Laptop: M6600 – 2720QM, AMD HD6970M, 32GB 1600CL9 RAM, Arc100 480GB SSD
Laptop: M6600 – 2720QM, AMD HD6970M, 32GB 1600CL9 RAM, Arc100 480GB SSD
Yea, I agree with Brother Grimm. Only OC that GPU if your temps are not going to suffer. That i7 and the 760 should be able to pull 60FPS on its own at 1080p. You just need to turn down a few options like reflections and character limit.
But with the fact your getting 15-25FPS I bet your 760 is in that advanced sleep state, and GW2 isnt waking it up (known bug on many GPUs actually). You can verify this with GPU-Z by looking at the clock and bus settings while GW2 is running (Windowed mode).
Laptop: M6600 – 2720QM, AMD HD6970M, 32GB 1600CL9 RAM, Arc100 480GB SSD
AMD Phenom II X4 955 BE 3.2 GHz
4gb RAM
Win 7.1 Home Premium 64bit
AMD Radeon HD6800As it turns out dialling settings down to minimum and turning off player names seems to have sorted things out, certainly while taking on the Fire Elemental last night anyway.
what you are seeing is normal for the CPU. You can OC the CPU to get a few more FPS out of it. But the fact is, its 65% slower then current gen Intel haswell CPUs when it comes down to single threaded performance.
You can lower in game settings to reduce the load off your CPU (post processing, character limit/quality, Reflections) and that should help some, but do not expect above 18-20 FPS in fights like Maw/FE/Modnir/Jormag…ect. Its your CPU and its showing its age.
Laptop: M6600 – 2720QM, AMD HD6970M, 32GB 1600CL9 RAM, Arc100 480GB SSD
I recently upgraded my computer and finally felt like I could completely max out all the settings in GW2, I had been running with most things on high, so I bumped things up to ultra. Now, whenever I do a world boss like Jormag, Teq, or Karka Queen my game crashes half way through. After some Googling, people have attributed this to the 32 bit client running out of usable memory in these huge fights and crashing.
Are there any plans to release a 64 bit client? Or are we just going to see more encounters like Triple Trouble where instead of 1 huge zerg there are 3 smaller zergs so you’re less likely to hit the memory limit?
Check your Crash log, if you see OOM then you are forcing more memory management then the GW2.exe client can handle. The only fix that I have found to get around it is to lower your character limit to Medium.
A 64bit client would solve these issues, but there have been no application update news in a long time so I doubt we will see a fix any time soon.
Laptop: M6600 – 2720QM, AMD HD6970M, 32GB 1600CL9 RAM, Arc100 480GB SSD
Glad I found this, as I just tinkered with SLI on GW2 for the first time last night.
Does anyone know how SLI affects Windowed Fullscreen, or vice versa? Read: looking for information based on first hand experience, not just repeating what someone else said in another post.
Anyone?
Both Crossfire and SLI require the applications to be running in full screen mode only. If you run the application in windowed mode CrossFire/SLI is disabled for that application.
Laptop: M6600 – 2720QM, AMD HD6970M, 32GB 1600CL9 RAM, Arc100 480GB SSD
^Thanks for the good news. I hope they make the trait system available to levelers earlier rather than later.
This is all like opinion man. I just leveled 2 alts, and enjoyed every bit of it.
Traits don’t give you addition skills to press. They give bonus dmg/utility and effects.
You will still play the game exactly the same way as before, except now you can deviate into using certain skills specifically.
Every mmo uses a skill/trait system of some sort, and NO game, offers it 100% upfront.
Mesmers are just weak without it. That’s a class problem. Maybe their revised system is better, but still all subjective.
Traits offer 1 thing that every player can benefit from, 50% reduced fall damage. This is almost a requirement if you are doing WoW/EoTM or general mapping, as there are TONS of fall damage locations in the game. The fact this ‘Has’ to be a trait makes me wanna say ‘Traits need to be accessible at level 10 again’.
Laptop: M6600 – 2720QM, AMD HD6970M, 32GB 1600CL9 RAM, Arc100 480GB SSD
https://www.guildwars2.com/en/news/coming-soon-beta-version-of-streaming-client/
TL;DR – Players can start playing the game sooner while the game downloads in the background.
About time, this game has needed this feature for a long time. I always wondered why GW1 had it but GW2 didnt.
Laptop: M6600 – 2720QM, AMD HD6970M, 32GB 1600CL9 RAM, Arc100 480GB SSD
My Thief, more or less just playing around with the different outfits and those wings. Not sure what I’m going to do with them just yet. Kinda wish they were dyable or available in different colors (wheres white?!).
Laptop: M6600 – 2720QM, AMD HD6970M, 32GB 1600CL9 RAM, Arc100 480GB SSD
sounds like your ISP is classifying traffic for Anet as Torrents or something and rate-limiting it during peak hours. You need to call your ISP and open a ticket for them to inspect this.
Laptop: M6600 – 2720QM, AMD HD6970M, 32GB 1600CL9 RAM, Arc100 480GB SSD
pro tip
do not bother asking for dx11 its not gonna happen.
but do ask for 64bit client.
Yea but i dont get how the 64bit client would help increase the performance^^
It increases the amount of memory it can use which means less things needing to be reloaded from disk. Helpful if you are always in crowds with your texture and character quantity settings maxed out.
Quite simply, it would stop crashing the Game with OOM errors. Everytime I do a large event like Teq with max Character models I crash with in 15mins. Set that to medium I never Crash. This is the limitation of the 32bit Game Client.
Laptop: M6600 – 2720QM, AMD HD6970M, 32GB 1600CL9 RAM, Arc100 480GB SSD
Thanks for all the input! I decided to go build my own desktop with an i7-4790k and a GTX 980
980 is over kill, and a 295×2 is faster. If your going to do a 980 might as well just do a 295×2. else get a r9 290×. :-)
Laptop: M6600 – 2720QM, AMD HD6970M, 32GB 1600CL9 RAM, Arc100 480GB SSD
So I have a new Mac Pro (Not a Macbook Pro)
3.5 GHz 6-Core Intel Xeon E5
16 GB 1866 MHz DDR3 ECC
2x AMD FirePro D500 3072 MBIt runs on OS X Yosemite 10.10.2
I have a beautiful 24" 3840 × 2160 at 60 Hz monitor connected via DP 1.2
Now for my question…
The game doesn’t run smooth for me unless I turn the graphics way down, which is a huge bummer with all this power. :-(
So perhaps the game is only using one of the GPUs?
The whole setup for the Mac Pro is totally unique to just the Mac Pro follow this link for further details on the Mac Pro
Or is there something else that is causing this issue.Thanks
what if you drop from the 4k Res down to 3k or 1440P? what happens then? It could be that the MAC client just cannot support 4K yet.
Laptop: M6600 – 2720QM, AMD HD6970M, 32GB 1600CL9 RAM, Arc100 480GB SSD
I am currently debating on whether or not to buy a laptop or a desktop to be able to play at 60fps with everything at max. The laptop I would get would be the ASUS ROG G751.
Its specs are:
Intel Core i7 4710HQ (2.50GHz)
16GB Memory
1TB HDD
NVIDIA GeForce GTX 970M 3GBIs this enough to run everything max at 60fps (except crazy zerg clashes)? Or should I go with the desktop to get more performance for the money?
It will run at or near 60FPS. The issue is the single threaded performance. Even with everything set to Max there are a few settings you MUST turn down to get 60FPS outside of zerg like content. Reflections being the most impactful and then Character limit being the 2nd most. otherwise, it should be able to handle the game wonderfully.
Laptop: M6600 – 2720QM, AMD HD6970M, 32GB 1600CL9 RAM, Arc100 480GB SSD
Well I come from the days where overclocking was unstable and severely reduced the life of the processor. Has things changed with the new Intel chips?
Overclock, but make sure you have a good cooler.
the i7-4790K will yield about 20FPS above the i5-4690K due to HT.
But they both have the same ‘zerg’ level of performance cause they use the same core technology.
If you have the money, get the i7. It will last you longer and hold a higher resell value when its time to upgrade.
When you OC, the simple out of box test is to default your BIOS and make sure your RAM is at 1333mhz or 1600mhz, set the CPU voltage to 1.25 and the multiplier to 46 and boot into windows then run Prime95. While that is running also load up coretemp/hwinfo and monitor the Temps(100c is the top, so you want between 65c-75c). If you dont hardlock or BSOD its a stable OC.
I have only seen 6 samples of Haswell that did not OC at or above 4.6ghz and they are all 4670K’s and 4770K’s. the newer G3258 and 4*90K’s all OC pretty well. Just make sure you buy a GOOD motherboard with 8 or 12 phased power (Z97-G45 for example) as that is what makes those Chips OC better.
Laptop: M6600 – 2720QM, AMD HD6970M, 32GB 1600CL9 RAM, Arc100 480GB SSD
WTFast runs through shared service VPNs like most tunneling services. Unfortunately when they change servers or providers, our systems detect a large quantities of accounts suddenly being accessed by a different network. Standard preventative measures dictate that we close access to these accounts to try to insure that the account’s holdings remain secure.
As we currently don’t have an “account identity lock” system, we rely on preventative banning in this case. We are working on a better solution for this, but this type of issue doesn’t happen very often, and when it does it can quickly and easily be fixed.
In this case it looks like this hit WTFast VPNs running out of Frankfurt, Germany which was brought to our attention because of high GoldSeller activity on it along with a flux of accounts hitting the same and new network at the same time.
Sorry for the inconvenience, I am still looking into this issue.
You guys have heard of BGP/OSPF right? and Internet Multi-pathing? Your system would flag accounts that happen to jump routes and change source IP’s due to BGP topology changes too.
You guys need to change your security measures. Mass banning accounts based on source IP is the STUPIDEST thing I have ever heard of. Especially since accounts that are banned via this method need tickets opened with your helpdesk. You are just burning $$$ by going this route. There are many other/better ways to run security.
One would be to enforce every account to use the email authentication system you have setup with GW2. They change IP the system blocks access until they accept said email. Sames goes for mobile authenticator.
Laptop: M6600 – 2720QM, AMD HD6970M, 32GB 1600CL9 RAM, Arc100 480GB SSD
1st of all, ANet doesn’t say anything until they are ready to tell the world. The new camera modes announced today would likely have been seen at the two upcoming gamer cons so it was the time to start up the hype train on that feature.
Now if they have improvements in the client such as 64-bit support and better multi-core scaling, they would keep it under wraps until the time is right to stoke the boiler of the hype train again.
I just wish for 64bit support, limited to 3.5GB of memory really puts a halt on using high and ultra character model settings in the client :-)
Anything else will be a value add at this point.
Laptop: M6600 – 2720QM, AMD HD6970M, 32GB 1600CL9 RAM, Arc100 480GB SSD
I wouldn’t say 65% slower without also saying clock for clock.
And that’s just the design philosophy of the two companies. AMD throws more but slower cores to process a large number of threads quickly while Intel’s approach is to use fewer cores that are nearly 3x the performance to process the same amount of overall workload.
As for a revamp of the engine, who knows. But for current FPS games which do scale, well that’s the selling point of those game engines which are also sold to 3rd party developers (as well as crossplatforming). Since ANet doesn’t sell their game engine, they didn’t have the additional motivation when creating it all those years ago. And they aren’t sitting an an enormous pile of money like Blizzard and could afford to spend some of it on updating their client, as well as revamp of models and textures.
Everyone I know left the game due to performance related issues where the most fun parts of the game like Big events and WvW are not enjoyable when the experience is the same as a power point slide. Investing in client side issues will pay off because more players coming in the world will give potentially more circulation of gold and gems. Both nVidia & Amd offered help after the release to team up and solve graphical issues of the game but guess who didnt give a crap about it and let it be?
The issue with that is, newer hardware resolves the ‘slide show’ issue.
So as long as there is a customer bound solution, the company has no ‘need’ to throw money at the issue from their end.
A customer can spend 400-600 to ‘fix’ the issue you describe. But the company would have to spend Millions to fix it from their point of view. So, as longas the 400-600 customer bound solution is viable right now (Yields 32-45FPS in Zergs with proper settings) the company has no ‘profitable’ need to do it on their end instead.
And that is the cycle we are in with client side issues right now.
The other issue, if they were to upgrade the API to even DX11 every Player that is on older DX10 cards would have to throw money at their system to use the new API anyway. For DX12 it would be a new install of Windows 10 (not available yet) and a card that supports DX12 (Really new Gen stuff), and Mantle would be a supported GNC2 GPU (HD7790, r9 290/290x, 295×2) and you are talking 300-400 right there on the customer side if they were to upgrade the API anyway.
That 300-400 could be dumped into a Haswell CPU/MB and yield similar ‘acceptable’ results today.
See the pattern here?
And that is why we will not see a client side performance fix from Anet any time soon.
Laptop: M6600 – 2720QM, AMD HD6970M, 32GB 1600CL9 RAM, Arc100 480GB SSD
Hey i just bought the game yesterday and i accidently messed up with picking the armor chest i picked it on the wrong character is there any way you guys could reset it for me ? or is it gunna stay on my warrior
go find a crafting table, there you can access your shared account bank. Put the Box in the bank, then hit a crafting table with the other character to pull the box out of the bank.
Laptop: M6600 – 2720QM, AMD HD6970M, 32GB 1600CL9 RAM, Arc100 480GB SSD
Thanks Sir Squishy, for the detail. It looks like I’m going to be running this build for quite some time. My wife and I both run AMD FX builds.
I understand the technical differences between the processors. The point that I want to Forward to Anet is that they want to have there game run on as many platforms / CPUs as possible for compatibility… That’s fine. But, If they know of this huge % of discrepancy between the processor types (with the given, single threaded performance) Then wouldn’t it be in their best interest be to code the game to take advantage of the higher end systems AMD or Intel?
It just make me sad, that we put as much love into are gaming rigs and money to play AAA level games, and to have GW2 which is by far my favorite game, be the lowest performing game on my system…
Its really not as simple as that. Anet cant do anything about it unless they redesign the game engine from the ground up. Upgrading to DX12 will isolate users to Windows10, Mantle will be in favor of AMD based systems, …ect. there is always something. But the issue with the AMD line is a hardware one, not software. As every application is affected by AMDs slow single threaded performance.
Here is a benchmark for an unassociated application to prove my point;
http://techreport.com/review/26735/overclocking-intel-pentium-g3258-anniversary-edition-processor/3
As you can see, single threaded the FX line gets stomped by Intel. And if you scroll down to the Crysis benchmarks you can see that there is a HUGE difference between Frame Latency from the FX to the Intel Line.
And For what its worth, the G3258 is a dual core CPU that can be over clocked.
The only way to fix a hardware issue is to replace the hardware with better hardware :-)
Laptop: M6600 – 2720QM, AMD HD6970M, 32GB 1600CL9 RAM, Arc100 480GB SSD
Well, I just have to say, that’s sad, since GW2 is the only game in my library that i don’t get full performance from my hardware… Just say’in…
AMD’s FX line of CPUs (like the one you have) is 65% slower then Intel when it comes down to single threaded performance. And when the next gen CPU from Intel releases (My guess is 2016 Q1-Q2) they are going to be 75%-80% slower then Intel at that point.
And that is really the root cause of your performance issues.
The game works fine with any range of GPU from a 750Ti on up (SLI or otherwise) as the Graphics in the game are not all that complex (I was able to get 90FPS on a HD7790/R7-260x at 1080P on high Graphical settings/Medium CPU settings, and thats only 10% slower then a 750Ti).
The best thing you can do to increase performance is to make sure the Load on your CPU is not being hit to hard by anything outside of GW2. And then to turn down the CPU bound settings ingame (Post Processing, Character Limit, Reflections, ….ect). Then exploring Overclocking your FX 8 core CPU, as that will bridge the 65% Gap between that and any 4th Gen Intel CPU.
Alternatively you can upgrade your FX to an Intel CPU. Even a dual core G3220 will out perform your FX for this game.
Laptop: M6600 – 2720QM, AMD HD6970M, 32GB 1600CL9 RAM, Arc100 480GB SSD
Hello everyone !
Let’s start with my pc :
8Gigs 1333 ram
i5-3570
gtx- 970 MSI OC
WIN 7When i was at town i had around 40 fps on highest appearance settings yesterday when i had GTX 660. Now i have 50 fps … really ? such small performance boost in gw2 ? from normal settings i boosted to ultra in BF4 same with other games running smooth assassin’s creed unity and far cry 4 on highest settings and my pc can’t handle gw2 ? 50 fps in town is not that bad ? it was empty town and with this i can’t imagine what im going to see at WvW. I bought this card especially for GW2 and almost nothing changed when other games boosted to the heavens.
What’s the problem ?
well first off dont us the high appearance preset, there are few things it turns on that have 0 benefit to the game but take a 20-30% hit off your performance from the top.
FXAA – None
Better off running AA in your GPU’s control panel then using this. It hits the CPU a tiny bit and blurs the edges rather the sharpens them.
Reflections – Sky&Terrain Only
Having this set to All enables reflections for stuff UNDER the ground that you cannot see.
Shadows – High or Medium
I see little to no benefit to running shadows above medium in this game. High if you like the higher range but this is more of an opinion of mine.
Post Processing – Low
On my system None and High look exactly the same for the settings here. Low makes everything ‘pop’ like it should. You could turn this off as this is mainly CPU driven
Character Limit – Medium
This is CPU and Memory intensive. This lets you see more characters yes, but at a great cost of performance
Character Quality – High
I disable high res textures as I use my GPUs control panel to sharpen the image. You do lose a bit of detail by not using this option but the performance hit is pretty huge.
Laptop: M6600 – 2720QM, AMD HD6970M, 32GB 1600CL9 RAM, Arc100 480GB SSD
Hey guys,
OK, I have a little problem with GW2.
I have quite a beefy system with Intel 4770k overclocked to 4.4 GHz and GTX 780 Ti in SLI mode. I play on 1080p, however, the performance of the game isn’t great at all with FPS dipping to 50 in half-empty Black Citadel.I have checked the CPU usage and saw a problem: it was running at 45-50%, so my GPU’s are just waiting for nothing while the CPU is slacking. Removing the overclock does nothing at all.
Can someone please let me know if that is normal and if it is just a poor optimization of the game or if my PC has something wrong with it?
Simple answer. Our current Direct X API is currently complete garbage and no matter how well you thread a game you are going to be dependent on 1 core when you are truly cpu bound and this only should happen in MMO’s and open world games. You are CPU bound because there are a ton of players in that zone.
As far as API? AMD Mantle fixes that. The new DirectX 12 will do the same, so will the new OpenGL.
Why don’t you see total adoption of Mantle? Market share (you own Nvidia GPU’s so it wouldn’t help you). I HOPE the new OpenGL takes over but if not, DirectX 12 will alleviate much of the cpu overhead in MMO’s. It is not ANET’s fault, ALL mmo’s have this problem and for what this game can actually put on screen as far as player count/effects? It is actually amazingly well optimized for our current API limitations.
You will see people argue that Crysis 3 is threaded perfectly and it IS well threaded. The difference? You are not getting the draw calls you are in Crysis that you are in an MMO with hundreds of players on screen. Decent threading on single player games can work up to a point on our current ancient, crappy MS API. In open world games and MMO’s it falls apart. Chris Roberts immediately adopted Mantle because he knew CPU overhead would be a major problem with Star Citizen and it would be an easy port to DX 12.
People love to say Guild Wars 2 is badly optimized but that is simply BS. All these newer MMO’s are limited by our API. Want a badly optimized game? You can’t even choose AC Unity. That is simply the devs pushing too much crap for our current API. The game was actually well threaded. A BAD port or optimization example? Far Cry 4. Mainly single threaded in a game that should have ran like butter with decent threading. Absolutely pathetic by Ubisoft.
TDLR. You are not going to see a MMO run at 60 FPS with tons of players on screen, unless that MMO was made a decade ago and you still need a kitten good PC to get 60 FPS in a LARGE raid or Ashran in WoW. Guild Wars 2 is much more ambitious than WoW.
I have never seen as many players on screen in WoW as I have in Guild Wars 2 and when I have? The server crashed (see Swifty and Stormwind). With this many players in SWTOR? The game was a slide show (see Ilum). Wildstar? You need a overclocked Haswell like we have to even get decent FPS in a large raid, let alone massive pvp.
You left out one big issue with the newer APIs.
Mantle only works on certain AMD GPUs (CGN next cores), and DirectX 12 will only work on Windows 10(not yet released).
If Anet (or anyone for that matter) were to add a new API (or Replace) they would be isolating out a large % of their user based, due to the hardware and software limitations.
While i would love to see an upgrade to DX9 done for GW2, I just can’t see them doing it anytime soon (with in 2-3 years) due to the above facts.
I think you misunderstood Squishy, Deathjester was saying to stop saying GW2 is poorly optimized because an MMO with hundred players on the screen is going to have poor framerate even if it did have a Dx12 rendering infrastructure with proper core scaling. MMOs are a different beast than FPS games with different bottlenecks and always will be.
no, not at all. I was just adding to his information. I totally agree 100% :-)
Laptop: M6600 – 2720QM, AMD HD6970M, 32GB 1600CL9 RAM, Arc100 480GB SSD
Hey guys,
OK, I have a little problem with GW2.
I have quite a beefy system with Intel 4770k overclocked to 4.4 GHz and GTX 780 Ti in SLI mode. I play on 1080p, however, the performance of the game isn’t great at all with FPS dipping to 50 in half-empty Black Citadel.I have checked the CPU usage and saw a problem: it was running at 45-50%, so my GPU’s are just waiting for nothing while the CPU is slacking. Removing the overclock does nothing at all.
Can someone please let me know if that is normal and if it is just a poor optimization of the game or if my PC has something wrong with it?
Simple answer. Our current Direct X API is currently complete garbage and no matter how well you thread a game you are going to be dependent on 1 core when you are truly cpu bound and this only should happen in MMO’s and open world games. You are CPU bound because there are a ton of players in that zone.
As far as API? AMD Mantle fixes that. The new DirectX 12 will do the same, so will the new OpenGL.
Why don’t you see total adoption of Mantle? Market share (you own Nvidia GPU’s so it wouldn’t help you). I HOPE the new OpenGL takes over but if not, DirectX 12 will alleviate much of the cpu overhead in MMO’s. It is not ANET’s fault, ALL mmo’s have this problem and for what this game can actually put on screen as far as player count/effects? It is actually amazingly well optimized for our current API limitations.
You will see people argue that Crysis 3 is threaded perfectly and it IS well threaded. The difference? You are not getting the draw calls you are in Crysis that you are in an MMO with hundreds of players on screen. Decent threading on single player games can work up to a point on our current ancient, crappy MS API. In open world games and MMO’s it falls apart. Chris Roberts immediately adopted Mantle because he knew CPU overhead would be a major problem with Star Citizen and it would be an easy port to DX 12.
People love to say Guild Wars 2 is badly optimized but that is simply BS. All these newer MMO’s are limited by our API. Want a badly optimized game? You can’t even choose AC Unity. That is simply the devs pushing too much crap for our current API. The game was actually well threaded. A BAD port or optimization example? Far Cry 4. Mainly single threaded in a game that should have ran like butter with decent threading. Absolutely pathetic by Ubisoft.
TDLR. You are not going to see a MMO run at 60 FPS with tons of players on screen, unless that MMO was made a decade ago and you still need a kitten good PC to get 60 FPS in a LARGE raid or Ashran in WoW. Guild Wars 2 is much more ambitious than WoW.
I have never seen as many players on screen in WoW as I have in Guild Wars 2 and when I have? The server crashed (see Swifty and Stormwind). With this many players in SWTOR? The game was a slide show (see Ilum). Wildstar? You need a overclocked Haswell like we have to even get decent FPS in a large raid, let alone massive pvp.
You left out one big issue with the newer APIs.
Mantle only works on certain AMD GPUs (CGN next cores), and DirectX 12 will only work on Windows 10(not yet released).
If Anet (or anyone for that matter) were to add a new API (or Replace) they would be isolating out a large % of their user based, due to the hardware and software limitations.
While i would love to see an upgrade to DX9 done for GW2, I just can’t see them doing it anytime soon (with in 2-3 years) due to the above facts.
Laptop: M6600 – 2720QM, AMD HD6970M, 32GB 1600CL9 RAM, Arc100 480GB SSD
Move the entire game’s folder to the SSD, not just those two files. The game is a bit more complicated now than it was in the past due to the Coherent UI interface.
If the other files are not detected when GW2.exe is ran then they are created before it loads the .dat file.
Laptop: M6600 – 2720QM, AMD HD6970M, 32GB 1600CL9 RAM, Arc100 480GB SSD
Just to warn you before you get scammed. Do not buy the samsung 840 evo, 840
http://www.overclock.net/t/1507897/samsung-840-evo-read-speed-drops-on-old-written-data-in-the-drive
I wonder if any other kingston drives did a bait and switch yet.
http://www.anandtech.com/show/7763/an-update-to-kingston-ssdnow-v300-a-switch-to-slower-micron-nand
840evo has firmware to fix that issue
v300 was a bait and switch, the 240G SSD v300 is worth about 80~ (not a buck more) as its still 1.5x faster then a traditional HDD.
Laptop: M6600 – 2720QM, AMD HD6970M, 32GB 1600CL9 RAM, Arc100 480GB SSD
just move the gw2.exe and gw2.dat to a folder on your SSD and run the game. it doesnt need to be installed again. Or you can just copy the entire games folder to the SSD. just make sure the SSD is formatted with NTFS and not fat32.
Laptop: M6600 – 2720QM, AMD HD6970M, 32GB 1600CL9 RAM, Arc100 480GB SSD
Bump, still can’t get it to run properly, 1-13fps with a GTX 970M isn’t right.
Grab GPU-Z and verify there is actual load on your GTX970M and not the HD4600 with GW2 running. It really does sound like your graphic settings in your Nvidia control panel switched to the HD4600 for some reason.
Laptop: M6600 – 2720QM, AMD HD6970M, 32GB 1600CL9 RAM, Arc100 480GB SSD
Newly built desktop Jan 2015. The R9 290 (stock) is maxing out at 100% usage with low cpu usage, which is unusual since most complaints about this cpu-intensive game are that there is minimal gpu usage.
I have completely opposite, but normal, readings when gaming on my laptop (2011 AMD Llano A8-3500m + 6750m), with expected high cpu/low gpu readings. Great thing about high cpu usage on my laptop is that I can overclock the thing from base 1.5ghz to 2.7ghz resulting in substantial amount of fps gain. The frustrating thing about my desktop’s low cpu usage, any amount of overclock gets no fps gain.
That’s normal with an as strong cpu as the i7 4770k. I personally tried everything from 3.2ghz/HT off to 4.8ghz/HT on and there was no change in framerates, which is normal because the gpu is running at it’s limit. It’s still strange that you’re only getting 50 fps, you should be experiencing at least over 90 fps average. Have you by any chance enabled some kind of frame limiter? Like nvidia inspector or ingame, or whatever software AMD may provide? Given, that should also drop your gpu usage, but it’s probably applied after processing.
Have you tried running gw2 at low resolutions, just for a test?
Might have confused me with OP. Apologies if I’ve highjacked the thread. I myself have i5-4690k with R9 290. I’m getting 18fps in Black Citadel with all settings max, with an embarrassingly low 20fps running around an empty Hoelbrak. CPU is running pretty cool at low 50s, even dipping to high 40s, feels like it’s not even being stressed
I wondered if it was any drivers or AMD Gaming Evolved or anything else, so I did a clean Windows 8.1 install. Before I did anything else, I installed GW2 straight out of the clean Win8.1, even before doing any updates. I was still having the same performance before and after installing all those cpu/gpu drivers, all those dx, .net, c++ runtime environments, etc.
Grab GPU-Z and make sure your Video card is linked at x16 version 3. It kind of sounds like its only linked at 1x (the 18FPS is kind of a dead give away here with that build).
Also make sure the GPU is in the closest slot to your CPU.
Laptop: M6600 – 2720QM, AMD HD6970M, 32GB 1600CL9 RAM, Arc100 480GB SSD
accounts are separate and cannot be combined.
Laptop: M6600 – 2720QM, AMD HD6970M, 32GB 1600CL9 RAM, Arc100 480GB SSD
ok thats proof im right what i said. gtfo then
we are done.
Guys this is just a troll, its time to ignore him.
Laptop: M6600 – 2720QM, AMD HD6970M, 32GB 1600CL9 RAM, Arc100 480GB SSD
medion erazer… and answer my question before
That is not a model number, I need the exact model number. If you will not supply it I am done helping you.
Ill answer YOUR questions when you answer MINE.
Laptop: M6600 – 2720QM, AMD HD6970M, 32GB 1600CL9 RAM, Arc100 480GB SSD
so tell me how my specs have something to do with the fact that the game forces my laptop to use the integrated graphics? and ofc its gw2 fault youre ofc allowed to think something else if you feel better. and if you dont want to help me better dont post here
what model laptop do you have? Can we at least start there?
Laptop: M6600 – 2720QM, AMD HD6970M, 32GB 1600CL9 RAM, Arc100 480GB SSD
edit: and ofc its a gw2 issues as it does work normaly with other games so why shouldnt it be a gw2 issues when the game doesnt switch for gw2 but any other game?
well if you want to continue acting like a kid, and not be forthcoming of requested information then no one here (Anet included) can help you.
You might as well uninstall GW2 now and go back to one of your ‘other’ games that is ‘working’ perfectly. As your not going to make any headway here.
Now if you want to actually work with us and supply system information (as I first requested) then by all means do so. But staying here saying its gw2’s fault and nothing else, is completely unproductive. Your time can be better used some place else.
Laptop: M6600 – 2720QM, AMD HD6970M, 32GB 1600CL9 RAM, Arc100 480GB SSD
and you dont understand that this has nothing to do with the specs.. as i said i have high end laptop the best you can get for money.. i can run everygame which is out atm on highest settings without problem.. but as i said gw2 not and its not that intense.. so if you dont understand that youre welcome leave and to keep youre comments which doesnt provide any help
actually it does, it has EVERYTHING to do with your laptops specs. If you take the 5mins it takes to write the specs in a post, then we can better advise you on how to fix your issue.
My guess is you have an intel i7 CPU with a HD4600 GPU, and also a Nvidia 860. And your Nvidia Control panel isnt switching to the 860 for GW2. That is a DRIVER issue and not a GW2 issue. You will have update your drivers to fix that issue, or apply well known registry fixes as a work around. Again THAT issue is not at fault on anyone other then the manufacturer of your laptop.
Laptop: M6600 – 2720QM, AMD HD6970M, 32GB 1600CL9 RAM, Arc100 480GB SSD
coming back to the game with a new laptop which is high end and should be able to run the game 2 times on ulltra. the laptop cant even handle the game on lowest settings.
the laptop has a nvidia graphic card.. everytng is up to date and i checked if the game doesnt run on integrated graphics..
evebn my 7 years old other laptop can run that game smooth on low
i wonder if tehre is a bug which forces my laptop to use the integrated graphic even i set the settings to use the nvidia one
and the system specs are?
Laptop: M6600 – 2720QM, AMD HD6970M, 32GB 1600CL9 RAM, Arc100 480GB SSD
first question, does gw2 support sli? I did some digging and people said 2 years ago that it didn’t. Can anyone confirm this as of today
Now I know you’re all like why the gtx 680? And why not upgrade to a gtx 970 or 980?
Well because I don’t really want to waste all that money on a new card when gw2 runs fine on my gtx 680 and gw2 is mainly what I play. So why go sli well because a friend of mines would sell me his old gtx 680 for $150. Second game I play is bf4 at 1080p and I’ll be able to play at ultra
No one has addressed the main issue with SLI/CF, Cost.
The performance you get by adding an additional card is not 100% its more like 75% at best. You also need to make sure your PSU has enough power to sustain dual 680’s.
If I were you, I would be looking at a newer card. Maybe the GTX 970. More Memory, faster GPU engine, more rendering units….ect. And it should cost about the same if not just a little bit more. (GTX680’s on Ebay are still going for 375~ for the really good ones, such as Asus, MSI…ect)
http://www.hwcompare.com/18057/geforce-gtx-970-vs-geforce-gtx-680/
CF/SLI is really useful when you are pushing your GPUs harder then they can work singularly. I run R9 290’s in 2way CF, and there is definitely a benefit there. But I generally only see it when running my 3 displays at 5040×900 rather then on a single display at 1600×900. Also, CF/SLI only works in Full screen mode, so if you run in windowed mode you will see 0 benefit to it at all.
And GW2 specifically is CPU bound for performance. When your CPU takes that hit (zerg content mainly) CF/SLI will do nothing here.
Laptop: M6600 – 2720QM, AMD HD6970M, 32GB 1600CL9 RAM, Arc100 480GB SSD
y50-70 is a pretty good laptop for the price range. And the specs you listed will be more then enough to play this game on ultra settings. But the same rules apply, zerg content will bring the CPU down to its knees, so expect 18-24 FPS for zerg content, while I would expect 90-120FPS for just about everything else.
Laptop: M6600 – 2720QM, AMD HD6970M, 32GB 1600CL9 RAM, Arc100 480GB SSD
You should be able to OC that G3258 to 4.2Ghz on the stock cooler, and with the 270 you should see 70-90 FPS in the open world, with low FPS of 18-24 in zerg type content. Maw will drop down to single digits though.
Laptop: M6600 – 2720QM, AMD HD6970M, 32GB 1600CL9 RAM, Arc100 480GB SSD
The current trait system (how its pronounced and delivered, as well as unlocked) needs to be reworked.
The unlocks, feels clunky and out of place 99% of the time. Some unlocks are done in WvW or PvE, and requires some mapping to get to the areas where the unlocks occur. This is not exactly conductive to the NPE IMHO. I feel that traits should unlock at a player levels, or as they master their skills, once traits are available to be used.
Take traits that have synergy with weapons. Using X weapons should unlock those traits (tie it back to the AP system), like the Mesmers scepter trait. Using the scepter should be what unlocks that trait, not some PvE exploration thing.
Same goes for the falling trait, maybe survive a fall of 20,000 damage should unlock the trait for example.
Make it more intuitive and more fun and less what it is today.
Traits lack a serious synergy in the tool tips. There are so many new players (and players that just don’t care enough to visit forums) that have no idea how to properly build their character with traits. Now sure this is purely cosmetic, but what I am going to suggest below ties into this quite well.
Have traits carry set bonus’s when you are running traits that have synergy with other traits. An example would be a Mesmer stun/lock down build. Take Halting strike, Bountiful Interruption, Chaotic Interruption, and maybe harmonious Mantras (if Mantra of distraction is on your Skill Bar) and have all of those skills have some hidden unlock (maybe 5%+ to stun/boon bonuses or something like that), and reflect that in the tool tips so that its quite obvious to players what works and what doesn’t.
Sure this does promote cookie cutter builds and play styles, but I can see this doing more good then harm with the average player. And the set bonus’s would be so small that it wouldn’t really hurt players that wanted to play outside of the ‘synergy scope’.
Laptop: M6600 – 2720QM, AMD HD6970M, 32GB 1600CL9 RAM, Arc100 480GB SSD
My major problem is the removal of SP post 80. I would earn around 5 SP a week with the my level 80 which I used to earn some addition gold through the MF. Now unless the new SP sources from HoT are repeatable, I see this change as yet one more means to earn a bit more gold being removed from play forcing me toward mindless trains and speedruns as the only way to supplement my “income”.
On top of that, the lack of earning additional SP beyond what’s available from leveling 1-80 and doing SP challenges means I’m going to be forced, when leveling my next character, choosing between skills and buying traits. Now it’s been mentioned that traits are changing again, without knowing how they are being changed, I can’t say that my fears of running out of SPs for skills are assuaged.
Lets not forget about the siege upgrades using skill points (unless that has changed in the last 3 months) :-)
Laptop: M6600 – 2720QM, AMD HD6970M, 32GB 1600CL9 RAM, Arc100 480GB SSD
GPU Heat while running the game is @ 41-45 C
Where are you getting that info from? And what model is your GPU exactly? My R9 280X is around 70 degrees on full load, 45 degrees is way too low.
My video card is the Radeon 280X 3GB. The temps are from Speedfan, and it’s currently running between 55-65 C
Verify those temps with GPU-Z.
Speedfan can be flaky at times (reading the wrong sensor for the incorrect Device ID).
Repeated w/ GPU-Z. Temps are still reading the same.
Then you got really lucky, those 280x’s run really hot. I had to drop my Direct CU II Asus cooler for a MK-26 to get my 280x’s temps down. Now my top end temp is 46c :-)
But that aside, did you install the AMD Omega Drivers yet? If not, I highly suggest it.
Laptop: M6600 – 2720QM, AMD HD6970M, 32GB 1600CL9 RAM, Arc100 480GB SSD
So guys I’m thinking of buying a new PC but i don’t have a lot of money, I have around 500 $ but as long as i’m going to build an entire new pc i won’t have money for Intel I5 etc. My current PC is not good at all, I play GW2 with 20-30 fps, in crowded area i’m getting 5-10 fps which is unplayable.
Could AMD FX6300 get me decent 60 FPS ?
in short not. All AMD FX CPU’s are not good for this game in its current state. This games performance is determined by the single thread performance of your CPU.
http://techreport.com/review/26735/overclocking-intel-pentium-g3258-anniversary-edition-processor/3
And as you can see the FX8350 is pretty bad compared to all of Intels CPUs, and your FX6300 is not as good as the FX8350.
Laptop: M6600 – 2720QM, AMD HD6970M, 32GB 1600CL9 RAM, Arc100 480GB SSD
GPU Heat while running the game is @ 41-45 C
Where are you getting that info from? And what model is your GPU exactly? My R9 280X is around 70 degrees on full load, 45 degrees is way too low.
My video card is the Radeon 280X 3GB. The temps are from Speedfan, and it’s currently running between 55-65 C
Verify those temps with GPU-Z.
Speedfan can be flaky at times (reading the wrong sensor for the incorrect Device ID).
Laptop: M6600 – 2720QM, AMD HD6970M, 32GB 1600CL9 RAM, Arc100 480GB SSD
It actually kinda sounds like the collections will be for the NEW legendarily precursors, while the Master Crafting will be for crafting the OLD legendarily precursors.
no, i’m pretty sure both new and old precursors will be obtained through the same method. you can quote me on it.
well just have to see. But being able to create 1 precusor of each type, per account seems very limiting.
Laptop: M6600 – 2720QM, AMD HD6970M, 32GB 1600CL9 RAM, Arc100 480GB SSD
It actually kinda sounds like the collections will be for the NEW legendarily precursors, while the Master Crafting will be for crafting the OLD legendarily precursors.
Laptop: M6600 – 2720QM, AMD HD6970M, 32GB 1600CL9 RAM, Arc100 480GB SSD
that happens to me every once in a while. restarting the GW2 client, or switching characters then back to the affected character has resolved it in every situation. Last time it happened was the night before last in Lions Arch, the issue followed another player as I looked at them (I think it was Spark that was the cause), when I logged out/back in I was in a different instance for LA and that player wasn’t around and Boom issue gone. Thought it might be game files, so did a -repair last night and have not been back in since.
Laptop: M6600 – 2720QM, AMD HD6970M, 32GB 1600CL9 RAM, Arc100 480GB SSD
Hm..i dont want to make a offtopic, but since a thread like that popped up..it is possible to change our acc name? Ive been curious about it for a while. Thats all ty
Yes, you have to open an ingame ticket with the request.
False, unless you mean account name which is the email address associated with the account. However I think the person above was talking about display name:
Anet will only change display names in the event someones personal privacy is exposed such as using their real name or if the display name is offensive/violates the naming policy.
Source:
https://forum-en.gw2archive.eu/forum/support/support/Name-Questions-Please-read-merged/
Um totally not false.
ANet will change the account display name. Sure certain factors have to be addressed (harassment and Need) BUT they will do it. And only an in-game ticket is the way to go about it.
He didn’t ask for specifics, just if it can be done.
Laptop: M6600 – 2720QM, AMD HD6970M, 32GB 1600CL9 RAM, Arc100 480GB SSD
Hm..i dont want to make a offtopic, but since a thread like that popped up..it is possible to change our acc name? Ive been curious about it for a while. Thats all ty
Yes, you have to open an ingame ticket with the request.
Laptop: M6600 – 2720QM, AMD HD6970M, 32GB 1600CL9 RAM, Arc100 480GB SSD
I have only wish. Dont make new leges tradeable on TP.
This ^.
I have only wish. Dont make new leges tradeable on TP.
Too many things are already account bound. Don’t lock more things away.
Legendaries were meant to be ultra rare items that showed your mastery of the game, making them trade-able reduced them to a high end checklist item that you grabbed if you wanted too with money you grinded. I hope they make them account bound and that acquisition is made far harder and skill based for the parts.
+1, I totally agree. THIS is why I never invested in a legendary. It wasn’t cause of the precursor, it was because anyone and their brother can have one by dumping cash into gems then converting to Gold. IMHO that is a small version of Pay 2 win. Which GW2 was never about.
Laptop: M6600 – 2720QM, AMD HD6970M, 32GB 1600CL9 RAM, Arc100 480GB SSD
which SSD are you running GW2 from? the 850EVO’s have firmware issues after so much GB has been written to the SSD. There is a firmware update to resolve this. You can test this by moving GW2 to your OCZ SSD.
Laptop: M6600 – 2720QM, AMD HD6970M, 32GB 1600CL9 RAM, Arc100 480GB SSD
