The last few days, my tech team has been trying various configurations to see if they could isolate the bottleneck or track down the wide range of performance they were seeing on our test systems, and it appears users in the forums are as well.
Here are some things noted by my techs, that don’t make sense, except that there may be some major problems with both CPU and GPU threading.
•System 1
Intel Core i7-3770K (More than enough CPU)
ATI 6970 1Gb
16gb RAM
SSD
Win7 x64
Avg 26fps @ 1680×1050
•System 2
Intel Pentium 4 3.4ghz (Not even close to enough CPU)
NVidia 7950 512mb
4gb RAM
SSD
Win7 x64
Avg 29fps @ 1920×1200
Now why does this not make any sense? It is impressive that a computer from 2006 can hit nearly 30fps, but virtually the fastest computer you can build today does not seem to be using it resources wisely.
Before some goof jumps in with, “your System is Messed,” up we have a test center, and configured 8 different systems and tested several MB/RAM/CPU/GPU configurations and various OS/driver configurations.
Here are things we tried, that don’t make sense…
On System1, we put in a NVidia GTS 250 768mb, latest drivers, on clean OS install. W were trying to pin down NVidia/ATI over several generations. Overall, both ATI and NVidia sucked the same.
However, one interesting thing about the game, Hitting AutoDetect in Game shoves everything to Medium, and Native Resolution with the 3x slower NVidia GTS 250; however, with 6970, 6850, 5850 ATI Video Cards AutoDetect puts everything at Low, and even enables subsampling.
(**When Affinity was limited to 2 CPU Cores, the game would then Auto Detect the 3X faster GPU at Medium settings. Changing the affinity back to 4 cores, and Auto Detect would set everything to low again, but at medium on the slower GPU.)
That should not be happening.
On the i7, i5, and AMD CPUs that have 4 cores, the game would have a small amount of threads, yet consistently peg at 1/4 of each actual core. Leaving the game at about a 75% avg CPU usage.
Which does not make sense, because if the game if starving for CPU, as it appears to be doing, it is not maximizing the use of any core.
Expanding on this theory, if when we shoved affinity to 1 CPU Core, it would Hit 100%, on that core, and when we shoved affinity to 2 CPU Cores, both cores would hit near 90-100%; however, when we enabled 3 CPU Cores, each Core’s usage dropped to around 50%, and if we enabled 4 CPU Cores, usage would drop to 25% per core, and the game would still be starving for CPU.
There is no reason that expanding out affinity to all cores would ‘decrease’ performance and CPU load. (Yes there are some CPUs that drop with Heat, but this was not what we were seeing.)
If the system is starving for CPU usage, and threading out to 4 cores, as the GW2 client was doing, it should be consuming as much on each core as possible, shoving them all to 100% if needed.
After 2 cores, enabling 3 and 4 cores no longer increased FPS in the game, even though the game was using a portion of all 4 cores; however, not only a percentage of each core.
I am wondering if the new scheduling code that was added that Tom’s Hardware reported is maybe where this problem could be?
It is almost like it is threading too much and not ‘managing’ the threads well, which again would be wiser to shove to the NT kernel than to manage in the game internally.
There is no reason the Pentium 4 system should run ok, and a fast i7 with a much faster GPU and more resources should be so bad.
There is also no reason that the game performance with only 2 Cores set to the game would be ‘equal’ in FPS to using 3 and 4 Cores on the same system.
I see a lot of people being dismissive of players that are not happy with performance, telling them to get a newer CPU or newer GPU, etc… However, there is an issue with the game’s performance that does not match the performance specifications of the hardware.
So maybe be a bit kinder to your fellow players, as non-informative and dismissive statements are not helping find the solution, and it is also discouraging the developers from taking the performance issues seriously.
So take this information with a grain of salt or use it to figure out what is up with the performance of the game.
Good luck and hope that a developer is also listening/reading.