Graphics card driver support
Until they optimize the CPU usage for GW2, and throw more data at the GPU. The Driver support wont mean anything.
But on that note, I have noticed some performance increases here and there.
My GPU’s in all my systems are now between 68% and 90% (depending on players in the area), while my CPU’s spike at 80-85% tops now. But the CPU threading that GW2 uses needs to have work done on it still. In an 8 core system I can only get GW2 to actively use 6 Cores (Not talking HT). And if I change Zones, it will drop down to 4 Cores (Forcing affinity in task Manager, is the only way I can force GW2 to run on 6 of my 8 Cores).
Just wish we had itemized settings for Post Processing Effects. I absolutely hate the Green Halo’s on Character/NPC mouse overs, just HATE IT.
Laptop: M6600 – 2720QM, AMD HD6970M, 32GB 1600CL9 RAM, Arc100 480GB SSD
SLi and CrossfireX only works when the GPU is taking to long to render the data given to it relative to the engine ready to send down the next frame’s data. If the GPU is already done while the CPU is still organizing the data to be sent, SLi and CrossfireX is meaningless and sometimes worse when enabled.
RIP City of Heroes
True stories… Also on my hex core only 8 – 10 threads are being used at about 60 – 70% usage. Gpu doesn’t even go about 60% usage one the one core and the other is dead flat 0%. All of this needs attention please anet.
*On a point look in upgrading a shader here and there or some higher resolution texture would be nice. ONLY after the performance issues have been dealt with.
Absolutely correct here.
The processor coding is just a mess in this game overall.
For an unrelated example I tested BioShock Infinite and Crysis 3 with my FX8350 at 5.0Ghz and my GTX 670 locked at 1228mhz. I killed em both. Crysis on high settings avg’d 57fps and BioShock over a hundred. CPU and GPU usage were efficiently spread and GPU usage was at 99%.
“But Yaweha!” You scream incoherently. “This is an MMO, they aren’t the same beast!”
Well I went back to WoW, and reinstalled a 64 bit WoW client to see if my FX 8350 was an issue there, or my GTX 670. In high player zones and raids (stormwind, random 25 man raid) I never dropped below 45fps, and averaged well over 90 with everything turned up at 1080p. Considering this 64 bit client came out years later for an already ancient and very unoptimized game engine, Arenanet has no excuse.
So there goes that theory.
I just hope these “rumoured” ""fixes"" Arenanet says are coming have something to do with correcting this problem because there a lot of people on here running absurdly powerful systems incapable of getting decent frame rates in what are agurably the best parts of this game ( Hell these event’s are easily the game’s biggest selling point), and that is really sad.
I can’t even imagine what the poor slob running this on an A6 or i3 powered laptop is going through right now.
P.S. Crossfire or SLI in a game this CPU unoptimized would be pointless. Hell GPU driver fixes and/or updated profiles will likely have almost zero impact as well. The problem is pretty much 100% CPU bound, and as such, you should not be looking to Nvidia or AMD for the solution here.
Unless you’re trying to run three monitors, in which case my advice is “good luck” and I hope you’re running at least a 3930k considering how single-threaded, brute forced, IPC bound this game is.
(edited by yawa.5103)
Hey, I run on 3monitors for non-invasive content (Dungeons, World Bosses, Champ Trains…ect). I’d never do it for WvW though! I get 38-45FPS at 5060×900 on High (not extreme high) settings :-) WvW is like 10FPS if im lucky :-)
But your right on the money. Anet has no excuse for the poor performance of this game. Looking back over history, WoW, Bioshock, Cyrsis3, Fallout3, they all use the hardware efficiently.
GW2 has been out for a year now, its really time they dump some of their Living story Funds into fixing the performance of this game.
cause…I’ll tell you what, Every LS release lasts me a max of 3days before I get all the Meta Achievements I want (the account reward for that release). the daily takes me 45mins to complete on average, the Monthly takes me 2 days on average, I’ll farm the Champs in FSG for 5 hours a week (to keep my gold reserves up), and run 2 Dungeons a week (to work on the Title). Do you know what I’m doing with the rest of my Gaming time? WvW – Where the game is the weakest. IMHO this is a HUGE problem.
There is lacking PvE content (no real reason to grind dungeons once you are 80 and in full Exotics/Ascended (Except FoTM20’s, for the Fused Rings for your alts and such). There are no Raids(or Linear Progression). There is a serious lack of Guild activity (The weekly events take our guild 4 hours to do on a single Saturday. This content is weak and boring.) And sPVP is a tweakers convention. While its fun ‘sometimes’, with the Skill lag spikes, and such its not nearly as fun as WvW (IMHO).
So after you strip out any grinding content, all that is left is sPVP or WvW. Which would be FINE, if the dumb game was optimized for such.
and, If in 6 months of the performance of WvW (the main feature here) is not adequate, I highly doubt I’ll be playing this game as much as I am now (Just like how I have not played Diablo3 since GW2 was released, its a dumb game that is boring… But its also free after the license Purchase).
But, in all reality. For a game that has no monthly cost, how can we as the players make such demands? Its not like its any skin off NCsoft’s Back if they lose 10-20% of their player base from time to time, due to Programming issues.
Laptop: M6600 – 2720QM, AMD HD6970M, 32GB 1600CL9 RAM, Arc100 480GB SSD
“But Yaweha!” You scream incoherently. “This is an MMO, they aren’t the same beast!”
Game genre has nothing to do with the engines. Try comparing Runescape to Final Fantasy XIV; both are MMORPGs, but scale totally differently…
Could also be like comparing the latest Call of Duty and Battlefield games. Both are shooters, but use totally different engines. In most cases, IW engine is way more likely to have better performance than Frostbite.
Different game engines do different things. Generally speaking though, IW engine tries to achieve maintaining 60 fps with decent, non-post processed graphics, and Frostbite is more for physics, motion blur, and whatever other “intense” graphics settings you can think of.
There’s huge differences between WoW’s game engine and GW2’s; not even possible to compare them reasonably (apples to oranges basically; yeah both are fruits I guess). A reasonable comparison “might” be GW and GW2, but there’s no telling how different the engine is really between those games even.
As already mentioned, GW2 is more CPU-based than GPU. You can Crossfire a few Radeon HD 7850’s, or just use a single 7770 and it won’t improve performance at all if you have some Athlon X2 processor. It’s all about upgrading the bottleneck(s) (if I run out of RAM occasionally, upgrading my HDD won’t help…)
Just to give a reasonable idea of how CPU-bound GW2 is, I run Eyefinity (5040×900) with highest settings (including Supersampling) on a single Radeon HD 7850, and still don’t manage 99% GPU usage in most cases. My CPU is a Phenom II X4 @ 3.3GHz (an unlocked and overclocked X3 720).
A general rule of thumb currently seems to be preference to higher frequency over core amount. A 16-core processor @ 2GHz will be terrible for GW2, as opposed to a 3-core processor @ 5GHz (depending on other factors of course, but hopefully the general idea is still seen).