intel 335 180gb/intel 320 160gb WD 3TB Gigabyte GTX G1 970 XFX XXX750W HAF 932
CPU bound GW2 and Megaservers
intel 335 180gb/intel 320 160gb WD 3TB Gigabyte GTX G1 970 XFX XXX750W HAF 932
on the surface I would guess perhaps you have Character Model Limit above setting low and/or supersampling on?
also, i don’t think during say like the maw with large numbers present we will ever get large frame rates with maxed settings from Anet.
I notice 0 framerate differences with supersampling enabled or disabled in meta events. Seriously. Character model can have an effect on fps and I generally play with those settings.
The issue is that after 2 years GW2 isn’t coded with modern hardware in mind.
ie. Megaservers for meta events where most people will get 2 fps. How is that fun?
I have a distinct feeling that the netcode for culling is causing issues but that is just a hunch.
I don’t know what Anet are thinking when they code a game for zerg play but don’t code it so that most users can enjoy it at a reasonable framerate, and 2-5 fps isn’t enjoyable.
Until Intel releases a quad core cpu that can run at a stable and consistant 6.5Ghz I can’t see zerg play in GW2 being enjoyable unless you like playing in potato mode.
intel 335 180gb/intel 320 160gb WD 3TB Gigabyte GTX G1 970 XFX XXX750W HAF 932
Megaservers for meta events where most people will get 2 fps. How is that fun?
My computer is just over 3 years old now and I get 20-30 FPS in zerg events with high settings, low model limit.
Most people don’t have modern hardware, they have very outdated hardware, hence the low FPS. Additionally, these people are used to simply slotting in a new GPU as the only needed upgrade for games, causing their CPU to fall way behind over time. ArenaNet has to set the bar somewhere.
Even if the game was perfectly optimized, there’s no way 5+ year old hardware would be able to handle 100+ people smoothly. One optimization I tend to see in other MMOs is animation quality, which I find very annoying. GW2 doesn’t do that and instead reduces the quality of spell effects. Animation quality basically lowers the amount of frames processed depending on how important the object is. For example, a player running into the distance would only have a few frames, making the movement of their legs look very choppy. It can greatly reduce processing however.
The problem with games like this is that they’re always going to be CPU bound when it comes to zergs and massive events. There isn’t much Anet could do outside of ensuring the game uses as many cores as possible – however that introduces its own problems:
Firstly, the game would need to be re-written from the ground up. This alone makes it near impossible to do.
Secondly, they would need to cap how much of a CPU can be used. If the game started using all available cores, a lot of people would suddenly be complaining that the game is making their PC overheat and shut down – especially those that aren’t great at maintaining their hardware.
Thirdly, there are very few game developers who have experience in writing programs that can use 4 or more cores. As demonstrated by Ubisoft, it doesn’t work out particularly well when you have inexperienced people attempting to bodge it.
The only other alternative is to have the servers handle more and more of the load – but this is extremely expensive and we all know they’re having problems with server performance as it is.
So while it would be nice for the game to perform better in large scale situations, I really don’t think any massive improvements in this area will be forthcoming. The most realistic hope is that there’s a pretty major leap in CPU performance with one of the next few generations that make it a non-issue.
You simply can’t compare an MMO with singleplayer games or shooters.
Even the CryEngine in AION was lagging like hell in big zerks, or the Unreal-Engine
in Lineage 2 for example (when using hardware that was quiet good on release of those games)
Best MMOs are the ones that never make it. Therefore Stargate Online wins.
There are certainly some graphical options they could add (mostly in terms of turning off or down the visual effects) to make large scale Zerg activities “better”, but expecting “netcode” (not the problem) or the Engine DirectX support (the REAL issue) to be altered on an existing 2 year old game is just not at all a reasonable request and it’s not going to happen..
Fate is just the weight of circumstances
That’s the way that lady luck dances