Wvw pc help
Anyone who says they get 60 fps during a megablob fight (3 way 60v60v60) is lying. I play with everything on max except shadows to off, reflections off and the character model stuff to medium and I averaged around 25-35 during that situation.
Blame anet for coding this game so poorly and not upgrading it to Dx11/12.
i7 4930k, 3 way 780 sli (which means nothing for this game)
Don’t go the irrational “blame Anet” route.
There isnt a game out there that runs 180 player characters with 180 various server connections and ISPs linked together that doesn’t have the same or worse problem.
Come on already, we had the WvW stess test on the new BL, and everyone on the normal servers has huge lag spikes!
Werent these supposed to be on different servers? Well the lag gives us a different story as no more test and the lag has gone for most servers.
But as usual anet is claiming it’s not them and to provide tracert, same old excuse and nothing being done!
Don’t go the irrational “blame Anet” route.
There isnt a game out there that runs 180 player characters with 180 various server connections and ISPs linked together that doesn’t have the same or worse problem.
I came back after spending around 8 months playing ESO.
I was surprised how great GW2 after remembering how laggy it was. So I guess go play some ESO if you want to feel really great about GW2!
Melanessa-Necromancer Cymaniel-Scrapper
Minikata-Guardian Shadyne-Elementalist -FA-
Come on already, we had the WvW stess test on the new BL, and everyone on the normal servers has huge lag spikes!
Werent these supposed to be on different servers? Well the lag gives us a different story as no more test and the lag has gone for most servers.
But as usual anet is claiming it’s not them and to provide tracert, same old excuse and nothing being done!
So now your using a stress test to cry fowl? Could you please try to keep the discussion based on something more reasonable?
How long have you been a programmer? They stated fairly reasonable facts that are common knowledge to those in the field, so I am going to go out on a limb here and suggest that you are making proclamations on matters you have no education or experience in. Care to share your credentials that lend weight to what your suggesting?
(edited by dancingmonkey.4902)
Don’t go the irrational “blame Anet” route.
There isnt a game out there that runs 180 player characters with 180 various server connections and ISPs linked together that doesn’t have the same or worse problem.
Which has nothing to do with the FPS problem. The FPS problem is because of the engine not ISPs or anything like that.
Don’t go the irrational “blame Anet” route.
There isnt a game out there that runs 180 player characters with 180 various server connections and ISPs linked together that doesn’t have the same or worse problem.
Which has nothing to do with the FPS problem. The FPS problem is because of the engine not ISPs or anything like that.
So what game out there now uses an engine that allows 180 players in different locations to process fluidly to each other simultaneously, that maintains solid pings and consistent 60 fps or better?
Don’t go the irrational “blame Anet” route.
There isnt a game out there that runs 180 player characters with 180 various server connections and ISPs linked together that doesn’t have the same or worse problem.
Which has nothing to do with the FPS problem. The FPS problem is because of the engine not ISPs or anything like that.
So what game out there now uses an engine that allows 180 players in different locations to process fluidly to each other simultaneously, that maintains solid pings and consistent 60 fps or better?
CU has the framework built currently to handle 1000+ players per instance. : )
The main problem about anet is the internal netcode is garbage and if you want to actually research instead of just blindly praising ANet, use Wireshark and look at all of the garbage each player spams the server with.
Good network design minimizes overhead; ANet decided that maximizing the amount of TCP headers each of their servers has to parse is a good idea for some reason. Not to mention anything related to optimizing how conditions are handled, and literally everything to do with damage.
ANet has made so many bad design decisions over the years that literally every intelligent professional that even marginally works with computers has a reason to laugh at them. Sysmins, Protocol Guys and Databasing. Seriously, look at how ANet sends the information regarding another player on a packet level. Enumerating outfits and armor with fixed address packets seems like a really good idea since literally everyone else in the industry does it with their art teams. But no, for some reason, that’s not implemented, so evaluating every single different fashion wars 2 costume becomes incredibly CPU heavy. It’s a work of sheer genius that a game this badly implemented on a technical level even runs. Props to Intel, AMD, and NVidia for making hardware strong enough to run this game even at 30fps.
That’s all. : )
So you cannot name a game then?
So you cannot name a game then?
MAG did it with 256 people on the PS3. EvE has done it since around 2008 I believe, maybe earlier. Planetside 2 did it with over 1000 players, and still has the world record I believe. And CU is currently alpha testing with over 1000 “players” as well. This just took a quick google search to find out; not sure why you need to pretend that a user’s whole post is invalid just because you don’t know how to look up information on your own.
Also, the fact that you use language like “maintains solid pings” while completely disregarding the entirety of Vermillion’s post is mildly depressing. He seems to know what he’s talking about, so maybe giving what he says some thought is a good idea.
I really enjoyed playing MAG, I played the heck out of that game. It had much worse problems with too many players on screen at once. I could list its visual problems. I loved the game though, and I am not a FPS guy. But id did a much worse job then GW2 as far as lag goes.
Planetside 2 has all of those players rendered on each others screens at the same time? Got a screen shot of that?
Got a screen shot or any evidence of of CU rendering 180 players at 60 FPS
I appreciate that you did a google search and all, but you offered about as much evidence as a google search for big foot does.
i7-3630QM, Win 8.1, Radeon 7970M
Character limit/quality: medium
Shadows: off
Rest: highest, don’t affect the framerate
Result: 15 is the lowest in huge zergs, 20-30 is typical during fights
The main problem is that GW2 is still DX9.
DX11 was the first to introduce some kind of support for multi-core rendering. Some games (many Cryengine 3 titles) can even utilize HT in i7’s.
DX12 will push it even further and make even more improvement.
But we can’t expect GW2 gaining support for either.
Imho it made no sense of Anet to do so since DX11 cards have been around for so long that if someone has such old hardware his computer won’t run GW2 properly anyway. Such old computers serve people mainly for web browsing and watching films. Putting DX11 support in min requirements wouldn’t harm the population.
If CU renders 180 players at 60fps it MUST be using DX11+.
(edited by Engelsstaub.4356)
On my 980Ti, the graphics is completely choked regardless of settings with one exception: model limit. Changing that from high to medium during PvE boss zergs (ie 100+ people in front) make the fps go from 20 to 50+ or so. Changing anything else like say shadows from low to max changes absolutely nothing.
GW2 depend far more on the CPU than the GPU. I mean there are places in the game where you can get below 30fps… and there’s no one around. Try going into DR instances for example.
i7-3630QM, Win 8.1, Radeon 7970M
Character limit/quality: medium
Shadows: off
Rest: highest, don’t affect the framerate
Result: 15 is the lowest in huge zergs, 20-30 is typical during fightsThe main problem is that GW2 is still DX9.
DX11 was the first to introduce some kind of support for multi-core rendering. Some games (many Cryengine 3 titles) can even utilize HT in i7’s.
DX12 will push it even further and make even more improvement.But we can’t expect GW2 gaining support for either.
Imho it made no sense of Anet to do so since DX11 cards have been around for so long that if someone has such old hardware his computer won’t run GW2 properly anyway. Such old computers serve people mainly for web browsing and watching films. Putting DX11 support in min requirements wouldn’t harm the population.If CU renders 180 players at 60fps it MUST be using DX11+.
You’re worng. Devs have said API is not the issue, but the game engine and the amount of work the cpu has to do.
The only way to fix it would be to add the multithreading to the game engine, so the heavy threads could use more than one thread of the cpu.
Btw, it’s pointless to talk about other games, since their amount of stress in cpu is miles behind of what gw2 can put.
i7 5775c @ 4.1GHz – 12GB RAM @ 2400MHz – RX 480 @ 1390/2140MHz
(edited by Ansau.7326)