Why is Performance Never/Rarely Addressed?
An optimized 64-bit client would indeed have advantages over an optimized 32-bit client. The problem right now is that we have neither, so it’s an entirely moot point. Merely switching over to 64-bit compilation does not automatically grant you any benefits other than the possibility of increased RAM use, but it’s been well-documented that RAM use is not the biggest issue that Gw2.exe is facing. Regardless of whether the client is 32- or 64-bit, Anet would have to perform essentially the same task to make it work the way we want and expect it to. Thus, asking for 64-bit client as the first step to performance improvement is pointless, and would only serve to drop support for the people who don’t have 64-bit systems.
64-bit code doesn’t buy you any significant performance improvement unless it’s purely math with no I/O until the end. This isn’t like when we went from 16-bit to 32-bit. The CPU cores of that time were abaci compared to the modern superscalar, micro-op, out of order execution, branch predicting, enormous multi level memory caches with prefetching cores of today. And as soon as the code tries to access a shared resource, call a device driver in the OS, BOOM any savings will be wiped by the relative “slowness” of the resource.
Sure a nice CPU burn in program that computes primes or Pi to a billion places using only the standard x86 integer and FP instructions will show a speed difference. Same with ray tracing. But real heavy duty math is done as SSSE or AVX SIMD instructions now which doesn’t distinguish the difference between 32 and 64 bit. Program logic isn’t faster. Program branching isn’t faster. At the end of the day, an “average” program will see little discernible improvement in performance between a 32 and 64 bit version.
RIP City of Heroes
the performance is rly bad. there is actually an easy way to improve it alot!
stop making mega events with hundreds of ppl. dont stick to a system that never worked out. if there is a problem to display this amount of ppl stop developing stuff where u need it.
it would make difference if i kill the mega dragon events with 40 or 400 ppl… i dont understand why anet sticks to their idea so bad…… we dont need events this big….
plz limit events to 50 or 100 ppl. i hate playing a game that looks like bs. u always need to turn down grafic details to a level where i kinda want to log off…. this is not solution with the bad looking chars.
Ok I’ve compiled this short task list for the performance team. I know it looks oversimplified but this is what is badly needed.
1. CPU – GPU utilization balancing – For a more graphically intensive game, it sure looks like the GPU is underutilized and the CPU is over utilized and badly at that. Which brings us to number 2.
2. CPU core usage balancing – I keep seeing 1 or 2 CPU cores reaching 60-80% utilization max even if I’m currently running an 8 core system with 8 more cores hyperthreaded. We need to utilize more of the processing cores.
3. 64-bit Client – 32 bit has a memory limit as we all know. Those who have 64 bit Systems need to be able to take advantage of that extra memory umpf.
4. Nvidia Drivers – I have yet to see nvidia drivers recommended for GW2. I know AMD is their “partner” but there’s no reason to alienate the competition who clearly still play the game. I run every single beta and WHQL driver since 306 iteration but ALL of them and I mean ALL of them crash the game after a couple of minutes of playing then recovers for continues play.
5. Windows 8 support – Clearly the client still has issues with the new OS that need to be addressed. Specifically the crashes that happens with certain hardware combinations. It will be easy enough to include crash log reports so you have data to see which combo does what.
These are just the ones on top of my head. I know these are oversimplified as stated but hey, one can’t enjoy the game if one can’t play it properly.
1) Basically the thread which grinds through all the geometry to figure out what to send to the GPU is taking too long and thus starving higher performing cards.
2) It’s not a matter of core balance as thread balance. The OS is the one that assigns cores to threads based on which cores are available at the time. There are over 40 threads running in this game, it’s just that 10 or less use 5% or more of a core when they run with only a three or four use over 40% of a core’s available time. This tends to translate with only a few cores showing high usage with the rest not as much, simply because there really isn’t much work for more than 3 cores to do.
3) The 32-bit client is Long Address Aware meaning it can use 4GB on a 64-bit OS. For most players it doesn’t come close to this limit. For those running out of memory, I suspect it’s an interaction with some other software running on those systems, perhaps whatever video driver they are using. We have no idea what ties those players together but if out of memory errors are rampant, the boards would have exploded. There has to be some commonality there.
RIP City of Heroes
I have a Win7 64 and 4gb ram and have never had an “out of memory” error with GW2. Infact gw2 never utilises more than 2..2gb of RAM. However I have seen this particular error a number of times in different titles, either when the system is hitting the page file hard or when the game is trying to load up a large amount of high res textures & do high levels of AA.
I personaly would suggest that those getting “out of memory” errors consistantly with GW2 on a 64bit OS and 4gb+ of ram should ensure that the gpu driver is fully updated and gpu driver setting are “application controlled”. Additonally try reducing the AA levels or changing the method of AA.
We have folks constantly working to improve performance, we’ve actually made quite a few perf updates this year. You’ll see more perf updates in the upcoming releases, we have a team working entirely on performance at all times that won’t be going away any time soon.
Apparently the only thing that wont be going away soon is the lag, it looks like it’s here to stay people. By the lack of response from Anet and their so called “Performance Team” the only impact on this game they’ve managed is to increase lag more and more after every patch. GG guys, you sure are a level up over the rest if you can keep getting a pay check for failing.
i5 3570K@4GHz. 4GB DDR3-1600@6-6-6-15.
NVIDIA GTX 660 Ti@950/3200.
All plugged into an ASRock Z77 Mini-ITX motherboard.
60 FPS everywhere (VSync/frame limiter turned on)…
except where there are a few thousand models to be rendered.
I then turn off shadows and reflections and I get 30-50 FPS or so.
No complaints.
Guess I can be glad I went with Intel/NVIDIA.
Sanctum of Rall since beta 3. Mesmer since 1070 AE
One of the worst offenders for me, since performance is so erratic anyway, is actually the frame limiter. Anyone know what I’m talking about?..
Okay, here, try to spot what’s wrong with this picture:
1. Random Map A.
Frame limiter disabled = 65 FPS
Frame limiter at 60 = 40 FPS
Frame limiter at 30 = 20 FPS
2. Random Map B.
Frame limiter disabled = 40 FPS
Frame limiter at 60 = 59 FPS
Frame limiter at 30 = 12 FPS
Spot the problem there?..
I’m not going to claim to be a programmer of 30 years, but I will say this. When wow launched it’s 64bit client my fps doubled. A 64bit client in Gw2 would mitigate the pain that is WvW. Please stop trying to convince yourselves that a 32bit client is just as good as a 64bit client. Even a monkey knows it’s better to have 64 bananas over 32 bananas.
Too bad your comment will never be heard. A 64bit client would go a long way to fixing a lot of issues. So would moving from DirectX 9c only to 10 or 11. Wait a minute; we were promised GW2 would support DX 10.
No we weren’t (promised Dx10 or 11) and 64-bit isn’t the panacea you think it is. The game already can use 4GB of memory as it is, 64-bit would just let it have loads more. Performance improvement would be negligible. Performance issues like this are a design problem and while refactoring the rendering pipeline is doable it’s not a quick or easy endeavor when you are making sure you aren’t breaking anything or making it worse. It’s obvious the game has problems when loads of players are packed into a relatively small area and/or the area is visually complex. This shows the problem is algorithmic in nature.
Also it’s easy to create lots of threads, the game spins off over 40 not counting the TP, but what’s hard is to do it in a way where these threads can efficiently run in parallel to one another and not be blocked, frequently forced to sync or have vastly different run times so the performance improvement is no where near what you would think.
Sorry, but you are not correct. We were promised DX10 support shortly after Vista came out (I believe it was back in 2007). Gail Gray was the one that publicy announced it during an ingame chat. At the time she was very pleased that she was allowed to announce it.
What I understood from this topic: monkeys can count to 64, and dead three legged dogs run very slowly.
I’m not going to claim to be a programmer of 30 years, but I will say this. When wow launched it’s 64bit client my fps doubled. A 64bit client in Gw2 would mitigate the pain that is WvW. Please stop trying to convince yourselves that a 32bit client is just as good as a 64bit client. Even a monkey knows it’s better to have 64 bananas over 32 bananas.
Too bad your comment will never be heard. A 64bit client would go a long way to fixing a lot of issues. So would moving from DirectX 9c only to 10 or 11. Wait a minute; we were promised GW2 would support DX 10.
No we weren’t (promised Dx10 or 11) and 64-bit isn’t the panacea you think it is. The game already can use 4GB of memory as it is, 64-bit would just let it have loads more. Performance improvement would be negligible. Performance issues like this are a design problem and while refactoring the rendering pipeline is doable it’s not a quick or easy endeavor when you are making sure you aren’t breaking anything or making it worse. It’s obvious the game has problems when loads of players are packed into a relatively small area and/or the area is visually complex. This shows the problem is algorithmic in nature.
Also it’s easy to create lots of threads, the game spins off over 40 not counting the TP, but what’s hard is to do it in a way where these threads can efficiently run in parallel to one another and not be blocked, frequently forced to sync or have vastly different run times so the performance improvement is no where near what you would think.
Sorry, but you are not correct. We were promised DX10 support shortly after Vista came out (I believe it was back in 2007). Gail Gray was the one that publicy announced it during an ingame chat. At the time she was very pleased that she was allowed to announce it.
I’m guessing there’s no print record of that from ANet. Or from any publication/web site covering that. I’m guessing that when gamers rejected Vista en mass that plans changed. Or maybe it was those first few games that added Dx10 support that traded looks for frame rate (Lost Planet) that caused the engine team to have 2nd thoughts.
RIP City of Heroes
What I understood from this topic: monkeys can count to 64, and dead three legged dogs run very slowly.
I thought it was, if monkeys were given the choice between 32 bananas or 64 bananas they would prefer to have 64 cause monkeys know that more is better.
“…let us eat and drink, for tomorrow we shall die;.”
This game runs worst than modded skyrim with 30+ mods (texture/lighting mods mostly) and im not even talking about zerg fights. Go in any lev25-70 zones (you know the empty ones), set your graphics to maximum with reflections set to All and see sup. The only game i can think of that was as poorly optimized was Crysis 1 on 07 but at least that game was a hog in dem graphics department, gw2 is two gens behind.
The best performance Increase I have had from changing ANY of the graphics settings has been setting “character model limit” to lowest.
Really has made a big difference for me in these zerg like times
Yeah that’s the one that makes the most difference, then reflections, then shadows and shaders. These are the ones with most difference. Playing the game on anything else but Shaders set to High is fugly tho.
Optimization and performance becoming much more important to me each day. Because they removed culling, added more high populated events. But didnt really released huge performance upgrade. So my system becomes weaker each day, because of new features.
I’m really looking a benchmark or new beta client which includes performance and optimization fixes, to test and feedback about it. I volunteer for this job just release a beta client or bencmark for testing new optimization fixes. Let us help and feedback you Arenanet.
solution : make world transfers limited to once a month….
I am a little disappionted that there were no performance improvements in the last update, considering that it´s about a world event with 80+.
I hope that we will get what we were promised in the next update.
I got some performance monitoring programs to check what my comp. is doing when GW2 is running.
In The Black Citadel I get 45 fps (refresh set to 60hz). Meanwhile my GPU load is only at 70%. CPU is 40% and RAM also at 40%.
(edited by Paul.4081)
One of the worst offenders for me, since performance is so erratic anyway, is actually the frame limiter. Anyone know what I’m talking about?..
Okay, here, try to spot what’s wrong with this picture:
1. Random Map A.
Frame limiter disabled = 65 FPS
Frame limiter at 60 = 40 FPS
Frame limiter at 30 = 20 FPS2. Random Map B.
Frame limiter disabled = 40 FPS
Frame limiter at 60 = 59 FPS
Frame limiter at 30 = 12 FPSSpot the problem there?..
They further have a memory leak. It’s cropped back up again on my machine. I’m using a 32bit windows and they are blaming that. But guess what this never happens on much more memory hungry games like STO or NWO I never get memory error crashes on those titles and I have their graphics turned all the way up. My frame rates are much much better than those on GW2! The worlds do look better but the space phenom and the special effects of the weapons and the shininess and patterns on the hulls of the ships both in and out of battle make for a much more difficult memory hungry experience in these games than in GW2. So, gotta ask the question, why is this not been fixed? why is this bug back? The only other time I experienced random memory error crashes were just after launch of this 1 game.
Still no optimizations? Weren´t there supposed to be some “in the next updates”?
Some unforseen problems?
Still no optimizations? Weren´t there supposed to be some “in the next updates”?
Some unforseen problems?
Depends on where they make changes for performance problems. If it was something on server side they were able to change to help with performance, it wouldn’t show up on the patch notes itself. Also, it is possible that the performance was increased in the patch but wasn’t stated specifically on the notes. Personally I saw a little increase after the patch, but I need to play some more to be sure if there really was.
I remeber one time where performance optimizations were in the patch notes.
It did adress a very specific issue, but it was highlighted.
In this thread we were told, that the next optimizations would adress performance issues with large groups of players / creatures and combat in general (which would be exactly what I always had issues with).
That would be a big improvement that many of us desired so I cannot believe they would not write it in the patch notes, although it is possible that they would not.
That said, I did not notice any improvements.
After a long time of silence concerning performance, this thread got my hopes up,
just to seemingly fall back into silence again, after rather detailed information about upcoming optimization.
Well, I seem to have lost 10 FPS on average this patch, so I guess ANet is doing something.
Oh, also the frame limiter is back to being broken after being fixed up in the last patch. So that’s great too.
Well. Back to playing something else.
We have folks constantly working to improve performance, we’ve actually made quite a few perf updates this year. You’ll see more perf updates in the upcoming releases, we have a team working entirely on performance at all times that won’t be going away any time soon.
I truly wish that this was a lie.
The game’s current performance — or better put “waste of resources” — is a disgrace both FPS- and especially LAG-wise. Hardware that runs every other game at decent frame rates struggles as soon as there are more than a handful of players around or someone shoots a firework into the air. Slight variations in internet connection speed impact the client/server communication so drastically that the game becomes unplayable. The game simply looses track of your current position (Faced a node and could not gather? Rode the lightning into the evening sky after switching targets? Found yourself “somewhere” after using a leap skill? Crossed a combo field placed in front of you without ever touching it on the run?)
It seems that during development other scenarios then “perfect world” (single top end computer connected at 1GB/s to a private server) have been largely ignored. But by now — more than 12 month after launch and having teams working on performance improvements — there should be at least some progress … but no, there isn’t.
this will never change, 1 year laer gw2 is still the least optimized game on my computer
i can run far cry 3 all max setting on 16:10 w 1680rez and it’s smooth sailing 60+ fps constant
gw2 however i get fps drops as low as 25 in wvw huge zergs….
in order to change this, they would have to REDESIGN the whole game and engine, and that’s like making a new game,they simply don’t have the money for this
it’s badly optimized and badly polished, it will allways work like this, my advice is, buy an alienware computer, u can run anything with those things no matter how sloppy the people working on the game were
I haven’t noticed any choppiness in World Boss fights lately, and I used to get choppiness all the time. So looks like to me they increased the performance when there is a huge zerg around.
I am bumping this since I am wondering if the devs can give us an update on performance patches.
intel 335 180gb/intel 320 160gb WD 3TB Gigabyte GTX G1 970 XFX XXX750W HAF 932
Don’t expect double FPS overnight . Also, the changes I mentioned will mostly only be noticeable in combat or in large groups of creatures/players.
-Bill
If you double the FPS in 2 months I will buy $1000 worth of gems…I am not kidding…
Guess I wont get a big visa bill this month
intel 335 180gb/intel 320 160gb WD 3TB Gigabyte GTX G1 970 XFX XXX750W HAF 932
I am bumping this since I am wondering if the devs can give us an update on performance patches.
There was a post by one of the devs 1-2 days ago. He said they have some more fixes that will be implemented as soon as the 16th all the way to the end of the year. He also reminded us how dangerous it is to make live server changes to a live game so they try to get everything absolutely perfect ;D
He also reminded us how dangerous it is to make live server changes to a live game so they try to get everything absolutely perfect ;D
LOL … just LOL
Stormbluff Isle [AoD]
Previous
BillFreist
Gameplay Programmer
Hello again!
I disappeared into my Bat-Cave again and finally emerged, and with some good news to share. I’ll start off with what exactly that is.
Since my last updates, we’ve been hard at work prepping some serious server-side optimizations to relieve the bottle-necks during heavy combat. We’ve made some large steps and are almost ready to start rolling out these changes. The first batch of changes have been in testing and we hope to have them start trickling in as soon as the release on Nov. 12th. We’ll know for sure once we’re closer to that date, and so will you.
I can’t stress enough how dangerous it is to optimize a live game. I know how upsetting it can be to be in the thick of a choppy battle, but things are going to be getting better, and soon. We’ve done some temporary things on the back-end to try to ease the influx of players in Wvw, but those changes only go so far.
On to answering some of the common questions.
A lot of you have noticed what seems like an increase in skill-lag in Wvw since the beginning of Season 1. Really, this is just a large influx of players playing Wvw, mostly in servers that usually don’t have queues to get into the map. A lot of the higher tiered servers are pretty used to large battles running into this issue, but obviously that isn’t a valid excuse for it happening. We’ve only increased the focus on relieving this since the season start, and rest-assured, its a top priority.
Some other common suggestions/questions is about a method other games used, which is known as “time-dilation” or “time-scaling”. Well, for starters, this method is extremely risky. We’ve discussed this, among quite a few other alternatives, and it has boiled down to causing more problems than it’ll solve. I know you might ask, “but the current experience is bad enough, how could it be worse?”. Well, to be completely honest, it would just open another can of worms that would end up breaking the game and causing things much worse than a couple seconds of input delay. We opted instead to focus on fixing the issue by simply making the game run better, instead of sweeping it under the rug by watering down the experience. Internally we have the ability to slow down time-scale of the game, and it just feels terrible. Not to mention breaks key mechanics of the game, such as the physics simulation.
Some other assumptions I’m seeing is about our server hardware and inefficient use of communication between them and the game’s database. I’ll go ahead and put these assumptions to rest. Our server hardware was actually purchased new right around the launch of the game, rest assured its not out-dated. And you can sleep at night knowing that our combat doesn’t connect out to the database for information. All of the information it needs is already loaded into memory. The game database is simply for storing persistent character and account information, which is for the most part only accessed when loading in/out of a map or periodic saves which are handled asynchronously. I suggest checking out the links below for more information on our servers.
As far as skills just not executing (noticed some people claiming utilities are more susceptible to this), its mostly just a race-condition as far as processing on the server. You’ll notice that your auto-attack skill seems to process more reliably than other skills. This is mostly due to the fact that we process things like auto-attack timers before player input. Obviously that sounds like a bug (and honestly now that I think about it, I want to look into doing something about it), but the reality is that under normal circumstances, the player input would process before the auto-attack timer triggers. Something you can try to verify this is disabling your auto-attack and see if your other skills become more responsive.
For you tech savvy folks, here’s some good links to help understand our server infrastruture and some other pretty neat things: Link – Another Link!
Well, that’s it for today. I’ll be sure to update you all on our progress and when you can start looking for major improvements. Like I mentioned above, we’re pretty optimistic that this could be as early as our next release. Have a good weekend!
intel 335 180gb/intel 320 160gb WD 3TB Gigabyte GTX G1 970 XFX XXX750W HAF 932
There are two things that determine an MMO’s success in my opinion, profession balance and game performance. It is the player classes and smooth frame rates by which we experience the game and the content it has to offer. With that said, 8fps in a zerg and 15fps in lions arch just ain’t gonna cut it.
When you look at computer requirements did you not see that it says at least a quad-core. That is the minimal requirements. As for performance issues I have very little lag and I am running on about a mid range comp. There are allot of quad-cores selling really cheap right now consider an upgrade or turn down your settings. If you want to PvP there is the 5v5 games If you know lag is a major issue for you in WvW don’t go there. This game is designed to last for about 10 years. Think about how much faster computers will be in ten years then tell them there is a problem with performance. If you are running an AMD dual-core your computer is probably 2-3 years old. Not that age should matter when talking about computers right.
(edited by EvilNucca.8371)
What type of setup does one need to run GW2 60fps or better in Lions Arch and large WvWvW populations?
What type of setup does one need to not experience frame drops when panning the camera?
What mmorpg is a good representation of how a game with densely populated areas is supposed to run?
When developers provide recommended system requirements does that mean they expect the game to run 60fps at all times.
Is 20fps dips in performance an acceptable and standard thing for mmorpg devs? Or do they expect their games to be running at a more stable fps?
What type of setup does one need to run GW2 60fps or better in Lions Arch and large WvWvW populations?
What type of setup does one need to not experience frame drops when panning the camera?
What mmorpg is a good representation of how a game with densely populated areas is supposed to run?
When developers provide recommended system requirements does that mean they expect the game to run 60fps at all times.
Is 20fps dips in performance an acceptable and standard thing for mmorpg devs? Or do they expect their games to be running at a more stable fps?
Here are my computer’s spec’s
Intel I7 2600k @ 4.2GHz (overclocked)
16 gb G-Skill Sniper Series Ram
EVGA GTX 760 Video Card
Win 8.1 on a SDD
GW2 on kitten (by itself)
Broadband is Fios 50/35
My experience with performance in GW2 is good overall. I have all setting maxed and get the following results:
In any PvE map I average 100-120 fps
Lions Arch at peak is about 50fps
WvW in the Zerg average 50-70 fps
WvW large scale battle(200 ore more players) 20fps and MASSIVE SKILL LAG!!!
One thing to point out is I have had a much better experience with Nvidia graphics cards over AMD in most games. I also believe an Intel processor works better with GW2 than an AMD processor (not my opinion it’s just what I read). Hope this helps!
Anet needs to follow in SoEs footsteps. They have begun to rework the planetside 2 engine to become a real multi-threaded engine. And the results show, especially for AMD users who were the primary target since the Playstation 4 is a AMD 8 core machine.
Going from 40 fps with my 8350 to 120+ on the planetside 2 pts is jaw dropping. It shows one thing, AMD cpus are more than capable of delivering performance. Programmers just have to endure the daunting task of rewriting their engines to take advantage of them.
I expect if Anet was to make gw2 a real multi threaded engine, that the performance would be very good. Those who average 30-40fps would probably average around doubled that. It would solve a lot of issues, wvw fps issues, world events fps issues etc. But it will take Anet to actually be willing to put in the resources to do it.
The future of gaming will probably be AMD 8 core since its the cpu present in both the ps4 and xbox one.
Windows 10