Why is Performance Never/Rarely Addressed?
Your benchmark links are not related to GW2 and they only focus on frame rates. The game can be slowed for reasons other than frame rendering performance, especially with large groups of players in a single area, and these would not show up on synthetic tests like the ones you provided.
1st) The frame rates also show minimum frame rates, which would be much lower, like 1 fps if the game ran out of memory.
2nd) I’ve checked GW2 with 4GB and 8GB ram and there saw no difference. having a faster HDD or SDD reduce load times considerable though.
3rd) If you read the article in the link you will see that they aren’t only using synthetic benchmarks (the BF3 multiplayer is a good indication of this since there is no canned benchmarks).
Test System: Intel Core i5-4670K, Asus Z87 Gryphon, Sapphire Radeon HD7870 2GB, Corsair 2×4GB or 1×4GB DDR3-1600, Sapphire Radeon HD 7870 2GB, Windows 8
We ran six benchmarks, all at a resolution of 1920×1080, and all averaging three 60-second runs through the actual gaming world, not built-in benchmarks. We used either one or two sticks of Corsair DDR3 RAM running at DDR3-1600 (9-9-9-24) settings. Thus, the 4GB system not only has a memory deficit, it was also running in single-channel mode (resulting from the use of just one stick of RAM). These two factors should have at least had some effect, right? Not so, according to our data.
Here’s what we tested, followed by the results:
(1) Deus Ex: Human Revolution (Hengsha Landing Pad) – Maximum Settings/FXAA
(2) Battlefield 3 (Swordbreaker Single-Player) – Ultra Settings/4xAA
(3) Battlefield 3 (Caspian Border Multi-Player) – Ultra Settings/4xAA
(4) Far Cry 3 (Village) – Ultra Settings/4xAA
(5) Hitman: Absolution (Chinatown) – Ultra Settings/4xAA
(6) Tomb Raider (Village) – Ultimate Settings/FXAA
And the synthetic benchmarks just mirrored the real world one.
Note that the 4GB configuration was just 1 stick meaning it was running single channel vs dual channel.
(edited by Swoo.5079)
The GW2 client is subject to a “2GB limit” as 32-bit applications cannot address more than 4 GB…
Well, yes. But the game STILL doesn’t use more than 2Gb of shared memory. Pretty much no game does.
These operations write to and read from memory, and in doing so can bump a game file out to swap even if the operation itself didn’t need a lot of memory. If the game needs it, this will manifest itself as lag but it’s not “dropped frames” or lag that occurs due to an underpowered GPU.
Excessive swapping is simply not the problem for GW2. It’s extremely obvious to any user when it happens – it will occasionally on any system – but it doesn’t manifest in lower FPS, but large lag spikes.
Streaming just means that you see stuff on your screen before the entire scene is loaded.
Actually I was referring to streaming assets in an out of memory, which is what modern developers use to avoid swapping mentioned above. That ties into the point about anything more than 4Gb being pointless, because (competent) streaming of assets from HDD to RAM mitigates it anyway.
No, it’s used for quite a bit more than that. The operation of the program relies on ram, and the underlying OS has many processes running in the background that require a share of the RAM and can cause sluggish performance if you don’t have enough. VRAM is used by the GPU for the textures and related data to rending the scene, and it is not connected to or shared with system memory.
Well, yeah. Just working on the assumptions that lots of background processes slowing down the game isn’t really the issue for most people suffering from poor performance, even though it would technically be a factor. I think any competent gamer knows how to run a “clean” system.
Obviously if you have a ton of kittens on your PC it will impact the game. But you’d have to have a LOT and a LOT and a LOT of stuff running against GW2 for it to become an issue even on a 4Gb system.
More outdated info. The bare minimum anyone should be considering is 8 GB, but I maintain my recommendation for 16 GB if you want butter-smooth performance. You’re apparently claiming that 4GB is enough and yet you also seem to believe that GW2 doesn’t run as well as it should on your system.
That’s because you’re confusing how games behave when they run out of memory: it doesn’t cause sluggish performance or low FPS, it causes huge and obvious lag spikes (not to mention HDD noise) anytime the game needs to load a big chunk of data from the disc to continue.
In fact, I work on graphics design and I run cutting-edge AAA games on my rig without a single hint of swapping, ever. My RAM size?
2Gb.
And hey, I did say that GW2 runs smooth for me …
The hallmark of any poorly-optimized game is erratic performance. GW2’s biggest problems stem from CPU usage issues, which affect low-tier and extremely high-end CPUs alike.
The bottle-necks seems to be about how and, more importantly, why mundane calculations are delegated to CPU. The most common drain of performance is simply the amount of actors – NPCs, moving objects, players, weapons – in your general vicinity, even if the game displays nothing more than nameplates.
The memory usage doesn’t even begin to enter the picture of performance issues, even if it’s poorly-handled. It’s simply that everyone – I do mean everyone – has more than enough.
(edited by Draco.2806)
Since I’m assuming all those games are 32-bit clients, if Large Address Aware then they can only use up to 4GB of memory and if not LLA then only 2GB. Doesn’t mean they got anywhere close to those limits during the tests either so I’m not surprised the test was a wash.
RIP City of Heroes
Since I’m assuming all those games are 32-bit clients, if Large Address Aware then they can only use up to 4GB of memory and if not LLA then only 2GB. Doesn’t mean they got anywhere close to those limits during the tests either so I’m not surprised the test was a wash.
FarCry 3 has a 64-bit client.
On the other hand, GW2 is 32-bits, so…
8GB RAM is atm more than enough to play games and even has a decent reserve.
(edited by Swoo.5079)
You didn’t say if you used it or not (FarCry 3 64-bit client).
But my point still stands. Unless the game being tested + 64-bit OS is actively needing more than 4GB then you won’t see a performance shift from 4GB to 8GB.
All that more system memory buys you is the ability to run more 32-bit apps at the same time without the apps fighting over who gets system memory.
RIP City of Heroes
I thought this thread was a joke, or that you were going to get railed, but I’m surprised to see people with similar issues.
I have a Radeon 6970 running at 1920×1080, a core i5 2500, 6gb of RAM, and the game installed on an SSD, and it runs amazing. This is a very mid range system, if not below midrange. I run great in WVW, never have any issues anywhere. I’m constantly amazed at how smooth this engine is, and how low the system requirements are, how well it runs with massive numbers of people on screen, etc. It’s seriously the most impressive MMO engine I’ve seen to date.
Don’t expect double FPS overnight . Also, the changes I mentioned will mostly only be noticeable in combat or in large groups of creatures/players.
Thank you for the update. What about load times?
Load time are load times. The only way you’re going to improve load times is a faster hard drive. The only exception would be if they were to use more heavily compressed textures, or lower resolution textures. I seriously doubt they, nor want to see them, do this.
SSDs are getting cheap, I would recommend making the investment soon if you haven’t. I play on my workstation at work with a hybrid drive, and it does OK, but still takes some time to load upon zoning. On my SSD at home, I never see texture loads.
What I notice a lot of is when one temp content is added, the previous is removed, and while the new is being farmed or what not WvW suffers insane lag, then 2-4 weeks later a new patch, the previous patch is removed and something new is added.
What I want to know is, in the future, will this change, will adding more area to the map make this worse? Is this why there is no exp in the distant future cause the game can’t handle one?
Välkyri – 80 Warrior
JQ[Lulz] – Kill fur Thrillz…
Here’s a little performance guide based on my experience with different setups, maybe it will help some of you.
Main PC: AMD Phenom II X6 3.5GHz, 8GB RAM, Radeon 6870 1GB, 7200Rpm HDD
Things i dont list dont affect performance much, those are at maximum:
1920×1080 fullscreen windowed – for easier alt tabbing
Antialliasing OFF – i personally dont like the blur it applies, minimal performance hit
Reflections NONE – barely noticeable, big performance hit
Render Sampling – NATIVE
Shadows – MEDIUM, if you have a weaker video card set to low or none.
Shaders HIGH – makes the game very pretty, heavy hit on the graphics card, lower if you have problems
And now the biggest hit on CPU, which most of you experience in zergs and wvw:
Character Model Limit, Character Model Quality – try switching it around in zergs and notice the performance. Example: Dwayna today with 60+ people, both settings at medium i get 15 FPS, both settings at lowest i get relatively smooth 25+ FPS. I keep it at medium everywhere except zergs, cant expect more from an AMD CPU.
Midrange laptop: i5 ivy bridge at 3230M at 2.6Ghz, 4GB RAM, GeForce 650M, 5400 Rpm HDD (makes loading times much longer than the 7200 rpm)
Things I don’t list don’t affect performance much but I keep them at medium:
1920×1080 fullscreen windowed – for easier alt tabbing
Antialliasing OFF – same as above
Reflections NONE – seriously, avoid this
Render Sampling – NATIVE
Shadows LOW – can’t handle the dynamic shadows medium settings apply
Shaders MEDIUM – works pretty well, sometimes it’s a good idea to set to low but that makes the game ugly
Character Model Limit and Quality at Low, Lowest in zergs.
I thought this thread was a joke, or that you were going to get railed, but I’m surprised to see people with similar issues.
I have a Radeon 6970 running at 1920×1080, a core i5 2500, 6gb of RAM, and the game installed on an SSD, and it runs amazing.
That’s the kind of fluffy kittens this is all about. I have a rig with a GPU slightly meatier than yours (650Ti) and the game runs like a pack of skritt. Then there are tons of people with overclocked i7’s and they report even worse things.
After running a performance test at max settings I was able to reach 85% cpu usage and 2010mb of ram which is just below 2gb of ram. A 64bit client would allow the usage of 128gb of ram and of course a 64bit cpu has twice the computing power over a 32bit cpu. Instead of sinking all of your resources in trying to pump out more content maybe you should come out with a 64bit client.
This is just plain wrong……
Wrong in what aspect?
First, 32 bit program is in fact able to be allocated more than 2GB of memory, with some code switch turned on. It should be able to get 4GB instead of 2.
Second, x64client doesn’t necessarily be faster just because it’s x64. See M$ office 64bit. Optimization is still the most important factor. of course, after some good optimization, it wouldn’t be surprised to outperform the 32bit client. But I don’t believe they fully optimized our current client yet. Better focus on the 32 bit client so everyone can benefit rather than just a portion.
The main reason of a 64 bit program is still because of memory. For software that takes TONS of ram like photoshop, it’s a whole different experience. For GW2, 4GB should be more than enough.
I thought this thread was a joke, or that you were going to get railed, but I’m surprised to see people with similar issues.
I have a Radeon 6970 running at 1920×1080, a core i5 2500, 6gb of RAM, and the game installed on an SSD, and it runs amazing.
That’s the kind of fluffy kittens this is all about. I have a rig with a GPU slightly meatier than yours (650Ti) and the game runs like a pack of skritt. Then there are tons of people with overclocked i7’s and they report even worse things.
Interesting, makes me not want to touch my hardware setup in fear of messing up the equilibrium!
The game’s 32-bit client is large address aware (ECHO … Echo … echo) and on 64-bit Vista/Win 7/Win 8 it can use up to 4GB.
Good point, and also a reason to not hamstring yourself by buying into the erroneous belief that 4GB of memory is “more than enough”.
Processor is definitely not the bottleneck. I’m running a very similar setup, Phenom II X4 Black edition 3.4 Ghz with an EVGA Nvidia 550 ti OC and 8 GB of G.Skill ram and a Biostar TA970 MB and I run the game on max settings with almost no problems.
CPU performance bottlenecks the GPU at higher resolutions, like 1920×1080 and up. If you’re running on a 550TI then you’re probably not pushing a high resolution…and if you enable full screen anti-aliasing you’ll need a faster CPU as well as more onboard memory on your GPU to sustain decent frame rates.
1st) The frame rates also show minimum frame rates, which would be much lower, like 1 fps if the game ran out of memory.
First of all, FPS benchmarks are a very general overview of what’s going on. They show the rendering performance by monitoring the hand off from the game engine to the system’s output (directx), which would not include system lag. This is easily evident in benchmarking a dual GPU setup that has stuttering issues (a very common issue with multi-gpu systems). The benchmarks will have you believing all is well as the stutters, which can last 1-2 seconds or more, do not affect the frame rate figures – the frames are counted BEFORE the system lag is encountered.
2nd) I’ve checked GW2 with 4GB and 8GB ram and there saw no difference. having a faster HDD or SDD reduce load times considerable though.
Total BS. You didn’t check anything and even if you did, your anecdotal commentary holds no water. GW2 client can in fact use up to 4GB of physical memory. If your total memory is 4GB then you’re not maximizing performance. I highly doubt you play the game without experiencing lag if you only have 4GB. A 4GB windows 7 system is sluggish in general, even with a good SSD.
3rd) If you read the article in the link you will see that they aren’t only using synthetic benchmarks (the BF3 multiplayer is a good indication of this since there is no canned benchmarks).
The fact that you’re citing unrelated articles to make your case for you shows that you do not know what you’re talking about on this topic. I don’t know why you are so hell-bent on misleading people into believing that there is no benefit to having 8GB of system memory or more when there clearly is.
And the synthetic benchmarks just mirrored the real world one.
Note that the 4GB configuration was just 1 stick meaning it was running single channel vs dual channel.
None of those games you listed are MMO type games – they are all first-person games designed for one player or a small number of multiple players. Their engines are optimized for speed rather than maximum aesthetics, and furthermore you’re still relying on FPS numbers alone to perpetuate the “4GB is enough” myth.
If you’re too cheap to buy more than 4GB then that’s fine, but GW2 will use up to 4GB if your system has more than 4GB of physical memory and it will run better – this is a FACT not some weak attempt at drawing parallels between GW2 and frame rate benchmarks of other games.
You didn’t say if you used it or not (FarCry 3 64-bit client).
But my point still stands. Unless the game being tested + 64-bit OS is actively needing more than 4GB then you won’t see a performance shift from 4GB to 8GB.
All that more system memory buys you is the ability to run more 32-bit apps at the same time without the apps fighting over who gets system memory.
The game doesn’t need to statically occupy all of its available physical memory for additional system memory to provide benefits. It’s not just about being able to multi-task more effectively.
Windows does have a good caching system that will utilize available memory, and that any time something has to be loaded from the disk, it’s better if it loads from memory cache rather than the disk.
The specific instances where GW2 bogs can be alleviated by having at least 8 GB of memory, a decent Intel CPU and a quick graphics card. Any issues beyond that would require software optimization from ANet.
Guild Wars 2 needs RAM like WoW or Everquest need RAM. That is to say, it doesn’t. Not on modern machines.
There’s never enough variety in assets on-screen – unless you’re looking at a hundred-player zerg in WvW – to justify occupying more than it actually does – which is 2Gb of SHARED memory at most (“High” textures), as mentioned earlier.
Windows can’t do anything to prefetch it either – GW2 uses a singular 15Gb file, which never, ever belongs in your RAM. There’s going to be swapping one way or another, and often.
The persisting performance issues have never been linked to RAM, so using more RAM won’t do anything to alleviate them.
Similarly, having more than 1Gb VRAM on your GPU will not accomplish anything unless you’re running a multi-monitor setup with “High” textures and shadows, and Supersampling instead of FXAA. There’s just not enough stuff that would need it, outside of aforementioned dense groups of players.
All persistent problems with performance are ultimately tied to the CPU clock speed and instruction sets (and incredibly outdated shadow implementation on the side of GPU).
Basically, this means the game is plain old poorly optimized.
(edited by Draco.2806)
You didn’t say if you used it or not (FarCry 3 64-bit client).
But my point still stands. Unless the game being tested + 64-bit OS is actively needing more than 4GB then you won’t see a performance shift from 4GB to 8GB.
All that more system memory buys you is the ability to run more 32-bit apps at the same time without the apps fighting over who gets system memory.
The game doesn’t need to statically occupy all of its available physical memory for additional system memory to provide benefits. It’s not just about being able to multi-task more effectively.
Windows does have a good caching system that will utilize available memory, and that any time something has to be loaded from the disk, it’s better if it loads from memory cache rather than the disk.
The specific instances where GW2 bogs can be alleviated by having at least 8 GB of memory, a decent Intel CPU and a quick graphics card. Any issues beyond that would require software optimization from ANet.
You may have misunderstood me. I’m not saying that you won’t see difference in performance between 4 and 8 GB with GW2, just the games Swoo tested.
RIP City of Heroes
Guild Wars 2 needs RAM like WoW or Everquest need RAM. That is to say, it doesn’t. Not on modern machines.
There’s never enough variety in assets on-screen – unless you’re looking at a hundred-player zerg in WvW – to justify occupying more than it actually does – which is 2Gb of SHARED memory at most (“High” textures), as mentioned earlier.
Well, you’re speaking in absolutes by using words like “never”. The situations where having extra memory would be beneficial are the situations that would cause a fair amount of people to rage about “crappy performance” even if the game maintains decent frame rates the majority of the time.
It has already been explained that the GW2 client has access to 4GB on a 64-bit Windows system and that the 2 GB is a limitation of 32-bit applications and an arbitrary split of 2GB for the process and 2 GB for system resources, a total of 4 GB.
Windows can’t do anything to prefetch it either – GW2 uses a singular 15Gb file, which never, ever belongs in your RAM. There’s going to be swapping one way or another, and often.
Right, but Windows uses segment caching and not file caching, meaning that it stores frequently accessed file segments in memory. Segments do not necessarily have to be contiguous files, they are often bits and pieces of large files. Windows does not need to store the entire 15 GB GW2 data file in memory for its caching mechanisms to benefit the system.
The persisting performance issues have never been linked to RAM, so using more RAM won’t do anything to alleviate them.
What fact(s) are you basing this statement on? We KNOW that GW2 client can use up to 4 GB if it’s available. If it only needed 2 GB why would ANet have bothered to enable the large memory address flag?
Maybe the people with 8 GB of RAM or more can chime in and tell us their experiences. I have a feeling that people running a decent CPU and GPU will not have many issues with this game if they have at least 8 GB of ram. Anyone sporting 4 GB or less is probably angry at ANet right now.
Similarly, having more than 1Gb VRAM on your GPU will not accomplish anything unless you’re running a multi-monitor setup with “High” textures and shadows, and Supersampling instead of FXAA. There’s just not enough stuff that would need it, outside of aforementioned dense groups of players.
Seriously obsolete information, boss. Let’s do some simple math:
1920 * 1080 = 2,073,600 pixels
RGBA True Color = 8-bits per pixel plus 8-bits Alpha (transparency) = 32-bits per pixel
32-bits = 4 bytes * 2073600 = 8,294,400 bytes per frame minimum
So at a paltry 1920×1080 resolution every frame requires 8.3 MB of VRAM. If you enable FSAA, the amount of memory scales based on the multiplier you select. 4x FSAA means 4 * 8.3 = 33.2 MB per frame
Frames are kept in VRAM and updated incrementally to boost performance as opposed to re-rendering each frame from scratch. This uses memory.
Textures can be large; high res textures can be 10-20 MB in size or more and for every texture in the scene, a copy is loaded in to VRAM and that can add up quickly. if you don’t have enough VRAM for you chosen resolution you will see stuttering and slow-downs as the textures are loaded from system memory, or worse, from your disk.
The quality of the textures, the effects you enable as well as FSAA and finally your screen resolution all contribute to the ideal onboard memory for a GPU. If you are playing at 1920×1080 and you want to minimize any rending slowdowns during particle-heavy effects, like having detailed shadows and reflections on as well as 4x to 8x FSAA, having at least 2 GB of onboard memory for your GPU is a must. Anything less means that the textures have to be loaded from system memory to the GPU memory prior to being rendered, and this effectively creates a bottleneck which limits your GPU’s memory to that of your system memory’s bandwidth.
All persistent problems with performance are ultimately tied to the CPU clock speed and instruction sets (and incredibly outdated shadow implementation on the side of GPU).
Basically, this means the game is plain old poorly optimized.
Nobody said that ANet cannot improve their game engine and client but that needs to be done judiciously. The game runs well and like all games has some issues, but none that are a deal breaker. Get a system that’s up to the task and it will run fine most of the time.
You have the frame buffer being shown, the frame buffer being worked on (may be more than one), the depth buffer, any buffers used by post process effects … it can add up but it’s at worst 10% on a 1GB card.
While the game can use 4GB on a 64-bit system and 3GB on an enabled 32-bit system it’s still only 2GB on a 32-bit system that isn’t set to allow 3GB apps. It still makes sense that the game manages memory usage, freeing data that relates to zones you are currently not in, players who aren’t currently in the same zone as you, critters that aren’t in this zone, etc.
RIP City of Heroes
We have folks constantly working to improve performance, we’ve actually made quite a few perf updates this year. You’ll see more perf updates in the upcoming releases, we have a team working entirely on performance at all times that won’t be going away any time soon.
I would just like to state from when i joined Guildwars 2 when it opened, it ran so smooth and was very playable for me (Australian) but over the past year the game has degraded to the point of almost unplayable (sanctum cove) its a tiny bit better now but not at all good enough to do jump puzzles, SAB etc..
I’m not sure what was changed but the game lag is pretty bad for me and quite a few Aussie friends now..
We have folks constantly working to improve performance, we’ve actually made quite a few perf updates this year. You’ll see more perf updates in the upcoming releases, we have a team working entirely on performance at all times that won’t be going away any time soon.
I would just like to state from when i joined Guildwars 2 when it opened, it ran so smooth and was very playable for me (Australian) but over the past year the game has degraded to the point of almost unplayable (sanctum cove) its a tiny bit better now but not at all good enough to do jump puzzles, SAB etc..
I’m not sure what was changed but the game lag is pretty bad for me and quite a few Aussie friends now..
I think the games performance compared to 6 months ago is horrible. I play on TC and am an avid WvW player, we always want to be in T1, but we always hear about the bad T1 lag, but the past few months we have had really bad lag of our own on T2 and if this lag in T2 is bad and we have been in T1 before and that lag was just insane, I don’t want that lag all the time, is this the way the game is headed, lag all the time everywhere no matter what you are doing unless you are on an almost empty server? If so might as well just quit now and quit wasting my time here if Anet isn’t gonna do something other then “constantly working to improve performance,” cause I doubt this will do much but give us a little boost till they add in another temp patch, then it will be like nothing was worked on.
And I don’t ever see them adding on to this game as far as lands and dungeons go, permanently anyways. Which is sad, this could have been a good game, why rush and do this stupid DX9 crap, such a huge mistake on Anets side, hope it was worth it in the end.
Välkyri – 80 Warrior
JQ[Lulz] – Kill fur Thrillz…
We have folks constantly working to improve performance, we’ve actually made quite a few perf updates this year. You’ll see more perf updates in the upcoming releases, we have a team working entirely on performance at all times that won’t be going away any time soon.
Colin has anyone told you how amazing you are? A game director posting directly on forums to consumers is an incredible feat.
You didn’t say if you used it or not (FarCry 3 64-bit client).
But my point still stands. Unless the game being tested + 64-bit OS is actively needing more than 4GB then you won’t see a performance shift from 4GB to 8GB.
All that more system memory buys you is the ability to run more 32-bit apps at the same time without the apps fighting over who gets system memory.
First, I’m not the person doing the tests. It is a hardware site.
Second my point is that you don’t have games that actively require 8GB RAM and especially GW2 doesn’t require 8GB RAM.
I’m not talking about the future – now, in September 2013 going from 8GB to 16GB or 32GB RAM is useless for gaming. If you buying new, 8GB RAM is probably a better choice than 4GB RAM, but still don’t expect much difference, unless you go crazy with mods (that will probably hammer your graphics card RAM first anyway).
The OP already has 8GB RAM so the suggestion to get more RAM to solve is GW2 performance problems is a bad one.
That is the only point I’m addressing – currently games do not require even 8GB RAM to run perfectly.
(edited by Swoo.5079)
1st) The frame rates also show minimum frame rates, which would be much lower, like 1 fps if the game ran out of memory.
First of all, FPS benchmarks are a very general overview of what’s going on. They show the rendering performance by monitoring the hand off from the game engine to the system’s output (directx), which would not include system lag. This is easily evident in benchmarking a dual GPU setup that has stuttering issues (a very common issue with multi-gpu systems). The benchmarks will have you believing all is well as the stutters, which can last 1-2 seconds or more, do not affect the frame rate figures – the frames are counted BEFORE the system lag is encountered.
2nd) I’ve checked GW2 with 4GB and 8GB ram and there saw no difference. having a faster HDD or SDD reduce load times considerable though.
Total BS. You didn’t check anything and even if you did, your anecdotal commentary holds no water. GW2 client can in fact use up to 4GB of physical memory. If your total memory is 4GB then you’re not maximizing performance. I highly doubt you play the game without experiencing lag if you only have 4GB. A 4GB windows 7 system is sluggish in general, even with a good SSD.
3rd) If you read the article in the link you will see that they aren’t only using synthetic benchmarks (the BF3 multiplayer is a good indication of this since there is no canned benchmarks).
The fact that you’re citing unrelated articles to make your case for you shows that you do not know what you’re talking about on this topic. I don’t know why you are so hell-bent on misleading people into believing that there is no benefit to having 8GB of system memory or more when there clearly is.
And the synthetic benchmarks just mirrored the real world one.
Note that the 4GB configuration was just 1 stick meaning it was running single channel vs dual channel.None of those games you listed are MMO type games – they are all first-person games designed for one player or a small number of multiple players. Their engines are optimized for speed rather than maximum aesthetics, and furthermore you’re still relying on FPS numbers alone to perpetuate the “4GB is enough” myth.
If you’re too cheap to buy more than 4GB then that’s fine, but GW2 will use up to 4GB if your system has more than 4GB of physical memory and it will run better – this is a FACT not some weak attempt at drawing parallels between GW2 and frame rate benchmarks of other games.
1st) The OP is not running a multi GPU configuration – micro stuttering in modern single GPU configurations like the 7850 is a non problem.
2nd) The OP already has 8GB RAM and you recommended him to get 16GB RAM to improve performance.
Sorry but GW2 will not going to use 8GB RAM.
3rd) While GW2 may use up to 4GB RAM, it will rarely do so. If 4GB ram wasn’t enough for a WIn7 64-bit system, we would have loads of threads pointing to “run out of virtual memory” warnings.
4th) Developers know 4GB RAM is a common configuration (probably 2-3GB for 32-bit systems are still reasonably common) so they take measures for the game to run with less than 4GB just for itself.
5th) You said not even 8GB RAM was enough, so please no “4GB is enough myth” shenanigans.
6th) I didn’t say 4GB RAM was enough for games and you should never buy more than that. I said 8GB RAM is enough for games.
It is not my fault a large majority of games will run just fine with 4GB RAM though.
7th) No, a 4GB Win7 system isn’t sluggish in general. I have one on my left that is used everyday and the difference to this one with 8GB is not noticeable.
8th) A 64bits Win7 with only 2GB RAM is sluggish. A 32bits Win7 with only 1GB RAM is sluggish.
9th) If you don’t like the results I linked to, feel free to provide tests that show that 16GB or 32GB RAM provide benefits for gaming because as it stands now I provided hard numbers and you provided not quantifiable opinions.
An interesting bit of news. Not really hopeful for this iteration of GW but for the future, it looks promising.
http://www.examiner.com/article/guild-wars-2-publisher-inks-unreal-engine-4-guild-wars-3-coming
Specs for Unreal engine 4:
http://www.unrealengine.com/unreal_engine_4/
Devs: Trait Challenge Issued
Right, but Windows uses segment caching and not file caching, meaning that it stores frequently accessed file segments in memory.
Interesting.
Maybe the people with 8 GB of RAM or more can chime in and tell us their experiences. I have a feeling that people running a decent CPU and GPU will not have many issues with this game if they have at least 8 GB of ram. Anyone sporting 4 GB or less is probably angry at ANet right now.
That’s the whole point. Everyone has at least 8Gb RAM these days, even older rigs. The issues of performance have nothing to do with it – and why would they? RAM size helps to alleviate caching/swapping issues, but has no effect on frame-to-frame performance.
Seriously obsolete information, boss. Let’s do some simple math…
Your math came down to needing about 32Mb of VRAM in absolute most demanding scenario. That’s about right. You don’t need more than that.
The bulk of VRAM will be taken up by textures and shaders, not rendered frames.
RAM is usually used to buffer non-texture information (sounds, meshes, scripts, etc.), so it matters even less.
The need for more memory comes when the game is using more assets within a given location. If those assets don’t amount to much more than 1-2Gb, the extra memory space won’t be used to try and speed up render, it’ll simply be left unused.
The game runs well and like all games has some issues, but none that are a deal breaker. Get a system that’s up to the task and it will run fine most of the time.
I don’t think you understand how dire the situation is, because you happen to have a rig that somehow works well with GW2.
Every day since beta people with absolutely killer rigs were reporting abysmal frame-rates, and every day since people with much, much, much lower specs than theirs assured them that GW2 runs “just fine” on theirs.
This is the problem with poor optimization: your stuff runs well on one rig, but not another that’s only different in one tiny obscure aspect.
If you’re getting the rough end of the stick, you know GW2’s performance is truly ATROCIOUS. Not just bad, not just annoying, but unforgivably horrible for what it is.
I think the games performance compared to 6 months ago is horrible.
Huh. And here I thought I was imagining it getting worse over time. Hard to tell.
I use the “lowest” character culling settings and, by far, this has done the most to alleviate performance issues. Just like everyone knew it would.
(edited by Draco.2806)
An interesting bit of news. Not really hopeful for this iteration of GW but for the future, it looks promising.
http://www.examiner.com/article/guild-wars-2-publisher-inks-unreal-engine-4-guild-wars-3-coming
Specs for Unreal engine 4:
http://www.unrealengine.com/unreal_engine_4/
Yea, that’s for the new games they have underdevelopment in Korea. Entirely different development teams. Aion uses URE3 I think. B&S as well I believe.
RIP City of Heroes
(edited by Behellagh.1468)
I’m beta testing ES:O right now…and I can’t say much since I’m technically under NDA, I will say that I am absolutely astonished by the performance of this game. Graphically a little better than GW2 and much higher (probably double) the framerate. Very impressive, and if GW2 has any chance in competing with this game when it releases, Anet really has to get on top of things.
1st) The OP is not running a multi GPU configuration – micro stuttering in modern single GPU configurations like the 7850 is a non problem.
I didn’t say that it was, I was citing micro stuttering as an example of how frame rate benchmarks can be misleading since the frame rate numbers will not be affected by the stutters. In the same way, “system lag” will not show up in frame rate benchmarks.
2nd) The OP already has 8GB RAM and you recommended him to get 16GB RAM to improve performance.
Sorry but GW2 will not going to use 8GB RAM.
Do you know what he is running on his system IN ADDITION to GW2?
Should he disable his anti-virus software plus any other programs he may run in the background? He may have other software running or bloated drivers for a mouse, gamepad, motherboard, etc.
Either way, you’ve been missing the point about memory all along. By increasing the amount of physical memory in the system, ALL running applications are eligible for a larger share of physical memory, including GW2.
3rd) While GW2 may use up to 4GB RAM, it will rarely do so. If 4GB ram wasn’t enough for a WIn7 64-bit system, we would have loads of threads pointing to “run out of virtual memory” warnings.
False, because most systems default to letting windows manage the size of virtual memory and windows will automatically scale that file as needed – up to the available space on the disk.
To determine if you have enough physical memory or not is run the “resource monitor” program, included with windows and accessible from the task manager or control panel:
1) Start both “resource monitor” and GW2.
2) Go to the “overview” tab.
3) The top row should be labeled “CPU”. You will see all running programs there. Check the box beside “gw2.exe” to filter by that program.
4) Start the game and enter a busy city like the trader’s area in Lion’s Arch. Walk around a bit, then alt-tab back to resource monitor.
5) Expand the “memory” row and pay attention to the “hard faults/s” figure. Note the graph in the right column. You should see an orange line, which represents GW2.
Hard faults should ideally be ZERO. Any number above zero indicates that the system had to hit the page file (virtual memory) rather than physical memory. Run GW2 in windowed fullscreen and play the game while watching the graph.
4th) Developers know 4GB RAM is a common configuration (probably 2-3GB for 32-bit systems are still reasonably common) so they take measures for the game to run with less than 4GB just for itself.
We know the game can run with less than 4GB – you’re confusing “viable” with “optimal”. If you want to run the game with all or even some of its eye candy enabled you will benefit from at least 8-16 GB of memory.
5th) You said not even 8GB RAM was enough, so please no “4GB is enough myth” shenanigans.
Not really, it has already been explained why 4 GB is not enough to run the game optimally. Please refer to other posts in the thread.
6th) I didn’t say 4GB RAM was enough for games and you should never buy more than that. I said 8GB RAM is enough for games.
Fortunately, I think that our discussion has shown that you are not the best person to take advice from if the goal is to end up with a PC that can run a demanding game smoothly.
7th) No, a 4GB Win7 system isn’t sluggish in general. I have one on my left that is used everyday and the difference to this one with 8GB is not noticeable.
Making unsubstantiated statements doesn’t help anyone and it certainly doesn’t bolster your position. Again, it has already been explained why Windows 7 benefits from using more ram.
8th) A 64bits Win7 with only 2GB RAM is sluggish. A 32bits Win7 with only 1GB RAM is sluggish.
Totally false. See above.
9th) If you don’t like the results I linked to, feel free to provide tests that show that 16GB or 32GB RAM provide benefits for gaming because as it stands now I provided hard numbers and you provided not quantifiable opinions.
I’ve thoroughly debunked all of your misinformation without expecting some random benchmark graphs to do it for me.
By invoking a 3rd party to make your case for you, you’ve demonstrated that you yourself do not understand what you are talking about and are merely reciting what you read somewhere on some website.
If your CPU and GPU are both up to par and your system experiences frequent “system lag” while gaming, adding more RAM can remedy the problem. It’s that simple.
That’s the whole point. Everyone has at least 8Gb RAM these days, even older rigs. The issues of performance have nothing to do with it – and why would they? RAM size helps to alleviate caching/swapping issues, but has no effect on frame-to-frame performance.
At least one person in this thread is only using 2 GB, possibly 4 GB.
Framerate is not the only factor that affects the “playability” of a game. In fact, all framerate really shows you is the “raw power” of your GPU; it does not show you lag that is introduced when the system has to load data from virtual memory (hard disk) even if it’s a small amount.
Your math came down to needing about 32Mb of VRAM in absolute most demanding scenario. That’s about right. You don’t need more than that.
32 MB per frame minimum, plus loading textures and geometry data for the scene, plus keeping partially rendered frames in memory to help improve rendering efficiency.
At 1920×1080 you’d be good with 2 GB. Above that, 3 GB…but 1 GB would cause a lot of assets to get bumped out of VRAM and into system memory, causing lag/stutters as the GPU has to load them from system memory into VRAM.
Get EVGA Precision-X and enable the OSD feature. You can see, in realtime, how much VRAM is being utilized.
The bulk of VRAM will be taken up by textures and shaders, not rendered frames.
RAM is usually used to buffer non-texture information (sounds, meshes, scripts, etc.), so it matters even less.
The GPU can access system RAM directly but whatever its working on needs to be in VRAM. If your card has too little and you want to play at high resolution then it’s going to become an issue.
The Nvidia GTX 580 was considered “handicapped” for only having 1 GB of RAM, so shortly thereafter, 1.5 GB versions started appearing. This was back when running 1920×1080 resolution was something only a small handful of people did. Today, most people are 1920×1080 or 1920×1200.
The need for more memory comes when the game is using more assets within a given location. If those assets don’t amount to much more than 1-2Gb, the extra memory space won’t be used to try and speed up render, it’ll simply be left unused.
You’re right, not EVERY scene in the game will require the full 2 GB of memory, but if you only have 1 GB and 1.3 GB would needed for a given scene, you’re going to notice lag in that scene whereas the guy with 2 GB will fly through without issues.
I don’t think you understand how dire the situation is, because you happen to have a rig that somehow works well with GW2.
I have a system that I built myself by having a deeper understanding of how the components within the system work together. On paper, my system isn’t that much different that what a lot of guys here are listing, and yet the game runs really smoothly for me.
Why?
I can tell you exactly why – because I didn’t cut corners under the false belief that something wasn’t necessary or that a lesser amount was “good enough”. All of the arguments I’m seeing here are what you’d see doled out in your typical gaming or PC forum without much thought behind it.
Every day since beta people with absolutely killer rigs were reporting abysmal frame-rates, and every day since people with much, much, much lower specs than theirs assured them that GW2 runs “just fine” on theirs.
I play the game at 2560×1440 with all effects enabled on high or ultra. This includes shadows, character models and character count set to max. On average I see 50-60 FPS and in the very busy sections of the game it will dip down to 40 FPS. For my GPU, a 680 GTX with 4 GB of VRAM, I do not consider these to be “abysmal” and more importantly, I do not have any stutter/lag issues so even when the frame rate drops to 40 the game is still perfectly playable.
AND
Everything looks awesome.
This is the problem with poor optimization: your stuff runs well on one rig, but not another that’s only different in one tiny obscure aspect.
Oh, I do agree that their 3D engine could be optimized but it’s far from being terrible.
If you’re getting the rough end of the stick, you know GW2’s performance is truly ATROCIOUS. Not just bad, not just annoying, but unforgivably horrible for what it is.
I haven’t experienced this.
I use the “lowest” character culling settings and, by far, this has done the most to alleviate performance issues. Just like everyone knew it would.
This is where not having enough VRAM becomes very noticeable. Just sayin.
GW2 from what I noticed doesn’t use a large amount of VRAM. I think even at 5040×900 and Supersampling I wasn’t able to come close to 1GB usage…
At least one person in this thread is only using 2 GB, possibly 4 GB.
That would be me. I use 2Gb RAM and I do not experience any swapping issues. The game doesn’t need any more. It’s very obvious when the engine reads from disc, and it’s never the source of bad performance.
Framerate is not the only factor that affects the “playability” of a game.
It’s what the problem is though. Low framerate. Not swapping.
32 MB per frame minimum, plus loading textures and geometry data for the scene, plus keeping partially rendered frames in memory to help improve rendering efficiency.
32Mb maximum at high resolution and high shadows and super-sampling AA for the output frame (total overkill for most systems), plus 2 to 3 pre-rendered “frames” without any of these if the GPU power allows. So, to be incredibly generous, that’s 64Mb.
I don’t know where you get these ideas from. The bulk of GPU memory has always been used for storing textures/shaders. Rendered frames can’t possibly require 2Gb worth of space. Does your desktop wallpaper require 2Gb worth of space to render? Of course not.
High resolutions don’t require more memory, they require more bandwidth. It’s the textures and assets that need a lot of RAM/VRAM space. GW2 doesn’t use that much.
On paper, my system isn’t that much different that what a lot of guys here are listing, and yet the game runs really smoothly for me.
Why?
Because you’re lucky.
There are tons of people with completely overkill rigs that still barely average 40 FPS on a good day. People with 32Gb of RAM. People with twin overclocked GPUs. People with cutting-edge i7s.
You’re just lucky. Your rig isn’t some divine gift. If anything, investing in large amounts of VRAM and high-end CPU are both wide-spread rookie mistakes.
Your ideas about memory bandwidth fall flat when turning down textures, shaders, shadows, resolution and, in fact, ALL settings to lowest OR highest doesn’t impact performance AT ALL in most circumstances. That’s been my experience, and experience of most people with decent rigs.
GW2 isn’t a looker, either. It uses next to no resources at all, and it definitely shouldn’t considering it’s tech and fidelity would be laughed off back in 2006.
The engine is throttled. It’s BAD. It should be sent back to the drawing board.
(edited by Draco.2806)
You didn’t say if you used it or not (FarCry 3 64-bit client).
But my point still stands. Unless the game being tested + 64-bit OS is actively needing more than 4GB then you won’t see a performance shift from 4GB to 8GB.
All that more system memory buys you is the ability to run more 32-bit apps at the same time without the apps fighting over who gets system memory.
This is just lol. A 64bit client will utilize a 64bit cpu which in turn will have twice the computing power. If a program is not utilizing the hardware that’s available then its a poorly designed program. The gw2 game engine is a poorly designed engine. It’s not the worst engine I’ve seen, but it’s still subpar none the less.
You didn’t say if you used it or not (FarCry 3 64-bit client).
But my point still stands. Unless the game being tested + 64-bit OS is actively needing more than 4GB then you won’t see a performance shift from 4GB to 8GB.
All that more system memory buys you is the ability to run more 32-bit apps at the same time without the apps fighting over who gets system memory.
This is just lol. A 64bit client will utilize a 64bit cpu which in turn will have twice the computing power. If a program is not utilizing the hardware that’s available then its a poorly designed program. The gw2 game engine is a poorly designed engine. It’s not the worst engine I’ve seen, but it’s still subpar none the less.
agreed 100%
ASUS Sabertooth Z77 | 16GB Corsair Dominator Platinum 1866MHz @ 2400MHz
Samsung 840 PRO 512GB SSD | Windows 10 x64
A 64bit client will utilize a 64bit cpu which in turn will have twice the computing power.
agreed 100%
What the fluffy kittens, guys.
NO.
You didn’t say if you used it or not (FarCry 3 64-bit client).
But my point still stands. Unless the game being tested + 64-bit OS is actively needing more than 4GB then you won’t see a performance shift from 4GB to 8GB.
All that more system memory buys you is the ability to run more 32-bit apps at the same time without the apps fighting over who gets system memory.
This is just lol. A 64bit client will utilize a 64bit cpu which in turn will have twice the computing power. If a program is not utilizing the hardware that’s available then its a poorly designed program. The gw2 game engine is a poorly designed engine. It’s not the worst engine I’ve seen, but it’s still subpar none the less.
A no, as a programmer for nearly 30 years from assembler to C++ that’s bunk. Well not 100% bunk but unless you are doing some very specific kinds of intensely math oriented algorithms and most of those are now faster in SSE4/AVX where 32/64bit doesn’t matter anymore. The primary advantage with a 64-bit app is virtually unlimited memory, or as much Microsoft allows us access to (they cap different versions).
With modern superscalar (more than 1 instruction at a time) CPU architecture the internal difference between 32 and 64 is virtually nonexistent. Instructions, 32 or 64 bit, are broken up into “microOps”, analyzed, rearranged, fused into “macroOps” and then those are executed to attempt to utilize the redundant processing elements within a core as efficiently as possible. In the guts of a core, there is no difference.
RIP City of Heroes
That would be me. I use 2Gb RAM and I do not experience any swapping issues. The game doesn’t need any more. It’s very obvious when the engine reads from disc, and it’s never the source of bad performance.
It’s not always obvious, that’s why resource monitor provides a tool to graph “hard faults”. Next time you play the game, or even when you use your comp in general, enable that tool and pay attention to the hard faults/sec metric. If it’s zero, you’re good, if it’s not then your system doesn’t have enough RAM.
It’s what the problem is though. Low framerate. Not swapping.
That may be your problem, but it’s not the only possible problem. What I explained is a phenomenon that many people attribute to an underpowered CPU or GPU when in reality it’s not enough RAM.
32Mb maximum at high resolution and high shadows and super-sampling AA for the output frame (total overkill for most systems), plus 2 to 3 pre-rendered “frames” without any of these if the GPU power allows. So, to be incredibly generous, that’s 64Mb.
See, now you’re making concessions about screen resolution. I play at 2560×1440 on a single display; multi-display systems can have even higher resolutions. If you’re at 1920×1080 you’re no longer on the high end, you’re in the mid range.
Supersampling multiplies the resolution of the frame by your selected multiplier. If you choose 4x as your anti-aliasing level, then you’re using 4x more memory per frame PLUS you need VRAM to perform the resampling operation itself, where the frame is sampled to the output resolution. Multiple copies of the same frame may exist in the memory.
Bottom line – 1920×1080 with maxed-out settings needs more than 1 GB of VRAM, about 2 GB would be ideal for THAT resolution.
I don’t know where you get these ideas from. The bulk of GPU memory has always been used for storing textures/shaders. Rendered frames can’t possibly require 2Gb worth of space. Does your desktop wallpaper require 2Gb worth of space to render? Of course not.
Just a few posts ago you believed that VRAM was included as part of the total system memory available to the program…and now you’re suddenly the expert? Why even mention the desktop? We’re talking about 3D gaming as it pertains to GW2.
Why are you in denial? Whether or not you can afford a better card doesn’t change the fact that YOU WOULD benefit from a higher end card with more than 1 GB of VRAM. You’re obviously one of those people who looks at price first and then tries to rationalize paying less for a sub-optimal system by convincing yourself that you don’t really need anything offered on the higher end products.
High resolutions don’t require more memory, they require more bandwidth. It’s the textures and assets that need a lot of RAM/VRAM space. GW2 doesn’t use that much.
It’s not one or the other, it’s both. You obviously don’t know what GW2 “needs” because you’re complaining that the performance sucks, yet you insist on believing that it’s not because of your hardware.
Because you’re lucky.
There are tons of people with completely overkill rigs that still barely average 40 FPS on a good day. People with 32Gb of RAM. People with twin overclocked GPUs. People with cutting-edge i7s.
Luck has nothing to do with it. My system runs fine because it’s spec’d correctly and I am careful about component selection. It’s easy to assemble a PC by buying random parts – that doesn’t mean that most people know the best way to do it. Believe it or not, the brand and revision of your motherboard, memory, CPU and GPU can have an effect on your system’s performance, as well as the software configuration…so rattling off spec sheets means little. The systems I build always perform better than “run of the mill” systems with similar spec sheets.
You’re just lucky. Your rig isn’t some divine gift. If anything, investing in large amounts of VRAM and high-end CPU are both wide-spread rookie mistakes.
I NEVER experience a game-busting slowdown when there’s suddenly a horde of players on screen. WvW runs like butter for me. But it’s not because my system is optimal, it’s because I’m lucky…right.
The only rookie mistake here is getting into this debate with me and not knowing when to call it quits. We’re just going in circles now.
Your ideas about memory bandwidth fall flat when turning down textures, shaders, shadows, resolution and, in fact, ALL settings to lowest OR highest doesn’t impact performance AT ALL in most circumstances. That’s been my experience, and experience of most people with decent rigs.
In one case you’re proclaiming how “bandwidth is everything not capacity” and now you’re denouncing bandwidth. Not sure what to make of that.
GW2 isn’t a looker, either. It uses next to no resources at all, and it definitely shouldn’t considering it’s tech and fidelity would be laughed off back in 2006.
The engine is throttled. It’s BAD. It should be sent back to the drawing board.
The actual graphics produced by GW2 are not bad at all. They are nice in general, and among the best by MMO standards. You’re obviously not playing this game at a high resolution with maxed-out effects.
When I play this game, it’s almost like watching a CG movie. Everything looks so nice, so detailed and it really helps draw you in. You’ll actually enjoy “vista hopping” just to see the nice views. In WvW last night I had to do a double-take because the mountain range looked so real, even with the way it reflected into the nearby lake.
Game was pushing a steady 60 FPS too. Must be my luck.
GW2 from what I noticed doesn’t use a large amount of VRAM. I think even at 5040×900 and Supersampling I wasn’t able to come close to 1GB usage…
At 2560×1440 it averages around 1.8 GB of VRAM, which means that 2 GB is a good target for most people. 1 GB is not even enough for 1920×1080.
That may be your problem, but it’s not the only possible problem. What I explained is a phenomenon that many people attribute to an underpowered CPU or GPU when in reality it’s not enough RAM.
Mistaking lag spikes caused by swapping for low framerate is completely impossible even for a novice user. You’re being ridiculous.
See, now you’re making concessions about screen resolution. I play at 2560×1440 on a single display; multi-display systems can have even higher resolutions.
You lost the trail of discussion. I was making a point about how changing resolution does not affect GW2’s performance, you were trying to make a point about how you supposedly need gigabytes of RAM just to hold the rendered frames.
For most people, it doesn’t matter what resolution they’re running or how much RAM they have with GW2. Neither changes performance in any noticeable way.
And your calculations are still bogus, as was explained earlier already. Quit it.
Just a few posts ago you believed that VRAM was included as part of the total system memory available to the program…
IT IS. HOW COULD IT NOT BE?
It’s not one or the other, it’s both. You obviously don’t know what GW2 “needs” because you’re complaining that the performance sucks, yet you insist on believing that it’s not because of your hardware.
And you have absolutely no idea what you’re talking about. Frankly you have amazingly strange ideas. Everything that’s wrong with that was proven shown earlier with proper data from recognized hardware enthusiast websites.
Your way of arguing about it isn’t respectable either.
You’re obviously not playing this game at a high resolution with maxed-out effects.
Haha!
I am.
GW2’s tech is horribly outdated. It’s not bad-looking, no, but the technology on display is extremely basic by 2006 standards.
The fun part is that it DOESN’T MATTER what graphical settings you use when you have a good enough GPU. It’s the CPU that makes the most difference thanks to all the horrible random bottlenecks.
RAM and GPU only come into picture if you want to turn up the graphics to see just how poorly you can implement deferred shading on the old GW1 engine (which is what GW2 is based on). Thanks to the poor post-processing implementation, you’re often better-off leaving it on “low”, too. The world is less of a tinted blur that way.
And the killer irony is? The reason GW2 was made with outdated graphics tech from the get-go, and the reason people were willing to tolerate it, is because the developers promised it would run well even on extremely outdated hardware by the time of release. Ha ha. Hah! Hah… Ha…
(edited by Draco.2806)
Despite all the so called performance improvements, GW2 has been performing worse today than it did at launch.
The best part? If I try to lower my GFX settings using the “best performance” button -when I’m not completely by myself in the middle of nowhere- my client decides it would rather call it quits and crash to desktop rather than load the new settings.
-BnooMaGoo.5690
I’m not going to claim to be a programmer of 30 years, but I will say this. When wow launched it’s 64bit client my fps doubled. A 64bit client in Gw2 would mitigate the pain that is WvW. Please stop trying to convince yourselves that a 32bit client is just as good as a 64bit client. Even a monkey knows it’s better to have 64 bananas over 32 bananas.
System:
-Intel i7 3770 @ 3.4ghz
-16gb of ram 8-9-9-24 @ 1866mhz.
-AMD Radeon 7950
-Asus Maximus V Formula motherboard.
Graphic options:
Resolution – 2560×1440
-Animation – high
-AA – none
- Environment – Med
- LOD Distance – Med
- Reflections – T & S
- Textures – High
- Render Sampling – Native
-Shadows – Med
-Shaders – Med
-Post Process – Low
-Character Model limit – Med
-Character Model Quality – Med
Best Texture Filtering – on
Effect LOD – On
High Res Character Textures – On
Average FPS in front of Lions Arch bank while moving around – ~35-40fps. It gets better when there are less players.
CPU Utilization – ~45% with 2 cores doing most of the work while the others vastly under utilized.
Memory Utilization – ~1.8 gb
The game still needs a lot more improvements.
I have a
Core i5 2500
16gb DDR3 Ram
GTX 660 2gb
And the game still runs worse than a three legged dead dog.
I’m not going to claim to be a programmer of 30 years, but I will say this. When wow launched it’s 64bit client my fps doubled. A 64bit client in Gw2 would mitigate the pain that is WvW. Please stop trying to convince yourselves that a 32bit client is just as good as a 64bit client. Even a monkey knows it’s better to have 64 bananas over 32 bananas.
Too bad your comment will never be heard. A 64bit client would go a long way to fixing a lot of issues. So would moving from DirectX 9c only to 10 or 11. Wait a minute; we were promised GW2 would support DX 10.
We have folks constantly working to improve performance, we’ve actually made quite a few perf updates this year. You’ll see more perf updates in the upcoming releases, we have a team working entirely on performance at all times that won’t be going away any time soon.
It’s nice that you have “a team” working on the most important issue facing GW2. My question is why only “a team” as this is a vastly more important issue that the living story. I would seriously advise moving to a 64bit client or at least offering one. There are so few people on 32bit Windows anymore… The performance improvements alone would be worth it.
I’m not going to claim to be a programmer of 30 years, but I will say this. When wow launched it’s 64bit client my fps doubled. A 64bit client in Gw2 would mitigate the pain that is WvW. Please stop trying to convince yourselves that a 32bit client is just as good as a 64bit client. Even a monkey knows it’s better to have 64 bananas over 32 bananas.
Too bad your comment will never be heard. A 64bit client would go a long way to fixing a lot of issues. So would moving from DirectX 9c only to 10 or 11. Wait a minute; we were promised GW2 would support DX 10.
No we weren’t (promised Dx10 or 11) and 64-bit isn’t the panacea you think it is. The game already can use 4GB of memory as it is, 64-bit would just let it have loads more. Performance improvement would be negligible. Performance issues like this are a design problem and while refactoring the rendering pipeline is doable it’s not a quick or easy endeavor when you are making sure you aren’t breaking anything or making it worse. It’s obvious the game has problems when loads of players are packed into a relatively small area and/or the area is visually complex. This shows the problem is algorithmic in nature.
Also it’s easy to create lots of threads, the game spins off over 40 not counting the TP, but what’s hard is to do it in a way where these threads can efficiently run in parallel to one another and not be blocked, frequently forced to sync or have vastly different run times so the performance improvement is no where near what you would think.
RIP City of Heroes
An optimized 64-bit client would indeed have advantages over an optimized 32-bit client. The problem right now is that we have neither, so it’s an entirely moot point. Merely switching over to 64-bit compilation does not automatically grant you any benefits other than the possibility of increased RAM use, but it’s been well-documented that RAM use is not the biggest issue that Gw2.exe is facing. Regardless of whether the client is 32- or 64-bit, Anet would have to perform essentially the same task to make it work the way we want and expect it to. Thus, asking for 64-bit client as the first step to performance improvement is pointless, and would only serve to drop support for the people who don’t have 64-bit systems.
performance has been getting better for me since launch, keep up the good work
There are so many possible fixes they could make, such as removing superfluous animations (arrow carts do not need to render every kittening arrow), eliminating weather effects (the snow, in particular, in the garrisons of the borderlands, not only severely obstructs visibility but takes a lot of power to render), and toning down attack animations to something very generic and not enormous and flashy.
The lag associated with having 75+ people in a zerg, having to render all their individual actions, is of an altogether different class, but as ArenaNet has not even addressed the minor changes they could be making, I severely doubt that they will ever approach the much larger problem posed by the large groups of players themselves.
Instead they’ll keep trying to emphasize small havoc-squad tactics, forcing us to abandon our labours because, at the end of the day, we’re clearly having fun wrong.