[FPS?] CPU / GPU
i5-3570K 4.2 GHz and 960 4gb. FPS is 55-70 when i normally run in WvW maps and in zerging maybe 20 or something (too depressing check it out ). Need to check today evening when i can find some zerg in EotM.
Seafarer’s Rest EotM grinch
960 4 GB? Where is the 4GB one?
960 4 GB? Where is the 4GB one?
Asus Strix GTX 960 DirectCU II OC, 4Gb GDDR5. Well doesn’t really help kitten, but it did cost almost same than 2gb model.
Seafarer’s Rest EotM grinch
Ah… so wondering if I should just go yolo and get the 980
Ah… so wondering if I should just go yolo and get the 980
980 is much better than 960, but i don’t think that it makes that much difference in gw2. Funniest part is that these new GPUs and CPUs are total kitten. Before when you did buy new 200-300€ GPU you got huge boost compared your old GPU, but nowadays these budget models aren’t any faster than some 2-3 years old models.
I tried to buy used gtx760 or gtx780 when my hd 7850 stopped working, but it’s kind of risk buy 2-3 year old used gpu.
Seafarer’s Rest EotM grinch
(edited by Junkpile.7439)
Ah… so wondering if I should just go yolo and get the 980
For GW2? Wont make too much of a difference as its severly limited by its main thread CPU performance. I went from 670→980Ti and sure I could raise settings but massive zergs will still bring you down toward 20-30.
As usual, its all about price/performance as a whole. Looking quick through a review on the Asus 960 Strix vs 980 reference and tested Tomb Raider, its 63 vs 110 fps @ 1080p, 38 vs 67 fps @ 1440p. For BF4 its 51 vs 88 fps @ 1080p, 32 vs 57 fps @ 1440p.
If you can get a good price on that 980 and consider the boost worth it, go for it. But know that GW2 in particular wont really give you better zerg performance by much.
So when the fart is ArenaNet going to optimize WvW? That has been the main thing I wanted to play ever since the dang game came out. I still haven’t experienced it yet, LOL
Can you even play properly at 20 FPS?
I currently have an i7 and R9 270 and I only get 30~40 fps in the mist, so I know kitten well I can’t WvW with that GPU
Lol just go into EB, join a zerg and see how it performs. Put character model limit to low (the other settings really wont change much at low fps).
So when the fart is ArenaNet going to optimize WvW? That has been the main thing I wanted to play ever since the dang game came out. I still haven’t experienced it yet, LOL
Can you even play properly at 20 FPS?
I currently have an i7 and R9 270 and I only get 30~40 fps in the mist, so I know kitten well I can’t WvW with that GPU
They cannot optimise it any further. As long as the game remains on DX9
Even though they said they would introduce DX11 within the first year after launch, here we still are 3 years later, still stuck on DX9.
I used to be a PvE player like you, then I played Guild Wars 2
The game fps depend on the cpu but it only use 2 cores gtx 980 wont help you.
So when the fart is ArenaNet going to optimize WvW? That has been the main thing I wanted to play ever since the dang game came out. I still haven’t experienced it yet, LOL
anet has to rewrite the engine to support multi thread as currently, the game is just using mainly single game loop
changing the renderer will not magically make the game multi threaded
while dx11 and 12 have better supports for multi threading, it doesnt means that dev has to use them.
dx9 too can be multi threaded by having multi cores processing the objects then send it to one thread to be rendered.
the main difference between dx9 and dx11 when come to multithreading is that dx11 supports multithread rendering.
Henge of Denravi Server
www.gw2time.com
So when the fart is ArenaNet going to optimize WvW? That has been the main thing I wanted to play ever since the dang game came out. I still haven’t experienced it yet, LOL
anet has to rewrite the engine to support multi thread as currently, the game is just using mainly single game loop
changing the renderer will not magically make the game multi threaded
while dx11 and 12 have better supports for multi threading, it doesnt means that dev has to use them.dx9 too can be multi threaded by having multi cores processing the objects then send it to one thread to be rendered.
the main difference between dx9 and dx11 when come to multithreading is that dx11 supports multithread rendering.
if they rewrite the engine it would mean like you have to buy another game to them
wait, the HoT is another game at a price….. im confused now…..
Gate of Madness
I’ve had a few cards in for this game, Anything stronger than a gtx660 is wasted,, Unless you’re supporting multiple monitors. Better to buy a liquid cooler for your cpu and overclock.
Youtubes: https://www.youtube.com/playlist?list=PLpXd26ZeABJNWi83dXDjtoZ8Lf-4IJ9Gu
i5 4690k oced to 4.6GHz and i7 5775c ocd to 4.2GHz, plus r9 285. Playing around 35-45fps during heavy WvW blob fights.
You need fast single core performance. Also, reducing some heavy cpu settings will help. Reflections and Shadows disabled, Effect LOD enabled, and Character model limit to “Low” or “Medium” bring a huge bump.
i7 5775c @ 4.1GHz – 12GB RAM @ 2400MHz – RX 480 @ 1390/2140MHz
(edited by Ansau.7326)
i7 6700k at 4.4 ghz, GTX 970. I needed a re-build including new mobo and gpu recently so figured why the hell not.
Only in the absolute blobbiest or blobs where all three servers collide with map queues do I really go below 35-40. The lowest I’ve gone without a hardware-side issue affecting CPU performance is like 23 with everything on high/highest except Effect LOD and Shadows. I saw very little performance gains changing most other settings as those rendering ones are CPU-based for the most part and the code the game runs for these effects isn’t too demanding by modern high-end hardware standards like often the newest PC ports of console titles are.
As others mentioned, the bottleneck is the CPU, though. The engine just isn’t written to take advantage of multiple cores, and only so much performance can be gained going multi-core.
As far as ANet saying they would release the game on DX11, they said that “they would try to.” I have gone back at one point and found the thread, and nowhere did they make promises. Actually, they even said it was unlikely as none of their engineers were familiar with the platform. There was a client briefly, and it failed miserably because parallel system design can be substantially slower than using just one core if not quite literally written perfectly for it.
We’ve pretty much just capped out on clock speeds at this point in time due to heat issues with hardware. Only until something really revolutionary happens will we see big gains.
https://forum-en.gw2archive.eu/forum/professions/thief/ES-Suggestion-The-Deadeye-FORMAL/
(edited by DeceiverX.8361)
I’ve had a few cards in for this game, Anything stronger than a gtx660 is wasted,, Unless you’re supporting multiple monitors. Better to buy a liquid cooler for your cpu and overclock.
This
Leto is kinda low balling it on what would be a waste for GPU, but that’s the gist of it, and this is what you should actually do if upgrading for GW2:
Go here: https://www.cpubenchmark.net/singleThread.html
Get the best single-thread performing CPU you can, which fits the hardware you already have.
After that, tune the GPU and game graphics settings for what that allows.
Otherwise, you’re looking at building a whole new rig, and that gets you into similar decision making with very different outcomes. Right now, you’re either spending big or waiting to see more back-and-forth between the manufacturers.
If you’re buying new this year, you’re looking at boards with PCIe 3.0 slots, DDR4 RAM slots, and (if you aren’t an AMD loyalist) LGA 1151 CPU sockets.
On the GPU, if you’re looking at Nvidia, you’re buying GTX980 or waiting for the next series to bump its price down, because 970s are just defective and kitten 980s, which like 960s will not carry you through a few years of new games.
RvR isn’t “endgame”, it’s the only game. Cu in CU.
The 970 will pretty much tackle any modern game. The whole 500 mb slower vRAM thing is largely untrue from the technical level and many optimizations have been made since release to make this pretty much a non-issue.
The 980 is a waste of money because you’ll be spending just as much for only marginally better performance at 1080p resolutions, and even then your gains at 4k aren’t huge. 2016/2017 marks the use of HBM stacked chips for GPU’s with Pascal’s architecture which in early testing already out-perform 4-way SLI’ed Titans per card. The 1151 socket will be used for a while due to new mobo architecture and feature support that has already been released on it or is emerging.
You could get a 970 and be more inclined to upgrade in two years with big pushes in tech, or you could get a 980 for bragging rights for two years and then still have the same reasoning to upgrade in the same span of time, all while spending double where instead you could have just gotten a straight-up better CPU and high-performance RAM which can take advantage of the DDR4 benefits of the new CPU’s and a better case with good cooling and USB 3.1/ type c connector support.
https://forum-en.gw2archive.eu/forum/professions/thief/ES-Suggestion-The-Deadeye-FORMAL/
The 970 will pretty much tackle any modern game.
Of course it will, and that’s not the point. Looking years ahead is the point. That’s why I said what I said, in the way I said it, which you aren’t addressing as I said it.
You’re making it an argument that wasn’t available until you misconstrued what I said.
The whole 500 mb slower vRAM thing is largely untrue from the technical level and many optimizations have been made since release to make this pretty much a non-issue.
That’s absolutely false, and the entire reason the 970 line exists is to allow manufacturers to disable defective portions of the 980, as part of the QA process of lot/bin testing on the GTX 980 assembly lines, and then still sell them as 970 because the device still works, but only to the 970 spec and with the changes that makes it a 970 and not a 980.
It’s not just about the VRAM partitioning thing. The 970 is a weaker spec, with lower quality standard, and you aren’t “getting a great deal” with it. You’re getting what you paid for.
That doesn’t mean you shouldn’t buy it. Read again the thing you quoted and misconstrued.
RvR isn’t “endgame”, it’s the only game. Cu in CU.
I have an i7 4790K,
So I should be okay with getting a 970
Ah… so wondering if I should just go yolo and get the 980
980 is much better than 960, but i don’t think that it makes that much difference in gw2. Funniest part is that these new GPUs and CPUs are total kitten. Before when you did buy new 200-300€ GPU you got huge boost compared your old GPU, but nowadays these budget models aren’t any faster than some 2-3 years old models.
I tried to buy used gtx760 or gtx780 when my hd 7850 stopped working, but it’s kind of risk buy 2-3 year old used gpu.
I’ve had the 970 and the 980, currently use 980. GW2, barely uses its potential even at 1440p. I play at 1080p. I am getting like just loads from 150 to 80%, yet it never really maxes out and I use the reference card from Nvidia. I use to play on GTX760 192bit OEM. It was a nice difference yes. But in all honesty, didn’t really see much difference in quality of graphics over the 970 or 980. It just way more quieter now.
I would recommend getting 970 if you don’t play many other games, but the 980 doesn’t have the memory flaw, so if you play high res games or games with way more complex graphics than GW2, then yeah… You’ll definitely want it. And if you want good future proofing 980 or 980Ti is a plus.
Note my system is a i7 4770s, 16GB RAM, GTX 980 reference card. Had tested GW2, AION, WoW, SWTOR, CSGO, and a few single player games with GTX 760 OEM, GTX 970, ATI R7 270, GTX 980. I like the 980 best.
(edited by KayCee.4653)
As mentioned above GW2 is mostly CPU bound beyond a 970 or equivalent. My old 670 rig runs about the same FPS as my current OC’d 980 rig with the same graphics settings. Both machines sport i7 processors but different generations.
It is pretty disheartening in GW2 to build a new machine that benches double the numbers on the Firestrike demo and see virtually zero improvement in GW2. Fortunately pretty much every other game responds accordingly.
I have access to servers that sport some of the fastest small scale processors (both Xeon and Extreme). The multi CPU system had over a terabyte of RAM and some of the fastest SSD cards available. GW2 runs the same there as well. Seems GW2 is poor at multi-threading as well. I would not be surprised if older duo core CPUs run GW2 as well as modern processors.
“Youre lips are movin and youre complaining about something thats wingeing.”
(edited by Straegen.2938)
The problem was that GW2 was designed to allow older PCs to run it.
Anet did not even consider the rate at which technology increases. And then took longer to make the game.
And so we end up with people owning more powerful computers than they thought and the game engine has become almost obsolete when compared to what is out there now.
How many people dont have a quad core with 8gb of ram these days?
And as for DX11 cards, they are are cheap as chips, it’s harder to buy a dx10, or dx9 card than a dx11 card.
And now we have DX12 and Vulkan just round the corner.
But then again this is in the WvW forum, do we honestly think we’ll all be here next year? How many of us are actually gonna buy the next expansion after the HoT DLC?
I used to be a PvE player like you, then I played Guild Wars 2
All this geek talk is making me hot!
I have access to servers that sport some of the fastest small scale processors (both Xeon and Extreme). The multi CPU system had over a terabyte of RAM and some of the fastest SSD cards available. GW2 runs the same there as well. Seems GW2 is poor at multi-threading as well. I would not be surprised if older duo core CPUs run GW2 as well as modern processors.
It is something I noticed too, and friends that run 8 core AMD chips, notice as well. Coding wise, the game’s code really isn’t optimized for multi-threading. Sorry devs. I’ve done class projects that use 4, 6, and 8 threads on my processor and you really notice the difference and that was graduate school. GW2, isn’t really designed to run anything beyond 3 threads simultaneously it seems. Processor performance often is only at 30 to 45%. It should be 50+ if it was using all of them. And even then when you look at the way the in game effects and all occur, it sort of demonstrates how they aren’t really using CPU that efficiently and also, effectively. Although 64-bit version of the client does seem a bit better tuned than the original 32 bit.
The graphics card and the processor usage though has me thinking… Why isn’t the game any better graphically? Also, a friend of mine has 2 graphics cards in SLI and I get the same performance if not a tad bit better with my single card setup. And yeah, small form factor case that only permits 1 graphics card so can’t test it with any dual or triple card setups.
We already asked anet about this at launch. We asked when would DX11 support be added.
They said they were looking at it and would be added within 6 months of launch, but again 3 years later…….
I used to be a PvE player like you, then I played Guild Wars 2
We already asked anet about this at launch. We asked when would DX11 support be added.
They said they were looking at it and would be added within 6 months of launch, but again 3 years later…….
It takes a redo of their game engine and that takes time.
It takes a redo of their game engine and that takes time.
It does not require a redo only an optimization of the API usage. They could add in some new routines which might be intensive but an initial move with optimization to DX11 is not a particularly large task for a gaming company.
I don’t think this would help much though. Their main bottleneck on the client seems to be tied to a main thread that is bound by network traffic. I wouldn’t say this is poor coding since MMO systems are very complex but certainly could use a redesign/rewrite which may indeed be a monumental task given their architecture.
“Youre lips are movin and youre complaining about something thats wingeing.”
It takes a redo of their game engine and that takes time.
It does not require a redo only an optimization of the API usage. They could add in some new routines which might be intensive but an initial move with optimization to DX11 is not a particularly large task for a gaming company.
I don’t think this would help much though. Their main bottleneck on the client seems to be tied to a main thread that is bound by network traffic. I wouldn’t say this is poor coding since MMO systems are very complex but certainly could use a redesign/rewrite which may indeed be a monumental task given their architecture.
API is the least of the troubles in this game.
Insane amount of calculations done in one or two threads is what is crushing the performance.
i7 5775c @ 4.1GHz – 12GB RAM @ 2400MHz – RX 480 @ 1390/2140MHz
All this geek talk is making me hot!
Seems like you could use some liquid cooling. :^)
It takes a redo of their game engine and that takes time.
It does not require a redo only an optimization of the API usage. They could add in some new routines which might be intensive but an initial move with optimization to DX11 is not a particularly large task for a gaming company.
I don’t think this would help much though. Their main bottleneck on the client seems to be tied to a main thread that is bound by network traffic. I wouldn’t say this is poor coding since MMO systems are very complex but certainly could use a redesign/rewrite which may indeed be a monumental task given their architecture.
API is the least of the troubles in this game.
Insane amount of calculations done in one or two threads is what is crushing the performance.
Same thing.
Straegen is talking about a theoretical (because he can’t see the source code, but a good assumption) API inside of Gw2.exe which abstracts the graphics work from the specific version of DirectX being used. That would be the way in which a game engine would normally handle using multiple versions of DirectX and OpenGL.
Straegen is also talking about a main thread (loop), where most of the draw/render calls to that API would trace their roots too, which appears to bind (block) on network traffic. He means that there is a blocking input/output cycle in Gw2.exe’s main loop, and that cycle is getting stuck on waiting for server data over the Internet.
I’d agree with him, on both points, and add that there’s really no escape from the second point in online games that do real-time simulation. Even when the loop doesn’t block, you’re still going to get nasty surprises during above-average desync. Things like skill activation lag and player teleports, when either the server CPU chokes or your uplink is dropping packets.
RvR isn’t “endgame”, it’s the only game. Cu in CU.
(edited by Virtute.8251)
this game is so outdated it will run well with 3 year old hardware.
… Did you mean 13 year old hardware?
Youtubes: https://www.youtube.com/playlist?list=PLpXd26ZeABJNWi83dXDjtoZ8Lf-4IJ9Gu
… Did you mean 13 year old hardware?
Nah, people always forget to carry the zero, they meant 30 year old.
I used to be a PvE player like you, then I played Guild Wars 2
this game is so outdated it will run well with 3 year old hardware.
In fairness 3 year old systems are only marginally slower than today’s systems. A three year old mid-range system would likely be running an OC’d i5 and a GTX 770 or similar. Processing has gotten more energy efficient but hasn’t gotten much faster. Heck the 980 came out in 2014 and it is still a premier graphics card. Skylake and Broadwell are both marginal upgrades to Haswell.
“Youre lips are movin and youre complaining about something thats wingeing.”
2x Titan black / I7 5930k stock this is max setting + sweetfx overriding it to visal look even better. i was recording to same hard drive but you can see game play.
2x Titan black / I7 5930k stock this is max setting + sweetfx overriding it to visal look even better. i was recording to same hard drive but you can see game play.
Been playing the Aviators box I see…
Tacktical Killers [TK]
We’re looking for players.
PM me here or ING.
2x Titan black / I7 5930k stock this is max setting + sweetfx overriding it to visal look even better. i was recording to same hard drive but you can see game play.
Been playing the Aviators box I see…
it was when i first got pc i bult it for animating not playing gw2 honestly there not much FPS inprovement over my old rig of 2×580 on idk what chip was i7 3ghz quad core chip lol
I have one older pc here with an amd phenom 965 that still runs the game great maxed out,thats with a single gtx 560ti.