i7 5775c @ 4.1GHz – 12GB RAM @ 2400MHz – RX 480 @ 1390/2140MHz
DirectX 11/12 request [merged]
i7 5775c @ 4.1GHz – 12GB RAM @ 2400MHz – RX 480 @ 1390/2140MHz
There are a lot of people who say “no matter which settings I pick I don’t see a difference”. That’s because higher graphic settings affect different things.
With the recent patch they added Ambient Occlusion in the game and, at least on my system, I don’t see any difference in FPS at all, although AO as a feature is usually resource intensive on other engines.
That’s probably because AO isn’t happening in the Main Thread but on a separate one. It splits properly and then the GPU takes over with minimal impact on performance. And since the Main Thread is always the bottleneck and what is causing FPS losses, AO doesn’t cost me frames because the extra cores and my GPU work on it and not the main Core (Which is already taxed too much).
This is another simple way to understand why DX12 would have a minimal impact on GW2 performance.
I think they should work on dx 12 and 64bit clientbecause it will improve gw2 and will also be the start for gw3
so you are saying gw2 is multithreaded but anet dump most of the things into one single thread thus the load balance isn’t there
that reminds me, gw2 engine was based on modifying gw1 engine which can be really old. not surprising if it does not take advantage of multi core architecture.
wow, so gw2 is super badly optimized eh…
No the game has two primary threads, one renders, the other does nearly everything else. There are additional threads that are more helper threads for the other two but it’s not a scaling renderer.
RIP City of Heroes
AO can be implemented as a post process pass, same with the dynamic lighting simulating our eyes adjusting to bright/dark.
RIP City of Heroes
ah, so thats why amd processors perform so badly…
Henge of Denravi Server
www.gw2time.com
dx12 pls anet; my $3000 PC only gets 20 fps
etherealguardians.com
its just amazing how most of the post here are just made up stats and lies, so many lies for peepz that hang on a game forum you guys know a lot about directx and how gw2 is made. Amazing!
and with amazing i mean sad, very sad so sad
It would be amazing if they could add DX12 support sometime after it releases. Better graphics does have its impact on the fun factor, and for those who love GW2 it would be like playing a new MMORPG, only that it would be the fav type they like
I used to be a power ranger, now not sure anymore
It would be amazing if they could add DX12 support sometime after it releases. Better graphics does have its impact on the fun factor, and for those who love GW2 it would be like playing a new MMORPG, only that it would be the fav type they like
It’s technically out for few days now.
Why when someone asks about Directx 10,11, 12, mantle, etc the common answer is “most people still use directx9” when that is a terrible lie?
!http://i.imgur.com/Us1knr4.png!
Sure, there might be certain bias in the steam surveys, but kitten , directx9 users are less 1%. And steam users might not represent the gw2 userbase perfectly but… its 1%. 1%!. Its a wide known fact the grand part of GW2 users have access to dx10, 11 and now 12 why do we pretend most of the userbase still run the game with dx9?.
I hope with the money they get from the constant influx of gems and now the expansion they can bother updating the client to a modern api.
Could we just get an official answer regarding this topic? If we get an official answer like “there will never ever be a client version beyond directx9” then maybe people would stop asking for it.
ermm…….
:)
u know is a lie when ppl say most uses dx9, they just have no idea what they are talking about
now the thing is dx12 isnt supported by all the cards. even if they do implement a newer dx, it will only be dx11 and 12 will be just a bonus.
still, it doesnt fix the performance issue due to how the game is coded.
Henge of Denravi Server
www.gw2time.com
Directx11 is all I want, not even saying DX12 …
Anet has its own database of our computer specs of every single player that plays GW1 & GW2 so they know exactly how many people have dx 9, 10, 11, and 12 cards.
“ANet. They never miss an opportunity to miss an opportunity to not mess up.”
Mod “Posts created to cause unrest with unfounded claims are not allowed” lmao
(edited by JediYoda.1275)
People aren’t saying that. People are saying that the game targeted 32-bit XP as the minimum OS requirements. Since 32-bit apps run on 64-bit OS and Dx9 is supported on all newer versions of Windows, that’s why.
Also China was still over 68% XP in 2012. Worldwide in 2012 XP was 2nd at 30% while Win 7 was at 50% and in 2011, XP was still number 1.
When you develop code you make technological decisions based on what your minimum configuration were at the time you started. Dx9 was also a very mature API while Dx10 was new and different and most early Dx10 added pretty but sacrificed the all mighty FPS so that wasn’t encouraging a forward looking design. In 2007 where quad cores were still premium hardware they built their client on the classic duel thread design that was the common approach at the time and an upgrade over what they had in GW.
This was a product of it’s time. And decisions made then made sense. Nobody wanted a Duke Nuken continuous restart everytime a better engine came out. So you build it, plant a stake in the ground and then have all the assets developed and tested on that client rather than feature creep hell.
RIP City of Heroes
If Blizzard, and many other companies, can upgrade their game engines to use DX11 then there’s no reason why ANet can’t either. I’d also like them to fix it so the game isn’t so CPU dependent too and to take advantage of the power of our GPUs.
If Blizzard, and many other companies, can upgrade their game engines to use DX11 then there’s no reason why ANet can’t either. I’d also like them to fix it so the game isn’t so CPU dependent too and to take advantage of the power of our GPUs.
Of cores Anet can do it as long as everyone give them 15$ a month whether they are still playing or not.
Blizzard had annual income of a billion a year for a few years. This game is around 80 million a year. There is a lot of resources you can devote if you are earning 10x as much as this game is annually.
Turbine did their upgrade, on an engine used in multiple titles, while those titles were still subscription based.
RIP City of Heroes
The thing I want to know is, why do people want directx 10, 11 or 12?, I am not being sarcastic, I genuinely would like to know, what are your reasons for it. There seems to be a lot of people that talk about directx without really understanding what it is and how it affects a game engine and the development of a game.
If Blizzard, and many other companies, can upgrade their game engines to use DX11 then there’s no reason why ANet can’t either. I’d also like them to fix it so the game isn’t so CPU dependent too and to take advantage of the power of our GPUs.
Of cores Anet can do it as long as everyone give them 15$ a month whether they are still playing or not.
Anet updated GW1 DX 8 to DX9 and they didn’t charge $15 a month so your point in voided!
“ANet. They never miss an opportunity to miss an opportunity to not mess up.”
Mod “Posts created to cause unrest with unfounded claims are not allowed” lmao
Anet updated GW1 DX 8 to DX9 and they didn’t charge $15 a month so your point in voided!
The difference between DX8 and DX9 was muuuuch smaller than the difference between DX9 and DX10 (not to mention DX11).
Krall Peterson – Warrior
Piken Square
The answer to all “please add DX11/12 threads” is always the same:
It’s NOT worth the effort. It’s the unfortunate truth that changing to another DX version, even DX12 which promises performance improvements won’t do anything for GW2. Not at ALL.
Why is this you may ask? Because DX12 will provide better options to split the load on the render thread of Guild Wars 2, that’s what DX12 is capable of. However, the bottleneck of the game is the Main thread, which won’t be affected AT ALL by DX11/DX12.
So what the devs are doing is try to fix that. It’s nothing easy, it’s not as simple as changing the renderer, it’s the most complex thing they could do in their game engine, close to the point of being better making a new engine from scratch than changing this. They try to fix it and when they do we will see massive improvements, much more important and useful for everyone than updating to another DX version.
Anet updated GW1 DX 8 to DX9 and they didn’t charge $15 a month so your point in voided!
The difference between DX8 and DX9 was muuuuch smaller than the difference between DX9 and DX10 (not to mention DX11).
That’s not the point here and congrats for missing it
“ANet. They never miss an opportunity to miss an opportunity to not mess up.”
Mod “Posts created to cause unrest with unfounded claims are not allowed” lmao
That’s not the point here and congrats for missing it
It is very much the point though.
Something that takes much less effort can be done for much less money and resources.
Krall Peterson – Warrior
Piken Square
That’s not the point here and congrats for missing it
It is very much the point though.
Something that takes much less effort can be done for much less money and resources.
nice try and bye
“ANet. They never miss an opportunity to miss an opportunity to not mess up.”
Mod “Posts created to cause unrest with unfounded claims are not allowed” lmao
A lot of differences between Dx9 and later are do to the advances in GPU design and how hardware features feed back into the changes of the API. More memory, texture compression, significantly more complex shaders and the movement from a very generic API that evolved to catch up to the features and structure of OpenGL to one with Dx12 that attempted to streamline it so the abstraction of features and internal data formats are all but non-existent. You can thank the console hardware development wars and the Mantle API for some of this.
Remember Dx9 runs on old GPUs with fixed, pixel and vector shader pipelines as part of it’s design rather than the massive array of simple FPUs that Dx10 and beyond GPUs are built around. There is a reason that supercomputers are being built with high end graphics cards nowadays.
But there is a downside to this raw power. It means that a significant portion of the driver is in reality a compiler to convert shaders into the code needed to run on these massive arrays of FPUs and well, they aren’t as good as they could be. Which is why many drivers and now game optimizers from GPU manufacturing essential provide hand optimized versions of those shaders and bypass the less efficient shader compilers for specific games. Games that are frequently used as benchmarks get special attention. Benchmarks sell cards. And even if a game doesn’t get personal attention one manufacturer may suggest a series of rendering steps that is less efficient on their competitors product.
This means the two major manufacturers don’t really need to make the best driver to support the published API but tweak their perception to consumers by playing favorites with benchmark products than a consistent best experience for all games. But that’s my soapbox. I think one or both teams have commented that ArenaNet wouldn’t play ball with them to get $pecial attention.
So what Dx10/11/12 could provide are shaders that could improve lighting, shadows, improved textures and other effects that are difficult and or slow using the limitations of Dx9.
I nice read into a modern rendering path, explained for the layperson, sort of is a piece on the Dx11 render path in Deus Ex: Human Revolution. It’s a fascinating read and highlights all the steps that are done to generate all those effects.
http://www.adriancourreges.com/blog/2015/03/10/deus-ex-human-revolution-graphics-study/
RIP City of Heroes
Gw2 should have utilized DX11, 64bit and multi GPU support from launch. DX12 should be part of the Xpac as well as a 2k mode at least. Who knows, maybe that’s why the price tag to the Xpac was kinda steep, so they can include these wonderful things.
nice try and bye
Such compelling arguments.
Krall Peterson – Warrior
Piken Square
why is my game using direct x9 wonder if this is why many many people are getting low FPS . I was looking at the processes that GW2 uses and its using d3d9.dll That caint be normal
Yeah it’s a darn shame.
Would be really nice to have dx11 support but nope. Same game engine as GW1 apparently.
(edited by Purple Miku.7032)
Not quite. GW’s engine was Dx8 and didn’t need two cores.
As to why Dx9? Look at the minimum requirements. Windows XP.
RIP City of Heroes
Just out of curiosity, since I’m not very computer literate. How much FPS increase can we expect by switching to a higher version? As it stands, I get a stable 70 FPS in the open world, 20-30 in zergs like Tequatl, a surprising 40 or so in WvW zergs, and again 70 or so in sPvP all at max settings with supersample.
Just wondering mainly because I suspect HoT will be a little more demanding at launch since there will be tons of events, enemies and of course players with their particle effects everywhere. Hoping to at least keep what I have now as far as FPS goes.
Just out of curiosity, since I’m not very computer literate. How much FPS increase can we expect by switching to a higher version? As it stands, I get a stable 70 FPS in the open world, 20-30 in zergs like Tequatl, a surprising 40 or so in WvW zergs, and again 70 or so in sPvP all at max settings with supersample.
Just wondering mainly because I suspect HoT will be a little more demanding at launch since there will be tons of events, enemies and of course players with their particle effects everywhere. Hoping to at least keep what I have now as far as FPS goes.
That’s the thing, the assumption is newer is faster and that’s not necessarily the case if you do a straight up replacement.
What the newer versions allow are the same graphic effects done more efficiently IF you code them with that in mind, as well as increasing the size of texture maps and other limits. But Dx11’s biggest gain comes from allowing the code that calls the Dx driver to be done in multiple threads without issue. This can’t be done safely in Dx9 or 10. Dx12 takes it one step farther and allow substantial portions of the driver itself to run in each of those threads and that’s where the most gains are made.
But to do that the rendering code in GW2 needs to be restructured to allow it to run in multiple threads for Dx11 and higher. Otherwise the gains will not be anywhere close to the numbers people have casually tossed around from a magazine article they read.
RIP City of Heroes
A renderer code can be easily rewritten. In my line of work easily means less than 6 months. As to why they are not doing it, I have no clue.
A renderer is part of the game engine (i guess if you are a car guy, that would be your camshaft). It can be replaced.
A renderer code can be easily rewritten. In my line of work easily means less than 6 months. As to why they are not doing it, I have no clue.
A renderer is part of the game engine (i guess if you are a car guy, that would be your camshaft). It can be replaced.
It’s likely because they don’t want to prioritise any longterm projects aside from the expansion right now. They will be in deep kitten if they goof it up and it’s definitely the most important thing for them to focus on, but still I don’t like how this has to be an excuse for nothing getting done… which sadly seems to be the case for many things the past year (or several). :P
GW2 is DX9 because it’s an MMO.
MMOs want to attract as many customers as possible and designing for the lowest common denominator is part of this. It would be nice if we had a DX11 option, but it’s perfectly normal for GW2 to default to DX9 because ANet doesn’t want to alienate people with older computers.
It’s a very old engine. I think it was developed from the GW1 engine? That’s oooold. Be happy it’s using Dx9 at all.
At this point DX9 is nothing but bottlenecking the game. I can run games like GTAV and Dying Light at High to Ultra and get 60+ FPS. Yet, GW2 drops down to 30 at times in pretty much every new area like LA, DT and SW. Sure, you have to account for server side performance lags in games like this but overall it just runs like crap.
By now most people who play GW2 have at least Win7 and a DX11 GPU except for some plebs who play on a potato laptop running XP.
If they want to keep the game running for a few more years they really have to consider upgrading their engine, like WoW did successfully. Considering ANets extremely slow development speed though… I wouldn’t count on it.
Could give support for x64 to juice out more stable FPS.
Not like it’s a big deal to make it happen anyway.
Anet gave birth to Gw2 – Anet killed Gw2.
Murican law 2015.
A renderer code can be easily rewritten. In my line of work easily means less than 6 months. As to why they are not doing it, I have no clue.
A renderer is part of the game engine (i guess if you are a car guy, that would be your camshaft). It can be replaced.
As explained multiple times on this thread alone a new render code be it dx11 or dx12 wouldn’t have a big impact on performance. That’s because the main issue with GW2 is not with the render thread, it’s elsewhere and that’s what the devs are trying to do. Make the main thread of the game split the load on multiple cores, that would have a great impact on performance, a better renderer won’t. Once the Main Thread is better optimized, and we get the massive benefits of it, they can work on DX11 or DX12 or any other thing like that.
Could give support for x64 to juice out more stable FPS.
Not like it’s a big deal to make it happen anyway.
x64 support would be amazing for big events and help with FPS much more than DX12. I’m not sure what their reasoning is behind keeping the game 32bit only, a lot of games moved to 64bit ages ago.
A 64bit game has more RAM available, which means memory fragmentation is less of an issue.
Whats more memory going to solve? Probably nothing. Its the games main thread thats bogging it down and its completely CPU bound. The only thing that will help is optimising this. x64 or DX12 will do nothing.
Have you ever played the same game on 32bit and 64bit client. Mr. Mitsubishi?
Anet gave birth to Gw2 – Anet killed Gw2.
Murican law 2015.
If Blizzard, and many other companies, can upgrade their game engines to use DX11 then there’s no reason why ANet can’t either. I’d also like them to fix it so the game isn’t so CPU dependent too and to take advantage of the power of our GPUs.
If Donald Trump can buy a Ferrari and a Yacht then YOU can also do that.
Thats more or less what you said here .. and of course a lot of other people also.
Why do you not just buy ANet and then let them make the DX-12 engine .. because
if rich people can do it .. poor can also do that.
Best MMOs are the ones that never make it. Therefore Stargate Online wins.
If Blizzard, and many other companies, can upgrade their game engines to use DX11 then there’s no reason why ANet can’t either. I’d also like them to fix it so the game isn’t so CPU dependent too and to take advantage of the power of our GPUs.
If Donald Trump can buy a Ferrari and a Yacht then YOU can also do that.
Thats more or less what you said here .. and of course a lot of other people also.
Why do you not just buy ANet and then let them make the DX-12 engine .. because
if rich people can do it .. poor can also do that.
It’s cheaper to make a Dx12 renderer than buying a ferari or a yacht. Fact.
If Blizzard, and many other companies, can upgrade their game engines to use DX11 then there’s no reason why ANet can’t either. I’d also like them to fix it so the game isn’t so CPU dependent too and to take advantage of the power of our GPUs.
If Donald Trump can buy a Ferrari and a Yacht then YOU can also do that.
Thats more or less what you said here .. and of course a lot of other people also.
Why do you not just buy ANet and then let them make the DX-12 engine .. because
if rich people can do it .. poor can also do that.It’s cheaper to make a Dx12 renderer than buying a ferari or a yacht. Fact.
Did anyone else hear a WHOOSH!!?
RIP City of Heroes
If Blizzard, and many other companies, can upgrade their game engines to use DX11 then there’s no reason why ANet can’t either. I’d also like them to fix it so the game isn’t so CPU dependent too and to take advantage of the power of our GPUs.
If Donald Trump can buy a Ferrari and a Yacht then YOU can also do that.
Thats more or less what you said here .. and of course a lot of other people also.
Why do you not just buy ANet and then let them make the DX-12 engine .. because
if rich people can do it .. poor can also do that.It’s cheaper to make a Dx12 renderer than buying a ferari or a yacht. Fact.
LOL Ravenmoon. I’ve agreed with your thinking this whole entire thread. I read over what you said on the first handful of pages but then I got to page like 7 and couldn’t keep reading anymore lol.
I hope Anet do become a little transparent about a DX engine upgrade.
I may be unique case, but went away from Alienware and now run a Mac Pro 2013 (trashcan-why? I’m a video editor) with maxed out specs and Windows 8.1 on bootcamp. I would love to see more hardware utilization even on my system even though I have similar FPS’ as all of you hardcore PC users.
Please send Anet your CV and offer to help them. lol.
The thing I want to know is, why do people want directx 10, 11 or 12?, I am not being sarcastic, I genuinely would like to know, what are your reasons for it. There seems to be a lot of people that talk about directx without really understanding what it is and how it affects a game engine and the development of a game.
Performance increase. While most games are GPU heavy, GW2 is CPU heavy. My 2500k has trouble getting high frame-rate and then the GPU does not matter much (using a 460 or a 970).
Making use of DirectX 12 and better use of the multiple cores (partly what DirectX 12 can help with as it does not only reduces the calls going to the GPU, but can also split them over multiple cores) could mean a big increase in performance. The other way of getting better performance would be OCing the CPU or / and buying a newer / more expensive CPU while that would then only be useful for a few games, because as said, most games are GPU heavy. Meaning it would be a pretty heavy investment compared for what you get for it, and when buying a new GPU you will also have to buy a new motherboard. Not to mention the power-consumption when OCing the CPU.
So for customers, GW2 updating to DirectX 12 (in combination with a few other improvements) would be the best way to get a performance increase.
Now some people say that GW2 is a MMORPG so it’s fine to have lower FPS, but also think about using a VR set. Then you would need 90FPS for best result (on the current generation) or minimum 75FPS. Talking about that, when looking at the system requirements OR gives, it’s a GTX970 and an i5-4590. If I would put a 970 in my system would be close to that, the 2500k just performers a little under the 4590, a small OC on the 2500k, should put the performance on about the requirements for the OR, a system that is considered ‘heavy’, however performance in GW2 would still be pretty bad.
Like somebody said very well on reddit about this. “The problem won’t be that game will look lets say poor in two years, the real problem it will look poor, and it will run just as crappy as it does now, while games looking 4x times better will run at least 2x times better than GW2.”
Whats more memory going to solve? Probably nothing. Its the games main thread thats bogging it down and its completely CPU bound. The only thing that will help is optimising this. x64 or DX12 will do nothing.
That depends, if the data the processor works with mostly is bigger than 32 bit (let’s say they do a lot with big ints) a 64bit processor (when using a 64bit client) can do this in one step, while a 32bit processor / 32bit client will need to cut the data into two parts so it takes longer. So it could increase performance, but that is not guaranteed.
Basically, the following code would theoretically run faster on a 64bit client then on a 32bit client.
Int64 a = 5;
int64 b = 5;
int64 c = a*b;
But, the following code would not.
Int a = 5;
int b = 5;
int c = a*b;
The thing I want to know is, why do people want directx 10, 11 or 12?, I am not being sarcastic, I genuinely would like to know, what are your reasons for it. There seems to be a lot of people that talk about directx without really understanding what it is and how it affects a game engine and the development of a game.
Performance increase. While most games are GPU heavy, GW2 is CPU heavy. My 2500k has trouble getting high frame-rate and then the GPU does not matter much (using a 460 or a 970).
Making use of DirectX 12 and better use of the multiple cores (partly what DirectX 12 can help with as it does not only reduces the calls going to the GPU, but can also split them over multiple cores) could mean a big increase in performance. The other way of getting better performance would be OCing the CPU or / and buying a newer / more expensive CPU while that would then only be useful for a few games, because as said, most games are GPU heavy. Meaning it would be a pretty heavy investment compared for what you get for it, and when buying a new GPU you will also have to buy a new motherboard. Not to mention the power-consumption when OCing the CPU.
So for customers, GW2 updating to DirectX 12 (in combination with a few other improvements) would be the best way to get a performance increase.
Now some people say that GW2 is a MMORPG so it’s fine to have lower FPS, but also think about using a VR set. Then you would need 90FPS for best result (on the current generation) or minimum 75FPS. Talking about that, when looking at the system requirements OR gives, it’s a GTX970 and an i5-4590. If I would put a 970 in my system would be close to that, the 2500k just performers a little under the 4590, a small OC on the 2500k, should put the performance on about the requirements for the OR, a system that is considered ‘heavy’, however performance in GW2 would still be pretty bad.
Like somebody said very well on reddit about this. “The problem won’t be that game will look lets say poor in two years, the real problem it will look poor, and it will run just as crappy as it does now, while games looking 4x times better will run at least 2x times better than GW2.”
The only problem is a newer API wouldn’t bring a big performance improvement, said by some of us here and confirmed by devs.
If people knew how truly dx12 works and what does and what does not, and how little would be the impact in gw2, this thread wouldn’t exist.
But you prefer to read something and assume all the world…
i7 5775c @ 4.1GHz – 12GB RAM @ 2400MHz – RX 480 @ 1390/2140MHz
Whats more memory going to solve? Probably nothing. Its the games main thread thats bogging it down and its completely CPU bound. The only thing that will help is optimising this. x64 or DX12 will do nothing.
That depends, if the data the processor works with mostly is bigger than 32 bit (let’s say they do a lot with big ints) a 64bit processor (when using a 64bit client) can do this in one step, while a 32bit processor / 32bit client will need to cut the data into two parts so it takes longer. So it could increase performance, but that is not guaranteed.
Basically, the following code would theoretically run faster on a 64bit client then on a 32bit client.
Int64 a = 5;
int64 b = 5;
int64 c = a*b;But, the following code would not.
Int a = 5;
int b = 5;
int c = a*b;
The number one reason a 64bit client would be better than a 32bit client is due to Memory Fragmentation.
From https://en.wikipedia.org/wiki/Fragmentation_:
The most severe problem caused by fragmentation is causing a process or system to fail, due to premature resource exhaustion: if a contiguous allocation is needed and cannot be satisfied, failure occurs. Fragmentation causes this to occur even if there is enough of the resource, but not a continuous amount. For example, if a computer has 4 GiB of memory and 2 GiB are free, but the memory is fragmented in an alternating sequence of 1 MiB used, 1 MiB free, then a request for 1 contiguous GiB of memory cannot be satisfied even though 2 GiB total are free.
That’s why it is advised to RESTART your computer before joining TT, Tequatl, guild missions or other big world events to free up your memory blocks. Otherwise you risk memory crashes. That’s the number one reason for losing your Tequatl loot.
An 64bit client has access to a lot more memory to allocate. Which means Memory Fragmentation is not much of an issue with a 64bit client. If there are no large enough free blocks in the first 4GB of RAM, the system will allocate more RAM to the application. A 32bit client cannot do that because it can only address 4GB