(edited by Ravenmoon.5318)
DirectX 11/12 request [merged]
Since launch people have complained about how their graphics card hasnt been utilised properly, that GW2 is more reliant on your CPU than your GPU.
At launch the devs said they would implement DX11 in the near future, but we still dont have it, and from the sounds of it HOT still uses DX9.
Will we ever get DX11, as I believe GW2 is being severely limited graphically due to this?
Yes I know they chose DX9 because DX9 was what most card supported when development first began. But it’s 2015 now, a DX11 card is cheap as chips now! Shouldnt DX11 support be something anet should looking into?
Maybe they are waiting for DX12 to be released but I wouldn’t hold my breath.
Currently playing Heart of Thorns.
They might be looking into it. But keep in mind that the engine the game is built on is over 10 years old (it is a modified version of the GW1 engine) and the changes to the engine made for GW2 was done about 7 years ago or something like that.
Making it support DirectX 11/12/whatever will require a massive amount of work and resources, something that could be rather hard to justify when they don’t have a steady stream of guaranteed income every month.
Krall Peterson – Warrior
Piken Square
Better start introducing those moa backpacks in the gemstore then!
We’ll probably not see an engine update up until the next iteration of the franchise. That’s a lot of work to do for something that probably won’t increase profits by much.
https://forum-en.gw2archive.eu/forum/professions/thief/ES-Suggestion-The-Deadeye-FORMAL/
We’ll probably not see an engine update up until the next iteration of the franchise. That’s a lot of work to do for something that probably won’t increase profits by much.
Increase profits? Probably not, but a healthy, modern rendering pipeline is important for keeping people around, which is actually where most of ANet’s focus actually is. The living story system, daily rewards, etc, are all focused on keeping people around so that they’ll be exposed to reasons to buy gems.
If you think being stuck on DX9 won’t hurt the game over the years, well… you probably never played Final Fantasy XI, which was in much the same situation, except it was stuck on DX8 instead of DX9. To this day, I’m not sure if you can get a solid 60fps out of it, but then I wouldn’t know, since I gave up a long, long time ago.
“This game is old, so why does it run like kitten?” is never a good user experience, whether for keeping old players or acquiring new ones.
Oh, and for reference, I have a GTX 970, Core i5-3550 and 24GB of ram. Sure, my CPU is a bit of a weakpoint, but I drop to 45fps just running around the dead areas of Lion’s Arch with no one in sight. There’s no excuse for that in a genre that grows based on its wide accessibility. It’s a problem.
Honestly it’s the lack of a 64 bit client but the ability to go over 4GB of Memory that’s simply atrocious at this stage.
Either limit it back to where the client can’t go over 4GB of Memory or give us a 64 bit client.
Crashing on huge raid bosses is simply terrible.
Honestly it’s the lack of a 64 bit client but the ability to go over 4GB of Memory that’s simply atrocious at this stage.
Either limit it back to where the client can’t go over 4GB of Memory or give us a 64 bit client.
Crashing on huge raid bosses is simply terrible.
?? I know the limitations you are referring to, but I’ve only ever had GW2 crash because of actual physical problems with my RAM, and I’ve never seen the memory usage get that high, although I don’t recall any number for the memory usage that I’ve seen, so it could be I just forgot.
If you add some additional memory (so OTHER windows things can run without taking up the 3.5gb memory limit), you won’t crash…..
I have 16gb and have NEVER crashed due to graphics or overload issues.
DX11 for this game is a pipe dream. 64 bit client is possible, but it would require a different GW2.DAT file to actually get the most out of it. All of this is a LOT of dev resources and actually gains them NO sales at all.
Try going into your boss and tell them you have 300 man hour project you would like to do that would gain no measurable revenue and see what they say…..(assuming a private sector, for profit business).
Fate is just the weight of circumstances
That’s the way that lady luck dances
(edited by Brother Grimm.5176)
It’s not as simple as “doesn’t generate immediate sales”.
If bringing the client up to date with more modern tech (although even Dx11 is putting on quite a few years already) increases the longevity of your game. And could open up new possibilities for content by removing possible performance limitations that prevent it.
300 man hours is not that bad. That would mean it takes one person about two months. For a multimillion operation, 2 month’s wages for one programmer is a tiny investment.
We simply do not know how much effort it is in the first place, so it’s pointless to speculate.
Would be nice to hear arenanet’s stance on this. If only to put these speculating discussions to rest once and for all.
I don’t actually think it increase the longevity of the game though. Aren’t all these DX versions backwards compatible? The only impact I can see is a real development one where using old standards and methods starts getting more costly than current techs. I’ve seen (ans still see) an example where upgrading the DX on a game has pretty much permanently halted game development (Looking at you Funcom!) but I don’t think DX11 is so antiquated that we should be concerned.
I don’t think they will upgrade to a superior DirectX version before Guild Wars 3.
My numbers were made up to make a point. NOBODY is going to allow anyone to invest significant resources without 2 things:
1) An insurmountable problem that MUST be fixed.
2) A significant return in potential revenue.
Neither of those apply here. The game as it is is NOT suffering for lack of DX11 support (it is NOT a MUST for the game to continue).
The other side of this is supporting multiple versions since they certainly CAN NOT discontinue support for the DX9 version. Despite what most player seem to think, that would actually INCREASE their recurring costs as well (having to test any change at least twice….having to do EVERYTHING twice).
Not gonna happen.
Fate is just the weight of circumstances
That’s the way that lady luck dances
My numbers were made up to make a point. NOBODY is going to allow anyone to invest significant resources without 2 things:
1) An insurmountable problem that MUST be fixed.
2) A significant return in potential revenue.Neither of those apply here. The game as it is is NOT suffering for lack of DX11 support (it is NOT a MUST for the game to continue).
The other side of this is supporting multiple versions since they certainly CAN NOT discontinue support for the DX9 version. Despite what most player seem to think, that would actually INCREASE their recurring costs as well (having to test any change at least twice….having to do EVERYTHING twice).
Not gonna happen.
True in one sense. It comes down to quality of game play, with Guild Wars 2 having both client-side and server-side limitations. Frankly, they are about at the limit of what they can get performance-wise from the client, and PvE, WvW (and possibly some content in HoT) show this. I realize that some of the issues with WvW are server-side, but some are not. When your only option is to overclock the snot out of your CPU to get the truly best experience, it gets to be problematic. So yes, cost benefit analysis says that DX12 isn’t necessary at the moment, but they may find that their desire to add specific features to drive growth of the player base is sufficient to prompt the effort.
They keep hinting that many of the new systems in HoT are actually ground work for the future growth of the game. I wouldn’t be surprised if they want this game to last for many more years. Also, haven’t they only just released in China? Seems like this is just the beginning.
If they want a system that is gonna last they need to update the renderer. It was borderline embarassing in 2012 (DX11 came out in 2009), to still have dx9, now, in 2015? It’s just…. ugh, please update it!
Directx 12 is actually around the corner, now. And yes, having a DX9 only game in 2015 is atrocious. And it is the same company that updated their GW1 directx 8 engine to directx 9 when Guild Wars Factions came out. Directx 9 is so ancient, now. It is 11 years old.
If they want a system that is gonna last they need to update the renderer. It was borderline embarassing in 2012 (DX11 came out in 2009), to still have dx9, now, in 2015? It’s just…. ugh, please update it!
You have to keep in mind that they didn’t start developing the game in 2012, they started in 2007 (at the very latest) with an engine from 2003 or something like that.
People also seems to underestimate the work required to do something like this. “please update it!” sounds easy and all, but in reality it would take months if not years to get it done, seeing as it would require a completely rework of the whole engine.
A time which could also basically be void of new content, since adding more content designed with DirectX9 in mind would add to the workload of converting the engine.
Krall Peterson – Warrior
Piken Square
My numbers were made up to make a point. NOBODY is going to allow anyone to invest significant resources without 2 things:
1) An insurmountable problem that MUST be fixed.
2) A significant return in potential revenue.Neither of those apply here. The game as it is is NOT suffering for lack of DX11 support (it is NOT a MUST for the game to continue).
The other side of this is supporting multiple versions since they certainly CAN NOT discontinue support for the DX9 version. Despite what most player seem to think, that would actually INCREASE their recurring costs as well (having to test any change at least twice….having to do EVERYTHING twice).
Not gonna happen.
True in one sense. It comes down to quality of game play, with Guild Wars 2 having both client-side and server-side limitations. Frankly, they are about at the limit of what they can get performance-wise from the client, and PvE, WvW (and possibly some content in HoT) show this. I realize that some of the issues with WvW are server-side, but some are not. When your only option is to overclock the snot out of your CPU to get the truly best experience, it gets to be problematic . So yes, cost benefit analysis says that DX12 isn’t necessary at the moment, but they may find that their desire to add specific features to drive growth of the player base is sufficient to prompt the effort.
Some of us enjoy trying to pull 4.7 gigs out of our AMD 8150…. until it crashes… or overheats…
We’ll probably not see an engine update up until the next iteration of the franchise. That’s a lot of work to do for something that probably won’t increase profits by much.
Increase profits? Probably not, but a healthy, modern rendering pipeline is important for keeping people around, which is actually where most of ANet’s focus actually is. The living story system, daily rewards, etc, are all focused on keeping people around so that they’ll be exposed to reasons to buy gems.
If you think being stuck on DX9 won’t hurt the game over the years, well… you probably never played Final Fantasy XI, which was in much the same situation, except it was stuck on DX8 instead of DX9. To this day, I’m not sure if you can get a solid 60fps out of it, but then I wouldn’t know, since I gave up a long, long time ago.
“This game is old, so why does it run like kitten?” is never a good user experience, whether for keeping old players or acquiring new ones.
Oh, and for reference, I have a GTX 970, Core i5-3550 and 24GB of ram. Sure, my CPU is a bit of a weakpoint, but I drop to 45fps just running around the dead areas of Lion’s Arch with no one in sight. There’s no excuse for that in a genre that grows based on its wide accessibility. It’s a problem.
Don’t worry about it. I have the new generation 5820K, 16gb DDR4 2400mhz, 7970 OCed with both SSHD and SSD and i get 45-50FPS in LA with literally 8 ppl around me AFKing and 2 at the mystic forge…that for the price of my PC going nuts on noctua fans. The game is unbelievably bad at performance. Am not planning on playing a game that takes out so much of my PC for far too long. Will probably quit if nothing is done about this. I prefer keeping my PC alive rather then sacrificing it for this game.
PS. Any game that consumes more then 8gb of memory has major memory leak issues. There is absolutely no reason it should ever consume over 8gb. Me..you..and many others have double or more the limit.
Don’t worry about it. I have the new generation 5820K, 16gb DDR4 2400mhz, 7970 OCed with both SSHD and SSD and i get 45-50FPS in LA with literally 8 ppl around me AFKing and 2 at the mystic forge…that for the price of my PC going nuts on noctua fans. The game is unbelievably bad at performance. Am not planning on playing a game that takes out so much of my PC for far too long. Will probably quit if nothing is done about this. I prefer keeping my PC alive rather then sacrificing it for this game.
Exactly lol, it’s a mess. It’s like we are using candles when light bulbs have been out for years. And solar-powered lights are coming out next year (DX12)!
If the 2 (random number) people they have working on optimisations start now, they may be finished by the time the second xpac comes around. I would rather wait that long and get a huge boost than get multiple 0.1% optimisations every few months like we do right now. So it takes a long time to do this? Who cares? Get it done! You are doing optimisations anyway, might as well make it worthwhile.
(edited by GreenAlien.5623)
Anet, I’m glad that you’re making an expansion, and bringing us all of this new content. However, I’m really curious as to whether you will be moving some of the computational load of the video game to the GPU?
GW2 compared to many other video games, hardly utilizes the GPU of my system, or anyone’s system that i’ve talked to.
Will you be making any real improvements to the game that are beyond diversifying the content and class balance?
It was confirmed that expansion will not get any dx 10,11,12 update so no need to buy a new video card. Source: video from press even at Anet office within the past couple weeks . Of course things are always subject to change between now and then.
edit found it: https://www.youtube.com/watch?v=U2dBqAw7C3U at 7min 25 sec
“ANet. They never miss an opportunity to miss an opportunity to not mess up.”
Mod “Posts created to cause unrest with unfounded claims are not allowed” lmao
(edited by JediYoda.1275)
wtb 64bit game.
Fractal lvl 80 – 126 AR
It was confirmed that expansion will not get any dx 10,11,12 update so no need to buy a new video card. Source: video from press even at Anet office within the past couple weeks . Of course things are always subject to change between now and then.
edit found it: https://www.youtube.com/watch?v=U2dBqAw7C3U at 7min 25 sec
He just said that you don’t need to upgrade your HW, the dx10+ could still be possible
http://wpwhendead.tumblr.com - a GW2 webcomic about a Charr and a Skritt
wtb 64bit game.
imagine the amount of bugs and crashes for the first 7 years after release of 64bit client
Anet, I’m glad that you’re making an expansion, and bringing us all of this new content. However, I’m really curious as to whether you will be moving some of the computational load of the video game to the GPU?
GW2 compared to many other video games, hardly utilizes the GPU of my system, or anyone’s system that i’ve talked to.
Will you be making any real improvements to the game that are beyond diversifying the content and class balance?
I don’t know who you talked to, but outside WvW (which indeed doesn’t use GPU for some reason) the game uses mostly the GPU and not much of the CPU. If players actually TESTED their claims, it would be better, but most players just post what others post and what they read somewhere.
Open your task manager and it’s easy to see the GPU is getting way more usage than the CPU, of course that depends on your CPU/GPU, if your CPU is terrible and your GPU is a monster, it might not be the case.
I don’t know who you talked to, but outside WvW (which indeed doesn’t use GPU for some reason) the game uses mostly the GPU and not much of the CPU. If players actually TESTED their claims, it would be better, but most players just post what others post and what they read somewhere.
Open your task manager and it’s easy to see the GPU is getting way more usage than the CPU, of course that depends on your CPU/GPU, if your CPU is terrible and your GPU is a monster, it might not be the case.
this
as owner of 4690k @ 4,5 (which isnt monster by any means)
i hardly get it to 80-90% of usage, mostly sitting at 30-60%
however, my hd7850 is constantly at 99% (especially on new maps)
and if expansion maps will be even more detailed / bigger than southsun/drytop/sw, for lots of players gpu change may be neccesary to get decent framerate, even if we’ll be stuck in 2005 with dx9
(edited by zaw.6741)
wtb 64bit game.
imagine the amount of bugs and crashes for the first 7 years after release of 64bit client
As opposed to the soul crushing crashes we get now whenever we do a large event, crash, and are unable to get back into that large event?
It was confirmed that expansion will not get any dx 10,11,12 update so no need to buy a new video card. Source: video from press even at Anet office within the past couple weeks . Of course things are always subject to change between now and then.
edit found it: https://www.youtube.com/watch?v=U2dBqAw7C3U at 7min 25 sec
He just said that you don’t need to upgrade your HW, the dx10+ could still be possible
That’s not all he said, he said the devs are getting new tech so they can make the game and that things will stay the same for us.
“ANet. They never miss an opportunity to miss an opportunity to not mess up.”
Mod “Posts created to cause unrest with unfounded claims are not allowed” lmao
I don’t know who you talked to, but outside WvW (which indeed doesn’t use GPU for some reason) the game uses mostly the GPU and not much of the CPU. If players actually TESTED their claims, it would be better, but most players just post what others post and what they read somewhere.
Open your task manager and it’s easy to see the GPU is getting way more usage than the CPU, of course that depends on your CPU/GPU, if your CPU is terrible and your GPU is a monster, it might not be the case.
this
as owner of 4690k @ 4,5 (which isnt monster by any means)
i hardly get it to 80-90% of usage, mostly sitting at 30-60%
however, my hd7850 is constantly at 99% (especially on new maps)
and if expansion maps will be even more detailed / bigger than southsun/drytop/sw, for lots of players gpu change may be neccesary to get decent framerate, even if we’ll be stuck in 2005 with dx9
I play 95% WvW, and i can say that my GPU is woefully under-utilized, mostly sitting at 15-20%. I Have tested my claims, I simply am in a different community (mostly wvw) where CPU dominates. I dont really spend any time on the new maps besides to unlock them and get WPs in every corner. I simply wanted to know if there would be any optimizations that would improve resource management.
It doesn’t really affect my gameplay at all – my CPU can handle all of the processing that needs to happen in full settings, but I had a glimmer of a hope that maybe Anet would bring the game into the right decade.
Thanks for all the posts and links.
I don’t know who you talked to, but outside WvW (which indeed doesn’t use GPU for some reason) the game uses mostly the GPU and not much of the CPU. If players actually TESTED their claims, it would be better, but most players just post what others post and what they read somewhere.
Open your task manager and it’s easy to see the GPU is getting way more usage than the CPU, of course that depends on your CPU/GPU, if your CPU is terrible and your GPU is a monster, it might not be the case.
this
as owner of 4690k @ 4,5 (which isnt monster by any means)
i hardly get it to 80-90% of usage, mostly sitting at 30-60%
however, my hd7850 is constantly at 99% (especially on new maps)
and if expansion maps will be even more detailed / bigger than southsun/drytop/sw, for lots of players gpu change may be neccesary to get decent framerate, even if we’ll be stuck in 2005 with dx9I play 95% WvW, and i can say that my GPU is woefully under-utilized, mostly sitting at 15-20%. I Have tested my claims, I simply am in a different community (mostly wvw) where CPU dominates. I dont really spend any time on the new maps besides to unlock them and get WPs in every corner. I simply wanted to know if there would be any optimizations that would improve resource management.
It doesn’t really affect my gameplay at all – my CPU can handle all of the processing that needs to happen in full settings, but I had a glimmer of a hope that maybe Anet would bring the game into the right decade.
Thanks for all the posts and links.
with new wvw maps comming, its highly possible that they’ll ned muchmuchmuchmcuh more gpu power, like the new pve maps do
wtb 64bit game.
imagine the amount of bugs and crashes for the first 7 years after release of 64bit client
That’s really not the issue (obviously there would be issues, despite your exaggeration). The issue is doubling your testing, development and support costs because you would HAVE to keep both clients. Also, to REALLY get the most out of a 64-bit client they would likely have to develop a 64 bit DB in the DAT file (and then support 2 DAT files). If they don’t revamp the DAT file, I’m not sure how much improvement a 64-bit client would make.
Fate is just the weight of circumstances
That’s the way that lady luck dances
I play 95% WvW, and i can say that my GPU is woefully under-utilized, mostly sitting at 15-20%. I Have tested my claims, I simply am in a different community (mostly wvw) where CPU dominates. I dont really spend any time on the new maps besides to unlock them and get WPs in every corner. I simply wanted to know if there would be any optimizations that would improve resource management.
It doesn’t really affect my gameplay at all – my CPU can handle all of the processing that needs to happen in full settings, but I had a glimmer of a hope that maybe Anet would bring the game into the right decade.
Thanks for all the posts and links.
I have to agree with this. I’m mostly in the same boat. My CPU is consistently above 80% during fights while my GPU is using no more than 30-40% at the same time.
When the CPU climbs over 90% I see significant frame drops and skill lag, while my GPU is never, ever maxed out.
FWIW, my specs are …
Intel i4670k @ 4.2GHz
MSI Z87-G45 Mobo
MSI GTX 760-TF
32GB Corsair @ 2133MHz
Samsung EVO 840 SSD’s
As a former developer, I realize that there are loads of computational processes occurring in the middle of a massive fight as well as the netcode to coordinate it all across the players. That certainly takes up a lot of processing power, but I’m not convinced that it is as optimized as it really could be and, much like Hamster, I’m curious if ANet is even working on optimizations here ….
I play 95% WvW, and i can say that my GPU is woefully under-utilized, mostly sitting at 15-20%. I Have tested my claims, I simply am in a different community (mostly wvw) where CPU dominates. I dont really spend any time on the new maps besides to unlock them and get WPs in every corner. I simply wanted to know if there would be any optimizations that would improve resource management.
It doesn’t really affect my gameplay at all – my CPU can handle all of the processing that needs to happen in full settings, but I had a glimmer of a hope that maybe Anet would bring the game into the right decade.
Thanks for all the posts and links.
I have to agree with this. I’m mostly in the same boat. My CPU is consistently above 80% during fights while my GPU is using no more than 30-40% at the same time.
WvW has long had a problem with GPU utilization, maybe with the new borderland they will fix it. Or rather, use the same system in WvW that they use anywhere else.
WvW has long had a problem with GPU utilization, maybe with the new borderland they will fix it. Or rather, use the same system in WvW that they use anywhere else.
It’s the same everywhere. It is simply amplified in WvW due to the nature of how WvW works. When I’m in PvE, particularly the boss train, the same exact thing happens.
In some instances, Maw for example, I get about 15/20fps with 100% CPU utilization and about 50% ish GPU. Clearly, the game engine cannot cope with what’s going on and starts dropping frames because it cannot get free CPU cycles to deal with what to even send over to the GPU for processing.
Like I said earlier, the game is computationally heavy, not graphically heavy. It it feels far from optimized in that regard when there are big groups of players in close proximity to each other.
WvW has long had a problem with GPU utilization, maybe with the new borderland they will fix it. Or rather, use the same system in WvW that they use anywhere else.
It’s the same everywhere. It is simply amplified in WvW due to the nature of how WvW works. When I’m in PvE, particularly the boss train, the same exact thing happens.
In some instances, Maw for example, I get about 15/20fps with 100% CPU utilization and about 50% ish GPU. Clearly, the game engine cannot cope with what’s going on and starts dropping frames because it cannot get free CPU cycles to deal with what to even send over to the GPU for processing.
Like I said earlier, the game is computationally heavy, not graphically heavy. It it feels far from optimized in that regard when there are big groups of players in close proximity to each other.
seems like we play totally different game
my cpu get to 90s only at teq, mostly sit at 30-70 (dungeons, pvp, open world, fotm)
gpu gets to 99% easily
i dont care about wvw, its too boring for me so that may be a thing
stop about that ‘you dont need gpu to play gw2’ nonsense
on a mere 1080p my gpu im often at 99%. the gui issues with latest patch made it even worse, opening inventory can cause gpu usage jump by 20% and lose 10fps (what is that, on some old flash?)
im using hd7850 2gb (never seen more than 700mb of vram beeing used) with 4690k @ 4,5ghz
I dont know but when I play pve i hear my cards ramp up and see it happen on my evga monitor. My cpu doesnt seem to get taxed very hard. Although i have noticed in certain world boss areas with alot of people the gpu will drop down along with fps seems odd it should be at full.
Resoulution 5970×1200
Intel i7 3770k
2x gtx 680 4g ea in sli
16g ram
(edited by rgraze.5169)
I never said you don’t need GPU to play GW2, please refrain from contorting what my statement was. My game experience matches what the OP has experienced, not only in WvW, but in PvE (when there’s a large zerg).
For clarity, I run two 1080p displays when gaming. One has GW2 running, the other usually has a browser window open, teamspeak and a system process monitor among any other random set of apps (spotify, etc).
There are certainly countless variables here, but I’d be happy to share empirical evidence such as process logs (CPU/GPU/RAM/HDD) usage after the next time I play.
Edit:
It’s probably worth mentioning that I also have every graphical option maxed out at 1920×1080 and all camera shake effects disabled.
(edited by Mike Hawk.4835)
I wonder if running in windowed mode it doesnt focus on gpu?
My cpu doesnt seem to get taxed very hard. Although i have noticed in certain world boss areas with alot of people the gpu will drop down along with fps seems odd it should be at full.
This is exactly what happens when a game tries to draw lots of (unique) things with an API that is incapable of expressing the work efficiently. For each thing the game draws, setting up the drawing operation (textures, mesh data, positions, character poses, etc.) via D3D may cost its “main” thread considerable CPU time, and then the API implementation and driver must translate this setup into GPU state. This CPU cost is the same regardless of how many pixels on the screen the object covers (if any at all), how many polygons the object has and so on.
When rendering a frame mostly consists of many drawing tasks expensive to set up on the CPU side but absolutely trivial to execute on a modern GPU, the GPU may start running out of things to do and starts showing low load. When paired with the dreaded API state synchronization where the CPU-side driver code has to wait for the GPU to finish something before touching it again, both the CPU and GPU may show low load as the game performs poorly.
Some of the earlier posters in this thread talk about “offloading” operations to the GPU, but as far as I can tell, the game is mostly already making the GPU do the things the GPU should do. These performance problems are mostly in communicating these tasks to the GPU efficiently. Newer APIs make this faster by e.g. allowing more of the draw state to be expressed as simple parameters to shaders, which can be passed to the GPU as bulk memory transfers instead of tinkering with D3D state variables one-by-one.
Any developers or programmers want to chime in on game improvements? I’m seeing a bunch of threads asking about it.
direct X 12 would be kinda useless on a 32 bit client.
lets focus on getting a 64bit client that can actually use our system resources and not be limited before asking for new dx.
64bit is a lot easier to achieve then direct x rework
Fractal lvl 80 – 126 AR
So, how much would you be willing to pay for basically a complete rebuild of the whole engine that will probably take years and loads of new employees?
Krall Peterson – Warrior
Piken Square
So, how much would you be willing to pay for basically a complete rebuild of the whole engine that will probably take years and loads of new employees?
Renderer != game engine.
#Gw2OnDx12
But the game engine is built for DirectX 9. You can’t just magically push a button and have it use DirectX 12. If that was the case why would any game ever use anything other than the latest version?
Krall Peterson – Warrior
Piken Square
(edited by lordkrall.7241)
But the game engine is built for DirectX 9. You can’t just magically push a button and have it use DirectX 12. If that was the case why would any game ever use anything other than the latest version?
Because, believe it or not, majority of users play on Dx9 compatible GPUs. Or at least that was the case when Gw2 launched, almost 3 years ago.
Hell, my R9 290X is maybe going to get full Dx 12 support. It is still unknown. Even though it has the latest iteration of GCN. At least until 3xx series launch later this year.
All those laptops and macbooks won’t see any improvement. But for the power users like myself, this will be very welcoming news.
So please, ANet, make it happen <3
But the game engine is built for DirectX 9. You can’t just magically push a button and have it use DirectX 12. If that was the case why would any game ever use anything other than the latest version?
Because, believe it or not, majority of users play on Dx9 compatible GPUs. Or at least that was the case when Gw2 launched, almost 3 years ago.
Just look at the outcry from all the people that still use XP about the last patch .. -sigh-
Best MMOs are the ones that never make it. Therefore Stargate Online wins.
Still doesn’t answer how much you are willing to pay for all the work required to get DirectX 12 available for GW2.
No matter how much you try to twist and turn it, it will be a rather massive cost and it will take rather large amount of time and they will need to hire more people.
Krall Peterson – Warrior
Piken Square