DirectX 11/12 request [merged]
aaaand another bump!
Any news?
We need that optimisation for 60fps at all sicuation
Use more GPU and less CPU.
thanks in advance
Corsair RM650x, Fractal Define S (with window panel)
aaaand another bump!
Any news?
We need that optimisation for 60fps at all sicuation
Use more GPU and less CPU.thanks in advance
Yes it would be nice and until a new engine would be available it will not happen.
And you want more CPU threads for the game executable and the engine so the GPU could be used for max use….
SO: use more CPU and GPU would be wiser…
and until a miracle happens and a new engine is made available and implemented for all area’s of the game nothing will change.
I would like to see improvement, BUT I’d prefer new LS’s, expansions, the blocked legendaries, 4 (,maybe 5,) more dragons, the implementations of wvw and more FIRST.
Content > optimalisation,
no content= no players,
no optimalisation= present situation with players who want content, to be content…
Been There, Done That & Will do it again…except maybe world completion.
@ Stephani Wise: (and others)
Direct x 12 =/= performance for older applications, DX 12 is completely the wrong thing to ask for. You need a new game engine. And it will (most likely) NOT be made.
This game’s engine was written with CPU speed in mind: This is also why at present this game can run faster on a 4.6-5.0 Ghz OC-ed Pentium G3258 with 2 cores then on a i7 3930K hexcore at stock (3.2ghz). The engine doesn’t need the threads just pure core speed. And as it is developed for DirectX 9 all videocards for DX 10(+ ) tend to be OP.
Lastly upgrading the engine w-/could force players to a new computer if the present set isn’t up to spec.
Guild wars 2 consists of many parts.
But in general you have 3 main parts of this game
1) being the program itself: gw2.exe the use of the objects ingame, the interaction, movies, database, security, and 2d flats, like the character select and the character build screens, I/O for map placement, and gamecontrols, like volume and so on.
You are interacting with this part, this part also needs to process all network data and provide the engine data to build the screen.2) the 3D engine ( whcih is creating your landscape from mesh/ or other graphicshapes, placing 2d and 3d elements, placing objects & characters) being an old modified engine being an extremely customized version of the Unreal Engine whcih was dx 7.0/8.0 but modified to dx 9.0c, it uses an Havok add on to create physics effects (waves, wind, knockbacks, and so on, waving of armor and gear)
You are looking at this part. It builds your screen…3) I/O with the databases and synchronisation of the game. (network services)
The servers regulate your placement and interaction with others in the game, this requires network communications. You only make use of the small bits of data you need to be able to interact with others, but this info is used by the Game executable.
This is neccesary to make your RPG into a MMO-RPGThen there is the OS with directX which have nothing to do with the game
1) The OS makes the computer run and provides you with an interface to use it.
2) DirectX will allow programmers to write code which is universal and makes sure different components will function with the game… This is mostly hardware support.
Problem with the GW2 engine is
The game has problems to construct enough character models and effects in crowded area’s mostly because the engine was originally used to be used for up to 10-16 people. GW2 as a game is now cramming 100 people in a map and in WVW 200 even, this overstresses the game and communication to the servers. Which in turn bottlenecks the game executable, which bottlenecks the engine. The GPU and the CPU have no real problems and still have additional power, as seen by the fact the game uses only 2 cores (likely 1 for the excutablke and the handling of I/O and 1 for the Graphics) and a few added support apps (network services)
Problem is the old graphics engine of the game CANNOT HANDLE "MULTI"THREADED operation and/or communications as it was never a problem when the engine was developed. (We are talking a few days ago here)
Timeline:
1998 (Unreal v1.0 optimized for 3dfx., various versions of Direct3d 6.0+)
2000 (directx 8.0)
2001 32 bit single core CPU’s @ ~1-2 Ghz, 256Mb-1Gb Ram just so you know
2002 (directx 9.0)
2002 (Unreal v 2.0 @ dx9)
2003 (64 bit CPU’s) (AMD)
2004 (dual cores)
2004 (directx 9.0c)
2004 (unreal v3.0 @ dx9.0c)
2005 (release of GW1) I have no clue which engine they use but my best guess would bea derivative of Unreal v2.0 modified, though talk was heard they changed a lot to fit a nwer engine whcih could have been the unreal 3.0, whcih was modded.
2006 (quad cores)
2007 (release of EotN)
2007 (windows vista,directx10)
2008 (windows 7, directx 11)
2008 (windows vista SP1, directx 10.1)
2011 (windows 8, directx 11.1)
2012 (release of GW2)
2012 (Unreal v 4.0, Dx 11&12) <- you need an engine availalble to create your game, as the game has had a development track Unreal 4.0 was to late. Actually gw2 was pretty late itself… and thus the previously used gw1 engine was stripped and upgraded.
2013 (windows 8.1, directx 11.2)
2015 (announced: directx 11.3)
2015 (windows 10, directx 12)
2015 (release of GW2:HoT) Problem is an engine upgrade would have been nice, but extremely costly… we should feel lucky as the sales of HoT were lower then expected and it could have been a deathblow for the game….
2015 Unreal 4.0 engine made avaialable for free…. so no initial buy, but use is costly… I do not want to pay a monthly fee to be able to use DX 12. Would you?
As Multithreading isn’r an option and upgrading is not possible to allow for directx and multithreads and thus the game is only able to deliver 1 thread.
The question if a upgrade from directx9 to directx 12 is worthwhile is eaily handled:1) It’s useless and a waste of time and development at this time to allow the use of DX 12.
2) The first change which should be made is a rebuild of a newer UNREAL engine to a new Guild Wars 2 engine to allow allow for multithreaded CPU rendering use and multithreaded GPU calls this asks a complete rebuild of the heart of the guild wars 2 program which might be worthwhile in a long run, but would be extremely costly due to the changes being horrendously timeconsuing and the neccesity to reverse engineer the old engine and rebuild the new engine to allow for multithreaded use and DX 12. 1 remark: the cost of using a new engine like Unreal 4.0 is free (since 2015), however a percentage of all PROFIT needs to be paid when using it! For a company completly reliant of gamestore purchases this could be a bit much…. And when they are modding it, it isn’t instantly a MMO as GW2… It is a FPS engine…Investments to make a new game engine will run into multiple, likely even dozens of, millions of dollars, even if the engine itself has no cost. For this game it is a bit over the top… And how much you may want this change, it is most likely not happening. Devs already stated it isn’t that easy and you should consider the fact DX 12 is NO PART OF ANY GAME. It allows for easier use of available hardware, but in the case of GW2 all hardware it needs is accessed already by directx 9. The old engine cannot make use of more.
DirectX is used by programmers to allow a program to work on different build computers, but using directx12 on a PC doesn’t allow a directX 9 program to work miracles. In effect the new features directx12 provides isn’t compatible and/or neccesary and therefore not used and only the legacy directx 9 components will be used.
SO:
We have the game in 32 and 64 bit which accesses a game engine to create graphics
We have a really old game engine, updated, but still LEGACY! running on directx 9.0c
We will not have an updated game-engine anywhere soon (well likely, as devs said so) and thus we will be locked into directx 9.We might have acces to directx 12 compatibility on computers 2-4 years old, allowing for partial directx 12 support but not by default
We might have acces to directx 12 True on computers 1 -2 year old for full directx 12 support, but not by default
We have acces to directx 12 software when running Windows 10, by default.
Direct X 12 only helps with the acces and ways to acces hardware, but if the program doesn’t use the features, any investments into allowing DX12 compatibility is/are useless…
Even if we have DX12 in all previous 3 questrions the game still is DX9 due to its game engine and it will stay directx 9.0….for now.The End. At least for now.
Would utilizing a VM to create 2 virtual cores (from the 6 physcial or 12 virtual ones in my PC) work to improve performance? and have a virtual machine handle the datastreams?
You say that this game was made for speed in mind in the ghz of the processor.
I do not agree with that.Since multithreading and
multiple coreDUAL CORE was out when the game was released. Yes, for gw1 dual cores were just the new thing, many players still played on single core, single thread machines, The engine developed for gw1 was reused in gw2, the game itself now uses 1 thread, and the rendering queue uses 1 thread, for a total of 2 cores, with some services delegated to other cores when available. this also makes the game compatible with i3 (laptop)processors, whcih are still dual core in naturecpu heat problem was all ready know before that. Well yes, limits of corespeed were kown in the time devs made gw2…, unfortunatly the ancient engine is mostly limited to 1 core
I can understand that they said the new dual core as communication issue and does not solve the speed issue. and they have continue to build it single core. Yes as at the time of release of Gw1 quadcores would likely be introduced and machines were at 2Ghz… ,but were not available at the time, most people still ran windows 32bit on single core machines.
I don’t think that they have not plan a head and say what if we need a multicore engine in the future and made the engine to be able to be upgraded since the industry was moving in that direction when they where building gw .2. and the cpu communication with the gpu is the api. ArenaNet owns it’s own game engine whcih was used for gw1 and gw2, buying a new one or making a neew one would be a gigantic investment.
That is what dx 12 will fix. DX 12 will not fix anything, you’ll need a new engine first since the devs will write smarter code that will not have to stop at the api to be validated since the devs will have validated it before it reach the api. that is the improvement that dx 12 brings. only DX12 will not provide better communication between 1! cpu core , due to an old engine and all gpu core*s*.
Less slow down and cpu bound (problem will still exists as the engine delivering the data is still MAXIMUM 1 core) issue do to less job on the api. right now the api need to validate the code and direct it. same as a cop(api) doing traffic that would have some car stopping to ask him if they are going at the right place. and would need to validate all the info and send them in the right direction. by removing that job from the api there is less slow down and having multiple cpu core (you would still only use 1 as the engine CANNOT make use of more) to multiple core gpu communication (my 2 old 780’s have 2 times 2304 cores, and the game is using them, at 30-50% but still) also help the flow of information. 1 to 4608 or 1 to 4608 doesn’t matter… in te end as long as ther isn’t a new engine
of course the complexity for the devs is higher and the responsibility also will be higher since the api will not validate the code of the devs, the devs will validate is own code before hand. meaning more testing.
Devs will not need to make dx12 , they will need to build a new engine first which can INTERACT WITH DX 12…..No New Engine: No need for DX 12
If they were to rework the engine, I’m pretty sure they’d add
- DX 9,10,11,12 compatibility.
- And options to use 2,4,6,8,10,12(+) cores.
- And maybe RAM caching for PC’s with more then 8 Gb/16 or 32Gb RAM, or more…
- And improved SLi/X-fire support,
- And options to customize the interface on a 3840*2160/2400 or 5760*1080/1200 and 11520*2160/2400 and VR sets/screens
Well that’s about it….
Been There, Done That & Will do it again…except maybe world completion.
(edited by PaxTheGreatOne.9472)
@ Stephani Wise: (and others)
Direct x 12 =/= performance for older applications, DX 12 is completely the wrong thing to ask for. You need a new game engine. And it will (most likely) NOT be made.
This game’s engine was written with CPU speed in mind: This is also why at present this game can run faster on a 4.6-5.0 Ghz OC-ed Pentium G3258 with 2 cores then on a i7 3930K hexcore at stock (3.2ghz). The engine doesn’t need the threads just pure core speed. And as it is developed for DirectX 9 all videocards for DX 10(+ ) tend to be OP.
Lastly upgrading the engine w-/could force players to a new computer if the present set isn’t up to spec.
Guild wars 2 consists of many parts.
But in general you have 3 main parts of this game
1) being the program itself: gw2.exe the use of the objects ingame, the interaction, movies, database, security, and 2d flats, like the character select and the character build screens, I/O for map placement, and gamecontrols, like volume and so on.
You are interacting with this part, this part also needs to process all network data and provide the engine data to build the screen.2) the 3D engine ( whcih is creating your landscape from mesh/ or other graphicshapes, placing 2d and 3d elements, placing objects & characters) being an old modified engine being an extremely customized version of the Unreal Engine whcih was dx 7.0/8.0 but modified to dx 9.0c, it uses an Havok add on to create physics effects (waves, wind, knockbacks, and so on, waving of armor and gear)
You are looking at this part. It builds your screen…3) I/O with the databases and synchronisation of the game. (network services)
The servers regulate your placement and interaction with others in the game, this requires network communications. You only make use of the small bits of data you need to be able to interact with others, but this info is used by the Game executable.
This is neccesary to make your RPG into a MMO-RPGThen there is the OS with directX which have nothing to do with the game
1) The OS makes the computer run and provides you with an interface to use it.
2) DirectX will allow programmers to write code which is universal and makes sure different components will function with the game… This is mostly hardware support.
Problem with the GW2 engine is
The game has problems to construct enough character models and effects in crowded area’s mostly because the engine was originally used to be used for up to 10-16 people. GW2 as a game is now cramming 100 people in a map and in WVW 200 even, this overstresses the game and communication to the servers. Which in turn bottlenecks the game executable, which bottlenecks the engine. The GPU and the CPU have no real problems and still have additional power, as seen by the fact the game uses only 2 cores (likely 1 for the excutablke and the handling of I/O and 1 for the Graphics) and a few added support apps (network services)
Problem is the old graphics engine of the game CANNOT HANDLE "MULTI"THREADED operation and/or communications as it was never a problem when the engine was developed. (We are talking a few days ago here)
Timeline:
1998 (Unreal v1.0 optimized for 3dfx., various versions of Direct3d 6.0+)
2000 (directx 8.0)
2001 32 bit single core CPU’s @ ~1-2 Ghz, 256Mb-1Gb Ram just so you know
2002 (directx 9.0)
2002 (Unreal v 2.0 @ dx9)
2003 (64 bit CPU’s) (AMD)
2004 (dual cores)
2004 (directx 9.0c)
2004 (unreal v3.0 @ dx9.0c)
2005 (release of GW1) I have no clue which engine they use but my best guess would bea derivative of Unreal v2.0 modified, though talk was heard they changed a lot to fit a nwer engine whcih could have been the unreal 3.0, whcih was modded.
2006 (quad cores)
2007 (release of EotN)
2007 (windows vista,directx10)
2008 (windows 7, directx 11)
2008 (windows vista SP1, directx 10.1)
2011 (windows 8, directx 11.1)
2012 (release of GW2)
2012 (Unreal v 4.0, Dx 11&12) <- you need an engine availalble to create your game, as the game has had a development track Unreal 4.0 was to late. Actually gw2 was pretty late itself… and thus the previously used gw1 engine was stripped and upgraded.
2013 (windows 8.1, directx 11.2)
2015 (announced: directx 11.3)
2015 (windows 10, directx 12)
2015 (release of GW2:HoT) Problem is an engine upgrade would have been nice, but extremely costly… we should feel lucky as the sales of HoT were lower then expected and it could have been a deathblow for the game….
2015 Unreal 4.0 engine made avaialable for free…. so no initial buy, but use is costly… I do not want to pay a monthly fee to be able to use DX 12. Would you?
As Multithreading isn’r an option and upgrading is not possible to allow for directx and multithreads and thus the game is only able to deliver 1 thread.
The question if a upgrade from directx9 to directx 12 is worthwhile is eaily handled:1) It’s useless and a waste of time and development at this time to allow the use of DX 12.
2) The first change which should be made is a rebuild of a newer UNREAL engine to a new Guild Wars 2 engine to allow allow for multithreaded CPU rendering use and multithreaded GPU calls this asks a complete rebuild of the heart of the guild wars 2 program which might be worthwhile in a long run, but would be extremely costly due to the changes being horrendously timeconsuing and the neccesity to reverse engineer the old engine and rebuild the new engine to allow for multithreaded use and DX 12. 1 remark: the cost of using a new engine like Unreal 4.0 is free (since 2015), however a percentage of all PROFIT needs to be paid when using it! For a company completly reliant of gamestore purchases this could be a bit much…. And when they are modding it, it isn’t instantly a MMO as GW2… It is a FPS engine…Investments to make a new game engine will run into multiple, likely even dozens of, millions of dollars, even if the engine itself has no cost. For this game it is a bit over the top… And how much you may want this change, it is most likely not happening. Devs already stated it isn’t that easy and you should consider the fact DX 12 is NO PART OF ANY GAME. It allows for easier use of available hardware, but in the case of GW2 all hardware it needs is accessed already by directx 9. The old engine cannot make use of more.
DirectX is used by programmers to allow a program to work on different build computers, but using directx12 on a PC doesn’t allow a directX 9 program to work miracles. In effect the new features directx12 provides isn’t compatible and/or neccesary and therefore not used and only the legacy directx 9 components will be used.
SO:
We have the game in 32 and 64 bit which accesses a game engine to create graphics
We have a really old game engine, updated, but still LEGACY! running on directx 9.0c
We will not have an updated game-engine anywhere soon (well likely, as devs said so) and thus we will be locked into directx 9.We might have acces to directx 12 compatibility on computers 2-4 years old, allowing for partial directx 12 support but not by default
We might have acces to directx 12 True on computers 1 -2 year old for full directx 12 support, but not by default
We have acces to directx 12 software when running Windows 10, by default.
Direct X 12 only helps with the acces and ways to acces hardware, but if the program doesn’t use the features, any investments into allowing DX12 compatibility is/are useless…
Even if we have DX12 in all previous 3 questrions the game still is DX9 due to its game engine and it will stay directx 9.0….for now.The End. At least for now.
Would utilizing a VM to create 2 virtual cores (from the 6 physcial or 12 virtual ones in my PC) work to improve performance? and have a virtual machine handle the datastreams?
You say that this game was made for speed in mind in the ghz of the processor.
I do not agree with that.Since multithreading and
multiple coreDUAL CORE was out when the game was released. Yes, for gw1 dual cores were just the new thing, many players still played on single core, single thread machines, The engine developed for gw1 was reused in gw2, the game itself now uses 1 thread, and the rendering queue uses 1 thread, for a total of 2 cores, with some services delegated to other cores when available. this also makes the game compatible with i3 (laptop)processors, whcih are still dual core in naturecpu heat problem was all ready know before that. Well yes, limits of corespeed were kown in the time devs made gw2…, unfortunatly the ancient engine is mostly limited to 1 core
I can understand that they said the new dual core as communication issue and does not solve the speed issue. and they have continue to build it single core. Yes as at the time of release of Gw1 quadcores would likely be introduced and machines were at 2Ghz… ,but were not available at the time, most people still ran windows 32bit on single core machines.
I don’t think that they have not plan a head and say what if we need a multicore engine in the future and made the engine to be able to be upgraded since the industry was moving in that direction when they where building gw .2. and the cpu communication with the gpu is the api. ArenaNet owns it’s own game engine whcih was used for gw1 and gw2, buying a new one or making a neew one would be a gigantic investment.
That is what dx 12 will fix. DX 12 will not fix anything, you’ll need a new engine first since the devs will write smarter code that will not have to stop at the api to be validated since the devs will have validated it before it reach the api. that is the improvement that dx 12 brings. only DX12 will not provide better communication between 1! cpu core , due to an old engine and all gpu core*s*.
Less slow down and cpu bound (problem will still exists as the engine delivering the data is still MAXIMUM 1 core) issue do to less job on the api. right now the api need to validate the code and direct it. same as a cop(api) doing traffic that would have some car stopping to ask him if they are going at the right place. and would need to validate all the info and send them in the right direction. by removing that job from the api there is less slow down and having multiple cpu core (you would still only use 1 as the engine CANNOT make use of more) to multiple core gpu communication (my 2 old 780’s have 2 times 2304 cores, and the game is using them, at 30-50% but still) also help the flow of information. 1 to 4608 or 1 to 4608 doesn’t matter… in te end as long as ther isn’t a new engine
of course the complexity for the devs is higher and the responsibility also will be higher since the api will not validate the code of the devs, the devs will validate is own code before hand. meaning more testing.
Devs will not need to make dx12 , they will need to build a new engine first which can INTERACT WITH DX 12…..No New Engine: No need for DX 12
If they were to rework the engine, I’m pretty sure they’d add
- DX 9,10,11,12 compatibility.
- And options to use 2,4,6,8,10,12(+) cores.
- And maybe RAM caching for PC’s with more then 8 Gb/16 or 32Gb RAM, or more…
- And improved SLi/X-fire support,
- And options to customize the interface on a 3840*2160/2400 or 5760*1080/1200 and 11520*2160/2400 and VR sets/screens
Well that’s about it….
pax
just a question: if you have data car that run all at the same speed and they do not stop is there a traffic jam?
that there is 1 lane or 4 lane if they do not stop is there a traffic jam?
that is dx 12 data not stopping at the api because the devs validate where it is going, no more need to stop at the api to be validated. yes all core communicate on dx 12. but the main factor is that what solve the cpu bound issue is there is no traffic jam. since the work the api was doing before is now done by the devs before release.
now can the game use only 1 core and use dx 12. that is totally another story.
one thing is sure dx 12 is what solve cpu bound issue. if they ever decide to upgrade dx 12 is the best pick.
I am talking about the over head issue. state pipeline in graphic that reduce the over head.
Direct3D 12 cuts out the metaphorical middle man. Game code can directly generate an arbitrary number of command lists, in parallel. It also controls when these are submitted to the GPU, which can happen with far less overhead than in previous concepts as they are already much closer to a format the hardware can consume directly. This does come with additional responsibilities for the game code, such as ensuring the availability of all resources used in a given command list during the entirety of its execution, but should in turn allow for much better parallel load balancing.
In DX12, state information is instead gathered in pipeline state objects. These are immutable once constructed, and thus allow the driver to check them and build their hardware representation only once when they are created. Using them is then ideally a simple matter of just copying the relevant description directly to the right hardware memory location before starting to draw.
Direct3D 12 will allow—and force—game developers to manually manage all resources used by their games more directly than before. While higher level APIs offer convenient views to resources such as textures, and may hide others like the GPU-native representation of command lists or storage for some types of state from developers completely, in D3D12 all of these can and must be managed by the developers.
This not only means direct control over which resources reside in which memory at all times, but also makes game programmers responsible for ensuring that all data is where the GPU needs it to be when it is accessed. The GPU and CPU have always acted independently from each other in an asynchronous manner, but the potential problems (e.g. so-called pipeline hazards) arising from this asynchronicity were handled by the driver in higher-level APIs.
http://www.pcgamer.com/what-directx-12-means-for-gamers-and-developers/2/
since you seams to like information and technical detail. you can also go read this:
https://software.intel.com/sites/default/files/managed/4a/38/Efficient-Rendering-with-DirectX-12-on-Intel-Graphics.pdf
(edited by stephanie wise.7841)
Although I would love to see DX12 or at least DX11, I wouldn’t count on it. Funny thing, even though I hae wow, they at least did some graphic tweaking and updating iver the last years.
aaaand another bump!
Any news?
We need that optimisation for 60fps at all sicuation
Use more GPU and less CPU.thanks in advance
Yes it would be nice and until a new engine would be available it will not happen.
And you want more CPU threads for the game executable and the engine so the GPU could be used for max use….SO: use more CPU and GPU would be wiser…
and until a miracle happens and a new engine is made available and implemented for all area’s of the game nothing will change.
I would like to see improvement, BUT I’d prefer new LS’s, expansions, the blocked legendaries, 4 (,maybe 5,) more dragons, the implementations of wvw and more FIRST.
Content > optimalisation,
no content= no players,
no optimalisation= present situation with players who want content, to be content…
yes but content that lack optimization is not as enjoyable, when it stutter or crash. just look at hot at release it was very good content but it crashed 10 time per hour until they released the 64 bit client optimization.
and if you look in the devs answer in the past they did not see a need for 64 bit client.
it is not the gospel as the other devs said.
do you know that with direct x 12 when you make pipeline state object for graphic. you can send 10 or more graphic to be draw as only 1 draw call to the gpu?
do you know that the driver the high level api dx 9 10 11 the driver handle the hazard and validation and cause a lot of over head(cpu bound issue)?
at the place of more thread I would say a better validation and less hazard that need to be validated will send the job faster to the gpu at the place of staying idle on the cpu to be validated and solved by the driver and the api.
They didn’t say that, they said that 64-bit would only help in delaying the memory fragmentation problem, not make the game faster which the questioner suggested. And that’s what the 64-bit client did. It was the easier solution than messing with alternate memory allocation libraries, each with their own set of issues, trying to reduce the OOM errors caused by memory fragmentation.
RIP City of Heroes