[Suggestion] Please add SMAA to GW2
this suggestion is hardly any sweat off a dev team’s back as it’s a very simple feature to implement.
Says who?
[OHai] – Northern Shiverpeaks
Hey all, used the search function and found nothing, and with the suggestion section dead and no section dedicated to technical suggestions, I guess I’ll drop this here.
Arenanet, can we please, as well as your current FXAA implementation, add SMAA to the antialiasing list? Found here: http://www.iryoku.com/smaa/ SMAA has a similar performance hit as FXAA but is far more accurate and less looking like blurred edges as well as the flickering staircase effect that happens with FXAA.
It would take almost no effort to implement as it is a simple post process effect exactly as FXAA is, but it would benefit the game’s visuals tremendously at almost no performance hit at all, much like FXAA.
There is a tool to “inject” SMAA into the game, but as it’s a third party tool rather than built into the game itself so it can cause overlays such as steam, fraps, dxtory, overwolf etc to not work due to how it intercept’s directX calls. So a proper in-game option would be much appreciated and this suggestion is hardly any sweat off a dev team’s back as it’s a very simple feature to implement.
Thanks for reading this short but much needed suggestion to the game. I would much rather use SMAA over FXAA due to it’s similar performance hit, if you can say it has a performance hit, but it has greatly increased visuals over FXAA.
GW2 engine is only DX9.0C implemented. You did know that, didn’t you? It may bot be implemented in their engine and would have to be.
Also this game is CPU bound, not video card bound.
this suggestion is hardly any sweat off a dev team’s back as it’s a very simple feature to implement.
Says who?
The framework is already in place in the form of having FXAA, GW2 uses a deferred lighting system so your typical antialiasing like MSAA cannot be forced without some serious driver/profile fiddling, however post processing antialiasing effects such as FXAA and SMAA do not have such a limitation due to it being applied after the game is rendered (which is also why FXAA/SMAA does not show up in screenshots, as screenshots are taken from the game render before the effects are applied)
GW2 engine is only DX9.0C implemented. You did know that, didn’t you? It may bot be implemented in their engine and would have to be.
Also this game is CPU bound, not video card bound.
SMAA is able to be implemented the same way FXAA was into GW2’s engine, SMAA is not purely limited to dx10/11. It is simply a more efficient and better looking post processing effect. Both of these effects hardly have any bearing on a game’s performance at all due to their nature, especially when you compare them to the industry standard MSAA, which cannot apply here due to GW2 using a deferred lighting engine.
(edited by Collective.9206)
this suggestion is hardly any sweat off a dev team’s back as it’s a very simple feature to implement.
Says who?
The framework is already in place in the form of having FXAA, GW2 uses a deferred lighting system so your typical antialiasing like MSAA cannot be forced without some serious driver/profile fiddling, however post processing antialiasing effects such as FXAA and SMAA do not have such a limitation due to it being applied after the game is rendered (which is also why FXAA/SMAA does not show up in screenshots, as screenshots are taken from the game render before the effects are applied)
GW2 engine is only DX9.0C implemented. You did know that, didn’t you? It may bot be implemented in their engine and would have to be.
Also this game is CPU bound, not video card bound.
SMAA is able to be implemented the same way FXAA was into GW2’s engine, SMAA is not purely limited to dx10/11. It is simply a more efficient and better looking post processing effect. Both of these effects hardly have any bearing on a game’s performance at all due to their nature, especially when you compare them to the industry standard MSAA, which cannot apply here due to GW2 using a deferred lighting engine.
WRONG – FXAA has a HUGE performance hit on this game. Since the majority of the game is actually done using the CPU, not the GPU, any extraneous functions do have affects on the FPS of the game. Example, in WvW, when two huge groups are fighting, the FPS goes down to nothing with FXAA on, turn it off and you can recover some of the FPS lost.
WRONG – FXAA has a HUGE performance hit on this game. Since the majority of the game is actually done using the CPU, not the GPU, any extraneous functions do have affects on the FPS of the game. Example, in WvW, when two huge groups are fighting, the FPS goes down to nothing with FXAA on, turn it off and you can recover some of the FPS lost.
Just tested this in a big world event with players all over the show at the karka queen, it made no discernable difference whatsoever on or off. FXAA is not a performance hog whatsoever, it is a simple post process effect overlayed on top of the game to approximate as a cheaper antialiasing tool.
You can google FXAA performance charts and almost all of them will paint the same picture, for instance in this review for MP3 http://www.hardocp.com/article/2012/06/12/max_payne_3_performance_iq_review/4#.U58asMdfCSE
The reviewer noted thusly:
With No AA the HD 7970 is averaging 63.8 FPS. When we turn on very high FXAA the average framerate drops to only 62.3 FPS. This is only a tiny performance drop.
(edited by Collective.9206)
WRONG – FXAA has a HUGE performance hit on this game. Since the majority of the game is actually done using the CPU, not the GPU, any extraneous functions do have affects on the FPS of the game. Example, in WvW, when two huge groups are fighting, the FPS goes down to nothing with FXAA on, turn it off and you can recover some of the FPS lost.
Just tested this in a big world event with players all over the show at the karka queen, it made no discernable difference whatsoever on or off. FXAA is not a performance hog whatsoever, it is a simple post process effect overlayed on top of the game to approximate as a cheaper antialiasing tool.
You can google FXAA performance charts and almost all of them will paint the same picture, for instance in this review for MP3 http://www.hardocp.com/article/2012/06/12/max_payne_3_performance_iq_review/4#.U58asMdfCSE
The reviewer noted thusly:
With No AA the HD 7970 is averaging 63.8 FPS. When we turn on very high FXAA the average framerate drops to only 62.3 FPS. This is only a tiny performance drop.
I have done it here, in this game – Radeon 7950 in my PC with most up to date WHQL drivers and a i5 2500K running at 4.6 GHz. In mob with FSAA on, 10 FPS, with it off 35 fps. That has been done again and again. It has been noted, on many PC forums, that FSAA is a hog and should be used judiciously.
You are assuming that all this processing occurs using the GPU, and it does not. The modified engine this game uses was written in 2006. You can assume that is what this game is written for, GPUs from that era.
When they add DX 11 support in this game, then much will be dumped onto the GPU. Currently is it not.
I have done it here, in this game – Radeon 7950 in my PC with most up to date WHQL drivers and a i5 2500K running at 4.6 GHz. In mob with FSAA on, 10 FPS, with it off 35 fps. That has been done again and again. It has been noted, on many PC forums, that FSAA is a hog and should be used judiciously.
You are assuming that all this processing occurs using the GPU, and it does not. The modified engine this game uses was written in 2006. You can assume that is what this game is written for, GPUs from that era.
I see where this misunderstanding is coming from. You’re talking about FSAA which is an AMD exclusive antialiasing method forced in the driver control panel which is similar to MSAA and are known performance hogs.
I’m talking about F[X]AA the antialiasing that’s built into the game in the options menu and has next to no performance hit, similar to the SMAA method I want introduced to the game. Because SMAA is a much better post processing antialiasing solution when compared to FXAA.
(edited by Collective.9206)
I have done it here, in this game – Radeon 7950 in my PC with most up to date WHQL drivers and a i5 2500K running at 4.6 GHz. In mob with FSAA on, 10 FPS, with it off 35 fps. That has been done again and again. It has been noted, on many PC forums, that FSAA is a hog and should be used judiciously.
You are assuming that all this processing occurs using the GPU, and it does not. The modified engine this game uses was written in 2006. You can assume that is what this game is written for, GPUs from that era.
I see where this misunderstanding is coming from. You’re talking about FSAA which is an AMD exclusive antialiasing method forced in the driver control panel which is similar to MSAA and are known performance hogs.
I’m talking about F[X]AA the antialiasing that’s built into the game in the options menu and has next to no performance hit, similar to the SMAA method I want introduced to the game. Because SMAA is a much better post processing antialiasing solution when compared to FXAA.
That is still (unfortunately) NOT GPU driven – it is CPU driven. (Even though it is in the graphics options). ZFXAA DOES affect the FPS in this game as do many of the other graphics options. This is why the most recent NVidia WHQL drivers had specific optimizations to remove some of the calculations from the CPU. This game is also a single-threaded app, also.
You cannot compare Max Payne to this game – 2 different engines. A.Net built and designed their own game engine.
(edited by Dusty Moon.4382)
Trust me, FXAA is an EXTREMELY lightweight post process that is driven by all of roughly 2 shader files. It is nothing like traditional antialiasing which uses colour sampling to determine what lines to smoothen out, which is where most of the performance cost comes in.
FXAA is a very simple approximation set of shader files. Which is also why it’s a very cost effective method of antialiasing, the only side effect is the image can look blurry due to how it approximates aliased lines and can cause shimmering or the staircase effect on moving images.
Even if this game is cpu bound vs gpu, FXAA quite literally cannot effect your fps performance in the same way MSAA does. Any game with FXAA, and I mean in ANY game, enabling FXAA will not tank your fps, it simply can’t be done due to how low the resource cost is.
FXAA is a post process filter, run over the frame just before display. Normally it has extremely low impact to framerate.
SMAA is also a post process filter but does require a bit more memory to execute as well as it’s slower than FXAA and therefore have a greater impact to frame rate.
RIP City of Heroes
Is FXAA like an offshoot of the FX network? Their programming is OK I guess…
It’s a medical condition, they say its terminal….
RIP City of Heroes
You can use SMAA with SweetFX. Also you can use downscale from nvidia driver custom resolution. :/
I just tried in game FXAA , I only lose 2 fps with it on.
The Dragonfly Effect [Phi]
DragonBrand
You can use SMAA with SweetFX. Also you can use downscale from nvidia driver custom resolution. :/
The problem with forced methods is it tends to be a blockade when you use overlay applications like steam/fraps/shadowplay/overwolf etc, they simply cease to function in-game. Having it built natively into the game as an AA option would be tremendously useful as it would allow such applications to function, as well as increasing the game’s overall image quality over using FXAA instead.
Also downscaling this game will totally annihilate your framerate. It’s already quite the performance hog in large battles due to the players on screen, but even with that nullified the particle and attack effects are still quite draining. So enabling downscaling would destroy any chance of a playable framerate.
Which is why I prefer suggesting SMAA being added due to it’s far more operable fps hit, while only being a bit more intensive than FXAA’s neglible at best fps hit, it’s still far less resource intensive than forcing MSAA or downscaling.
(edited by Collective.9206)
It will be in the gem store next week for 2000 gems.
Use sweetFX!
/15hugechars ash-legion
and the stupidest grown-ups who are the most grown-up.”
- C. S. Lewis
We don’t even do Dx10 or Dx11, so what are the odds they will do a manufacturer’s exclusive API?
RIP City of Heroes
radeon pro adds smaa injection as well with a better gui than the text based sweetfx. Also has a pretty nice sweetfx configurator as well. But only works for amd cards.
The maker of sweetfx said he would like to defaul to the radeonpro implementation, but only does not due to the hardware restrictions.
We don’t even do Dx10 or Dx11, so what are the odds they will do a manufacturer’s exclusive API?
SMAA is not exclusive to either of the GPU manufactures, it is universal and can be used on either brand, the same as FXAA being created by nvidia but it can just as easily be used by AMD.
It’s a shame this game was inevitably made on DX9, DX11 would be able to push for far greater performance, as well as using the CPU/GPU to more benefit rather than being limited to one or 2 cores of the CPU with DX9.
(edited by Collective.9206)
The Game engine, which is modified is from GW1 and GW1 used 9.0C. It is not a trivial bit of coding to move up to DX 11, although I wish they would.
Also, DX is not what causes the the game to be single-threaded, the programmers did that in the game engine.
(edited by Dusty Moon.4382)
First and foremost, GW2’s graphics were done in DX9 because of Windows XP users. More specifically, 32-bit Windows XP.
Now that Windows XP is no longer supported by Microsoft, A-Net “could” overhaul the whole game and bring it inline with Windows 7 and 8.
Will they? I have no idea.
Lady Alexis Hawk – Main – Necromancer
Ravion Hawk – Warrior
First and foremost, GW2’s graphics were done in DX9 because of Windows XP users. More specifically, 32-bit Windows XP.
Now that Windows XP is no longer supported by Microsoft, A-Net “could” overhaul the whole game and bring it inline with Windows 7 and 8.
Will they? I have no idea.
Considering how long it took them to change GW1 to not support older versions of Windows? Not very likely in the near future, but maybe eventually.
First and foremost, GW2’s graphics were done in DX9 because of Windows XP users. More specifically, 32-bit Windows XP.
Now that Windows XP is no longer supported by Microsoft, A-Net “could” overhaul the whole game and bring it inline with Windows 7 and 8.
Will they? I have no idea.
They designed this game is to be played on as many PCs as possible. Just because Win XP is no longer supported, does not mean people will get rid of it. The people who use it won’t change and will continue on with XP – that is a red herring argument.
I’m not quite following the cpu bound and not gpu. That being said an integrated graphics should be able to run everything maxed if its on an i5 or i7. Right?
I’m not quite following the cpu bound and not gpu. That being said an integrated graphics should be able to run everything maxed if its on an i5 or i7. Right?
No. The problem is the Intel integrated graphics really stinks. You really need a separate GPU if you are using Intel Integrated graphics. The AMD APU’s are more complex and generally are designed better.
So what your saying then is if I have an a-10 I don’t need a graphics card. They just said the game is cpu bound. Every benchmark you can find an intel i5 will beat any AMD.
(edited by rgraze.5169)
You’re really confused Dusty Moon, FSAA has nothing to do with FXAA, SMAA works on a D3D9 render pipeline just fine, virtually no one still running XP has the hardware to run GW2 and FXAA’s impact on frametimes is neglectable.
I’d stop posting if I were you.
On the suggestion itself I’d love to see SMAA implemented, FXAA is worst than playing without antialiasing and not everyone has to hardware to move the game supersampled.
Use sweetFX!
/15hugechars ash-legion
SweetFX applies SMAA to the whole final frame, including text.
It’s a decent alternative but native support would be vastly better.
(edited by Spiuk.8421)
@Dusty, Anet could try what Saint’s row did and support both DX9 and DX11’s codepaths, admittedly though it could potentially lengthen workloads in the future due to needing to maintain both versions on windows.
@rgraze, CPU bound purely means that a game utilises the CPU more than the GPU. GPUs by design can process data and run much faster than a CPU can by a very large margin. Having a game using the CPU more than the GPU can limit performance and also bottleneck your computer when it comes to very intensive tasks. For instance when you turn the culling options to show the highest player amount and highest quality, your framerate tanks.
@Spiuk Exactly, which is why I’d love an in-game option for SMAA, it’s easily supported by DX9, only a slightly larger fps hit than FXAA due to it being more efficient at antialiasing and as you say, an in-game option won’t force itself onto the entire rendered screen such as forcing FXAA/SMAA via driver/third party tools does.