Summary : SLI scaling feels very poor now. What’s happened?
=========
Asus P8P67 Pro
i7 2600k at 4.4ghz
SLI Gigabyte GTX 970 g1
8GB DDR3 1600mhz
1000W PSU
2x 27" 1440p 120hz monitors
etc etc etc
In game settings are :
2560×1440
96hz
Unlimited /no frame limiter
Animation : High
Antialiasing : None
Environment : High
LOD Distance : Ultra
Reflections : Terrain and Sky
Textures : High
Render Sampling : Supersample
Shadows : High
Shaders : High
Postprocessing : High
Character Model Limit : Medium
Character Model Qualiter : Low
Ambient Occlusion : On
Best Texture Filtering : On
Depth Blur : On
Effect LOD : On
High-Res Character Textures : On
Light Adaption : On
When I started GW2 years ago, I had the same setup as above, sans the graphics cards which were SLI GTX 680s.
There were two spots I would stand to test performance and make adjustments.
-Ligacus Aquilo Waypoint, when you land in front of the portal, facing the Black Citadel
-Fort Marriner Waypoint, standing on the bridge facing the Grand Piazza (after the revamp, the point I used was at the end of the stone bridge just before entering the Piazza, facing the center tree)
Of course the game engine was different back then, some features did not exist. However Reflections (from Terrain & Sky to Full made a significant difference) as well as Render Sampling (from Native to Supersample) and Shadows (from High to Ultra).
Therefore the results of the comparison from GTX 680 to the GPUs used after will be skewed, though there is some merit in the findings.
At the above two mentioned waypoints, and the enabled settings (that were available at the time) the GTX 680s managed to clear 60fps as a minimum frame rate. Occasionally with a mass of people running past the screen this would dip just under.
Fast forward to 2014/2015, using the same hardware and settings above, except with Crossfire r9 290s. The AMD cards managed to do even worse, hovering around 40-50fps in those two teleport locations.
Come forward again to early 2016 using GTX 970 in SLI. Black Citadel is low 60s and Grand Piazza/Lion’s arch at high 70s.
To point out how strange this is. When landing at Black Citadel Ligacus Aquilo, if you move forward a few steps then turn around to your south east(?) facing the fog valley leading to Ligacus Notos, I am still getting no more than 80fps. Yet there’s really not much there, no drawn mobs, no landscape, no extensive backdrop. It’s just a mountain wall, some fog and that’s it.
I also noticed that I cannot peg 96fps in all out door areas. Even ones where I am completely alone and doing no activities, there seems to be some odd fluctuation in the frame rates, between 93 and 96 – often induced by a camera pan. I would expect it to be stuck at 96, given that removing the 96hz cap, sees the cards go well over 100fps.
Then when I encounter a few human players or small battles(or even crossing the Diessa Plateau and facing the windmill), it drops to high 80s out of combat.
GPU usage and CPU % usage seems to be fair, seeing all three push 50-60% at times where required.
I decided to try and make any bottlenecks more GPU oriented and force higher GPU usage by upping the resolution to 4k. While GPU activity increased by almost 25% (no doubt to manage the increased resolution output), frame rates only dropped ten to fifteen fps in most out door areas.
Next I disabled SLI and with a single card, my results were twenty fps less in all scenarios.
Additionally I moved render quality down to Native, and gained no performance in doing so.
Other things I’ve tried :
Using various SLI compatibility bits in Inspector, though none seem to make a difference.
Different SLI render modes (AFR 2 render mode is still the best for performance)
Removed Nvidia’s presetting for Ambient Occlusion
Disabled the following settings :
Ambient Occlusion
Best Texture Filtering
Depth Blur
Effect LOD
High-Res Character Textures
Light Adaption
Reducing quality of all normal settings.
I appreciate that after disabling many features/and lower settings, that some performance was gained, however we’re speaking to five or six frames.
The biggest improvement came from reducing the field of view amount. Moving somewhere to about 25% on the slider versus 60% was in excess of twenty fps.
And of course zooming to first person shaves another five fps at times.
It’s my opinion that SLI is not scalling well, less than 50%.
Admittedly these games are CPU intensive, but unless mistaken, that should not stop the GPU scaling from at least being more saturated. Especially considering that as the resolution rises, the system is more dependent on the GPU.
Players boast about running 1080p at 100+ frames on a single modern performance graphics card, where the CPU is more significant at the reduced resolution.
I toy with the idea of moving to a Skylake or even a Haswell (DC) platform, for MMO gaming. Everywhere I read though, it doesn’t seem like real world performance gets much better.
I am feeling frustrated that something is missing. I am not expecting maximum frames in large battles or PVP, though am disheartened it cannot peg 96fps outdoors in all areas.
I may just have to make a custom resolution around 85hz and then cap the framerate to match. It’ll still feel better than 60, though is disappointing.
(edited by Cable.2746)