Dunno what kind of engine it might need but something needs tweaking, me or it
I have virtually 2 identical CPUs with 2 differences between them, here are the specs:
- i7 4790k
- 16Gb RAM Corsair Pro 1800
- ASUS Z97 PRO acwi
- PC1 – GTX 970 + 27" Samsung 1920×1080 @ 60Hz refresh LED monitor
- PC1 running Win 10 Clean install, DX 12
- PC2 – R9 390 + 27" BENQ 2560×1440 @ 144Hz refresh LED monitor
- PC2 running Win 10 upgraded from 8.1, DX 12
Here’s the interesting thing with identical in-game graphics settings, all at highest:
*PC1 = 87 FPS
*PC2 = 63 FPS from 56 FPS when running Win 8.1 & DX 11
So, where’s the crazy promised increases with DX12 and why does the PC with the R9 390 card running 30% lower FPS?
I dropped the resolution from 2560 to 1920 on PC2 but no change from 63 FPS, I don’t get it.
If I change rendering on PC2 from Supersample to Subsample then I get an increase to 76 FPS but still…….
Is my R9 390 broken or AMD drivers broken (yup installed the latest)? It should be running like a GTX 980.
/scratches head
EDIT:
Someone over in the AMD.com forums posted that GW2 is poorly optimized for AMD GPUs which now makes sense because none of the above does when you look at the gear.
So, is the game going to be optimized for AMD as well as NVidia or is this going to be a constant case of AMD owners being fully kittened off for the rest of eternity?
Maybe the high cost of the expansion will pay for better code or………. ?