We’ve added new graphics options for players with higher-end machines.
- Tune the look of your scene with ambient occlusion or light adaptation.
- Choose between two extra antialiasing options: SMAA Low and SMAA High.
I’ve got a laptop with nvidia Optimus and a 970m. This is a competent graphics card even by desktop standards, hitting 60-100 fps on Best Appearance. However, the only anti-aliasing option which ever shows up is FXAA. Same as when I run the game on the Intel integrated graphics.
I think what’s going on is that GW2 is mis-identifying my graphics card as the Intel integrated card. The way Optimus works is that the Intel integrated graphics is always active and always driving the screen. The 970m acts like a co-processor which does the 3D rendering on 3D apps, then passes a completed frame to the Intel graphics to display on the screen.
I think GW2 is just doing a simple check for which graphics card is present. It sees the Intel integrated card, and decides that’s what I have. It then limits the anti-aliasing options to something appropriate for an integrated card, even though the heavy lifting on my laptop is being done by a high-end nvidia card. On my 5 year old laptop with a paltry nvidia 325m, the FSAA x2 and x4 options show up. That laptop has a hardware switch to turn on the nvidia card (and turn off the Intel integrated graphics).
Could you please make the game recognize Optimus or base the anti-aliasing options on performance rather than video card? Or (easier) just give us a way to manually override your idiot filter which prevents people with low-end graphics cards from turning on the high-end anti-aliasing options?