Nvidia 660 ti, intel i5 2320 3 ghz, I think I should be getting better performance

Nvidia 660 ti, intel i5 2320 3 ghz, I think I should be getting better performance

in Account & Technical Support

Posted by: Xyphra.4601

Xyphra.4601

Hey, My specs are there, I get tons of lag in WvW ever though my CPU and GPU arent performing near full, and I think I should be able to have my graphics settings in game higher, I can’t run shadows or anything. Is there anything I can do to fix this?

Nvidia 660 ti, intel i5 2320 3 ghz, I think I should be getting better performance

in Account & Technical Support

Posted by: spawnlink.1856

spawnlink.1856

I know issues with Nvidia cards and shadows seem to be present currently. I would say your in the same boat. Other than turning them off I have not heard anything that is a fix yet. Make sure drivers are updated.

Would not hurt to post your current driver build and ingame display settings.

1100T @ 3.3GHz / Gig 990FXA-UD5 / 16GB Corsair 1600 /Radeon HD 6950 ICEQ 2GB / Win 7 pro 64
64GB & 128GB Crucial M4 / 3×1TB WD Black / 850w Corsair Mod / ARC Midi / ASUS Xonar DSX

Nvidia 660 ti, intel i5 2320 3 ghz, I think I should be getting better performance

in Account & Technical Support

Posted by: Cabbage.4085

Cabbage.4085

I have a 660TI and I am not experiencing any issues.

Have you tried updating to the latest Nvidia drivers? If I remember right, there was a recent important update for 6XX series.

Nvidia 660 ti, intel i5 2320 3 ghz, I think I should be getting better performance

in Account & Technical Support

Posted by: Xyphra.4601

Xyphra.4601

I have the most current driver, I just checked, my in game settings are:

1600 × 900

Animation: high
FXAA
Environment:Medium
LOD Distance:High
Reflections: Terrain and Sky
Texture:High
Native
Shadows:Medium
Shaders:High
Post p:high
best texture filtering
depth blur
v sync
60 fps
^ in game, doesnt lag for snowy environments

Im just thinking I shouldnt get lag in places like Lions Arch and WvW, my cpu isnt even running high.

Nvidia 660 ti, intel i5 2320 3 ghz, I think I should be getting better performance

in Account & Technical Support

Posted by: Cbuzz.5083

Cbuzz.5083

At that low resolution you are passing most of the work to your Cpu which isn’t particularly strong. Why did you even get a 660Ti for that resolution?

i7 3770k@4.8Ghz / AIR
GTX 6GB Titan@1160Mhz
3007WFP@2560x16000

Nvidia 660 ti, intel i5 2320 3 ghz, I think I should be getting better performance

in Account & Technical Support

Posted by: AndyPandy.3471

AndyPandy.3471

At that low resolution you are passing most of the work to your Cpu which isn’t particularly strong. Why did you even get a 660Ti for that resolution?

A 150€,- CPU for GW2 “isn’t particularly strong”, there is something seriously broken with GW2 than. Also what exactly means “passing most of the work to your Cpu”, what has the resolution to-do with “passing” GPU work, if u cap to vsync anyway.

PS: Just noticed your sig “3930k@5Ghz” oki a kitten kid, this explains why. So yes u can “solve” the GW2 crappy engine performance, by throwing excessive money at a problem, which should not exist in the first place.

(edited by AndyPandy.3471)

Nvidia 660 ti, intel i5 2320 3 ghz, I think I should be getting better performance

in Account & Technical Support

Posted by: Cbuzz.5083

Cbuzz.5083

At that low resolution you are passing most of the work to your Cpu which isn’t particularly strong. Why did you even get a 660Ti for that resolution?

A 150€,- CPU for GW2 “isn’t particularly strong”, there is something seriously broken with GW2 than. Also what exactly means “passing most of the work to your Cpu”, what has the resolution to-do with “passing” GPU work, if u cap to vsync anyway.

PS: Just noticed your sig “3930k@5Ghz” oki a kitten kid, this explains why. So yes u can “solve” the GW2 crappy engine performance, by throwing excessive money at a problem, which should not exist in the first place.

Ha Ha Ha, haven’t been called a kid for 30years but thanks anyway. The lower the resolution you run, the more work your Cpu has to do because the gpu has to render less pixels & there is physically less on screen.

i7 3770k@4.8Ghz / AIR
GTX 6GB Titan@1160Mhz
3007WFP@2560x16000

Nvidia 660 ti, intel i5 2320 3 ghz, I think I should be getting better performance

in Account & Technical Support

Posted by: VirtualBS.3165

VirtualBS.3165

@Xyphra.4601:

  • Are you using triple-buffering with v-sync?
  • Have you tried disabling v-sync?

Nvidia 660 ti, intel i5 2320 3 ghz, I think I should be getting better performance

in Account & Technical Support

Posted by: Zakpu.2165

Zakpu.2165

This is a great CPU, why spread misinformation?

Nvidia 660 ti, intel i5 2320 3 ghz, I think I should be getting better performance

in Account & Technical Support

Posted by: Zakpu.2165

Zakpu.2165

At that low resolution you are passing most of the work to your Cpu which isn’t particularly strong. Why did you even get a 660Ti for that resolution?

A 150€,- CPU for GW2 “isn’t particularly strong”, there is something seriously broken with GW2 than. Also what exactly means “passing most of the work to your Cpu”, what has the resolution to-do with “passing” GPU work, if u cap to vsync anyway.

PS: Just noticed your sig “3930k@5Ghz” oki a kitten kid, this explains why. So yes u can “solve” the GW2 crappy engine performance, by throwing excessive money at a problem, which should not exist in the first place.

Ha Ha Ha, haven’t been called a kid for 30years but thanks anyway. The lower the resolution you run, the more work your Cpu has to do because the gpu has to render less pixels & there is physically less on screen.

This is also misinformation, where do you even get an idea like that?

Nvidia 660 ti, intel i5 2320 3 ghz, I think I should be getting better performance

in Account & Technical Support

Posted by: AndyPandy.3471

AndyPandy.3471

“The lower the resolution you run, the more work your Cpu has to do because the gpu has to render less pixels & there is physically less on screen.”

What the heck are u talking about?
This makes no sense at all, u have any idea how GPU and CPU interacts in game engines? U can obviously overclock your 500€,- CPU and are “over” 30 years, so spend some extra time to understand the relation between GPU/CPU/Resolution and more specific vsync, tripple buffering and the new “target framerate” or “adaptive vsync”.

My “guess” is u referring to the interaction between GPU/CPU load and vsync on changing resolutions, which in some extreme cases will result in higher CPU load, if u go from 20 FPS to 150+ FPS. This only happens on drawcall bound PC games, if vsync is disabled and u go from like 1920×1024 to 800×600.

(edited by AndyPandy.3471)

Nvidia 660 ti, intel i5 2320 3 ghz, I think I should be getting better performance

in Account & Technical Support

Posted by: VirtualBS.3165

VirtualBS.3165

Using v-sync without triple buffering will reduce your fps to 30 the moment you lower from 60+ to 59fps, and stay between 30 and 59fps.

The triple-buffering option in the nVidia control panel only works for OpenGL, so you’ll need for the game to support triple-buffering or use a 3rd party utility (D3DOverrider).

Actually, I don’t think any dev implemented proper triple buffering in D3D yet, all common solutions are just Frame Queuing (including D3DOverrider, but it works).

Bottom line: Try the game first without v-sync. If you notice that resolves your problem, get D3DOverrider (included with the RivaTuner util)

Nvidia 660 ti, intel i5 2320 3 ghz, I think I should be getting better performance

in Account & Technical Support

Posted by: AndyPandy.3471

AndyPandy.3471

Using v-sync without triple buffering will reduce your fps to 30 the moment you lower from 60+ to 59fps, and stay between 30 and 59fps.

The triple-buffering option in the nVidia control panel only works for OpenGL, so you’ll need for the game to support triple-buffering or use a 3rd party utility (D3DOverrider).

Since the “.300” nVidia drivers there is a alternative called “adaptive vsync”, it basically just enables/disables vsync dynamically, if i understand it correctly.

Nvidia 660 ti, intel i5 2320 3 ghz, I think I should be getting better performance

in Account & Technical Support

Posted by: Revolutia.6507

Revolutia.6507

I am also trying to fix some of the fps issues and I want to ask, how do I get rid of the tearing when I put v-sync off?

Nvidia 660 ti, intel i5 2320 3 ghz, I think I should be getting better performance

in Account & Technical Support

Posted by: VirtualBS.3165

VirtualBS.3165

Since the “.300” nVidia drivers there is a alternative called “adaptive vsync”, it basically just enables/disables vsync dynamically, if i understand it correctly.

True, thanks for pointing that out. The unfortunate side-effect is that each time the GPU is unable to deliver more than 60fps on a particular scene, you are basically playing with v-sync off.

I think they made this option more to limit GPU energy draw on games where you are regularly above 60fps (so the GPU doesn’t render frames that will never be seen on a 60Hz screen).

Nvidia 660 ti, intel i5 2320 3 ghz, I think I should be getting better performance

in Account & Technical Support

Posted by: VirtualBS.3165

VirtualBS.3165

I am also trying to fix some of the fps issues and I want to ask, how do I get rid of the tearing when I put v-sync off?

You don’t, that’s why v-sync exists, to eliminate that tearing. The only option to play tearing free is to use v-sync + triple-buffering (of alternatively, frame queuing), so that when you drop below your screen refresh rate, the framerate still stays smooth, just like without vsync.

Try D3DOverrider, or if your motherboard supports it, LucidLogix’s Virtu MVP that comes with a more elaborate “Virtual Vsync” feature.

(edited by VirtualBS.3165)

Nvidia 660 ti, intel i5 2320 3 ghz, I think I should be getting better performance

in Account & Technical Support

Posted by: michaeljhuman.3940

michaeljhuman.3940

Every game benchtest I have seen shows that if you lower your game graphics settings enough (lower resolution especially) that eventually you become CPU bound.

It’s not that the CPU load goes up, it’s that the GPU stops being the bottleneck and the CPU does.

Nvidia 660 ti, intel i5 2320 3 ghz, I think I should be getting better performance

in Account & Technical Support

Posted by: VirtualBS.3165

VirtualBS.3165

Every game benchtest I have seen shows that if you lower your game graphics settings enough (lower resolution especially) that eventually you become CPU bound.

It’s not that the CPU load goes up, it’s that the GPU stops being the bottleneck and the CPU does.

+1!

Nvidia 660 ti, intel i5 2320 3 ghz, I think I should be getting better performance

in Account & Technical Support

Posted by: AndyPandy.3471

AndyPandy.3471

Every game benchtest I have seen shows that if you lower your game graphics settings enough (lower resolution especially) that eventually you become CPU bound.

It’s not that the CPU load goes up, it’s that the GPU stops being the bottleneck and the CPU does.

This is true for very low resolutions, benchmarks actually “need” this to test CPU performance for none synthetic tests. They also lower the graphic settings to the absolute minimum, they basically force/hack the render engine into a CPU bound mode, with vsync off.
Thats oki for the CPU benchmark and i guess thats where this confusing idea comes from that by lowering your resolution from like 1920 to 1650, u suddenly become CPU bound or the CPU works “more”. Most games are either always partially CPU bound or only if u lower settings/resolution to a absolute minimum.

My guess in GW2 case, is that the engine is suffering from poor thread locking and synchronizing code. Basically some threads are waiting for others to finish and will lock each other. The CPU/GPU behavior, usage i see is very bad and fluctuates extremely. There seem to-be a multitude of problems, ranging from bad render code to bad threading code.

(edited by AndyPandy.3471)