GTX 680 FTW+ FPS @ 2560 x 1600 Full Detail?

GTX 680 FTW+ FPS @ 2560 x 1600 Full Detail?

in Account & Technical Support

Posted by: Mirithal.7685

Mirithal.7685

Hi All,

I have a EVGA GTX 680 on the way and just wanted to know how many FPS I should expect on a 2560 × 1600 monitor at full detail? This card will also be driving one other monitor with a much smaller resolution.

The reason I ask is because I’m thinking about ordering another GTX 680 and putting them in SLI. Will this be necessary?

GTX 680 FTW+ FPS @ 2560 x 1600 Full Detail?

in Account & Technical Support

Posted by: Ilithis Mithilander.3265

Ilithis Mithilander.3265

You’re missing a key part of information for a high stable FPS. What processor are you running with the GTX 680? It makes a world of difference sadly.

Primary Guild: Testing Eternity [TE]
Chloe (Version 3):
[i7 930 @ 4.1Ghz (1.3875V) w/Cooler Master 120M][Gigabyte G1 Gaming GTX 970 (stock)]

GTX 680 FTW+ FPS @ 2560 x 1600 Full Detail?

in Account & Technical Support

Posted by: Mirithal.7685

Mirithal.7685

Sorry! I should have specified… i7 3770K

GTX 680 FTW+ FPS @ 2560 x 1600 Full Detail?

in Account & Technical Support

Posted by: Rampage.7145

Rampage.7145

A second GTX 680 would be overkill for GW2, it will be cool for crysis 3 or farcry 3 but for GW it will make no difference in low frame rate situations, sure looking to the sky u will see 400 FPS instead 250 using SLI, but in lions arch u will still see FPS as low as 30, SLI will make no difference in WvW, cities or heavy PVE events.
A single gtx 680 will give u very good frames in general at that res, even running triple monitors or 3D surround it will give u respectable performance. I would avoid SLI in general for games since it causes a lot of truobles, suttering, some games don’t scale properlly, it is more of a marketing thing, u know they made it possible in order to sell more videocards, but it really causes a lot of truoble sometimes. If u want really high end performance wait for the new tesla videocard, tho it is rumored to cost arround 1600$ so u probably want to save money :P

(edited by Rampage.7145)

GTX 680 FTW+ FPS @ 2560 x 1600 Full Detail?

in Account & Technical Support

Posted by: Caedmon.6798

Caedmon.6798

A second GTX 680 would be overkill for GW2, it will be cool for crysis 3 or farcry 3 but for GW it will make no difference in low frame rate situations, sure looking to the sky u will see 400 FPS instead 250 using SLI, but in lions arch u will still see FPS as low as 30, SLI will make no difference in WvW, cities or heavy PVE events.
A single gtx 680 will give u very good frames in general at that res, even running triple monitors or 3D surround it will give u respectable performance. I would avoid SLI in general for games since it causes a lot of truobles, suttering, some games don’t scale properlly, it is more of a marketing thing, u know they made it possible in order to sell more videocards, but it really causes a lot of truoble sometimes. If u want really high end performance wait for the new tesla videocard, tho it is rumored to cost arround 1600$ so u probably want to save money :P

This statement is not true…if i disable SLI i have noticable lower FPS allround,and yes,LA included.SLI enabled it pops right back…Yes again..LA included.SLI has a MAJOR impact on performance if you have the cpu to back it up.

GTX 680 FTW+ FPS @ 2560 x 1600 Full Detail?

in Account & Technical Support

Posted by: Mirithal.7685

Mirithal.7685

Thank you very much for the input! That definitely steers me away from purchasing an additional card. I’d much rather save my $500 and just purchase a high-end card that is unreal (later on in the year).

What you mention regarding other games is definitely something I should look at more (I play BF, CoD)… however, I’ll do my own research on that front.

Why do you say low framerate in LA, WvW, heavy PVE, etc. with a SLI configuration (why doesn’t SLI help in these scenarios?)?

GTX 680 FTW+ FPS @ 2560 x 1600 Full Detail?

in Account & Technical Support

Posted by: Ilithis Mithilander.3265

Ilithis Mithilander.3265

Thank you very much for the input! That definitely steers me away from purchasing an additional card. I’d much rather save my $500 and just purchase a high-end card that is unreal (later on in the year).

What you mention regarding other games is definitely something I should look at more (I play BF, CoD)… however, I’ll do my own research on that front.

Why do you say low framerate in LA, WvW, heavy PVE, etc. with a SLI configuration (why doesn’t SLI help in these scenarios?)?

It’s because even the nice i7 3770K overclocked to a nice 4.5Ghz is the limiting factor in the CPU configuration. The way GW2 was programmed was kitten poor for multithreads, as a result, GW2 has 3 massive threads and a bunch of small ones. However, if any of those three threads get hung up, your FPS drops to garbage. Ideally, ArenaNet would split these threads in smaller threads, but that requires a lot of engine work.

To answer your original question though, you’ll have one of the best machines to play GW2 in its current state. The best however is still limited to low FPS in crowded LA, crowded dynamic events, and chaotic 100+ man WvW battles because of how the GW2 engine is so inefficient at utilizing the CPU. Low FPS being 30. The GTX 680 is fine for your setup and won’t be the limiting factor.

Here’s to a new (dare I say) DX10/11 engine in 2014. Cheers!

Primary Guild: Testing Eternity [TE]
Chloe (Version 3):
[i7 930 @ 4.1Ghz (1.3875V) w/Cooler Master 120M][Gigabyte G1 Gaming GTX 970 (stock)]

(edited by Ilithis Mithilander.3265)

GTX 680 FTW+ FPS @ 2560 x 1600 Full Detail?

in Account & Technical Support

Posted by: hoegarden.4287

hoegarden.4287

You can get a Sapphire Radeon Vapor-X HD 7970 GHz OC 3GB DDR5. In belgium that one is at 384€40
The GTX 680 is around 495€69 here… And has only 2Gb DDR5 ram.
I have the radeon and i’m pretty happy with it. Farcry, Sleeping dogs, Hitman, Crysis 3 runs pretty nice on max settings.

GTX 680 FTW+ FPS @ 2560 x 1600 Full Detail?

in Account & Technical Support

Posted by: Mirithal.7685

Mirithal.7685

A second GTX 680 would be overkill for GW2, it will be cool for crysis 3 or farcry 3 but for GW it will make no difference in low frame rate situations, sure looking to the sky u will see 400 FPS instead 250 using SLI, but in lions arch u will still see FPS as low as 30, SLI will make no difference in WvW, cities or heavy PVE events.
A single gtx 680 will give u very good frames in general at that res, even running triple monitors or 3D surround it will give u respectable performance. I would avoid SLI in general for games since it causes a lot of truobles, suttering, some games don’t scale properlly, it is more of a marketing thing, u know they made it possible in order to sell more videocards, but it really causes a lot of truoble sometimes. If u want really high end performance wait for the new tesla videocard, tho it is rumored to cost arround 1600$ so u probably want to save money :P

This statement is not true…if i disable SLI i have noticable lower FPS allround,and yes,LA included.SLI enabled it pops right back…Yes again..LA included.SLI has a MAJOR impact on performance if you have the cpu to back it up.

@Caedmon
The statement they made was not :

“SLI will not make a difference in FPS for GW2”.

If I understand them correctly, their statement is:

“Since 680 is a high-end graphics card, there isn’t a CPU on the (consumer) market that will back it up. Therefore, adding an additional 680 in a SLI configuration will not benefit low-FPS scenarios like heavy WvW, big cities, etc. This doesn’t mean that a SLI configuration for lower-end graphics cards will not be beneficial to GW2”

That’s my interpretation.

GTX 680 FTW+ FPS @ 2560 x 1600 Full Detail?

in Account & Technical Support

Posted by: Rampage.7145

Rampage.7145

Let me put it this way. In LA using a single GTX 680 u will get 30-160FPS, using 2 in SLI u won’t get 50-300, u will get 30-300, SLI will help u whenever the CPU is not taxed so the cards can stretch their legs, whenever u get into a CPU bounded situation (90% of the time in GW2) u won’t get a single extra frame using 2 cards than a single one, u can even get jerky, suttering screen due the CPU being unable to feed the 2 cards equally.
In a perfect world situation the CPU should be able to feed the cards so they render frames 1/1, this means if u are at 120 frames per second every card should be pulling 60, this is not the case in GW due the CPU limitations and the frame rate constantly going up and down the videocards won’t be able to render the frames evenly, this means if u are getting 60 FPS, one card may be pulling 40 and the other one 20, this causes microsuttering, and even while ur FPS counter might show u decent frame rate ur gaming experience will be just as bad as playing at 20 FPS.
So in conclusion dual GPU might be a good idea for GPU bounded games cuz u know for sure ur CPU will be able to feed the cards properlly, in CPU bounded games, it could be potentially a microsutter mess (depends on the drivers aswell), but overall u will just get higher maximum FPS while lowest FPS will be about the same.
It would be more beneficial to use the 500$ in cash to wipe ur kitten than buying a second gtx680 for Gw2 basically (and kinda sad aswell).

(edited by Rampage.7145)

GTX 680 FTW+ FPS @ 2560 x 1600 Full Detail?

in Account & Technical Support

Posted by: Unspec.9017

Unspec.9017

If you have another 600 dollars to burn, you might as well just skip another card and upgrade to a 2011 system. In GW2, CPU is more important than your GPU.

GTX 680 FTW+ FPS @ 2560 x 1600 Full Detail?

in Account & Technical Support

Posted by: Ilithis Mithilander.3265

Ilithis Mithilander.3265

If you have another 600 dollars to burn, you might as well just skip another card and upgrade to a 2011 system. In GW2, CPU is more important than your GPU.

By 2011 I hope you mean the socket :p

If you are for upgrading, I would make sure that Haswell-E follows the same suit as Sandy Bridge-E and Ivy Bridge-E. Ivy Bridge-E is due out later 2013 IIRC and Haswell-E most likely at the end of 2014, so that would make for a nice upgrade path if Haswell-E stayed on the same socket.

Primary Guild: Testing Eternity [TE]
Chloe (Version 3):
[i7 930 @ 4.1Ghz (1.3875V) w/Cooler Master 120M][Gigabyte G1 Gaming GTX 970 (stock)]

(edited by Ilithis Mithilander.3265)

GTX 680 FTW+ FPS @ 2560 x 1600 Full Detail?

in Account & Technical Support

Posted by: MaRko.3165

MaRko.3165

Simply put – after you have enough GPU power to handle your screen resolution the limiting factor for GW2 becomes your CPU.

There are players which have very fast CPU’s and a single GTX460 cards getting what they might consider ‘acceptable performance’ with the rigs. GW2 requires good CPU ‘single thread performance’.

While Nvidia’s SLI supports more titles than I’ve played, at my current rate of 2wk-2mo a title I’ve gone through a few dozen titles in the last few years, all of which were ran in SLI/3DVision mode.

I will however mention that I’ve steered clear of upgrading to Kepler GPU boards because of issues some driver versions/games have sometimes had with SLI setups.

In my personal experience, using 2 570’s in SLI enhances my 3DVision frame rate over using a single card alone.

“I was playing Farmville and a kitten MMO GW2 broke out of it…”
I cut my gaming teeth on Adventure&ZorkI,II,III.
i7-2600K/8G/GTX570SLI/WIN7/Stereoscopic_3D

GTX 680 FTW+ FPS @ 2560 x 1600 Full Detail?

in Account & Technical Support

Posted by: Rampage.7145

Rampage.7145

If you have another 600 dollars to burn, you might as well just skip another card and upgrade to a 2011 system. In GW2, CPU is more important than your GPU.

I think that is a bad idea since GW2 makes no use of extra cores, 2011 CPUs are slower clock per clock than the 3770k, which means, in GW2 the 3770k is the fastest CPU u can buy at the moment.

GTX 680 FTW+ FPS @ 2560 x 1600 Full Detail?

in Account & Technical Support

Posted by: Unspec.9017

Unspec.9017

If you have another 600 dollars to burn, you might as well just skip another card and upgrade to a 2011 system. In GW2, CPU is more important than your GPU.

I think that is a bad idea since GW2 makes no use of extra cores, 2011 CPUs are slower clock per clock than the 3770k, which means, in GW2 the 3770k is the fastest CPU u can buy at the moment.

Wouldn’t the higher amounts of L3 cache be beneficial?

GTX 680 FTW+ FPS @ 2560 x 1600 Full Detail?

in Account & Technical Support

Posted by: Beau ter Ham.3709

Beau ter Ham.3709

OP, just for reference, since I have a very similar system to the one you currently have, I think you might be interested in my results.

I’m running GW2 at a 2560×1600 resolution (30" monitor), with a 3770k at 5.2Ghz, and a GTX 680 with a core speed of 1200 Mhz.

In my experience, when the CPU clock speed is raised from 5.2Ghz to 5.3Ghz the FPS doesn’t change at all, there is no performance increase.
Also, when the core speed of the GTX 680 was raised from 1200 Mhz to 1600 Mhz the maximum framerate increased overall, but the minimum framerate did not increase by a large margin in graphically intense scenes such as WvW. However, the minimum framerate in a zerg versus zerg moment is still 30 FPS and upwards. These results suggest that while SLI will yield some performance increase. However, due to the way SLI is implemented in GW2, it will not give you a much better gaming experience than using only one GTX 680 to power your 30" monitor (i.e. microstuttering, sub-par scaling).

In my specific graphic settings I have Reflections set to none, and Shadows set to low. This will give a nice boost to your minimum framerates, while the image quality is practically the same to best appearance.

My CPU is cooled by a phase change unit with an evaporator temp of -40C, this allows for the CPU to attain speeds as high as 5.3 Ghz in GW2. Also, the test with the GTX 680 at a core speed of 1600 Mhz was done with a second phase change unit.

In the clip posted below you can see the system running at 5.2 Ghz CPU speed, and 1600 Mhz GPU speed. The clip was made in Divinity’s Reach, and the graphics were set to ‘best appearance’ without further adjustments made to it; this will make it possible for you to compare my framerate with your own.

If you have the money, I would personally invest it in a fast card with 1 GPU (e.g. the upcomming Geforce Titan) instead of choosing the GTX 680 in SLI.

The clip:

(edited by Beau ter Ham.3709)

GTX 680 FTW+ FPS @ 2560 x 1600 Full Detail?

in Account & Technical Support

Posted by: Rampage.7145

Rampage.7145

If you have another 600 dollars to burn, you might as well just skip another card and upgrade to a 2011 system. In GW2, CPU is more important than your GPU.

I think that is a bad idea since GW2 makes no use of extra cores, 2011 CPUs are slower clock per clock than the 3770k, which means, in GW2 the 3770k is the fastest CPU u can buy at the moment.

Wouldn’t the higher amounts of L3 cache be beneficial?

Yeah for cinema 4d or 3Ds Max, video editing higher cache and quad channel memory are the way to go but for games it will hardly be beneficial 3770k cache latencies ar very good and 8mb is plenety, i don’t think the extra cache could significantlly imporve performance on any game in the market atm, the i7 3820 (LGa 2011) losses to the 3770k in 9/10 bench and it has higher cache ammount, cool stuff like quad channel memory or support for Ivy-E CPUs are a plus for LGA 2011 but harddly benefits a gaming setup either at least not for GW2, may be for future gaming, but it is never wise to get top of the line gear as future proffing method. Since most likelly 2014’s 190$ CPU will perform better than actual 1000$ CPUs :P
The fact GW2 is a “CPU bounded” game harddly means it does propper use of current CPU tecnology, in fact it does terrible use CPU resources because of this a 200$ i5 2500k will perform just as good (or bad) as the 1000$ i7 3960x in this game.

(edited by Rampage.7145)

GTX 680 FTW+ FPS @ 2560 x 1600 Full Detail?

in Account & Technical Support

Posted by: Jazhara Knightmage.4389

Jazhara Knightmage.4389

Rampage, actually 2011 cpu’s (top end ones) tho being slower clock for clock also run cooler, and will overclock better with proper cooling, unlike ivy they didnt use cheap horrible TIM under the IHS on them.

but either way, its a dumb buy at this point, haswell replaces the socket again, so you wouldnt have any real upgrade path…..

this is the biggest problem building intel for people, many of my clients are use to being able to buy a decent cpu for a decent price then, in 1-3 years buy a new cpu that will drop in and be a good bit faster, you cant do that with intel, they change sockets even 1-1.5 years it seems, and when they do, they totally abandon older socket support.

AMD on the other hand, though slower core for core clock for clock, gives you a much longer upgrade path, and with their current chips price for multithreaded perf amd kitten pillages and plugers intels options at the same price.

more and more engines are being optimized for more and more cores, look at bf3 perf on stock 8350 vs 3470 and 3570k and 3770k chips, then note the 8350 costs the same price as a locked 3470 but, is fully unlocked and easy to overclock.

AMD FX-8350@4.8ghz on air(SilverArrowSB-E Extreme) , 32gb 1866mhz(10-11-10 cr1)
PCP&C 1200watt TC, Crosshair V F-Z, Sapphire 290x

GTX 680 FTW+ FPS @ 2560 x 1600 Full Detail?

in Account & Technical Support

Posted by: Raijinn.9065

Raijinn.9065

In gw2 I get an average of 15-20 frames from enabling sli. Gtx 670 sli with 3770k @ 4.6ghz and resolution is 5900×1080 @ 75hz. I would highly recommend going sli, even if its only 10-15 average frames….that’s worth it to me. The key in my setup is that I have a motherboard that lets both my video cards run at x16/x16, that right there would be your bottleneck (both cards at x8/x8) especially at that resolution.

GTX 680 FTW+ FPS @ 2560 x 1600 Full Detail?

in Account & Technical Support

Posted by: Caedmon.6798

Caedmon.6798

A second GTX 680 would be overkill for GW2, it will be cool for crysis 3 or farcry 3 but for GW it will make no difference in low frame rate situations, sure looking to the sky u will see 400 FPS instead 250 using SLI, but in lions arch u will still see FPS as low as 30, SLI will make no difference in WvW, cities or heavy PVE events.
A single gtx 680 will give u very good frames in general at that res, even running triple monitors or 3D surround it will give u respectable performance. I would avoid SLI in general for games since it causes a lot of truobles, suttering, some games don’t scale properlly, it is more of a marketing thing, u know they made it possible in order to sell more videocards, but it really causes a lot of truoble sometimes. If u want really high end performance wait for the new tesla videocard, tho it is rumored to cost arround 1600$ so u probably want to save money :P

This statement is not true…if i disable SLI i have noticable lower FPS allround,and yes,LA included.SLI enabled it pops right back…Yes again..LA included.SLI has a MAJOR impact on performance if you have the cpu to back it up.

@Caedmon
The statement they made was not :

“SLI will not make a difference in FPS for GW2”.

If I understand them correctly, their statement is:

“Since 680 is a high-end graphics card, there isn’t a CPU on the (consumer) market that will back it up. Therefore, adding an additional 680 in a SLI configuration will not benefit low-FPS scenarios like heavy WvW, big cities, etc. This doesn’t mean that a SLI configuration for lower-end graphics cards will not be beneficial to GW2”

That’s my interpretation.

Their statement ? I replyd to one person,unless he is schizo,wich i doubt…Anyways,i stick to what i say and here is what i Do Not agree with At All

“I would avoid SLI in general for games since it causes a lot of truobles, suttering, some games don’t scale properlly, it is more of a marketing thing, u know they made it possible in order to sell more videocards, but it really causes a lot of truoble sometimes. If u want really high end performance wait for the new tesla videocard, tho it is rumored to cost arround 1600$ so u probably want to save money :P”

This is just complete bullocks,i wont even go in details since im to lazy to bother,been using sli since around 2004 -2005..Never had a problem,and Always boosting my fps massively.Ignorant statement.

GTX 680 FTW+ FPS @ 2560 x 1600 Full Detail?

in Account & Technical Support

Posted by: Rampage.7145

Rampage.7145

A second GTX 680 would be overkill for GW2, it will be cool for crysis 3 or farcry 3 but for GW it will make no difference in low frame rate situations, sure looking to the sky u will see 400 FPS instead 250 using SLI, but in lions arch u will still see FPS as low as 30, SLI will make no difference in WvW, cities or heavy PVE events.
A single gtx 680 will give u very good frames in general at that res, even running triple monitors or 3D surround it will give u respectable performance. I would avoid SLI in general for games since it causes a lot of truobles, suttering, some games don’t scale properlly, it is more of a marketing thing, u know they made it possible in order to sell more videocards, but it really causes a lot of truoble sometimes. If u want really high end performance wait for the new tesla videocard, tho it is rumored to cost arround 1600$ so u probably want to save money :P

This statement is not true…if i disable SLI i have noticable lower FPS allround,and yes,LA included.SLI enabled it pops right back…Yes again..LA included.SLI has a MAJOR impact on performance if you have the cpu to back it up.

@Caedmon
The statement they made was not :

“SLI will not make a difference in FPS for GW2”.

If I understand them correctly, their statement is:

“Since 680 is a high-end graphics card, there isn’t a CPU on the (consumer) market that will back it up. Therefore, adding an additional 680 in a SLI configuration will not benefit low-FPS scenarios like heavy WvW, big cities, etc. This doesn’t mean that a SLI configuration for lower-end graphics cards will not be beneficial to GW2”

That’s my interpretation.

Their statement ? I replyd to one person,unless he is schizo,wich i doubt…Anyways,i stick to what i say and here is what i Do Not agree with At All

“I would avoid SLI in general for games since it causes a lot of truobles, suttering, some games don’t scale properlly, it is more of a marketing thing, u know they made it possible in order to sell more videocards, but it really causes a lot of truoble sometimes. If u want really high end performance wait for the new tesla videocard, tho it is rumored to cost arround 1600$ so u probably want to save money :P”

This is just complete bullocks,i wont even go in details since im to lazy to bother,been using sli since around 2004 -2005..Never had a problem,and Always boosting my fps massively.Ignorant statement.

SLI does causes trouble and microsuttering, they tend to fix this kind of stuff over time but not allways end up scalling properlly and u most likelly have to wait for drivers fixes for specific games, stuff like that, SLI is problematic u need a decent knowleadge over hardware to deal with dual card cofiguration, it is not for everybody. U can check out Nvidia forums and see thousands of ppl complaining about this kind of stuff… oh wait no, they are all ignorant! A single card is ALLWAYS better than dual cards, i have run 3 SLI configs in the past and said it based on experience.

GTX 680 FTW+ FPS @ 2560 x 1600 Full Detail?

in Account & Technical Support

Posted by: Jazhara Knightmage.4389

Jazhara Knightmage.4389

Rampage: clearly Caedmon.6798 hasnt bothered to read where both nVidia and AMD admit to microstutter, and the fact is, its worse on nvidia SLI setups then AMD at the moment.

I say this having built 6 SLI setups in the last year, 4 on higher end intel, 2 on AMD in all cases, I was able to find micro stutter in some titles, not all titles but some, there are some things that can help or even fix it for some titles, like forcing windows 7 to use HPET (dont do this on first gen core i5/7 systems, more times then not it causes other issues), on the other hand, with amd, I have found that using radeon pro to tweak profiles tends to fix CFX microstutter in most games, mind you can enable force HPET mode on any simi-modern AMD system without issue.

microstutter is caused by frametimes/latancy not fps, you can have bad latancy but good fps….

im not a fanboi, I have owned as many nVidia gpu’s over the years as ati/amd, I have also owned many other brands back when we had more choices, its just a fact both sides have microstutter issues
http://www.overclockers.com/micro-stutter-the-dark-secret-of-sli-and-crossfire/

now, one thing I have noticed is its more common on intel chipset systems then amd or nVidia chipset systems, im not sure why this is, but I am far from alone in having noticed this weird trend…..

either way microstutter is real, and as the wikipedia artical about it says, on amd cards, you can use radeon pro to pretty much totally fix it, theres no similar tool for nvidia anymore since nHancer died.

AMD FX-8350@4.8ghz on air(SilverArrowSB-E Extreme) , 32gb 1866mhz(10-11-10 cr1)
PCP&C 1200watt TC, Crosshair V F-Z, Sapphire 290x

GTX 680 FTW+ FPS @ 2560 x 1600 Full Detail?

in Account & Technical Support

Posted by: Rampage.7145

Rampage.7145

http://www.overclockers.com/micro-stutter-the-dark-secret-of-sli-and-crossfire/

either way microstutter is real, and as the wikipedia artical about it says, on amd cards, you can use radeon pro to pretty much totally fix it, theres no similar tool for nvidia anymore since nHancer died.

THAT CANNOT BE, MICROSTUTTER DOSEN’T EXIST!!!!! Nah may be all those ppl writing articles arround the web and ppl complaining in forums about experiencing microstutter on dual cards configs are all IGNORANT, such thing don’t even exist. In fact Nvidia and AMD will only produce dual GPU cards now, cuz this is all lie!

Troll mode: OFF

GTX 680 FTW+ FPS @ 2560 x 1600 Full Detail?

in Account & Technical Support

Posted by: Caedmon.6798

Caedmon.6798

A second GTX 680 would be overkill for GW2, it will be cool for crysis 3 or farcry 3 but for GW it will make no difference in low frame rate situations, sure looking to the sky u will see 400 FPS instead 250 using SLI, but in lions arch u will still see FPS as low as 30, SLI will make no difference in WvW, cities or heavy PVE events.
A single gtx 680 will give u very good frames in general at that res, even running triple monitors or 3D surround it will give u respectable performance. I would avoid SLI in general for games since it causes a lot of truobles, suttering, some games don’t scale properlly, it is more of a marketing thing, u know they made it possible in order to sell more videocards, but it really causes a lot of truoble sometimes. If u want really high end performance wait for the new tesla videocard, tho it is rumored to cost arround 1600$ so u probably want to save money :P

This statement is not true…if i disable SLI i have noticable lower FPS allround,and yes,LA included.SLI enabled it pops right back…Yes again..LA included.SLI has a MAJOR impact on performance if you have the cpu to back it up.

@Caedmon
The statement they made was not :

“SLI will not make a difference in FPS for GW2”.

If I understand them correctly, their statement is:

“Since 680 is a high-end graphics card, there isn’t a CPU on the (consumer) market that will back it up. Therefore, adding an additional 680 in a SLI configuration will not benefit low-FPS scenarios like heavy WvW, big cities, etc. This doesn’t mean that a SLI configuration for lower-end graphics cards will not be beneficial to GW2”

That’s my interpretation.

Their statement ? I replyd to one person,unless he is schizo,wich i doubt…Anyways,i stick to what i say and here is what i Do Not agree with At All

“I would avoid SLI in general for games since it causes a lot of truobles, suttering, some games don’t scale properlly, it is more of a marketing thing, u know they made it possible in order to sell more videocards, but it really causes a lot of truoble sometimes. If u want really high end performance wait for the new tesla videocard, tho it is rumored to cost arround 1600$ so u probably want to save money :P”

This is just complete bullocks,i wont even go in details since im to lazy to bother,been using sli since around 2004 -2005..Never had a problem,and Always boosting my fps massively.Ignorant statement.

SLI does causes trouble and microsuttering, they tend to fix this kind of stuff over time but not allways end up scalling properlly and u most likelly have to wait for drivers fixes for specific games, stuff like that, SLI is problematic u need a decent knowleadge over hardware to deal with dual card cofiguration, it is not for everybody. U can check out Nvidia forums and see thousands of ppl complaining about this kind of stuff… oh wait no, they are all ignorant! A single card is ALLWAYS better than dual cards, i have run 3 SLI configs in the past and said it based on experience.

Can you read ? I said i have been using SLI since 2004 2005 and Never had a problem with it.The problem usually sits Infront of the Pc.That you’re unable to accept this as a fact is not of my concerns.The thing that differs me from these people,i Know what it is that i’m doing,and i do not screw around.

(edited by Caedmon.6798)

GTX 680 FTW+ FPS @ 2560 x 1600 Full Detail?

in Account & Technical Support

Posted by: MaRko.3165

MaRko.3165

“micro stutter” – I’ve heard of it, can’t say that I’ve ever caught more than a glance of it. Its never been an ‘issue’ for me however I’ve only ran dual card rigs.

While I’ve not actually TRIED and USED a 690 or 680SLI configuration with GW2 IMHO such a configuration would not benefit my current rig playing GW2.

(why folks who are not currently using an SLI rig with GW2 are commenting escapes me)

“I was playing Farmville and a kitten MMO GW2 broke out of it…”
I cut my gaming teeth on Adventure&ZorkI,II,III.
i7-2600K/8G/GTX570SLI/WIN7/Stereoscopic_3D

GTX 680 FTW+ FPS @ 2560 x 1600 Full Detail?

in Account & Technical Support

Posted by: Jazhara Knightmage.4389

Jazhara Knightmage.4389

A second GTX 680 would be overkill for GW2, it will be cool for crysis 3 or farcry 3 but for GW it will make no difference in low frame rate situations, sure looking to the sky u will see 400 FPS instead 250 using SLI, but in lions arch u will still see FPS as low as 30, SLI will make no difference in WvW, cities or heavy PVE events.
A single gtx 680 will give u very good frames in general at that res, even running triple monitors or 3D surround it will give u respectable performance. I would avoid SLI in general for games since it causes a lot of truobles, suttering, some games don’t scale properlly, it is more of a marketing thing, u know they made it possible in order to sell more videocards, but it really causes a lot of truoble sometimes. If u want really high end performance wait for the new tesla videocard, tho it is rumored to cost arround 1600$ so u probably want to save money :P

This statement is not true…if i disable SLI i have noticable lower FPS allround,and yes,LA included.SLI enabled it pops right back…Yes again..LA included.SLI has a MAJOR impact on performance if you have the cpu to back it up.

@Caedmon
The statement they made was not :

“SLI will not make a difference in FPS for GW2”.

If I understand them correctly, their statement is:

“Since 680 is a high-end graphics card, there isn’t a CPU on the (consumer) market that will back it up. Therefore, adding an additional 680 in a SLI configuration will not benefit low-FPS scenarios like heavy WvW, big cities, etc. This doesn’t mean that a SLI configuration for lower-end graphics cards will not be beneficial to GW2”

That’s my interpretation.

Their statement ? I replyd to one person,unless he is schizo,wich i doubt…Anyways,i stick to what i say and here is what i Do Not agree with At All

“I would avoid SLI in general for games since it causes a lot of truobles, suttering, some games don’t scale properlly, it is more of a marketing thing, u know they made it possible in order to sell more videocards, but it really causes a lot of truoble sometimes. If u want really high end performance wait for the new tesla videocard, tho it is rumored to cost arround 1600$ so u probably want to save money :P”

This is just complete bullocks,i wont even go in details since im to lazy to bother,been using sli since around 2004 -2005..Never had a problem,and Always boosting my fps massively.Ignorant statement.

SLI does causes trouble and microsuttering, they tend to fix this kind of stuff over time but not allways end up scalling properlly and u most likelly have to wait for drivers fixes for specific games, stuff like that, SLI is problematic u need a decent knowleadge over hardware to deal with dual card cofiguration, it is not for everybody. U can check out Nvidia forums and see thousands of ppl complaining about this kind of stuff… oh wait no, they are all ignorant! A single card is ALLWAYS better than dual cards, i have run 3 SLI configs in the past and said it based on experience.

Can you read ? I said i have been using SLI since 2004 2005 and Never had a problem with it.The problem usually sits Infront of the Pc.That you’re unable to accept this as a fact is not of my concerns.The thing that differs me from these people,i Know what it is that i’m doing,and i do not screw around.

http://www.tmrepository.com/trademarks/worksforme/

in short, because you havent seen it or, didnt notice it it dosnt exist…..if only that where true…..but alas if you google microstutter its even got its own wikipedia page….

AMD FX-8350@4.8ghz on air(SilverArrowSB-E Extreme) , 32gb 1866mhz(10-11-10 cr1)
PCP&C 1200watt TC, Crosshair V F-Z, Sapphire 290x