Yay or nah, gtx 680 sli
gw2 can do sli but honestly there’s no need as you wont get any fps upgrade unless you are doing triple monitors or 4k, in which case the 2gb frame buffer of the 680 is lacking
for bf4 ultra 1080p, sli 680’s are great as that game has pretty good multi gpu scaling especially if you can get it that cheap
edit; just make sure your mb has sli support and your psu is strong enough to run 2 680’s
so it is true, about not really gaining any improvement on fps… i currently play with all settings on high except for character quality and limit on medium, anti-aliasing faxx and native sampling with vsync. but when im in wvw and i join a zerg fps drops to around 30 fps and i was hoping this could solve this
so it is true, about not really gaining any improvement on fps… i currently play with all settings on high except for character quality and limit on medium, anti-aliasing faxx and native sampling with vsync. but when im in wvw and i join a zerg fps drops to around 30 fps and i was hoping this could solve this
sli is pretty useless.
When you really want it to increase frame rates, it doesnt because the bottleneck is elsewhere.
There is always a chance of shuttering.
Not worth the headaches of drivers and such.
Yeah, sli would yield a fps gain. However, if it was just gw2 I wouldn’t do it. I’m getting like 80 fps average with a gtx 660ti, everything on ultra except for sampling (native) and post processing (off). Bare minimum outside of huge zergs at 50 fps.
I could imagine a gtx 680 staying above 70 fps at all times, so unless you were to play on a 120/144hz monitor, there wouldn’t be a change.
so it is true, about not really gaining any improvement on fps… i currently play with all settings on high except for character quality and limit on medium, anti-aliasing faxx and native sampling with vsync. but when im in wvw and i join a zerg fps drops to around 30 fps and i was hoping this could solve this
sli is pretty useless.
When you really want it to increase frame rates, it doesnt because the bottleneck is elsewhere.
There is always a chance of shuttering.Not worth the headaches of drivers and such.
What ???
SLI is not useless
It does increase frames
Bottleneck depends on the CPU mostly.
Shuttering ? This game stutters no matter what. Especially in zergs.I’m running the game on SLI and have had a 40-60 fps gain on High-Ultra settings on a 120Hz monitor with Vsync on.
so sli shutter when you really want fps increase. I said the problem with sli it increases fps when you dont really care. When you do care, there always a different problem such as cpu bottleneck preventing the frame increase.
sounds pretty useless to me.
Im using CFX and I’m pretty much covered @1440p but the frame pacing issues is still there for 4k reso for CFX should be pretty similar for the green team.
You do get a nice boost with sli enabled,i think i had about 25% more fps overall .My 2nd gpu died rpecently so im currently on one gpu..and trust me i can really tell the difference,especially when it comes to situations where there are 50+ people on your screen blasting effects allover the place i really feel im slowing down atm..which didnt happen when i had sli.
so it is true, about not really gaining any improvement on fps… i currently play with all settings on high except for character quality and limit on medium, anti-aliasing faxx and native sampling with vsync. but when im in wvw and i join a zerg fps drops to around 30 fps and i was hoping this could solve this
sli is pretty useless.
When you really want it to increase frame rates, it doesnt because the bottleneck is elsewhere.
There is always a chance of shuttering.Not worth the headaches of drivers and such.
What ???
SLI is not useless
It does increase frames
Bottleneck depends on the CPU mostly.
Shuttering ? This game stutters no matter what. Especially in zergs.I’m running the game on SLI and have had a 40-60 fps gain on High-Ultra settings on a 120Hz monitor with Vsync on.
so sli shutter when you really want fps increase. I said the problem with sli it increases fps when you dont really care. When you do care, there always a different problem such as cpu bottleneck preventing the frame increase.
sounds pretty useless to me.
That made no sense at all.Maybe let people who are currently using crossfire / sli with gw2 provide constructive feedback rather than “sounds pretty useless”.
Like I said. When you really want sli to increase your frame rate, you are more likely bottleneck somewhere else. When it does increase your frame rate, you are probably doing nothing.
I find that kinda useless for sli.
The only people who should sli are people who are running higher resolutions than the normal 1080p.
so it is true, about not really gaining any improvement on fps… i currently play with all settings on high except for character quality and limit on medium, anti-aliasing faxx and native sampling with vsync. but when im in wvw and i join a zerg fps drops to around 30 fps and i was hoping this could solve this
sli is pretty useless.
When you really want it to increase frame rates, it doesnt because the bottleneck is elsewhere.
There is always a chance of shuttering.Not worth the headaches of drivers and such.
What ???
SLI is not useless
It does increase frames
Bottleneck depends on the CPU mostly.
Shuttering ? This game stutters no matter what. Especially in zergs.I’m running the game on SLI and have had a 40-60 fps gain on High-Ultra settings on a 120Hz monitor with Vsync on.
so sli shutter when you really want fps increase. I said the problem with sli it increases fps when you dont really care. When you do care, there always a different problem such as cpu bottleneck preventing the frame increase.
sounds pretty useless to me.
That made no sense at all.Maybe let people who are currently using crossfire / sli with gw2 provide constructive feedback rather than “sounds pretty useless”.
Like I said. When you really want sli to increase your frame rate, you are more likely bottleneck somewhere else. When it does increase your frame rate, you are probably doing nothing.
I find that kinda useless for sli.
The only people who should sli are people who are running higher resolutions than the normal 1080p.
I apologise. You’re obviously more qualified to give definitive advice.More so than those people that are actually running cf/sli.
As for your blanket statement about sli only being useful for users running resolutions above 1080p that’s complete rubbish. Provide links to back up your statement on sli scaling for 1080p. I run a 1080 p monitor that’s 120hz and the only way I cant sustain this frame rate is with sli, so that debunks your theory (its not fact). Stop spreading this crap.
Be honest , you’re not running cf/sli so why make these claims ?
I care about minimum frame rates. During those frame rate drop, it is more likely you are experiencing most of the action
take a look at an actual cpu bottlenecked game
Crossfire actually achieve even worse case performance than a single card solution.
Please note, civ5 actually implemented crossfire properly unlike other games
The sole problem with the vast majority of benchmarks for sli are using single player game with little complexity. Multiplayer games such as arma, bf4, etc show display issues such as microshutter.
I would had not made the statement sli is useless a few years ago. Today, single card solutions are already cpu bottlenecked. So, there is no point of another GPU.
I will re evulate my statements after the oculus rift is released and more games have direct x 12 support.
(edited by loseridoit.2756)
Again, you don’t even have sli /cf and reinforcing your argument with the link to ONE game. The point is, it’s not your opinion, it’s someone else’s. To make an objective argument go out and actually use sli/cf now and see the results for yourself on Guild Wars 2. Nuff said.
Think critically about what you’re saying. How can you honestly disregard the actual experience of people using cf/sli with GW2 in a multiplayer setting? I play BF4 multiplayer and the net-gain is huge with sli over a single card,that’s minimum frame rates too (on 1080p)
Again, blanket statements like “its not worth the actual hassle with drivers and such”
What hassle ? It took me under a minute to click a few buttons and run sli without issues. Maybe it’s time to re-evaluate your opinion ?As you said, “The sole problem with the vast majority of benchmarks for sli are using single player game with little complexity. Multiplayer games such as arma, bf4, etc show display issues such as microshutter.”
Thus logic dictates one to consider the experience of people ACTUALLY using CF/SLI with Guild Wars 2, correct ?……………………………………………………………………………………………………………….
Not just gw2, every other game.
Yea, I spend time reading game forums and there is always somebody who complains about the game not being sli or cf optimized and requires configuration.
I like to value the OP’s time. The fact is that those duo cards solutions needs vendor help and extra configuration to get things working. I find that a giant con in light that single card solutions are pretty fast already. There plenty of other games with bork sli/cf that has micro shuttering.
I just do not want people to add their name to this forum. Time should be spent playing games not deal with this BS.
https://forums.geforce.com/default/board/50/sli/1/
https://forums.geforce.com/default/topic/781212/sli/i-am-done-with-sli/post/4345003/#4345003
first question, does gw2 support sli? I did some digging and people said 2 years ago that it didn’t. Can anyone confirm this as of today
Now I know you’re all like why the gtx 680? And why not upgrade to a gtx 970 or 980?
Well because I don’t really want to waste all that money on a new card when gw2 runs fine on my gtx 680 and gw2 is mainly what I play. So why go sli well because a friend of mines would sell me his old gtx 680 for $150. Second game I play is bf4 at 1080p and I’ll be able to play at ultra
No one has addressed the main issue with SLI/CF, Cost.
The performance you get by adding an additional card is not 100% its more like 75% at best. You also need to make sure your PSU has enough power to sustain dual 680’s.
If I were you, I would be looking at a newer card. Maybe the GTX 970. More Memory, faster GPU engine, more rendering units….ect. And it should cost about the same if not just a little bit more. (GTX680’s on Ebay are still going for 375~ for the really good ones, such as Asus, MSI…ect)
http://www.hwcompare.com/18057/geforce-gtx-970-vs-geforce-gtx-680/
CF/SLI is really useful when you are pushing your GPUs harder then they can work singularly. I run R9 290’s in 2way CF, and there is definitely a benefit there. But I generally only see it when running my 3 displays at 5040×900 rather then on a single display at 1600×900. Also, CF/SLI only works in Full screen mode, so if you run in windowed mode you will see 0 benefit to it at all.
And GW2 specifically is CPU bound for performance. When your CPU takes that hit (zerg content mainly) CF/SLI will do nothing here.
Laptop: M6600 – 2720QM, AMD HD6970M, 32GB 1600CL9 RAM, Arc100 480GB SSD
Good god man, get some perspective. The OP will be using it for GW2 and BF4.
Sli works great on both.
there is nothing wrong with a perspective that does not agree with you
Sorry, i wasn’t clear. I meant to say “lack of”. Again if you thought critically about your opinion, you might just come to realize the people most qualified to provide the OP with feedback are the ones using either crossfire/sli with GW2.
It would be a different story if you could provide some real feedback to the OP pertaining to this game, but you can’t.
Look, if sli/cf was kitten. We’d have said so. Capiche? Good lad. And you’re absolutely right, there’s nothing wrong with an opinion that differs from mine. Unbalanced or otherwise……. Sound rude ? No it’s the truth, because you don’t have first-hand experience like the other posters.
If sli/cf was a terrible as you’d like us to believe I wouldn’t have spent the $$$ in the first place. This is coming from someone who has been building gaming PC’s for years and I openly admit i’m a fps kitten ! :P
Great to see some real feedback from others in the community.
hmmm, if you are going to insult me the whole time, you have issues. I am just telling the op in the harshest way the reason why he will have problems.
SLI/CF technology is not that great. It does not double vram. There are driver issues and require vendor help. We do not know the exact vram of his cards. The op might want to play star citizen down the road at very high settings.
I said it is useless today because we are in a strange situation. Single card are powerful. direct x 12 is not out yet which will eliminate lots of problems. 3GB vram might be the standard soon due to consoles. Many games are cpu bottleneck at action packed moments
(edited by loseridoit.2756)
.
The performance you get by adding an additional card is not 100% its more like 75% at best.
at best it is about 43% to 45%, the scaling is not even close to linear so 75% isnt achievable with 99% of games.
SLI/CF technology is not that great. It does not double vram. There are driver issues and require vendor help.
I said it is useless today because we are in a strange situation. Single card are powerful.
on the contrary, it actually is great in many circumstances.
in single monitor situations a single card is more than enough, the returns are quite dimished if you crossfire, there are certain exceptions to the rule dependant on the game.
where Crossfire really shines is in Eyefinity setups (of which i currently use and play with), in a crossfire setup with a resolution of 5760×1080, the gains can be as high as 45%, at a minimum 10% to 15% which is still a pretty significant increase in multi-monitor setups, single monitor setups will suffer a bit more, will be a minimum of around 5% to 7% in that ballpark area.
in a case i have personally witnessed two 7990’s in crossfire/quad fire on a single 27’ widescreen monitor running GW2, the gains were fairly negligible and didnt really do anything to prevent framerate slow downs in zergs or world boss fights on ultra settings, in an eyefinity setup, they did provide a better frame rate in GW2, but the cost will be high for running such a setup, and i really do not recommend crossfiring just for playing one game, for multiple other games, sure.
i cannot speak for SLI a whole lot since i havent used Nvidia cards for well, a very long time, but a friend did used to play GW2 on an SLI setup, same thing really, the difference on a single monitor was practically negligible.
Thanks for trying to give feedback when you don’t play GW2 on sli/cf. Stimulating stuff, really. If you cared as you claim you do about the OP’s question ( he asked specifically about GW2 and BF4 ) you’d let people that play these games with cf/sli provide real answers……………………..Ya get it yet loser idoit.2756?
If you don’t get it, that’s fine. I’ll continue to point out logically that you don’t have any first hand experience.
Btw, GW2 does not use lots of vram for frame buffer (that’s my experience on 1080p) so the information you provided about “it does not double vram” is irrelevant. You’re just parroting what you read on various forums to try and sound like some sort of aficionado.
I eagerly await your next irrelevant post !
…
you really have issues.
I have issue with how sli/cf is implemented. two gpu set up is practically bolted on and per game support has to be implemented by drivers devs. I find this approach will lead to nowhere in the future. I said I will reevulate my statement when direct x 12 comes around and microsoft ratify a standard that gives a dawm about dual gpu set ups.
(edited by loseridoit.2756)
….
Shuttering ? This game stutters no matter what. Especially in zergs.
…..
I do believe that was the point being made about it being less than useful. The gains you get outside of Zerg instances are just bragging rights as nobody can see beyond 60-80 FPS anyway….Getting 120 FPS is just pointless. (but I’m sure someone that the video card manufactures have brain washed to believe otherwise will be sure to tell me all the reasons I’m ignorant in this area).
Fate is just the weight of circumstances
That’s the way that lady luck dances
.
The performance you get by adding an additional card is not 100% its more like 75% at best.at best it is about 43% to 45%, the scaling is not even close to linear so 75% isnt achievable with 99% of games.
SLI/CF technology is not that great. It does not double vram. There are driver issues and require vendor help.
I said it is useless today because we are in a strange situation. Single card are powerful.
on the contrary, it actually is great in many circumstances.
in single monitor situations a single card is more than enough, the returns are quite dimished if you crossfire, there are certain exceptions to the rule dependant on the game.
where Crossfire really shines is in Eyefinity setups (of which i currently use and play with), in a crossfire setup with a resolution of 5760×1080, the gains can be as high as 45%, at a minimum 10% to 15% which is still a pretty significant increase in multi-monitor setups, single monitor setups will suffer a bit more, will be a minimum of around 5% to 7% in that ballpark area.
in a case i have personally witnessed two 7990’s in crossfire/quad fire on a single 27’ widescreen monitor running GW2, the gains were fairly negligible and didnt really do anything to prevent framerate slow downs in zergs or world boss fights on ultra settings, in an eyefinity setup, they did provide a better frame rate in GW2, but the cost will be high for running such a setup, and i really do not recommend crossfiring just for playing one game, for multiple other games, sure.
i cannot speak for SLI a whole lot since i havent used Nvidia cards for well, a very long time, but a friend did used to play GW2 on an SLI setup, same thing really, the difference on a single monitor was practically negligible.
hey, I said people who need gpu grunt aka resolution should sli/cf.
….
Shuttering ? This game stutters no matter what. Especially in zergs.
…..I do believe that was the point being made about it being less than useful. The gains you get outside of Zerg instances are just bragging rights as nobody can see beyond 60-80 FPS anyway….Getting 120 FPS is just pointless. (but I’m sure someone that the video card manufactures have brain washed to believe otherwise will be sure to tell me all the reasons I’m ignorant in this area).
hey, do not insult him. I do not anybody stoop to his level.
I rather him insult me.
….
Shuttering ? This game stutters no matter what. Especially in zergs.
…..I do believe that was the point being made about it being less than useful. The gains you get outside of Zerg instances are just bragging rights as nobody can see beyond 60-80 FPS anyway….Getting 120 FPS is just pointless. (but I’m sure someone that the video card manufactures have brain washed to believe otherwise will be sure to tell me all the reasons I’m ignorant in this area).
Not if you want above 60 fps for a number of reasons ?
I can see a difference above 60 & 80Hz …… ( look I read you couldn’t – just like loseridiot reads the forums and stuff, but then changed refresh rates on my 120 Hz monitor and realised you can, brain washing aside of course )
Do you have either a gsync monitor or freesync monitor?
Montior come in predefine refresh rates. If you dont output at the proper refresh rate. The frame get push an extra hertz
for example, you video card is outputting 59 fps.
you will end with alternating problems of 60 hz and 30 hz So, I will find your I can see 80 hz quite strange unless you are relieve at the fact that a higher refresh rate monitor has less presistance which you could had done nvidia lightboost and have a better experience.
you know, I care about shuttering alot and best user experience
(edited by loseridoit.2756)
Again, you don’t even have sli /cf and reinforcing your argument with the link to ONE game. The point is, it’s not your opinion, it’s someone else’s. To make an objective argument go out and actually use sli/cf now and see the results for yourself on Guild Wars 2. Nuff said.
Think critically about what you’re saying. How can you honestly disregard the actual experience of people using cf/sli with GW2 in a multiplayer setting? I play BF4 multiplayer and the net-gain is huge with sli over a single card,that’s minimum frame rates too (on 1080p)
Again, blanket statements like “its not worth the actual hassle with drivers and such”
What hassle ? It took me under a minute to click a few buttons and run sli without issues. Maybe it’s time to re-evaluate your opinion ?As you said, “The sole problem with the vast majority of benchmarks for sli are using single player game with little complexity. Multiplayer games such as arma, bf4, etc show display issues such as microshutter.”
Thus logic dictates one to consider the experience of people ACTUALLY using CF/SLI with Guild Wars 2, correct ?……………………………………………………………………………………………………………….
Not just gw2, every other game.
Yea, I spend time reading game forums and there is always somebody who complains about the game not being sli or cf optimized and requires configuration.
I like to value the OP’s time. The fact is that those duo cards solutions needs vendor help and extra configuration to get things working. I find that a giant con in light that single card solutions are pretty fast already. There plenty of other games with bork sli/cf that has micro shuttering.
I just do not want people to add their name to this forum. Time should be spent playing games not deal with this BS.
https://forums.geforce.com/default/board/50/sli/1/
https://forums.geforce.com/default/topic/781212/sli/i-am-done-with-sli/post/4345003/#4345003
Dude..just to say ive owned SLI systems since 2005 – 2006.At those times,games were pretty horribly optimized for SLI,those times are long gone buddy.SLI has always drastically improved my fps in most games,and the only issues ive had with SLI lately were when playing Farcry 4,which had terrible support.
Again, you don’t even have sli /cf and reinforcing your argument with the link to ONE game. The point is, it’s not your opinion, it’s someone else’s. To make an objective argument go out and actually use sli/cf now and see the results for yourself on Guild Wars 2. Nuff said.
Think critically about what you’re saying. How can you honestly disregard the actual experience of people using cf/sli with GW2 in a multiplayer setting? I play BF4 multiplayer and the net-gain is huge with sli over a single card,that’s minimum frame rates too (on 1080p)
Again, blanket statements like “its not worth the actual hassle with drivers and such”
What hassle ? It took me under a minute to click a few buttons and run sli without issues. Maybe it’s time to re-evaluate your opinion ?As you said, “The sole problem with the vast majority of benchmarks for sli are using single player game with little complexity. Multiplayer games such as arma, bf4, etc show display issues such as microshutter.”
Thus logic dictates one to consider the experience of people ACTUALLY using CF/SLI with Guild Wars 2, correct ?……………………………………………………………………………………………………………….
Not just gw2, every other game.
Yea, I spend time reading game forums and there is always somebody who complains about the game not being sli or cf optimized and requires configuration.
I like to value the OP’s time. The fact is that those duo cards solutions needs vendor help and extra configuration to get things working. I find that a giant con in light that single card solutions are pretty fast already. There plenty of other games with bork sli/cf that has micro shuttering.
I just do not want people to add their name to this forum. Time should be spent playing games not deal with this BS.
https://forums.geforce.com/default/board/50/sli/1/
https://forums.geforce.com/default/topic/781212/sli/i-am-done-with-sli/post/4345003/#4345003
Dude..just to say ive owned SLI systems since 2005 – 2006.At those times,games were pretty horribly optimized for SLI,those times are long gone buddy.SLI has always drastically improved my fps in most games,and the only issues ive had with SLI lately were when playing Farcry 4,which had terrible support.
I dislike anecdotal evidence.
I rather post reports that people can verify themselves from nvidia etc.
Hey, I am trying to show this is a much larger problem with my blanket statement with a reasonable alternative. Single cards are powerful enough. Intel/AMD are not advance single threaded performance enough.
I find this funny that SpiritSpeakerOfHoelbrak posted this community is turning toxic when he is no doubt is a problem since he cannot consider that somebody disagrees with him.
hey, I believe in the ability for the merit of the argument rather than personal stature.
(edited by loseridoit.2756)
After reading through this thread thoroughly, it’s crystal clear to me that it’s indeed you sir, that has the issue. I suspect you’re a forum troll.
care to explain?
My whole bases of my argument is basically SLI is a time sink and the op does not have to bother because single card solutions are already fast enough. Hey, I would not write anything against SLI if there wasnt a regression in performance in the most demanding cpu bottlenecked events.
I do not write my post based on trust. I try to provide evidence. Compared to my disenter, who basically said “works for me” and states “i do not have a perspective” and pass insults throughout the whole time.
All purchases are investments at the end of the day. I rather provide a chunck of information so they can verify even if the OP disagrees with me.
(edited by loseridoit.2756)