Best graphics card?
Ì’m no V-Sync lover, I used to play games before on mid end video solutions for years, and I cannot be bothered with 1 lost frame, when I have 60, 90 or 120. Coming from 15-20 FPS gaming (with dips on my old opteron with Nvidia 9600GT as low as 3-5 FPS) for years now I find V-Sync a luxury which is unneeded as I tend to ignore the treared frame.
Been There, Done That & Will do it again…except maybe world completion.
Why do you turn all V-Sync off? I hate tearing
V-Sync went obsolete the second Nvidia added Adaptive V-Sync to their drivers. Enabling it ingame is pointless. Fortunetly GW2 is designed as a true exclusive fullscreen game (ie graphics card can hook into it), unlike games that cheap out and use borderless windowed mode (like ESO on release, graphics card cant properly hook into it)
(edited by Dawdler.8521)
Well… This is in open field while on the move, not in a WvW battle..
The second is in combat due to some ghosts popping up when I wanted to make my picture. All Plains of Ashford…
FPS will drop significantly when on world bosses or on wvw battles… so this is just to show it is possible to get good Fps.
I’m curious what your setup is.
I have a decent rig that maxes everything, but for you to get 90fpson 4k you must have some other good stuff in there.
Either that or the whole “cpu” thing is debunked as far as regular play goes(cpu only comes into effect really during larger engagements), and maybe your graphics cards are handling the scenery at that point at 4k.
I know with my 3770k and a gtx 970 i barely get 45fps with everything maxed in combat at 4k. I am, however using sweet FX, so im sure thats dropping me 10-20fps at 4k as well, but I’m wondering if i should sli another 970.
4k for a lot of games is still not a viable solution unless you run 2 gtx titan x’s or 2 r295’s currently(At least that are graphic reliant)
Base system:
A Hex Core Intel i7 3930K @ 3.8Ghz on a
Asus Rampage IV Extreme cooled with a
Scythe Mugen 3 in a
Thermaltake Chaser A71with a
Toughpower 1200 Watt modular PSU.
32 Gb Ram (DDR3-1866 on 1600 (1866 seems to work but causes hangs on 10-15% of boots, so i’m not finding it stable enough, offcially the i7 3930 has a 1600Mhz max mem controller, though oc-ing sh-/could be possible, but I guess 1 of my eight 4Gb modules is not good/stable enough. All slots filled)2 Nvidia GTX 780 (non-OC, stock) in SLi both on a PCI-E 16x v3.0 port on 3.0 (Full range enabled, forced on PCI-e 16 v3.0)
(running either :
1920*1200 with my 2nd screen on 1920*1080 in windowed full screen or
3840*2400 on full screen.)Audio and network from mainboard.
1 256 Gb OCZ Vertex 4 (SATA 3) on SATA 3(OS and Programs)
1 250 Gb WD Velociraptor (SATA 3) on SATA 3 (User and Swaps/internet cache)
2 1Tb WD Reds (SATA 3) on SATA 3 (backup Velociraptor/ Storage)
1 DVD RW (SATA 2) on SATA 2 (cause I’m oldfashioned)Mouse: Sharkoon Drakonia Black
Keyboard: Corsair K95
Headset (main): Roccat Cave 5.1
Speakers (alt) Generic 2.0 Logitech set but due to the fact I have wife and 2 kids I’m on headset 99% of the time, my speakers are for my little girl(s) if she/they want to watch Postman Pat, Fireman Sam or other streamed stuff.OS: Windows 7 Ultimate (generally up to date.)(Doesn’t NATIVELY support pci-e 16x v3.0!!, make registry changes to update)
Nvidia driver: 347.88 (No Beta allowed in settings. generally up to date.)(I seldomly run services in the background, no active scanning, but I do run autoscan for virus and malware on shutdown or demand. Active programsin background : windows (lightly tuned), Nvidia, mouse, keyboard, headset, printer driver and a small program (Display Fusion (it gives me a second taskbar on my 2nd screen, independently configurable))
1 remark though: Fighting spawns in a map doesn’t cause to much FPS drop,
generally it is only when encountering other players which seems to cause a lot of FPS drop…My PC is on the attic, where we have all PC’s ( my wifes old Q6600, and my old dual opteron 2376) it can get very hot in the summer with ambient going easily over 27 celsius, so I do not OC too much, as good OC-ing requires a stable ambient temp. if all PC’s are running thay add a lot to the ambient temperature, up to outside temperatures… Mid summer I only game in the morning and the evenings. (something with not completely wanting to no-life my life, PC’s will be off during the day. )
Hope this is all the info you need. During the summer my PC will be set to throttle, so clock will be lower.
AH i actually took a look there, 384 bit bus vs the 256 of the 970. Plus the sli. that’s probably where you’re getting the good performance at at least on 4k. At 1080p we probably won’t see too much of a difference.
I think i’m still going going to sli them. I want to play starbound and terraria and terraria otherworld at 4k on my 40’.
best video cards? tnt riva 2, intel 740, voodoo 3.
…don’t take this one seriously.
I still remember getting my Voodoo 2 to “max out” Dungeon Keeper 2……brings a tear to my eye.
Voodoo ruled the heck out of all that NVidia stuff in Diablo 2 ^^
Best MMOs are the ones that never make it. Therefore Stargate Online wins.
@Pax,
why do you have your PCs in the attic? Genuinely curious.
Why do you turn all V-Sync off? I hate tearing
V-Sync went obsolete the second Nvidia added Adaptive V-Sync to their drivers. Enabling it ingame is pointless. Fortunetly GW2 is designed as a true exclusive fullscreen game (ie graphics card can hook into it), unlike games that cheap out and use borderless windowed mode (like ESO on release, graphics card cant properly hook into it)
Which is pointless if you don’t own one of the heavily marked up monitors that support nVidia’s adaptive sync hardware. ATI just introduced their own tech and surprise, they aren’t compatible. But unlike nVidia, their solution is free to use in hopes that monitor manufactures would flock to it rather than pay nVidia for the privilege and pass the expense onto the gamer.
RIP City of Heroes
I find that tearing (even minor) creates far larger impact on my gaming experience than the difference between 60 and 90 FPS (which I highly doubt I could even tell apart in a double blind test).
I find that tearing (even minor) creates far larger impact on my gaming experience than the difference between 60 and 90 FPS (which I highly doubt I could even tell apart in a double blind test).
you won’t notice 60-90. 120 is noticeable by most though.
went from Asus GTX 580 on I7, 2600, 3.4 with and average of 55 to 60 FPS
to Asus Strix 960 still on my I7, 2600 with a new average of 83 to 117 FPS.
As for my CPU I’m waiting until after aluch to see how things go.
Why do you turn all V-Sync off? I hate tearing
V-Sync went obsolete the second Nvidia added Adaptive V-Sync to their drivers. Enabling it ingame is pointless. Fortunetly GW2 is designed as a true exclusive fullscreen game (ie graphics card can hook into it), unlike games that cheap out and use borderless windowed mode (like ESO on release, graphics card cant properly hook into it)
Which is pointless if you don’t own one of the heavily marked up monitors that support nVidia’s adaptive sync hardware. ATI just introduced their own tech and surprise, they aren’t compatible. But unlike nVidia, their solution is free to use in hopes that monitor manufactures would flock to it rather than pay nVidia for the privilege and pass the expense onto the gamer.
Adaptive v-sync is not g-sync. Two entirerly different things. If you got an Nvidia card, you can force adaptive v-sync regardless of what monitor you use. Its basicly just “smart v-sync” – instead of chopping fps down to 30 fps on a 60hz monitor that is really running 58fps, it disables v-sync and run it at 58fps. When it reach 60fps, it enables v-sync again.
Dont know if AMD use something similar now as my last card from them was ATI. Freesync is hardly impressive though. Really, really bad ghosting in the tests I’ve seen.
(edited by Dawdler.8521)
I play just fine on an FX 6300 and R9 270X 4gb. Sure i dont always have the almighty 60 fps everyone clamors about all the time, but im still able to enjoy high/ ultra.
Plus my soundcard allows audio processing offloaded to it , so high quality sound
@Pax,
why do you have your PCs in the attic? Genuinely curious.
Due to the rest it provides me while gaming and the DOG…
The dog doesn’t like the attic. He’s sweet but way madder then me, and I’m madder then the Mad Hatter…
Put a nice picture with this. I also have other pictures portraying him sleeping “starfish”, but I think they’d be removed due to indecent exposure… He’s 6 now but this was made roughly when gw2 launched I guess.. so about 3,5 yrs old at the time only difference is I get more white hairs now on the floor and less brown, but the amount of mixture is the same… He has hair for everything, if you have black clothes: white hair, if you wear white: brown hair, if you are smart and wear brown and white he just comes over and drivels all over your pants, it really doens’t matter…When you buy one you know you need to accept he looks good and you are the new hair collector….
Trust me: you do not want a shedding Shorthaired German Pointer anywhere near any of your hardware. We found hair everywhere. Dust proof?: hair inside!
He wrecked 2 DVD players by just living with us, and I clean daily, during spring and fall shedding at least 2 times daily to make sure the hairs are gone, and I never ever seem to succeed though…
That is why my PC is in the attic.
Been There, Done That & Will do it again…except maybe world completion.
(edited by PaxTheGreatOne.9472)
@Pax,
why do you have your PCs in the attic? Genuinely curious.
Due to the rest it provides me while gaming and the DOG…
The dog doesn’t like the attic. He’s sweet but way madder then me, and I’m madder then the Mad Hatter…
Put a nice picture with this. I also have other pictures portraying him sleeping “starfish”, but I think they’d be removed due to indecent exposure… He’s 6 now but this was made roughly when gw2 launched I guess.. so about 3,5 yrs old at the time only difference is I get more white hairs now on the floor and less brown, but the amount of mixture is the same… He has hair for everything, if you have black clothes: white hair, if you wear white: brown hair, if you are smart and wear brown and white he just comes over and drivels all over your pants, it really doens’t matter…When you buy one you know you need to accept he looks good and you are the new hair collector….
Trust me: you do not want a shedding Shorthaired German Pointer anywhere near any of your hardware. We found hair everywhere. Dust proof?: hair inside!
He wrecked 2 DVD players by just living with us, and I clean daily, during spring and fall shedding at least 2 times daily to make sure the hairs are gone, and I never ever seem to succeed though…That is why my PC is in the attic.
I’m sure you know about this, but my PC is literally 99% dust free because offff….
Dust filters. Buy yourself a decent case with good air flow, and a lot of cases come with built in filters, or there’s like 2-3 Uk sites that sell dust filters for popular cases like NZXT.
For the record i use a NZXT H440, and it comes with 2 filters by itself. Using proper air flow management, i havent had any dust issues outside of having to quickly power blow the back where the heat flows out for a couple seconds.
I have a husky/shiba inu mix and she she’s quite a bit too, but i havent found any hair.. yet. You probably need a case that’s designed for dust proof design etc.
I noticed you’re using a monster size Air cooler too. Try swapping to like the corsair h100i closed loop cooler. I rarely have to run the fans up, and my radiators been spot clean for a good 6months now. i used to have a noctua dh14 but those push a ton of air and figured that was part of my problem as well.
Another issue i had was mine was close to the floor, so i lifted it on its own magical stand(aka my minifridge) and lifes been great
(edited by edgarallanpwn.8739)
While this thread is around.
I just upgraded to an i5 3470, and I currently have a Radeon HD 6790, would there be a significant benefit in me upgrading to an r9 270?