Opinion on Radeon or Nvidia?
from my exp building systems and upgrading them, and having seen results on those and friends systems, I personally prefer radeon cards.
I have dual 7870 2gb dual-x cards http://www.newegg.com/Product/ProductList.aspx?Submit=ENE&N=100006519%2050001561%2040000048%20600286767%20600298540&IsNodeId=1&name=Radeon%20HD%207870%20GHz%20Edition
I do recommend at current tho, depending on the price you can get the newer 7870 like this one
http://www.newegg.com/Product/Product.aspx?Item=N82E16814202024
with 1536shaders, I only paid 209 for my cards, at full price, get the 1536 shader cards, also, the best 7870/7950/7970 are the gigabyte cards, depending on price(they have the best coolers!!!!)
also, you will want to grab the latest beta drivers that have specific tweaks/fixes for gw2(the 13.2 beta drivers).
honestly no game i have gives me any problems at 2560×1440(using downsampling on my 1920×1080 monitor they run every game i have wonderfully)
nVidia cards are ok, but price for perf they are not the best options at this time.
http://tpucdn.com/reviews/AMD/Catalyst_12.11_Performance/images/perfrel_1920.gif
660ti is 8% faster then the 7870(orignal 7870) but is 300usd vs 230-250usd….
PCP&C 1200watt TC, Crosshair V F-Z, Sapphire 290x
I would get a single 7870 tahiti LE ..only a few makes out atm so any1 will do, they r a great choice for the price.
If u go for a 660ti or a 7950 i sugest a Asus or EVGA as they have the better coolers for their clock speeds, even at full load and top fan speeds they r super quiet ..unlike Gigabyte which can get noisy. Gigabyte r great, i have one myself, they keep cool and have great clocks ..but if u like ur computers quiet its a good idea to stear clear of Gigabyte graphics cards unless u find one that has been reviewed and actually does have a quiet cooler..most iv seen tend to be more noisy than Asus and EVGA cards.
EVGA GTX 780 Classified w/ EK block | XSPC D5 Photon 270 Res/Pump | NexXxos Monsta 240 Rad
CM Storm Stryker case | Seasonic 1000W PSU | Asux Xonar D2X & Logitech Z5500 Sound system |
all the reviews I have seen and every gb windforce card I have seen has bee quiet……could just be their nvidia cards that have noise issues….
http://www.vortez.net/articles_pages/gigabyte_hd7950_windforce_review,19.html
“Conclusion
The graphics card we have reviewed today should serve as an example to all manufacturers of how a GPU should be built. It runs very cool, goes about it’s business very quietly and as we have shown, performs at the leading edge in today’s high end sector of graphics cards. You could be forgiven for thinking this is the best graphics card we have tested to date and you wouldn’t be far off the mark were it not for some niggling drawbacks.”
http://www.pureoverclock.com/Review-detail/gigabyte-7950-windforce/16/
“Also impressive is the fact that the Windforce cooler is very quiet, even when the card is at full load. Gigabyte has really nailed the design here, producing a card that runs exceptionally cool and quiet.”
http://www.hardwarecanucks.com/forum/hardware-canucks-reviews/53456-gigabyte-hd-7950-windforce-msi-hd-7950-twin-frozr-iii-review-18.html
“Regardless of the lower temperatures both heatsinks are capable of achieving, Gigabyte once again holds a small lead over MSI. But let’s be honest here: there’s no way you’ll hear either of these cards over the sounds of your favorite game.”
http://www.techspot.com/review/496-amd-radeon-7950/page9.html
“Gigabyte claims that running the three fans at their full speed of 4200RPM, the Radeon HD 7950 will never exceed 46 degrees. This was hard to believe before testing the card, but when remaining almost silent, it only reached 59 degrees — 24% cooler than a standard HD 7950 in our tests.”
and asus can be loud as well, depends on the model, i have had various models from them that where silent or some that at any kind of load became very loud…..(got rid of the latter, fan auto ramped to 80% under any kind of load…..and it was a loud fan….)
PCP&C 1200watt TC, Crosshair V F-Z, Sapphire 290x
What do you all suggest the size of gig the card should be?
The windforce cards are awesome. I have a GTX 680 oc windforce and I love it. Quiet, very very cool and powerful!
If you go ATI or nVidia and get a windforce you will be ok. I have used both manufacturers for years and the hardware for both are pretty close. I do like nVidia driver suites better than AMD but that is my personal preference.
As for your question regarding size of Vram. It really depends on what resolution your planning to run the game in. 2GB is always good and unless your going for Eyeinfinity then I wouldnt get the 4gb cards.
intel 335 180gb/intel 320 160gb WD 3TB Gigabyte GTX G1 970 XFX XXX750W HAF 932
2gb for 1080p & mild AA.
3gb for 1080p & HD textures (usualy from mods) and high AA
3gb for 1440p/1600p & mild AA
4gb+ for 1440p/1600p & High AA
EVGA GTX 780 Classified w/ EK block | XSPC D5 Photon 270 Res/Pump | NexXxos Monsta 240 Rad
CM Storm Stryker case | Seasonic 1000W PSU | Asux Xonar D2X & Logitech Z5500 Sound system |
Also a side note. If there is any chance of you ever wanting utilize GW2 multi-monitor support, Nvidia hands down. ATI (eyefinity) suffers GW2 UI issues that Nvidia users do not and until Anet chooses to address these issues, ATI is a big inconvenience over Nvidia.
it depends on the game what you need for vram, I went dual 7870 after testing it at 2560×1440, im perfectly happy, i can max out every game I have and add real aa and smaa without issues(well all games that unlike gw2, support real AA)
I use 2 monitors 1 for gaming, 1 for videos and such, at 2560×1440 im more then happy even with 1 7870 2gb i havent had any problems, No I will say, if your planning the card to last 3-4 years , you want to get a 4gb single card and add a 2nd one later, but, really, it tends to be better to go bang/buck and upgrade every 1-3 years( i use to buy top end cards to hold off to ever 3 years but, the way tech advances now….and the prices….its more effective to get cards more often and resell your old ones to friends or online.
SolarNova is correct to a point, not for all games tho, I think his HD textures things mostly about games like skirim where the games if possible, even less optimized for modern systems then gw2…(at least gw2 uses 3 threads…..)
funny Stormcrow prefers nVidia cp, I prefer RadeonPro to nvcp or ccc……I prefer some of the features nv drivers have built in but, RP adds most of them to radeon cards.
If only nVidia hadnt lost nHancer, I would have a harder time choosing between GeForce and Radeon…..(nHancer was a 3rd party app like RadeonPro that let you tweak settings and use settings/stuff that nvidia drivers/cp do not, it made nvidia cards amazing back in the day. being able to mix aa modes and such….)
I still say a 7870 2gb is the best bang for the buck at the moment, be it old model or the newer model(newer version is ofcorse better) depends on price you get them at.
remember, vram dosnt matter as much as core, if you get a low end card with 4+gb ram its still a low end card.
cant tell you how many noob gamers I have had to teach this lesson over the years, one old and very good friend of mine bought a 1gb card years ago, thinking that made it better then a 512mb card, thing was it was ddr2 rather then ddr3 and the chip was an 8400gs…….yeah he ended up returning it….
PCP&C 1200watt TC, Crosshair V F-Z, Sapphire 290x
Also a side note. If there is any chance of you ever wanting utilize GW2 multi-monitor support, Nvidia hands down. ATI (eyefinity) suffers GW2 UI issues that Nvidia users do not and until Anet chooses to address these issues, ATI is a big inconvenience over Nvidia.
THIS ^^!
Unfortunately for us eyefinity users, AN has been somewhat lacking in interest / replies regarding eyefinity.
It seems that some in the eyefinity camp believe that this is a deliberate and unfair ‘ploy’ by ANet. (That is based on reading threads in these ANet forums and others online).
A forum search for ‘eyefinity’ will varify those previous statements. A web search will show many personal opinions on both ATI and Nvidia gfx cards and many trustworthy and / or professional reviews with statistics and such.
I have used both Nvidia and ATI products for many years and both have had good qualities.
My preference at this time is for the ATI eyefinity and I use it well in a number of games on my main rig. I do have a rather ‘aging’ secondary rig that has 2x SLI Nvidia and runs games well – but I have not tried it with any kind of wide screen (multi monitor) gaming.
Although I do not wish to advise on using any work arounds or third party fixes, the problems with eyefinity can be ‘fixed’. If anyone is interested in such, the web is your bestest mate!
So! ATI eyefinity users can play GW2 very well in three monitor setup (at least) but there are graphics placement issues. There is at least one good ‘fix’ for the ATI issues but, sadly, using it would possibly be a breach of rules / conditions set by ANet. Nvidia on the other hand is (as far as I am aware) fully ‘fixed’ and compatible with GW2.
Also a side note. If there is any chance of you ever wanting utilize GW2 multi-monitor support, Nvidia hands down. ATI (eyefinity) suffers GW2 UI issues that Nvidia users do not and until Anet chooses to address these issues, ATI is a big inconvenience over Nvidia.
THIS ^^!
Unfortunately for us eyefinity users, AN has been somewhat lacking in interest / replies regarding eyefinity.
It seems that some in the eyefinity camp believe that this is a deliberate and unfair ‘ploy’ by ANet. (That is based on reading threads in these ANet forums and others online).
A forum search for ‘eyefinity’ will varify those previous statements. A web search will show many personal opinions on both ATI and Nvidia gfx cards and many trustworthy and / or professional reviews with statistics and such.
I have used both Nvidia and ATI products for many years and both have had good qualities.
My preference at this time is for the ATI eyefinity and I use it well in a number of games on my main rig. I do have a rather ‘aging’ secondary rig that has 2x SLI Nvidia and runs games well – but I have not tried it with any kind of wide screen (multi monitor) gaming.
Although I do not wish to advise on using any work arounds or third party fixes, the problems with eyefinity can be ‘fixed’. If anyone is interested in such, the web is your bestest mate!
So! ATI eyefinity users can play GW2 very well in three monitor setup (at least) but there are graphics placement issues. There is at least one good ‘fix’ for the ATI issues but, sadly, using it would possibly be a breach of rules / conditions set by ANet. Nvidia on the other hand is (as far as I am aware) fully ‘fixed’ and compatible with GW2.
I really doubt its anything intentional on anets part, its more likely the same reason the game runs like kitten poo and isnt getting any better…anet dont care because its not costing them enough money that its in their eyes even worth looking into, let alone fixing it.
PCP&C 1200watt TC, Crosshair V F-Z, Sapphire 290x
I really doubt its anything intentional on anets part, its more likely the same reason the game runs like kitten poo and isnt getting any better…anet dont care because its not costing them enough money that its in their eyes even worth looking into, let alone fixing it.
As I wrote; ‘It seems that some in the eyefinity camp believe that this is a deliberate and unfair ‘ploy’ by ANet. (That is based on reading threads in these ANet forums and others online).’
In another thread someone suggested that ANet are discriminant against eyefinity users and I replied on the lines that ‘discrimination’ is maybe too strong a way to put it!
It is possible that some eyefinity users feel badly done by in that ANet have not given an official update regarding eyefinity for quite some time – especially when a game update was said to be on the way. It would be nice if ANet gave an official response update about the eyefinity issues instead of appearing, as you suggest, to simply not care.
Any way, I still hold that my preference is for eyefinity but the NVidia solution is good too. Though I have seen the NVidia option working and it looks great and I have used my SLI rig for multi monitors, I have not used it for widescreen (multi monitor) gaming so cannot give any better information regarding NVidia.
I was asked for ‘Opinion on Radeon or Nvidia’ and gave my opinion with some ‘back up’ information is all.
im honestly kinda shocked anet dont block multi monitor, they made macro’s against the rules because they arent fare to people who cant afford/are to cheap to buy an mmo mouse or keyboard with programmable buttons
other games have blocked eyefinity/surround because they wanted things to be “fair for everbody” despite the fact that theres no real competitive advantage in most games……and in gw2 and many others, if you play windowed with the aspect ratio the same as eyefinity/surround you get the same result fov/aspect ratio wise….
still im really shocked they dont got it blocked or made it against the rules….
PCP&C 1200watt TC, Crosshair V F-Z, Sapphire 290x
-Radeon performs better in general for dollar. If u are looking performance for the money go Radeon all the way. Also Radeon hd 7970 is by far the fastest videocard on the market atm it is a freaking beast.
-Nvidia has much better support, the release drivers way faster and fix problems, in general much better support. Also physx which looks very nice for games that support that very nice eyecandy on batman games or H.A.W.K.E.N for example.
Rampage: this really isnt true about nVidia fixing drivers faster, they fix some stuff faster but many people on other forums complain that nV seem to ignore gw2 where amd seem to give it alot of attention.
amd has really stepped up with driver support since they stopped the monthly release cycle, the best driver for gw2 is amd’s 13.2 beta4 or later, it has specific optimzations and fixes for perf issues in this game(shocking how much smoother it works on single and multi card systems.
Physx is great IF you play games that use it
http://en.wikipedia.org/wiki/List_of_games_with_hardware-accelerated_PhysX_support
I used an 8800gts 512mb for a physx card for a while, honestly, so few games have hardware physx support…..its kinda a non feature unless your REALLY into a specific title that really benefits from it.
I have this feeling within the next couple years, nVidia will have to port physx from cuda to APP /OpenCL/DirectCompute since amd is gaining market share due to their 7000 series cards being so good for the price.
I would agree before amd started putting out beta drivers that nv was faster to get drivers out…but back then, nv was only faster to get beta’s out, full driver updates(whql) where slower.
and yes the 7970 is a beast, but, the perf is so close to the 7870xt and cost so much higher last I looked, I found it was better value for me to get 2 7870 ghz editions(pre xt cards) perf for price, im faster then a 7970 ghz at close to the same price.
PCP&C 1200watt TC, Crosshair V F-Z, Sapphire 290x
Nvidia runs GW2 perfect allready AMD had framerate latecy issues NVIDIA does not have this kind of problems on any title atm. In general i like NVIDIA a lot even when i have 4 years in a row buying AMD cards, i only have a 550ti as physx dedicated card. NVIDIA is just much much better in drivers AMD cards are fast i love how they perform but drivers are just very low quality stuff compared to NVIDIA, honestly AMD beta drivers are terrible to say less, try record some gameplay on GW2 using the beta 13.2 drivers and see what happens LOL.
NVIDIA drivers are on a different lvl of performance and stability compared to AMD, that is logical since they cards are slower and they sell much more than AMD, they have more money to dedicate to general customer support.
Radeon cards can give u headche sometimes due drivers but once drivers are stable they just perform so good u learn to love them.
Nvidia runs GW2 perfect allready AMD had framerate latecy issues NVIDIA does not have this kind of problems on any title atm. In general i like NVIDIA a lot even when i have 4 years in a row buying AMD cards, i only have a 550ti as physx dedicated card. NVIDIA is just much much better in drivers AMD cards are fast i love how they perform but drivers are just very low quality stuff compared to NVIDIA, honestly AMD beta drivers are terrible to say less, try record some gameplay on GW2 using the beta 13.2 drivers and see what happens LOL.
NVIDIA drivers are on a different lvl of performance and stability compared to AMD, that is logical since they cards are slower and they sell much more than AMD, they have more money to dedicate to general customer support.
Radeon cards can give u headche sometimes due drivers but once drivers are stable they just perform so good u learn to love them.
This gave me a good LOL I have a 6970 using driver 13.1 and have never had a problem with any game.
EDIT I just updated to 13.1 from 11.12 wich also gave me no problems in games.
actually Rampage, nv do have issues on some titles, they also have SLI latancy issues, its just less noticeable then it was before the latest drives from AMD….
I dont agree on driver quality but, I have had more nv cards then you have it sounds like, nV have their issues, they are just different issues from amd, for example since teh 7000 series from nvidia they have had a problem that comes back at random with yv12 color space, causing videos to look like an old tv with rabbit ears when you turned the vacuum on..(ghosting banding….horrible) now alot of players and codecs see nvidia and auto disable yv12 today because they have yet to keep this problem consistently fixed.
they also have had incidents like the driver that disabled the fans on cards, ruining many peoples high end cards, they did not take responsibility for that, they did not replace cards that where out of warr, they effectively told people it was their own fault for installing a WHQL driver that smoked their card…..
http://www.zdnet.com/blog/hardware/warning-nvidia-196-75-drivers-can-kill-your-graphics-card/7551
I personally lost a 295 to that driver, thankfully bestbuy took the card back(was within the return period) they didnt have any more but had alot of returns on nV cards due to that driver killing cards.
AMD/ATI have NEVER done that…..they have had other issues but, they have never smoked peoples cards with a bad driver.
if you check nVidia centric forums or sections of tech support sites, you will find that problems are common on both, nvidia for whatever reason having HBAO auto enabled breaking some games(in some cases you CTD when you try to load the game), you can disable it but, look at the level of most users on this forum, they dont know what nvcp or ccc are….let alone how to use them…..
AMD/ATI and nV have always gone back and forth driver quality wise, what drove me to use ati was my experiance with my 5800ultra, it was slower then a 9600 256mb that cost less then 1/4 what the 5800ultra cost in dx9 games(betas i was in) the nvidia drivers where HORRIBLE, quality sucked, perf sucked……even with the hacks they did that degraded quality to boost perf…
my 6800ultra had some driver issues with some titles(most noticable in hl2 and other source games) but then so did my x800xt pe, BUT the biggest problem the x800 faced was vendor specific optimzations like doom3
http://megagames.com/news/doom-3-enhance-experience-pt2
you still see crap like this, such as blocking AA from AMD cards on batman(even though its been shown that it works fine if you trick the game into allowing you to enable it)
note: I loved my 8800gt till they each cooked themselves(4 of them cooked themselves with stock cooler) the 8800gts512mb jeff at bfg sent me (referbed by him) was GREAT, but, they still had driver issues(yv12 most annoying one for me, i watch alot of videos that use that color space by default)
others where aa and af breaking some games and it taking MONTHS to get it fixed….again this is all old news to people who played alot of games over the years..
I will admit, nVidia do tend to have less issues with some titles, or at least less noticeable issues…..nVidia still do have issues with some titles, but, and heres the thing…..nobody makes perfect drivers, this is in part due to the nature of pc hardware, and in part due to the nature of directx and opengl, each vendor has to create their own driver and each one put in their own vendor specific calls and paths.
pt1
PCP&C 1200watt TC, Crosshair V F-Z, Sapphire 290x
a big part of the problem can be blamed on IP laws(patends/copyrights/exct) companies dont want to share their code or work with other companies to make things better for everybody, because, they fear somebody getting the advantage over them and selling more…..
bah….. in short tho, nothing you buy from nvidia or amd will be free of driver flaws, there will always be some games that run better on one or the other.
at times this is espically true of “the way its ment to be played” titles, nVidia are known to push companies they help to use intels compilers and code that should run best on nvidia hardware…..funny enough at times this backfires and amd cards/drivers are better on some TWiMTBP titles, AMD enhanced titles have rarely if ever been shown to have tricks like doom3 or tigerwoods golf had to gimp non-amd/ati cards…
its funny, I went from being a rabbid nvidia fanboi in the riva128/tnt1/tnt2/geforce1/geforce2/geforce3/geforce4 and even early geforce5(fx) era to being a fan of the competition, BUT its not stopped me from owning cards of each gen.
I have owned
nvidia
NV1 http://en.wikipedia.org/wiki/NV1
riva128
tnt1
tnt2
geforce1
geforce2gts
geforce3ti
geforce4 4600
geforce fx5800ultra
6800ultra
7800gtx(later upgraded to the below after 12 rma’s for the card burning it self out)
7950gtx
8800gt
8800gts512mb(replacement after 4th 8800gt cooked itself)
gts250
gtx295
460ti(sold it after a few weeks because it was slower then what i already had from amd)
AMD/ATI
awe32
awe64
rage
ragepro
ragepro turbo(all rage/rage pro really same chip dif clocks from what i remember)
rage128 kitten good card for its day once you modified the drivers and took parts from 3 dif drivers to make one ultimate driver….but back then we did the same for nvidia honestly)
Radeon9600 256mb
Radeon 9800se unlocked to full XT
Radeon x800pro vivo unlocked to xtpe
Radeon x1900xt flashed to xtx
Radeon 5770
Radeon 6870(then added a 2nd one)
Radeon 7870 2gb*2
I also owned cards from
Rendition(verite series cards rocked back in the day)
3dfx
diamond
s3
oak technologies
Trident
i could list more….many more……
I honestly can say, I have never found nVidia drivers overall to be better then ATI/AMD since the Radeon 9600, before that it went back and forth ALOT, and you had to modify driver packages alot, but you also did that with TNT seires cards if you wanted the best perf and least bugs in all your games….(opengl, d3d, ddraw,exct you mixed and matched files from various driver versions to get the best you could, I use to make and redistribute driverpacks on a few sites for the rage128 and tnt1/2 cards it was fun)
now, Intel has always had HORRIBLE video drivers, even their current drivers are pretty bad in my exp, alot of older games just wont run on them, when they will with any amd/ati/nvidia videocard out there….
s3 had bad driver support but they where a small company that lacked resorces so its somewhat understandable, despite the savage 2000 being an amazing piece of kit in its day.
sorry for the long post but, I really think it needed to be said….again no matter who you buy from or what you buy, your not gonna get perfect…..even 3dfx, the company whos fans to this day ware rose colored glasses never had a perfect driver….(dont get me started on that, i had just about every product they made…….)
PCP&C 1200watt TC, Crosshair V F-Z, Sapphire 290x
I have had nvidia and amd and prefer nvidia much much more. I LOVE the custom resolutions, Fxaa/txaa/csaa, adaptive vsync, physx, ambient occlusion, nvidia inspector (radeon pro no longer supported), 3D Vision, etc, none of which amd NATIVELY offers. I have absolutely nothing against amd, but to me having nvidia is like all the gadgets and stuff in a luxury car. Not needed, not a must have, but makes the ride so much sweeter.
Raijinn.9065 your full of kitten poo.
japamd the creator of radeon pro has been adding features for 2-4 months now, its fully supported, perhaps you should stop talking our your rectal orifice
http://forums.guru3d.com/showthread.php?t=373303&page=16
http://forums.guru3d.com/showpost.php?p=4528435&postcount=391
if the creator of an app posting yesterday is “nolonger supported” then gw2 is in serious trouble…..their dev’s dont post often at all….
with current radeon pro we have
HBAO
VAO
FXAA
SMAA
Triple Buffering
advanced code path Triple Buffering for dx9 games that dont like normal.
Dynamic Frame Control
Lock Frame rate up to monitors refresh rate
CPU Affinity (you can pick what cores your game will run on and force process high priority)
Spoof Video Adapter ID(for games that look for nVidia cards and block out features if you dont have one)
We have an advanced OSD thats FAR FAR better then the overlay that afterburner and the like use.
We have SweetFX Integration
We have Overdrive control per game.
and ofcorse normal AMD AA options like edge detect, box, adaptive, multi sample and super sample.
My advice is, next time you go to spout off, google first, RP was stalled for a while because the creator was busy in real life, but, unlike nHancer japamd came back and went back to work with a vengeance.
im getting tired of seeing people who dont know WTF they are on about spouting bs as if its fact…..common people, step up your game, if you wana bad mouth and down talk amd or nvidia or whoever, dont talk out your kitten
also, Im happier having somebody other then AMD work on these features this way, because it means they get updates more often and fixes more often its just sad that nHancer died because it could do alot of things nvcp cant, like mix AA modes.
when will nvidia add smaa rather then blurry crappy fxaa to their cp? oh wait, the wont, because they have txaa another aa method that works by bluring stuff(the creator, the same guy behind fxaa admits that how fxaa and txaa work, and says if you dont want that, you should use smaa)
waits for Raijinn to admit he was talking out his rectal orifice
PCP&C 1200watt TC, Crosshair V F-Z, Sapphire 290x
more info: TXAA is only supported on the 650 and up, unlike all the features I mentioned that radeonpro support.
http://www.geforce.com/landingpage/txaa/technology
yeah, so txaa=fxaa=blurry mess……..yuck….no thanks.
PCP&C 1200watt TC, Crosshair V F-Z, Sapphire 290x
Raijinn.9065 your full of kitten poo.
japamd the creator of radeon pro has been adding features for 2-4 months now, its fully supported, perhaps you should stop talking our your rectal orifice
http://forums.guru3d.com/showthread.php?t=373303&page=16
http://forums.guru3d.com/showpost.php?p=4528435&postcount=391if the creator of an app posting yesterday is “nolonger supported” then gw2 is in serious trouble…..their dev’s dont post often at all….
with current radeon pro we have
HBAO
VAO
FXAA
SMAA
Triple Buffering
advanced code path Triple Buffering for dx9 games that dont like normal.
Dynamic Frame Control
Lock Frame rate up to monitors refresh rate
CPU Affinity (you can pick what cores your game will run on and force process high priority)
Spoof Video Adapter ID(for games that look for nVidia cards and block out features if you dont have one)
We have an advanced OSD thats FAR FAR better then the overlay that afterburner and the like use.
We have SweetFX Integration
We have Overdrive control per game.
and ofcorse normal AMD AA options like edge detect, box, adaptive, multi sample and super sample.
My advice is, next time you go to spout off, google first, RP was stalled for a while because the creator was busy in real life, but, unlike nHancer japamd came back and went back to work with a vengeance.
im getting tired of seeing people who dont know WTF they are on about spouting bs as if its fact…..common people, step up your game, if you wana bad mouth and down talk amd or nvidia or whoever, dont talk out your kitten
also, Im happier having somebody other then AMD work on these features this way, because it means they get updates more often and fixes more often its just sad that nHancer died because it could do alot of things nvcp cant, like mix AA modes.
when will nvidia add smaa rather then blurry crappy fxaa to their cp? oh wait, the wont, because they have txaa another aa method that works by bluring stuff(the creator, the same guy behind fxaa admits that how fxaa and txaa work, and says if you dont want that, you should use smaa)
waits for Raijinn to admit he was talking out his rectal orifice
Have not not used my amd card in a year, and it was not being updated then. Anyways with that said
HOLY FANBOI RAGE
Wait a minute…..aren’t you the guy who thinks people using surround/eyefinity are cheating??? LOL I quit
nope, what i said was, anet views using mmo mice and gaming keyboards with programable buttons as cheating, so I dont get why they also dont view people using more then 1 monitor for the game as cheating, its an “advantage” not everybody has.
and i love being called a fanboi for calling you out on your BS, if you dont like it, stop projecting fecal matter from your mouth/fingers and you wont be called out as the fanboi you clearly are.
personally, I think if you can afford hardware and want to use it, then there shouldnt be any issue doing so, that goes for surround/eyefinity, mmo mice that let you program macros, keyboards that let you program macro’s, hell even free software that let you setup macros.
as long as it dosnt automate the game for AFK play, i have no issue with it, BUT anet does, they have strict rules against things they say would give people an unfair advantage like macro’s……quite a few other developers consider eyefinity/surround to give you an unfair advantage, yet those same developers dont ban the use of programmable keyboards/mice/gamepads/joysticks/exct……
I call BS on that…
now, are you going to admit you spounted falsehoods (lies/fud) or are you going to continue to look like a prat? I mean, I posted something you could have found via a 10second google search…….proving you where full of kitten the least you can do is admit it and tell people your sorry for spreading fud.
I back up the stuff I post….such as the FXAA/TXAA info…..
PCP&C 1200watt TC, Crosshair V F-Z, Sapphire 290x
nope, what i said was, anet views using mmo mice and gaming keyboards with programable buttons as cheating, so I dont get why they also dont view people using more then 1 monitor for the game as cheating, its an “advantage” not everybody has.
and i love being called a fanboi for calling you out on your BS, if you dont like it, stop projecting fecal matter from your mouth/fingers and you wont be called out as the fanboi you clearly are.
personally, I think if you can afford hardware and want to use it, then there shouldnt be any issue doing so, that goes for surround/eyefinity, mmo mice that let you program macros, keyboards that let you program macro’s, hell even free software that let you setup macros.
as long as it dosnt automate the game for AFK play, i have no issue with it, BUT anet does, they have strict rules against things they say would give people an unfair advantage like macro’s……quite a few other developers consider eyefinity/surround to give you an unfair advantage, yet those same developers dont ban the use of programmable keyboards/mice/gamepads/joysticks/exct……
I call BS on that…
now, are you going to admit you spounted falsehoods (lies/fud) or are you going to continue to look like a prat? I mean, I posted something you could have found via a 10second google search…….proving you where full of kitten the least you can do is admit it and tell people your sorry for spreading fud.
I back up the stuff I post….such as the FXAA/TXAA info…..
Are you seriously going on because I didn’t know kittening radeonpro got supported again a few months ago? Your begging for an apology for something like that? what a sad little boy. If you think that warrants double crit walls of text and makes you feel good about yourself like that to talk so greasy then go ahead. I really could care less about the tone you’ve been carrying. Anyone reading this realizes how childish you are for letting yourself get riled the kitten up about something like radeonpro coming back. Grow the hell up, then you have the nerve to ask for an apology. Your first response was the most pathetic attempt at informing someone I have ever seen. “Radeonpro regained support a few months ago and has all those things you mentioned + some” was a normal response. But, that is why I called you a fanboi (after looking at your sig), because you snapped and went off the edge. Throwing out insults and walls of text just begging me to stroke your kitten to make you feel better. That really means you lack something vital in your life, and you try to make up for it and feel better about yourself he on the forums by acting like you know everything. That is why I called you a fanboi, because Y U SO MAD?
Your clearly a fanboy, mad because I didn’t know radeon pro gained support a few months ago. I’m not even going to go there with you about surround/eyefinity thing. Everyone knows how idiotic such claims are.
Oh yea pm me if you want to see pictures of my passive 3D surround setup. You know, just so you can hate me some more.
(edited by Raijinn.9065)
ROFLMFAO
umadbro?
again ROFL….whos getting angery?
I just find people who spout blatantly false info as fact to discredit products or people to be quite childish, and guess what, you did just that…rather then take 10 seconds and check your facts, and see that infact, RP has a bunch of new features and even when its dev was on hold it worked wonderfully….unlike nhancer that was killed by nvidia changing their drivers so it couldnt function.
but hey, if you want to get angery that i called you out, and proved your full of it, have at it, im not mad, I dont want an apology for me, I want you to apologize to the people who read your post and took it as fact (Because thats how it was stated)
you could have said “last i checked radeon pro was nolonger supported” or the like but instead you stated as fact that it was dead.
but again umadbro?
note: for those reading these posts, this is typical behaviour of fanboi’s of the green team, they spout nonsense/bs then get enraged when they are called out, not everybody who uses a companies products is a fanboi.
I am a fan of amd, BUT I build using Intel and nVidia as well, if they are better for a specific job, they are what gets used…the fellow above im quite sure uses intel/nvidia and is quite upset when his choices are questioned by other peoples choices, why else would he be so angery over being called out for spreading FUD?
why would I have a 8800gts 512mb in my work system at the moment if i was a fanboi? I have a 6670 on a shelf thats almost brand new after all….I use it because it works for its job, why replace it if it works for the job its doing?
anyway, enough of this, if he fellow above wants to post another enraged green fanboi rant, let him…
my above posts stand, I linked proof that you loose nothing but hardware physx and txaa(fxaa mk2) by going AMD…..unlike green boy here who just spouted nonsense and expected people to take it as fact.
PCP&C 1200watt TC, Crosshair V F-Z, Sapphire 290x
(edited by Jazhara Knightmage.4389)
ROFLMFAO
umadbro?
again ROFL….whos getting angery?
I just find people who spout blatantly false info as fact to discredit products or people to be quite childish, and guess what, you did just that…rather then take 10 seconds and check your facts, and see that infact, RP has a bunch of new features and even when its dev was on hold it worked wonderfully….unlike nhancer that was killed by nvidia changing their drivers so it couldnt function.
but hey, if you want to get angery that i called you out, and proved your full of it, have at it, im not mad, I dont want an apology for me, I want you to apologize to the people who read your post and took it as fact (Because thats how it was stated)
you could have said “last i checked radeon pro was nolonger supported” or the like but instead you stated as fact that it was dead.but again umadbro?
What you did there……I see it. FYI you’ve been paraphrasing my words. The question is….do you feel better about yourself……or not? And I know you hate me, I want to make you hate me even more, so do you want to see pics of my passive 3d vison surround setup (I only want to increase the hate)?
(edited by Raijinn.9065)
you mad bro?
seems like you are…..
i couldnt give less of a crap about surround or eyefinity or txaa or or or, i play the game at 2560×1440, by choice, I could run that across 3 monitors, i have the cards to do it , I have the monitors to do it, but honestly, why bother? I like to run my 2nd monitor with chat programs, browser, and mpc-be playing tvshows/anime/movies as i game, (infact im watching slayers as I post this)
sorry you are all bent out of shape over being called out.
anyway this is all OT, and no i dont hate you, I just find you funny, entertaining really, people like you are allover the net, know nothing know it alls….you know, you should really join the site techpowerup, you would fit right in, the sites full of intel/nvidia fanboi’s who like to circle jerk and talk out their posteriors.
and, Raijinn.9065 bro, its been fun sparing with you, but, your really starting to bore me, no challenge at all….like a boxing match with a guy whos for no arms……
PCP&C 1200watt TC, Crosshair V F-Z, Sapphire 290x
you mad bro?
seems like you are…..
i couldnt give less of a crap about surround or eyefinity or txaa or or or, i play the game at 2560×1440, by choice, I could run that across 3 monitors, i have the cards to do it , I have the monitors to do it, but honestly, why bother? I like to run my 2nd monitor with chat programs, browser, and mpc-be playing tvshows/anime/movies as i game, (infact im watching slayers as I post this)
sorry you are all bent out of shape over being called out.
anyway this is all OT, and no i dont hate you, I just find you funny, entertaining really, people like you are allover the net, know nothing know it alls….you know, you should really join the site techpowerup, you would fit right in, the sites full of intel/nvidia fanboi’s who like to circle jerk and talk out their posteriors.
and, Raijinn.9065 bro, its been fun sparing with you, but, your really starting to bore me, no challenge at all….like a boxing match with a guy whos for no arms……
Hahah ill take that as a win on my part. Only reason I keep asking you if you want to see my setup is because you obviously have something against people who use them aka Y U MAD.
Ps Your pretty good at paraphrasing
thanks, but really have no issue with people using eyefinity/surround, and never did, i could choose to run them myself but, all that head turning….lol ( i did this for a while, didnt make games any more fun so why bother?)
if your so proud of your system thats koo, im sure you ego needs the boost from your kitten.
have fun man im going to bed, have fun, all you have in you is Ad hominem attacks…..how disappointing.
PCP&C 1200watt TC, Crosshair V F-Z, Sapphire 290x
Go lay down some where and dream up your own words for tomorrow…..okay?
Not going to get in to a debate.. but I have ran both and had less problems with NVIDA all around. Currently I run an NVIDIA GTX670 and have no issues on any game I play.
I used both over the years and i find both do a great job. Im with Nvidia at the moment and was luckly enough to have a EVGA GTX 690. I’ve also got a few gaming buddy’s who have ATI and they still play the same games on max settings.
Only diff is they need 2 ATI graphic cards to keep up with my one Nvidia.
Its true that 2 ATI’s might be cheaper than 1 Nvidia but i’ve never liked Crossfire or SLI.
I used both over the years and i find both do a great job. Im with Nvidia at the moment and was luckly enough to have a EVGA GTX 690. I’ve also got a few gaming buddy’s who have ATI and they still play the same games on max settings.
Only diff is they need 2 ATI graphic cards to keep up with my one Nvidia.
Its true that 2 ATI’s might be cheaper than 1 Nvidia but i’ve never liked Crossfire or SLI.
if you dont like sli you need to get rid of that 690 its a dual gpu card(like the 7970×2 or 7990) sli on a card.
http://www.anandtech.com/show/5805/nvidia-geforce-gtx-690-review-ultra-expensive-ultra-rare-ultra-fast
so, you are running the equivalent of 2 cards, just on 1 slot.
http://www.tomshardware.com/reviews/radeon-hd-7990-devil13-7970-x2,3329.html
http://www.tomshardware.com/reviews/radeon-hd-7990-devil13-7970-x2,3329-6.html
as you see, the AMD equivlant walks all over the 690….
http://www.tomshardware.com/reviews/radeon-hd-7990-devil13-7970-x2,3329-7.html
im not trying to insult you but, you are using SLI, just as people with 7990/7970×2 cards are using cfx even if its only using one pci-e slot.
in some cases, its actually a problem on both sides, single card sli/cfx is more suseptable to one of the gpu’s getting stuck at 0% use (you see alot of people complaining about this every time new games come out)
either way, you got a very nice card, I know 2 people who went crazy and got 690’s myself, one even got 2 of them…..he has some scaling issues though…..hits 3960x cant keep up with them in alot of titles..LOL (despite being oc’d to 4.8ghz range)
I also have a friend with a 7990/7970×2 he won, that cards a total beast as well, though I will say, its quieter and runs cooler then the 690’s the others have…..but part of that is probably the fact that as soon as he got it he popped the coolers off and replaced the goop and the other 2 just left the stock stuff in place.(and nvidia stock stuff isnt so great…..i saw 12c drop on my last nvidia card build under load by replacing the goop with ceramiq and ceramiq isnt even close to as good as pk1 pk3 mx4 or a few others that are newer…
I have dual 7870 ghz editions because, at 209each shipped, they give me amazing perf for close to what a decent 7970 cost at the time…..I honestly couldnt pass up the deal.
I have run SLI a few times as well, but with sli I have had more issues with microstutter that i couldnt work out(i think in part because theres no tool that can do what radeon pro can do anymore for nvidia)
I will say it, I would love your card or a 7990/7970×2.....but, not at the price they go for compared to what I have.
whats sad is, no matter how good our cpu’s or videocards, this game will always have a horrible nominal fps……at least till anet fix their kitten…..
PCP&C 1200watt TC, Crosshair V F-Z, Sapphire 290x
I still think both companies are very good and really it comes down to price lol.
I kinda wish I had of gotten the GTX670 instead of the 680oc and saved myself 100 bucks.
The major issue is GW2 is fracking terribad clientside. It doesnt matter if you have 2x gtx 690 sli, your still going to get kitten poo performance unless your running a freakin cray!
Anet needs to get off their kitten and stop with the meet and greets and start working on this client!
intel 335 180gb/intel 320 160gb WD 3TB Gigabyte GTX G1 970 XFX XXX750W HAF 932
a cray wouldnt help saddly the game really needs a tricore at 6-8ghz to run decently….nobody makes a chip like that….
what needs to happen is we need to get a group togather to go to the meet and greet and one at a time ask the same questions about them fixing the performance of the game to each and every employee there…..that may get their attention……
if i had a ride to seattle for the event, i would go, and i would ask if they are ever gonna fix it, and if they are ever gonna get dx11 support, or if they are waiting for the xbox 720 before they bother.
PCP&C 1200watt TC, Crosshair V F-Z, Sapphire 290x
lol the cray was sarcastic
Your right about tricore 6-8ghz tho. That is exactly what it would need. Maybe an x3 on liquid hydrogen?
If I lived on the other side of the continent I would go for sure. I think that asking them over and over is the only way to get their attention.
They have had 0 posts from devs regarding this issue.
It is pretty sad really since the game has a ton of potential
intel 335 180gb/intel 320 160gb WD 3TB Gigabyte GTX G1 970 XFX XXX750W HAF 932
what I think they need to do is, hire a team to either re-write the client/engine to run better or hire a team to rebuild the client using a different engine, as they keep the current team working on content and bug fixing the current client as best they can.
either way, they need to make it so the game on min specs never drops below 30fps, and so it scales to modern systems.
dx11 support would also be another big bonus that could help with perf.
sadly I honestly believe the game is the way it is, because somebody had visions of porting it to the xbox 360.
look at the facts, its got 3 heave threads, the 360 has a tricore cpu, its dx9, the 360 has dx9 level hardware.
the bonus of that was they could sell to XP users, then tell them to fix many erros, they should move to a 64bit os with more ram….(ROFL?)
honestly the games got amazing potential, the few things that could make it better would be a more tera like combat system(aiming rather then targeting), look at neverwinter gameplay video the one by totalbiscut is great, I love the quest system in gw2, thats one place tera drives me nuts, back and forth bank and forth…lol(not as bad as WoW though…..tera does at least try and keep you headed forward)
if anet would wake up and fix the game so it ran decent, my group alone would be dumping thousands in a year, not to mention the new people i could bring in who refuse to buy the game because it runs so poorly on their systems….despite every other game including tera running quite well on them…
bah!!!!
PCP&C 1200watt TC, Crosshair V F-Z, Sapphire 290x
Ya I have pretty much given up on any performance patches. I will do my dailys and move on to other games.
The 360 porting is probably what they had envisioned when they were modding the gw1 client.
I can pretty much guarantee that they will not rewrite any code for dx11, nor will they rewrite any code for multi-threading performance gains. It would cost too much money. Besides they are busy putting Quaggan backpacks in the TP
intel 335 180gb/intel 320 160gb WD 3TB Gigabyte GTX G1 970 XFX XXX750W HAF 932
2gb for 1080p & mild AA.
3gb for 1080p & HD textures (usualy from mods) and high AA
3gb for 1440p/1600p & mild AA
4gb+ for 1440p/1600p & High AA
You don’t need 4gb card for 1600p there are only 2 games I have seen use more than 2gig vram at 1600p on my 7970 and that is BF3 which used about 2.1/2.2 and Crysis 2 which used about 2.4 max, but both games still run fine on a gtx680 at that res which only has 2gig vram.
4gig vram is really not needed unless you are runing 3D eyefinity or 3D Nvidia souround set up
Ya I have pretty much given up on any performance patches. I will do my dailys and move on to other games.
The 360 porting is probably what they had envisioned when they were modding the gw1 client.
I can pretty much guarantee that they will not rewrite any code for dx11, nor will they rewrite any code for multi-threading performance gains. It would cost too much money. Besides they are busy putting Quaggan backpacks in the TP
I see one way they will.
xbox 720, theres alot of money in there there consoles…
the 720 and ps4 have such similar hardware by reports, that porting to both should be cake.
8 core power pc cpu in both, the xbox has 8gb ran, ps4 has 4gb, they both will have AMD APU’s and apparently GPU’s as well(m series chips for lowerpower use)
I get the feeling that they may have even gone back and re-moded the engine with the aim of the 360, then they learned it wouldnt work out…..because its to heavy.
I really do think that its VERY likely they will push out an xbox 720 port if not a ps4 port, simply because theres so much money in that market.
if they do this, the changes to threading should carry over to pc, it will need a different compile ppc vs x86-64, but hey, a kitten could do that…..a dunk kitten even…
note i say this as somebody whos recompiled a number of apps to use across mips/x86/arm/risc processors(not because i wanted to but because I had to for a project)
im no programmer, but, if they do this work to get it working for console thats our best hope of them fixing the game for the rest of us.
So im laying my hopes in that.
if they can make it run using 4-8 balanced threads rather then 3 heavy threads…….we could see a nice perf boost.
and from what I gather MS is really pushing the use of 64bit and dx11 for the 720, they want to move things along…..(took them long enough)…..
till then tho, the games not gonna get any major fixes unless somebody whos rally wealthy decides to pay anet to fix it because they want to play the game properly….(if i won the lotto….lol )
PCP&C 1200watt TC, Crosshair V F-Z, Sapphire 290x