FX 8350 - A good buy?
That one beats the i5, but not on Guild wars 2….
The game is not really optimized for multi-threading. But I guess you will not use your computer only for playing 1 game :p
Besides, the new playstation 4 will have a amd cpu… So that means they are still better than intel i think :p
if you read the FAQ and had read some threads in this section you would know the answers.
guild wars 2 uses 3 heavy threads, its very very dependant on per core performace, so an i5 dual core with ht, quad core or i7 would be faster then an 8350, but if you OC to around 4.5-4.6 on either one, you will get about as good a performance as this games gonna offer.
this games not optimized at all, and anet arent really working on it from what we can tell, they have decided that stuff like Quagan Backpacks are more important then the game running well.
the best you can hope for with gw2 is to make a system that wont drop below 30fps 99% of the time, yes i said 30fps…….
videocard dosnt matter to much, the games totally cpu bound.
if any of its 3 threads hits a wall even for an instant, your fps suffer….and since they hit walls constantly…your fps in WvWvW or large DE’s will drop to around 30 even with the best OC your can get out of the best processors on the market.
now there is one case where the AMD will beat the intel, if you want to record the game as you play, this is true of many games, if your running something like xsplit or fraps or or or….you will find the amd gives you better perf because, its got more real cores. (hyper threading actually makes gw2 suffer on quads and hex core intel chips, for their dualies its needed)
your best bet if all you care about is gw2 and fps, is a 3570k and clock it to at least 4.5ghz(higher the better) but keep in mind that ivybridge run hot even with water cooling, this is due to intel cheaping out on the TIM under the metal lid on the cpu, (the ihs), you can fix it but that voids your warranty…..it also gives you higher oc headroom…..
Personally, because I do more then just GW2 I find my 8350@4.6 to be a godsend
PCP&C 1200watt TC, Crosshair V F-Z, Sapphire 290x
Hey guys, thanks for your answer. Im really interested of the performance of your 8350 jazhara. At what clocks are you running this game? Also is WvW totally playable for you?
as my sig says, 4.6ghz 24/7, i rarely see anything but spikes for an instant below 30fps, but, I dont do WvWvW because, honestly, it runs like crap so I just forget about even bothering with it.
WvWvW dosnt run well for anybody really, its playable but, I get tired of invisible attackers and other bs that comes with WvWvW in its current state Honestly, if you want PvP pick another game that performs better and wait to see if anet fix gw2 when the xbox720 comes out.
every other game i have runs better then gw2, gw2 infact is the only game i ever see spikes below 30in even at 2560×1440.....that includes planetside2 a game that even sony, the creator admits needs alot of work to get its perf up(among other problems the game has)
PCP&C 1200watt TC, Crosshair V F-Z, Sapphire 290x
as my sig says, 4.6ghz 24/7, i rarely see anything but spikes for an instant below 30fps, but, I dont do WvWvW because, honestly, it runs like crap so I just forget about even bothering with it.
WvWvW dosnt run well for anybody really, its playable but, I get tired of invisible attackers and other bs that comes with WvWvW in its current state Honestly, if you want PvP pick another game that performs better and wait to see if anet fix gw2 when the xbox720 comes out.
every other game i have runs better then gw2, gw2 infact is the only game i ever see spikes below 30in even at 2560×1440.....that includes planetside2 a game that even sony, the creator admits needs alot of work to get its perf up(among other problems the game has)
I have exactly the same problem with WvW. I do like it, but it’s completely unplayable or well, atleast it’s not enjoyable. I sometimes run it with my guildies for even bit better fps but I won’t really play WvW before this all get’s fixed…
Still reading this? You know there is something better to be done for sure. -.-’’
For GW2 get a FX 8350 only if ur current motherboard can take it as that makes it a really cheap upgrade.
If u need a new motherboard for that however, it would be bettrer performance wise to go Intel 3570k or 3770k.
If ur upgrading to do other things other than gaming and want to stay on a tight budget then get the FX 8350.
If u got loads of money and want somthing that beats a FX 8350 in GW2 and can still hammer away at non gaming computational tasks ..then get a 3930k (6 physical cores 12 threads)
EVGA GTX 780 Classified w/ EK block | XSPC D5 Photon 270 Res/Pump | NexXxos Monsta 240 Rad
CM Storm Stryker case | Seasonic 1000W PSU | Asux Xonar D2X & Logitech Z5500 Sound system |
I am running my 8350 stock on ASUS 990FX R2 with 32Gb ram and I am happy with it. Would an Intel chip have been better? Maybe, but I always have a video playing on a second monitor with an SQL query running in the back ground so this chip is better for me because the work/play line is severely blurred. My understanding is there is no perfect solution, and well, that’s life.
If u got loads of money and want somthing that beats a FX 8350 in GW2 and can still hammer away at non gaming computational tasks ..then get a 3930k (6 physical cores 12 threads)
Which would make little to no sense since GW2 has kitten poor optimization for more then 4 threads .
Ayhlonpv u missunderstand the reasoning.
a FX 8350 is better for a rig designed for gaming and computing, and its cheap.
a 3570k is great for gaming. In GW2 its slightly better than a FX 8350, but its not as good at computing as it only has 4core 4 threads.
a 3930k is better at both but is significantly more expensive.
Doesnt mater how many threads GW2 uses since it will use as many as it needs of what the cpu has available, the rest of the threads will be usfull for the computing side of things. hence the ‘and’.
EVGA GTX 780 Classified w/ EK block | XSPC D5 Photon 270 Res/Pump | NexXxos Monsta 240 Rad
CM Storm Stryker case | Seasonic 1000W PSU | Asux Xonar D2X & Logitech Z5500 Sound system |
(edited by SolarNova.1052)
I think the FX is the best buy atm, the motherboard will at least support 2 more CPU generations for further upgrade which is very nice, u can’t say that about intel tho. If u want to upgrade right now go for the AMD, if u can wait up to june or july u can get the new haswell processors. But buying Ivy at this point is just not smart.
I am particulary waiting for haswell to upgrade my 2600k cuz i am not happy with it’s performance in GW2. May be if Ivy-E comes out with noticeable improvments over regular ivy i could consider the LGA 2011 platform.
I am running my 8350 stock on ASUS 990FX R2 with 32Gb ram and I am happy with it. Would an Intel chip have been better? Maybe, but I always have a video playing on a second monitor with an SQL query running in the back ground so this chip is better for me because the work/play line is severely blurred. My understanding is there is no perfect solution, and well, that’s life.
Hey, im not alone, im always watching videos as I game
that or sometimes listening to audio books but, thats a far less intensive task then having for example 720p james bond movies on my 2nd screen as i game, withmy 1055t and 8120 and 8350 I dont see any perf hit doing this, with the 3570k I build recently you would get hitching every so often and if it was for example watching a tv show series every time it would change episodes he whole system would stall for 3-5seconds……
I think the FX is the best buy atm, the motherboard will at least support 2 more CPU generations for further upgrade which is very nice, u can’t say that about intel tho. If u want to upgrade right now go for the AMD, if u can wait up to june or july u can get the new haswell processors. But buying Ivy at this point is just not smart.
I am particulary waiting for haswell to upgrade my 2600k cuz i am not happy with it’s performance in GW2. May be if Ivy-E comes out with noticeable improvments over regular ivy i could consider the LGA 2011 platform.
what I read was ivy-e was coming out, the haswell even later in the year then july, and haswel will be another one off socket that will be replaced with broadwell.
the best bang for the buck at the moment would be something like
http://pcpartpicker.com/p/G9kn
would want to add some nice case fans like akasa vipers but, all in all, the build I just linked is serious bang for the buck, not just for gaming but or general computing
PCP&C 1200watt TC, Crosshair V F-Z, Sapphire 290x
Wow! Thanks for all the replies guys! Didn’t expect to have so many.
Thanks to all these post I am starting to lean towards the amd 8350 as I will in a future use some light autocad for architecture.
I got just some more question. Initially I really don’t plan on overclocking this cpu.
So basically, at stock levels, how will the 8350 run WvW?
Also when I get the evo 212, what is the maximum clock I can achieve?
By doing so, let say I get to 4.4-4.6 ghz, how much will WvW performance increase?
To be honest im happy in big zergs (30vs30) as long as my fps are 25+. Will the 8350 make it?
And finally, while arenanet may perhaps be delaying some more optimization, can I expect some noticeable increase of performance?
Thanks!
overclocked 25+ should be fine in wvwvw,.
if you can swing a Xigmatek Aegir, that will get you 4.6-4.8 for sure I would add 2xAkasa viper fans if you get an Aegir, they move more air and are quieter then its stock fan(fan ticks when it pwm’s, not horrible but had me thinking something was touching one of my fans…lol)
dont mind the double listing of the thermal goop, didnt see I had done that….lol
PCP&C 1200watt TC, Crosshair V F-Z, Sapphire 290x
Indeed… Jaz has on hand experiance with the FX 8350. From my knowledge on it u can expect 4.5ghz on the Evo though it could easily be 100mhz higher if u get a good chip. Stock settings it wont fair to well in WvW but even the 4.5ghz OC should net a significant performance increase and keep FPS above 25 in the kind of battle sizes u mentioned considering a 4.6ghz OC will manage the same 25+ with larger sized battles.
EVGA GTX 780 Classified w/ EK block | XSPC D5 Photon 270 Res/Pump | NexXxos Monsta 240 Rad
CM Storm Stryker case | Seasonic 1000W PSU | Asux Xonar D2X & Logitech Z5500 Sound system |
(edited by SolarNova.1052)
That one beats the i5, but not on Guild wars 2….
The game is not really optimized for multi-threading. But I guess you will not use your computer only for playing 1 game :p
Besides, the new playstation 4 will have a amd cpu… So that means they are still better than intel i think :p
No, the PS4 uses an AMD CPU because it needs to hit a price point.
That one beats the i5, but not on Guild wars 2….
The game is not really optimized for multi-threading. But I guess you will not use your computer only for playing 1 game :p
Besides, the new playstation 4 will have a amd cpu… So that means they are still better than intel i think :pNo, the PS4 uses an AMD CPU because it needs to hit a price point.
oh, its not all about price, AMD offers the better overall package, AMD is better at multi threading, amd also have FAR FAR better video then intel, and im sure that amd are supplying some sort of chipset for it since that would make things alot easier on sony in design and in software design.
intel is faster per core, but intel dont even have an 8core chip yet, and no, quads with HT dont count…HT is great when it works, but can have HORRIBLE effects when it dosnt.
a simple example is using x64 ogg vorbis encoder with dbpoweramp on all cores, my 8core amd flys, my buddies 900x(its a 970/980/990…cant remember) at 4.8 takes a 47% perf hit with HT enabled doing the same task, making it ALOT slower then mine, with it disabled, its slower but, not quite so big a gap…..(this was done encoding a large batch of flac files to ogg using the lancer mod enhanced oggenc from hydrogen audio forums using dbpoweramp, foobar2000 also could be used for testing i hear….but dbp is easy and fast)
PCP&C 1200watt TC, Crosshair V F-Z, Sapphire 290x
That one beats the i5, but not on Guild wars 2….
The game is not really optimized for multi-threading. But I guess you will not use your computer only for playing 1 game :p
Besides, the new playstation 4 will have a amd cpu… So that means they are still better than intel i think :pNo, the PS4 uses an AMD CPU because it needs to hit a price point.
oh, its not all about price, AMD offers the better overall package, AMD is better at multi threading, amd also have FAR FAR better video then intel, and im sure that amd are supplying some sort of chipset for it since that would make things alot easier on sony in design and in software design.
intel is faster per core, but intel dont even have an 8core chip yet, and no, quads with HT dont count…HT is great when it works, but can have HORRIBLE effects when it dosnt.
a simple example is using x64 ogg vorbis encoder with dbpoweramp on all cores, my 8core amd flys, my buddies 900x(its a 970/980/990…cant remember) at 4.8 takes a 47% perf hit with HT enabled doing the same task, making it ALOT slower then mine, with it disabled, its slower but, not quite so big a gap…..(this was done encoding a large batch of flac files to ogg using the lancer mod enhanced oggenc from hydrogen audio forums using dbpoweramp, foobar2000 also could be used for testing i hear….but dbp is easy and fast)
They use AMD cause it has a great price to performance ratio. Also, while the AMD CPU does have 8 cores, one AMD core =/= one Intel core. As for AMD having better integrated video, that doesn’t really matter. The PS4 is using a dedicated graphics card anyhow.
its using an dedicated AMD video chip, I didnt say integrated….
Intel video is a joke, nVidia video……well nV always want more for their chips, not to mention the whole 8600gt(and related chipset) debacale(huge fail rates, google it, apple and many other laptop vendors where hit very hard..nvidia didnt make things right…)
by going amd/amd they also get a platform they only have to go to 1 place to get assistance making the most of the hardware or working around any bugs that are found.
if you went by the intel fanboi line, sony should have used a quad core intel chip, any quad core intel chip, as it would have benched faster then any amd chip(not kidding) but, the fact is , amd totaly kicks kitten for multi threading, and everybody knows thats the way of the future…..even intel….but, intel being intel they want to milk each upgrade by requireing a new motherboard…..cant blame them, its worked so far……i mean idiots year after year upgrade their board to get a new chip thats not really much better then last years…..
it also forces oem’s to buy more boards, selling more chipsets for intel…..
blah, basickly, yes going intel could give better per core perf but wouldnt get them as many cores(ht dosnt count).
PCP&C 1200watt TC, Crosshair V F-Z, Sapphire 290x
thanks mate. Im almost sold on getting it.
How is the temperature of your cpu when gaming?
I do not overclock, and I love it. I have the exact setup save a single ASUS 7870. Overclocking is only a couple clicks away though. Enjoy!
Well, if my opinion matters at all at this point, I’d like to provide some input. My current system is:
CPU: Intel i5-3570k @ 3.4GHz (so it’s not overclocked; I am using the stock fan.)
GPU: AMD Radeon HD 7770
RAM: 8GB DDR3 1600MHz
PSU: 620W SeaSonic M12II
MB: ASRock B75m-DGS
HDD: 1TB Seagate Barracuda 7200RPM 6GB/s
While I can’t really say what my FPS drops to during larger battles – because I haven’t really checked – I can say that my FPS stays well above 50-60 most of the time when I’m playing, and I keep all my settings on high. So I’m quite satisfied with my build; I can try getting into larger battles later tonight after I finish my project for computer science and checking my FPS.
On a side note, I don’t think Intel cores are just for “Intel Fanboys”. I think it’s safe to say that AMD and Intel each have their pros and cons compared to each other. For example, an i5-3570k will run better than an AMD FX-8350 on Skyrim, Diablo 3, and WoW. On the other hand, an AMD FX-8350 will run better than an i5-3570k for things like rendering, 3D modeling, and photoshop. In terms of just gaming, Intel is typically the better processor, but is also a bit more expensive (AMD FX-8350 is $30 cheaper on Newegg right now). So while I don’t agree with Jazhara’s opinion that AMD is the “way to go”, I don’t think that AMD is all that bad either. It really depends on what you specifically intend to use your processor for and how software applications adapt to the current technology offered by today’s hardware, which leads to the typical “Do your homework” reminder when it comes to these sorts of things.
Pardon some parts of my post if they seem off-topic; I meant for it all to be relevant and hopefully useful information. If you’re looking for a processor specifically for GW2, I would say to go with an i5-3570k. But again, there are numerous factors regarding the i5-3570k and the AMD FX-8350 that you should look into and read up on before deciding.
Edit: Sorry for the wall of text! I’m a bit of a computer enthusiast.
Sheer sorry you didnt understand what i ment.
see on many forums, you will have intel fanboi’s tell you, that nothing amd makes is worth buying for any reason, they will point to benches that show that dual and quad intels are faster then 8 core amd’s as proof…what most if not all of the benches they use have in common is that they are either antiquated crap like superPI(nobody uses x87 code anymore, people use SSE now days) or they are games that favor less cores that are more powerful each, rather then stronger threaded performance.
Intel is the way to go if all you do is play games, its also the way to go if you really care about your superpi score, if your playing alot of games that have yet to add proper threading support like gw2, and thats all you wana do, get intel, its the best choice.
if you want a platform thats going to be a bit more future proof and will give you better multi tasking and multi threaded support for less money, go AMD.
if you look around, Intel gets its kitten kicked hard in one very telling bench, Cinebench R11.5 a program built and compiled with intels compiler, for those not in the know, intels compiler sends non-intel chips a much slower code path then intel chips, even if the other chips support the same feature set(sse2/3/4/avx) they just get a basic slow code path.
http://www.agner.org/optimize/blog/read.php?i=49
if you read that thread, you will see, that for example a via nano is over 26% faster when the program thinks its an intel cpu, AVX can boost perf of apps like bonic 20-50% but intels compiler blocks non-intel chips from getting avx code.
so, the fact that is faster or bitting the kitten of intel in Cinebench R11.5 shows just how powerful the chips really are……imagine if intel played fair and let amd get avx……(they dont got a patch to remove the avx blockage yet….you can remove the blockage for sse code paths)
http://www.techradar.com/us/reviews/pc-mac/pc-components/processors/amd-fx-6300-1117533/review
http://www.xbitlabs.com/articles/cpu/display/fx-8350-8320-6300-4300_7.html#sect0
if you ding around, amd and intel trade blows pretty good app to app, game to game…..
skyrim runs fine on amd, it runs better on intel, but common….both give you over 60fps….and most peoples monitors cant display more then that so…..whats it matter(60hz=60fps)
same is true for most other games…..and those amd truly sucks in, like sc2, intel gets hammered as well.
and remember, amd have stated they plan to use the current socket for 2 more generations of chips, intel, if you buy now, next year your gonna need a new board if you want a new chip.
so yeah, in the end it comes down to if your a gamer who uses your system for more then just gaming(most of us) is this, intel will give you at best an extra 5-7fps in gw2 in wvwvw/la/de’s, but will cost more and be slower in some apps.
also note: intel boards with comparable features tend to cost more then amd boards, so you pay 30 extra to get a 3570 vs 8350, then pay extra to get an equivlant motherboard……
hell 30 about covers the cost of getting a coolermaster hyper 212 evo, a cooler that can get you to 4.5ghz on the amd…..
OH yeah, and with intel you get this fun http://forums.anandtech.com/showthread.php?t=2261855
yup, ivybridge run hotter then sandybridge because intel cheaped out on the TIM under the IHS rather then using their normal fluxless solder they used VERY VERY cheap shoddy thermal paste…..(look at the thread) 20c temp drop by replacing it….but to do that, you gotta void your warranty….and risk killing your chip….
I build with both intel and amd for people, but, I have had less and less people wanting the intel rout after they see the above links and cost differance……
the savings going 8350 with a nice board lead to better cooling or more ram or better videocard….
http://pcpartpicker.com/parts/partlist/
kitten amd build on the cheap(i would upgrade the cooler to the aegir)
yes i am an amd fan, but, I do build with intel for people whos tasks are better suited to intel or who want intel for whatever reason, i have no problem building with intel if its the clients desire or makes more sense…….I just rarely find it makes more sense for the money……specially as sandybridge chips become harder to get and ivybridge chips have tim issues(yes it bothers me….i dont like building with faulty parts, and to me, tim that bad is faulty….)
PCP&C 1200watt TC, Crosshair V F-Z, Sapphire 290x
Let me put my two cents in it if you are to play all sort of MMOs which most if not all rely on single core performance for obvious reasons and we all know which brand Cpus offer better single core performance. Last time I checked (like a week ago) with the most current drivers and what not I was getting 30ish on best appearance preset with the dips to 13-15 in WvWvW zergs. I dont regret buying the amd cpu I regret buying this game. So decide for yourself
My rig is fx 8120 @ 4.7ghz
Asrock 990fx Fatality Pro
Msi TF 6970
From personal experience the Ivy bridge line hands down gets better performance clock for clock on this game….Im running an i5-3570k 4.84ghz 47×103 8gb 2470mhz on a 7870 2gb and with shadows on med and shaders on med rest high I can maintain a solid 30+ in a battle of maybe 75+ i think i saw it drop to 28 once. But I ran fraps fps counter always so im always observing my fps.
I also would like to note that I did some memory for work for a friend and overlkittened his memory to 2600 under an i7-3770k and his min fps went up by about 3-4 in wvw solid. I havent seen him dip below 40 in a battle of 100 it sick lol. oh btw his 3770k his oc to 5.1.
One last thing man I really rec. building yourself a 3770k system esp. if you want the hyperthreading tec. If you have a microcenter near you id suggest looking them up because you can get a 3770k for like 230$ and a 3570k for 180$
(edited by Clexzor.6351)
That one beats the i5, but not on Guild wars 2….
The game is not really optimized for multi-threading. But I guess you will not use your computer only for playing 1 game :p
Besides, the new playstation 4 will have a amd cpu… So that means they are still better than intel i think :pNo, the PS4 uses an AMD CPU because it needs to hit a price point.
oh, its not all about price, AMD offers the better overall package, AMD is better at multi threading, amd also have FAR FAR better video then intel, and im sure that amd are supplying some sort of chipset for it since that would make things alot easier on sony in design and in software design.
intel is faster per core, but intel dont even have an 8core chip yet, and no, quads with HT dont count…HT is great when it works, but can have HORRIBLE effects when it dosnt.
a simple example is using x64 ogg vorbis encoder with dbpoweramp on all cores, my 8core amd flys, my buddies 900x(its a 970/980/990…cant remember) at 4.8 takes a 47% perf hit with HT enabled doing the same task, making it ALOT slower then mine, with it disabled, its slower but, not quite so big a gap…..(this was done encoding a large batch of flac files to ogg using the lancer mod enhanced oggenc from hydrogen audio forums using dbpoweramp, foobar2000 also could be used for testing i hear….but dbp is easy and fast)
They use AMD cause it has a great price to performance ratio. Also, while the AMD CPU does have 8 cores, one AMD core =/= one Intel core. As for AMD having better integrated video, that doesn’t really matter. The PS4 is using a dedicated graphics card anyhow.
Nope , the PS4 is using what AMD calls an APU , 8 jaguar core with a graphics chip on the die. And from what “we” “know” MS uses something that is “close” to Sony
AMD beats Intel CPU when it comes to video encoding There some other encoding things which are faster as well. There is a video out on how AMD beats Intel on games when using something like fraps and running the game at the same time.
There is also it upgrade path Intel will go with mainboards + cpu soldered and AMD allows users to upgrade the cpu without needing a new mainboard.
So far AM3+ has one upgrade left somewhere before the end of the year (prolly 2014) And it is called Steamroller .
(edited by Athlonpv.2093)
How would an oc 8350 compare to i5 3470 at stock clock? Which would be better for WuvWuv?
stock for stock they would be very close to the same, the amd will be faster then the 3470 once you overclock it, there are some reviews where the 3470 is shown, its rarely faster when you oc the amd(there are apps that favor intel still ofcorse, specially those compiled with intels bias compiler)
PCP&C 1200watt TC, Crosshair V F-Z, Sapphire 290x
How would an oc 8350 compare to i5 3470 at stock clock? Which would be better for WuvWuv?
If you wish to use fraps or stream your game, then FX 8350 is a good buy.
Benchmarking is not a simple thing to do.
Most reviewer sites benchmarks on lowest resolution and textures which is not a the average user load. AMD new FX chips is definitely better at video enconding and playing at the same time.
the 3470 dosnt overclock well, you have to use turbo and bclk and…..thats VERY limiting.
on the other hand amd is FUN and easy to overclock
http://www.overclock.net/t/1348623/amd-bulldozer-and-piledriver-overclocking-guide-asus-motherboard
that guide will work for asrock as well before you ask
PCP&C 1200watt TC, Crosshair V F-Z, Sapphire 290x
How would an oc 8350 compare to i5 3470 at stock clock? Which would be better for WuvWuv?
If you wish to use fraps or stream your game, then FX 8350 is a good buy.
Benchmarking is not a simple thing to do.
Most reviewer sites benchmarks on lowest resolution and textures which is not a the average user load. AMD new FX chips is definitely better at video enconding and playing at the same time.
yeah im the first one to post those videos, another note is, if you want to run background tasks as you game same things true, moar cores is moar better when doing moar things at once (yes the miss-spelling is intentional)
PCP&C 1200watt TC, Crosshair V F-Z, Sapphire 290x
Yeah and just look at intel… less physical cores, less Ghz, they all bet on their hyperthreading stuff, this isn’t going to work well since it takes to much time to make applications using that technologie. It’s more easier if you have more actual units.
And amd socket don’t change like a pair of socks like intel do. Also if you really have to buy a motherboard, you definitly don’t want to go for intel because you will end up with twice your budget… and maybe not that good.
its using an dedicated AMD video chip, I didnt say integrated….
Intel video is a joke, nVidia video……well nV always want more for their chips, not to mention the whole 8600gt(and related chipset) debacale(huge fail rates, google it, apple and many other laptop vendors where hit very hard..nvidia didnt make things right…)
by going amd/amd they also get a platform they only have to go to 1 place to get assistance making the most of the hardware or working around any bugs that are found.
if you went by the intel fanboi line, sony should have used a quad core intel chip, any quad core intel chip, as it would have benched faster then any amd chip(not kidding) but, the fact is , amd totaly kicks kitten for multi threading, and everybody knows thats the way of the future…..even intel….but, intel being intel they want to milk each upgrade by requireing a new motherboard…..cant blame them, its worked so far……i mean idiots year after year upgrade their board to get a new chip thats not really much better then last years…..
it also forces oem’s to buy more boards, selling more chipsets for intel…..
blah, basickly, yes going intel could give better per core perf but wouldnt get them as many cores(ht dosnt count).
You really can’t compare AMD dedicated graphics to Intel integrated graphics, you’re comparing apples to oranges. Also, most people don’t upgrade their CPU each year. People who upgrade every year are usually bleeding edge enthusiasts. Majority of people follow the 3 year cycle. And again, one AMD core isn’t equal to one Intel core. So while AMD may have more cores, Intel’s cores actually get more work done per cycle compared to AMD’s.
That one beats the i5, but not on Guild wars 2….
The game is not really optimized for multi-threading. But I guess you will not use your computer only for playing 1 game :p
Besides, the new playstation 4 will have a amd cpu… So that means they are still better than intel i think :pNo, the PS4 uses an AMD CPU because it needs to hit a price point.
oh, its not all about price, AMD offers the better overall package, AMD is better at multi threading, amd also have FAR FAR better video then intel, and im sure that amd are supplying some sort of chipset for it since that would make things alot easier on sony in design and in software design.
intel is faster per core, but intel dont even have an 8core chip yet, and no, quads with HT dont count…HT is great when it works, but can have HORRIBLE effects when it dosnt.
a simple example is using x64 ogg vorbis encoder with dbpoweramp on all cores, my 8core amd flys, my buddies 900x(its a 970/980/990…cant remember) at 4.8 takes a 47% perf hit with HT enabled doing the same task, making it ALOT slower then mine, with it disabled, its slower but, not quite so big a gap…..(this was done encoding a large batch of flac files to ogg using the lancer mod enhanced oggenc from hydrogen audio forums using dbpoweramp, foobar2000 also could be used for testing i hear….but dbp is easy and fast)
They use AMD cause it has a great price to performance ratio. Also, while the AMD CPU does have 8 cores, one AMD core =/= one Intel core. As for AMD having better integrated video, that doesn’t really matter. The PS4 is using a dedicated graphics card anyhow.
Nope , the PS4 is using what AMD calls an APU , 8 jaguar core with a graphics chip on the die. And from what “we” “know” MS uses something that is “close” to Sony
AMD beats Intel CPU when it comes to video encoding There some other encoding things which are faster as well. There is a video out on how AMD beats Intel on games when using something like fraps and running the game at the same time.
There is also it upgrade path Intel will go with mainboards + cpu soldered and AMD allows users to upgrade the cpu without needing a new mainboard.
So far AM3+ has one upgrade left somewhere before the end of the year (prolly 2014) And it is called Steamroller .
No, Intel denied the BGA-only rumors.
http://www.maximumpc.com/article/news/intel_says_company_committed_sockets2012
the 3470 dosnt overclock well, you have to use turbo and bclk and…..thats VERY limiting.
on the other hand amd is FUN and easy to overclock
http://www.overclock.net/t/1348623/amd-bulldozer-and-piledriver-overclocking-guide-asus-motherboard
that guide will work for asrock as well before you ask
3470 doesn’t overclock well because it doesn’t have an unlocked multiplier like the K series does. Unlocked processors from Intel are just as fun and easy to overclock as AMD’s.
(edited by Unspec.9017)
As Unspec said, unless it’s a k series, it pretty much is a waste to overclock the Intel processor.
One thing my friend has told me about – he’s used two AMD cores before switching to Intel; he’s used one and currently uses two Intel cores in his computers (he has two computers) – is that AMD CPU’s life tends to be much less than that of Intel’s.
As Unspec said, unless it’s a k series, it pretty much is a waste to overclock the Intel processor.
One thing my friend has told me about – he’s used two AMD cores before switching to Intel; he’s used one and currently uses two Intel cores in his computers (he has two computers) – is that AMD CPU’s life tends to be much less than that of Intel’s.
your buddy is either full of something brown and smelly, a fanboi or you missunderstood him……
I have been at this computer building game for over 20 years now, neither intel or amd have chips that in themselves dont last, there have been some pretty bad boards, and some bad chipsets(never seen a bad amd chipset, but intel had a few that where horrible, and SiS intel chipsets where suq….dont get me started on via….)
and unspec, no, to me intel isnt as fun or challanging to push, all you really can do with intel is change the multi and volts…with amd you can change the buss without breaking anything, you can tweak the HT, and teh cpu/nb clocks….
FSB – 280
DRAM – 1866Mhz
CPU/NB – 2520Mhz
HTT – 2520Mhz
thats my current settings……quite fun really
PCP&C 1200watt TC, Crosshair V F-Z, Sapphire 290x
your buddy is either full of something brown and smelly, a fanboi or you missunderstood him……
I have been at this computer building game for over 20 years now, neither intel or amd have chips that in themselves dont last, there have been some pretty bad boards, and some bad chipsets(never seen a bad amd chipset, but intel had a few that where horrible, and SiS intel chipsets where suq….dont get me started on via….)
and unspec, no, to me intel isnt as fun or challanging to push, all you really can do with intel is change the multi and volts…with amd you can change the buss without breaking anything, you can tweak the HT, and teh cpu/nb clocks….
FSB – 280
DRAM – 1866Mhz
CPU/NB – 2520Mhz
HTT – 2520Mhz
thats my current settings……quite fun really
To be honest, my thought on it was that it was purely the luck of the components he had that worked well or didn’t work well, but I figured it wouldn’t hurt to post about it anyway. I think that, as long as both cores are subject to the same load respective to what they’re capable of at stock, they’ll last about the same time. He’s not really a fan of AMD/Intel in particular, but he does like Asus and will buy Asus whenever he can.
The very brief and condensed stance I have on AMD/Intel is when it comes to my digital artwork and programming, I do appreciate AMD processors purely because they kick Intel cores to the curb with rendering and just general multi-tasking; I wouldn’t think twice about even considering Intel cores for a machine that would be specified towards such tasks. However, when it comes to gaming, I believe that – as of today; right now – Intel takes home the trophy. I don’t think there’s a “bad” choice unless you choose a system that isn’t tailored to how you intend to use it.
(edited by Sheer.7529)
You guys make it so hard
My dilema is the following:
• Going Intel and staying intel for atleast 4 years since I will have to replace the whole mobo AND cpu. The reason is to guarantee a good performance for GW2.
• Go AMD since it really can be said that it is more future proof as I only have to change to steamroller/excavator cpu as it seems that both will be compatible with am+. Not only that but certainly games will start using multithreading in the next year or so. (PS4)
BUT, by this route my only solution is to pray that GW2 optimization improves; especially for amd…
As Unspec said, unless it’s a k series, it pretty much is a waste to overclock the Intel processor.
One thing my friend has told me about – he’s used two AMD cores before switching to Intel; he’s used one and currently uses two Intel cores in his computers (he has two computers) – is that AMD CPU’s life tends to be much less than that of Intel’s.
your buddy is either full of something brown and smelly, a fanboi or you missunderstood him……
I have been at this computer building game for over 20 years now, neither intel or amd have chips that in themselves dont last, there have been some pretty bad boards, and some bad chipsets(never seen a bad amd chipset, but intel had a few that where horrible, and SiS intel chipsets where suq….dont get me started on via….)
and unspec, no, to me intel isnt as fun or challanging to push, all you really can do with intel is change the multi and volts…with amd you can change the buss without breaking anything, you can tweak the HT, and teh cpu/nb clocks….
FSB – 280
DRAM – 1866Mhz
CPU/NB – 2520Mhz
HTT – 2520Mhz
thats my current settings……quite fun really
I hope the HT doesn’t stand for Hyper Threading.
Just for ur consideration Crysis 3 benches, as u can see the FX 8350 beats the i7 3770k and destroys the i5 3570k, the FX 6300 (130$) beats the i5 2500k (220$), so as u can see ur money is well invested in AMD tech for future gaming, in any multithread scenario, and future games will all be like this, AMD will beat current intell generation CPUs, just look at that chart. The FX 6300 beats the i3 3220 at stock speeds (AMD can be OCed i3s cannot) by arround 70% at the same price point, no wonder what ppl who said the i3 were better than 8 core AMDs would say about that.
http://www.pcgameshardware.de/Crysis-3-PC-235317/Tests/Crysis-3-Test-CPU-Benchmark-1056578/
Rampage, I saw that post. It only beat it on that map. There are tons of threads about that situation and later on you will see that i5/i7 actually beats the 8350 on almost all maps.
You guys make it so hard
My dilema is the following:
• Going Intel and staying intel for atleast 4 years since I will have to replace the whole mobo AND cpu. The reason is to guarantee a good performance for GW2.
• Go AMD since it really can be said that it is more future proof as I only have to change to steamroller/excavator cpu as it seems that both will be compatible with am+. Not only that but certainly games will start using multithreading in the next year or so. (PS4)
BUT, by this route my only solution is to pray that GW2 optimization improves; especially for amd…
If it makes you feel any better, I intend to stay with the i5-3570k for the next couple of years for gaming; I do intend to replace my motherboard simply because the current one I have is a micro board and I bought it cheap ($55), but it had a PCI-e 3.0 slot which was perfect for my Radeon HD 7770 GPU. I do intend to add some more to it in the future, though – when I have the cash.
Straightforward answer: I intend to stick with the i5-3570k for the next few years, and I would encourage others to do so as well if they’re building a computer purely for gaming.
Rampage, I saw that post. It only beat it on that map. There are tons of threads about that situation and later on you will see that i5/i7 actually beats the 8350 on almost all maps.
So the most CPU intensive map u mean?
You guys make it so hard
My dilema is the following:
• Going Intel and staying intel for atleast 4 years since I will have to replace the whole mobo AND cpu. The reason is to guarantee a good performance for GW2.
• Go AMD since it really can be said that it is more future proof as I only have to change to steamroller/excavator cpu as it seems that both will be compatible with am+. Not only that but certainly games will start using multithreading in the next year or so. (PS4)
BUT, by this route my only solution is to pray that GW2 optimization improves; especially for amd…
If more games will use multithreading, then you will not have any problem with amd. Since amd has already more cores than a i7 they will have better performances.
BTW if u go AMD get 2133 memory AMD performance scales very well with memory speed and latencies beacuse of the memory controller, it performs poorly in some games using slow memory F1, GW2 are pretty good examples of this. Also get a good mobo since u will probaly be using it for the next gen chips aswell get a 990fx Asus sabertooth, great overclocker, good performance, very good comoponets quality and manufacture, ddon’t cheap out on the mobo.
BTW if u go AMD get 2133 memory AMD performance scales very well with memory speed and latencies beacuse of the memory controller, it performs poorly in some games using slow memory F1, GW2 are pretty good examples of this. Also get a good mobo since u will probaly be using it for the next gen chips aswell get a 990fx Asus sabertooth, great overclocker, good performance, very good comoponets quality and manufacture, ddon’t cheap out on the mobo.
Last post about the 8350, I promise…what Rampage says is correct.
I have the ASUS SABERTOOTH 990FX R2.0 with AMD FX-8350 and CORSAIR Vengeance 32GB 1866 and ASUS HD7870-DC2-2GD5-V2 running on ASUS MX279H monitor and I couldn’t be happier.
Just for ur consideration Crysis 3 benches, as u can see the FX 8350 beats the i7 3770k and destroys the i5 3570k, the FX 6300 (130$) beats the i5 2500k (220$), so as u can see ur money is well invested in AMD tech for future gaming, in any multithread scenario, and future games will all be like this, AMD will beat current intell generation CPUs, just look at that chart. The FX 6300 beats the i3 3220 at stock speeds (AMD can be OCed i3s cannot) by arround 70% at the same price point, no wonder what ppl who said the i3 were better than 8 core AMDs would say about that.
http://www.pcgameshardware.de/Crysis-3-PC-235317/Tests/Crysis-3-Test-CPU-Benchmark-1056578/
It looks like turbo boost was turned off. I’d like to see the results when it is turned on.
As Unspec said, unless it’s a k series, it pretty much is a waste to overclock the Intel processor.
One thing my friend has told me about – he’s used two AMD cores before switching to Intel; he’s used one and currently uses two Intel cores in his computers (he has two computers) – is that AMD CPU’s life tends to be much less than that of Intel’s.
your buddy is either full of something brown and smelly, a fanboi or you missunderstood him……
I have been at this computer building game for over 20 years now, neither intel or amd have chips that in themselves dont last, there have been some pretty bad boards, and some bad chipsets(never seen a bad amd chipset, but intel had a few that where horrible, and SiS intel chipsets where suq….dont get me started on via….)
and unspec, no, to me intel isnt as fun or challanging to push, all you really can do with intel is change the multi and volts…with amd you can change the buss without breaking anything, you can tweak the HT, and teh cpu/nb clocks….
FSB – 280
DRAM – 1866Mhz
CPU/NB – 2520Mhz
HTT – 2520Mhz
thats my current settings……quite fun reallyI hope the HT doesn’t stand for Hyper Threading.
clearly you know kitten all about AMD, AMD dont do hyper threading, HT stands for HyperTransport, the communication between the chipsets and cpu are all hypertransport based http://en.wikipedia.org/wiki/HyperTransport
BTW if u go AMD get 2133 memory AMD performance scales very well with memory speed and latencies beacuse of the memory controller, it performs poorly in some games using slow memory F1, GW2 are pretty good examples of this. Also get a good mobo since u will probaly be using it for the next gen chips aswell get a 990fx Asus sabertooth, great overclocker, good performance, very good components quality and manufacture, don’t cheap out on the mobo.
your better off with the Asrock 990fx Extreme4 then the sabertooth, I own 2 sabertooths, and though they are great boards, they and their flaws, for one the verg section of the board gets hot, and aisuit will nag you alot about it(Even though its within safty range of all the parts used in that area of the board)
another is, the extreme4 has a for its cpu verg/fet area, kitten, it also comes with a front breakout for usb3 incase youre case dosnt have one.
its cheaper as well.
Honestly if i could go back i would have an extreme4 now…..another fun note, it has an ide/pata connector, so if you got a decent optical drive you want to re-use thats pata you can….(no need for sata on optical, they cant saturate the pata buss let alone sata.)
as to ram, it also really depends on the board and latancy vs clocks of the ram, 1600 cas8 is better then 2133 cas11 or 12 in many cases, 1866 cas8 is kinda hard to get and not really wort the extra cost….
from my exp the best point is to get a 32gb kit thats decent 1600mhz or better, use fancycache to cache your games drive, then, in a year or 2 when faster lower latancy ram is cheaper, upgrade.
i got my 32gb 1866 kit for 109.99 shipped form newegg…..honestly 9-9-9 at 1600 is just as fast as 1866 10-11-10 for gw2, (i have ran it both ways, it even does 2133 just fine at 11-11-11)
honestly, at this point in time this kit http://pcpartpicker.com/part/mushkin-memory-994069 is the best value on the market, alot of reports of getting 1866 or better with looser timings……though honestly, cas9 for a 32gb kit at 1600 is very fast!!!
http://pcpartpicker.com/p/Gr66
thats a build i speced up for somebody a while back.
psu is overkill for 1 card, but, leaves you open for a quick easy upgrade to dual 7870xt’s and as many hdd’s as you can cram in that case….;)
PCP&C 1200watt TC, Crosshair V F-Z, Sapphire 290x
You guys make it so hard
My dilema is the following:
• Going Intel and staying intel for atleast 4 years since I will have to replace the whole mobo AND cpu. The reason is to guarantee a good performance for GW2.
• Go AMD since it really can be said that it is more future proof as I only have to change to steamroller/excavator cpu as it seems that both will be compatible with am+. Not only that but certainly games will start using multithreading in the next year or so. (PS4)
BUT, by this route my only solution is to pray that GW2 optimization improves; especially for amd…If it makes you feel any better, I intend to stay with the i5-3570k for the next couple of years for gaming; I do intend to replace my motherboard simply because the current one I have is a micro board and I bought it cheap ($55), but it had a PCI-e 3.0 slot which was perfect for my Radeon HD 7770 GPU. I do intend to add some more to it in the future, though – when I have the cash.
Straightforward answer: I intend to stick with the i5-3570k for the next few years, and I would encourage others to do so as well if they’re building a computer purely for gaming.
there is ZERO point to running a 7770 in a pci-e 3.0 slot, ZERO, they cant saturate the bandwidth of a 1.0 slot, let alone a 2.0 or 3.0 slot…
The Bottom Line
We have put forth a great effort to get to the bottom of the PCIe 2.0 versus PCIe 3.0 debate. We put a lot of time into testing performance and verifying that our data is accurate. Except for a couple of specific scenarios, most of the performance advantage had under PCIe 3.0 was well under 10%. This actually falls in-line with the kind of performance advantages one might expect using n Ivy Bridge CPU clock-for-clock compared to a Sandy Bridge CPU. The IPC can affect performance by as much as 4-7% in favor of Ivy Bridge easily. As you noticed, most of our data when we experienced an improvement on the Ivy Bridge system was in this range of improvements. There were a few specific cases of 11% in The Witcher 2 in one test, and 19% in Batman (for part of the game only) and 14% when we cranked up the settings to unplayable levels in Max Payne 3. For the most part, at the real-world highest playable settings we found playable, all performance advantages were under 10%.
With real-world gameplay performance advantages under 10% it doesn’t change the actual gameplay experience. It in no way allows us to improve in-game quality settings nor does it give us any advantages over the PCIe 2.0 system. As we’ve stated previously in this evaluation, the technical performance advantages are “benchmarkable” but not relating to the gameplay experience.
It is also very clear from our testing that the NVIDIA GeForce GTX 680 receives an overall higher percentage of improvements with Ivy Bridge than the Radeon HD 7970 does. It is possible that similar to our past CPU frequency testing, NVIDIA GeForce GTX 680 GPUs are simply more sensitive to CPU clock speed and IPC, especially when you scale these upwards. We’ve done testing in the past that also shows NVIDIA GPUs are more sensitive to CPU clock speed than AMD GPUs are as you scale those up to dual and triple-GPUs. Therefore, we are not shocked to find that one brand might benefit with a technology more than another. It is an interesting result that we didn’t expect when we started testing.
So do not fret if you are on a Sandy Bridge PCI Express 2.0 system, you aren’t missing out on a bunch of performance compared to an Ivy Bridge PCI Express 3.0 system. Most of our readers will likely benefit from higher CPU overclocks on Sandy Bridge anyway if you are truly pushing the CPU clock and this alone will likely negate any “advantages” from PCIe 3.0 or Ivy Bridge IPC when it comes to real-world gaming scenarios. PCIe 3.0 is a great evolution, one day it may actually support a better gameplay experience compared to PCIe 2.0, but that day is not today.
there are similar articals from other sites, this one was just the first i found on google.
theres ZERO need for pci-e 3.0 today, thats why intel and amd both arent really pushing to get boards out with tons of 3.0 lanes…..why bother?
PCP&C 1200watt TC, Crosshair V F-Z, Sapphire 290x
I have the 8350 and saw a drastic improvement over the 4100 (oc’d to the same clockspeed). Doubled framerate in a lot of places. The bulldozer core was just a piece of crap. The piledriver core is just a better rig all around.
You can’t beat the FX-8350 for the price.
I agree with your last statement, but I cant agree aht bollzoder was a poc, I have an 8120 and didnt see that much of a gain moving to 8350 at the same clocks, the main issue with the 4100 and the like is that its sort of a hybrid of dual and quad, and the frontend isnt efficient at load balancing fpu time, so you bottleneck faster.
the 8120 is a beast even though it benches poorly, the 8350(and even 8320) are just an evolution on that, faster, run a small bit cooler, both clock very very well and are alot of fun
PCP&C 1200watt TC, Crosshair V F-Z, Sapphire 290x