AMD or Intel CPU ?

AMD or Intel CPU ?

in Account & Technical Support

Posted by: Vespero.6180

Vespero.6180

What’s better between i5-4460 Socket 1150 3,2GHz 6MB and CPU AMD FX-8350 8-Core Vishera 4.0GHz ?
both costs 150€ i have 200€ budget

AMD or Intel CPU ?

in Account & Technical Support

Posted by: Behellagh.1468

Behellagh.1468

i5 hands down for this game.

We are heroes. This is what we do!

RIP City of Heroes

AMD or Intel CPU ?

in Account & Technical Support

Posted by: Natsu Dragneel.1625

Natsu Dragneel.1625

For gw2? pick the Intel every time.

AMD or Intel CPU ?

in Account & Technical Support

Posted by: Avelos.6798

Avelos.6798

Intel i5. Would go for the 4690K for the overclocking too.

AMD or Intel CPU ?

in Account & Technical Support

Posted by: vespers.1759

vespers.1759

It’s not that simple. The 8350 will take a dump on the i5 in heavily threaded applications and games that actually use the cores (like BF4 and many newer games). In the future, more games will take advantage of the greater number of cores and threads. The 8350 also overclocks like a beast while the i5 cannot be overclocked (it’s not an unlocked cpu).

if ALL you care about is GW2 then get the i5, as GW2 will not make full use of the 8350. if you like to do other things on your computer like photoshop, new games etc… then consider the 8350.

personally i’d go with the 8350 simply because it is far more versatile and will be better in the long run as games and applications use the extra cores.

Bristleback can’t hit anything? Let’s fix the HP bug instead.

AMD or Intel CPU ?

in Account & Technical Support

Posted by: Avelos.6798

Avelos.6798

I’d go with the intel because it’s an overall better experience in everything than just games. That certainly is the case for me with the 4770K and the 8350. Don’t waste your time or money on the AMD imo. They have their perks, but the price on some of them is misleading. Poorly advertised or advertised as something they are not.

AMD or Intel CPU ?

in Account & Technical Support

Posted by: vespers.1759

vespers.1759

I’d go with the intel because it’s an overall better experience in everything than just games. That certainly is the case for me with the 4770K and the 8350. Don’t waste your time or money on the AMD imo. They have their perks, but the price on some of them is misleading. Poorly advertised or advertised as something they are not.

This post is one of the dumbest things i’ve ever seen. you’re seriously comparing a $180 cpu to a $340 cpu? well kitten for almost twice the price of the 8350 i sure hope the 4770k is better.

when you compare the amd cpus to their equivalent price intel cpus the amds are the clear winner in value for money.

Bristleback can’t hit anything? Let’s fix the HP bug instead.

AMD or Intel CPU ?

in Account & Technical Support

Posted by: dodgycookies.4562

dodgycookies.4562

For gw2 intel especially the unlocked versions. Where a high clock i3 will probably out do a 8350 at this time.

For everything else it is debatable.

The i5-4460 is intels offering at that price point, which has better or similar performance for most applications, and only loses out by in the few heavy multithreaded application benchmarks. Intel’s stronger cores only start to lose ground when dealing with 6+ threads.

One problem is that the 8350 is pretty “old”. Even in an amd game like bf4, the comparible intel chip in terms of performance (at stock) was the 2500k. Which was 3 gens ago and the current 4460 is faster than the old sandy bridge chip. Only when overclocking does the 8350 begin to offer real tangible benefits over a current gen budget i5.

The major problem with amd’s value proposition is that software is designed around current hardware capabilities. When 2-4 thread pentiums/i3s/i5’s/APUs dominate the market very few applications take the time to write more than 4 threads, as that is extra development time for something only very few hardware setups can take advantage of. And thus the bigger more efficient cores of the intel’s are not in a technical sense better, but rather the right tools for the right job.

And even worse for AMD, the programs that currently can typically show tangible performance gains when utilizing 8 threads are usually pretty expensive enterprise or creation packages that can also use more than 8 threads in which case high core count xeons are the preferred chip. Also with expensive software usually comes a larger hardware budget and those situations the minor value advantage of a oced 8350 no longer applies. Sure the 8350 beats the 4460 in cinebench, but no one who buys Cinema 4D for 3.5k is going to cheap out and buy either cpu. Most will just go with i7 or Xeons on a 2011 platform.

[ICoa] Blackgate

(edited by dodgycookies.4562)

AMD or Intel CPU ?

in Account & Technical Support

Posted by: Avelos.6798

Avelos.6798

I’d go with the intel because it’s an overall better experience in everything than just games. That certainly is the case for me with the 4770K and the 8350. Don’t waste your time or money on the AMD imo. They have their perks, but the price on some of them is misleading. Poorly advertised or advertised as something they are not.

This post is one of the dumbest things i’ve ever seen. you’re seriously comparing a $180 cpu to a $340 cpu? well kitten for almost twice the price of the 8350 i sure hope the 4770k is better.

when you compare the amd cpus to their equivalent price intel cpus the amds are the clear winner in value for money.

When you’re looking at Guild Wars 2, the same priced Intel to the AMD crap is vastly superior. I’d rather pay the higher price for something I know now with first hand experience will be better.

And let’s look at it in a different way; Top of the line AMD FX 8350 (not including the FX-9 because it’s just an overclocked 8350 anyway) to top of the line mainstream Intel 4770K and it’s better in every way that matters to a gamer.

Consider this, too. An Ivybridge i3 still slaps an FX-8350 around in Guild Wars. Much more when it’s overclocked to like 4.2 GHz such as the one that a buddy of mine has.

But if you want to prefer AMD because of just it’s price alone then feel free to live in the non-existent, virtual reality where AMD is actually better than Intel because that’s the only place they’d be better.

AMD or Intel CPU ?

in Account & Technical Support

Posted by: TinkTinkPOOF.9201

TinkTinkPOOF.9201

I’d go with the intel because it’s an overall better experience in everything than just games. That certainly is the case for me with the 4770K and the 8350. Don’t waste your time or money on the AMD imo. They have their perks, but the price on some of them is misleading. Poorly advertised or advertised as something they are not.

This post is one of the dumbest things i’ve ever seen. you’re seriously comparing a $180 cpu to a $340 cpu? well kitten for almost twice the price of the 8350 i sure hope the 4770k is better.

when you compare the amd cpus to their equivalent price intel cpus the amds are the clear winner in value for money.

In what regard are they the better value? The Intel compared to the AMD the OP is asking about the Intel wins out everywhere, and big time in gaming, yet costs the same amount. The Intel (4460) beats out even AMDs top of the line $400 CPU and that’s even in AMDs own baby of a game BF4.

Something to ponder

6700k@5GHz | 32GB RAM | 1TB 850 SSD | GTX980Ti | 27" 144Hz Gsync

AMD or Intel CPU ?

in Account & Technical Support

Posted by: Behellagh.1468

Behellagh.1468

Original source of Tink’s link. Please don’t crush this site.

http://iyd.kr/665

From a price performance prospective they chose the i5-4590 over the 4460 because if prices in the UK/EU is anything like the US or South Korea, where this blog post is from, the price increase it negligible (US $200 Vs $190) but gives you a higher Turbo frequency (3.7GHz V 3.4GHz) even though it’s only 100MHz difference in standard clock rate.

The FX-8350 is fine if you are also planning to use your PC with apps that scale extensively with number of cores, in which case those apps tend to to noticeably better with the AMD processor. Software development, 3D rendering, video compression, etc. But when games are GPU bound, any middle to high end quad core or above perform similarly. There are very few if any games where 8 slower cores found in the FX are better than four faster cores. And for a game that is CPU bound that use only a few cores worth of performance, like GW2, Intel will crush AMD.

Of course OCing can alter these comparisons but I’m talking standard clock rate here, just gaming, not gaming and streaming/recording just to be clear before we get linked to a Russian site extolling the virtue of massively overclocked FX CPUs.

We are heroes. This is what we do!

RIP City of Heroes

(edited by Behellagh.1468)

AMD or Intel CPU ?

in Account & Technical Support

Posted by: TinkTinkPOOF.9201

TinkTinkPOOF.9201

…The FX-8350 is fine if you are also planning to use your PC with apps that scale extensively with number of cores…

This, so many times over.

And those apps that can scale well with number of threads/cores are often business class programs, and often times are run on Xeon based computers, where cost of a few hundred (or even a few thousand) bucks doesn’t matter. Take the new G3258, I was going to pick one up from MC for a HTPC, as they had a combo deal for the CPU and Z97 mobo for $100 bucks (but is now OOS), and reviews of the chip are starting to roll out, and since it is unlocked, people seem to be hitting 4.2-4.8GHz pretty easy with it and in games is keeping up and in some cases beating all of the AMD line, all this from a CPU that’s under 60 bucks, or if you live by a MC 100 bucks for the CPU/mobo combo, pretty crazy for those looking at a budget gaming build.

One review here.

6700k@5GHz | 32GB RAM | 1TB 850 SSD | GTX980Ti | 27" 144Hz Gsync

AMD or Intel CPU ?

in Account & Technical Support

Posted by: Brother Grimm.5176

Brother Grimm.5176

It’s not that simple. The 8350 will take a dump on the i5 in heavily threaded applications and games that actually use the cores (like BF4 and many newer games). In the future, more games will take advantage of the greater number of cores and threads. The 8350 also overclocks like a beast while the i5 cannot be overclocked (it’s not an unlocked cpu).

if ALL you care about is GW2 then get the i5, as GW2 will not make full use of the 8350. if you like to do other things on your computer like photoshop, new games etc… then consider the 8350.

personally i’d go with the 8350 simply because it is far more versatile and will be better in the long run as games and applications use the extra cores.

Wow…AMD needs you on their Marketing team…..or as a lawyer….

We go out in the world and take our chances
Fate is just the weight of circumstances
That’s the way that lady luck dances

AMD or Intel CPU ?

in Account & Technical Support

Posted by: Jazhara Knightmage.4389

Jazhara Knightmage.4389

For gw2 intel especially the unlocked versions. Where a high clock i3 will probably out do a 8350 at this time.

actually as somebody whos tested this theory personally, my 8350 and even my 8120@4.5 (Easy oc even done it on stock cooler) was faster then the best i3 and low end i5’s tested aprox 6 months ago, and since i know they havent re-written the core problems with the game…..i doubt thats changed.

For everything else it is debatable.

The i5-4460 is intels offering at that price point, which has better or similar performance for most applications, and only loses out by in the few heavy multithreaded application benchmarks. Intel’s stronger cores only start to lose ground when dealing with 6+ threads.

One problem is that the 8350 is pretty “old”. Even in an amd game like bf4, the comparible intel chip in terms of performance (at stock) was the 2500k. Which was 3 gens ago and the current 4460 is faster than the old sandy bridge chip. Only when overclocking does the 8350 begin to offer real tangible benefits over a current gen budget i5.

amd and intel have both “hit a wall” perf wise, neither is putting out the huge boosts to perf with new generation chips they once where, infact, if you where a first gen i7 adopter who got a 970/980 EE chip and a decent board, you really have no reason to upgrade to the latest i7 offerings as the perf boost is pretty minor in real world use.

The major problem with amd’s value proposition is that software is designed around current hardware capabilities. When 2-4 thread pentiums/i3s/i5’s/APUs dominate the market very few applications take the time to write more than 4 threads, as that is extra development time for something only very few hardware setups can take advantage of. And thus the bigger more efficient cores of the intel’s are not in a technical sense better, but rather the right tools for the right job.

funny, because i got a stack of games including ElderScrollsOnline that use all 8cores of my chip effectively, something im saddened that GW2 cant do…perhaps Anet just need to nutup and buy the Hero Engine and rebuild GW2 using it…since clearly they cant be kitten d to fix their own engine.

programmers(Decent ones) dont write their programs/games for specific core counts, properly threaded apps can run multi threads on a dual/tri/quad/hex/octa core or higher chip without any special coding for each core count, I say this as somebody beta testing a few games(cant say what games saddly….stupid NDAs) where the devs are working VERY hard to make their games scale to any core count offered, even testing on 16core amd workstation rigs.

AMD FX-8350@4.8ghz on air(SilverArrowSB-E Extreme) , 32gb 1866mhz(10-11-10 cr1)
PCP&C 1200watt TC, Crosshair V F-Z, Sapphire 290x

AMD or Intel CPU ?

in Account & Technical Support

Posted by: Jazhara Knightmage.4389

Jazhara Knightmage.4389

And even worse for AMD, the programs that currently can typically show tangible performance gains when utilizing 8 threads are usually pretty expensive enterprise or creation packages that can also use more than 8 threads in which case high core count xeons are the preferred chip. Also with expensive software usually comes a larger hardware budget and those situations the minor value advantage of a oced 8350 no longer applies. Sure the 8350 beats the 4460 in cinebench, but no one who buys Cinema 4D for 3.5k is going to cheap out and buy either cpu. Most will just go with i7 or Xeons on a 2011 platform.

funny, I helped setup(build and config) a few systems for video/audio editing as well as rendering recently and they where quad socket AMD rigs with 4×16core chips in them (6386 SE) these chips only turbo up to 3.5ghz but, can do it on all cores(At least with the board we used).

the reason that the company went AMD was because 3 of their software vendors actually recommended AMD over Intel for the workloads they process, where more cores is actually a bigger advantage then more powerful per clock cores, also the systems each have a titan AND 290x in them, because, and this is the silly part, one of their software vendors refuses to optimize their ocl/directcompute paths because they are an nvidia partner so adding the nVidia card boosts perf drastically in that app AND allows it to run at the same time as others leaving more cpu time for the stuff that must run on cpu.

for this game, yes, intel is better but within reason, and for games like ESO…..no real advantage to be had with either companies product.

no the 8350 isnt a top of the line chip but nore is the intel price equiv either, and the amd gives you headroom to OC if you get a decent board, at the very least, 4.5 is an easy oc on the 8350, im at 4.8on air prolly in all honesty could hit 5 on this cooler but, dont really see the point/need….

to the OP, if your only worried about this game, buy the intel, if you want to be a bit more future proof go with the AMD option as everybody is moving to support at least 8 cores in gaming…you know, because the ps/4 and Xbone both use 8 core amd chips that are clocked far lower then desktop chips…
I would guess we wont see any real disruptions in the cpu market any time soon, since software makers are having to learn or re-learn how to thread their games to run decently on multi threaded systems.
also we have things like mantel taking hold that should be exciting to anybody who really gets why they are a big deal…..to bad anet cant give us a mantel rendering path rather then just giving us antiquated dx9….

AMD FX-8350@4.8ghz on air(SilverArrowSB-E Extreme) , 32gb 1866mhz(10-11-10 cr1)
PCP&C 1200watt TC, Crosshair V F-Z, Sapphire 290x

AMD or Intel CPU ?

in Account & Technical Support

Posted by: Jazhara Knightmage.4389

Jazhara Knightmage.4389

…The FX-8350 is fine if you are also planning to use your PC with apps that scale extensively with number of cores…

This, so many times over.

And those apps that can scale well with number of threads/cores are often business class programs, and often times are run on Xeon based computers, where cost of a few hundred (or even a few thousand) bucks doesn’t matter. Take the new G3258, I was going to pick one up from MC for a HTPC, as they had a combo deal for the CPU and Z97 mobo for $100 bucks (but is now OOS), and reviews of the chip are starting to roll out, and since it is unlocked, people seem to be hitting 4.2-4.8GHz pretty easy with it and in games is keeping up and in some cases beating all of the AMD line, all this from a CPU that’s under 60 bucks, or if you live by a MC 100 bucks for the CPU/mobo combo, pretty crazy for those looking at a budget gaming build.

One review here.

actually tested one of these chips, the work great in games/apps that dont use alot of threads, they start to choke when you run games that are properly threaded though sadly, i mean if all you care about is gw2 and older games that dont really thread much or at all, then this is the chip for you, it would be great in an emulator box for example.

on the other hand, if you are looking to the future where game devs/engine dev’s are not just targeting dual or quad core systems but, infact are building to support the ps/4 and xbone then you will see the flaws in the plan of buying a dual core to run games/engines that are being optimized to get the most out of 8 core consoles.

granted the consoles cpu’s are alot slower per core then either of the pentium’s cores, that dosnt mean the pentium can deal with the level of threading needed to avoid a bottleneck.

my view on this is, and will likely remain, you need to take into account weather or not your going to be running only gw2 and similar old skool games(meaning under the hood, since gw2 is dx9 and only has 3 critical threads and all) then stick with the cheap intel.

if you plan to play games like ESO i suggest getting a chip with more cores as these newer games actually take advantage of proper threading to give you improved performance.

its kinda crap shoot either way honestly, some people on lower mid range amd rigs get very good perf, same with intel, and others with top end intel and amd rigs clocked to the hilt with top end everything still endup with unplayable fps when they enter towns and the like……its been that way since launch and, till anet decide to invest the money in fixing the threading and rendering issues(DX9 only in 2014….really? ) you wont see anybody getting amazing fps game wide……not unless they manage to get a tricore cpu to around 6-8ghz…….

AMD FX-8350@4.8ghz on air(SilverArrowSB-E Extreme) , 32gb 1866mhz(10-11-10 cr1)
PCP&C 1200watt TC, Crosshair V F-Z, Sapphire 290x

AMD or Intel CPU ?

in Account & Technical Support

Posted by: Jazhara Knightmage.4389

Jazhara Knightmage.4389

e]
Wow…AMD needs you on their Marketing team…..or as a lawyer….

im sure you will say the same to me, funny thing is, i only ever hear this from people who havent used/setup a proper amd rig and just live by the rewiews and gosa on forums, its sad, since, in reality, in most games, even gw2, amd can work fine, def better then some of the old first gen i3 or last gen c2d/c2q systems people try and tell the users are better then any amd rig they could get…..

i recently setup a cheap gaming box for a buddy using an fm2+ apu, was so easy to setup and OC, and with a 270x in it, it plays gw2 just as well as the fellows wifes i7 3970x (she does video editing and her company only buys systems from dell so she was limited in choices) we set him up a rig so he could play with her, but had to do it within a budget, the fm2+ option was the best price/perf for his needs and offeres a better future upgrade path then an i3 would have….(infact he has an i3 first gen…..dead end sockets suck…)

hes happy, still sees the game hit the teens in LA and large groups of people, but so does she, (shockingly its within 3fps….3970x vs a10-7000 series, and the a10 was stupid easy to oc….4.8ghz with no effort….using a cheap cooler (20bucks shipped)

thankfully we where able to rob the ram out of his i3(was actually really good 1866 stuff that clocked to 2400stable at 16gb!!!) either way, the system works great, was fairly cheap to build and will have an upgrade path for a number of years to come, unlike intel sockets that change every time intel brings out a new gen of chips.

AMD FX-8350@4.8ghz on air(SilverArrowSB-E Extreme) , 32gb 1866mhz(10-11-10 cr1)
PCP&C 1200watt TC, Crosshair V F-Z, Sapphire 290x

AMD or Intel CPU ?

in Account & Technical Support

Posted by: dodgycookies.4562

dodgycookies.4562

Optetron is a good chip and can be better than their xeon competitors. I did not mean all AMD chips were bad. The FX series was not great, specifically i was speaking about the non existent use of the 8350 in truly multi threaded enterprise applications as people point out the multi thread power of the 8350 which although true doesn’t matter at that price point or offer significant performance increases in current realistic usage scenarios.

When multi thread applications hit the consumer software market, i am sure intel will move to high core count cpus as well, but as the latest steam hardware survey suggests, quad cores not bigger than the dual cores at the moment and dropping due to more purchases of laptops (and 8 cores sitting at 0.28%). Until that happens not many devs will make games that require or can use quad core or higher.

Though some are developing scalable software, it is not quite there yet along with api work as well. by the time this new tech is fully implemented in games, both intel and amd will have whole new lines and architectures so the argument is moot.

I have also tested out the AMD vs Intel for gw2 as well, my super old 8120 @4.4 vs the new g3258 at 4.6 both running titan blacks. The AMD is about 20fps better 140->160 (meaningless) on empty maps with less than 5 npcs and far view distances, the intel is about 10 fps better in heavy triple queued 60 v 60 v 60 wvw zergs or full server world bosses going from a unplayable 8120 @ 12 to a still bad but not slide show g3258 @24.

Though all this is theoretical. In practice the OP was asking about the 4460 vs the 8350, and in gw2 and most other consumer grade applications the intel is a better buy at the same price.

Most of the people here who are recommending intel for gw2 over amd are doing so from personal experience, and most of us have had properly setup and decent rigs running both amd and intel. The current overclocking and setup is about the same for both with intels requiring less cooling due to the more efficient 22nm chips. No one is saying AMD is bad overall, we are saying AMD FX performs the worse or the same as equivalent intel chips, cost the same, and have more drawbacks with legacy games. So the intels are “better” for now.

[ICoa] Blackgate

(edited by dodgycookies.4562)

AMD or Intel CPU ?

in Account & Technical Support

Posted by: tweeve.3782

tweeve.3782

I have an AMD 8350 for my main gaming system. I knew going in though that I wouldn’t get the performance of the Intel based one. GW2 does maybe 2 -3 threads most the time and is not optimized for a cpu that is designed for heavy multi threading. I do lots of video editing and video converting so the extra horse power of the 8350’s extra cores is a big help to me. If GW2 is the main reason I was building a new system I would go Intel chipset.

I also got the 8350 in a gamble thinking that gaming will go going more multi threading in the future. If you look at the console market both cpu’s (PS4 and XB1) are not too much different than the 8350 so I think in the long term the 8350 is fine. Just dont expect massive frame rates with an 8350 in GW2. It can hold its own, but it is not the frame rates of my brother’s and friend’s Intel chips.

AMD or Intel CPU ?

in Account & Technical Support

Posted by: Behellagh.1468

Behellagh.1468

Looking at the current steam survey 94.46% of users have 2-4 physical cores (meaning i3’s count as dual cores and i7s count as quads). Ignoring the single core systems that leaves 2.14% with more than 4 cores. Now why would anyone optimize their game or spend the effort to let their game scale beyond 4 cores on the PC. Now if you get that ability for “free” form licencing a game engine that someone else developed and maintains, that’s a different matter but rolling your own 5 years ago?

Back then you have a main thread and a rendering thread and a bunch of support/utility threads. The advantage of running on a quad is that the utility threads don’t impede on any CPU time the two main threads. And since a thread doesn’t run exclusively on a particular core, the entire workload is spread across all cores which is why you see activity on all cores although you may see a couple of cores markedly higher than the rest.

We are heroes. This is what we do!

RIP City of Heroes

AMD or Intel CPU ?

in Account & Technical Support

Posted by: Jazhara Knightmage.4389

Jazhara Knightmage.4389

@Behellagh
http://en.wikipedia.org/wiki/PlayStation_4

http://en.wikipedia.org/wiki/Xbox_one

as to threading issues, alot of that will be addressed by using mantel and eventually dx12/ogl’s next major revision, sadly those are years off and dx12 will likely require you to buy a copy of windows9 if it follows MS’s normal pattern these days.

I know many programmers at various game studios, also know app developers for a number of major companies, they are all working to support more cores, as they ALL have come to realize thats where things are headed, since its easier to tag on more cores or more partial cores, then it is to try and make chips run faster per clock or at higher clock speeds.

ZOS went with the hero engine and tagged other 3rd party tech onto it because they wanted an engine that could properly scale with todays and future tech from bottom to top end, its not as good as it could be, but, i hold alot of that onto d3d/dx that honestly is a huge bottleneck today, i have been beta testing titles that will likely launch with mantel support, its AMAZING how much more efficient it is, not even kidding a little, its fing insane, removing the rendering bottleneck in one title allows you to scale up the physics effects to max on my system without seeing any perf hit at all…..and they havent even started work on a gpu accelerated (directcompute or opencl) based physics binary (they say they plan to though)

its kinda crazy how, in reality any 1/2 decent system today SHOULD have zero trouble pushing any game out there to playable fps but cant because…dx/d3d is holding them back AND developers are behind the curve.

many old time devs are having to re-learn their jobs, i know a few who admit its challenging and, infact like to talk about the challenges, the one universal thing they all say, they hope mantel becomes truly cross platform(yes even TWIMTBP studios) because it would be a way to get everybody onto a very powerful, versatile cross-platform API, and in playing with mantel they have been finding that the place it REALLY shines is in helping lower end laptop owners get the most out of their systems.

it also opens alot of options for full on desktop rigs as, its never a bad thing when you can remove a bottleneck and free up processing power for other things…..

//Optetron is a good chip and can be better than their xeon competitors. I did not mean all AMD chips were bad. The FX series was not great, specifically i was speaking about the non existent use of the 8350 in truly multi threaded enterprise applications//

you do know that the opteron’s are based on the exact same core design as the FX line rite?

the only real dif being the socket, there are new low power opterons based on piledriver but the 16cores are all based on the FX line core.

as to the perf delta between intel/amd is very state dependant, the mention of an 8120@4.4 being slower then a cheap intel is true in some titles, though, I put mine up against i5’s and even 7’s in some titles and it dosnt beet them outside a few select titles that LOVE threading when compared to locked intel chips.

I would never suggest an AMD chip to somebody who only plays gw2 and dosnt do anything else…..on the other hand, if the person plays gw2 and other games or likes to muti task, amd can be the better option.

on thing we found when testing was, even top end intel EE chips arent on average as good as live video streaming when gaming as the 8 core amd chips.

as to temps, i have clocked top end intel chips enough to say this, at the top end, they all are hot as kitten…..specially the chips that dont have soldered cores but use cheap TIM under the lid…..dosnt matter how well you cool those newer chips you will hit a wall where that tim just cant move heat efficiently enough and even starts to degrade(turns to chalk)

and let me tell you, delidding a ivybridge+ to replace the shoddy TIM that intel use, its a slow, time consuming process, but can have starterling results……first time we did it and used CHEAP ASC to replace the stock intel TIM, saw a 22c temp drop at stock……22c…..WTF…….thats INSANE!!!

http://forums.anandtech.com/showthread.php?t=2261855

read that thread, its sad but the 4770k suffers the same flaw

http://www.overclock.net/t/1397672/deliding-a-4770k-haswell-improving-temperatures-and-maximizing-overclockablity

the are nice chips but, i question the longevity of these enthusiast grade chips under even mild to moderate overclocking, mostly because, holy kitten….22c temp drop AT STOCK on the chips cores……WTFF……

btw, delidding is not for the faint of heart, you seriously risk rucking up your cpu, do it only if you are willing to take the risk.

AMD FX-8350@4.8ghz on air(SilverArrowSB-E Extreme) , 32gb 1866mhz(10-11-10 cr1)
PCP&C 1200watt TC, Crosshair V F-Z, Sapphire 290x

AMD or Intel CPU ?

in Account & Technical Support

Posted by: Behellagh.1468

Behellagh.1468

Console development is immaterial because this game will never be targeted at consoles.

Those doing cross-development start with an engine that’s already supported on consoles and PCs and it’s the engine developer that worries over thread balancing issues.

The new consoles consist of what’s equivalent to the CPU portion of two Athlon 5350s (each is a quad on it’s own). One is roughly 30% of the performance of an i5-4590K Intel quad core in multithreaded benchmarks. The primary advantage of consoles is that you can target it’s hardware exactly. No need to generalize for an unknown hardware platform. That’s also why the graphics API on those can be closer to the metal upfront. No need to abstract.

When the development started for this game, creating (well modifying) their own game engine, all you had were dual and quad cores. Windows 7 wasn’t out yet and Dx10 wasn’t enough of a draw for gamers to target it or do 64-bit port because Vista wasn’t catching on.

The Haswell rehash, the “Devil’s Canyon” line of CPUs such as the i7-4790K uses a new TIM to significantly reduce heat build-up by 10-15 degrees Celsius over the original Haswell CPUs. Likely Broadwell will incorporate this new TIM as well.

Now you may believe that someday ANet will update the core graphic’s engine to support Dx11/12 and have a renderer that scales to number of cores you are sporting. but I don’t see that day coming soon. Or even two years down the line.

Back to what the OP was asking I reiterate. There’s nothing wrong with choosing AMD if you play a wide range of GPU dependent games or use software frequently that can scale up to whatever number of cores you have. But what is important in this game, right now, is per core performance in which case Intel is the winner hands down.

We are heroes. This is what we do!

RIP City of Heroes

AMD or Intel CPU ?

in Account & Technical Support

Posted by: Red Queen.7915

Red Queen.7915

snip

I also got the 8350 in a gamble thinking that gaming will go going more multi threading in the future. If you look at the console market both cpu’s (PS4 and XB1) are not too much different than the 8350 so I think in the long term the 8350 is fine. Just dont expect massive frame rates with an 8350 in GW2. It can hold its own, but it is not the frame rates of my brother’s and friend’s Intel chips.

I’m curious: when you say “it can hold its own”, exactly what does that mean? I’m facing a similar decison as the OP is, and if “holding its own” means about 30FPS, I’ll probably going for AMD. GW2 isn’t my only game (who would’ve thought…), so buying an Intel for GW2 alone is silly (not to mention stupidly expensive), and if the 8350 can run it better that the 8120 I have now, that’s fine by me. I don’t need 60+FPS, I need a working game.

PSA: The amount of small felines serves as an indicator for just how angry I am at something.

Kaerleikur @ Elonaspitze

AMD or Intel CPU ?

in Account & Technical Support

Posted by: Avelos.6798

Avelos.6798

The performance you’re seeing with an AMD FX-8120 and the performance increase you’d see with the 8350 would still be in the laughable department. I used an 8350 until round January 2014 starting from when I bought the game on May 5th and at first I thought it was great but I immediately picked up that the 8350 was not performing as well as I was lead on to believe because of AMD’s HORRIBLE marketing ploys for the FX line.

Defiance, another MMO is a huge example of CPU dependency. I started that game with a Phenom 2 quad core 975 BE and for the most part it ran it fairly well. I change to the FX-8350 and the performance was substantially worse. Then after a while I try it on my new 4770K and there’s not a single stutter anywhere ever.

When I used the 8350 for Guild Wars 2, the best I could get when just for lols purposes see how much FPS I can get by changing all settings to low and finding a spot with hardly any visual or CPU hard stuff, I got 180 fps and on the 4770K, I broke 320 doing the same thing. In WVW, I’ve seen the FPS for the 8350 drop to as low as a half frame per second. Yes, the number actually read ZERO. That was when it was Sanctum Of Rall, Jade Quarry and Black Gate. It was far from spectacular because it was literally a slideshow for me.

I don’t do wvw anymore but the largest player event I’ve been in since would likely be tequatl fights. At those the lowest FPS I’ve seen with the AMD was 2 fps. The intel, I think I stayed about 20 (I play on 5760×1080 at max settings turned off shadows and reflections for it, otherwise I have all of the eyecandy on since for what I do there isn’t really anything that can drop it now)

I think you’d be better off buying something like an Intel i5 4690K and a new board. Costs more than just buying an FX-8350 yes, but it’ll still perform better in games and Guild Wars 2 especially. Not to mention this thing where some games are ‘optimized’ for AMD FX… I still got better performance using an Intel on them.

AMD or Intel CPU ?

in Account & Technical Support

Posted by: sobe.4157

sobe.4157

Oh geez, the AMD fans are out in full to defend what even AMD claims is not what they hoped. The 8350 and the rest of the FX chips are simply slower than the i5/i7 counterparts in nearly every aspect. That being said the 8350 is quite the chip with some oomph behind it, but due to the architecture and not being true 8 core, the chip simply did not hit the market with it’s module setup like AMD hoped. Even in multi threaded environments Intel tends to pull ahead…

Now, that being said, according to some recent news AMD will be aiming at the enthusiast crowd in 2016. With Intel’s leading cpus since Nehalem, they have been somewhat stagnant…. My hope is that AMD pulls what they did back in the AthlonXP days and puts serious pressure on Intel while keeping their well known lower costs. I want to see the underdog pull back out the bag of tricks and get a one up on Intel for slouching.

That out of the way, if you want the best experience in your games, ESPECIALLY this one, go Intel. 100% stupid to look at anything else if buying new and this game is on your list.

Oh and before the bashing, I am a fan of the underdog known as AMD… I WANT AMD to put a deathgrip on Intel, more competition means better results for consumers, regardless which brand you go to.

3770k 4.9ghz | Koolance 380i | NexXxoS XT45 | XSPC D5 Photon | ASUS MVFormula |
Mushkin Black 16gb 1600 | 500GB Samsung 840 Evo |2×2TB CavBlack| GALAX 980 SoC |
NZXT Switch 810 | Corsair HX850 | WooAudio WA7 Fireflies | Beyerdynamic T90

AMD or Intel CPU ?

in Account & Technical Support

Posted by: loseridoit.2756

loseridoit.2756

Oh geez, the AMD fans are out in full to defend what even AMD claims is not what they hoped. The 8350 and the rest of the FX chips are simply slower than the i5/i7 counterparts in nearly every aspect. That being said the 8350 is quite the chip with some oomph behind it, but due to the architecture and not being true 8 core, the chip simply did not hit the market with it’s module setup like AMD hoped. Even in multi threaded environments Intel tends to pull ahead…

Dude, cpus have long product cycles. I heard it takes 5 years. AMD invested heaily cmt architecture. You dont have to write a long rant on AMD’s hail mary

AMD or Intel CPU ?

in Account & Technical Support

Posted by: sobe.4157

sobe.4157

Dude…. You’re getting a Dell. But, a majority of us already know….. Some people on these forums don’t quite grasp it.

3770k 4.9ghz | Koolance 380i | NexXxoS XT45 | XSPC D5 Photon | ASUS MVFormula |
Mushkin Black 16gb 1600 | 500GB Samsung 840 Evo |2×2TB CavBlack| GALAX 980 SoC |
NZXT Switch 810 | Corsair HX850 | WooAudio WA7 Fireflies | Beyerdynamic T90

AMD or Intel CPU ?

in Account & Technical Support

Posted by: loseridoit.2756

loseridoit.2756

Dude…. You’re getting a Dell. But, a majority of us already know….. Some people on these forums don’t quite grasp it.

dell still makes decent business laptops……..

There is nothing wrong with buying parts from a company that makes a decent product……

AMD or Intel CPU ?

in Account & Technical Support

Posted by: ikereid.4637

ikereid.4637

AM3/AM3+ is done. AMD is Focusing on the AM1 and FM2+ part lines now. And nothing in that part line can hold a candle to any of the Intel Haswell+ lines.

so why are we even having this discussion?

Piledriver’s per core performance is 45%-50% that of current Haswell cores. DX9 is single threaded for the rendering thread. Any application that has its core functions single threaded is going to see BETTER performance on a CPU that has a stronger core.

FX8350 vs G3220 using a R9 260x and 8GB of ram

FX8350 max FPS on medium settings
65-72FPS in the open world
42FPS-48FPS in LA
55FPS-60FPS in a random dungeon (CoF Path1)

G3220
90-123FPS in the open world
65FPS-92FPS in LA
73FPS-97FPS in a random dungeon (CoF Path1)

and those are FPS stats taken from systems that I own. A Intel Haswell Dual Core beats a FX8350 when it comes down to single threaded performance. Of Coarse, the Dual core will get stomped if the FX8350 were to use all 8 of its ‘cores’. Even with the FX line being garbage with how they designed the 2nd set of cores on the die.

And nothing you AMD fan guys can say will change these facts.

The performance line of AMD is dead right now, we are all waiting on AMD’s revival (if it ever happens) with a ‘am4’ series CPU. We want to see DDR4 support, Thunderbolt, USB3.0 Native support, SAS connections, and better SOC at the CPU, all in a performance package to compete with Intels current offerings.

Desktop: 4790k@4.6ghz-1.25v, AMD 295×2, 32GB 1866CL10 RAM, 850Evo 500GB SSD
Laptop: M6600 – 2720QM, AMD HD6970M, 32GB 1600CL9 RAM, Arc100 480GB SSD

AMD or Intel CPU ?

in Account & Technical Support

Posted by: sobe.4157

sobe.4157

Sirsquishy hit the nail on the head Until 2016, we cant expect much.

As far as the Dell comment, don’t be so sheltered, it was an old commercial and I was making a play on your use of dude, calm down….

I kind of wish squishy’s post could be stickied to answer all the AMD or Intel threads that pop up.

3770k 4.9ghz | Koolance 380i | NexXxoS XT45 | XSPC D5 Photon | ASUS MVFormula |
Mushkin Black 16gb 1600 | 500GB Samsung 840 Evo |2×2TB CavBlack| GALAX 980 SoC |
NZXT Switch 810 | Corsair HX850 | WooAudio WA7 Fireflies | Beyerdynamic T90

AMD or Intel CPU ?

in Account & Technical Support

Posted by: Behellagh.1468

Behellagh.1468

It all goes back to how AMD was going to match Intel’s performance on the i7 in multithreaded performance. The Phenom II, the fixed K10 architecture, wasn’t cutting it. The Phenom II 955 was still 25% slower than the i7-950 (both mid 2009) and the 965 was 15% slower than the more affordable i7-860 a few months later.

So the tried a two prong attack, one short term, one long term. For the short term they tried the hex core, turbo boosted Phenom IIs. And that was a good stop gap right up until Intel released the Sandy Bridge i7-2600. AMD was once again behind and their only way to compete was in the price/performance analysis so they ax the price of their CPUs.

For the long term they were developing the Bulldozer Architecture or what we call the FX series. Their goal was to beat the performance of an Intel Nehalem (i7-9xx) HT core.

Now the way Intel’s HyperTreading (HT) works is that since Intel’s integer core is so overbuilt with duplicate internal functionality, they could raise the overall efficiency by assigning two threads to a single core. Most times they can squeeze an additional 10% of performance out of each core.

Now whether or not it was due to patents and the like, AMD took a different tact. They decided to design what they call a “Module” that contains two simpler, lower performing integer cores, one for each thread, along with a single floating point core that would be shared. But that was no difference that what was in an Intel HT core when it was running two threads sharing it’s FPU.

And when the FX-8150 came out it could match the i7-975. But Intel just released the i7-2700K so that celebration was short lived. So AMD took a 2nd crack at the architecture, tweaked some of the internals of each core and came out with Piledriver. It had a higher clockrate and a slight performance per cycle improvement and now the FX-8350 could match the i7-2600K in multithreaded benchmarks. But Intel had the i7-3770K out for almost six months at that point which was 10-15% faster, again.

Now while AMD was so fixated on multithreaded benchmarks, their design had one glaring problem. When you were running an application that didn’t use 8 threads stressing the CPU to it’s maximum performance, the FX architecture was seriously inferior. Because when an Intel core isn’t running two threads, the single thread now runs roughly 80% faster than when two threads were running while an FX module’s performance only went up 25%. Starting with Sandy Bridge cores, the Intel under low thread loads ran 50% faster than an FX and now after multiple additional generations from Intel that gap as widen even further.

Now of course if you are running a game built around a modern FPS game engine, cranking the pretty and resolution up then GPU performance overwhelms CPU performance and in that case CPU performance isn’t a limiting factor other than dual Vs quad and higher and CPU age/era.

But in a game where the GPU workload isn’t the limiting factor, then you notice CPU performance. And you find that in spades with GW2. A game engine which was started 7 years ago based on an even older game engine, both designed in house. Not necessarily designed to maximize framerate so it could be used as a benchmark for video cards but provide “good enough” framerate for a character on foot jogging.

The problem appears to be that the additional work to render the actions of fellow players around you gums up the renderer leading to an overall slower framerate.

Edit: Kitten filter problem.

We are heroes. This is what we do!

RIP City of Heroes

(edited by Behellagh.1468)

AMD or Intel CPU ?

in Account & Technical Support

Posted by: Jazhara Knightmage.4389

Jazhara Knightmage.4389

Console development is immaterial because this game will never be targeted at consoles.

funny since they planned to target the 360 and only failed to do so because the 360 cpu couldnt deal with the load and MS was not at all friendly to the MMO Update process.

Those doing cross-development start with an engine that’s already supported on consoles and PCs and it’s the engine developer that worries over thread balancing issues.

spoken like somebody who really dosnt deal with many developers and hasnt paid much attention to gw2’s design, its clear to anybody who bothers to pay attention that gw2’s design is VERY much designed for gamepad users, limited skill slots and the way the UI works are blatant examples of this, not so easy to spot is the fact that the devs have more then once said that the game uses 3 critical threads, and if any one of those threads stalls out for even an instant(to quick for task manager to show it) then, the whole games perf drops like a rock.

the fact is, they already planned a console port, it couldnt work for the 360, its cpu couldnt hack it.

The primary advantage of consoles is that you can target it’s hardware exactly. No need to generalize for an unknown hardware platform. That’s also why the graphics API on those can be closer to the metal upfront. No need to abstract.

actually they can do this on PC now using mantle and will be able to use dx12 or the next major version of ogl to get the same result, current dx/d3d/ogl are high level api’s that cause no end of headakes for developers trying to get the most out of modern hardware.

When the development started for this game, creating (well modifying) their own game engine, all you had were dual and quad cores. Windows 7 wasn’t out yet and Dx10 wasn’t enough of a draw for gamers to target it or do 64-bit port because Vista wasn’t catching on.

the choice of dx9 only was directly related to the fact the 360 hardware was dx9, and that pretty much any system running at launch could run dx9, even systems that couldnt actually run the game in any meaningful way(go back and look at the complaints about being well over min spec and not being able to actually play…)

Now you may believe that someday ANet will update the core graphic’s engine … Or even two years down the line.

the devs promised they would add dx11 support, but im sure its not their fault they havent been allowed to do such, since they need to spend all that time on stuff to keep the money flowing via cash shop……

it would actually be smarter for anet to put a small team on creating a mantel renderer, since dx12 is said to be so similar to mantel that most of the code could be re-used with only minor changes.

mantel removes the API bottleneck, allowing them to do more with less.

I suggest you read up on it, and why dx12/ogl(next big release) will be copying what mantel is doing, and rumor is, Intel is considering adding mantel to their video drivers once amd releases the devkit/specs since it would allow their iGPU to be enough to play some modern games 1/2 decently.

alot of devs i talk to are working on patching in mantel support in the next couple years to even older games……im beta testing a very large content expansion for one game(sorry cant say what game) but they already have a decent mantel renderer out thats faster then their already highly optimized dx9/10/11 support, and the one thing the dev’s keep saying when we interact and talk about it, that mantel is so easy to code for compared to d3d/ogl that a barely optimized mantel renderer is faster in every test on the same hardware when compared to extremely optimized d3d9/10/11 renderer options.

i think it would be worth a small out lay upfront to get this added, if only as a selling point, possibly work out a deal to include a basic copy of gw2 with some higher end videocards from AMD even as a promotion.

again, i get where you are coming from but, i also work closely with some devs and have for decades now and, i dont know a single one who isnt excited about moving off dx11.x/ogl to less restrictive options that let them break out of the threading bottleneck.

its not all about the renderer from what i remember (Been a while since read the interviews where they talked about these issues around launch time) it all comes down to some very poor choices made early on, that would require a major overhaul to the engine to fix, BUT are quite doable and could lead to a xbone/ps4 port, such as the ones that are in the works for ESO.

if i could just take the good parts of gw2 and eso and mix the 2 games good parts, replacing the bad parts…could have a nearly perfect game.

AMD FX-8350@4.8ghz on air(SilverArrowSB-E Extreme) , 32gb 1866mhz(10-11-10 cr1)
PCP&C 1200watt TC, Crosshair V F-Z, Sapphire 290x

AMD or Intel CPU ?

in Account & Technical Support

Posted by: Behellagh.1468

Behellagh.1468

And you don’t sound like someone who actually do development.

We are heroes. This is what we do!

RIP City of Heroes

AMD or Intel CPU ?

in Account & Technical Support

Posted by: Jazhara Knightmage.4389

Jazhara Knightmage.4389

Piledriver’s per core performance is 45%-50% that of current Haswell cores.

this is very subjective, for some benches its very true, for others not so much, alot of the issue comes down to compiler optimization and finding ways to break intels compilers dirty tricks that prevent for example FMA optimization to work for amd chips, long story short, depending on whats used to compile the software you are using to bench and what kind of optimization has been done, the numbers can be as high as 50%, but also can be FAR FAR FAR lower(i have a couple apps im beta testing now that the dif between a 4770k and 8350 both overclocked to 4.5 is around 5%.

DX9 is single threaded for the rendering thread. Any application that has its core functions single threaded is going to see BETTER performance on a CPU that has a stronger core.

why do people talk out their posteriors?
http://amd-dev.wpengine.netdna-cdn.com/wordpress/media/2012/10/S2008-Scheib-ParallelRenderingSiggraph.pdf

dx9 can use muti threaded rendering, the biggest flaw for dx9 perf wise comes down to one major issue, everything in videoram must also be mirrored in system ram, add to that the fact that dx10/11 both make rendering curved lines/circles more efficent….well, there really was no reason to launch a game in 2013 that was dx9…..other then because they already put all their eggs in the dream of a 360 port…..and didnt have time to get a dx11 renderer built(mind you a proper dx11 renderer can actually to a feature check on the hardware and disable or use software emulation for those features that the dx9/10/10.1 cards are mising.)

FX8350 vs G3220 using a R9 260x and 8GB of ram

FX8350 max FPS on medium settings
65-72FPS in the open world
42FPS-48FPS in LA
55FPS-60FPS in a random dungeon (CoF Path1)

G3220
90-123FPS in the open world
65FPS-92FPS in LA
73FPS-97FPS in a random dungeon (CoF Path1)

and those are FPS stats taken from systems that I own. A Intel Haswell Dual Core beats a FX8350 when it comes down to single threaded performance. Of Coarse, the Dual core will get stomped if the FX8350 were to use all 8 of its ‘cores’. Even with the FX line being garbage with how they designed the 2nd set of cores on the die.

i have seen top end intel systems get the same fps range or worse then your 8350 example, and you can find examples of such here on the forums if you dig back a bit, the games just REALLY poorly optimized, and anet management dont seem to want the dev’s to actually fix it……makes me sad, since if they would fix it, i would gladdly buy some gems……

And nothing you AMD fan guys can say will change these facts.

again, it depends on your workload, for me, i sold my i7 rig and kept my 8350 rig because for the stuff i do the most often, my 8350 is faster.

The performance line of AMD is dead right now, we are all waiting on AMD’s revival (if it ever happens) with a ‘am4’ series CPU. We want to see DDR4 support, Thunderbolt, USB3.0 Native support, SAS connections, and better SOC at the CPU, all in a performance package to compete with Intels current offerings.

I would honestly rather see them skip ddr4 and move to ddr5, as to tunderbolt….i have zero interest, even less then I had in firewire…., usb3 native or via 3rd party chip dosnt really matter, sure it would be nice to have all the ports being 3.0 but, then again, how many 3.0 devices do most people really own?

sas connectors, i hope you mean SFF-8087 connectors like my hardware raid card has, that would be great!!!

btw, look at fm2/fm2+ they already have native usb3, amd just didnt bother to port back the chipsets because, honestly, their current head was planning to fully abandon the cpu market and move everything to apu’s, I think they are now rethinking that since even though they have pretty much abandoned am3+ they are still selling alot of chips……

anyway i wish everybody would move to SFF-8087 connectors on motherboards….it would be a godsend to me…..(i love my br10i, cheap and gave me 8 more sas/sata ports )

AMD FX-8350@4.8ghz on air(SilverArrowSB-E Extreme) , 32gb 1866mhz(10-11-10 cr1)
PCP&C 1200watt TC, Crosshair V F-Z, Sapphire 290x

AMD or Intel CPU ?

in Account & Technical Support

Posted by: Jazhara Knightmage.4389

Jazhara Knightmage.4389

For the long term they were developing the Bulldozer Architecture or what we call the FX series. Their goal was to beat the performance of an Intel Nehalem (i7-9xx) HT core.

the dozer idea has been in the works a long time, rumor has it that it was effectively restarted at least 4 times as various people decided that they didnt like this or that and wanted to re-invent the wheel so to speak, common problem in large corporations…also why we never saw the K9 hit the market. (yes a K9 was in the works and due to internal disagreements, they lost some engineers, leading to the k9 being kittencanned….)

Now the way Intel’s HyperTreading (HT) works is that since Intel’s integer core is so overbuilt with duplicate internal functionality, they could raise the overall efficiency by assigning two threads to a single core. Most times they can squeeze an additional 10% of performance out of each core.

this depends in some cases you can take a pretty nasty hit with HT enabled as well, for example encoding multi audio files at once using ogg vorbis or even Lame, i can encode 8 tracks at a time without a perf hit( tested vs 4 with 1 thread per module) my scaling is linear in this case, infact at the moment im encoding audio from flac to lame mp3 and getting around 341x encode speed across 8 cores(and still able to watch videos and post on forums/ect. )

also keep in mind, HT is a stopgap that was originally used by Intel to makeup for their poor design choices with the p4, and was not in the core2 line of processors, they had to do alot of work to get it working efficiently rather then causing a performance hit in the i7/i5 lineups, this isnt an insult to intel, its just fact….ht has its place but it can be as harmful as it is helpful at times.

Now whether or not it was due to patents and the like, AMD took a different tact. They decided to design what they call a “Module” that contains two simpler, lower performing integer cores, one for each thread, along with a single floating point core that would be shared. But that was no difference that what was in an Intel HT core when it was running two threads sharing it’s FPU.

actually, if you really wanted to understand why the FX line have some perf issues, you could read the half dozen or so analysis and break downs of why they have issues, its not because the basick idea is flawed or bad, it comes down to execution and a few poor design choices, likely by people who really didnt understand wtf they where doing (many times management think things that are a very bad idea are great ideas….)

They have one of the industries top cpu/soc guys working for them now, (even intel and apple say hes one of the words best) steamroller was just a tweak on the current design.

I personally do not expect intel or amd to bring anything amazing out in the cpu market any time soon, they are both working very hard to find ways to break threw the brickwall they have hit, AMD is more focused on multi-threading then Intel is, though, as I hear it from friends who work for intel, they have been looking at designs closer to AMD’s and even going farther like what SPARC chips are(lots of weaker cores working togather)

what i see coming eventually is what most of the industry have been saying was coming for a decade or so now….everybody is having to learn to multi thread and everybody is having to learn/figure out newer, better, more efficient ways to use more threads to get work done rather then heavy single threads.

AMD FX-8350@4.8ghz on air(SilverArrowSB-E Extreme) , 32gb 1866mhz(10-11-10 cr1)
PCP&C 1200watt TC, Crosshair V F-Z, Sapphire 290x

AMD or Intel CPU ?

in Account & Technical Support

Posted by: Jazhara Knightmage.4389

Jazhara Knightmage.4389

Now while AMD was so fixated on multithreaded benchmarks, their design had one glaring problem. When you were running an application that didn’t use 8 threads stressing the CPU to it’s maximum performance, the FX architecture was seriously inferior.

yup and how many apps/games written today are being targeted at single threaded perf ignoring mutli threading?

Because when an Intel core isn’t running two threads, the single thread now runs roughly 80% faster than when two threads were running

dosnt seem like your math adds up, you above say a 10% gain from HT, but then say that an intel core when running 1 thread rather then 2 is 80% faster…..

Now of course if you are running a game built around a modern FPS game engine, cranking the pretty and resolution up then GPU performance overwhelms CPU performance and in that case CPU performance isn’t a limiting factor other than dual Vs quad and higher and CPU age/era.

alot of this comes down to the nature of dx and ogl, and the trouble they give developers in making effective use of all those cores/threads they could be using, you need should read some of the complaining that some of the dev’s i work closely with post about how MS even now really isnt listening to devlopers (or they would have dx12 out by now….for windows vista/7/8……)

But in a game where the GPU workload isn’t the limiting factor, then you notice CPU performance. And you find that in spades with GW2. A game engine which was started 7 years ago based on an even older game engine, both designed in house. Not necessarily designed to maximize framerate so it could be used as a benchmark for video cards but provide “good enough” framerate for a character on foot jogging.

gotta love the white knighting for anet here…..its funny, because you dont seem to know that WoW’s engine is even older then gw2’s engine, yet its been updated and optimized to make use of multi cores(not limited to a set number even!!!), dx10/11 support have been added, little FYI, the WoW engine is based on the Warcraft 3 Engine…..
Guild Wars: April 26, 2005
WoW: November 23, 2004

looks to me like your excuse for not updating the renderer or engine core itself to fix perf issues and make efficient use of modern systems isnt really grounded in fact/reality but in feelz….

The problem appears to be that the additional work to render the actions of fellow players around you gums up the renderer leading to an overall slower framerate.

its always shocked me how UE3 mmo’s perform so much worse then then UE2.5x counterparts that, in many cases look just as good….i have been told this is down to the fact that ue3 was optimized for consoles and thus dosnt have as efficient a system for mutli threading…..kinda a WTF is you ask me….I only mention this because, even Tera runs better for me then gw2, and its also a very CPU heavy game…..(thanks enlarge to the shoddy copy of solidworks that BH used creating the UI for the game..)

What I would honestly just like to see is Anet accept and admit the game needs fixed, and start work on a patch to address the shortcomings.

btw, you are welcome to look but the dev’s more then once promised dx10 and even 11 support at launch then it was to be added after launch…..but alas it hasnt and probably wont happen till some anet manager decides its time to work on a port for modern consoles.

AMD FX-8350@4.8ghz on air(SilverArrowSB-E Extreme) , 32gb 1866mhz(10-11-10 cr1)
PCP&C 1200watt TC, Crosshair V F-Z, Sapphire 290x

AMD or Intel CPU ?

in Account & Technical Support

Posted by: Behellagh.1468

Behellagh.1468

Since I know people who worked on the Bulldozer at the time, I think I might have a little more insight on it’s development than you do. And yes I’ve read those analysis. But I’m not going to bullet point them to an audience who haven’t taken a course on CPU design theory. It’s hard enough simplifying how multitheading works on modern OSes in a crowd that may still believe that you can compare clock speed (vs IPC) or “more cores = better” at all things.

You are right it does take years laying out the groundwork for a new CPU architecture. Refining it, a lot less time. Feature reduction buys you more than lower power and higher clockrates., it’s gives you the space to fix glaring problems like the L1 instruction cache in an FX Module for one.

And yes, depending on the data flow and instruction mix HT can have a negative impact on performance. Then again it can have a 45% boost in certain operations which is why I conservatively stated “Most times they can squeeze an additional 10%” rather than stating the high outlier.

Intel’s big leap in performance (IPC doubled) was when they went from Netburst architecture (Pentium 4/D) to Core architecture, which evolved from the Pentium III. But since then Intel gains roughly 10% performance per generation since.

Same can be said with AMD. The K10 (Athlon II/Phenom II) cores were only 25% faster than the K8 and had an IPC similar to the 1st gen of the Core 2, Intel was just able to cycle refinements faster. Between the 1st gen of the Core 2 to Sandy Bridge Intel was able to improve IPC by 35% while AMD, to customers, stood still with the K10. And since Sandy Bridge Intel boosted IPC roughly 10% with Ivy Bridge and another 10% with Haswell.

Then AMD came out with the Bulldozer, which had a worse IPC than the K10, it’s saving grace was it could run at a higher (eventually with Piledriver) clock speed. Show me an independent article that didn’t think the FX-8150 was a huge disappointment.

So we aren’t going to see another major leap in IPC performance from Intel as we did with Pentium D to Core 2. But after 6 generations of small iterative improvements, not counting improved clock speed due to manufacturing improvements, they don’t need to.

I am routing for AMD. I want the competition. We may not have gotten the Core 2 if it wasn’t for AMD owning Intel during the first half of the 00s with the K8 (Athlon 64).

And while it will take time for programmers become verse with proper multithreaded design, not all applications can scale linearly with multithreading. Games will always be limited by time it takes to load resources off of mass storage and pushing all draws through a single interface (it’s the graphics driver than handles multiple GPUs, not the root application).

And as interesting to pontificate on a scalable rendering engine supporting Dx11/12/Mantle and 64-bit native mode, that has nothing to do with the state of the game today. The game today rarely goes beyond 3 cores of workload (not saying it only uses 3 cores or it has only 3 threads, I’m saying it’s total workload rarely gets above 75% on a quadcore, 50% on a hex core, 38% on an octocore). And it’s because of that and how the performance tradeoff between more slower cores Vs fewer faster cores is why AMD isn’t a good choice for this game over Intel.

We are heroes. This is what we do!

RIP City of Heroes

AMD or Intel CPU ?

in Account & Technical Support

Posted by: living.9361

living.9361

Intel all the way!!! my i5 kicks molasses

AMD or Intel CPU ?

in Account & Technical Support

Posted by: Avelos.6798

Avelos.6798

All I see is literally walls of text.

AMD or Intel CPU ?

in Account & Technical Support

Posted by: Behellagh.1468

Behellagh.1468

At least I use paragraphs. Better than some walls of text.

We are heroes. This is what we do!

RIP City of Heroes

AMD or Intel CPU ?

in Account & Technical Support

Posted by: Avelos.6798

Avelos.6798

At least I use paragraphs. Better than some walls of text.

This is true.

/15 character minimum limit

AMD or Intel CPU ?

in Account & Technical Support

Posted by: ikereid.4637

ikereid.4637

this is very subjective, for some benches its very true, for others not so much, alot of the issue comes down to compiler optimization and finding ways to break intels compilers dirty tricks that prevent for example FMA optimization to work for amd chips, long story short, depending on whats used to compile the software you are using to bench and what kind of optimization has been done, the numbers can be as high as 50%, but also can be FAR FAR FAR lower(i have a couple apps im beta testing now that the dif between a 4770k and 8350 both overclocked to 4.5 is around 5%.

When benchmarking CPUs for standardization you have to keep things the same between Intel and AMD. ‘breaking’ the ‘tricks’ that are used on Intels systems to make them faster then on AMD goes against that idea/model and is fabricated. The only time that would really make any serious difference, or even matter, is when you are compiling your OS, Tools, Applications to follow the ‘tricks’ that you are working around to get the desired results. That is what real-world testing is about.

And only a very few users are going to be going that far. And the ones that I know that do go that far, none of them are gamers at all.

dx9 can use muti threaded rendering, the biggest flaw for dx9 perf wise comes down to one major issue, everything in videoram must also be mirrored in system ram, add to that the fact that dx10/11 both make rendering curved lines/circles more efficent….well, there really was no reason to launch a game in 2013 that was dx9…..other then because they already put all their eggs in the dream of a 360 port…..and didnt have time to get a dx11 renderer built(mind you a proper dx11 renderer can actually to a feature check on the hardware and disable or use software emulation for those features that the dx9/10/10.1 cards are mising.)

For purposes of THIS thread, GW2’s performance is all that matters. and It is in fact limited to a single process that makes all the difference when choosing AMD vs Intel. And that is how Anet decided to use DX9. So, sorry not talking out of my ‘posteriors’ here.

again, it depends on your workload, for me, i sold my i7 rig and kept my 8350 rig because for the stuff i do the most often, my 8350 is faster.

Each to their own, but selling an i7 system (I’m going to assume your i7 was at least sandy bridge) was the wrong choice. The FX8350 does not have 8 real cores, it has 4 real cores and 4 ‘sudo’ cores that share resources with the 4 other cores. That is the biggest problem of the design issue with the piledriver/bulldozer design. Aside from how IPS/IPC was derived, but thats a totally different discussion.

I would honestly rather see them skip ddr4 and move to ddr5, as to tunderbolt….i have zero interest, even less then I had in firewire…., usb3 native or via 3rd party chip dosnt really matter, sure it would be nice to have all the ports being 3.0 but, then again, how many 3.0 devices do most people really own? anyway i wish everybody would move to SFF-8087 connectors on motherboards….it would be a godsend to me…..(i love my br10i, cheap and gave me 8 more sas/sata ports )

The fact you want AMD to skip DDR4 and go straight to DDR5 shows your level of knowledge on this topic. AMD has to use standardization when choosing internal components such as the Memory controller. Why would they go straight to DDR5 (we are talking late 2015 mid 2016 before we see a new performance CPU release, BTW) when manufacturers of ram are JUST starting to build,test, and deliver DDR4?!

Hell even the X99 that is the first DDR4 chipset is not even available yet….

As for the small feature list I posted, that is just what the newer haswell-e is going to bring to the table on the new 2011v3 platform and the X99 chipset. AMD has to compete at that level when talking performance parts. Else if they skimp on a feature that is being used widely they will isolate them selves from that customer base.

And by the time AMD is ready to release their next-gen performance series Intel is going to be working on Broadwell-E or even Skylake-E releases. So AMD is going to be 5 Generations behind in the performance game. They have big shoes to fill, and if they dont. They will never recover in that arena.

as for the SAS comment, a single SAS controller can deliver more SATA ports at a higher speed then a native SATA controller. X99 is going to have additional SAS ports and have a bios Addon for a hardware raid SAS controller module on the higher end ‘server’ boards. Thats where my SAS part comes in, as its considered a ‘performance’ feature. Its not something you are seeing on the 1150/haswell systems.

Desktop: 4790k@4.6ghz-1.25v, AMD 295×2, 32GB 1866CL10 RAM, 850Evo 500GB SSD
Laptop: M6600 – 2720QM, AMD HD6970M, 32GB 1600CL9 RAM, Arc100 480GB SSD

AMD or Intel CPU ?

in Account & Technical Support

Posted by: Aza.2105

Aza.2105

I currently use a 8350@ 4.7ghz. It performs fine, not better than high end intel cpus. But in comparison the 8350 is a lot older than the current chips intel has available.

From my experience gw2 is the only game that I play that has such a large difference in performance between intel and amd. In a patch several months ago Anet made some optimizations and I got around a 14fps boost on average. That’s not too bad.

Amd Ryzen 1800x – Amd Fury X -64GB of ram
Windows 10