What is being done about AMD CPU performance?

What is being done about AMD CPU performance?

in Account & Technical Support

Posted by: Kalec.6589

Kalec.6589

Mr. Kalec, what was i trying to prove according to you? i wrote that my AMD Phenom II Dual Core 560 with two cores unlocked (B60 after the unlock) on stock 3.3 Ghz with a HD4870 card runs this game on highest settings with Shaders turned on Ultra. at 1280×1040. (HD4870=5770=6770 and is bit faster than 7770) can and will let GW2 run at 35-55 fps.
and yes i am using 212 CM cooler. and it doesnt matter as this chip Unlocked cant run faster than 3.9Ghz without shutting one core. so basicaly what i have is an AMD QC that can run Stable up to 3.8 on Air with the 212.
is it playable? it is.
will it be playbale at 1920×1080 i will tell you when i swap it with my intel machine.
honestly i dont know about the 41xx, i am not going to buy one to test it. but what i understood is that they will deliver more or less the same.

as for the guy who bought it. 6850 is not going to block him. and he can oc the GPU to get 6870 performance. if his system can run the game up to 25 fps during the fps drops, he wont notice the drops. (what the eye sees as fps drop lag is under 24fps and not 30 fps as stated everywhere. just go to the cinema and tell me you see fps drops there).

that AMD i have was a 80-90$ Cpu. The mobo 100$ more. 4 GB of DDR3 Ram. and the Card there is 4 years old. my point is that it is playable on high settings. in that configuration.

FX41xx with 6850 should be a better combo than the one i have. so yes he should be able to play it.

as for you:
i am sorry your 2500-3000$ machine isnt delivering X10 times the fps a 250-300$ combo delivers while its still playable. (as long as his minimun fps will stay above 24-30) and he will be able to swap and upgrade his cpu.

Simon De Borovsk I think you are missing the point here. You said the benchmark proved it. It is already incorrect test. Low res with a much higher end card. Which the best case scenario, as already mentioned.
The Fx4100 does not perform well in single or dual threaded games. Quad thread games can be ok, but compared to previous counter part X4 965, it is below it.
The i3 3220 can keep up with or even best the FX4100.

Comparing movies to games is very different. 24fps movies and 24fps games acts very differently.
What is the point in buying a system, when you have to upgrade quickly. Why run minimum 24-30fps, and have it dip even lower under heavy situations.
That makes no sense.

As for my system, never paid that much. Wasn’t built for gaming, that was a side step for work. I do get 50-60fps constant in open world, and do dip to about 45fps in WvW.
My system is not important here, as I know it looks great while playing with all the details GW2 has to show.

i7 3770k @ 4.2Ghz | Z77 Sabertooth | EVGA GTX670 4GB | Crucial M4 256GB SSD | Corsair AX650
2×8GB Crucial Ballistix 8-8-8-24 @ 1600 | Arc Midi R2 | Asus VE278Q 27" 1920×1080
Edifier S330D | Superlux 668B | Audiotrak Prodigy Cube | CMstorm Trigger | Logitech G500

(edited by Kalec.6589)

What is being done about AMD CPU performance?

in Account & Technical Support

Posted by: Zelta.6829

Zelta.6829

For me GPU utilisation is much higher than CPU, at times it can hit 80-85%, where as CPU will never exceed 50%, it’s unlikely to be a bottleneck problem as I run an overclocked 670 with i5 3570k @ 4.5.

It’s difficult to maintain a stable 60fps in slightly crowded areas. As I said before, it’s pointless to optimise one aspect by increasing the load of another, that’s just laziness in coding.

I might have to switch to SLI to gain a stable fps.

What is being done about AMD CPU performance?

in Account & Technical Support

Posted by: DeadPhone.3084

DeadPhone.3084

Thanks Simon De Borovsk.7460. Just the info I was looking for. I should be able to OC the gpu and The HD Stack inside the case is removable so I should be able to do it safely. Though I will bench mark the heavy cities near starter areas and see what happens.

What is being done about AMD CPU performance?

in Account & Technical Support

Posted by: Simon De Borovsk.7460

Simon De Borovsk.7460

Kalec. i am not arguing with you. yes i agree i am considering upgrading my 5 year old Q6600 system. which still performs decently here. it happens that i have that AMD system as back up near me and i use to ditch older cards into the secondary computers i have. i also have an I5-460m Mobile which i used to play Guild Wars with (its connected to my 42inch Panasonic IPS-Led screen btw)
i run the Q6600 with single HD6870 and i am not getting far downed results than your system. yes i do get drops as low as 24fps in WvW. but i am not pretending to sell 5-6 year old Q6600s here. but look you are running a GTX670 4GB with 16GB Ram with a far superior CPU to mine. i for example would be very dissapointed if i would purchase your parts now and get the results you wrote you are getting.
yes indeed its a mystery why AMD failed so badly with that Bulldozer generation.
yes the Phenom II’s seem to outperform them.

and yes there is no much point in purchasing low end system for a game that makes high end premium priced system choke in the upper parts of the scale.
but he said, he can not return the parts. so no point for him even switching the CPU. even if he get the 8150 he might not get much higher fps.
so what can he do?

as for myself…
i will wait and see what piledriver can deliver. (if the 890GX Asus board i have would still be able to run it. it can run the other AM3+)
i like the AMD chipset for allowing a much cheaper alternative to X-fire. and the fact is when i got it two years ago, it was already equipped with SATA3 and USB 3.0 (and it replaced a Pentium D 805)

it is sad that AMD run out of competition to Intel. (the SB/IB are manufactured near to where i live btw) the market was much more competitive 2-3 years ago.

What is being done about AMD CPU performance?

in Account & Technical Support

Posted by: Zelta.6829

Zelta.6829

It really does feel like overclocking AMD CPUs is not as worthwhile as an Intel chip. They both are 3.5ish out of the box, but sadly Intel and Nvidia seems to have the pull on developers to optimise their engines and whatnot to only their products, thus the Intel and Nvidia logos in the beginning of 99% of the games.

An OCed i5 and an OCed bulldozer simply cannot compete at the moment, but then again, if you are just a casual gamer, AMD chips are the way to go, but personally I feel like the bulldozer series is one of AMD’s worst lines compared to the old phenoms and so on. With that said, some of the Intel lines had been dreadful as well, like the i5 750 etc.

As I said in another post, games now days is tied up with certain brands like Intel. So in order to play it at the smoothest level, gamers have to go out and buy those parts, clearly an endorsement from those respective companies.

What is being done about AMD CPU performance?

in Account & Technical Support

Posted by: SolarNova.1052

SolarNova.1052

Oh no lol ..Somone mentioned what the ‘human eye can see’ in terms of fps.

In lamens terms you could argue the human eye can see XX fps .but thats not how the eye works ..not really lol.

We CAN see a difference between 24 and 60 fps ..hell even 60 to 120 fps. it depends on the person also. But pritty much everyone can see a difference between 24 and 60.

So please dont go around saying otherwise when it can EASILY be tested by someone reading the comment. just plug an old game in that can run fast in your computer and limit the fps using a program like EVGA precision and check different fps’ limits to see the difference lol.

3930k 4.6ghz | NH-D14 Cooler | P9x79 Pro MB | 16gb 1866mhz G.Skill | 128gb SSD + 2×500gb HDD
EVGA GTX 780 Classified w/ EK block | XSPC D5 Photon 270 Res/Pump | NexXxos Monsta 240 Rad
CM Storm Stryker case | Seasonic 1000W PSU | Asux Xonar D2X & Logitech Z5500 Sound system |

What is being done about AMD CPU performance?

in Account & Technical Support

Posted by: Simon De Borovsk.7460

Simon De Borovsk.7460

Solar. if you screen is refreshing at 60hz and you run a game at 60fps or 500fps you will not see any difference.
and the 24 fps comment was aimed for the minimum fps drop. and it will not appear as a slide show. that is why tom’s are referring to 30fps as a minimum fps target. (NTSC vs PAL frame rates btw) 3d screens work in similar way, showing another motion picture stream during the dead time, while the mind connects them into one.
in filming higher fps shooting is usually used for time lapse. while shooting something like a bullet flying in 70-300 fps and then showing it in 24-30 fps.
Classic cinema shooting is 24 fps
PAL Video stream is 25fps
NTSC stream is 30fps
Charlie Chaplin and Harold Lloyd on the other hand were only 16 fps. thats why they are jumping just like some players do in WvW. or in WWI movies.

bottom line is that 24-30fps is (probably) the low limit of what will not disturb the eye in case of fps drop.

(edited by Simon De Borovsk.7460)

What is being done about AMD CPU performance?

in Account & Technical Support

Posted by: Zelta.6829

Zelta.6829

Seriously, you guys need to read more about refresh rates and LCD architecture first before posting elaborate discussions.

What is being done about AMD CPU performance?

in Account & Technical Support

Posted by: Simon De Borovsk.7460

Simon De Borovsk.7460

Higher fps than screen refresh rate only has some effect on the Image tearing effects due to more calculation operations. visualy there is less time for the eye to percieve impurities.

http://www.adkgamers.com/index.html/_/comp-team-articles/pt-3-optimization-maximization-intensification-r248

http://forums.guru3d.com/showthread.php?t=366444

What is being done about AMD CPU performance?

in Account & Technical Support

Posted by: Zelta.6829

Zelta.6829

I don’t understand why people mix vsync into refresh rates at absolute values.

What is being done about AMD CPU performance?

in Account & Technical Support

Posted by: Moz.8264

Moz.8264

4100 is getting 41 min. fps @ 4ghz It’s really not that bad when you OC. I wouldn’t worry about it

Kalec.6589 is correct, those CPU benchmarks show a best in case scenario, meaning its only going downhill from there.

CPU benchmarks using game engines are made so as to answer the question “If you had a GPU with infinite performance, how far would your CPU be able to push the game?”. That’s why a 7970@1280x1024 is used, so that the GPU is not the bottleneck, the CPU is.

Please note Tom’s Hardware probably knows how to benchmark games and more than likely don’t stare at sky or walls to make their reviews; surely the min. fps means the most CPU intensive moments, i.e zergs…

I also don’t know how many more times I have to post these, it’s like you Intel Faboys have blinders on strictly for any relevant information regarding AMD CPUs.

Attachments:

What is being done about AMD CPU performance?

in Account & Technical Support

Posted by: deltaconnected.4058

deltaconnected.4058

I never said “intel 4 lyfe” or “AMD sux balls”. I gave the main reason why the Bulldozer architecture was a huge flop and linked an article that explains it in more detail. Are you gonna say that every major reviewer is also an Intel fanboy for stating the same things I did?

As for the benchmarks, I had a linear increase going from 3.2GHz/29fps to 4.7GHz/39fps in a controlled test (the same scene up logout/login, overlay in far top-right). This doesn’t mean that TH benchmarked the game wrong; I just can’t reproduce the results they got on my hardware with regards to OC’ing a 2500k.

What is being done about AMD CPU performance?

in Account & Technical Support

Posted by: Moz.8264

Moz.8264

I never said “intel 4 lyfe” or “AMD sux balls”. I gave the main reason why the Bulldozer architecture was a huge flop and linked an article that explains it in more detail. Are you gonna say that every major reviewer is also an Intel fanboy for stating the same things I did?

As for the benchmarks, I had a linear increase going from 3.2GHz/29fps to 4.7GHz/39fps in a controlled test (the same scene up logout/login, overlay in far top-right). This doesn’t mean that TH benchmarked the game wrong; I just can’t reproduce the results they got on my hardware with regards to OC’ing a 2500k.

No one is arguing you tool, you come in a thread spouting things everyone already knows. This isn’t an hurr derr AMD vs Intel topic. WE KNOW BULLDOZER DIDN’T PERFORM AS EXPECTED, AMD KNOWS, EVERYONE KNOWS. FOR THE 3RD TIME. LOOK AT THE CHARTS. Now tell me why Guild Wars 2 is the only game that has a problem with certain AMD CPUs? Now don’t ramble off this time and state the obvious again.

Also the TH results were done at 1280×1024 not 5760×1080, no wonder you’re having problems replicating the results.

Attachments:

What is being done about AMD CPU performance?

in Account & Technical Support

Posted by: Winterchill.7156

Winterchill.7156

No disrespect but,
The way FX processors works…. not really for gaming. I mean it’s not just this game but others as well like Intel i3 counterparts are even beating the crap out of 8 core FX CPU’s. Phenom II’s even perform better than them. FX’s rely too much on XX number of cores but not all games/applications use all of them. That’s why Intel is beating AMD to the game is because their CPU’s performance is focused on clock-per-clock performance not number of cores. Unfortunately, like MANY OTHER GAMES, GW2 does not fully utilize so much cores specifically on how FX CPU’s are handling the workload. Not Anets fault IMO, it’s AMD’s fault for their design not targeted at the current generation of games/applications.

IGN: Demons Edge
Class: Mesmer
Server: IoJ

What is being done about AMD CPU performance?

in Account & Technical Support

Posted by: VirtualBS.3165

VirtualBS.3165

CPU benchmarks using game engines are made so as to answer the question “If you had a GPU with infinite performance, how far would your CPU be able to push the game?”. That’s why a 7970@1280x1024 is used, so that the GPU is not the bottleneck, the CPU is.

Please note Tom’s Hardware probably knows how to benchmark games and more than likely don’t stare at sky or walls to make their reviews; surely the min. fps means the most CPU intensive moments, i.e zergs…

I also don’t know how many more times I have to post these, it’s like you Intel Faboys have blinders on strictly for any relevant information regarding AMD CPUs.

Lets get this straight.

1) Yes, Tom’s Hardware knows how to benchmark games. Min fps on a CPU bound chart is still a best-case scenario. As soon as you put a lesser GPU in that system (or increase the resolution), everything will decrease. Read my post above.

2) If you want to compare CPU prices, do it in an unbiased way:

  • i5-2500K priced at ~$220 — 68.9fps — 100%
  • Pentium G860 priced at $90 — 52.9fps — 76.8%
  • FX-4000 priced at ~$105-$120 —31.9fps -- 46.3%

Sorry, but there is no possible reason for choosing any AMD processor for this game, at this moment. I really hope that can change in the future.

Attachments:

What is being done about AMD CPU performance?

in Account & Technical Support

Posted by: Teknobug.3782

Teknobug.3782

For now my Phenom II X6 1090T is treating me well with this game, but I do plan on getting an Ivy Bridge pretty soon, but I’m going to see what happens witht he Piledriver. I’ve been using AMD’s for my main PC since the late 90’s with the K5 and K6-3 but I’ve had a share of good Intel systems (like the dual Pentium Pro, Celeron 533MHz @ 1.1GHz & dual P3 2GHz).

As for the FPS discussion, you can see a difference between 30 and 60 fps, if you have Fraps then set it to 30 (or 29.97) and record something, then set it to 60 fps and record again, you’ll see a natural motion blur with 60 fps. But you won’t see much of a difference between 60 and 120 fps.

And I’m glad we’re into the LCD/plasma era, the old CRT TV and monitors are bad to our eyes, each refresh line causes our eyes to bobble up and down if staring at it for a long period of time, plus the moderate radiation. On my last CRT I was able to see a difference between 60Hz and 75Hz, it was night and day and 60Hz instantly made me look away with disgust.

Yak’s Bend WvWvW’er [Mount Phoenix Imperials]
Intel i7 3770K @ 4.5GHz | 8GB G.Skill DDR3 1600 ram | Gigabyte R9 280X 3GB (14.2)
Win 8 Pro 64bit

What is being done about AMD CPU performance?

in Account & Technical Support

Posted by: Hasteo.6437

Hasteo.6437

I upgraded the guts of my PC to run GW2. This is what I bought;

AMD FX-8150 (OC’d to 3.95GHz)
Asus M5A97 EVO
Patriot Viper 3 Mamba Black 8GB
CM Hyper 212 Plus
AMD Radeon HD 7770

I get 18FPS standing near the forge at LA.
I get avg 35-40FPS everywhere else.

Got to say I’m really dissapointed Is this really what I’m suppose to be getting??

Hasteo [EVOH] – Maguuma

(edited by Hasteo.6437)

What is being done about AMD CPU performance?

in Account & Technical Support

Posted by: Moz.8264

Moz.8264

CPU benchmarks using game engines are made so as to answer the question “If you had a GPU with infinite performance, how far would your CPU be able to push the game?”. That’s why a 7970@1280x1024 is used, so that the GPU is not the bottleneck, the CPU is.

Please note Tom’s Hardware probably knows how to benchmark games and more than likely don’t stare at sky or walls to make their reviews; surely the min. fps means the most CPU intensive moments, i.e zergs…

I also don’t know how many more times I have to post these, it’s like you Intel Faboys have blinders on strictly for any relevant information regarding AMD CPUs.

Lets get this straight.

1) Yes, Tom’s Hardware knows how to benchmark games. Min fps on a CPU bound chart is still a best-case scenario. As soon as you put a lesser GPU in that system (or increase the resolution), everything will decrease. Read my post above.

2) If you want to compare CPU prices, do it in an unbiased way:

  • i5-2500K priced at ~$220 — 68.9fps — 100%
  • Pentium G860 priced at $90 — 52.9fps — 76.8%
  • FX-4000 priced at ~$105-$120 —31.9fps -- 46.3%

Sorry, but there is no possible reason for choosing any AMD processor for this game, at this moment. I really hope that can change in the future.

What does it being best-case scenario have to with anything anyways? No one is talking about GPU performance.

And there’s a problem with your percentages; it only takes into account the highest possible fps and excludes the most demanding parts (min. fps). There’s also another problem using this chart because all CPUs are clocked at 3.0ghz and doesn’t show the performance out the box or what would happen with overclocking potential. What this chart does show is clock for clock performance which is irrelevant unless you’re dumb enough to down clock your CPU on purpose to play games.

Getting a little bit bias there pal’

What is being done about AMD CPU performance?

in Account & Technical Support

Posted by: Teknobug.3782

Teknobug.3782

I upgraded the guts of my PC to run GW2. This is what I bought;

AMD FX-8150 (OC’d to 3.95GHz)
Asus M5A97 EVO
Patriot Viper 3 Mamba Black 8GB
CM 212 Cooler
AMD Radeon HD 7770

I get 18FPS standing near the forge at LA.
I get avg 35-40FPS everywhere else.

Got to say I’m really dissapointed

Look up about disabling core parking and see what happens.

Yak’s Bend WvWvW’er [Mount Phoenix Imperials]
Intel i7 3770K @ 4.5GHz | 8GB G.Skill DDR3 1600 ram | Gigabyte R9 280X 3GB (14.2)
Win 8 Pro 64bit

What is being done about AMD CPU performance?

in Account & Technical Support

Posted by: Hasteo.6437

Hasteo.6437

I upgraded the guts of my PC to run GW2. This is what I bought;

AMD FX-8150 (OC’d to 3.95GHz)
Asus M5A97 EVO
Patriot Viper 3 Mamba Black 8GB
CM 212 Cooler
AMD Radeon HD 7770

I get 18FPS standing near the forge at LA.
I get avg 35-40FPS everywhere else.

Got to say I’m really dissapointed

Look up about disabling core parking and see what happens.

Tried that, no luck. * MAYBE 1-2FPS increase

Hasteo [EVOH] – Maguuma

What is being done about AMD CPU performance?

in Account & Technical Support

Posted by: VirtualBS.3165

VirtualBS.3165

Lets get this straight.

1) Yes, Tom’s Hardware knows how to benchmark games. Min fps on a CPU bound chart is still a best-case scenario. As soon as you put a lesser GPU in that system (or increase the resolution), everything will decrease. Read my post above.

2) If you want to compare CPU prices, do it in an unbiased way:

  • i5-2500K priced at ~$220 — 68.9fps — 100%
  • Pentium G860 priced at $90 — 52.9fps — 76.8%
  • FX-4000 priced at ~$105-$120 —31.9fps -- 46.3%

Sorry, but there is no possible reason for choosing any AMD processor for this game, at this moment. I really hope that can change in the future.

What does it being best-case scenario have to with anything anyways? No one is talking about GPU performance.

I wasn’t talking about GPU performance either. Those are CPU performance numbers.

And there’s a problem with your percentages; it only takes into account the highest possible fps and excludes the most demanding parts (min. fps). There’s also another problem using this chart because all CPUs are clocked at 3.0ghz and doesn’t show the performance out the box or what would happen with overclocking potential. What this chart does show is clock for clock performance which is irrelevant unless you’re dumb enough to down clock your CPU on purpose to play games.

Getting a little bit bias there pal’

These are not based on highest possible fps, but average fps — huge difference there. There is no reason to compare minimum fps, other than seeing an estimate on the variation of the measure.

Lets do another comparison then:

  • i5-2500K@3GHz priced at ~$220 — 68.9fps — 100%
  • Pentium G860@3GHz priced at $90 — 52.9fps — 76.8%
  • FX-4000@4GHz priced at ~$105-$120 —54.4fps -- 79%

A 1GHz overclock yielded 1.5fps on a best-case scenario over the Pentium G860… lol

Lets not even compare the TDP numbers between both architectures, if you have to pay for your energy.

What is being done about AMD CPU performance?

in Account & Technical Support

Posted by: Moz.8264

Moz.8264

Lets get this straight.

1) Yes, Tom’s Hardware knows how to benchmark games. Min fps on a CPU bound chart is still a best-case scenario. As soon as you put a lesser GPU in that system (or increase the resolution), everything will decrease. Read my post above.

2) If you want to compare CPU prices, do it in an unbiased way:

  • i5-2500K priced at ~$220 — 68.9fps — 100%
  • Pentium G860 priced at $90 — 52.9fps — 76.8%
  • FX-4000 priced at ~$105-$120 —31.9fps -- 46.3%

Sorry, but there is no possible reason for choosing any AMD processor for this game, at this moment. I really hope that can change in the future.

What does it being best-case scenario have to with anything anyways? No one is talking about GPU performance.

I wasn’t talking about GPU performance either. Those are CPU performance numbers.

And there’s a problem with your percentages; it only takes into account the highest possible fps and excludes the most demanding parts (min. fps). There’s also another problem using this chart because all CPUs are clocked at 3.0ghz and doesn’t show the performance out the box or what would happen with overclocking potential. What this chart does show is clock for clock performance which is irrelevant unless you’re dumb enough to down clock your CPU on purpose to play games.

Getting a little bit bias there pal’

These are not based on highest possible fps, but average fps — huge difference there. There is no reason to compare minimum fps, other than seeing an estimate on the variation of the measure.

Lets do another comparison then:

  • i5-2500K@3GHz priced at ~$220 — 68.9fps — 100%
  • Pentium G860@3GHz priced at $90 — 52.9fps — 76.8%
  • FX-4000@4GHz priced at ~$105-$120 —54.4fps -- 79%

A 1GHz overclock yielded 1.5fps on a best-case scenario over the Pentium G860… lol

Lets not even compare the TDP numbers between both architectures, if you have to pay for your energy.

You keep mentioning “best case scenario” like it means anything.

Yes, sorry the average fps. So tell me why I would care about fps over my monitor’s refresh rate or care more about average fps over my min. fps. Excluding min. fps is the most ridiculous thing I’ve ever heard.

An underclocked FX-4000 @ 3.0ghz to stock 4.0ghz resulted in a 41-43% increase in performance while the 2500k gained 4 min. fps. As for an FX-4000 beating a G860 slighty at stock clocks, take in to account the FX-8150 doing 20-27% worse than an 1100T only in Guild Wars 2. Also take in to account the mediocre overclocking the G860 is able to accomplish.

And yeah TDP a dual core vs a 4 core, I’ll take that into account when users use multi-monitors, multi-gpus, drive gasoline cars, and accidently leave their lights on.

A Pentium Dual core might be a good choice over an FX CPU, if you want to pay $90 to play one certain game…

What is being done about AMD CPU performance?

in Account & Technical Support

Posted by: Kalec.6589

Kalec.6589

I upgraded the guts of my PC to run GW2. This is what I bought;

AMD FX-8150 (OC’d to 3.95GHz)
Asus M5A97 EVO
Patriot Viper 3 Mamba Black 8GB
CM Hyper 212 Plus
AMD Radeon HD 7770

I get 18FPS standing near the forge at LA.
I get avg 35-40FPS everywhere else.

Got to say I’m really dissapointed Is this really what I’m suppose to be getting??

Sorry Hasteo, but is the best you are going to get. Are you running 1920×1080? Higher res need more gpu power to get higher settings.
The 7770 is a low-mid end card, with a poor performance 8150. Lower FPS.
If you want to help get more, you will need to upgrade your gpu to at least a GTX670/680, AMD 7950/7970
OC 3.95 is not much, you need to get much higher to at least 4.6Ghz.

i7 3770k @ 4.2Ghz | Z77 Sabertooth | EVGA GTX670 4GB | Crucial M4 256GB SSD | Corsair AX650
2×8GB Crucial Ballistix 8-8-8-24 @ 1600 | Arc Midi R2 | Asus VE278Q 27" 1920×1080
Edifier S330D | Superlux 668B | Audiotrak Prodigy Cube | CMstorm Trigger | Logitech G500

What is being done about AMD CPU performance?

in Account & Technical Support

Posted by: Simon De Borovsk.7460

Simon De Borovsk.7460

seriously, Guild Wars 2 is becoming the new Crysis!
i can see how Arenanet is going to get a lot of free advertising with all the future tech/computer websites benchmarking each and every GPU and CPU to GW2 demanding standards!
and the eternal question “can it run Crysis?”
had officially been changed to “All good, But… Can it Run Guild Wars 2?”

What is being done about AMD CPU performance?

in Account & Technical Support

Posted by: SolarNova.1052

SolarNova.1052

Just wait till Crysis 3 ..apparently after the shambles that was the crysis 2 port, they have promised a PC melting crysis 3 crosses fingers

3930k 4.6ghz | NH-D14 Cooler | P9x79 Pro MB | 16gb 1866mhz G.Skill | 128gb SSD + 2×500gb HDD
EVGA GTX 780 Classified w/ EK block | XSPC D5 Photon 270 Res/Pump | NexXxos Monsta 240 Rad
CM Storm Stryker case | Seasonic 1000W PSU | Asux Xonar D2X & Logitech Z5500 Sound system |

What is being done about AMD CPU performance?

in Account & Technical Support

Posted by: VirtualBS.3165

VirtualBS.3165

You keep mentioning “best case scenario” like it means anything.

Yes, sorry the average fps. So tell me why I would care about fps over my monitor’s refresh rate or care more about average fps over my min. fps. Excluding min. fps is the most ridiculous thing I’ve ever heard.

Do you even know what an average of a distribution means? Trading the average for the minimum is trading a measure of centrality of a distribution that uses all data points vs one data point. This is why I generally don’t like when benchmark sites post minimum fps measures — if they posted the standard deviation, at least people would go read about what that means.

An underclocked FX-4000 @ 3.0ghz to stock 4.0ghz resulted in a 41-43% increase in performance while the 2500k gained 4 min. fps. As for an FX-4000 beating a G860 slighty at stock clocks, take in to account the FX-8150 doing 20-27% worse than an 1100T only in Guild Wars 2. Also take in to account the mediocre overclocking the G860 is able to accomplish.

An FX-8150 loses to an equally clocked 1100T. Here’s an FX-8150@3.6GHz vs 1100T@3.3GHz: http://www.anandtech.com/bench/Product/434?vs=203

If the G860 doesn’t need an overclock to get good results, why would there be a need to overclock it?

And yeah TDP a dual core vs a 4 core, I’ll take that into account when users use multi-monitors, multi-gpus, drive gasoline cars, and accidently leave their lights on.

Sorry, but no. Overclocking high TDP CPUs is much more difficult than low TDP ones. The FX-4100 is a 95W CPU — that’s the same as an i5-2500K. Inefficiency per Watt at its best.

A Pentium Dual core might be a good choice over an FX CPU, if you want to pay $90 to play one certain game…

Or maybe many other games also?
http://www.tomshardware.com/reviews/fx-4100-core-i3-2100-gaming-benchmark,3136.html
http://www.tomshardware.com/reviews/gaming-pc-overclocking-pc-building,3273.html

(edited by VirtualBS.3165)

What is being done about AMD CPU performance?

in Account & Technical Support

Posted by: Brocksley.3127

Brocksley.3127

I am excited to see this thread. I was going crazy trying to figure out what was wrong with my system. I have an fx 6100 and an HD 6850 card. For some reason the game just ran poorly for no reason. Comparing my performance with my friends who had supposedly lesser processors and gfx cards just made me more frustrated. Why would theirs run so much better?

Overclocking my 6100 from stock 3.3 to 4+ gave me a few boost in fps, but it is disheartening to see older, slower processors outperform me. It is definitely something to do with the FX series and GW2. Everything else I play runs at max/ultra settings. But GW2….

What is being done about AMD CPU performance?

in Account & Technical Support

Posted by: Radium.5019

Radium.5019

What is being done about AMD CPU performance?

in Account & Technical Support

Posted by: Moz.8264

Moz.8264

You keep mentioning “best case scenario” like it means anything.

Yes, sorry the average fps. So tell me why I would care about fps over my monitor’s refresh rate or care more about average fps over my min. fps. Excluding min. fps is the most ridiculous thing I’ve ever heard.

Do you even know what an average of a distribution means? Trading the average for the minimum is trading a measure of centrality of a distribution that uses all data points vs one data point. This is why I generally don’t like when benchmark sites post minimum fps measures — if they posted the standard deviation, at least people would go read about what that means.

An underclocked FX-4000 @ 3.0ghz to stock 4.0ghz resulted in a 41-43% increase in performance while the 2500k gained 4 min. fps. As for an FX-4000 beating a G860 slighty at stock clocks, take in to account the FX-8150 doing 20-27% worse than an 1100T only in Guild Wars 2. Also take in to account the mediocre overclocking the G860 is able to accomplish.

An FX-8150 loses to an equally clocked 1100T. Here’s an FX-8150@3.6GHz vs 1100T@3.3GHz: http://www.anandtech.com/bench/Product/434?vs=203

If the G860 doesn’t need an overclock to get good results, why would there be a need to overclock it?

And yeah TDP a dual core vs a 4 core, I’ll take that into account when users use multi-monitors, multi-gpus, drive gasoline cars, and accidently leave their lights on.

Sorry, but no. Overclocking high TDP CPUs is much more difficult than low TDP ones. The FX-4100 is a 95W CPU — that’s the same as an i5-2500K. Inefficiency per Watt at its best.

A Pentium Dual core might be a good choice over an FX CPU, if you want to pay $90 to play one certain game…

Or maybe many other games also?
http://www.tomshardware.com/reviews/fx-4100-core-i3-2100-gaming-benchmark,3136.html
http://www.tomshardware.com/reviews/gaming-pc-overclocking-pc-building,3273.html

We’re focusing on minimal fps because unlike many other SP/MP games Guild Wars 2 and generally most mmos are CPU bound and not limited by the GPU. Hence why tom’s hardware reviewers adjusted the settings low and use high end GPUs to help show the CPU performance and not the GPU’s performance. This is why we look at the minimal fps; they show the most CPU intensive parts of the game, i.e ZERGS.

A FX-8150 and a 1100T will go back and forth on synthetic benchmarks, but the point is they are synthetic and not real world gaming results. Look at World of Warcraft, Starcraft 2, and Civilization V which are CPU bound games; the FX-8150 performs similarly to a 1100T when it comes to CPU bound games. On the other hand the 1100T is performing significantly better (30% BETTER) than the FX-8150 when it comes to this game, Guild Wars 2.

Overclocking high TDP CPUs is much more difficult than low TDP ones? Wrong. The black edition CPUs are known for being unlocked and having a high TDP for amazing overclocks. I honestly don’t know why you replied to this because you missed the point I was trying to make to that person; Let me give you an example off the wall example, “I’m going to buy this super duper energy efficient desktop computer! But the model of my car is a 2004 Hummer H2!”

The tomshardware reviews you linked are irrelevant to this conversation since we were talking about Pentium Dual cores and not i3s. On top of that the games benchmarked are GPU bound and not CPU bound. I’m sure even old Athlons can maintain 60 fps minimal on these SP/MP games.

Here’s my attachment again; don’t let it go over your head too.

Attachments:

What is being done about AMD CPU performance?

in Account & Technical Support

Posted by: Aza.2105

Aza.2105

Moz,

Thats what I wrote in my post in this thread. But I don’t think many people bothered to read it. The fact is the FX 8150 is the same as the 1100t (in single threaded situations). And yea those cpu dependent game proves that fact. But in GW2, the 1100t is 30% faster. When I noticed this, I wrote AMD right away. When they responded with their last email to me stating that Ncsoft and AMD are working together to improve performance on FX cpus. I thought they were kidding me.

I’m glad Tom’s hardware confirmed what they said and I’m happy my voice was actually heard.

Many people say that the FX series are bad cpus. This is only true when you take into consideration intel cpus. But when you compare prices (motherboard included) AMD comes to be a lot cheaper.

I use software that is heavily multi threaded (I’m a illustrator), so the FX series really shines in that area. In terms of single threaded performance the FX 8150 trades with the 1100t depending on the program. In some games the 8150 was better in some slightly worse. But not 30%.

Realistically the FX-8150 should be pumping out around 70fps on average in gw2. You have to take into consideration turbo boost which is poor with the phenom series. I’ll be happy when the performance boosts hit. Glad I don’t have to go out and buy a new cpu just for gw2.

Amd Ryzen 1800x – Amd Fury X -64GB of ram
Windows 10

What is being done about AMD CPU performance?

in Account & Technical Support

Posted by: Teknobug.3782

Teknobug.3782

With today’s patch, I’m starting to experience some fps stuttering which I didn’t experience beforehand.

Yak’s Bend WvWvW’er [Mount Phoenix Imperials]
Intel i7 3770K @ 4.5GHz | 8GB G.Skill DDR3 1600 ram | Gigabyte R9 280X 3GB (14.2)
Win 8 Pro 64bit

What is being done about AMD CPU performance?

in Account & Technical Support

Posted by: VirtualBS.3165

VirtualBS.3165

We’re focusing on minimal fps because unlike many other SP/MP games Guild Wars 2 and generally most mmos are CPU bound and not limited by the GPU. Hence why tom’s hardware reviewers adjusted the settings low and use high end GPUs to help show the CPU performance and not the GPU’s performance. This is why we look at the minimal fps; they show the most CPU intensive parts of the game, i.e ZERGS.

Yes, the minimum fps usually happens at the most intesive parts of the game. However you cannot accurately interpret them in a benchmark because it doesn’t state whether its an isolated case or if it happens a lot. It’s not a standard deviation. You can’t state with bar charts how often a particular gfx dips to low fps, you need a performance curve.

A videocard that dips once or twice into low minimum fps is very different than one that consistently goes there. Look at the example benchmark screenshot below. Would you say the GTX 680 is the better card, since it has a min fps of 45 vs the 7970 that has 38? Ofc not. The averages paint the better picture. (Taken from HardOCP).

A FX-8150 and a 1100T will go back and forth on synthetic benchmarks, but the point is they are synthetic and not real world gaming results. Look at World of Warcraft, Starcraft 2, and Civilization V which are CPU bound games; the FX-8150 performs similarly to a 1100T when it comes to CPU bound games. On the other hand the 1100T is performing significantly better (30% BETTER) than the FX-8150 when it comes to this game, Guild Wars 2.

True, the discrepancy is monumental. However, due to bulldozer’s architecture differences, it’s still a gamble which performance improvements can be done to alleviate this.

Overclocking high TDP CPUs is much more difficult than low TDP ones? Wrong. The black edition CPUs are known for being unlocked and having a high TDP for amazing overclocks. I honestly don’t know why you replied to this because you missed the point I was trying to make to that person; Let me give you an example off the wall example, “I’m going to buy this super duper energy efficient desktop computer! But the model of my car is a 2004 Hummer H2!”

Don’t confuse high TDP with high current leakage.

The tomshardware reviews you linked are irrelevant to this conversation since we were talking about Pentium Dual cores and not i3s. On top of that the games benchmarked are GPU bound and not CPU bound. I’m sure even old Athlons can maintain 60 fps minimal on these SP/MP games.

If you find the links irrelevant, then I really don’t know what else to say. The first link compared the FX-4100@3.6GHz with an i3-2100@3.1GHz that is very close to a Pentium G680. The second link is a $500 PC that uses a Pentium G680 in various benchmarks. You said “A Pentium Dual core might be a good choice over an FX CPU, if you want to pay $90 to play one certain game…” — this proves that a Pentium G680 can chug away nicely in many more than just one game…

Let’s compare old athlons with an old Pentium G850 then… Hmmm…
http://www.anandtech.com/bench/Product/121?vs=404

Here’s my attachment again; don’t let it go over your head too.

Lol. Hardly. I’ve probably used more AMD CPUs in my life than Intel ones. I just don’t think AMD is going in the right direction at the moment. They are distancing from Intel to go the ARM route. Lets see how that turns out.

Attachments:

(edited by VirtualBS.3165)

What is being done about AMD CPU performance?

in Account & Technical Support

Posted by: VirtualBS.3165

VirtualBS.3165

Moz,

Thats what I wrote in my post in this thread. But I don’t think many people bothered to read it. The fact is the FX 8150 is the same as the 1100t (in single threaded situations). And yea those cpu dependent game proves that fact. But in GW2, the 1100t is 30% faster. When I noticed this, I wrote AMD right away. When they responded with their last email to me stating that Ncsoft and AMD are working together to improve performance on FX cpus. I thought they were kidding me.

I’m glad Tom’s hardware confirmed what they said and I’m happy my voice was actually heard.

Many people say that the FX series are bad cpus. This is only true when you take into consideration intel cpus. But when you compare prices (motherboard included) AMD comes to be a lot cheaper.

I sincerely hope they do get better performance on the FX’s. Intel is too comfortable at the moment in the high-end CPU business, and this is never good for any consumer. My first AMD CPU was a 386DX@40 MHz, and oh boy, did it rock!

Let’s hope GW2 doesn’t rely much on bulldozer’s weak spots.

What is being done about AMD CPU performance?

in Account & Technical Support

Posted by: Aza.2105

Aza.2105

Virtual,

Yeah me too, I built a lot of AMD cpus. Athlon 64 was exceptional at the time.

Anyhow, Bulldozer performs quite well under Linux.

http://www.phoronix.com/scan.php?page=article&item=amd_fx8150_bulldozer&num=1

Amd Ryzen 1800x – Amd Fury X -64GB of ram
Windows 10

What is being done about AMD CPU performance?

in Account & Technical Support

Posted by: CptBadger.5918

CptBadger.5918

What is being done ?

Nothing.

/thread

What is being done about AMD CPU performance?

in Account & Technical Support

Posted by: Moz.8264

Moz.8264

We’re focusing on minimal fps because unlike many other SP/MP games Guild Wars 2 and generally most mmos are CPU bound and not limited by the GPU. Hence why tom’s hardware reviewers adjusted the settings low and use high end GPUs to help show the CPU performance and not the GPU’s performance. This is why we look at the minimal fps; they show the most CPU intensive parts of the game, i.e ZERGS.

Yes, the minimum fps usually happens at the most intesive parts of the game. However you cannot accurately interpret them in a benchmark because it doesn’t state whether its an isolated case or if it happens a lot. It’s not a standard deviation. You can’t state with bar charts how often a particular gfx dips to low fps, you need a performance curve.

A videocard that dips once or twice into low minimum fps is very different than one that consistently goes there. Look at the example benchmark screenshot below. Would you say the GTX 680 is the better card, since it has a min fps of 45 vs the 7970 that has 38? Ofc not. The averages paint the better picture. (Taken from HardOCP).

A FX-8150 and a 1100T will go back and forth on synthetic benchmarks, but the point is they are synthetic and not real world gaming results. Look at World of Warcraft, Starcraft 2, and Civilization V which are CPU bound games; the FX-8150 performs similarly to a 1100T when it comes to CPU bound games. On the other hand the 1100T is performing significantly better (30% BETTER) than the FX-8150 when it comes to this game, Guild Wars 2.

True, the discrepancy is monumental. However, due to bulldozer’s architecture differences, it’s still a gamble which performance improvements can be done to alleviate this.

Overclocking high TDP CPUs is much more difficult than low TDP ones? Wrong. The black edition CPUs are known for being unlocked and having a high TDP for amazing overclocks. I honestly don’t know why you replied to this because you missed the point I was trying to make to that person; Let me give you an example off the wall example, “I’m going to buy this super duper energy efficient desktop computer! But the model of my car is a 2004 Hummer H2!”

Don’t confuse high TDP with high current leakage.

The tomshardware reviews you linked are irrelevant to this conversation since we were talking about Pentium Dual cores and not i3s. On top of that the games benchmarked are GPU bound and not CPU bound. I’m sure even old Athlons can maintain 60 fps minimal on these SP/MP games.

If you find the links irrelevant, then I really don’t know what else to say. The first link compared the FX-4100@3.6GHz with an i3-2100@3.1GHz that is very close to a Pentium G680. The second link is a $500 PC that uses a Pentium G680 in various benchmarks. You said “A Pentium Dual core might be a good choice over an FX CPU, if you want to pay $90 to play one certain game…” — this proves that a Pentium G680 can chug away nicely in many more than just one game…

Let’s compare old athlons with an old Pentium G850 then… Hmmm…
http://www.anandtech.com/bench/Product/121?vs=404

Here’s my attachment again; don’t let it go over your head too.

Lol. Hardly. I’ve probably used more AMD CPUs in my life than Intel ones. I just don’t think AMD is going in the right direction at the moment. They are distancing from Intel to go the ARM route. Lets see how that turns out.

And, again, for the 4th? 4th time?? This topic is not about AMD vs Intel. This thread was over on day one, but you babies keep flocking back stating the obvious. How long has the FX series been out? Almost a year now? You’d think after almost a year the intel fanboys would have gone home by then…

http://www.tomshardware.com/reviews/guild-wars-2-performance-benchmark,3268.html

Update Sept 27: We’ve re-tested the game with the release client and we’re not seeing a notable performance difference. The Core i5 gained a few FPS (as did the AMD FX-4100 to a lesser extent), but all of the other results remain similar to our published numbers. NCSoft let us know that they’re working with AMD to improve FX-series CPU performance, and if this happens in the near future we may revisit Guild Wars 2 with new benchmarks.

Here you go I cropped in the attachment this time because you are clearly have eye problems.

Attachments:

(edited by Moz.8264)

What is being done about AMD CPU performance?

in Account & Technical Support

Posted by: Fozee.1083

Fozee.1083

The FX series is actually poorly efficient and will use considerably more power when overclocked. Just had to throw that in there.

I’m also an AMD person, and I totally understand where AMD is strong. High gaming performance, however, is not the place. I just ordered my 2500K (was on sale, or I’d have bought the 3570K) to replace my FX-4100.

Unfortunately, my FX-4100 does not like to hold an overclock of any sort. Maybe it’s the motherboard. Any useful OC causes data corruption, so it was time to get rid of this board and chip anyways.

BioWare/Mythic Moderator, Terror Squid, and Funparty

What is being done about AMD CPU performance?

in Account & Technical Support

Posted by: VirtualBS.3165

VirtualBS.3165

And, again, for the 4th? 4th time?? This topic is not about AMD vs Intel. This thread was over on day one, but you babies keep flocking back stating the obvious. How long has the FX series been out? Almost a year now? You’d think after almost a year the intel fanboys would have gone home by then…

Here you go I cropped in the attachment this time because you are clearly have eye problems.

Get your facts straight. You are just refusing to see that the Phenom X6 and the FX-8150 are very different architectures. Past results are not assurance of anything. The bulldozer architecture has severe drawbacks in many areas, that may show up or not depending on the CPU task, and will show up especially in desktop performance, as the bulldozer architecture is heavily oriented to server workloads.

Bulldozer weak spots:

  • Shared Op Decoders
  • Less integer execution units
  • High L2 cache latency
  • High Branch Misprediction Penalty
  • Low L1 Instruction Cache Hit Rate

Desktop Performance Was Not the Priority

No matter how rough the current implementation of Bulldozer is, if you look a bit deeper, this is not the architecture that is made for high-IPC, branch intensive, lightly-threaded applications. Higher clock speeds and Turbo Core should have made Zambezi a decent chip for enthusiasts. The CPU was supposed to offer 20 to 30% higher clock speeds at roughly the same power consumption, but in the end it could only offer a 10% boost at slightly higher power consumption.
http://www.anandtech.com/show/5057/the-bulldozer-aftermath-delving-even-deeper/12

Attachments:

(edited by VirtualBS.3165)

What is being done about AMD CPU performance?

in Account & Technical Support

Posted by: Moz.8264

Moz.8264

And, again, for the 4th? 4th time?? This topic is not about AMD vs Intel. This thread was over on day one, but you babies keep flocking back stating the obvious. How long has the FX series been out? Almost a year now? You’d think after almost a year the intel fanboys would have gone home by then…

Here you go I cropped in the attachment this time because you are clearly have eye problems.

Get your facts straight. You are just refusing to see that the Phenom X6 and the FX-8150 are very different architectures. Past results are not assurance of anything. The bulldozer architecture has severe drawbacks in many areas, that may show up or not depending on the CPU task, and will show up especially in desktop performance, as the bulldozer architecture is heavily oriented to server workloads.

Bulldozer weak spots:

  • Shared Op Decoders
  • Less integer execution units
  • High L2 cache latency
  • High Branch Misprediction Penalty
  • Low L1 Instruction Cache Hit Rate

Desktop Performance Was Not the Priority

No matter how rough the current implementation of Bulldozer is, if you look a bit deeper, this is not the architecture that is made for high-IPC, branch intensive, lightly-threaded applications. Higher clock speeds and Turbo Core should have made Zambezi a decent chip for enthusiasts. The CPU was supposed to offer 20 to 30% higher clock speeds at roughly the same power consumption, but in the end it could only offer a 10% boost at slightly higher power consumption.
http://www.anandtech.com/show/5057/the-bulldozer-aftermath-delving-even-deeper/12

“Past results are not assurance of anything” lol right.

You’re stating the obvious again, you’re starting to bore me.

(edited by Moz.8264)

What is being done about AMD CPU performance?

in Account & Technical Support

Posted by: deltaconnected.4058

deltaconnected.4058

1) I stated how to fix the FX8000’s problem in my second post which you obviously did not read

“start /AFFINITY 55 /B /WAIT c:\path\to\gw2.exe”

2) The 1090t has 6 cores. The FX8000 has 4 with hyperthreading. Likewise, the FX4000 is really a dual-core CPU. I don’t care what the spec sheet says, I trust the technical data more.

3) I redid my comparison of 3.2 vs 4.8GHz. And it’s still a linear increase at 1024×768 because it was CPU-limited to begin with, even at 5760×1080.

Attachments:

What is being done about AMD CPU performance?

in Account & Technical Support

Posted by: VirtualBS.3165

VirtualBS.3165

deltaconnected.4058 is absolutely correct. I really hope ANet improves the multithreaded workload, as that will benefit everyone.

About, the affinity mask trick, supposedly there were two Windows 7 hotfixes released in January that should alleviate that:

http://www.anandtech.com/show/5448/the-bulldozer-scheduling-patch-tested
AMD and Microsoft have been working on a patch to Windows 7 that improves scheduling behavior on Bulldozer. The result are two hotfixes that should both be installed on Bulldozer systems. Both hotfixes require Windows 7 SP1, they will refuse to install on a pre-SP1 installation.

The first update simply tells Windows 7 to schedule all threads on empty modules first, then on shared cores.

The second hotfix increases Windows 7’s core parking latency if there are threads that need scheduling. There’s a performance penalty you pay to sleep/wake a module, so if there are threads waiting to be scheduled they’ll have a better chance to be scheduled on an unused module after this update.

Other than that, it seems Windows 8 has finally fixed these issues for good, almost a 10fps increase in World of Warcraft:
http://www.tomshardware.com/reviews/fx-8150-zambezi-bulldozer-990fx,3043-23.html

Attachments:

(edited by VirtualBS.3165)

What is being done about AMD CPU performance?

in Account & Technical Support

Posted by: Moz.8264

Moz.8264

1) I stated how to fix the FX8000’s problem in my second post which you obviously did not read

“start /AFFINITY 55 /B /WAIT c:\path\to\gw2.exe”

2) The 1090t has 6 cores. The FX8000 has 4 with hyperthreading. Likewise, the FX4000 is really a dual-core CPU. I don’t care what the spec sheet says, I trust the technical data more.

3) I redid my comparison of 3.2 vs 4.8GHz. And it’s still a linear increase at 1024×768 because it was CPU-limited to begin with, even at 5760×1080.

I didn’t read it because I don’t own a FX-8000?

We’re talking about this graph right? https://dviw3bl0enbyw.cloudfront.net/uploads/forum_attachment/file/1902/CPU_Cores.png

It was a 3ghz to 4ghz overclock by tom’s hardware, I don’t know why you’re addressing to me and not them. Also how does 2500k overclocking correlate to this thread’s topic?

(edited by Moz.8264)

What is being done about AMD CPU performance?

in Account & Technical Support

Posted by: Aza.2105

Aza.2105

1) I stated how to fix the FX8000’s problem in my second post which you obviously did not read

“start /AFFINITY 55 /B /WAIT c:\path\to\gw2.exe”

2) The 1090t has 6 cores. The FX8000 has 4 with hyperthreading. Likewise, the FX4000 is really a dual-core CPU. I don’t care what the spec sheet says, I trust the technical data more.

3) I redid my comparison of 3.2 vs 4.8GHz. And it’s still a linear increase at 1024×768 because it was CPU-limited to begin with, even at 5760×1080.

Where do you run that command? Because I have no idea how to do it. lol

Amd Ryzen 1800x – Amd Fury X -64GB of ram
Windows 10

What is being done about AMD CPU performance?

in Account & Technical Support

Posted by: Teknobug.3782

Teknobug.3782

4 or 6 physical cores are different from a 2 or 4 core with hyperthreading.

Physical cores (eg X6 1090T):
1
2
3
4
5
6

Hyperthreaded 6 cores (eg FX6100):
1→4
2→5
3→6

Intel’s HT technology is better than AMD’s FX HT technology, for now i3 and i7’s has HT, not sure if the Pentium G series do or not.

Yak’s Bend WvWvW’er [Mount Phoenix Imperials]
Intel i7 3770K @ 4.5GHz | 8GB G.Skill DDR3 1600 ram | Gigabyte R9 280X 3GB (14.2)
Win 8 Pro 64bit

What is being done about AMD CPU performance?

in Account & Technical Support

Posted by: VirtualBS.3165

VirtualBS.3165

Where do you run that command? Because I have no idea how to do it. lol

Command prompt. But try the patches first, should have the same result.

http://www.anandtech.com/show/5448/the-bulldozer-scheduling-patch-tested
AMD and Microsoft have been working on a patch to Windows 7 that improves scheduling behavior on Bulldozer. The result are two hotfixes that should both be installed on Bulldozer systems. Both hotfixes require Windows 7 SP1, they will refuse to install on a pre-SP1 installation.

The first update simply tells Windows 7 to schedule all threads on empty modules first, then on shared cores.

The second hotfix increases Windows 7’s core parking latency if there are threads that need scheduling. There’s a performance penalty you pay to sleep/wake a module, so if there are threads waiting to be scheduled they’ll have a better chance to be scheduled on an unused module after this update.

What is being done about AMD CPU performance?

in Account & Technical Support

Posted by: Aza.2105

Aza.2105

I’ve had the patches installed since they were released. My GW2 is on drive D, so how do I run the cmd? Every time I try, I get a error even when I specify the path.

Amd Ryzen 1800x – Amd Fury X -64GB of ram
Windows 10

What is being done about AMD CPU performance?

in Account & Technical Support

Posted by: Teknobug.3782

Teknobug.3782

start /AFFINITY 55 /B /WAIT d:\Program Files (x86)\Guild Wars 2\Gw2.exe

Yak’s Bend WvWvW’er [Mount Phoenix Imperials]
Intel i7 3770K @ 4.5GHz | 8GB G.Skill DDR3 1600 ram | Gigabyte R9 280X 3GB (14.2)
Win 8 Pro 64bit

What is being done about AMD CPU performance?

in Account & Technical Support

Posted by: deltaconnected.4058

deltaconnected.4058

without the "s around the whole command, but you do need “path to/gw 2/gw.exe” if it has spaces.

start /AFFINITY 55 /WAIT /b E:\GW2\Gw2.exe

(2A and 05 are the affinity masks for 6- and 4000 series.)

It’s not the 30% the benchmarks shows, but it’s a noticeable increase on a friend’s rig using the same side-by-side comparison I did in Lion’s Arch. I’m guessing this will heavily depend on how much you have running in the background too.

The only reason I did a quick 3.2 and 4.8 test is to show that what you get on your system doesn’t have to correspond with their system and testing method. I’m not saying they’re wrong, just that it’s not a 100% reproducible statistic.

(edited by deltaconnected.4058)

What is being done about AMD CPU performance?

in Account & Technical Support

Posted by: Aza.2105

Aza.2105

start /AFFINITY 55 /B /WAIT d:\Program Files (x86)\Guild Wars 2\Gw2.exe

My GW2 folder isn’t in a sub folder. Its just in the main drive directory. It looks like this:

start /AFFINITY 55 /B /WAIT D :\Guild Wars 2\Gw2.exe

But every time I try to run that, I get this error: Windows cannot find “D:\Guild” Make sure you typed the name correctly and then try again.

I have no idea why it keeps doing that. And yea I’m running CMD as a admin.

Amd Ryzen 1800x – Amd Fury X -64GB of ram
Windows 10

What is being done about AMD CPU performance?

in Account & Technical Support

Posted by: VirtualBS.3165

VirtualBS.3165

@Aza.2105, do:
start /AFFINITY 55 /B /WAIT “D:\Guild Wars 2\Gw2.exe”

(yes, with the quotes there; quotes are needed to specify paths that use spaces)

(edited by VirtualBS.3165)