Intel Optimizations
2x improvement?
which one did you had? FX 8120?
game use 3 thread 70-75% or 2 threads 100%. Intel has stronger single thread. SO that why you see so much improvement. According to Tomshardware i5 2500 4.0ghz vs Fx 4100 4.0Ghz is only about 27% improvement.
(edited by XFlyingBeeX.2836)
I recently switched from an AMD 3ghz 8 core processor to an Intel 3.5GHZ i7 and saw a 2x improvement in performance in GW2. It’s lead me to ask
Is Guildwars 2 compiled on an Intel Compiler with Intel specific optimizations? I wasn’t expecting so strong an improvement between the processor change.
No, its because Intel is 150% better at single threaded tasks above AMD’s fastest chip.
GW2 is DirectX 9, and that API is single threaded for all the rendoring that has to happen.
that is ONLY why Intel beats the PANTS off of AMD for this game. If we were running DX10, or 11 I dont think it would be that bad of a difference between AMD and Intel.
fun fact – my i3-2120 KILLED my FX8350 in Max possible FPS in GW2. FX8350 would cap at 65FPS on lowest settings using a R7-260×. Swapping the CPU and Motherboard over from that FX to my i3 and my max FPS on Lowest settings was 119.
Only difference was the MB and CPU between 2 tests. tested in a low population Forest area (Looking at were you kill Boar, looking into the forest)
Laptop: M6600 – 2720QM, AMD HD6970M, 32GB 1600CL9 RAM, Arc100 480GB SSD
sirsquishy – thats not totaly true
Intel is not 150% faster than AMD FX in single threaded tasks – i mean not in every game( single threaded).
I will say it like this sometimes it is like 125% sometimes 160% – huge difference
Also GW2 is not single threaded – check your CPU usage. It use 2-3 cores.
http://www.tomshardware.com/reviews/guild-wars-2-performance-benchmark,3268-7.html
FX 8320 is great for its price but the best CPu for this game is I5 K
(edited by XFlyingBeeX.2836)
sirsquishy – that not totaly true
Intel is not 150% in single threaded tasks – i mean not in every game.
I will say it like this sometimes it is like 125% sometimes 160% – huge differenceAlso GW2 is not single threaded – check your CPU usage. It use 2-3 cores.
http://www.tomshardware.com/reviews/guild-wars-2-performance-benchmark,3268-7.htmlFX 8320 is great for its price but the best CPu for this game is I5 K
GW2 is single threaded. The MAIN process that controls the CORE component of the game runs on 1 Core. that is the API that ties gw2.exe to your DX9 libraries.
that is what ‘defines’ this game as being single threaded. And what is the TRUE bottle neck of every system suffering from the poor performance around GW2.
And on average Intel is 100% faster then AMD for single threading. But from what I do for a living (VMware Engineering) I can say with a 100% certainty (and facts to back it up if you wish) Intel is about 150% faster then AMD when it comes down to a per thread, Per core performance when compared to Intel at an absolute hardware layer. Games and applications aside.
Laptop: M6600 – 2720QM, AMD HD6970M, 32GB 1600CL9 RAM, Arc100 480GB SSD
Yeah for sure keep dreaming.
Go in bios set 1 cores – test
Go in bios set 2 cores – test
then test with all 4 cores enabled – report back
Yeah for sure
keep dreaming.
I don’t need to, as I’m more awake then you appear to be.
Laptop: M6600 – 2720QM, AMD HD6970M, 32GB 1600CL9 RAM, Arc100 480GB SSD
Yeah for sure
keep dreaming.
I don’t need to, as I’m more awake then you appear to be.
http://www.tomshardware.co.uk/answers/id-1909165/guild-wars-cpu-usage-4670k.html
Yeah for sure
keep dreaming.
I don’t need to, as I’m more awake then you appear to be.
I believe you, everything you said sounds about right.
I’m not sure why FlyingBee is defending AMD single threaded performance, since it doesn’t take away from the truth that its far behind Intel single threaded performance. The FX 8350 is a mult-threaded cpu, its not far behind Intel (and in many cases can exceed) intel when software takes advantage of its architecture.
Windows 10
i have i7 3770K(gave it to my bro) 3.8Ghz and FX 6300 when i OCed it to 4.7Ghz sometimes FX was faster in single thread games (TLII,LoL,…) sometimes i with 5.2 OC couldnt get close to intel at 3.8Ghz
I also benchmarked FX 4300 3.4Ghz vs i7 mimic – i3 3.4Ghz in BF4 MP performance was almost the same…
AZA check it yourself – go windowed mode in LA + taskmanager and check CPU usage!
(edited by XFlyingBeeX.2836)
i have i7 3770K(gave it to my bro) 3.8Ghz and FX 6300 when i OCed it to 4.7Ghz sometimes FX was faster in single thread games (TLII,LoL,…) sometimes i with 5.2 OC couldnt get close to intel at 3.8Ghz
I also benchmarked FX 4300 3.4Ghz vs i7 mimic – i3 3.4Ghz in BF4 MP performance was almost the same…
AZA check it your self – go windowed mode in LA + taskmanager and check CPU usage!
You are comparing CPUs using Multi-threaded properly coded games. where they utilize the GPU correctly.
GW2 is not in that mix. GW2 uses the CPU more then any other game that I can think of in recent history. And its using DX9 in a very over loaded way. The more visual crap they add to the game, the worse their interface with DX9 gets.
If they keep on going down this road with GW2 they will be forced to upgrade from dx9 to 10 or 11 to fix these issues they are bringing on themselves.
And the discussion with AMD vs Intel, is ONLY for THIS game. I have had NO issues with any of my other games on my AMD system(s) at all.
Thief
LoL
LoTRO
WoW(FreePlay – Not paying for that crap anymore!!)
Dirt3
TitanFall (Retail tomorrow!)
everquest (1999 called, they want me to play some!)
L4FD
L4FD2
Crysis3
BF1942
…long list
None of the above games EVER gave me issues with my AMD systems. It was only ever GW2 and its because they push TOO much data into the DX9 API. So much that they had to split the CPU/GPU load by 60% for the game to ‘run smooth’.
Laptop: M6600 – 2720QM, AMD HD6970M, 32GB 1600CL9 RAM, Arc100 480GB SSD
Yep directX 9 is a crap.
No they should do directX 11.2 or even wait for DirectX 12. But then i5 wont be so great choise why? becuse there wont be so big difference between FX 4300 and i5 4670K
At start GW2 was crap for fx cpu but right now i can say that OC-ed FX (4.5Ghz+) is on lvl of i3 4340 3.5Ghz
Yep directX 9 is a crap.
No they should do directX 11.2 or even wait for DirectX 12. But then i5 wont be so great choise why? becuse there wont be so big difference between FX 4300 and i5 4670K
And cue ignoring XFlyingBeeX because all he can do is change the subject when he’s trumped.
OP:
The Intel processor has superior to AMD multicore processors in the FX lineup now because AMD chose to now basically split a single core into two halves in one module.
I thought it was an interesting thing to do but of course since Direct X 9 was made in the generation where CPU performance was valued more, so the processor speed increased with the generations. To make up for that on an AMD FX, you’d have to superclock the thing to beast mode speeds where the Intel is kindasorta already clocked like that.
Until this game receives some better coding on the processor side of thing, CPU speed is everything.
Yep directX 9 is a crap.
No they should do directX 11.2 or even wait for DirectX 12. But then i5 wont be so great choise why? becuse there wont be so big difference between FX 4300 and i5 4670KAt start GW2 was crap for fx cpu but right now i can say that OC-ed FX (4.5Ghz+) is on lvl of i3 4340 3.5Ghz
I will say this much and then leave this alone.
If GW2 was on DX10 or 11 (not sure about 11.2, dont know much about it really), the performance between AMD and Intel CPUs would be marginal (8-12%) between series (FX4000 = i3, FX6000=i5, FX8000=i7, IMHO). And Intel would win for the BiS (best in Slot) component for pushing the game to extreme resolutions and features (3d and such). But AMD would NOT be having the issues we are seeing today.
But, who knows when (if ever) we will see anet upgrade the horrible API they are using to fix these issues. Time is money, and since there is no Subscription to the game (yet, Rumor has it there will be a sub for Chinese players), we cannot expect much in the way of a core rework of this game.
So, in reality the battle here isnt AMD vs Intel. It’s good design vs bad design(DX9)
Laptop: M6600 – 2720QM, AMD HD6970M, 32GB 1600CL9 RAM, Arc100 480GB SSD
Back when I played an MMO called Defiance for a bit I saw better performance on my Phenom II 975BE than FX-8350. Better single thread performance in the Phenom line up I’d say.
I’ll also take a moment to say that I upgraded from an AMD FX-8350 to an i7 4770K. I had my FX-8350 clocked up to as high as 4.7 GHz and only saw maybe as much as 4 FPS change in Guild Wars 2. To my i7 4770K, the performance difference was night and day with the superior FPS all across the board, AND my multi card performance has substantially improved in every game I have played since.
Back when I played an MMO called Defiance for a bit I saw better performance on my Phenom II 975BE than FX-8350. Better single thread performance in the Phenom line up I’d say.
I never really tested performance in AMD other then Athlon II vs Phenom II. When I moved to my FX8350, I dumped all my other AM3 CPUs to a buddy of mine (he was building a VMware lab to pass his VCP), but my Llano was faster then my FX8350 in most regards, and that’s based on the Phenom-ii with out the l3 cache.
Laptop: M6600 – 2720QM, AMD HD6970M, 32GB 1600CL9 RAM, Arc100 480GB SSD
Yep directX 9 is a crap.
No they should do directX 11.2 or even wait for DirectX 12. But then i5 wont be so great choise why? becuse there wont be so big difference between FX 4300 and i5 4670KAnd cue ignoring XFlyingBeeX because all he can do is change the subject when he’s trumped.
OP:
The Intel processor has superior to AMD multicore processors in the FX lineup now because AMD chose to now basically split a single core into two halves in one module.
I thought it was an interesting thing to do but of course since Direct X 9 was made in the generation where CPU performance was valued more, so the processor speed increased with the generations. To make up for that on an AMD FX, you’d have to superclock the thing to beast mode speeds where the Intel is kindasorta already clocked like that.Until this game receives some better coding on the processor side of thing, CPU speed is everything.
I heard when you disable 1,3,5,7 core in the bios that you can get a performance boost since the each core/module no longer has to share resources. I haven’t been able to try this since my board only can disable modules instead of individual cores.
Windows 10
Same thing with my board when I was using AMD FX-8350. could only disable modules.
Aza whats your FPS in GW2 – LA when fighting final boss?
intelfanboys…
check you CPU usage man check that link (youtube) you have 2 games StarCraft and
GW2!
(edited by XFlyingBeeX.2836)
Aza whats your FPS in GW2 – LA when fighting final boss?
intelfanboys…
check you CPU usage man check that link (youtube) you have 2 games StarCraft and GW2!
Which boss? For the big fight, I’d say its around sub 30fps depending on the amount of character models present in the scene. The boss after that (solo instance) its 60fps+.
Windows 10
sorry – when your fighting scarlet
This is exactly why I would never take ANY of XFlyingBeeX’s advice on ANYTHING. All he does is derail the thread with crap that has nothing to do with THIS game.
Hurr durr let’s take a look at this MMO that uses CryEngine it’s so much more advanced and optimized for AMD FX CPU. Well gee that’s probably because that version of CryEngine was released when AMD FX was even a marketed product.
This game, however, GUILD WARS 2, uses Direct X 9 which was created when AMD FX was not even a CONCEPT. Half core modules was probably a preposterous idea at the time as well until somehow AMD was able to make it work. DX9 requires pure core performance. Single thread performance! That is why every processor ever for high end has such amazingly high clock speeds— Intel i7 3.5 GHz. Intel i7 Extreme, 3.5 GHz on SIX cores. Quad cores at 3.4 GHz. Phenom 2 processors as high as 3.7 GHz and six core Phenom 2 processors as high as 3.3 GHz. Why? Because that’s what the game needs if it uses DIRECT X 9
As far as I’m concerned, XFlyingBeeX, Please DO NOT respond to a topic unless you aim to keep the thread ON TOPIC and about GUILD WARS 2, because Holy sweet merciful mother, Arche Age is so amazing, so advanced wow, such AMD FX with it’s Cry Engine platform. What’s that going to do for Guild Wars 2? ABSOLUTELY NOTHING. This game started it’s journey in development way back in 2007 when DirectX 10 was only accessible via Windows Vista and we all know how THAT went down. Direct X 11 wasn’t anymore than a drawing board concept as well, let alone 11.1 or 11.2 or DX12 which will probably be just the same as 11.2 which we probably won’t even need.
As far as I’m concerned, this game is not going to get a new engine, a new API, or some other act of God that will change how it runs. It’s up to the developers if they want to change it or not. So if you want better optimization for your AMD FX-6300 and continue to blindly run along up ahead saying that your FX-6300 3.3 GHz triple module half 6 core processor can beat any CPU unit that Intel can dish out, then start doing it in a way that’s more constructive and useful than just comparing things that GW2 certainly is not going to get because the amount of incoming resources is not substantial compared to a game like LOTR or WOW with their monthly subscriptions.
Ahem.
As far as I’m concerned, this game is not going to get a new engine, a new API, or some other act of God that will change how it runs. It’s up to the developers if they want to change it or not. So if you want better optimization for your AMD FX-6300 and continue to blindly run along up ahead saying that your FX-6300 3.3 GHz triple module half 6 core processor can beat any CPU unit that Intel can dish out, then start doing it in a way that’s more constructive and useful than just comparing things that GW2 certainly is not going to get because the amount of incoming resources is not substantial compared to a game like LOTR or WOW with their monthly subscriptions.
You are probably right. I feel it was a mistake to create a engine based upon DX9. I can understand how financially it was a good idea (I think many people still use windows xp). But more than likely these people do not upgrade their hardware often and are running ancient graphic cards and cpus.
The long term performance wise is difficult for gw2. DX9 is sort of a dead end API, it doesn’t have great multi threading features like DX11 does. So performance will hit a brick wall. The choices just to allow gw2 to continue to have bad performance or re-write the engine for a modern day api.
Something like Mantle would be ideal, but I definitely do not see it happening. The amount of users who can make use of it is too small. So the obvious solution would be a dx11 or higher. Like you said though, I don’t foresee this happening. It costs a lot of resources to re write a engine.
The only way I see it happening is if Anet decided to put gw2 on consoles. Then we might see some multi threaded features ported over to the pc version. But again, that is just fantasy. I don’t see that happening either.
I think we’ll be stuck with slightly better performance that we have now. And that in the future when we have more advanced cpus than we have today, that you will see many players complain about how bad gw2 performs. Similar to what has happened to Lineage 2. Its a game based upon the Unreal 2003 engine, but it has aged poorly. My geforce 3 and athon 2500 ran it at 20fps. My current rig (which is listed in my sig) runs the game at 20fps.
Windows 10
I’ve had a lot of wonderful performance experiences with an AMD FX-8350 on a lot of my Direct X 9 based games when I was using one. Granted they are not MMOs but I still think performance can be enhanced for them which is likely about the best that CAN be done. Sure DX9 might make it harder to do that but is there really an option and the resources available to use an easier API to do that?
Myself, however I have no need to think about it now since I’ve made the decision to purchase an i7 4770K and new board for my machine and I’ve seen performance increases in every game I have played since. I can’t see myself recommending AMD processors unless they want a very budget friendly computer for games other than MMOs.
“FX8350 would cap at 65FPS on lowest settings using a R7-260×”
If that’s true, then something was wrong with your system. My FX-60 (yep – socket 939) and 8800GT can beat that. If you need proof you’ll have to wait until tomorrow so I can grab the screenshots from my other laptop (not hooked up at the moment).
As far as I’m concerned, XFlyingBeeX, Please DO NOT respond to a topic unless you aim to keep the thread ON TOPIC and about GUILD WARS 2, because Holy sweet merciful mother, Arche Age is so amazing, so advanced wow, such AMD FX with it’s Cry Engine platform. What’s that going to do for Guild Wars 2? ABSOLUTELY NOTHING. This game started it’s journey in development way back in 2007 when DirectX 10 was only accessible via Windows Vista and we all know how THAT went down. Direct X 11 wasn’t anymore than a drawing board concept as well, let alone 11.1 or 11.2 or DX12 which will probably be just the same as 11.2 which we probably won’t even need.
Ahem.
first i never siad that AMD is faster! I said :“it is that bad”! Yeas intel is faster or let we let about 30-50% faster
Cryengine is better for all CPU not only AMD FX…
Difference between directX 11 and directX 11.2 is HUGE – look at BF4.
What does ArchAge to do with GW2? More than you think… (it is still beta). Why do you have to buy i5 K and oc it as far as possible to get “good FPS”?
Yes many of users will try new MMOs…. archage is one of the first that use cryengine and support 4 cores. Every year their is a lot of new MMOs!
If you think that performance is not something that users care about your wrong…
(edited by XFlyingBeeX.2836)
“FX8350 would cap at 65FPS on lowest settings using a R7-260×”
If that’s true, then something was wrong with your system. My FX-60 (yep – socket 939) and 8800GT can beat that. If you need proof you’ll have to wait until tomorrow so I can grab the screenshots from my other laptop (not hooked up at the moment).
record it at 1080P i will be waiting
“FX8350 would cap at 65FPS on lowest settings using a R7-260×”
If that’s true, then something was wrong with your system. My FX-60 (yep – socket 939) and 8800GT can beat that. If you need proof you’ll have to wait until tomorrow so I can grab the screenshots from my other laptop (not hooked up at the moment).
939, AM2+/AM3 (Non FX) were built for per core performance. That was the era those chips were designed in. The fact remains that the FX (Bulldozer+ based chips) have a weaker per core performance then their predecessors.
And there is nothing wrong with my FX8350 system, its now my media center system (upgraded from a 635 x4), and it plays all my other titles on high at 1080p with 90-120FPS with out a hitch. Except GW2. But I dont play that on my 65" TV.
Laptop: M6600 – 2720QM, AMD HD6970M, 32GB 1600CL9 RAM, Arc100 480GB SSD
I think that right know if you do Phenom x4 965 vs FX 8350 in GW2 actually FX may be faster
i can only say that FX 6300 4.5Ghz is faster than E8500 4.5Ghz for this game
also this shows that FX is fatser than phenom on DX9
http://www.hardwarecanucks.com/forum/hardware-canucks-reviews/57615-amd-vishera-fx-6300-fx-4300-review-11.html
(edited by XFlyingBeeX.2836)
I know I got better performance in Defiance (also an MMO) in huge player population events with my old Phenom II 975 than my FX-8350 did. I would assume it’d be the same for GW2.
what about RAm OC? does it help?
what about RAm OC? does it help?
No, RAM OC is directly tied to FSB OC. If you do not need to OC your FSB dont OC your ram as the benefits are just not there.
The exception to this rule is Ram that is meant to be OC because they are out of spec (1866OC, 2100OC….ect). In which case, sometimes you must OC the ram to run the chip at the advertised spec.
Example;
My 1866mhz RAM had to be OC’d in my AMD system because it topped at 1600mhz. however, It supported ram up to 2100 via OC.
where as my Intel system natively supports 1866-2100 with out the OC.
Laptop: M6600 – 2720QM, AMD HD6970M, 32GB 1600CL9 RAM, Arc100 480GB SSD
The only time you really should OC RAM for gaming purposes is if ur using a AMD APU based system, where OCing the RAM also OC’s the on board GPU and can result in significant FPS gains.
EVGA GTX 780 Classified w/ EK block | XSPC D5 Photon 270 Res/Pump | NexXxos Monsta 240 Rad
CM Storm Stryker case | Seasonic 1000W PSU | Asux Xonar D2X & Logitech Z5500 Sound system |
The only time you really should OC RAM for gaming purposes is if ur using a AMD APU based system, where OCing the RAM also OC’s the on board GPU and can result in significant FPS gains.
well thats true. And with the new Line of APU’s and those 8xxx series GPUs in there, OCing the Ram really helps.
But if you have discrete graphics then I still suggest not doing it.
Laptop: M6600 – 2720QM, AMD HD6970M, 32GB 1600CL9 RAM, Arc100 480GB SSD
just asking… because i OC ram and i see some improvements or not
http://linustechtips.com/main/topic/70824-bf4-ram-speed-benchmarks/
totaly different game … just asking is someone noticed some improvement with ram OC
The most noticeable improvement you’d see from RAM I think is if you lowered the CAS latency. DDR3 800 CL6 can keep up fairly well with DDR3 3100 CL12/CL13 when it comes to games. In rated benchmarks, there was roughly a 2.5 FPS difference.
Is Guildwars 2 compiled on an Intel Compiler with Intel specific optimizations?
If I recall right, I’m pretty sure it’s using the generic VC++ compiler and not Intel’s (I checked the executable with some methods on Linux). I made a post/thread at some point about it.
The most noticeable improvement you’d see from RAM I think is if you lowered the CAS latency. DDR3 800 CL6 can keep up fairly well with DDR3 3100 CL12/CL13 when it comes to games. In rated benchmarks, there was roughly a 2.5 FPS difference.
When it comes down to memory, nothing is as effective as tightening down the CL ratings as far as you can get them being stable. (I had my Veng 1600Mhz clocked down to 1480mhz at cl6, and it was faster then most other ram I was benching against. But it wasnt real stable, would throw memory errors and BSOD windows every now and then. )
The only real exception is with iGPUs that rely on the memory speed for their rendering jobs.
Laptop: M6600 – 2720QM, AMD HD6970M, 32GB 1600CL9 RAM, Arc100 480GB SSD