What is being done about AMD CPU performance?
in Account & Technical Support
Posted by: VirtualBS.3165
(edited by VirtualBS.3165)
in Account & Technical Support
Posted by: VirtualBS.3165
seems like I will have to post my specs. Before this topic completely derails however a caveat. Parts are already ordered. As far as OC goes, I might bump my proc to 4GHz but no higher.
Mobo: Gigabyte GA-990XA-UD3
Proc: AMD FX-4100 Zambezi @3.6 GHz AM3+
Graphics: Sapphire Radeon HD6850
Memory: 16GB (4 × 4GB) Corsair Vengence
HD: Corsair Force Series 120 GB SSD with 2 TB spare drive
PS: Cooler Master Silent Pro M700’As far as cooling
2x 200mm fans
1x 120mm fan
Case is Cooler Master Storm Enforcer SGC-1000-KWN1
Thoughts?
If you plan to play GW2 a lot, there is currently no advantage in going AMD.
http://www.tomshardware.com/reviews/guild-wars-2-performance-benchmark,3268-7.html
A core i3@3GHz (heck, even a Pentium G860@3GHz that costs $90) will give you the same performance as an FX-4000@4GHz.
AMD FX-4100 Vs. Intel Core i3-2100: Exploring Game Performance With Cheap GPUs
http://www.tomshardware.com/reviews/fx-4100-core-i3-2100-gaming-benchmark,3136.html
Want a $500 PC for GW2? Get this instead:
http://www.tomshardware.com/reviews/gaming-pc-overclocking-pc-building,3273.html
As you can see in the GW2 benchmark, both the Pentium G860 and the GTX 560 fare pretty well for their price points.
(edited by VirtualBS.3165)
in Account & Technical Support
Posted by: VirtualBS.3165
Thoughts ?
“shoulda gone to …intel” :P
But seriosly. Step away from AMD. intel is by far the better choice when it comes to price/performance ratio’s.
AMD holds the price/performance ratio. Quit spreading misinformation
Not on GW2.
in Account & Technical Support
Posted by: VirtualBS.3165
The big problem with the FX series is really the architecture used, deltaconnected.4058 explained that correctly.
Thankfully, Steamroller will have dedicated op decoders for each core, among other improvements:
http://www.anandtech.com/show/6201/amd-details-its-3rd-gen-steamroller-architecture
However… by 2013 we’ll also have Haswell which’ll be a PITA for AMD to beat:
http://www.anandtech.com/show/6263/intel-haswell-architecture-disclosure-live-blog
in Account & Technical Support
Posted by: VirtualBS.3165
http://www.tomshardware.com/reviews/guild-wars-2-performance-benchmark,3268.html
(Update: Because we tested the game’s beta client in order to get this story ready in time for launch, the Guild Wars 2 lead engine programmer, Chad Taylor, dropped us a line to let us know that the game was updated with performance optimizations in the final build. One key change was putting the renderer in its own thread so that blocking driver calls wouldn’t create stoppages in the main game loop. He mentioned that this change should make a notable difference on machines with four or more CPU execution cores, and that a few of the graphics preset options were also tweaked.
As a result of these changes, we’d like to revisit Guild Wars 2 in the near future to re-benchmark CPU performance and update the driver settings images with examples from the full release. Stay tuned for the update, due in mid-September.)
(Update Sept 27: We’ve re-tested the game with the release client and we’re not seeing a notable performance difference. The Core i5 gained a few FPS (as did the AMD FX-4100 to a lesser extent), but all of the other results remain similar to our published numbers. NCSoft let us know that they’re working with AMD to improve FX-series CPU performance, and if this happens in the near future we may revisit Guild Wars 2 with new benchmarks.).
in Account & Technical Support
Posted by: VirtualBS.3165
Nope, no crashes or glitches yet. I’ve just tested looking to lion statue in Lion’s Arch coming from the mystic forge, and HyperFormance+Virtual V-sync was around 71fps vs 63fps with just Virtual V-sync enabled.
I can tell you for sure that the Virtual V-sync is really worth it. It beats everything on the market right now. Even better than using regular v-sync + triplebuffering (actually d3d frame queueing) + framerate cap at monitor Hz minus 1 trick for no input lag. And let’s not even discuss nVidia’s adaptive v-sync….
Dust, cigarette smoke (mixed with dust) and pet hair are common culprits of overheating components, because they form a strong layer between heatsinks and fans.
External exhaust GPUs suffer badly from this, you have to really disassemble them and clean the inside of the exhaust.
Hpet affects FPS and HID’s. With it turned on a lot of people have a bigger delay on their mouse control (some are less sensitive to this) it usually makes for a little more stable framerate but lower overall.
Hpet has been giving these issues for quite a few years now. For instance when playing ‘WAR’ the big tip always was to turn it off.
As far as I know, there are 4 possible combinations for timer usage:
Which of the 2 first ones with HPET on were you referring to, for the slowdowns?
in Account & Technical Support
Posted by: VirtualBS.3165
I’ve been using Virtu MVP (HD3000 + GTX 560 Ti, in “i” mode, with the monitor connected to the iGPU via HDMI), with both Virtual V-sync and HyperFormance, and can honestly say that it works 100% in GW2.
A nice benefit of using Virtu in iMode is that the iGPU renders the windows desktop, leaving the dGPU’s memory 100% free (i.e., you get even a better benefit than disabling “desktop composition” for memory hungry games).
(edited by VirtualBS.3165)
Game engine and/or software issues between GW2 and drivers.
Other mmo’s have this issues aswell. When the game engine, that runs fine most the time, comes across somthing that it just cant sort through efficiently enough.. like LOADS of players in the same area, then it slows down making your computer have to wait for the information.
Exactly. It is also sometimes related to network issues. If the client isn’t receiving network data fast enough it will also slowdown, which will be reflected in reduced CPU and GPU usage (they are both waiting for the network connection data).
If you are crashing all the time, please test your system with this method.
Get 3DMark11 and test your system. This way you can be sure if you’re getting similar framerates to other ppl with similar systems than you.
Check this thread, and try the linked test please.
Have you tried unlocking your CPU cores?
If you do it, use this to test if it’s stable.
in Account & Technical Support
Posted by: VirtualBS.3165
A BSOD during the install points to either RAM, PSU or HDD controller related problems. I would suggest starting with the RAM — Test it with this: http://www.memtest.org/.
Were you doing anything else while installing the game, like watching videos or music?
Use this to analyse the BSODs.
You should also do this for a stability check.
(edited by VirtualBS.3165)
You forgot the most important. The heart! What’s your PSU?
in Account & Technical Support
Posted by: VirtualBS.3165
Guys, please try the newest nVidia betas, 306.63, and post your results in that thread also:
https://forum-en.gw2archive.eu/forum/support/tech/New-nVidia-driver-306-63-Beta
Since the start, GW2 has had numerous problems with nVidia drivers, especially in shader processing, sprites, texture mapping and DX9 memory spooling. I’m sure all these will be addressed, either by driver updates and/or game patches.
Yo should check this for performance estimates with different kinds of hardware:
http://www.tomshardware.com/reviews/guild-wars-2-performance-benchmark,3268.html
(Update: Because we tested the game’s beta client in order to get this story ready in time for launch, the Guild Wars 2 lead engine programmer, Chad Taylor, dropped us a line to let us know that the game was updated with performance optimizations in the final build. One key change was putting the renderer in its own thread so that blocking driver calls wouldn’t create stoppages in the main game loop. He mentioned that this change should make a notable difference on machines with four or more CPU execution cores, and that a few of the graphics preset options were also tweaked.
As a result of these changes, we’d like to revisit Guild Wars 2 in the near future to re-benchmark CPU performance and update the driver settings images with examples from the full release. Stay tuned for the update, due in mid-September.)
(Update Sept 27: We’ve re-tested the game with the release client and we’re not seeing a notable performance difference. The Core i5 gained a few FPS (as did the AMD FX-4100 to a lesser extent), but all of the other results remain similar to our published numbers. NCSoft let us know that they’re working with AMD to improve FX-series CPU performance, and if this happens in the near future we may revisit Guild Wars 2 with new benchmarks.)
in Account & Technical Support
Posted by: VirtualBS.3165
First off, I did not say say running in legacy mode wasn’t horrible – nor did I recommend it. You made a blanket statement that was untrue. Why you bother posting one article about a manufacturer screwing up and having to release new firmware to fix their screw up is beyond me. The fact is, TRIM works in legacy mode.
To show the fact that “supposed to work” is very different than it actually working 100% as intented in all scenarios.
My statement was “An SSD has to be in AHCI mode to enable NCQ and TRIM”. I’m so sorry then for making a blanket 50% innacurate statement!
Pagefile, show me some benefits or even logical reasons why you’d need it on in a 64bit high memory system? I’ve done plenty of research and benching.
There are several important reasons for keeping a pagefile. You can start reading about them here:
http://blogs.technet.com/b/markrussinovich/archive/2008/11/17/3155406.aspx
We could go back and forth all day about this nonsense, that doesn’t change my original point -neither of these things has anything to do with how well or crappy GW2 runs.
They did for the op, and he wanted to share his experience to help others.
If you’re on a budget ($500), get this:
http://www.tomshardware.com/reviews/gaming-pc-overclocking-pc-building,3273.html
Game doesn’t even use OpenGL. Not worth messing with if the only thing that changed was adding support for a new revision of OpenGL.
306.23: branch r304_70-93
306.63: branch r304_83-34
nVidia never only changes one thing. These drivers are from a newer branch (83), than the latest WHQL (70).
Like I said, I saw a performance increase on my card.
in Account & Technical Support
Posted by: VirtualBS.3165
To clarify my original post: The single fix that made the game playable was to return my FSB to its default multiplier. This is a verifiable and repeatable fix on my system.
You now have HPET turned on in Windows (the useplatformclock trick) and in the BIOS, correct?
Try to change the FSB again and see if the effects return. In theory, those effects should be because of TSC-related issues, so you should be immune to them now.
How come they’re not on the nvidia website ;/
There you go!
http://developer.nvidia.com/opengl-driver
in Account & Technical Support
Posted by: VirtualBS.3165
There is no guarantee that all SATA controller drivers will transmit trim when working in legacy mode (IDE), or even that all SSDs accept trim in legacy mode. For example:
http://thessdreview.com/daily-news/latest-buzz/new-sandisk-extreme-ssd-firmware-fixes-trim/
Also, the default MS SATA driver in Windows 7 only transmits trim when in AHCI mode (msahci.sys), unless you specifically update your SATA controller drivers.
Running SSDs in legacy mode is horrible, as you are missing the most important AHCI feature for SSDs: NCQ
http://www.tomshardware.com/reviews/ssd-performance-tweak,2911-2.html
You want AHCI turned on because it is the specification by which Native Command Queuing (a SATA-specific technology) is enabled. SSDs boast incredibly fast response times. So, they realize their best performance when fielding multiple commands simultaneously, consequently benefiting from the parallelism the defines most SSD architectures. This is precisely the reason we see better benchmark performance when we use queue depths of up to 32 versus a queue depth of one.
The benefit of having the pagefile split by several drives did not end with XP/2000. Do your research before you post.
(edited by VirtualBS.3165)
in Account & Technical Support
Posted by: VirtualBS.3165
Lets clear one thing about pagefiles.
First, there is no penalty in having one, but there is a benefit — when windows runs out of VM space, it’s there to stop it from mindlessly closing programs at will.
Second, there is a benefit of having a pagefile in different drives (not partitions, drives), since windows will always use the one in the idlest drive of all, as a strategy to minimize any performance impact.
@VirtualBS.3165 Sure thats all true, the HPET is the superior method to time things in applications period. However we are discussing if a more precise timer can have any FPS impact at all in a game engine and here the answer is no, since it don’t effect any performance relevant game engine subsystem.
Counter precision can affect a lot of game engine subsystems. Try to do physics on wrong counters!
in Account & Technical Support
Posted by: VirtualBS.3165
Pagefile with 16GB of ram, huh? You don’t even need a pagefile with 8GB. Also, having an SSD doesn’t matter except for load times, and whether that SSD is in AHCI or IDE barely matters with that.
An SSD has to be in AHCI mode to enable NCQ and TRIM.
(edited by VirtualBS.3165)
These seem smoother than the 306.23 WHQL on my GTX 560Ti, I’m no longer seeing 100% GPU utilization on some places where it previously happened.
Anyone tested these issues with the latest drivers? (306.63 betas)
Best Graphics Cards For The Money: September 2012
http://www.tomshardware.com/reviews/gaming-graphics-card-review,3107.html
in Account & Technical Support
Posted by: VirtualBS.3165
The AMD FX (Bulldozer) Scheduling Hotfixes Tested
http://www.anandtech.com/show/5448/the-bulldozer-scheduling-patch-tested
AMD’s FX-8150 After Two Windows 7 Hotfixes And UEFI Updates
http://www.tomshardware.com/reviews/windows-7-hotfix-bulldozer-performance,3119.html
Sneak Peek: AMD’s Bulldozer Architecture On Windows 8
http://www.tomshardware.com/reviews/fx-8150-zambezi-bulldozer-990fx,3043-23.html
It seems Windows 8 will bring the correct implementation of these issues for Bulldozer. The fps gain in Windows 8 is pretty nice.
(edited by VirtualBS.3165)
Have you tried any of this yet?
in Account & Technical Support
Posted by: VirtualBS.3165
This machine plays the game fine and that’s all that matters.
Thread op question answered then, time to close it now!
Everyone, please stop falling for this stuff!
in Account & Technical Support
Posted by: VirtualBS.3165
Best Performance preset:
http://www.tomshardware.com/reviews/guild-wars-2-performance-benchmark,3268-4.html
Take your pick!
Best Graphics Cards For The Money: September 2012
http://www.tomshardware.com/reviews/gaming-graphics-card-review,3107.html
Also, comparison when using the Best Appearance preset:
http://www.tomshardware.com/reviews/guild-wars-2-performance-benchmark,3268-6.html
Reseating is taking the card out and putting it back the same slot, or in a different slot, making sure that the slot is clean and that the metal connectors are oxidation free.
On some motherboards you can only put the gfx card in the first PCI-E slot, since they have a tendency to run everything on the other slots at PCI-E@8x max, or worse. Depends on the number of PCI-E lanes that the processor provides or the motherboard supports, while using bridge chips or not.
in Account & Technical Support
Posted by: VirtualBS.3165
@Swedemon.4670, nice find. Any AMD Bulldozer user should make sure he’s using both those fixes, the first one is aimed at the core parking issue, and the second one is aimed at the windows scheduler.
@Home Style.9640, true, if you don’t pay your electricity bill
Running alot of cooling as well. Can show benches if needed, but guarantee I’m in the low range on both GPU/CPU.
You are really only on the low end on the CPU part, you GPU is more than capable for running GW2 at the “Best Appearance” preset, when paired with a good enough CPU:
http://www.tomshardware.com/reviews/guild-wars-2-performance-benchmark,3268-6.html
in Account & Technical Support
Posted by: VirtualBS.3165
@Fu Ling U.6918, there you go!
(edited by VirtualBS.3165)
It really depends on storage media usage. Anything that reads it a lot (like windows loading up) will benefit hugely, while apps that rely more on cpu processing won’t benefit as much.
While using an SSD on a Sata 2 port will reduce its throughput, SSDs are not only about MB/s — most of the latest SSDs have an access time around 0.05ms vs the best 7200rpm HDDs of ~12ms — that’s 240x higher!
@AndyPandy.3471, thanks for raising some significant issues.
Yes, you are correct about the function call time, it is mostly irrelevant (in the order of ~0.72uS for QPC with HPET on and ~0.0362uS with TSC). Function precision is however another story. QPC has a precision of ~0.0698uS with HPET on and ~0.427uS with TSC.
In the end any should make a reliable counter, if these were the only aspects to it. However, HPET responses are more immune to several aspects where TSC fails miserably. If you do a google search for useplatformclock usepmtimer, there are several topics on this issue.
(edited by VirtualBS.3165)
in Account & Technical Support
Posted by: VirtualBS.3165
Bottom line is, if you have 4GB of RAM, there’s no valid reason for not running a 64-bit OS, especially since when you buy any version of windows, your license entitles you to both the 32-bit and 64-bit versions of the OS.
Think everything was done properly and windows is booting up crazy fast compared to the HD. But it just seems GW2 loading times were unaffected. Shoudn’t I see a performance increase? Is GW2 just not optimized for SSD’s yet?
It depends on how much GW2 reads the disk during those loading times. Were you seeing GW2 read the disk a lot (HDD led almost 100% on)? Copy the game back to the HDD and run it from there, is the HDD reading non-stop?
If not, then the storage medium is not your bottleneck, your CPU+RAM is (probably your CPU, since its a dual-core i3 with hyperthreading).
in Account & Technical Support
Posted by: VirtualBS.3165
Well I just went into my BIOS and found nothing about either Vcore or LLC?
There was some sort of “core unlocker” that was disabled that made it sound like my CPU wasn’t being used to its full potential; however, it warned of the computer possibly becoming unstable if enabled. I assume this option is just a different way to overclock?Anyway, I am lost as to where to configure either the Vcore or LLC?
That “core unlock” feature is made for the X2 and X3 cpus, to make them X4 (as they are all equal on the hardware level, just have some of the cores disabled, either because they are effectively defective of just for selling purposes). Since you have a X4 its of no use to you.
What is your MB manufacturer/model?
After playing Batman AA and AC with PhysX on High, I could never go back to ATI. :P
Even though I haven’t tried it yet, Borderlands 2 also shows great effects with PhysX:
http://www.geforce.com/whats-new/articles/borderlands-2-physx/
In the end, any hardware can be made great, but ultimately, it’s the software, the “experience” that counts.
ATI is the one with the most efficient GPUs, no doubt there, their execution on the hardware level is impressive. nVidia also makes good hardware, but they win really well on the software front: PhysX, CUDA and 3D Vision were really their best bets.
in Account & Technical Support
Posted by: VirtualBS.3165
@shastawong.3486
Did you check with Resource Monitor if some cores were effectively parked (or oscillating) during the tests?
in Account & Technical Support
Posted by: VirtualBS.3165
On a 64-bit OS, 32-bit programs with LAA are still limited to a maximum of 4GB of mem usage.
For unlimited maximum mem usage, the executable needs to be 64-bit.
I still don’t understand why using a different timer will notable improve game FPS? So while changing the timer u get higher precision and the timer call itself may cost less cycles, but those timers are used like a couple of dozen times per frame only, so there should no measurable FPS gain?
http://www.nvidia.com/object/timer_function_performance.html
The time required to return from the various timeGetTime(), GetTickCount(), QuerryPerformanceCounter(), and special assembly _emit 0×0F timing methods can vary by a factor of 150.
It all depends on the used function. For all we know, they can be used thousands of time per frame. At 60Hz, each frame is ~16.6ms (1.66×10^-2 s) — most of those timer functions have precisions around 10^-8 to 10^-7. In the worst case (10^-7), they can be called 166.000 times per frame if need be.
The 7770 is the best GPU at it’s price point: http://www.tomshardware.com/reviews/geforce-gtx-660-geforce-gtx-650-benchmark,3297.html
It stands in the middle of the GTX 650 and GTX 660, so nVidia is releasing a 650 Ti (made from the GK106 silicon) aimed to go heads-on with the 7770. http://www.techpowerup.com/172572/NVIDIA-GeForce-GTX-650-Ti-Specifications-Detailed.html
Still, the 7770 seems the best bang for the buck, at its price point.
in Account & Technical Support
Posted by: VirtualBS.3165
@ Kamatsu.8206, please check this post:
https://forum-en.gw2archive.eu/forum/support/tech/Black-screen-and-not-responding-This-might-help/279092
Your GPUs seem to be running too hot. Try doing a custom fan profile for them.
This one also seems a good choice, comes with a GTX 660M:
http://www.techpowerup.com/172567/MSI-Announces-Upgraded-GE60-and-GE70-Gaming-Notebooks.html
Not affiliated with ArenaNet or NCSOFT. No support is provided.
All assets, page layout, visual style belong to ArenaNet and are used solely to replicate the original design and preserve the original look and feel.
Contact /u/e-scrape-artist on reddit if you encounter a bug.