nVidia Surround UI Placement/Cutscenes/Character Selection
in Account & Technical Support
Posted by: deltaconnected.4058
in Account & Technical Support
Posted by: deltaconnected.4058
Yea, windowed hasn’t changed. I’ve never seen my 580’s dip below 50% usage, so I’m surprised you’re able to pull that off on a single card.
in Account & Technical Support
Posted by: deltaconnected.4058
Didn’t see anything mentioned in today’s patch notes, but the screenshots speak for themselves. No more sumo characters and party/chat centered. Haven’t had a chance to find a cutscene yet, but I’ve got a feeling those are covered too.
Dunno about the beer, but I’ll buy gem-equivalent
+1 for Bill.
in Account & Technical Support
Posted by: deltaconnected.4058
I never said “intel 4 lyfe” or “AMD sux balls”. I gave the main reason why the Bulldozer architecture was a huge flop and linked an article that explains it in more detail. Are you gonna say that every major reviewer is also an Intel fanboy for stating the same things I did?
As for the benchmarks, I had a linear increase going from 3.2GHz/29fps to 4.7GHz/39fps in a controlled test (the same scene up logout/login, overlay in far top-right). This doesn’t mean that TH benchmarked the game wrong; I just can’t reproduce the results they got on my hardware with regards to OC’ing a 2500k.
in Account & Technical Support
Posted by: deltaconnected.4058
You’re really just stating the obvious and letting the point go over your head. All this information was described in the post above you. Now quit riding Intel…
“start /AFFINITY 55 /B /WAIT c:\path\to\gw2.exe”
Your 8core BD should now run about the same as a similar clocked 4core Phenom II. Change 55 to 2A for the FX6000 and 05 for the FX4000, but GW2 has enough threading to need a quad and not a tri or dual-core CPU so those two will most likely run worse with this.
in Account & Technical Support
Posted by: deltaconnected.4058
Once you know the benchmarks and technical details yea. But the thought of saving $20 here ($40 MSRP), another $40-50 on the mobo, and seeing 4 vs 8 cores is too good to pass up for some people.
It’s not the perfect comparison, but it’s sorta like buying a car without testdriving it.
in Account & Technical Support
Posted by: deltaconnected.4058
AMD’s 8core BD: $180
Intel’s 2500k: $200
Classic case of “you get what you pay for” followed by over-ambitious assumptions of what your hardware is capable of.
1 look at AMD’s architecture diagram says everything. Notice that each 2core “module” shares the same instruction decoder, L2 cache, and floating point unit. This makes an 8core BD much closer to a 4core CPU. It’s literally reinventing the Pentium 4 with hyperthreading.
Worth a read if you’re interested why Sandy is much better.
http://www.anandtech.com/show/5057/the-bulldozer-aftermath-delving-even-deeper
Larger cache may give you better performance in some things, but I doubt it’s more than a single-digit increase.
in Account & Technical Support
Posted by: deltaconnected.4058
GPU usage at 100%, has to share cycles with the desktop.
in Account & Technical Support
Posted by: deltaconnected.4058
Yea, figured as much. Had a little hope
in Account & Technical Support
Posted by: deltaconnected.4058
The HD5000 series felt a lot more tolerant of heat than the HD6000s. My 5850’s had no problems on OC runs at 80c, though at 90c-ish they started to show artifacts (never a black screen). Still worth a shot to get something like Afterburner and aim for a curve that isn’t too loud but keeps them slightly cooler.
Asked about the sound part since I recall someone fixing it by playing around with their network settings (disable jumbo frames, or some filters on the router).
From what I gather, there’s 3 different black screen problems – the usual hard-lock one caused by hardware through power/heat/driver, the program crash that you can still ctrl+alt+del out of, and this (hard-lock would stop sound completely or have a very short sound loop).
in Account & Technical Support
Posted by: deltaconnected.4058
I don’t remember if I set it or it’s linked with it by ANet and -repair doesn’t change my GW2.exe, so someone else check if the Large Address Aware bit is set. If it’s not, might give 4.0GHz+ Sandy’s and Ivy’s a boost in LA and WvW.
in Account & Technical Support
Posted by: deltaconnected.4058
My go-to suggestion as of late is to run a benchmark (http://www.3dmark.com/3dmark11/download/) and compare your scores to the average for your hardware. If there are power or hardware problems, the score should reflect it, but I doubt that’s the reason.
“cursor won’t move, sound still plays and screen sometimes goes black (but not always)”. Elaborate on the sound part.
100% GPU sounds like it’s running on PCIE-1×. Or the 100% desktop bug is back in AMD.
in Account & Technical Support
Posted by: deltaconnected.4058
That rig can pay BF3 @ maxed settings, AA included, and keep a steady 120FPS everywhere, even in the open parts of the B2K maps? I applaud you for accomplishing the impossible.
MW3, like WoW, runs on an engine that’s several years old. And you can’t compare shooters to MMO’s.
http://www.3dmark.com/3dmark11/download/
and post the link to your results.
in Account & Technical Support
Posted by: deltaconnected.4058
If those suggestions don’t work, underclock.
in Account & Technical Support
Posted by: deltaconnected.4058
Ahhh you’re right. http://www.3dmark.com/3dmarkvantage/download/ use Vantage. 3dmark06 will be limited by CPU.
in Account & Technical Support
Posted by: deltaconnected.4058
FPS won’t be incredibly high, but with everything low it shouldn’t be single digits either. Still waiting on those 3dmark11 results as I didn’t see anything unusual in those screenshots.
http://technet.microsoft.com/en-us/sysinternals/bb896653.aspx is a much better tool than the default task manager for windows (has an option to replace it as well if you copy it to c/prog files). It’ll give you much more in-depth info, including threads.
Depending on what the system is like, Windows can switch a thread from one core to another as many as several hundred times a second to distribute the load (there’s ways to override this with affinity, but ANet didn’t set any). I’ve always assumed this to be for reducing the amps per core and therefore resistance and leakage = less heat, but idk. Other OS’s, like Linux, don’t do this by default.
As for why a single thread can’t use more than a single core – concurrency. In short, let’s say the next two things the thread would do is “x = x + y” followed immediately by “x = x + z”. If those statements were split across different cores, they would both attempt to store the new value of x at the same time (shared clock source). Doesn’t matter which value ends up being stored, neither will be the “x = x + y + z” that we want.
GW2 itself has plenty of threads, but when you look at ProcessExplorer while playing, you’ll notice one doing a lot more work than the others. It’ll also be reported as using 1/#cores, physical or logical, you have (25% on desktop i5, 12.5% on i7 with hyperthreading etc.)
(edited by deltaconnected.4058)
Shadows is nvidia’s problem (works on all the HD7000 devices we have).
Threaded Optimization deals with http://msdn.microsoft.com/en-us/library/windows/desktop/ff476891%28v=vs.85%29.aspx, DX11 only.
SLI does work, but for the moment the game relies solely on single-threaded CPU performance.
Dropping shadows from ultra to high fixed the camera stutter for me (seems to affect nVidia only). 30FPS in large WvW fights and most of Lion’s Arch seems about right for your specs, nothing we can do about it til some minor improvements from ANet.
in Account & Technical Support
Posted by: deltaconnected.4058
And I’d wager that in the majority of cases where a new Ivy/Kepler or Ivy/SI rig is having troubles it’s user error, and the other few is due to DOA hardware (dead or dying).
This “little benchmark”, combined with something like GPU-z to make sure your GPU is running in PCIE-x8 or x16 and HWiNFO64 can give you a lot of insight as to why you’re only getting 4-5fps.
Dunno how you managed good temperatures without a fan plugged in, but glad you figured it out
in Account & Technical Support
Posted by: deltaconnected.4058
@MaxTwelve: Interesting… Eyefinity? My char select screen looks quite a bit different (Surround). At least I can see my head and feet lol.
in Account & Technical Support
Posted by: deltaconnected.4058
Run a benchmark and post your results link here. http://downloads.guru3d.com/3DMark-11-Basic-Edition-download-2650.html
Compare your score to what others get. http://www.3dmark.com/search
“I meet the minimum system requirements. There is no excuse for it not working.” Doesn’t change the dictionary definition of ‘minimum’.
in Account & Technical Support
Posted by: deltaconnected.4058
Beer’s on me if the next build moves chat/party to the middle screen
Thanks for the update.
The usual stuff: check temperatures, clock modulation (hwinfo64), etc.
Make sure you’re using the discrete GPU as well (specs seems vaguely familiar…)
Your physics score does seem rather low for that CPU. Based on other results, a stock 3570k should be ~6.5k ish.
Run something like HWiNFO64 and make sure that min/max/avg for Clock Modulation (under CPU) is 100%. Other than that, all I can suggest is temps (which you say are normal), disabling C-states (including C1E) and SpeedStep in BIOS, and changing Windows’ power plan to high performance.
(edited by deltaconnected.4058)
@Reize: run a benchmark anyway (eg. http://downloads.guru3d.com/3DMark-11-Basic-Edition-download-2650.html), that’ll give us an idea of whether it’s a system or GW2 issue. Yours looks like one of the very few cases where FPS don’t match the specs.
in Account & Technical Support
Posted by: deltaconnected.4058
It’s the bombs on the sides. They keep spawning. And spawning. And because they all have the exact same coordinates (no culling or edge detection), your CPU/GPU will render every last one of them..
in Account & Technical Support
Posted by: deltaconnected.4058
Please don’t edit #3, that is amazing.
Your power supply says 1250W because you fell into a trap. Unless you have Quad CPU and Quad SLI you won’t even use 2/3s of that. Big numbers = better. Amirite?
U funny. Please don’t edit yours either.
http://www.3dmark.com/3dm11/4406641 <- my system.
Not only that, if I’m choosing between $200 for 1kW or $225 for 1.25kW, what’s $25 compared to my total cost already? I’ll probably make up for it in slightly better %efficiency over the years.
By your logic my Amd Phenom II X4 955 3.2GHz Black Edition CPU should run faster than my I7 2670QM 2.2 GHz. At least I know who is right and who is wrong on this one.
See #1.
(edited by deltaconnected.4058)
in Account & Technical Support
Posted by: deltaconnected.4058
1) I guess I had to be a little more thorough: raw GHz with a high number of instructions per cycle (be it through a longer pipeline, more ALU/FPU’s, fast last level cache, or better branch prediction etc.). Better? I thought this wasn’t necessary since I never compared AMD’s GHz to Intel’s GHz and the statement holds true regardless of which you pick.
2) I don’t need to know the specs. I look up what intel has in their mobile Ivy line, then look up what nvidia/AMD have for mobile graphics, and notice that a mid-range build from 2008-2009 could match the highest offering they have.
3) If laptops are supposed to be as stronk as desktops, why is my PSU rated at 1250W while I’ve yet to see a laptop brick that provides over 150W? And why can’t my desktop have these divine high-power/high-efficiency parts? Maybe it’s because you trade power for mobility.
But whatever, the “my laptop is best laptop” brigade can keep QQ’ing their hearts out. I’ll continue enjoying the game while you continue waiting for your 500% optimization.
in Account & Technical Support
Posted by: deltaconnected.4058
@Hickeroar: nice catch. Went in there and sent a bug report.
When your GPU is able to do FP32 math more efficiently than the CPU and there’s a unified way to do so for AMD/nvidia, let me know.
Otherwise, http://i.imgur.com/uiBK8.jpg. No game can use 4 renderers to split up the load, and there’s no escaping concurrency.
Yea you’re right. I can see the future patch notes already; “ANet improves performance by approximately 500% to accommodate old rigs”
Based on the latest version of Direct X 9, which came out in 2004. There is no reason why any hardware in the last 6 years shouldnt be able to run this game on LOW settings at a steady 30 FPS.
Please never edit your post. This ones a keeper.
in Account & Technical Support
Posted by: deltaconnected.4058
“brand spanking new gaming laptop.”
Gaming laptops are a gimmick. There isn’t a single mobile chip that has the raw GHz this game needs to run smooth everywhere, nor will there be til at least after Skylake.
Likewise, the GTX560m/GTX660m that I see on the majority of these “gaming” laptops is still worse than a desktop GTX260 from 2008. That’s the price you pay for mobility.
Luckily not all developers made the switch to target low-end machines.
Current gen desktops are having a hard time with WvW. That means you can expect that laptops will struggle with it even after Haswell.
in Account & Technical Support
Posted by: deltaconnected.4058
Swamp areas are one of the few locations in the game that max out all 3 of my GTX580’s. 100-100-100, and about 55-60fps.
(2500k @ 4.7)
in Account & Technical Support
Posted by: deltaconnected.4058
Haven’t been there since completing the zone weeks ago, all I remember is it was near the vista for the forge. Might’ve been fixed by now.
in Account & Technical Support
Posted by: deltaconnected.4058
If you extrapolate benchmarks (vantage → 06) from the 435m to the 8800gt to the 7800gs, the 435m is below the minimum.
The CPU cant really compare directly because of turboboost, but on single-threaded apps (superpi) it matches a stock q660, while on multi-threaded it will be ~35% weaker. But because at 2 cores loaded the speed will be 2.2GHz, that may as well be minimum.
in Account & Technical Support
Posted by: deltaconnected.4058
There’s a similar spot in Black Citadel, absolutely destroys framerate. Reporting it as a bug is all we can do.
What I meant to say is if you did the swap (take her harddrive, and boot from it on your computer), you’ll need to reinstall video drivers. This is the fastest way to tell if it’s a hardware or software problem.
tried this? http://techreport.com/review/21865/a-quick-look-at-bulldozer-thread-scheduling
It won’t give you a huge difference or make your budget CPU an expensive one, but any extra FPS should help.
in Account & Technical Support
Posted by: deltaconnected.4058
The 8800GT of 5 years ago is still 2x better than your 435m. A Phenom x4 9750 of 4 years ago is about equal to your 1.6GHz clarkfield. That looks very old to me.
I’ll repeat what I say in every laptop thread; were there laptops 4 years ago that could play Crysis? Or laptops 8 years ago that could play WoW? If every developer “optimized” for outdated hardware (if you want that, go get an XBox or PS3), we’d still be playing pong on a black and white screen, looking forward to the the Quake 3 launch in 2016.
The 630 is a budget card. You get what you pay for.
A GTX560 Ti/660 or HD6950/7850 should be the minimum.
in Account & Technical Support
Posted by: deltaconnected.4058
That hardware is old. Very old. Unlike those other games, GW2 isn’t a console port.
If you can, swap harddrives in those two rigs. This’ll tell us whether it’s a software problem (no more crashes), or hardware problem (still crashes). May need to reinstall newest Catalyst drivers if the 7750 cards are from a different vendor.
in Account & Technical Support
Posted by: deltaconnected.4058
I noticed that as well, issue is this. If say the main thread was doing too much, i would expect to see the core the thread is running on to be maxed out and the issue being other threads hanging while waiting for it. That doesn’t seem to be case though, it seems to me rather to be an issue how object, players and effects are implemented into the engine and how these are being rendered.
That’s just how Windows’ scheduler works – distributed load. Single thread will still be limited to a single core at any one point in time. How much a context switch costs in performance, I have no idea as I haven’t been able to find anything in Windows to change that behavior.
The other threads don’t need locks as good design will make these uni-directional. Eg, the thread that pulls player character location/model from the server only has to write the data to shared memory while the main will only read it. Same with sound; the main thread requests a sound event and the sound thread will queue/play it as needed.
The best model i can up with is say 10 slave threads having to do parts of the render, while not swamped, say in pve, these can manage fine. Once they finish their job the main thread can take the data and use it. In WvW where many things are happening, although i noticed the game starts to show less effects and reduce details and players on screen
The result of rotating something around a world axis and then moving it away from the center point would not be the same if you moved first then rotated. As soon as that becomes a possibility, the operations have to be done in series. Splitting it up would be very complex to say the least (maybe could be done by terrain triangle plane/player/static objects, but I can’t see this working in the cities).
Can’t comment on the effects reduction as I haven’t seen it happen, but there’s nothing preventing it from reducing particle count if previous frame render time > x milliseconds.
If the engine was not implemented to take advantage of that for certain things and leaves the cpu to do it, there’s your bottleneck.
I’m not a game dev so I can’t say for sure, but with what little DirectX I know, the pipeline is designed in such a way that it can’t be done. BeginScene, use your routines that draw a bunch of triangles, EndScene. No way for the CPU to interact directly with the display.
if there are 20 things that need to be done in parallel, and complete before the next thing could be done, a gpu could potentially get them done without any switches from the kernel all in one cycle while the cpu would need many more.
Until those 20 things do FP32 math. Then they’d never get done
Not affiliated with ArenaNet or NCSOFT. No support is provided.
All assets, page layout, visual style belong to ArenaNet and are used solely to replicate the original design and preserve the original look and feel.
Contact /u/e-scrape-artist on reddit if you encounter a bug.