Currently running amd fx8320 with a GTX 660 wanting to upgrade to an I5 would I have to buy a new windows too ?
Yes. ._. I changed from an FX-8350 to an i7 4770K. I will not go back.
I don’t have to worry about that since I don’t do world versus world. :P
Nope. Need two computers.
My mistake, yes I do have Super Sampling enabled. :P It makes my game, to me, look much more… immersive I guess I can say. Funny enough, this might sound totally strange and you’ll call me crazy, but no we didn’t just meet, but when I was previously using an AMD FX-8350, switching to Native sampling from Super Sampling actually very slightly, lowered my performance in some places more often than not. But most of the time there was absolutely no change in performance between the two. It was weird…. but I was still happy about it since it was a useable option haha.
I can see where it’s then being limited, the amount of work being put on them with my current in game settings.
I don’t run additional graphics options outside the game so just what I said, max everything without FXAA because FXAA is crap. One card being idle in windowed mode has been something that’s been a thing for quite a long time in the days of Crossfire.
The load bounces around like when you hold the refresh key (F5) on the Performance tab of Task Manager. There’s no up and down bits like the temperature graph. It’s just straight spikes. There is however no notice of an FPS lag at all.
Hey guys. I don’t always make a Tech Support topic, but when I do…
It’s because I need to learn something.
What in Guild Wars 2 makes the GPU hit max load?
How is the GPU load measured?
Over the past few months I’ve upgraded the core of my machine from AMD FX-8350+Biostar TA990FXE to Intel i7 4770K+Asus Maximus VI Formula Z87,
and yesterday, two Asus Radeon R9 290X DCU II 4GB GDDR5 OC.
Crossfire is currently enabled without any issues.
I also run three monitors with the resolution at 5760×1080, and all of the graphics options set to max aside from FXAA (BECAUSE IT’S UGLY AS kitten)
However. I watched MSI Afterburner and both GPU in full screen are usually hitting 100% load like they did with my 7970s. In windowed mode, only one would hit 100% load while the other idled.
My i7 has no overclock, sitting happy at 3.5 GHz with the turbo ceiling at 3.9 and sometimes I get it to 4.1 GHz in the operating system with Asus software utilities for my mainboard.
My power supply is also a 1000W.
Both slots are running at PCIe 3.0 x8 lane because of the enabled crossfire, showing GPUZ and the GPU configuration in mainboard BIOS.
So yes, what makes the GPU hit 100% load? Is it the CPU not feeding it fast enough? Or something else? I read something related to it a long while back but since forgot it.
(edited by Avelos.6798)
Protip, Avoid tiger direct. Or any ‘Direct’ named online retailer. They will screw you on shipping.
Send in a bug report in-game. That feature exists for a reason as the teams who handle these things will get it directly.
Why Guild Wars 2 is the longest loading the game in history?
Depends on your hardware and connection. I have a high download speed with high ping and an SSD and the game loads very fast.
Yeah the back piece is a bit of a problem… Some of them look alright though.
Shoulders are another problem too but for me it’s just I wish they put more attention to detail towards the polygon budge on the character model.
Aren’t the first generation desktop Lianos 32nn versions of the Phenom II with the added GPU?
Hey Lolisamu. After checking out your screenshot, I think, this is a hunch mostly, that the best settings for you would be to set shadows to low or off, disable Reflections completely, turn down Character model limit to medium, can probably have model quality to maximum after it’s set to medium, I’d turn off FXAA since it’s… awful. It literally just makes things more blurry the further it is. It’s fake anti-aliasing. But it has pretty much no impact on most or all systems.
Render sampling… This one is weird. I got better FPS set to super sampling than I did with native back when I was using an AMD FX-8350. I’d give that a try and see if it helps at all.
I’m pretty sure you’d be able to have animations at maximum, environment at maximum, I don’t think post processing will have any noticeable impact being changed to maximum and neither with shaders since you’re running with a GTX 650 TI. These are suggestions only that I’m sure you can have enabled for the eyecandy :P The others are the most processor heavy I know of.
Yes, yes it is your connection that’s making you lag. Something between you and the server isn’t doing it’s job.
Short answer: No.
Try adjusting the brightness on your display.
at low its still dynamic, but just a blob on the ground
Which is what makes it not dynamic. The other shadows, such as the shadows from structures on the OFF setting are literally just darker texture colours. The blobs are just… shadow disks that neither get larger or smaller. They’re like a commander tag for example, but a shadow. I can’t believe there would be any performance impact aside from something running the bottom of the barrel APU like AMD C3-5 APU.
I’d rather go with Behellagh’s advice. He knows the first generation APU very well since he uses one although without a dedicated GPU to the best of my knowledge.
However, with your own dedicated GTX 650TI, turn off shadows and reflections and maybe turn down character model quality and visible player count. Should help. Or better yet you could experiment with CPU overclocking if your hardware can do it.
In response to dodgycookies, dynamic shadows actually start at medium. At low they have little to no effect on performance.
I have had three different 7970s, two of which I still have and they have had no issues at all.
Which version of the drives specifically. 13.12 here no problems.
The outfits… they’re not as bad as I thought they’d be, but the stuff I bought with real money being turned into 15 minute tonics?
How I feel about that:
I put them in the bank. Didn’t play Guild Wars 2 aside from finishing my daily.
Fairly disappointed.
I deleted it because I had made a huge edit to it only to find that my internet disconnected during the save and I refreshed the page. The edit did not show and I gave up and stopped bothering so I deleted it.
GW2 is CPU intensive and I do not think the CPU in the Shield meets the minimum spec of GW2.
PC games are streamed to it via network.
Assuming your PC can handle it, I don’t see why the Shield can’t.
Playing 5760×1080 three monitors here. no issues.
SirSquishy has awesome advice going on here. I just wanted to pop in and say that’s an awesome budget gaming machine for what you’re wanting.
It’ll even handle some other games fairly well too.
how to disable e – mail authentication ? whenver i log in it launcher asks me to go to my e-mail account to allow acces to the game, and i tagged option to remember my network. I guess my Adsl is changing IP randomly everyday… i want to turn off this thing.
You do not want to disable that feature. If you have ADSL+, you can request a static IP from your provider (3bucks a month extra) to combat that.
But GW2 accounts are in high demand for hackers and gold sellers. You want that 2tone auth based on source IP. Trust me.
I had the same problem SirSquishy, but I my hardware did not support static IP. It was when I connected on my friend’s internet that solved the issue for me. E-mail authentication is not required but it is a very useful security feature.
Send in a support ticket.
Here’s a good question and that I can’t see anywhere if it’s already been asked,
Does the screen flash in any other game when the M290X is being used?
I’d check with your laptop manufacturer about that. That’s not normal.
Going to another version of Direct X is fairly expensive since it usually takes rewriting the entire software platform to my understanding. It’s something I don’t think they have the resources for without a monthly subscription.
AMD FX-8130 is not a real CPU. FX8100 8120 and 8150 are Bulldozer Zembazi. There was an FX 8100 but it never reached the market. I also read about an 8170 somewhere at some point but never marketed.
FX 8320 and 8350 / 9370/9590 are all the same Vishera chips.
The lower the number here the more binned the CPU is. FX 9590 being the most perfect chip in the line since it comes out of the box clocked at 4.7 GHz which on it’s own should hold up alright enough for Guild Wars 2 but still I would not expect miracles.
Edit:
My mistake. FX-8130*P* is a real processor but it was also never marketed. It too was a Zembazi Bulldozer.
(edited by Avelos.6798)
I was previously using an AMD FX-8350. Lowest FPS I got was 6 FPS in wvw at the time when I participated. The extremely bad, like, literally all of wvw one evening, dropped my FPS to 2.
I’ve since upgraded to an i7 4770K myself.
I had better FPS on my Phenom II 975 Black Edition in other MMO games where I used both CPU so I’d imagine the same performance difference could probably be seen here.
If you want to stick to the AMD though I’d actually recommend no further than the AMD Phenom II 1100T processor if you wanted the best performance you could get from AMD. Afterwards you could probably get it professional overclocked to maybe 3.8 GHz or 4 GHz and get it boosted a bit. Although for the price now, the FX-8350 isn’t too bad of a deal when it comes to AM3+ but that’s only if the board supports it to begin with.
Until I know what the board is I can recommend the Phenom II X4 980 Black Edition if it can be found (highest speed without overclocking) and the Thuban Phenom II X6 1100T. It comes clocked at 3.3 GHz, but is also a six core.
I guess that’s a better way to put it. Polygon budget. As for rox’s horns, I rather like them, haha. They could use a bit more curve though too.
I know they’re supposed to be hefty, but their design is not smooth compared to other races. I’m not talking about what they look like in accordance to stereotypes either. I’m meaning like there’s parts of the Charr body that don’t look like it can be physically possible to have in the body structure. Female charr for example on mine, there’s a huge edge bump behind her shoulders.
Some parts you’ll probably see what I’m talking about look really… weird. I’m not saying smooth it out as in just remove said bump, but make it more rounded. More natural and less mechanical.
They look good but I think they can look better.
(edited by Avelos.6798)
Well if someone wants the absolute best performance they can get and if Guild Wars 2 were the only game they played… :P
Yeah, but that’s just it. They’re new and exclusive. But the charr’s body looks clunky. I wish that would be fixed so it could look maybe at least as nice as the Sylvari or the Asura bodies. I mean, they DO look good, it IS a half-feral beast race and I’m quite impressed that they went that far. But I feel like some areas didn’t get as much care as they should have.
It’s about 18 GB give or take a few hundred MB without any other languages downloaded. I accidentally downloaded the German audio language last month and it brought it up to 21.5 GB
I moved it to the slot nearest to the cpu now im getting 60 fps . Thanks for the help.
Why does moving it to the other slot make such a difference?Because even though three slots look the same, only the one closest has all 16 PCIe lanes from the CPU wired to it. The other two slots are pulling their PCIe lanes from the H77 motherboard chip.
I actually didn’t know that the additional lanes came off the chip. So I guess I have an additional x8 and x4 coming from my Z87. (Asus ROG Maximus 6 Formula)
If money were not a problem, you’d be looking at purchasing an intel i7 4960X Extreme Edition and then with extreme processor cooling maybe of the water cooling variety, you’ll be looking at pushing it to 5 GHz or possibly 5.6. Even then your performance in heavy WVW scenarios would likely be sub-par. Or, you could wait for the rumored 8 core Haswell-E i7 Extreme editions and superclock those up. (before anyone says it, you CAN see more performance with more cores on the Intel side of things apparently. So I don’t see why a possible 8 core intel i7 could offer a little more power than a 6 core.)
It’s ridiculous entirely I know but it’s such a sad reality right now.
What’s the spec of your i7?
Whoever serviced your laptop did a fine job. Good on ya!
I moved it to the slot nearest to the cpu now im getting 60 fps . Thanks for the help.
Why does moving it to the other slot make such a difference?
The closest slot to the CPU is the most populated slot when it comes to PCI Express lanes. The more lanes, the more bandwidth can be sent through it. The highest lane count is X16 and after that it goes x8, x4, sometimes X2 and then lastly, X1. The lower the lane count, the lower the GPU performance. Every PCI Express slot below the one nearest the CPU is a secondary one but on any mainboard with more than one PCI Express slot, typically there is always at least one slot that has a full x16 lane connection. Some very high end motherboards could be designed otherwise to have multiple X16 lanes PCI Express slots even if the processor doesn’t support having more than 16 lanes (such as my own) and you could potentially look at it as alternative locations for a GPU.
The game can still play fine on an X8 lane slot, which for people using two graphics cards on a regular CPU (like me, 7970 Crossfire and i7 4770K,) each GPU splits one X16 lane that the CPU can handle, (i7 4770K has 16 lanes for graphics) and they both run at X8. No single GPU will always fully utilize 16 lane all the time. Maybe a dual chip card (7990, GTX 690, GTX 590, so on)
However, extreme edition intel i7, (3960X, 3930K, 4960X so on and other six core processors) are fully equipped with 32 lanes so you can have up to four graphics cards running at X8. It’s a tonne of power.
But yeah essentially, more lanes in use on a PCI Express slot, the better the GPU performance and the faster that it’s being fed to the GPU and thus being sent to the monitor for you to enjoy your digital experience.
Asus GTX 660
AMD FX 8350
Buy everything else cheap except for the PSU.
A 120gb SSD would be a good idea as well.
Wow dude, read OP’s post. He’s got a budget.
Oh. I got 5000 Glory points when I used mine like a month or so ago. Should I have gotten Rank points as well?
GTX 750 doesn’t require external power. I’d be surprised if it didn’t run on that system without changing anything.
It seems like a lack of passion to me. I mean they should care about each race equally no matter the circumstances. First time I ever noticed something like this(I am sure this happens in other games).
Yes! That’s what I feel it is too. There’s not enough attention given to them all equally. ANet says there’s not enough people playing these other races that don’t look as good as the Human or Norn and a number of players myself included are like “Well maybe if you gave them more, it would be incentive to play it!”
Then don’t play one. I’m glad Charr don’t have any fan service. They’re kittenes.
I play one. I enjoy the Charr race. And if I have suggestions / opinions, I share them. Much like how you would if for anything else.
—
I should actually be more specific. Their shape being clunky instead of smooth. Not the armor so much as body appearance. It just looks like there was little care given to it.
I’m only bothered with their sharp and clunky look.
I just kinda wanted to put my two cents in about what a character in this game looks like in 3D.
Every character race looks great on it’s 3D part. They’re smooth, they’re gorgeous. All of them except the charr. (Well, it’s not really so bad that they’re ugly or anything,)
I look at my human female character and check out how much work went into making the body look really nice and shaped and such, smooth curves and I admire the care put into character detail like that.
But I go look at my Charr and it looks big and clunky and chunky.
It’s PC 2014, not Nintendo 64 1999.
I realize that most of the budget goes to Norn and human for resources and design because they’re the most played but come on, really? Charr should look at least better… I mean, they look good, I’m easy to please but they could use a bit of a touch up.
Anyway yeah that’s my two cents. Just wanted to finally say it.
And no I don’t care about clipping… unless it’s major. Otherwise it’s an MMO and you should learn to live with it.
[Edit]
Dear Forum moderator(s).
I’d like to formally request this topic be moved to another forum board that better suits it such as Guild Wars 2 discussion or something that relates to development.
This topic is in no way connected to lore in Guild Wars 2. This topic is about the 3D physical design of the Charr body.
[/Edit]
(edited by Avelos.6798)
Actually so far as I’ve seen the 280X have been fairly cheaper than GTX 770s.
Radeon HD 7970 will put this game at maximum settings without a problem. Could opt for the R9 270X instead for lower price. Will give same results.
No experience with GTX 770 and 780 here but they’ll do the same with an additional cost.