No performance increase with the 780 Ti

No performance increase with the 780 Ti

in Account & Technical Support

Posted by: Stark.1350

Stark.1350

Went from a GTX 560 kitten a 780 Ti and I’m seeing no increase in FPS. What gives?

No performance increase with the 780 Ti

in Account & Technical Support

Posted by: Avelos.6798

Avelos.6798

That might be because the 560TI was enough easily handle the game’s generated graphics? If you changed from something like an AMD processor to a current generation Intel processor, then you’d see gains.

No performance increase with the 780 Ti

in Account & Technical Support

Posted by: Behellagh.1468

Behellagh.1468

Game’s not GPU bound.

We are heroes. This is what we do!

RIP City of Heroes

No performance increase with the 780 Ti

in Account & Technical Support

Posted by: TinkTinkPOOF.9201

TinkTinkPOOF.9201

Game’s not GPU bound.

Indeed, you would think with all the posts about this people would know, or at least research before doing a upgrade for a given game.

6700k@5GHz | 32GB RAM | 1TB 850 SSD | GTX980Ti | 27" 144Hz Gsync

No performance increase with the 780 Ti

in Account & Technical Support

Posted by: Stark.1350

Stark.1350

So I run an AMD Phenom X2 1100T, which the current i7 processors are barely better, 8 gb ram, I run 100 FPS in BF4, faster than 80% of all machines on the 3d mark benchmark tests and the best I’m going to do is 30 FPS in Rata Sum at Max graphics?

No performance increase with the 780 Ti

in Account & Technical Support

Posted by: Avelos.6798

Avelos.6798

No, for this game the current i7 is vastly superior to the Phenom II X6 1100T.

No performance increase with the 780 Ti

in Account & Technical Support

Posted by: tota.4893

tota.4893

So I run an AMD Phenom X2 1100T, which the current i7 processors are barely better, 8 gb ram, I run 100 FPS in BF4, faster than 80% of all machines on the 3d mark benchmark tests and the best I’m going to do is 30 FPS in Rata Sum at Max graphics?

The game’s using the Direct3D 9 API, which hasn’t seen significant updates since 2004, and it’s drawing way more stuff than the API was ever meant to do.

Your in-game performance depends heavily on how fast a single rendering thread can tinker with the D3D9 state, passing shader variables, configuring texture and vertex data sources and issuing draw calls, etc. The game’s probably doing like 20k+ calls per frame if there’s enough action going on.

When it comes to this kind of work, current top-of-the-line Intel i5 and i7 CPUs have a significant advantage due to their high instructions per clock rates.

Games like BF4 can run fast because they have limited variations in character art, relatively few (unique) visible objects in general and are generally built to take maximum advantage of newer graphics APIs with less overhead.

No performance increase with the 780 Ti

in Account & Technical Support

Posted by: Stark.1350

Stark.1350

So in short, unless I upgrade the motherboard and chip, I spent 700 dollars on a paperweight.

No performance increase with the 780 Ti

in Account & Technical Support

Posted by: sobe.4157

sobe.4157

Like what’s been said, any current i7 is vastly superior for this game. Not to mention from a performance standpoint, in all other applications a 4770k would be nearly double in performance compared to the 1100T in a majority of applications.

As for your 780Ti, not a paperweight at all, it can play all the modern titles with ease. Why on earth would you get a 780 Ti just for this game….

3770k 4.9ghz | Koolance 380i | NexXxoS XT45 | XSPC D5 Photon | ASUS MVFormula |
Mushkin Black 16gb 1600 | 500GB Samsung 840 Evo |2×2TB CavBlack| GALAX 980 SoC |
NZXT Switch 810 | Corsair HX850 | WooAudio WA7 Fireflies | Beyerdynamic T90

No performance increase with the 780 Ti

in Account & Technical Support

Posted by: Stark.1350

Stark.1350

Well, it’s not just for this game. Good news at least, my motherboard is compatibile with AM3+ chips – just have to snag a new FX and that should help out quite a bit.

No performance increase with the 780 Ti

in Account & Technical Support

Posted by: sobe.4157

sobe.4157

Unless I’m mistaken that may be a mistake, the Phenom should have better single core performance than the newer FX series chips, which this game draws a lot on.

3770k 4.9ghz | Koolance 380i | NexXxoS XT45 | XSPC D5 Photon | ASUS MVFormula |
Mushkin Black 16gb 1600 | 500GB Samsung 840 Evo |2×2TB CavBlack| GALAX 980 SoC |
NZXT Switch 810 | Corsair HX850 | WooAudio WA7 Fireflies | Beyerdynamic T90

No performance increase with the 780 Ti

in Account & Technical Support

Posted by: Stark.1350

Stark.1350

yeah, just checked. The FX-8350 is 0.2 better in single core performance. Guess I’ll wait a bit and pick up Intel sometime later.

No performance increase with the 780 Ti

in Account & Technical Support

Posted by: Behellagh.1468

Behellagh.1468

It can’t be stressed enough, the phrase “for this game”. Games that are GPU bound, where faster more costly video cards, or multiple ones, greatly improve performance, CPU isn’t a large factor.

GW2 isn’t one of those games. So your 780i is grand for all those FPS games you enjoy but here, with a game that prefers more than 2 cores but benefit very little with more than 4 (even 3), which has only a few CPU intensive threads, single core performance is a metric to use in judging what CPU is better for this game than all cores performance.

Comparing stock clocks, an Intel i5-4670 would be roughly twice as fast as a Phenom II X6 1100T which is slightly faster than an FX-8350. The extra cores of the Phenom II hexcore or the FX octocore doesn’t matter since the limiting factor in GW2 is performance per core.

We are heroes. This is what we do!

RIP City of Heroes

No performance increase with the 780 Ti

in Account & Technical Support

Posted by: abomally.2694

abomally.2694

So in short, unless I upgrade the motherboard and chip, I spent 700 dollars on a paperweight.

I wouldn’t say that. One thing you could do is use the GTX 560 as a PhysX processor. That can be beneficial in other games, just not GW2.

No performance increase with the 780 Ti

in Account & Technical Support

Posted by: Lilywhite.6397

Lilywhite.6397

My initial build should’ve handled GW2 without much issue – 1035T OC’d, 16GB DDR-1600, and 4890s in CF.

When I moved to a 2500k, 8GB of RAM, and a (temporary) 7770, game play significantly improved.

kitten those IPCs.

No performance increase with the 780 Ti

in Account & Technical Support

Posted by: ArmoredVehicle.2849

ArmoredVehicle.2849

The game is CPU bound, almost all MMOs are as a matter of fact.

I upgraded from a 560ti to a 760 just a few weeks ago, it certainly helped in other games but it didn’t do a minimal difference in GW2.

No performance increase with the 780 Ti

in Account & Technical Support

Posted by: Exiled Dbl.9035

Exiled Dbl.9035

Regardless of being CPU bound or anything along them lines im tired of hearing excuses from fanboys the game is poorly unoptimized this has been discussed a billion times to sunday and its also in Direct X 9 and before these fanboys come in and say its for this reason or that its 2004-2005 Technology on a modern game that NEEDS direct X 10/11 AMONGST MANY OTHER THINGS.

3570k oc 4.5Ghz on Water, 16 Gigs of RAM 1866, 2 SSD’s in RAID 0, 2x Gigabyte GTX 760 OC SLI

No performance increase with the 780 Ti

in Account & Technical Support

Posted by: Behellagh.1468

Behellagh.1468

Explaining why it happens isn’t an excuse.

Do I think it should be looked at? Sure. Is it realistic to expect ANet to put the development time and money into it to re-code it to significantly improve performance? No.

Then again they may have been working on it for the last 18 months, but they won’t say anything until they are ready to roll it out. Otherwise it’ll be “is it done yet” over and over again on the forums.

We are heroes. This is what we do!

RIP City of Heroes

No performance increase with the 780 Ti

in Account & Technical Support

Posted by: Exiled Dbl.9035

Exiled Dbl.9035

Explaining why it happens isn’t an excuse.

Do I think it should be looked at? Sure. Is it realistic to expect ANet to put the development time and money into it to re-code it to significantly improve performance? No.

Then again they may have been working on it for the last 18 months, but they won’t say anything until they are ready to roll it out. Otherwise it’ll be “is it done yet” over and over again on the forums.

Well put man, i agree with you however i can understand peoples frustration with high end custom build machines like mine which i built myself yeah i play gw2 no issues what so ever

But with that being said me being a pc builder myself and i love doing it its a hobby of mine i will say i can notice when a game could be alot better and gw2 is definately one of them.

3570k oc 4.5Ghz on Water, 16 Gigs of RAM 1866, 2 SSD’s in RAID 0, 2x Gigabyte GTX 760 OC SLI

No performance increase with the 780 Ti

in Account & Technical Support

Posted by: Avelos.6798

Avelos.6798

DX10 or 11 isn’t going to help the game much. That is all.

No performance increase with the 780 Ti

in Account & Technical Support

Posted by: ikereid.4637

ikereid.4637

DX10 or 11 isn’t going to help the game much. That is all.

yea, it would.

DX10/11/11.2 are all multi-threaded. DX9 is not.

that in itself would be a HUGE performance increase.

GW2’s MAIN weakness is that single threaded Rendering thread tied back to the DX9 API. If that was able to span 1-3 additional threads, we would see HUGE differences from what we are seeing today.

Desktop: 4790k@4.6ghz-1.25v, AMD 295×2, 32GB 1866CL10 RAM, 850Evo 500GB SSD
Laptop: M6600 – 2720QM, AMD HD6970M, 32GB 1600CL9 RAM, Arc100 480GB SSD

No performance increase with the 780 Ti

in Account & Technical Support

Posted by: OGDeadHead.8326

OGDeadHead.8326

It would be very interesting to see this game taking advantage of Mantle.

Win10 pro | Xeon 5650 @ 4 GHz | R9 280x toxic | 24 Gig Ram | Process Lasso user

No performance increase with the 780 Ti

in Account & Technical Support

Posted by: Avelos.6798

Avelos.6798

DX10 or 11 isn’t going to help the game much. That is all.

yea, it would.

DX10/11/11.2 are all multi-threaded. DX9 is not.

that in itself would be a HUGE performance increase.

GW2’s MAIN weakness is that single threaded Rendering thread tied back to the DX9 API. If that was able to span 1-3 additional threads, we would see HUGE differences from what we are seeing today.

DX 11 didn’t do much for WOW raids.

No performance increase with the 780 Ti

in Account & Technical Support

Posted by: Behellagh.1468

Behellagh.1468

It would be very interesting to see this game taking advantage of Mantle.

Same problem as Dx11, ANet would need to redesign the renderer to be a proper, scaling, multithreaded render pipeline.

If you throw a null driver on the game and measure the framerate in a boss event or WvW zerg, I’m betting the gain you get is not as great as some imagine it will be.

We are heroes. This is what we do!

RIP City of Heroes

No performance increase with the 780 Ti

in Account & Technical Support

Posted by: TinkTinkPOOF.9201

TinkTinkPOOF.9201

It would be very interesting to see this game taking advantage of Mantle.

Why?

I mean really, I see so many people drooling over this and its like every problem there is with a game people reply “just wait for mantle” or “if it only supported mantle”, when most people have no idea what it is or the VERY few areas it helps, which tends to be on AMDs APU’s because of having a lopsided setup of a weak CPU and stronger GPU. You are also (as already stated) stuck rewriting for a new and so far unsupported API when you could just write it for DX11 for the higher parallel threading rather than off loading to the GPU. Almost every review of mantle shows the only improvement when you are running a strong CPU like a i5-i7 only comes when you have a powerful GPU and kitten your monitor resolution to place a higher load on the CPU.

I am all for moving ahead, but you have to understand what you are working with and what it was intended for. Most RTS and MMO games are rough on the CPU because many are still based on the DX9 API and much of the CPU power is being wasted on directx calls.

6700k@5GHz | 32GB RAM | 1TB 850 SSD | GTX980Ti | 27" 144Hz Gsync

No performance increase with the 780 Ti

in Account & Technical Support

Posted by: SolarNova.1052

SolarNova.1052

Mantle looks good, but DX12 is also apparently just as good if not better in some cases than Mantle, and its open for all GPU’s that currently support DX11. Mantle may be ‘open’ for Nvidia to adopt, but its made by AMD, Nvidia’s rival, meaning they are unlikely to want to support it.

Ofc DX12 is only going to be open for development in Q4 this year and the 1st games that support DX12 will likely release in Q3 of next year. Unless the GW2 devs are waiting on that, it is highly unlikely they will change the current APi to fix GW2’s issues.
Remember they have ur money already, they have no financial incentive to update their game engine.

3930k 4.6ghz | NH-D14 Cooler | P9x79 Pro MB | 16gb 1866mhz G.Skill | 128gb SSD + 2×500gb HDD
EVGA GTX 780 Classified w/ EK block | XSPC D5 Photon 270 Res/Pump | NexXxos Monsta 240 Rad
CM Storm Stryker case | Seasonic 1000W PSU | Asux Xonar D2X & Logitech Z5500 Sound system |

No performance increase with the 780 Ti

in Account & Technical Support

Posted by: madmalkav.4805

madmalkav.4805

Remember they have ur money already, they have no financial incentive to update their game engine.

- Port to OpenGL and optimize the engine in the process.
- Release the game for PS4 and Linux.
- ???
- Profit.

No performance increase with the 780 Ti

in Account & Technical Support

Posted by: ikereid.4637

ikereid.4637

Remember they have ur money already, they have no financial incentive to update their game engine.

- Port to OpenGL and optimize the engine in the process.
- Release the game for PS4 and Linux.
- ???
- Profit.

No, just NO. We do not need an influx of baddies that think Consoles are better then PC’s at gaming.

Desktop: 4790k@4.6ghz-1.25v, AMD 295×2, 32GB 1866CL10 RAM, 850Evo 500GB SSD
Laptop: M6600 – 2720QM, AMD HD6970M, 32GB 1600CL9 RAM, Arc100 480GB SSD

No performance increase with the 780 Ti

in Account & Technical Support

Posted by: sobe.4157

sobe.4157

LET THE CONSOLE GAMES BEGIN! May the odds be ever in your favor…. But seriously, bleh to consoles, if I want to play at Medium graphic settings I’ll lower the settings ingame on my pc from max settings.

3770k 4.9ghz | Koolance 380i | NexXxoS XT45 | XSPC D5 Photon | ASUS MVFormula |
Mushkin Black 16gb 1600 | 500GB Samsung 840 Evo |2×2TB CavBlack| GALAX 980 SoC |
NZXT Switch 810 | Corsair HX850 | WooAudio WA7 Fireflies | Beyerdynamic T90

No performance increase with the 780 Ti

in Account & Technical Support

Posted by: TinkTinkPOOF.9201

TinkTinkPOOF.9201

Consoles are not far off, and some people say why GW2 does as bad as it does, also why it only seems to use 3 cores, why it has the skill and utility count that it does and why it was based off of DX9 as the xbox uses a somewhat modified DX9 setup, so it could very well be they had planed on trying it out there, but never did for some reason if that is the case.

6700k@5GHz | 32GB RAM | 1TB 850 SSD | GTX980Ti | 27" 144Hz Gsync

No performance increase with the 780 Ti

in Account & Technical Support

Posted by: Behellagh.1468

Behellagh.1468

Don’t mistake number of cores with performance per core. The current XBone with it’s 8 cores still has less performance than the four cores in an i5-2600. And each core in the XBone is less powerful than each core in the XBox 360. There are cell phone processors that test faster than the CPU and GPU from an XBox 360.

Believe me, nobody was targeting the current or previous gen consoles. If they had we would be getting great performance on the PC.

We are heroes. This is what we do!

RIP City of Heroes

No performance increase with the 780 Ti

in Account & Technical Support

Posted by: TinkTinkPOOF.9201

TinkTinkPOOF.9201

The 360 was a triple core, which would have been the target if the game ever had consoles in mind, as well as the API choice. Consoles have always and will always be behind PC, more so now that they have taken on more PC like builds, they will always be cheaper PC’s. The only reason they play the games as well as they do is because of no HW variation and as such much higher optimization than ports to PC have.

6700k@5GHz | 32GB RAM | 1TB 850 SSD | GTX980Ti | 27" 144Hz Gsync

No performance increase with the 780 Ti

in Account & Technical Support

Posted by: Behellagh.1468

Behellagh.1468

The 360 was a triple core, which would have been the target if the game ever had consoles in mind, as well as the API choice. Consoles have always and will always be behind PC, more so now that they have taken on more PC like builds, they will always be cheaper PC’s. The only reason they play the games as well as they do is because of no HW variation and as such much higher optimization than ports to PC have.

You don’t get what I’m saying. Each of the 3 cores in the XBox 360 is less powerful than each of the cores in the XBone which each are less than half as powerful as a core in an i5-2600. If the game was written at one time to try to run on an XBox 360 it would run fantastic on a PC because of all of the additional performance, 3 to 4 times per core at least.

Yes consoles do have an advantage when talking to it’s peripherals (this includes the GPU) because there is no variation so there’s no need for any abstraction and overhead of API to driver to hardware. But what is the underlying limitation we are seeing, CPU or GPU?

We are heroes. This is what we do!

RIP City of Heroes

No performance increase with the 780 Ti

in Account & Technical Support

Posted by: TinkTinkPOOF.9201

TinkTinkPOOF.9201

The 360 was a triple core, which would have been the target if the game ever had consoles in mind, as well as the API choice. Consoles have always and will always be behind PC, more so now that they have taken on more PC like builds, they will always be cheaper PC’s. The only reason they play the games as well as they do is because of no HW variation and as such much higher optimization than ports to PC have.

You don’t get what I’m saying. Each of the 3 cores in the XBox 360 is less powerful than each of the cores in the XBone which each are less than half as powerful as a core in an i5-2600. If the game was written at one time to try to run on an XBox 360 it would run fantastic on a PC because of all of the additional performance, 3 to 4 times per core at least.

Yes consoles do have an advantage when talking to it’s peripherals (this includes the GPU) because there is no variation so there’s no need for any abstraction and overhead of API to driver to hardware. But what is the underlying limitation we are seeing, CPU or GPU?

I get what you are saying just fine, and do not understand why you are trying to say that I am saying the per thread power of the CPU in the xbox is the same as a i5? Not really sure where you are coming from with that. Also, you can not compare how a game will play on a PC with far more processing and GPU power that plays on a console, the massive amount of poor ports we have today show this, as well as most consoles are 720, so the pixels they are driving are far less. The point was about how the game was setup, the API choice etc etc point to the fact they wanted to try consoles, and even remember a video a while back talking about this, however why it never made it to console might be because it never reached the minimal FPS needed to be released on the console (something MS requires), or they had a change of direction, who knows.

6700k@5GHz | 32GB RAM | 1TB 850 SSD | GTX980Ti | 27" 144Hz Gsync

No performance increase with the 780 Ti

in Account & Technical Support

Posted by: Avelos.6798

Avelos.6798

Tip: i5 2600 is an i7 3.4 GHz quad core /HT.

No performance increase with the 780 Ti

in Account & Technical Support

Posted by: Behellagh.1468

Behellagh.1468

@Avelos

But HT simply lets the cores run more efficiently by presenting the instruction scheduler with a second set of code that doesn’t interact with the first, allowing a higher IPC.

And since the ARM Power PC cores in the 360 and the x86 cores in the XBone are single thread, thought to use the i5 in comparison so HT doesn’t get in the way.

@Tink
GW2 isn’t GPU bound, but CPU bound. It doesn’t matter if the consoles were running 1280×720 or not. Either of us can run the game at that resolution and boss events and zergs will still drive the frame rate to it’s knees.

If a modern let older CPU like an i5-2600 has problems, what would it be like on something with considerably less performance? Conversely if they had gotten a design to run well on such a slow (to today standards) set up, we wouldn’t be having all these threads about frame rate issues.

Edit: The CPU in the 360 consisted of 3 Power PC cores, not ARM (or x86).

We are heroes. This is what we do!

RIP City of Heroes

(edited by Behellagh.1468)

No performance increase with the 780 Ti

in Account & Technical Support

Posted by: TinkTinkPOOF.9201

TinkTinkPOOF.9201

@Avelos

But HT simply lets the cores run more efficiently by presenting the instruction scheduler with a second set of code that doesn’t interact with the first, allowing a higher IPC.

And since the ARM cores in the 360 and the x86 cores in the XBone are single thread, thought to use the i5 in comparison so HT doesn’t get in the way.

@Tink
GW2 isn’t GPU bound, but CPU bound. It doesn’t matter if the consoles were running 1280×720 or not. Either of us can run the game at that resolution and boss events and zergs will still drive the frame rate to it’s knees.

If a modern let older CPU like an i5-2600 has problems, what would it be like on something with considerably less performance? Conversely if they had gotten a design to run well on such a slow (to today standards) set up, we wouldn’t be having all these threads about frame rate issues.

If we were talking about another PC CPU vs another, like talking about AMD vs Intel builds in a generalized manner, I would agree. But we are not, setting up for a system like in a closed ecosystem such as an xbox/PS you can get FAR more out of the HW because you know exactly what you are working with in every case and are not dealing with mangled 3rd party HW and drivers etc etc.

As a PC gamer I am sure you have played many bad ports, that from a HW stand point should run amazing on PC, but doesn’t, yet still plays just fine on a wholly underpowered console, this, in a nutshell, is the problem I am speaking of.

6700k@5GHz | 32GB RAM | 1TB 850 SSD | GTX980Ti | 27" 144Hz Gsync

No performance increase with the 780 Ti

in Account & Technical Support

Posted by: Avelos.6798

Avelos.6798

@Avelos

But HT simply lets the cores run more efficiently by presenting the instruction scheduler with a second set of code that doesn’t interact with the first, allowing a higher IPC.

And since the ARM cores in the 360 and the x86 cores in the XBone are single thread, thought to use the i5 in comparison so HT doesn’t get in the way.

Just offering correction that “i5 2600” does not exist but is an i7 2600. :P Otherwise it’s just an i5 2500.

No performance increase with the 780 Ti

in Account & Technical Support

Posted by: Behellagh.1468

Behellagh.1468

Sorry Avelos, senor moment.

We are heroes. This is what we do!

RIP City of Heroes

No performance increase with the 780 Ti

in Account & Technical Support

Posted by: Decado.1726

Decado.1726

Even if this game is so CPU bound even the best of CPUs currently can’t use the highest graphics settings and still get 60 fps steady in some areas. It’s crazy.

No performance increase with the 780 Ti

in Account & Technical Support

Posted by: Behellagh.1468

Behellagh.1468

@Avelos

But HT simply lets the cores run more efficiently by presenting the instruction scheduler with a second set of code that doesn’t interact with the first, allowing a higher IPC.

And since the ARM cores in the 360 and the x86 cores in the XBone are single thread, thought to use the i5 in comparison so HT doesn’t get in the way.

@Tink
GW2 isn’t GPU bound, but CPU bound. It doesn’t matter if the consoles were running 1280×720 or not. Either of us can run the game at that resolution and boss events and zergs will still drive the frame rate to it’s knees.

If a modern let older CPU like an i5-2600 has problems, what would it be like on something with considerably less performance? Conversely if they had gotten a design to run well on such a slow (to today standards) set up, we wouldn’t be having all these threads about frame rate issues.

If we were talking about another PC CPU vs another, like talking about AMD vs Intel builds in a generalized manner, I would agree. But we are not, setting up for a system like in a closed ecosystem such as an xbox/PS you can get FAR more out of the HW because you know exactly what you are working with in every case and are not dealing with mangled 3rd party HW and drivers etc etc.

As a PC gamer I am sure you have played many bad ports, that from a HW stand point should run amazing on PC, but doesn’t, yet still plays just fine on a wholly underpowered console, this, in a nutshell, is the problem I am speaking of.

No we are not. The XBox 360 architecture isn’t x86. Not the same instructions at all. They aren’t even the same endian.

You were the one who suggested that maybe the 3 core-ish cap in performance was related to the XBox 360 and that maybe ArenaNet was considering a version on that platform as well. You can’t be saying they were targeting the XBone because the game was in development from 2007-12.

We are heroes. This is what we do!

RIP City of Heroes

No performance increase with the 780 Ti

in Account & Technical Support

Posted by: Avelos.6798

Avelos.6798

Even if this game is so CPU bound even the best of CPUs currently can’t use the highest graphics settings and still get 60 fps steady in some areas. It’s crazy.

MMORPGs like this with Direct X 9 are very strange but the explanation is sorta simple I think. There’s too much load being placed on the CPU because of the currently used API which was chosen so that the game would be the most accessible to anyone with a computer. Windows XP, Vista and 7 are the most used operating system and MAC has DX9 support so it’s a given that they’d want to make it with that.

The only drawback is the substantial amount of work that then makes the processor need to do. It was then made further inefficient by adding in all of the huge amounts of visual effects. Those visuals are done by GPU yes but the CPU needs to throw them at the GPU first.

When it comes down to it if I were one of the developers of the game I would have opted for DX9 as well over 10 or 11 so more people could play the game. Even with the game how it is you can get reasonable acceptable performance with most hardware configurations since the game’s graphics are scalable.

Just another case of the pros outweighing the cons.

No performance increase with the 780 Ti

in Account & Technical Support

Posted by: TinkTinkPOOF.9201

TinkTinkPOOF.9201

No we are not. The XBox 360 architecture isn’t x86. Not the same instructions at all. They aren’t even the same endian.

You were the one who suggested that maybe the 3 core-ish cap in performance was related to the XBox 360 and that maybe ArenaNet was considering a version on that platform as well. You can’t be saying they were targeting the XBone because the game was in development from 2007-12.

The suggestion for the console was the reason for the lack luster performance and API choice, it had nothing to do with comparing raw power of PC vs consoles. I also don’t understand what the development time has to do with anything?

When it comes down to it if I were one of the developers of the game I would have opted for DX9 as well over 10 or 11 so more people could play the game. Even with the game how it is you can get reasonable acceptable performance with most hardware configurations since the game’s graphics are scalable.

Just another case of the pros outweighing the cons.

DX11 games will play on older HW, however if the HW only supports DX9 and there are DX11 only related gfx etc etc they will not be shown/rendered on the DX9 HW.

6700k@5GHz | 32GB RAM | 1TB 850 SSD | GTX980Ti | 27" 144Hz Gsync

No performance increase with the 780 Ti

in Account & Technical Support

Posted by: Avelos.6798

Avelos.6798

Come to think of it I doubt more there are a lot of people who even play Guild Wars 2 with a DX9 graphics chip anymore. DX 10 for Nvidia started at the gaming line of GeForce 8 series and Radeon HD 2000 series for ATI. Anything beyond those two lines are extremely dated and probably way too slow to run the game adequately anymore unless you had/have like two 7800 GT or two HD 2900 in SLI/Crossfire. Though multi card setups of those generations would probably be more problems than solutions. Crossfire and SLI was still rickety and performance gains were eeehhhh.

Also I went and seeked out information that confirmed your information as positive, TinkTinkPOOF that DX11 games will play on DX9 hardware. That the games will have some sort of DX9 fallback mode. I find that actually interesting.

No performance increase with the 780 Ti

in Account & Technical Support

Posted by: TinkTinkPOOF.9201

TinkTinkPOOF.9201

Come to think of it I doubt more there are a lot of people who even play Guild Wars 2 with a DX9 graphics chip anymore. DX 10 for Nvidia started at the gaming line of GeForce 8 series and Radeon HD 2000 series for ATI. Anything beyond those two lines are extremely dated and probably way too slow to run the game adequately anymore unless you had/have like two 7800 GT or two HD 2900 in SLI/Crossfire. Though multi card setups of those generations would probably be more problems than solutions. Crossfire and SLI was still rickety and performance gains were eeehhhh.

Also I went and seeked out information that confirmed your information as positive, TinkTinkPOOF that DX11 games will play on DX9 hardware. That the games will have some sort of DX9 fallback mode. I find that actually interesting.

Indeed it is, but then it makes you stop and think….WTF devs, why are there so few games on new DX when MS has made it a point to be backward compatible? And the only real answer is consoles, which is why when new ones come out, I could careless about the HW in it, I want to know what DX it will be based off of, and most of the time they are to cheap or lazy to do a PC version and what we get is a chopped together port that runs like crap.

6700k@5GHz | 32GB RAM | 1TB 850 SSD | GTX980Ti | 27" 144Hz Gsync

No performance increase with the 780 Ti

in Account & Technical Support

Posted by: Avelos.6798

Avelos.6798

Come to think of it I doubt more there are a lot of people who even play Guild Wars 2 with a DX9 graphics chip anymore. DX 10 for Nvidia started at the gaming line of GeForce 8 series and Radeon HD 2000 series for ATI. Anything beyond those two lines are extremely dated and probably way too slow to run the game adequately anymore unless you had/have like two 7800 GT or two HD 2900 in SLI/Crossfire. Though multi card setups of those generations would probably be more problems than solutions. Crossfire and SLI was still rickety and performance gains were eeehhhh.

Also I went and seeked out information that confirmed your information as positive, TinkTinkPOOF that DX11 games will play on DX9 hardware. That the games will have some sort of DX9 fallback mode. I find that actually interesting.

Indeed it is, but then it makes you stop and think….WTF devs, why are there so few games on new DX when MS has made it a point to be backward compatible? And the only real answer is consoles, which is why when new ones come out, I could careless about the HW in it, I want to know what DX it will be based off of, and most of the time they are to cheap or lazy to do a PC version and what we get is a chopped together port that runs like crap.

Because DX 11 was not available until 2009. DX 10 still required high CPU power as far as I’ve seen. ANet wouldn’t scrap 2 years of work just to change APIs anyway.

No performance increase with the 780 Ti

in Account & Technical Support

Posted by: TinkTinkPOOF.9201

TinkTinkPOOF.9201

Because DX 11 was not available until 2009. DX 10 still required high CPU power as far as I’ve seen. ANet wouldn’t scrap 2 years of work just to change APIs anyway.

The first games available with DX11 was in 2009 and only because we had to wait on Win7 because MS didn’t want to patch Vista for it, but ended up doing that anyway, that however is not the time frame devs have to work with it, as DX11 was shown to the public back in 08, just like DX12, devs already have their hands on this, but it is still new news to us. Also still gives no reason to not be at least DX10 which helped with a good bit of problems with DX9, sure, it does not have the multithreaded support DX11 has, but it is better than nothing.

6700k@5GHz | 32GB RAM | 1TB 850 SSD | GTX980Ti | 27" 144Hz Gsync

No performance increase with the 780 Ti

in Account & Technical Support

Posted by: Behellagh.1468

Behellagh.1468

Experience has a large part in choosing what APIs to use. When the project was started in 2007, there was already Dx9 was already 5 years old, 3 if you only dealt with the latest version. It was well known and tricks and gotchas of developing with it were fairly well known.

Dx11 did come out with Win 7 but it was still an API whose best practices were not well known. And you do not code your project based on a “new” API just because it suppose to be better. Not the single project that your company’s future is leveraged on.

Of course you would have to design the renderer to be scalable across multiple threads/cores for best Dx11 support while not significantly adding significant overhead if you are using a single thread for Dx9. And then there were the rumors that the game was suppose to be out in 2010 and then 2011.

The people who embraced Dx11 early were the game engine manufacturers (it’s a competitive feature) and developers leveraged their games off of that infrastructure.

And when developing software, very early on during the process you fix your specs on what tools you are using and what APIs you are supporting. If those things change mid project, you are royally screwed as a developer.

We are heroes. This is what we do!

RIP City of Heroes

No performance increase with the 780 Ti

in Account & Technical Support

Posted by: XFlyingBeeX.2836

XFlyingBeeX.2836

But they should implement new API already.
Right know they have a chance to try out AMD mantle.

Yes it is very weird to use DX9 in 2014. Finally AMD support low level API while INTEL doesnt care. By their fault some games still use DX9.

Yep they make superior CPU but i really wish that AMD defeat them.

GW2 is really great game but it is unplayable for everyone when 100 people fights.
Like i said imagin other well optimized games on DX9.

It is much cheaper for us to change windows then change i5 2400 to i5 4670K + great mobo + great cpu cooler. Also dont see much improvement and games is still unplayable in big fights

Imagine playing this game at steady 60 FPS when your in combat with 100 players around. Wouldnt be amazing?

No performance increase with the 780 Ti

in Account & Technical Support

Posted by: Avelos.6798

Avelos.6798

To save time explaining why they couldn’t and wouldn’t, I’ll just say no.