(edited by Ravenmoon.5318)
Not necessarily. It just means they were going to look into it.
If anything, good programmers are forward thinkers. That’s what puts them further apart from the mediocre coders. I’m fairly certain there is some form of an abstraction.
Not if the programmers are under time pressure so they could be placed on something else on the schedule. Not if the code was finished before Windows 7 came out.
Every layer of abstraction reduces performance. That’s why from a performance point of view it’s better adding an entirely new renderer, invoked at game startup, for each API supported.
Do you even know how coding works <o> You don’t copy/paste renderer code in multiple projects in Visual Studio so you would compile with different flags and have the support for all of them. You write abstractions and implementations based on those abstractions and the APIs you have available. Then you instruct your compiler through compiler flags to compile the code that you need. Its how OO programming works. And this is just one option, there are many…for instance, you can have multiple renderers in single executable.
Google this kitten man. It’s basic programming principles.
Not necessarily. It just means they were going to look into it.
If anything, good programmers are forward thinkers. That’s what puts them further apart from the mediocre coders. I’m fairly certain there is some form of an abstraction.
maybe you should buy an AMD card.
I don’t know if ANet has any metrics but most people I ask have superior hardware that can levitate Dx 11(12). Either that or there are many liars. Everytime a hardware talk happens in map chat in HotM people claim of having 290X-es in crossfire and here I am happy for having a single one xD
Oh right. When the game launched I had crossfire Radeon HD 6870. Gw2 has negative gains in multi-GPU build. So I sold those and bought R9 290X. There was some improvement but the FPS still tanks.
It was fun though. Disabling crossfire in Gw2 raised my overall FPS. Fun times…
But so did many other games, including Crysis 2 until AMD released a driver profile for crysis. So I guess that’s not ANet’s fault after all. But the usage of ancient API is :U
Another funfact is that among other things, in DirectX 12/Vulkan developersr will spend LESS times fighting vendor drivers. They’ll have to do most of the low-level stuff themselves. On the bright side … Valve already did the heavy lifting for Vulkan.
(edited by Ravenmoon.5318)
No. I try to turn this to a proper form.
Turret protection against criticals or conditions still could be traits.Cause people don’t whine about turret traits already, right?
It would just shoehorn turret users further onto full turret builds, anyway. Even more than now, assuming they wouldn’t end up in the same trait lines of the existing ones.Either way, the comparison is moot. Classes are designed and balanced differently.
Engineers’ even lack a second main weapon, so they have to rely even further on utilities. Thus such utilities can’t even be balanced as the other classes’ ones to start with.
Unless they change the whole design of the class, they can’t make them weaker unless they wants to make them completely useless (and to be blunt, that’s exactly what some people in this forum are aiming for; i assume they’ll have to live with it).
Oh those poor engineers and those instant main ability swap with 0 cooldown (compared to elementalist, who too has no secondary weapon).
You know, the ele goes into cooldown when it switchs attunement. Engi doesn’t.
I really hate to advertise other games in this forum, so just watch some youtube clips and see it for yourself. All I want is for my PC to be used at its fullest potential. It barely heats up in Gw2 … that basically means that it is idling at this game….
I watched a 30 minutes WvW video from TESO and personally it just looked horrible
to me. Maybe they had better textures, but i really don’t care about that if its just
50 shades od brown, and player characters that run around like zombies.And for the FPS .. i still have for example 50-70 FPS when doing pre-quests for the
Svanir Shaman .. but the FPS go down when the actual fight begins. So its not
simply the rendering of all those character, but either effects in the fight, or
data that is been sent from and to the client about all that stuff that slows
down the FPS.Sometimes stuff slows down performance that people don’t really see. I remember
when devs in SOE explained why we had a questlog limit of 70, because for every
step we go the server has to check your total questlog to see if you maybe just
discoverd something … or in AoC somebody wrote how much traffic was generated
from stuff that was thrown away onto the ground ..
Yup, TESO has amazing environment graphics. Characters take backseat, but in intense fighting character animations take a backseat in Gw2 as well to improve performance (somewhat, at least). There even was a setting where everyone ran the same character model with the same animation set. Looked funny and it still lagged. Not sure if they scrapped it off.
You are absolutely correct, working with data eats limited CPU time. But here’s the example. 60 frames per second basically means your computer has ~16.6ms (miliseconds … 1/1000 of a second) to render a frame. 16ms to do game logic, which could get quite CPU hogging on its own, and then you have to go through the rendering stack. Which has huge overhead. Drawing all those special effects that Gw2 employs takes a lot of time. On top of stuffed environment (trees, grass, structures, various textures) and god forbid hundreds of characters that regardless of what you see, have to be rendered all at once.
The problem with the above is that DirectX 9 drivers utilize single thread to process all the rendering stuff. Same happens in Dx 11, but there is multi-threaded option. Its not true multi-threading. You still have tons of overhead, but at least on paper you can separate some of the processing. Also Dx11 can be a multi-threading is user-space setting. So maybe you have this one disabled in your LotRO?
What SoE is talking about is ancient. There are tons of sorting and searching/indexing algorithms. Programming is all about data structures and representation. Database engines use those algorithms all the time to sort through millions of rows of data in a blink of time. Or perhaps you can make more clever quest system, kinda like Gw2’s hearts :P But using your example, nowadays you just need tons of RAM to keep this data “alive” and have the server respond to client quest events only and only then have the server sift through the active quests and verify the quest state. Warframe employs similar system. Your Gw2 already does ridiculous amount of tracking atm. (E.g. all those achievements)
And the traffic issue is relative yeah. Gw2 isn’t there yet.
(edited by Ravenmoon.5318)
I never stated TESO has Dx 12. No current game has Dx 12 for the obvious reason that the SDK is not out yet.
Yeah .. but since the talk here is all about that we need DX12 because better performance
and then you bring TESO as an example for better performance, it sounds as if DX12
would be the reason.I don’t know how the performance of TESO really is, however if it is really better
ist maybe also better in DX9, and that has maybe just because they use very
clunky animations, and there is not much eyecandy there, why GW2 has often
simply an effect overkill.DX11 in LotRO was also worse than DX9 when i tested it, and i switched back
to DX9 after very short time.
Is your GPU with 512MB VRam? Because a proper implementation of Dx11 should give you at least 30% better performance. Also clearly mathematically you need at least 1gb of VRAM to contain the pixel color data of FullHD screen (1080p) Then add some megabytes of that for actual storage of textures and whatever else the engine needs to load in VRAM to render the next frame.
Also TESO has a lot more eyecandy. Everything is more detailed and the setting is photorealistic with higher resolution textures. However yeah, animations are hyper lame. But then again, Gw2 has one of the best animations in the industry. But you have to be blind to say that artistically TESO is not breathtaking. And that is achieved with over 60 frames per second. It will also work on the “next gen” consoles, which are basically a $500 PC come this July.
I really hate to advertise other games in this forum, so just watch some youtube clips and see it for yourself. All I want is for my PC to be used at its fullest potential. It barely heats up in Gw2 … that basically means that it is idling at this game….
(edited by Ravenmoon.5318)
Its terrible business decision not to improve your game performance. I’m sorry but lets face it, Gw2 is not the only kid on the block. New games pop up all the time, MMOs at that too, supporting new technology, utilizing everything you have. People are a lot more likely to give up a lagging game, regardless of quality. See, I’m eyeing Black Desert, because it looks dope and runs buttery smooth. I’m eyeing Star Citizen. I’m quite tempted to pick up TESO even though I despise ZOS for what they did to the franchise just because the game supports – wait for it – ……… massive battles on an acceptable framerate. Find people who have played both Gw2 and TESO and ask them which WvW was more enjoyable.
Most of the games are just another generic Asia Grinder 4711 with Unreal Engine
or whatever .. and they look nice in all those videos, and after playing them
2 hours you are already bored to death.TESO has by the way NO DX12 .. and DX11 is often worse performance wise than
DX9.http://www.reddit.com/r/Smite/comments/1scqu6/dx9_vs_dx11_differences_why_such_major/
I never stated TESO has Dx 12. No current game has Dx 12 for the obvious reason that the SDK is not out yet. TESO however is properly written to take advantage of the new API (and thus the new hardware), and as a result, a hundred man combat with stuff flying all over the place doesn’t kill your CPU.
Smite probably has poor implementation of Dx11. Either that or they use stuff like HDR lighting or that supersampling technique to improve image quality. The VRAM usage difference is off the charts. Regarless, my GPU has 4gb of GDDR5. Any medium class graphic card nowadays has 2 gigs of VRAM. The VRAM is not really an issue.
P.S: I’m not completely against asian grinders as well, if they are fun. After all I spent 8 years in Lineage 2. Good times. Had kittenty performance unless you had HDDs in RAID 0 (or lately, SSD) .. buuut RAID-ing 2 velociraptors was cheaper than a new CPU+Mobo albeit a bit loud x.o
And if we are going to compare apples to apples, TESO is built using the wretched Hero Engine, the same engine that powers SWTOR and raised the costs of the game at $200kk. Notice the difference between the two?
(edited by Ravenmoon.5318)
http://anandtech.com/show/9124/amd-dives-deep-on-asynchronous-shading
Even more info for the uninitiated. If you are bad with technical terms, at least enjoy the slides
Guys, its not just player versus turrets. Stop being so banal. Any semi brain-dead engi will burst you with rifle, alongside the turrets. If you do happen to destroy them, he’ll drop a crate on your face. By the time you are done with those, he’ll cast even more turrets… all while the thumper turret will keep getting you down. Your safest bet is to +1 the guy and burst him down. And even that may prove difficult vs someone who knows what they are doing.
(edited by Ravenmoon.5318)
Cloth simulation (E.g. non-clipping capes) is a thing in DirectX 11 games. Among other things ….
I mean really the only negative I can see is the upfront cost. But that’s just how things work in the software world. Huge upfront costs and then potentially cashing big.
i think you misunderstood. nobody cares about texture quality. what people care about is the major performance increase from dx 12.
…seriously, so ppl whine about performance problems when they have an outdated computer…..and they want a higher DX version that in no way does anything good for GW2?
oh and if nobody cares about it, am i nobody then?
i5-2500k @ 4.5 GHz and R9 290X OC is barely a toaster PC my friend. Gw2 is the only game that it struggles at atm(I don’t tend to play older titles, but even if i did, the games 10 years ago didn’t push the renderer as hard as Gw2 is pushing it atm)
Upgrading at the moment makes no sense either because:
1) You buy an year old CPU/mobo or
2) You miss out on the generation that is going to be revealed later this year by either AMD or Intel
3) DDR-4 is still quite expensive. Buying those 16gb modules now is a suicide while in a year time the best now will just be an entry-level and will lose 2/3rds of its price, once the manufacturing switches over to DDR4
4) Current generation GPUs are a bit of a sham compared to last gen (that includes AMDs new R9 3xx). No die shrinkage was made, so they are, for the most part, the same GPUs, just on steroids. And in Titan-X’s case … a larger die. 22nm GPUs are scheduled for arrival in 2016.
(edited by Ravenmoon.5318)
To the guy screaming about mantle. Sorry to disappoint but as it is now it is a proprietary API locked to AMD. Their plans were to open it to other vendors but obviously nVidia weren’t interested in supporting it, HOWEVER, the next OpenGL standard, namely Vulkan, is basically Mantle.
Here’s the blogpost about it: http://community.amd.com/community/amd-blogs/amd-gaming/blog/2015/03/03/one-of-mantles-futures-vulkan
I like AMD too but they have never had a working Mantle demo on an nVidia or Intel GPU.
Good news is, Vulkan is meant to work on any major OS (including WXP) and most modern GPUs.
AMD advised to not focus on Mantle but on DirectX 12 and Vulkan. Mantle was a really good idea of AMD but became obsolete with DX12 and Vulkan.
Well, basically Mantle will become Vulkan … and nVidia and Intel are onboard aswell as Apple (hint at the Mac Beta, which would be obsolete with DX12)
I doubt the only stop for real Mac port is the graphics renderer. I’m not talking of PC vs Mac here though.
To the guy screaming about mantle. Sorry to disappoint but as it is now it is a proprietary API locked to AMD. Their plans were to open it to other vendors but obviously nVidia weren’t interested in supporting it, HOWEVER, the next OpenGL standard, namely Vulkan, is basically Mantle.
Here’s the blogpost about it: http://community.amd.com/community/amd-blogs/amd-gaming/blog/2015/03/03/one-of-mantles-futures-vulkan
I like AMD too but they have never had a working Mantle demo on an nVidia or Intel GPU.
Good news is, Vulkan is meant to work on any major OS (including WXP) and most modern GPUs.
(edited by Ravenmoon.5318)
I don’t need to back up anything … I’m not the one claiming we need DX12. I do however understand the implications of a fairly unnecessary upgrade on the game. The game works, it can be maintained and developed within the cost structure and business model surrounding the game. You can slather ‘performance increase’ or ‘do more with the game’ as your rally banner all you want. It’s actually a terrible justification to upgrade from a business perspective because what more needs to be done? How does that translate to ROI? Upgrading to DX12 don’t sell gems; the game would need to be unplayable or hit development dead ends before this suggestion would even make sense.
It would make more sense to just pump out more awesome skins on the Gemstore than it would be to waste time upgrading to DX12.
Its terrible business decision not to improve your game performance. I’m sorry but lets face it, Gw2 is not the only kid on the block. …
That’s a fallacy … people aren’t drawn to this game because of it’s OMG awesome use of DX technology. If it’s a terrible business decision, why is the game still around after using ‘outdated’ technology for 2.5 years? If it was so critical factor to players, then I think we need to understand how this contradiction exists.
Half of, if not more, of the serious WvW guilds are over at TESO. Gw2 is a game that sold 2 million copies in its first week. Where are those 2 million players?
I don’t need to back up anything … I’m not the one claiming we need DX12. I do however understand the implications of a fairly unnecessary upgrade on the game. The game works, it can be maintained and developed within the cost structure and business model surrounding the game. You can slather ‘performance increase’ or ‘do more with the game’ as your rally banner all you want. It’s actually a terrible justification to upgrade from a business perspective because what more needs to be done? How does that translate to ROI? Upgrading to DX12 don’t sell gems; the game would need to be unplayable or hit development dead ends before this suggestion would even make sense.
It would make more sense to just pump out more awesome skins on the Gemstore than it would be to waste time upgrading to DX12.
Its terrible business decision not to improve your game performance. I’m sorry but lets face it, Gw2 is not the only kid on the block. New games pop up all the time, MMOs at that too, supporting new technology, utilizing everything you have. People are a lot more likely to give up a lagging game, regardless of quality. See, I’m eyeing Black Desert, because it looks dope and runs buttery smooth. I’m eyeing Star Citizen. I’m quite tempted to pick up TESO even though I despise ZOS for what they did to the franchise just because the game supports – wait for it – ……… massive battles on an acceptable framerate. Find people who have played both Gw2 and TESO and ask them which WvW was more enjoyable.
It is sad really. ANet has some really great and passionate developers. I saw the keynote at GDC. The in-house monitoring tools are fantastic. And the game is fantastic. But that renderer man … on a game that tries to play on such a high scale. Hundreds of players on a single map. Shooting yourself in the foot would be the mild thing to say …
P.S: Gw2 survived for 3 years on gem store alone. Do not underestimate that. They haven’t had a single staff cut since the game launch. On the contrary, the dev team has expanded and they are still hiring. Most of you make it seem like the game running out of funds. If there is a time to do this, its after the HoT launch.
(edited by Ravenmoon.5318)
Im curious if people trolling this thread or they dont want to gain better performance? Or maybe they playing on potato and doesnt want rest of the playerbase to enjoy smooth gameplay?
No, we wouldn’t mind better performance but dislike those that are constantly softballing the effort required to add additional render pipelines that support newer APIs. It is hardly trivial. It requires those with “deep magic” understanding of the newer APIs and may require massaging of all the game’s assets to be compatible with Dx9 and additional rendering APIs. It’s a specialized skill set. Not every code slinger can jump in feet first and knock it out.
Turbine did it because they have multiple products using the same engine so multiple income streams.
SoE/Daybreak did it because again, multiple products using the same engine so development cost is spread out.
WoW did it because they are sitting on a Smaug size pile of gold and can afford to without significantly impacting their bottom line.
Dice did it because the license the Frostbite Engine and the same can be said with other game companies that license their engines to 3rd parties. Have to compete with each other so if one does it, everybody needs to do it.
To do it right, it isn’t trivial. It isn’t a selling point for MMOs. Nobody is rating or comparing MMOs based on frame rate. A ROI argument needs to be made internal to the company if the cost to do this will be offset by additional income Vs doing nothing or very little (tweaking the current engine and rendering code). It’s a business decision at the end of the day and if a business argument can’t be made to do it, it won’t get done. Do not forget that in the life of ANet, they’ve had total income, GW and GW2, of less the $600 million over 15 years.
Aaand CCP did it for lulz and giggles ? Since you conveniently missed that one…
I switched from a GTX 460 to a GTX 960 the other week. Man, what a difference for a CPU-bound game.
I think, a lot of the theories in this thread about the used technologies and their effects on performance are… let’s say… not very well founded.
I am running R9 290X which frankly destroys your GPU in every test in existance. I rarely get 100 frames per second. If I am within intensive scene it dips below 30.
Fighting the miasma in Lion’s Arch a year ago was cancer to me. The events after that, even more so. All those people zerging single boss mobs. That frame rate was going back in time.
To be honest at the time I didn’t overclock my CPU (i5-2500k) because of heat issues, plus Crysis was maxing out on 100 fps, so it was running its stock clock of 3.3GHz + turbo(500MHz). Now that I juiced extra 1GHz out of it the fps is more stable in low to medium places but if i dare to go to WvW without effect LOD, i’ll be playing slideshow 2.
No, I won’t toss $600 for CPU upgrade just so I could get decent FPS in one game, while everything else runs at least with 60 fps.
At this point I’ll be happy for multi-threaded Dx11 renderer xD
Face it, if 90% of games available require powerful GPU, and only Gw2 requires powerful CPU, where do you think 90% of people will invest more money at? Especially after the bullkitten Intel pulls with each generation, having to upgrade the motherboard to upgrade the CPU. Fun times.
Wish we could get some words from ANet on this topic, but that is unfortunately their weak side, communication.
I’ve seen MMO companies that are easier “persuaded” to add a new feature than ANet to answer on a request.What should they say ?
If they say NO they will be flamed.
If they say YES they will be flamed if they don’t give a date, or if they give a date
they will be flamed if it is not ready at that time.They can’t win no matter what they do.
“We’ll look into it”
“Nothing is off the table”
“When it’s ready”
“Thank you for your feedback”
“It’s coming”
You know, nothing they haven’t already said! :p
Well I don’t know, if non-ANet people could just stop worrying about revenues and just ask for things without arguing over non-issues, maybe they’ll listen.
Keep your Dx9 renderer if you love it so much. I want my Dx11/122 renderer and I’m willing to pay … wait for it … full game price tag for it. If you don’t like it, keep it to yourselves, it is that simple. I like to pay for my things. I don’t play Gw2 because its B2P. I play it because I like the game. Sub or no-sub, especially the de-facto standard of $15 really doesn’t bother me. Actually, I buy gems pretty often. Sometimes I store them for future use because I want to support the game. True that doesn’t give me any more rights than the next player, but just like the next player, I get to ask for things. Its ultimately in ANet’s hands whether they do this or they don’t. If they don’t I think it would hurt them more. Game is already 3 years old, I would love to see it reach the age of 10 and tell its story. I like the graphics, I don’t like the framerate
I think ANet could even pull a PR stunt out of all of this, to bring in purchases further. I mean after all, they sell TVs now with “TimeShift” technology and I used to “TimeShift” in the 90s with my VCR.
Although a lot of people keep posting about how Blizzard created a DX10/11 renderer for WoW, it’s not the only company that has done so. Turbine created a DX11 renderer for Lotro and DDO, CCP for EVE and many many others. Did they all have way more money than Arenanet to hire actual programmers to work on that?
It is actually rather interesting that every single company that people brings up that have made the change have all had subscription based games (both DDO and LotRO require a subscription back when they did the update). And yet no one seems to see the connection.
CCP’s revenue has always been lower than ANet’s. Go figure.
CCP’s best is less than ANet’s worst.
CCP recently axed World Of Darkness MMO because their funds were unstable.
Dust 514 is really poor and drains the company
Eve Online is their only stable revenue, which is low. They hope Eve Valkyrie will hit bit on eSports.
(edited by Ravenmoon.5318)
I don’t understand the big deal. The game is fine as is. Real mmo gamers know that gameplay trumps pretty graphics.
Its not about graphics and stop insulting people by saying they play from toasters. You have the latest and greatest 500$ CPU, that’s fine. The Dx9 pipeline probably doesn’t clog up your CPU scheduler with draw calls.
For the rest of us, who are used to play games on ultra graphics on everything remotely-modern is a huge problem having to lower settings and still play at 40 fps.
The networking layer of this game is fine for a MMO. It really is. If only the game was able to use more cores … instead of you know, relying on single core for draw calls and game logic.
@maddoctor – Thank you for the reminder. I play EVE Online currently alongside Gw2, and have been playing it over the years. The guys have implemented new renderers and new features every few years. Few years ago they refreshed all the textures and models of the game making it look more modern. Last patch allowed them to use realistic lighting with surface reflection thanks to the improved renderers. And I think their next expack will update few more models of the game.
The best of all … game runs smoothly on 120 fps completely maxed out (and frame limited because because i dont want it to go over 120 fps)
Funny thing is, EVE Online is written in Python. Or was,, the last time I checked, around 5-6 years ago.
(edited by Ravenmoon.5318)
WoW managed to make the jump from DX9 to DX11 so I don’t know why GW2 can’t do the same, I gained 25fps when Blizzard updated their engine and client to 64bit. It would be a massive improvement to the game.
If we assume WoW currently have 7 million subscribers.
Each paying $15 per month. That means a guaranteed income of AT LEAST 105 MILLIONS every MONTH.ArenaNet is most likely not even close to that level of income. Thus it is rather silly to compare them.
So you think the upgrade costed blizz 105 mil? The cost for the upgrade goes into salary. Nothing else should be revamped. They dont need art team for this upgrade. They dont need sound team, they dont need a story team. What they need is a few hard working devs (programmers) and 5 months. After that some QA. But I can be a beta tester for that if they dont want to pay for QA. Hell I’ll bring all my friends to QA Dx 11/12 for free. xD
So if 10 devs can make millions for 5 months working at ANet, I’m so passing my CV to them lulz xD
@Obtena – Yes the ROI is negative for this upgrade. But single-threaded DirectX 9 is forcing them to work more at the moment to optimize for this old renderer. Eventually those overwork hours will cost them more than the update itself. They can double the draw calls with Dx 11. Even moreso with Dx 12/Vulkan. Its basic economics. In the long run, this DirectX 9 is going to raise the prices of everything. In the long run, they might even find it hard to find devs that know how to work with Dx 9. Keep in mind that good developers are in limited supply. And some good developers just don’t want to go through the hassle of supporting old technology. And they don’t even look at these gigs.
(edited by Ravenmoon.5318)
Ahhh somebody can’t dodge and wants to spam condis from afar with autoattack while watching his/her favorite anime. I see what you did there OP
http://www.phoronix.com/scan.php?page=article&item=valve-lunarg-vulkan&num=1
There are your “millions of dollars”. Took Valve 4 months to port the whole kitten thing.
Education, i know its hard, but you have to do it.P.S: Writing anything graphical under linux is pure hell, because of the drivers mess there is. These guys did it in 4 months.
P.S.S: And good guy Valve will release the whole thing as open source, once Vulkan launches. And I highly doubt Mantle/Dx will be that much different to write against. Valve did the heavy lifting that y’all have been afraid of.
P.S.S: Now that I think about it, praise Gaben really has his aim at Microsoft Windows and proprietary walled gardens :O
Note: You need to copy/paste the link, the forum’s redirector mangles the URL.
Valve.
The owners of Steam.
The creators of the cross platform Source Engine that they sell to 3rd parties. An engine already supports an OpenGL pipeline.
The company that Gabe Newell owns that made him a billionaire.
The company isn’t exactly paupers and since they sell their engine it’s important to keep up with their competitors in the game engine market.
Versus
ArenaNet, a company with less than 600 million in sales in it’s lifetime. Who doesn’t sell their home grown engine to anybody.
Nice attempt to equate the two companies there mate.
one thing you may have missed is that, the work they just did to get this working will be released as open source.
P.S: If it wasn’t Steam, Valve would be dead. The Source engine is also – free. But even if it wasn’t, there aren’t many companies using it atm, outside of Valve. Mobile is dominated by Unity3D, AAA is dominated by UE/CryEngine/Frostbite/“Homebrew”
P.S: Source Engine didn’t have OpenGL rendering pipeline until Gaben decided they’ll push for linux through SteamOS. After that they ported most of their games.
Funfact: A game can be made without the use of a sophisticated multi-purpose graphics engine, you know, kinda like they used to do them in the 90s But hey, if swapping a renderer means rewriting the game to you guys, by all means, ignore this comment, we have nothing to talk about. I have nothing to talk about with you
You clearly don’t see the difference.
The URL again: http://tinyurl.com/n6d2ndu
(edited by Ravenmoon.5318)
Its funny because the real pet/ai master – ranger- has some really kittenty class mechanic with working with pets and most people just avoid them with little to no pressure from the pet, outside few nasty CCs that last whole 2 seconds lulz, while the engi can just sit at the middle of a point a tank the crap out of any other build xD
Wells necro? pls … here have some flying turrets.
I think the HoT engi will be a step down. Those flying drones surely can’t match the flying rocket launcher!
http://www.phoronix.com/scan.php?page=article&item=valve-lunarg-vulkan&num=1
There are your “millions of dollars”. Took Valve 4 months to port the whole kitten thing.
Education, i know its hard, but you have to do it.
P.S: Writing anything graphical under linux is pure hell, because of the drivers mess there is. These guys did it in 4 months.
P.S.S: And good guy Valve will release the whole thing as open source, once Vulkan launches. And I highly doubt Mantle/Dx will be that much different to write against. Valve did the heavy lifting that y’all have been afraid of.
P.S.S: Now that I think about it, praise Gaben really has his aim at Microsoft Windows and proprietary walled gardens :O
(edited by Ravenmoon.5318)
Its kind of a paradox that ppl complain about some build that is easy to counter and call it braindead afk build (which it is in a way) when they themselves die off fire from stationary turrets because they stand on the fire and dont even try take them out.
One might start to think that ppl that complain nerf this nerf that on a daily basis use up all their time complaining on forums instead of practicing their own counterplay on classes and builds they feel difficult from one reason or another to play against.
Analytical discussion about class balance is another thing. But then you would need some valid arguments.
Be my guest and go ahead and destroy turrets vs turret engi as a thief in 1on1. Tell me again how is that not frustrating. Those turrets are harder than the thief, lulz. Can’t be struck with critical either, because structure c:
Or try with staff mesma.
Or as a necro, but have the engi place the turrets in the air.
Fun fun fun
1. shadowarts thieves
2. shadowarts thieves
What is this mythical creature you speak of <o>
NA doesn’t deserve such leaderboard since you have NO team able to defeat Abjured. The top NA players are 5 and they play for the abjured. Case closed lulz :p
This could be easily done, but i prefer S E X and sunshine. As do most other people.
Most vets and any real serious players have since moved on. The above team found a game they were finally good and will stick to it until its dying day.
They should come to SMITE/LoL/ DoTA2 and see how they really stack up against a real PVP community. Not one based around Meta builds, Ai or FoTM. Odds are you would never even know they were playing the game.
cri mor lol
Ontopic, steamrolling any other team with 500 to 100 something means that those 5 players are better than the rest “pro” playaz listed in here.
At least over at the EU side, the WTS winners get their kitten kicked by other good players so the matches are at least interesting, because there is no certainty who would win.
You’re so cute with your Fanboiism.
No one cried, it was pure simply stated FACT.
Now take a break, remove your head from inside their bums and breathe.
Maybe get a GF or goto a beach.
I don’t even like the Abjured. But facts are facts. Muricans can’t beat them. I’m not even from NA.
You cri.
I got a girlfriend, still too cold in my country for beach. But I will, in 2 months when I get a month off work tyvm for your concern
P.S: Dota 2/LoL do have FotM builds. Guess you don’t play these games so much to figure it out on your own :p
(edited by Ravenmoon.5318)
NA doesn’t deserve such leaderboard since you have NO team able to defeat Abjured. The top NA players are 5 and they play for the abjured. Case closed lulz :p
This could be easily done, but i prefer S E X and sunshine. As do most other people.
Most vets and any real serious players have since moved on. The above team found a game they were finally good and will stick to it until its dying day.
They should come to SMITE/LoL/ DoTA2 and see how they really stack up against a real PVP community. Not one based around Meta builds, Ai or FoTM. Odds are you would never even know they were playing the game.
cri mor lol
Ontopic, steamrolling any other team with 500 to 100 something means that those 5 players are better than the rest “pro” playaz listed in here.
At least over at the EU side, the WTS winners get their kitten kicked by other good players so the matches are at least interesting, because there is no certainty who would win.
(edited by Ravenmoon.5318)
Turret engies are bad and they should feel bad.
#NoRespectForTurrets
This kitten ain’t skill bruh.
NA doesn’t deserve such leaderboard since you have NO team able to defeat Abjured. The top NA players are 5 and they play for the abjured. Case closed lulz :p
signed
15chars
What he said. +1
Haswell and Ivy Bridge run hot, its not the game. Gw2 is the only game that keeps my PC silent. ANY other game and its as if I’ve turned on a hairdryer. Not to mention the simple thermal paste they use under the lid, forcing people to delid their CPUs. Well at least the ones that need some extra heatroom for overclocking that is.
Googled it for you. Be thankful. The speed improves are relatives. Certain applications even get slower.
P.S: Thank you for derailing my thread with your uneducated answers. Most of you at least.
(edited by Ravenmoon.5318)
@Ravenmoon I don’t have the problem, maybe you have fps issues and that’s why you have such a bad experience, but that doesn’t mean the game runs really crappy.
Also, I was talking performance at same speeds. Your 2500k at 4.5GHz is like a 4690k at 3.8GHz.
And about the rest, that shows how little you know about improving fps. Yes, pcie speed increases fps (tested). Yes, ram speed increases fps (tested). Yes, some graphic options have massive cpu load with minimum graphic increase.If I have time this Easter I’ll do some tests in my desktop about all this kind of things.
This post shows how little you know about hardware and how much you are willing to spent over marketing.
Newsflash, there is NO difference between dual/quad channel ram running at 1600MHz or 2133MHz in DDR3. Infact, 2133 has higher latency. Google what that means, or if you are too lazy, check out the benchmark made by linustechtips at youtube, not going to do it for you. Faster ram means jack kitten in gaming. Period. If anything it can slow you down.
There is virtually NO difference between PCIe 2.0 x16 and PCIe 3.0 x16 in gaming
http://www.pugetsystems.com/labs/articles/Impact-of-PCI-E-Speed-on-Gaming-Performance-518/#Conclusion
Got you a link that autoscrolls to the conclusion.
Now if you are done spreading bullkitten can we please continue the conversation without derailing.
Oh and changing a renderer does not mean rewriting the core of the game of whatever jokes you guys came up with. You all should buy this book and read it cover by cover
The guys explains pretty well what an engine consists of and how hard it is to abstract the rendering calls. Voodoo i tell you.
P.S: The problem Gw2 is facing is that it can’t feed the GPU fast enough because its rendering through a single core. Certain thread locking occurs when the scenes get intense. Thus you start generate frames slower because your CPU can’t finish sending the previous frame buffer.
(edited by Ravenmoon.5318)
I wouldn’t go as far as claiming that Gw2 devs are lazy. That’s far from the truth. If anything they are understaffed. Gw2 is one of the major MMOs today, while having a rather smaller dev team compared to other AAA games. Gw2 is an AAA game. And the optimizations they’ve pulled on Gw2 is amazing. I respect them for that. But they are hitting the limit. You can only optimize so much without graphics vendors fixing their drivers for you…
@tom – Win10 will be free to all windows 7 users o.O
@ravenmoon That was a UI bug fixed in a very recent patch, it’s not a valid point to argue about performance issues, as the problem has only existed for a few days in the whole game history, and it was a bug.
About the cpu, there’re 2 things why 2500k struggles a little bit. First is single core performance, haswell has ~13% more power, and second is pcie speed, which gw2 really takes advantage of it (z77 is limited to 2.0 x16).And why people keep mixing things? This goes for you Scoobaneic.
Just because a frog is an amphibious doesn’t mean a horse is able to breath in the water.
The same happens to game engines. Just because a single player non-online game has acces to it’s code (so people can make mods, adapt it to newer apis…), doesn’t mean a mmo game, like gw2, can.
If you want to make a point, tell me which mmo have open code,so people can play with it…
Its still active. Just popup full inventory and let me know if you get any FPS hits.
Also, I’m playing every other MMO on ultra graphics with my machine at 60+ fps
Battlefield and Dragon Age: Inquisition go with more than 120 fps.
Don’t compare my CPU to stock clock Haswell please. With turbo boost it goes even further than 4.5GHz. The liquid cooling keeps the thermals low.
Also, there is no consumer GPU available today that can max out PCIe 2.0 x16, even though they make them PCIe 3.0. I made the research before sinking $500 for R9 290X. It is not being bottlenecked by my chipset. There’s no difference in 290X running on 3.0 or 2.0.
Again, WoW is using proprietary engine. They’ve updated their renderer. It didn’t const them an arm and a leg.
Either way, in long term, old renderer is only going to cost them more. You know, taking their time to optimize stuff, it adds up to cost. I can’t begin to fathom how many iterations of HoT maps they’ve done before they had something that gives decent FPS. And that costs time, and time is money.
Regardless, adding another rendering pipeline would greatly benefit the game IMO. Many people agree with me. If they write abstract code, it should be easy for them to at least get Dx 11 running. Which does have multi-threaded renderer setting. Which would greatly benefit the more powerful machines.
If people get only 15 fps improvement, it will be wort it. The difference from 25 to 40 fps is HUGE and quite noticeable.
P.S: There is a time in programming where you just have to bite the bullet and upgrade your tech. Because maintaining old tech, more often than not, gets more expensive as the time goes.
(edited by Ravenmoon.5318)
i5-2500k OCed to 4.5Ghz why?
Plus are you implying that bf4 doesn’t process kittenload of data? YOU are not getting what im saying. Im telling you that even alone you can have framerate drop in Gw2 because of scene intensity. You lose roughly 10 fps when you open your inventory (and its actually full). I’ve seen it, there’s videos about it.
But since you like vids: https://www.youtube.com/watch?v=JTofdajvMoo
Keep in mind I have a 120Hz screen, the framerate dips can be felt with or without VSync.
(edited by Ravenmoon.5318)
Guys you also need to realize that DirectX 9.0c is one of the things that make this game so heavily CPU bound. The optimizations are obvious. Many things in the game are asynchronous. Those things are evident and I appreciate it. But see it from this perspective.
DirectX 9.0c launched on August 2004, while the 9.0 spec was released in 2002. That makes it a more than 10 years old rendering tech. What they have done with it is marvelous. But its just not good enough
Back in 2004 AMD was king of performance with its AMD Athlon 64. The then 400$ GPU was GeForce 6800 GT. Intel’s best offer was the Pentium 4. ATi was a standalone company.
We have gone leaps and bounds since then in terms of performance and instruction sets. This API was tailored for this particular hardware. It can’t take full advantage of today’s hardware. GPU computation (GPGPU) was probably a myth. But its a reality today.
Not to mention that Dx11 has technology that wasn’t even discussed back then. Dx12 even moreso.
P.S: Funny, in 2004 I didn’t have my own PC. Was visiting “Internet Caffe’s” for my gaming fix hehe I was alive when the internet caffee business started and died xD Much like the compact-disc. kitten I feel old now.
(edited by Ravenmoon.5318)
@Ravenmoon BF4 multiplayer is limited to 64 players, and there are fewer external things to process. In gw2, where it gets really messy is when +50vs+50 people engage.
There is a lot of stuff calculated by the cpu there (IA, players location, projectiles, range attack, AoE targeting, condition ticking, dps calculation, hard CC movements, pathing of transportation skills, dodges…). That’s easily several times the amount of data the cpu has to manage compared to bf4.
Things where the gpu does no job, and as a result, it doesn’t matter which api you use.
Just look at this video of how a gtx 660 can max the game with minimum of 45fps while recording:
https://www.youtube.com/watch?v=l1zWbnnjzOkAlso, the photos show how dx12 will work compared to dx11. Violet is game engine, and red/blue is graphic rendering. We can see that dx12 is really important when the game is very graphic demanding, which gw2 isn’t.
You cannot compare a AAA fps game, like BF4, performance with a mmo performance. BF4 has way more complex graphics than gw2, and that’s why mantle/dx12 improves so much the framerate.
Gw2’s framerate dips when you open your inventory …. it dips even more so when you have 20 people around, not 64. Bf4 can get pretty intense with structures falling down all around tanks shooting and firing choppers jets infantries and the list goes on and on. Especially on a major teamfights. And the game dips to 80 fps at worst. Which is still fine in my book.
I didn’t quite catch why you posted this vid. The guy in it is just wandering around on an empty map in an okay scene (not too many objects).
Its one of the reasons I started playing sPvP instead of WvW. It just hurts in massive fights. I have no clue whats going on and sometimes the particle effects fill up my whole screen so I can’t even see myself. That and the weird skillag WvW has been getting lately. But that’s not a renderer issue so I’ll stop there.
OP are you a tanky turret engi? If yes, then you deserve it!
Just a reminder. Windows 10 will be free even to the pirates, however updates will be disabled and won’t be coming from Microsoft.
So really, they are just throwing this OS out. I see no reason not to pick it up. And if the upgrade is as smooth as Win 8 to 8.1, I’d be all in without second guessing.
I don’t know why people think dx12 would solve fps issues.
Dx12 only aims to redistribute graphic rendering in cpu to all cores, but doesn’t change how the cpu processes the game engine.
And as gw2 is a mmo, all the npc, players, skill mechanic… Is purely done by game engine. Graphics in this game are not a big deal, with a gtx 760 or r9 280 you can max at 60fps without problems
Because more intense scenes drop the framerate, even when you are alone.
And as a person who has played around with Mantle in Battlefield 4, its safe to say that im pushing 120 fps constantly. The same does not happen over at the Dx renderer.
P.S: @lordkrall, I don’t think WoW has ever been designed for Dx 11 when it was made 10 years ago. Yet, they already upped the render tech. And are upgrading character models now (or did already, i dont follow it so closely) because the new tech allows for more draw calls per frame without killing your FPS.
@Mirta – That’s why I wouldn’t mind a dropdown menu setting. Pick the renderer you want for best experience. I’ve dropped $1000 on my PC. Pardon for wanting to get all-around 60 fps, lol. All my other games pump up to 120 at ultra graphics but im fine with Gw2 going at 60 really. It’s just how love works. But that doesn’t mean I can’t ask for a little bit more. And I am.
(edited by Ravenmoon.5318)
You are ignoring the fact that it is completely different to have a game built with multiple versions in mind and one that is not.
Guild Wars 2 was built from the ground and up for DirectX 9. The whole engine is built around specifically having DirectX 9.
In order to get Guild Wars 2 Direct X 12 compatible, they will have to rebuild the engine, since it is not built to handle Direct X 12 (or 10 or 11) at this time. That will take money and resources.
You – don’t – know this for a fact.
Why people think that a renderer is a game o.O Why would I need few million in the bank? There are countless games running OpenGL/DirectX and lately DirectX/Mantle or hell … multiple iterations of DirectX.
For instance, Warframe, a F2P Co-Op shooter has Dx 9, Dx 10 and Dx 11 support. It has ports for Playstation 4, which to the uninitiated should sound like something other than DirectX (It’s OpenGL). You do the maths and tell me again what is possible and what isn’t.
Plus from what ANet has shown so far in the game updates, their engine seems very well engineered. So I doubt they decided to cut corners with the renderer.
P.S: World of Warcraft has Dx9 and Dx11 renderer btw…
(edited by Ravenmoon.5318)
Windows 8.1 (apart from the actual start menu) is an upgraded W7. Also absolutely 0 incompatibility issues in W8.1 with anything so far.
Windows 10 almost completely removes the tabs thing. From everyone that actively tested it i only heard good things..not 1 bad thing. Windows 10 will be a beast when it comes out.
Also W7 doesn’t know how to fully profit from the new generation of CPUs, memory usage and so on. The jump in speed/performance i noticed when jumping from W7 to W8.1 was pretty big …..performance that is also noticeble in games.
When W10 comes out people still rooting W7 will look exactly like ppl who rooted XP when W7 came out.
Let him repeat the same troll posts over and over that the internet dummies have been repeating. I absolutely love my Win 8.1. Its fast, its sleek, the mouse control is not shoehorned. They just moan for the sake of moaning. Don’t even bother answering such people, it is obvious he has only seen Win8 on a screenshot.
About the money I would pay, is the same money for any other Dx 12 game out there. In the case of Gw2 though, I see myself going for collectors edition pricing. Because I love the game.
P.S: I only bought Battlefield 4 and Thief because of the Mantle API support. Deluxe editions at that. Love how smooth those games are.
(edited by Ravenmoon.5318)