Showing Posts For Korval.2197:
The new colors are indeed much too bright and do not fit the rest of the WvW art style at all. The colors should be changed back to the old ones. If the goal here was to help colorblind people then add an option in the game preferences that allows players to chose between the muted color style and the “screaming” color style.
Evan,
I’ve logged into Lord of the Rings Online today and done some experiments with the camera there to see to see how it behaves differently from the GuildWars 2 camera. The most noticeable difference is that the click – drag threshold in Lotro is much, much lower compared to GW 2. In Lotro it looks like as if the threshold would be < 10 pixels and it might be as low as five pixels – it is at most 25% of the threshold that you use in GW2.
I’ve been playing Lotro since 2008 and I’ve never it had it happen that I would have selected something when I wanted to turn the camera or that the camera would have turned when I wanted to select something. I’ve also never seen anyone complain on the Lotro boards about this kind of problem. In GW2 on the other side I ran into this click vs drag problem right in beta weekend 1.
Disabling right-click to attack/interact is certainly the safest first approach to this issue. There are two problems, though. Anytime you have camera-look and targeting on the same input, you will have conflicting behavior. There isn’t a bug that targets are randomly selected, but rather the game trying to be smart at recognizing clicks versus camera movement. Even when you think you’re perfectly clicking on a target, most of the time you’re also moving the camera a tiny, tiny bit. There is a small threshold that determines whether your tiny movement should actually be a target selection. Lowering this threshold will reduce the chances of this happening, but is one of those numbers where we need to sit on it for a while to see if it ‘feels right’. The problem could also be solved by completely separating the different functionality of selection, auto-attacking, and camera movement. However, completely splitting those is not something we’re prepared to work on at the moment as it is a much broader-reaching change. I hope this helps clear up what’s happening and what steps we’ll take to change it. Thank you again for all the feedback!
The primary goal should be to remove right selection altogether. There is no good reason why GuildWars 2 should be a special snowflake and try to overload the camera movement with selection – an action that for the past decades has been customary done by left clicking.
However, there also appears to be a bug in the way that the input handling code tries to differentiate between clicks and drags. You can see it by executing the following two tests:
1) stand in front of a (green) mob. Move the mouse cursor on top of the mob. Press the mouse button and move slowly left/right/up/down.
Observed behavior: at first the camera does not move and the mob is not selected. At some point the camera starts moving, namely when we cross the click – drag threshold. Releasing the mouse button as long as the camera hasn’t moved yet selects the mob. Releasing the mouse button once the camera has started moving does not select the mob.
All this is as expected.
2) Repeat test (1) but this time move the mouse very fast/flick it.
Observed behavior: there’s a 50% chance that the mob will be selected or not selected after the mouse up. If the mob is selected then the selection has occurred regardless whether the camera has moved or hasn’t moved.
This inconsistent behavior in the differentiation between clicks and drags is the most irritating part about this right-click “feature”.
I’ve been exclusively playing a staff elementalist in WvW since release of the game with a 30/30/0/10/0 build and a 80 to 20 ratio of offensive and defensively orientated gear.
The fundamental problem with staff in my experience is not that AoE would be too weak or too powerful. The fundamental problem is that the single target spells are too unreliable which often forces me to throw my AoE at a single player if I actually want to have a chance at killing him. It is pretty ridiculous that I have a higher chance at succeeding at a 1v1 if I cast Meteor Shower rather than using any of my single target spells.
So the question is why is that the case? There are three problems with the single target spells:
a) Perpetual miss after target change: the first spell that you cast AFTER you have changed target from A to B or reacquired a target that you’ve had targeted a short time ago will always miss. Every single time. Without exception. I.e. I have a guy to my left targeted, now I switch my target to the guy to my right. Now I cast Fireball: guess where the Fireball is flying – to the guy to my left…
This is also one of the reasons why fighting a thief can be extremely frustrating. Every time he goes into stealth I lose him as my target. Once he comes back and once I have required him as my target, the first single target spell that I cast at him will miss him.
Sometimes even the second and the third cast after target acquisition misses.
b) Too slow projectile flight speed. It is easy for an enemy to dodge my projectiles or outrun them because they fly so slow. This combined with the fact that casting single target spells tends to take 1 to 2 seconds means that it is very easy for my enemy to ensure that he will rarely ever get hit.
c) Projectiles don’t pierce. As soon as a Ranger’s pet decides to run between me and the Ranger – I can’t hit him anymore. Same with Necro pets or any other kind of NPC/pets.
Weakening the AoE that staff elementalists have is not the right answer. The right answer is to fix the single target spells. The AoE is currently the only reliable way that I have to actually hit my target. So if you want to see me throw less AoE around then make it actually possible to kill someone by using my single target spells.
Just to make it clear: the problem with the single target spells is not the amount of damage they cause – this part is fine – the problem is that they are too unreliable at hitting.
Cider does not use the X11 driver because Mac OS X does not use X11.
Your logic is flawed.
Cider is based on Cedega which was Transgaming’s proprietary fork of Wine. Wine uses X11.
The goal of the Cider project is to provide a Win32 implementation on top of Mac OS X. The goal of the Wine project is to provide a Win32 implementation on top of POSIX compatible systems. POSIX systems have traditionally been using X11 as their graphics interface while Mac OS X uses the proprietary Quartz architecture instead.
Consequently Cider uses a Quartz-based driver instead of the X11 driver that Wine uses.
If Cider would use X11 then the GW2 app would have to come with an X server since there is no X11 installed by default on Mac OS X. But looking through the GW2 app package shows that there is no X11 server. Instead we find a library called “libquartz”. Okay maybe Transgaming went to the troubles of creating an in-process X11 server that is linked into the wineserver or the cider process. So let’s check with Instruments: nope, none of the threads references anything X11 related.
Looking through the cider process (the thing that gets started when you double click GW2) we see references to Cocoa (the native windowing/app API), we also see references to CGL calls to set up OpenGL pixel formats and contexts (they wouldn’t be there if this thing would use X11) and we see references to “macdrv”.
So considering that you can not rely on the presence of a X11 server on a Mac and the fact that there is no X11 server built into the Cider port, the question is how something that does not exist on the machine can draw something on the screen. That’s a bit mysterious.
Finally, changing the GraphicsDriver entry in the config file from “macdrv” to “x11drv” causes the game to stop working – it no longer starts up with that change. Reverting the change to “macdrv” makes it work again.
I do have Xquartz installed. is that why I had a dramatic change when adjusting the VideoRam setting?
What does dramatic mean? For me dramatic would be an increase of 40 to 50 FPS. Anything around 10 FPS is ignoreable because that is the amount by which the in-game FPS meter fluctuates over time. Up to 20 FPS is debatable since you can produce that increase by turning your avatar by 360 degrees in the right spot. It’s also not clear how the change was measured.
But even if you reach more than 70 FPS you will still not necessarily be happy with the Mac OS X version of GW2. I’ve been running the game on a late-2012 27 inch iMac with the GTX 680MX graphics card with 2GB VRAM and while the FPS is good, the responsiveness of the client to user input is bad. This is most noticeable when doing a simple 360 degree turn with the avatar or in WvW. Turning the camera is very jerky compared to the Windows version. Although note that the Windows client does in fact have the same problem but it is by far not as noticeable there.
This has everything to do with how the Cider wrapper handles system memory.
There is no need to speculate since we can use the OpenGL profiler and Instruments to find out what causes the jerky behavior.
This is the test case:
a) Log into the game and make sure that you are not in the Mount Maelstrom playfield.
b) Waypoint to Bard’s Waypoint in the Mount Maelstrom playfield.
c) turn your character via the keyboard by 360 degrees.
Observed result: very jerky movement, lots of dropped frames.
According to Instruments the CPU load was between 340% and 390% while we were turning (400% is the max value for the 4 core CPU I’m running this on)- double the amount before we started the turn. The stack trace probes indicate a high amount of texture uploads (glCompressdTextImage2D() and glTexImage2D()) plus shader compile and link operations plus a lot of framebuffer flushes.
The OpenGL profiler gives us more details: while turning, the game has uploaded 1,600 (!) compressed textures and 138 uncompressed textures to the GPU. For each texture load operation the pixel store attributes and the texture environment parameters were changed. 20 pixel and vertex shaders were compiled and linked and lots of drawing happened.
Now it is clear what is going on and why the same problem exists on the Windows side, though less pronounced. The game generates an extremely high amount of textures even if most of the view is blocked by a mountain right in front of the avatar. The uploads happen in a way that is inefficient for OpenGL (state changes are bad) and a number of shaders had to be translated from HLSL to GLSL, compiled and linked.
To make this easier to see, assume that one texture upload takes 1ms. At 1,600 textures this means we spend 1.6 seconds just in uploading textures. No wonder that turning is so slow.
Whatever the game does is in any case not helped by the fact that the Cider port has to translate Direct3D shader programs and API calls into OpenGL shader programs and API calls at runtime and that the DX9 API design is very different from OpenGL. The only good thing is that DX11 is design-wise closer to OpenGL. But then GW2 is still stuck in DX9 land so that doesn’t help either.
Again, this is the wrapper’s fault. Transgaming has left the wrapper where it is for over a year and a half now (if not longer). No significant engine changes have been made to it in a very long time; ask any City of Heroes player about Transgaming. You’ll be lucky to see any major updates to the wrapper in the next leap year.
There is no magic pixie dust that could be applied to the Direct3D -> OpenGL translation problem that would make it much faster. Translating the API calls and HLSL shaders to GLSL shaders is costly no matter who implements the cross-compiler.
Hi all, I think a little more explanation is in order.
First, let’s call the culling system that we’re switching to “affinity culling” just to make it easier to talk about. Under affinity culling the system handles enemies and allies independently. This means that the maximum number of allies that you can see under affinity culling is 1/2 the maximum number of characters (combined enemy & ally) that you could see under the original culling. Ditto for enemies. In exchange for that cost he benefit that we get is that running with (or through, or past) a large group of allies (e.g. a guild) won’t prevent you from seeing the enemies who are closest to you.
…
Thanks for the details.
How are the lists organized? My impression back in december was that both the enemy and the friendlies list are FIFOs. I was part of a zerg that did a golem rush on Overlook and while we were running from SM to Overlook the golems would repeatedly appear and disappear for me despite the fact that I was only maybe 5m behind them. The same thing happened when we reached the gate and it was very irritating that they appeared sometimes for me in front of the gate and a couple seconds later they were gone from my view. This is the behavior that I would expect if the friendlies list is a FIFO because then a new ally that enters my visibility area would kick off the front most entry from the queue which in my case was the golem(s).
In any case, I think that the friendlies list should be a priority queue where the priority is based on the something like the following criterias:
a) commanders.
b) mobile siege equipment aka golem since they can have a big impact on the outcome of a battle.
c) siege equipment operators. It is important to be able to see them in order to provide heal and protection.
d) people that are in the same group as I am.
e) people that are in the same guild as I am.
f) people that are in the same squad as I am.
g) anyone else.
Now the question is whether the enemy list should use a similar priority system. I would say no because this would mean that enemy commanders, golems, siege operators, etc would de facto always be visible and consequently focused fired down in no time. Instead the enemy list should be a simple FIFO to ensure more fairness.
I don’t want to be the bearer of the bad news but since I’m a software developer who writes applications for Mac OS X I do feel that it is necessary to shed some light onto this topic because there seems to be some confusion here.
If we’re confused how are we seeing better performance?
See the explanation in my previous post.
Don’t get me wrong. I’d like to have a faster Mac OS X version of GW2 just like anyone else here but as long as ArenaNet doesn’t develop a native client either in-house or by outsourcing it to a porting house, our options to improve the FPS are very limited short of buying a late 2012 or newer Mac.
What you can do to improve the FPS are:
1) Go through the “Render Sampling” settings. It is by default “Native” which may not be a good choice for your graphics card. Try “Subsampling” and “Supersampling” instead. One of those setting might give you more FPS.
2) Run the game in windowed mode by checking the Window checkbox in the preferences dialog. Then make the window smaller than the screen. The FPS that you can achieve is directly dependent on the amount of pixels that the game needs to draw. Consequently a lower screen resolution or running the game in a window that is smaller than the screen will give you better FPS.
3) Reduce the quality of reflections. Reflections require multi-pass rendering which lowers the FPS.
4) Reduce the quality of shadows. Same problem as the previous one.
5) Turn off “Depth Blur”.
6) Turn off “Vertical Sync” if you are usually below 60 frames per second while out in the open world. Having that option turned on if you do not get more than 60 FPS consistently can significantly cut down your FPS. I.e. if you hover around 50 FPS then turning v-sync on will cut your FPS to 30 because the LCD screen is refreshed 60 times per second and 60 is not evenly divisible by 50.
But even if you reach more than 70 FPS you will still not necessarily be happy with the Mac OS X version of GW2. I’ve been running the game on a late-2012 27 inch iMac with the GTX 680MX graphics card with 2GB VRAM and while the FPS is good, the responsiveness of the client to user input is bad. This is most noticeable when doing a simple 360 degree turn with the avatar or in WvW. Turning the camera is very jerky compared to the Windows version. Although note that the Windows client does in fact have the same problem but it is by far not as noticeable there.
Whatever the game does is in any case not helped by the fact that the Cider port has to translate Direct3D shader programs and API calls into OpenGL shader programs and API calls at runtime and that the DX9 API design is very different from OpenGL. The only good thing is that DX11 is design-wise closer to OpenGL. But then GW2 is still stuck in DX9 land so that doesn’t help either.
I don’t want to be the bearer of the bad news but since I’m a software developer who writes applications for Mac OS X I do feel that it is necessary to shed some light onto this topic because there seems to be some confusion here.
Cider does not use the X11 driver because Mac OS X does not use X11. Instead it uses a Quartz-based driver. Quartz is the native graphics architecture that Mac OS X uses. This means that changing anything in the [x11drv] section does not have an impact on the GW2 client. I.e. changing the VideoRam setting to 32MB gives you the same FPS as changing it to 4096MB.
But why would you change the VideoRam setting to 32MB? To check whether it even has an impact on the client.
32MB is just enough to hold the frame buffer of the game on a 27 inch iMac. The frame buffer in turn consists of a front and a separate back buffer and both consume 28MB of VRAM which in turn means that only 2MB would be left available for textures and 2MB for shaders and geometry data. So the expectation here is that if this setting would be evaluated by the Cider graphics driver, that the game should either no longer run at all or if it runs then the majority of the screen should be black or covered with random colors. Neither is the case though and the game runs fine no matter what value you assign to the VideoRam setting.
Changing the Enable3D setting in the [macdrv] section on the other side, from 1 to 0 does have a very significant effect on the game since it no longer starts up when you do this. Changing the setting back to 1 makes the game work again. This is as expected because when you look into the [wine] section of the config file you’ll see that the GraphicsDriver setting is set to macdrv.
But why then would changing a setting that the game doesn’t use change the FPS in the game?
Because you are (a) looking at the FPS meter that is built into the game and because you do (b) not a proper performance comparison. The FPS meter in the game is not a performance measurement tool and it only gives an indication of where you are performance wise. It is good enough to give a general idea about the performance but it is not precise enough to do actual comparison between similar levels of performance. The in-game FPS meter tends to fluctuate a lot because it integrates only over a 1 second time span which is not enough to filter out changes in the CPU load caused by the game engine itself and that occur because of load changes in the background processes (there are always dozens of processes running in the background while you play). I.e. just logging into the game, going to an area where no other mobs or players are, looking right at a wall and just standing there while not using any abilities of the avatar causes the FPS to constantly change by around 10 FPS. Doing a 360 turn with the avatar and looking back onto the wall “improves” the FPS for a few minutes by up to 20 FPS.
So in short: if you want to measure performance improvements then you have to use a tool like Apple’s OpenGL Analyzer that is available for developers. You also need to make sure that you do the comparison on a freshly booted system to rule out any side effects from caching/other applications and you need to find a place in the game where there are no other mobs or players and where you can look onto something that triggers backside culling (this type of culling has nothing got to do with the culling problem in WvW – “culling” has multiple different meanings in software engineering) of anything that might be drawn in the distance (i.e. wall, tall hill, large tree). Otherwise those mobs or players will have an impact on the FPS value and thus the performance comparison.
The default respawn rate is 95-120 seconds, so if you’re seeing locations where they respawn much faster, please report it with specific info! Event creatures will often appear more quickly (so the event doesn’t look dead if players kill them immediately) and if players were killing things in the same area, they might respawn near you. (i.e. if a group ran through and cleared an area a minute ago, a lone player might be in the area when their respawn timer comes up.)
A respawn time of 95 to 120 seconds is in general very fast if not too fast. I.e. Lotro has a minimum respawn time of 180 seconds and WoW is also in that area (modulo its dynamic spawn timer adjustments) as far as I remember from my tests back in TBC. Also Lotro tries to avoid respawning mobs in the same location in which they died which has the nice side effect that it creates the impression on the player that his action did have a consequence: namely that the mob would be gone for good (although in actual reality it respawns, but likely out of his sight).
The problem by the way is observable all over Tyria and gets compounded by high mob density in areas like caves, skill point locations or Orr.
One reason why the current respawn rate feels so bad is, because it means that if you want to kill a veteran/champion mob and you proceed to clear out the place around that mob, then those mobs that you cleared out will respawn at least once and potentially more than once while you try to down the veteran. This is especially annoying if the veteran is part of a skill challenge. It is even more annoying when you remember the fact that we are supposed to actively avoid incoming damage via dodging and other skills because there is no RNG-based passive avoidance system in the game. But to be able to actively avoid I need space and thus a high respawn rate and a high mob density on top of that is a problem and can get aggravating fast. And that’s one big reason why so many players dislike Orr.
But even if you want to do just trivial things like reading a sign after you have killed a mob that was standing right beside the sign is something that often does not work because the mob respawns on top of you while you are still reading the sign. Or just looking into the map can be enough to get you killed because the mob that you killed seconds ago before you pulled up the map respawned so fast.
So what should be done to improve this is:
a) increase the minimum respawn time to at least 150 seconds.
b) a mob that has been killed should preferably not respawn in the same location where it died. Instead the spawn generator should try to find a location somewhat away from the point where the last mob died. Only if no such alternative point can’t be found should the new mob spawn where the old one died.
In technical terms: the spawn generator covers a disk with radius R. The last mob death point is stored by the generator on the death event and a radius r << R is associated with the death point. Spawning a mob means computing a random coordinate inside R but outside r. For every attempt at generating a valid spawn location that fails, r is halved. If N attempts have been made then r is set to 0.
c) the mob density in caves and other confined spaces should be lowered.
d) the mob density around veteran/champion/skill challenge mobs and/or the respawn time of regular mobs around them should be higher than the standard respawn rate so that players can kill veterans while getting at most one wave of respawns of the surrounding mobs. You may also want to increase the leash distance of veterans/champions in the open world.
(edited by Korval.2197)
You still have to change the config file since it defaults to 256 mbs of video ram see attachment.
Can you elaborate on how to do this? I’m new to the Apple world.
Many thanks!
There shouldn’t be any need to change anything in a config file. The change that Skippygoober described above was in the X11 driver section but Cider does not use the X11 driver. Instead Cider comes with a Quartz graphics driver (called “macdrv”). So if there is something you want to configure about it then you need to look into the [macdrv] section.
I’ve been running GW2 now for a couple days on a late 2012 27in iMac (3.4GHz Core i7 / 16GB / Fusion Drive / GTX 680MX with 2GB VRAM) and I get around 50 to 70 FPS depending on the area in full screen mode with high settings.
I also play Lord of the Rings Online on the same machine for which Turbine has released a native client 2 months ago and I get around 70 to 90 FPS on the open landscape and 60 to 70 FPS in larger towns in full screen mode with high settings and high texture resolution.
While I’m quite happy with the FPS that the GW2 client achieves on that machine, I’m irritated by two mis-festures compared to Lotro:
1) the fans come on easily and often when I play GW2 while they never come on in Lotro even when i play for 2 hours or more in a row.
2) if I turn the camera after I’ve entered an area or waypointed to somewhere, then the first 360 degree turn is jerky. It is as if the game would load art assets on the main thread or as if it would build some expensive data structures. This doesn’t happen in Lotro.
I think it’s a valid mechanic/tactic but culling issues makes it overpowered. In terms of counters, knockdowns, fears, etc seem to do the trick. It will be interesting to see whether it retains its value once culling issues are improved.
The primary question here though is not whether ANet thinks that portal bombing is a valid gameplay mechanics or whether they think it is not. The primary question here is what we expect WvW to become over the next one, two and more years and what can be done to ensure that it remains fun over a long time.
Portal bombing is a gameplay technic that is too good in the sense that it allows you to move a large number of players from A to B near instantly in order to take your enemy by surprise and overwhelm him while the portal bombing side takes little risk and has to invest only a comparatively small amount of effort into pulling this off. It is also a technique that favors the side that is already stronger because it can shove more players through the portal and by doing so the side which has less players is demoralized even faster.
The portal bombing technique also favors a kind of gameplay where it becomes mandatory to have a certain class in your army and where those toons are considered a liability to the army outside the specific cases where they are used to create a portal.
The rendering problem that is currently triggered by portal bombing is just icing on the cake. Assuming that ANet will be able to fix this then I wouldn’t be surprised at all if some players would then start to shove even more people through the portal to purposefully induce lag on the portal destination site. If you’ve participated in a large Stonemist Castle fight then you may have been able to see that AOE and heal skills start to time out at a certain amount of lag. The only way to use them in this case is to leave the area.
I don’t think that this game format has a future if it ends up in a “strategic singularity” where the only commonly accepted strategy to defend or take a keep is by using portal bombing and where alternative and possibly more elaborate strategies are being ignored because it is less time consuming to achieve your goal by portal bombing your zerg around.
Consequently the portal should be changed so that there is a limit on the number of players that can go through it.
(edited by Korval.2197)
We should not confuse LOS requirements with the AOE application area and optional facing requirements.
@Xandax
What ele skill doesn’t need los? I also play ele and I would like to know. The patched lightning strike a long time ago.
Aside from big AoEs.
On a staff elem….
Fire 2, 5
Lava Font (2): 360 degree small circle ground targeted AOE with a fixed projection axis (-Y).
Meteor Shower: (5): 360 degree big circle ground targeted AOE with a fixed projection axis (-Y).
The reason why those abilities hit people which stand on top of a wall is because those AOEs are always projected along the -Y axis. This makes sense because it would be frankly very weird if meteors which are supposed to fall from the sky would suddenly fall sideways if you target a wall while casting. Complaining about how those AOEs work is like complaining that you can’t hide from rain by climbing on the roof of your house.
Those abilities do have a working LOS check. You can not target anything that is behind geometry.
Water 2,3,4,5 (AoE healing from the ground onto a wall.)
Ice Spike (2), Geyser (3), Frozen Ground (4), Healing Rain (5): all work the same way as the fire skills mentioned above.
Air 1,4, 5 (Air 1 chain bounces doesn’t require LoS, air 4 buffs allies you can’t see behind you)
Chain Lighting (1): LOS check is applied between the caster and the first target. Separate LOS checks are applied between the first target and the secondary targets. For the secondary targets it does not matter whether the caster has LOS with those targets since the check is applied between first and secondary targets.
Windborne Speed (4): has no LOS requirement. This is a 360 degree AOE centered on the elementalist.
Static Field (5): another 360 degree ground targeted AOE which is also LOS checked.
Earth 3,4,5
Magnetic Aura (3): this skill covers a sphere which is centered on the elementalist.
Unsteady Ground (4): similar to the fire and water AOEs. Fixed projection axis along -Y.
Shockwave (5): hits everything in front of the elementalist which has LOS to the elementalist. Also only applies to the surface on which the elementalist is standing. I.e. it will not hit people that are on a higher ground level with respect to the elementalist.
Maybe we should nerf elem?
The elementalist abilities that failed to do a proper LOS check, facing check or range check have been fixed a couple weeks ago. I.e. Lightning Strike failed to properly check the range and LOS and now it does. Flame Burst failed to check facing and now it does, etc, pp.
Kinda amusing to watch them go through walls and attack stuff beyond 1200 m, I admit.
it doesnt work beyond 1200, fact.
You can make the Berserker work beyond a 1200 range. Just put a few basic elements together: (1) range checks along the +Y/-Y axis are broken for some skills, (2) the range check is only applied when you hit the ability button but not on the Berserker once it has been cast and (3) the way that aggro works in this game ensures that certain classes and builds will always be stalked by the Berserker as long as it exist and no matter where that person goes.
SteepledHat.1345:Would you guys go swing your nerf bats at something that needs it?
Just because other classes also have their share of broken mechanics doesn’t mean that we should ignore the balancing problems that the Mesmer causes. If we start going down that path then let’s do it right at least: ANet should bring back the infinite range Lighting Strike, Flame Burst that used to work on everything behind an elementalist and slows that used to be castable on the guy that was chasing the elementalist despite the fact that the elementlist was looking in the opposite direction.
You see, it works both ways.
The problem with the Berserker is that it deals a lot of damage, can be cast through geometry by taking advantage of auto targeting and that the casting range can be extended far beyond the 1200 range by taking advantage of height differences. Either the damage is toned down to compensate for the additional flexibility that this skill has over other damage skills or the LOS check is fixed so that it actually works.
We should also remember that the whole point of having walls and gates in WvW is to provide protection to the people inside a keep. Any player skill that nullifies this central aspect of walls and gates needs to be looked at.
At a minimum the Berserker skill needs to be changed so that the LOS check actually works.
if that change is implemented, then mesmers will literally have 0 attacks that can hit people on or inside keeps. is that fair?
It would be a first step towards more fairness between Mesmer and other classes’s AOE behavior. An elementalist can not cast inside a keep with the only exception that he is able to climb on a higher point right beside of a keep wall and where this point is still low enough so that the keep ground floor is in range of his AOE. An elementalist can cast on top of a wall by targeting the front of a wall – that is true. But by doing so the AOE loses 50% of its effectiveness because the front most 50% of the AOE circle is in front of the keep and only the backward 50% have a chance to hit any foe. The enemy on the other side can easily avoid the AOE altogether by moving to the very inside ledge of the keep wall. Why so many players fail to do this and instead keep on standing right on a spot where they get continuously hit by a meteor is one of the many fascinatingly illogic things going on in the game.
How about that: the Berserker skill does a proper LOS check and the Chaos Storm ability is changed so that it does direct damage plus the random condition nonsense is replaced with a proper condition mechanic where you can know up front before you cast it what condition it will cast.
Again, i just repeat, ele’s rangers and necros all they gotta do is lay their circles up top, its the same principle yet you guys single out mesmers. why not take away all classes ability to ever hit above the gate again to solve all your complaints about beserkers.
There seems to be some confusion. The Meteor Shower AOE does not cover the whole red circle on the ground. Each meteor is a separate projectile and you will only take damage if you get hit by a meteor. You will not get hit if you stand in between meteors – even if you are still inside the red circle. Consequently avoiding damage from Meteor Shower should be straight forward even if you do not doge out of it.
Casting Meteor Shower also requires that I have LOS to the target area. I can not stand in front of a keep wall and cast Meteor Shower inside a keep while a Mesmer can get his Berserker inside the keep. Additionally casting Meteor Shower takes multiple seconds where I have to stand still and where I have to be quite close to the intended target area. The cast animation is also very obvious and combined with the fact that an elementalist has to give up a lot of his defence to gain strong damage output he can be easily killed before he is able to finish the Meteor Shower cast.
A Mesmer Berserker hits for 5k – 6k with a single attack against a light armor wearer while a Meteor Shower hits on the average for around 2k – 3k per meteor. The high damage output of those Berserkers combined with the simple minded aggro mechanics that this game features means that a light armor wearer on siege equipment will either have to accept that he will die fast or he will spend more time killing those Berserkers and healing himself back up rather than actually using the siege equipment. Naturally, somewhere between 1 and 5 seconds after you killed one Berserker another one will be right back and hit you again for 5k – 6k.
At a minimum the Berserker skill needs to be changed so that the LOS check actually works.
27" iMac, 2.8 GHZ Core i7, 12 GB 1067 DDR3, ATI Radeon HD 4850 512 MB, running Mountain Lion, at 2560 × 1440 with most graphics settings maxed out, except for shadows, shaders and LOD.
I tried this, and while my FR remained the same, overall performance actually got worse—character moving jerkily through scenes etc. So I’m guessing the potential performance boost may be limited to specific systems, perhaps MBP only.
I got the same machine except that mine has 8GB instead of 12GB. With MTD3D turned on I get 5 – 10 more frames per second but I also get this jerkily moving behavior when I run with my avatar (tested in Lion’s Arch at the WvW Asura gates). With the option turned off I get around 30 – 40 FPS depending on the environment but the avatar moves smoothly.
The setting that has by far the biggest impact on my FPS is the Render Sampling option. Subsampling gives me the best FPS, the FPS at Native setting is 10 frames lower and Supersampling is 20 frames lower. That’s btw also the case with the Windows client running under bootcamp on the same machine. On Windows I get on the average 40 – 46 FPS at the native display resolution.
Jeffrey,
I was playing through the personal story quests with my (staff) elementalist up to the level 40 quests and I’ve since stopped playing the personal story because nearly every quest has been an experience of frustration. Most quests are designed around the idea of either throwing lots of mobs at the player or throwing champions and/or veterans at the player. This often leads to the situation that I’m forced to kite around to stay alive and playing those quests feels very much like running around like a headless chicken. There are often NPCs with me in the quest instance but they either:
a) fail to engage in combat and just stand there, two paces away from me and twiddle their thumbs while watching me getting downed. His majesties most useless soldier Logan is a prime example of this behavior,
b) engage in combat but deal no noticeable damage to their enemies and/or are incapable of taking aggro off of me,
c) they get downed in less than 5 seconds.
I very much believe in the idea that a personal story should be primarily about telling a story. It should not be about challenge because challenge can be had in places like dungeons which are isolated pieces of content. So if a player either chooses not to play a dungeon or faces a too high challenge level there then that’s not a big deal. But in the case of a personal story it means blocking the player from playing to the end of the story.
Because of this I completely disagree with the view that you expressed in another thread that it would be acceptable that a player has to show up with a specific weapon and/or a specific trait setup to get through a quest. It should never be the case that a quest that is part of a personal story requires that a player has to retool his character just to get through it. Instead the personal story should be:
a) about telling a story in an engaging way and allowing the player to take the story in at his pace,
b) personal. A player who chooses a particular class, weapon set and trait setup has made a personal decision what makes his character his character. A personal story quest should not force him to throw his personal decisions over board just to get through this quest.
The main problem with the personal story in GW2 is really that it is simply not about telling a story. The central design idea seems to be “let’s see how fast we get the player to the floor!”. That’s the opposite of what I expected it to be. I’ve played the epic story in Lotro and I would honestly recommend that you check out how the epic story is implemented there. It can be played by any class with any choice of weapon and any trait setup because the epic story there is really about telling a story. Players who want primarily a challenge can go and do a dungeon or raid instead or in addition to the epic story. But the epic aka personal story is about telling a story. What Turbine also does differently is that for many epic story quests the player can choose whether he wants to do it in a group or solo. The quest is balanced for 6 characters in the former case and in the latter case the player receives an “elf stone” from the quest giver. The player can activate the elf stone in the quest area and it drastically increases the player’s stats while he remains in the quest area to the point that he can defeat elite mobs solo. This may sound a bit weird but works very well in actual practice and it allows players to play all quests solo if they so choose.
The thing is that I still know every single epic quest that I played through in Lotro and I have very good memories of them although it has been years that I played the early ones. I still get goose bumps when I replay the “We can not get out” session play although I know that it is deliberately designed in such a way that I can not win this quest because it is about the small group of dwarves that tried to retake Moria and they failed and you play as one of those dwarves. But here in GW2? I don’t even know why some NPCs address my character as “advocate” or where I got that title from because the majority of my worry and energy is focused on staying alive and this drowns out the story aspect.
If you want to fix the personal story then the way to fix it is to replace the design goal:
“kill the player”
with
“give the player good memories of a well thought out story with plot twists that play out over many levels. Also, although the character is considered to be a hero, this still does not make him perfect and unable to screw up”.
My champion (basically an offensive warrior) in Lotro failed to help safe the daughter of one of the rangers from the witch king in Angmar. Every time I meet her father he reminds me of that fact and how much he misses his daughter. In that game I’m a hero but not perfect. In GW2 I’m apparently a perfect hero but for some mysterious reason still spend a whole lot of time on the floor – throwing rocks at my enemy.
(edited by Korval.2197)
If you guys actually tried to fortify your keeps when you took them then maybe you’d keep some, but you don’t. You go and cap something then run off to the next one leaving nothing behind. If you don’t hold it, you don’t get points.
While I do agree in principle with the idea that a tower/keep which has been taken from the enemy should be upgraded as fast as possible and that a suitable amount of defensive siege equipment should be placed, I have to say that doing so in the context of the current match up is not a reasonable thing to do for us on the AR side. The simple truth is that the WvW population on the AR side is in general lower compared to the NSP side but particularly lower after 10PM US West Coast time. This is when more and more AR player start to log off but more and more players on the NSP side start showing up. The result of this change in population can be clearly seen in the WvW map because one keep after the other falls until around 12AM – 1AM the map has turned all green with the one or two odd blue and red spots on it.
Defensive siege equipment is a powerful tool and also a great way to make a lot of XP and to get a lot of loot when you man it. It has one flaw though: it’s dead and useless wood if you do not have enough people to man it.
Matt and Jon,
it is 11PM US West cost time and I’m standing here in the Anvil Rock spawn point in the Eternal Battlegrounds. Looking at the map I see that the Northern Shiverpeaks server owns all supply camps, towers and keeps on all battlegrounds except 1 tower and supply camp that is bravely held by Borlis Pass and 1 supply camp that is still in our hand – at least for the next 10 seconds when the 30 man NSP zerg will steamroll this place on its way out from the tower that they just took from us.
I’m not alone here at the AR spawn point, though. I’m joined by 12 other battle hardened players who are currently weighting their two options:
1) log off.
2) serve as free sources of loot and badges for the overwhelming NSP forces.
What adds insult to injury in this case is that this is an exact repetition of what happened last week where we were also matched up against NSP. How comes that the score that was computed by the match making system did not reflect the fact that NSP was constantly owning 90% of all maps over the week versus the 10% that we and the other server held?
I hear what you both said and it leaves me with the impression that you do not take the problem seriously. The terms “night capping” and “day capping” are an oversimplification of what is happening in the game . The problem is that the existing match making system does not take the distribution of the player population over a day into account when matching up servers. Consequently servers are matched up with widely different amounts of players which then leads to the effect that one or two sides get overwhelmed by the other side.
This unhealthy mismatch in the population between servers leads to further problems: demoralization of the server(s) that gets stomped into the ground in mere hours and it allows the server with the much larger population to invest into all tower, keep and supply camp upgrades which can be finished in quick order because the underpopulated side does not have enough manpower to sabotage or prevent successful upgrades in more than a small handful places.
The next day, when the population of the smaller server reaches its peak, it can only take on a small number of objectives before the population peak passes because taking out a fortified tower or keep, especially if up to a dozen arrow carts and other defensive sieges have been placed on the walls and inside the court yard is a very time consuming task.
Because of all this the match is effectively decided in the first 24 to 48 hours and it leaves the losing servers with nothing much to do for the rest of the match up.
What is also not understandable is that on one side a design goal was that WvW should be a 24/7 format. But then on the other side it has been implemented in the context of a multi-server architecture where servers are put against each other. The problem here is that the implementation is not suitable for the idea. A 24/7 battleground really needs a single server architecture where guilds and alliances fight against each other. This setup gives the necessary flexibility to easily adjust the balance of power as the fight rages on and it also ensures that every side has the same chances at filling their ranks with soldiers for all time ranges. A multi-server architecture on the other side is too restrictive for a 24/7 PvP format.
Speaking of multiple servers and adjusting the populations on them: while I do agree with the idea that players should be able to move from server to server for free in the early days of the game to better distribute the player base, I absolutely disagree with the idea of allowing someone who has just moved from server A to server B to immediately rejoin WvW on server B. Transferring from one server to another one absolutely must lock this person out from WvW as long as the current match up is still active. By allowing players to immediately rejoin WvW you have effectively opened the gates and actually invited players to misuse this services to tamper with WvW matches.
Another point that I do not agree with is the idea of hiding the names of the players of the opposing faction. Since fights in PvP can get heated from time to time it is vitally important for the quality of a game to ensure that players can be held responsible for their actions on the virtual battlefield. By not showing the names you allow players with questionable motives and with a laissez-faire attitude towards cheating and hacking to hide behind a veil of anonymity. The reporting function that is available is only useful if you manage to target the player that you think is hacking which is a problem.
It is now 12PM and I think I will log off from the game and skip the rest of this WvW week because this match has already been decided and I rather spend my free time on something where I have the impression that my actions actually make a difference.
Players who transfer to another server should be locked out of the ongoing WvW match up until the next match up starts. Allowing players to immediately rejoin WvW on the destination server just encourages questionable actions and since player names are not shown in WvW it is even more tempting for some people to do this.
Like mentioned above it is supposedly about avoiding harassment and to make it so that players who are new to the format do not feel intimidated by it.
However, I think that those reasons are not strong enough to allow players to be near completely anonymous which in turn allows some players to use cheats and exploits in a safer way because they know that the other side has a harder time reporting them since the player name is not shown. Although there is the option of right clicking an enemy player to report him, the problem with that is that you must be able to get him into your target fast enough to actually invoke the report function before he disappears from your screen. Also the fact that the report form doesn’t give me any way to add additional information about what I observed doesn’t help either.
Not showing player names can also lead to a situation where more and more players who have become a victim of someone who cheated or exploited start blaming the whole enemy server instead of that single person that is at fault because there is no way to identify that person. Consequently there is a tendency to put the blame on the whole server or enemy guild.