RIP City of Heroes
(edited by Behellagh.1468)
635 hours over the past 6 months (203 days). 2 Characters, one 561 hours (Level 80 Mesmer), one 74 hours (Level 52 Thief).
What can I say, not much on TV. Note not all of it actively playing. Right now for instance I’m waiting for orders to fill.
(edited by Behellagh.1468)
This is what happens when your minimum requirements are Intel Core 2 Duo 2.0 GHz, Core i3, or AMD Athlon 64 X2? Hard to balance between lots of threads and the lowest common denominator.
You can still scale it. And the Steam survey still shows 47% with two physical cores. Does it make marketing sense to exclude potentially half of your customers?
Same could be said by not requiring Dx11, since that excludes all of your XP, Vista and 40% of your Win 7/8 customers who only have Dx10 class video cards.
Don’t forget the game is expanding into Asia shortly and XP is still 58% in China, 30% is South Korea and 16% in Japan (Vs 13% in NA).
You code to your target audience and that means Dx9 and two core minimum isn’t bad when only 5% still run with a single core.
I’m sure someone will blame NCSoft NCSOFT by requiring ANet to make the game available in Asia.
(edited by Behellagh.1468)
Wait so you are telling me that turning FXAA off you are getting 15fps more? Or that when you force one of the other “standard” AA methods that you lose 15fps? Because FXAA is only a fancy filter run over the completed frame, it shouldn’t take much time at all and that’s the point of it. Fewer jaggies for little loss in performance but with occasional mistakes causing blurring of textures.
You get what’s on the tin, the good and the bad. That’s why companies are embracing it. Anything that doesn’t lower performance in those benchmarks every site uses. Gaming benchmarks sell cards so nVidia and AMD is asking devs to add FXAA so the AA/AF ultra setting benchmarks look better than if they used MSAA.
(edited by Behellagh.1468)
Currently the gem chart at GW2spidy is broken for some reason (at least for me). The one at GuildWarsTrade is working but it only shows the Gold to Gem rate. The Gem to Gold rate is 72.25% of that.
http://www.guildwarstrade.com/gems
As for the OP, the Gem to Gold rate mirrors the Gold to Gem rate and as as long as more players by Gems with Gold than vice versa, it goes up. Otherwise it goes down.
Spikes up happen when something sexy goes on sale at the Gem store and then usually decays back toward the pre spike value.
Expect the next spike when the infinite logging axe comes out unless they put bank and inventory slots back on sale before then.
I’ve been selling at 12-13c for weeks now.
Don’t forget the glut of copper that came in from the molten forge dungeon. Look at the long term price for copper at GW2Spidy. The price is simply returning to “normal” as the glut of supply goes away.
(edited by Behellagh.1468)
To be able to use more cores effectively, a task needs to be able to be split into multiple tasks with similar amount of work to do in each. That is not an easy thing to do especially in a task that is providing real time feedback.
In the end everything depends on the rendering code providing scene data to the driver via the Direct3D 9 API. And Dx9 isn’t really multithreaded enabled.
You would need to create something like command lists that’s found in Dx11 where you can have multiple threads each create a command list for whatever they are currently rendering, queue it for execution and then have one thread that processes the queue and executes the Dx9 API calls. But that will still mean that only one thread is talking to the driver and that will be the thread where you can still bind up.
It’s one of these things where it’s unclear up front if the additional overhead will provide actual improvement or not.
Here’s a SigGraph 2008 presentation (pdf) on the approach.
Sadly that open source library is no more as the company closed and assets sold off only a year or two later.
(edited by Behellagh.1468)
As someone who works at a wholly owned subsidiary of another company, our sales and expenses are tracked by our accounting department with our aggregate profits and losses appearing on our parent company’s quarterly and yearly statements.
Yes, in the end it’s all their money but the money first goes through our hands, pays our expenses before any left over goes on the way to theirs.
(edited by Behellagh.1468)
It’s not queries, it’s a removal from an inventory database from a character (character DB) and an insertion into a player’s market inventory database (market DB) when posting an item for sale. Now multiply that by thousands or tens of thousands per second during peak playing hours. Now imagine the cumulative database fragmentation in the market database caused by all of the sell orders and pick ups throughout the day across all worlds.
So instead of getting a DB timeout when trying to sell or pick up a single item due to peak DB manipulation, they decided to put a speed limit on every player so only those who are selling a ton of different items at once are affected.
It’s timeouts due to insertion performance and DB fragmentation reducing overall performance that they are trying to mask with the delay.
Also the limit isn’t as low as 10/min, it’s closer to 15-20/min or one every 3-4 seconds. Sure that’s a lot slower than the 2 per second you can get once you get the rhythm down but it’s not as slow as 10 per minute. I routinely sell a dozen or more runes I’ve salvaged only using a 3 beat between sells.
(edited by Behellagh.1468)
However, buying gems puts money in NCSofts coffer, not ANET. Ofcourse some money will trickle over for some new content but most seems to go to funding NCSofts other projects.
And from where are you gleaning that bit of information from? And don’t say Dontain’s video because that guy is foiling the inside of his flat to keep out the mind control rays.
Game is only using 32% of CPU because it only has about 3 cores worth of threads to run. If you had 100 cores then your CPU usage would be 3%. This is why single core performance is key and not the number of cores. AMD CPUs are simply outperformed, core Vs core, by Intel ones since the introduction of the Core 2.
As for the 670’s 78%, well it can only do as much work as the CPU can provide to it.
If you are expecting an official answer from ArenaNet, don’t hold your breath.
(edited by Behellagh.1468)
FXAA is pretty light weight in terms of frame rate drop. So I don’t what you are talking about poor performance. Blurring, well that’s understandable it is a “fake” anti-alias method being a post process filter that’s run on the completed frame rather than something like MSAA which is done while the frame is being rendered. Same way DoF blur is done (FXAA that is).
FXAA guesses at what is a valid edge by changes in the luminescence of adjacent pixels and if it exceeds a threshold it’s classified as an edge and blurred (layman’s version). It’s quick, doesn’t use much frame memory if any and like a lot of things in GPUs over the years, a quick and dirty way to get a good approximation of the effect you are going for, much like how AF was done over the years but now is actually pretty darn accurate (every see the old colored “star burst” pictures from AF tests) due to the raw horsepower of a pile of simple FPP and a lot of graphics memory bandwidth.
Hi robertlee,
For budgets under £400 I saw this from SolerNova’s post.
£389.99 inc vat (limited time only special 2left in stock – normal price £415)
http://www.aria.co.uk/SuperSpecials/Other+products/Gladiator+750K-GTX650Ti+AMD+4.00GHz+Quad-Core+Next+Day+Gaming+PC+?productId=54930looks good enough to play the game decently for its price which to be brutally frank is the lowest priced gamer oriented pc I have ever seen.
Glad that smart minds think a lot. The 2nd link. For those who aren’t familar with the Athlon X4 750K, it’s an A8/A10-5xxx/-6xxx but without the integrated GPU enabled, and running only at 3.4GHz.
https://forum-en.gw2archive.eu/forum/support/tech/Need-help-for-buying-a-tower-for-250/2266598
I’d also like the Transactions – buying -bought-sold-selling to work so that you didn’t have to reset every time you look at an item, its so kitten annoying
Especially if you are pulling bids that you have already been overcut on.
I think there’s a youtube clip up already with the Noir intro if you ever want to watch it again. Link off of Dulfy’s site.
http://www.youtube.com/watch?feature=player_embedded&v=NgCdjc1oGOo
You didn’t look hard enough. That’s mainly the fault of the lousy search here.
Again it’s a happy benefit that it’s annoying to the bot masters but it’s main purpose is so the database that is handling the market across ALL servers can handle all the selling traffic from all players using it without needing an server infrastructure like Amazon.
It’s not always about botters, sometimes it’s just being realistic with your server hardware.
ArenaNet spent more time on game looks than game optimizations. NCsoft is only allowing ArenaNet to release content to the game. Bug fixes and any hard optimizations will not be happening any time in the next few years. You can quote me on that, I’m never wrong about these things.
NCSOFT doesn’t care what ANet does as long as the money keeps pouring in. GW2 bought in more sales in the last 3 quarters than any of their other properties.
Sure lets start interfering with the golden goose.
Hello
As said in the title I come here to news on the optimization of the game that already has nine months, it has been 9 months I expect a small side of cpu optimization and nothing we don’t know if it’s in court, if you work disappoint, or simply that you don’t intend to work disappointed?
The problem with this lack of optimization and the game only uses 3 cores.
In mcm we are left with the fall in combat fps (15/20 fps) so the gpu and cpu are not optimal used (40% gpu and the cpu max 35%).Config :
-gtx 670 oc
-amd fx850 oc a 4.6 GHz
8 go ddr3 2133MHzCould you tell me if you counted finally optimize the game?
PS: I enclose a screen printed where you see me on a border area (mvm) with a graph that shows the use of the gpu (78%) / cpu (32%), this scenario also works in PvE areas.
Sorry, i’m french and my english is very bad.
I decided to go to the same spot and capture a couple of frames with Process Explorer looking at Guild Wars. Note I am in Windowed Fullscreen and I know a lose a frame or two per second this way so the game isn’t running as fast as it could in fullscreen. What you don’t see is the zerg organizing behind the PE window (well you do see it on the map).
The first shows the overall CPU usage, 67.6%. On my quad core you could say it takes 2.7 cores to run. Notice on nunoumaior’s eight core rig that, assuming all 32% is the game, that’ll be about 2.56 cores.
The second shows the thread breakdown. Now I have a quad core so no thread can be higher than 25% or one core.
Now I surmise the first thread listed is the main gaming loop taking 69.4% of a core. The second and third threads comprise the main rendering threads, one game, one driver at 50.5% and 41.5% of a core respectively. It’s likely that these two threads sync with one another. Those three threads comprise about 60% of the entire game’s workload. The next 6 threads comprise about 38% of the game’s workload and the remaining 36 threads (yes on my system the game as 45 threads) about 2%. I’ll also guess that some of those 6 threads also sync loosely to the main gaming thread.
My point and the point of others is the game will only be as fast as single core performance. And the FX generation of CPUs aren’t as quick as Intel’s. They may be as quick as a Phenom II but the Core iX series of CPUs for the most part have better single core performance and it’s getting better every generation.
Honestly, how many keys could you farm in say an hour? Compare that to number of regular dragon coffers that drop.
I was thinking along the same lines Swordbreaker, but an A10-5700/5800K but the clock speed bump you get with the A10-6800K is rather nice.
I would still stick with DDR3-1866 simply because of the integrated GPU. You’ll get about a 8% bump up from DDR3-1600. Fastest way to get a GPU to underperform is strap it to slower memory and since the HD 7660D/8760D uses system memory …
I would skip the video card for now. If he was relying on the HD 3200 IGP, the integrated HD 7660D/8760D would still be like night and day. Plus it’s unclear if GW2 can really benefit from an CrossfireX with the HD 6670 suggestion. And if overclocking is out of the picture then so is the Evo 212, the stock heatsink is good enough.
And then there’s the dreaded Microsoft tax for a copy of the OS. That alone blows the budget. But if he is retiring his old box then he might be able to reuse the product key on the new box, assuming he can find it/didn’t scratch the sticker off the old box (assuming the old box is store bought).
But even after all this haggling, it’ll still be more than £250.
Edit: Using uk.pcpartpicker I got the follow A10-6800K without OS at about £300, OS is another £66. Some assembly required.
(edited by Behellagh.1468)
No, Windows 7 is “or better”.
The HD 3200 is a motherboard chipset based graphics device, very low end.
We need better information about your system. In one of your other threads I linked the zipped version of CPU-Z. Just unzip it into it’s own directory (Windows 7 has an integrated basic zip/unzip tool, just right click the zip file and choose extract to).
This utility will give you the name of the CPU, the socket type it uses (Package), the motherboard name and chipsets used (under the Mainboard tab) and the type, size and dram frequency of your memory. If you could tell us all that it’ll be helpful in getting an idea of what your options are.
What kind of AMD processor, not just the speed. Single or multiple core? Game needs at least 2 CPU cores. I’m going to guess it’s a single core AMD Sempron LE-1200 but you could have an Athlon X2 4000+/4050e or even a 1st generation Phenom.
The HD 3200 is an IGP (meaning it’s part of the motherboard chipset) with only 40 graphic cores.
I did get GW2 to run, fairly poorly (~15-20fps parked in The Grove, @1280x720) with a 3.0GHz Athlon II X4, 4GB of memory with a HD 4200 IGP which is essentially the same graphics IGP in terms of gaming.
Of course I’m assuming it’s a desktop and not a laptop.
CPU-Z will tell you and then you can tell us exactly what you currently have.
(edited by Behellagh.1468)
PC game requirements in general have changed a lot in the intervening 7 years between GW1 and GW2.
You may be able to wing something at 350-400 pounds, 250 pounds is just to little even going with a dual core and a low end video card.
In Aria’s clearance bin they have this dual core system at 350 pounds but without Windows and this quad core at 400 pounds without Windows as well (this system is worth the extra 50 pounds IMO).
Intel CPUs are doing great because they are inherently faster than AMD’s offerings. Even the 2 generation old i7-2600 is 35% faster than an FX-8350 in terms of CPU performance for gaming. Even the 3 generation old i5-760 will beat it in games more times than not.
When AMD went about to design the new from scratch CPU used in the FX they decided to sacrifice performance when only a few threads are running for one where an FX-8xxx is competitive with the Intel i7 only if both processors are running all cores at 100%. Games for the most part don’t max out all the cores in a four or more core CPU.
So there’s nothing to “optimize” to make the AMD run better. AMD is simply not as fast.
There’s no silver bullet here. No magical means to take something temporally linear like a game and code it in a way to scale with the number of cores in a system. No divide and conquer approach to distribute to cores like you can for video compression or ray tracing.
Sure you can. Game engine optimization.
Which wouldn’t be just AMD then.
And it wouldn’t be 9 copper if there wasn’t 5 million for sale when demand is 11 thousand.
Except here the publisher also owns the developer outright. Like how EA owns Maxis or Bioware.
There’s nothing to optimize to make AMD’s faster. The FX has inferior single core performance. Period, end of story. The FX-8xxx architecture was designed to take on the Intel i7 only when all cores or logical cores in the case of Intel, are fully occupied. Great for server clusters but not much better than the Phenom II in limited threaded applications like games.
There’s no silver bullet here. No magical means to take something temporally linear like a game and code it in a way to scale with the number of cores in a system. No divide and conquer approach to distribute to cores like you can for video compression or ray tracing.
If the drop data in the wiki can be relied on, standard coffers are around 2000:1 while rich coffers are round 80:1 for a ticket.
With 500 standard coffers, there is a 77.9% chance of not getting one, a 19.5% chance of getting one and a 2.6% chance of getting more than one.
Think of it this way. Gem and game sales is a way NCSOFT measures the success of the game. As long as it’s at what NCSOFT thinks is an acceptable level, relative to their other games or not, they will support it. It’s a way to keep score between their properties.
In WvW it’s CPU performance that’s limiting your frame rate. The game engine seems to have problems in crowds of players which is why boss chest fights are also poor.
I wouldn’t say your CPU sucks, just that it’s not as fast as some. At least it’s a 2.9GHz quad core. The Phenom II X4 965 is about 25% faster while the Intel i5-3570 is 95% faster. And in WvW zerg battles, it’s CPU performance that rules the day.
According to this page it only has 15a @ 12 volts. That means that 420 watt PSU can only put out 180 watts at 12 volts and that’s what both the CPU and video card feed on. That is very, very low. It’s a older designed power supply from back when most PCs drew the bulk of their power from 5v and 3.3v side of the power supply and that hasn’t been true for at least 5 years, maybe even 10.
So yes, you need a better power supply in your new system if you what to support a modern video card. Note that it doesn’t need to have more watts overall, just watts on the 12 volt side of the power supply. For instance the Antec Basiq BP430 or the Corsair CX430 430 watt PSU can put out up to 384 watts at 12 volts.
Video card manufacturers try to cover their bases by having higher requirements for PSU wattage in hopes if you do have an older generation PSU that their would be enough watts at 12 volts. This is simply because it was getting confusing to the average folk back when they also included an amps at 12 volts specification in their requirements.
Right now with my current computer, I have to run on best performance in WvW/PVP to avoid lag/fps drop. So I’m looking to upgrade my computer for gw2, can the spec below run gw2 on Best Appearance setting, how much is my fps for zerg vs zerk in WvW?
Operating System
Windows 8
Processor
Intel® Core™ i7-3770 Processor
Memory
16GB DDR3-SDRAM 1066MHz (4×4GB)
Hard Drive
1TB SATA 7200rpm
Graphics Card
Nvidia GTX680 2GB DDR5
First double check the memory speed. I’m guessing you are going nuts and getting ddr3-2133 memory. If it really is ddr3-1066, upgrade to at least ddr3-1600 cas 9 memory.
Second the GTX 770 is faster and may be cheaper than the GTX 680, but I haven’t checked prices lately, places may be slashing the price of the GTX 680.
As to your first question “can the spec below run gw2 on Best Appearance setting” the answer is yes. PvE areas should look lovely to you now.
As to your second question “how much is my fps for zerg vs zerk in WvW”, I can’t guess but I’ll say not as fast as you may want. Game performance drops quickly as the number of players near you goes up. This then likely puts the CPU as the limiting factor and the i7-3770 (question: is it the K model or not?) is pretty close to the top of the CPU charts for single core performance.
(edited by Behellagh.1468)
So are you planning on starting a band? If they give you a guitar, next you will be asking for a drumset, clarinet, banjo, harmonica, trumpet, cello, and trombone.
Which you then toss all of into the MF and get one of these.
Looking at the sales numbers for NCSOFT’s subsidiaries it appears that online sales are assigned to ANet while store sales are assigned to either NC Interactive or NC Europe. But in the end it’s still all NCSOFT’s money.
I wouldn’t spend $10 on a real vuvuzuela.
I got my ticket in my 3rd gem chest.
Which is about the same odds I had with 75-85 normal coffers.
It’s not to fight botting, it’s to keep database ops/s under control.
When you but something up for sale they have to remove it from your character’s inventory, but it in the market inventory under your market’s player inventory. That is a costly operation. Since the TP is being accessed by every world they simply put a speed limit on everyone so the database is unduly strained when lots of players are selling at the same time. Stopping automated systems is just a happy benefit.
Buy orders are different since it’s only gold and a request of an item, not the actual item itself. Having the buy order fulfilled doesn’t have to happen in front of prying eyes immediately, the TP code can take it’s time based on load to transfer the item from the selling player’s market inventory to your market inventory. No need to access the character’s inventory database either.
It’s a matter of cost. I’m sure you could build a DB server cluster than could handle everyone selling as fast as they can click but it’s much cheaper to simply but at speed limit on it than have an Amazon like server cluster.
You old system
Athlon II X2 260, 2 cores @ 3.2GHz
HD 5770 – 800 shaders @ 850MHz with 128-bit memory @ 4800MHz
Your new system
A6-3650, 4 cores @ 2.6GHz
HD 6530D – 320 shaders @ 443MHz with 128-bit memory @ 1333MHz
HD 5450 – 80 shaders @ 650MHz with 64-bit memory @ 1600MHz
So you may have doubled your cores but each run at 81% the speed of your previous CPU.
Graphics wise the embedded HD 6530D has about 21% of the shader power and 28% of the graphics memory bandwidth while the HD 5450 has 8% of the shader power and 17% of the graphics memory bandwidth.
As for transferring the HD 5770 from the old to the new, you will need to know if the power supply in the new computer can handle it because the HD 5450 is only a 20 watt card while the HD 5770 is an 110 watt card.
There’s still the slight problem that the new system has a slower CPU but if you have games that can tax more than two cores, or enjoy having other stuff running while playing, it may actually be faster than your old system. Let my direct you to this comparison of a Athlon II X2 265 to an A6-3650, almost at the bottom of the comparisons they look at a few games.
http://www.anandtech.com/bench/Product/190?vs=403
Dragon Age Origins most assuredly use more than 2 cores and shows the A6 kicking the faster dual core’s butt. However WoW and Starcraft II shows that more often than not, single thread performance is still a limiting factor in a lot of games.
Player A spends 300$ gets 0 tickets.
Player B spends 40g gets 5 tickets.
Player C spends nothing gets 1 ticket.
Master salvage another 63 rares earlier this evening, 57 ectos. Now during the salvaging process I certainly had runs with few ectos. But then getting a couple of three ectos with one salvage brings the overall average back up rather quickly.
Remember getting close to the established average is easier with large samples than small ones. Streaks, lucky and unlucky, will tend to cancel each other out. But because we are dealing with randomness, there’s still an outside chance with big samples to get poor or incredible results.
Salvaged 102 rares last night, got 92 ectos out of it with master kits. Night before did 125 and got 110 ectos out of it with master kits.
Working normally for me.
Actually the taxes attempt to keep the active player money supply in check and that in turn helps keep inflation at bay. But I nitpick.
Sounds like the OP forgot about the fees when pricing the object for sale and ended up with less than he was expecting and is transferring his anger from himself to the TP.
You don’t have to do just your personal story. Just explore the zone. You get XP from everything in this game. Gather ore, XP. Rez an NPC or player, XP. Explore, XP. Hold off doing the personal story until you are that level or greater. Like everywhere else in this game when you are over level for the area, the game will scale you down.
Personally I only do my story if it’s a Daily option.
In a perfect world, game developers will write their engines to the DirectX or OpenGL API standards and GPU manufactures will write drivers to translate those functions into commands that will use the GPU as efficiently as possible for the hardware features that GPU has.
Sadly this isn’t what happens. What happens is GPU manufactures write fairly basic drivers that aren’t terribly efficient and shake down game developers, offering them help for a fee. What they end up doing is, and this is mainly with shader code, provide them a “hand assembled” version of the shader they want to use or will unilaterally replace the shader unilaterally in the driver code if the game is popular enough to use as a major benchmark. This is because the shader compiler provided in the driver from either manufacturer works but doesn’t produced very optimized output. How else do you think a beta driver can suddenly give a 20% boost to performance in a new game without changes to the game code?
On one hand writing an optimized compiler for anything isn’t an easy task, a lot of deep magic is involved and clever developers who can do that are both rare and expensive to employ. On the other by not putting in much effort into getting better results from the compiler keeps the “partnership” program and revenue stream going.
When games favor one manufacture over the other, it’s often due to the game developer being told how to code their shaders in a way that happens to be much better in that manufacturer’s shader compiler but not necessarily in the other’s.
So to perform well in both it’s now on the game developer’s shoulders to provide two or three sets of shader code depending on whose driver is being used (N or A or generic). An application provider shouldn’t have to care about whose device is being used to do function X, that’s the whole purpose of the driver code, you abstract that stuff away. And it’s true everywhere but video drivers and games.
So ANet refused to pay to play. Congrats. But this does mean that the game will never perform as well as one from a company that was willing to pay “performance” money to the big two for their “expert” guidance.
(edited by Behellagh.1468)
Better armor, weapons, trinkets, higher level crafting materials. Then there’s the old gold→gems conversion to by stuff from the Gem store. 100G for a commander badge for WvW.
Save some money and get the i5-4670K, i7s don’t buy you a whole lot when gaming. $120 less at the moment than the i7-4770K.
While the GTX 770 is a very nice card, it’s still a hefty chunk of your budget. The GTX 660Ti is around $100 less while the GTX 670 is around $50 less.
Don’t forget the Microsoft Tax (ie a legit copy of Windows). OEM Home versions will run you $100.
Z87 motherboards aren’t anywhere as cheap as Ilithis states. Especially if you are going to pay for a K class CPU for overclocking purposes. Asus Z87-A is $150 for a motherboard with SLi and CrossfireX support. If that isn’t important than the Asus Z87-K is $130.
As for cases, there are several nice ones for under $100, even under $50. Just remember the lower the cost, the less frills, like included fans and dust filtering.
Examples:
Antec Gaming Series One
Antec Three Hundred Two
Corsair Carbide 300R
Cooler Master CM 690 II Advanced (USB 3)
I’m using NewEgg for all my pricing.
Either the GTX 660 or the HD 7870 is around 20% faster then the HD 7850. So is that worth $50.
Your PSU is fine.
The motherboard isn’t SLi compatible and CrossfireX is neutered by the PCIex4 slot, if that’s important to you. The -LK supports SLi and fully Crossfire X (x8/x8).
Ram – Cas 10? Seriously? Why do you need 16GB for anyways? Well if you are going that route pay the extra $10 for Cas 9.
http://www.newegg.com/Product/Product.aspx?Item=N82E16820231568
Agree with SolarNova about the Evo 212 cooler, even if you aren’t overclocking it lowers the temps a lot over the stock cooler.
(edited by Behellagh.1468)
Wow, I think it’s been more than a day since the last TP fee rant.
Not affiliated with ArenaNet or NCSOFT. No support is provided.
All assets, page layout, visual style belong to ArenaNet and are used solely to replicate the original design and preserve the original look and feel.
Contact /u/e-scrape-artist on reddit if you encounter a bug.