(edited by Eowin Of Rohan.2619)
Clean up events list ?
Since you brought up cost … it currently cost you less than $3 USD /mo for hosting? O_o
Anyway … invasions are still going on so removing those wouldn’t make much sense. Crown Pavilion and Labyrinthine Cliffs are both returning as well.
The simplest thing you can do to reduce the bandwidth would be requesting a compressed copy of the JSON if you aren’t already doing that. It is reduced to about 1/5th of the uncompressed version. I am only bringing this up because you mentioned wget and it is a bit lacking there so you might not be doing that.
As Khisanth said, the events are still going on, so by definition, they should be returned by the API.
Much of the concerns you’re raising, while valid, seem very application-specific problems, and not necessarily the fault of the API service.
That being said, I agree (but not necessarily advocating for change) that it would be nice if there were some grouping of the events aggregated within the API. The community provided dragon event IDs come to mind, but if there was an endpoint that returned event_group_ids for various collections (such as dragon event IDs, crown pavilion, labyrinthine cliffs, scarlet invasions, etc.), it could be something useful, perhaps.
@Khisanth :
That’s 18€/month for a server (which is necessary because I run TS3, and because I’m using unix shell batches to update my database with API information, 2 things that are impossible if I use a simple php/mysql shared hosting plan).
I have a “best effort” 100mbps bandwidth. Would have to pay 50€/month to get a server with a “guaranteed” 200mbps.
Also, I know that invasions are still going on (which is why I didn’t even speak about rolling back to the way the API worked before they added “inactive” status – which would have prevented labyrinthine events from appearing in the json but not the scarlet ones).
But, unless you want to draw a live map, the only events status useful for scarlet invasions are the meta events (13 different ones : http://events.gw2organizer.com/events/events.php?mode=7&list=2), not all of those 1282 red events.
About crown pavilion and labyrinthine cliffs events, they may return, which is why they should not be removed from the reference (event_names.json); but in the mteantime we don’t need their status.
For these reasons, I was asking for a second json which would be pre-filtered, keeping the existing one as-is, so people who want to draw full live maps for scarlet can still do it.
Btw, tyvm for the compression tip. I didn’t know it was possible to request a compressed version of a file through https. I’ll test this !!
@Killer :
Grouping IDs are a very good idea. This idea could cover both my “filtered api” and my “white list request” needs.
- all “normal & permanent” events = ID 1
- temporary-content or other living story related events = one ID for each release.
- all world bosses (and pre events ?) = another ID
- …
I would be able to call ID1 + currently active temporary content IDs every minute, and WB’s ID every 20 seconds.
I was thinking of DreamHost VPS/Linode. Their cheapest plans meets what you described. DreamHost has no limits and inbound data is free for Linode so cost doesn’t change no matter how big the JSON file grows.
What I would say is get a decent NAS and run your own host! That then costs nothing except your normal broadband costs.
Very bad idea.
I was thinking of DreamHost VPS/Linode. Their cheapest plans meets what you described. DreamHost has no limits and inbound data is free for Linode so cost doesn’t change no matter how big the JSON file grows.
I may have to do something like this.
gw2stats.net recently crossed the 5TB bandwidth mark with my current hosting provider. I never expected that much traffic and I had a 500GB cap :/