Showing Posts For Demosthenex.7368:
IMO, the best way to fetch them all is to paginate through the listings using ?page_size=200&page=N, with N starting at 0 and ending at the value of the X-Page-Total response header. This will require 121 requests/scrape.
I’ll have to look into that. I’ve been using the list of id’s returned by the root of each endpoint, and creating 200 long URLs.
I was asking about bulk download because it seems like most (of my) API usage has nothing to do with requesting single items or 200 at a time. The most common use seems to be download to a local DB to query and a bulk option would rather make sense.
Ever considered making a compressed bulk file available hourly for the most common endpoints?
You totally can make it a get request with ids separated by commas. Also the results can be paginated if you don’t want to deal with ids.
To confirm, that’s limited to 200 ids per result?
I also read somewhere that ids=all was valid, but doesn’t appear to work for items, recipes, commerce/prices, or commerce/listings?
Is there a better way to bulk download price/listing data (ie: prepared cache) somewhere?
Unfortunately that’s a bit long to wait between refreshes for me. Back to the drawing board. Thanks.
Is the character inventory refresh interval documented somewhere?
I was trying to record my inventory before and after doing a salvage operation, but noticed that the API didn’t refresh quickly. I waited a few minutes and then logged out, and eventually the API returned the updated inventory.
If the interval is long then this method is unsuitable and I’ll return to paper notes.