Showing Posts For paddle.9654:
So you are suggesting instead of doing a long line of ID’s (?ids=1,2,3) the best way to get the data is:
- Get the X-Page-Total from the HTTP header on a URL like this:
- https://api.guildwars2.com/v2/commerce/listings.json?page=0?page_size=200 - Loop through the “page?=LoopNumber” until i hit the X-Page-Total
I tried doing this quickly and when using the above URL despite my “?page-size=200” i get a page_size of 50 and a x-page-total of 450 in the HTTP header. When using the ID’s i could loop 200 of them and get around 113 requests. Do i ignore the page-size and x-page-total and use the X-Result-Total/(maximum page-size).
I am not sure what header information containing the page count you are talking about. Looping through individual pages was causing 20000 requests i have dropped it down to around 170. From your previous comment i have stopped requests like this:
- https://api.guildwars2.com/v2/items?id=1
- https://api.guildwars2.com/v2/items?id=2
- https://api.guildwars2.com/v2/items?id=3
And made my request bundle the id’s up to chunks of 200.
Eearslya.6309 I think this is the correct solution thankyou. I can’t work out any other way to speed up the process.
You may want to create a database for the item data and only request them once a week or day.
For requesting massive data you should use pagination. This will reduce the amount of requests to v2/commerce/prices from 22480 to 113 when using a page_size of 200 (maximum value.)
I don’t think this is what i am looking for. Unless there is a way to use the pagination to speed up multiple requests with individual ID’s. Optimally i am looking for a solution which creates something like “?ids=all” for items, as my current solution for looping requests with the individual ID’s is a slow process.
I understand using pagination would be a good way to do it if it was outputting the content onto a page directly with each call. This would then use pagination to show the first 100 items with buttons to load the other items.
I am trying to find the best way to get all of the content into a json file which can then populate a database. For the items and their information i would update them on a daily basis if they content had changed but for the prices and other more dynamic data i would like to update that more frequently.
For the past two weeks i have been trying to work out the best way to get data from the item and commerce api’s by using PHP and jQuery. I would be happy to use other methods however i am not familiar with them.
Currently i am experimenting with jQuery. My method is as follows:
- getJSON on the ‘v2/commerce/prices’ to get a listing of all the ID’s which are in the shop.
- Create a loop for all the ID’s in step one
- Inside the loop getJSON for ‘/v2/items?id=Step_1_ID’ which allows me to access the ‘name’, ‘icon’ and other info.
- Inside the loop getJSON for ‘v2/commerce/prices?id=Step_1_ID’ which allows me to get the ‘buy’ and ‘sell’ info.
I feel like i am approaching this wrong and am creating way too many requests which could be simplified greatly.
I have also tried this method with PHP however it was extremely slow. I tried optimising it with curl however when running around 100-200 items it would take about 3 seconds per item.
The goal of doing this would be to create my own local json file which could then be updated and used to populate a mysql database.