Alteryx Designer Desktop Discussions

Find answers, ask questions, and share expertise about Alteryx Designer Desktop and Intelligence Suite.
SOLVED

Maxing out API requests - need to batch API requests with download Tool

g_ang311
5 - Atom

I am having an issue where I am only able to fetch 1000 records from the API I am calling with the download tool. I need to fetch roughly 40000 records, and the API will allow me to do fetches of 1000 records at a time and use offset. So essentially I am looking for a way for inputting a dynamic count and offset into the download tool such as 1-1000,1001-2000, 2001-3000 etc.

 

I currently have a text input for API URL, so how am I able to have a static text input and a dynamic input to the same download tool? And how do I achieve a dynamic input that generates counts and offsets automatically that I can pass into the download tool's payload?

 

2019-08-22 10_06_04-Alteryx Designer x64 - BrightedgeKeywordGroupMergeAutomatic-Test.yxmc_.png

6 REPLIES 6
geraldo
13 - Pulsar

Hi, 

 

 

A question. You have already researched if this REST has the paging parameter. So when you send payload you put the page. Each pagination will be 1000 records. Search.

 

 

[]

rafalolbert
ACE Emeritus
ACE Emeritus

Hi @g_ang311,

 

I had the same problem with SharePoint API, i use modulo function like this: Mod([record_id], 1000)

 

To further optimise this, i suggest compacting all your records at batch number into a single cell, by concatenating all columns and all records and running as macro, i explain it in my other post here: https://community.alteryx.com/t5/Alteryx-Designer-Discussions/Passing-User-Inputted-Data-into-Nested...

 

img1.JPG

 

Thanks,

Rafal

 

#Excuse me, do you speak Alteryx?

g_ang311
5 - Atom

Hi Rafal, Thank you so much for the proposed solution. Attached is my revised macro and also the workflow. How am I able to take the value from the batching macro to update the count and offset record in the download tool?

 

 

 

 

 

rafalolbert
ACE Emeritus
ACE Emeritus

Ok, let's start over again - i've misunderstood your use case, i thought this is for data upload...

 

For record restricted download you'l need to find what is the meta-parameter name of starting record/min.

 

First I would make a call to get maximum record id and based on this and the limitation of 1000 records at a time calculate the intervals, the resulting values should go into the payload string (which you should dynamically drive from this field) and the number of calls will be determined and these calls connected with Download tool will execute top to bottom and give you the entire record-set in the output (new workflow is attached).

 

The only challenges are;

 

- what is the call for max record-count (i do this is SharePoint, example call is: https://site_address/_api/web/lists/getbytitle('list_name')/Items?$top=1&$orderby=ID desc)

 

- know the min parameter for querying your records (in SharePoint this looks like this: https://site_address/_api/web/Lists/GetByTitle('Booked Media')/items?$filter=ID gt min&$top=5000)

 

I don't know your API, so this is as close as i can approximate my use case, but in theory this should work.

 

img2.JPG

 

cgoodman3
14 - Magnetar
14 - Magnetar
Just a tip to add to what has been mentioned on this thread so far.

In the past, I’ve found API calls can fail mid workflow for reasons such as hitting an API call limit. So just incase this happens I tend to wrap the API call within a batch macro and write the results of each call out to a yxdb file and if there’s a call limit use the crew wait a second tool to slow the macro down.

That way if the API fails you can restart at the fail point and you won’t end up pulling your hair out!
Chris
Check out my collaboration with fellow ACE Joshua Burkhow at AlterTricks.com
g_ang311
5 - Atom

Rafal, thank you very munch for the explanation and example. It has gotten me most of the way there, however now I am getting an error when I connect the workflow to my macro saying "No valid fields were selected." I double checked the field length of the formula and even added a select tool after the formula to extend the field length but still get this message. Any ideas? Much appreciated!

Labels