Advent of Code is back! Unwrap daily challenges to sharpen your Alteryx skills and earn badges along the way! Learn more now.

Alteryx Designer Desktop Discussions

Find answers, ask questions, and share expertise about Alteryx Designer Desktop and Intelligence Suite.

How to Create a Batch Macro in Alteryx for Dynamic API Calls to Retrieve Contact Data?

AKPWZ
8 - Asteroid

Hi Alteryx Community,

I am working on creating a Batch Macro to retrieve contact information using the following API:

https://api.hubapi.com/contacts/v1/contact/vid/" + [vid] + "/profile?property=firstname&property=lastname&property=email&property=phone

Scenario:

  • I have a dataset containing a large number of vids (e.g., 200,000 rows).
  • Each vid corresponds to a unique contact, and I need to make an API call for each vid to fetch the details (firstname, lastname, email, phone) dynamically.
  • Since there’s no pagination in this API, I’m considering a Batch Macro to process this data in manageable chunks or batches.

What I've Done So Far:

  1. Input Data:
    I have a list of vids in a column, stored as part of my workflow input.

  2. API Configuration:
    The API endpoint dynamically appends the vid to the URL. I plan to construct the URL using a Formula Tool:"https://api.hubapi.com/contacts/v1/contact/vid/" + [vid] + "/profile?property=firstname&property=lastname&property=email&property=phone"

  3. Authentication:
    I have already set up an API key and tested the endpoint successfully using a cURL tool.

  4. Goal:
    Create a Batch Macro that:

    • Loops through each vid.
    • Makes a call to the API for each vid.
    • Processes and appends the results into a consolidated dataset.

Challenges:

  1. Control Parameter Setup:
    I understand that the Control Parameter tool is used to dynamically pass values (in this case, the vid values) into the macro, but I’m not sure how to configure it correctly.

  2. Macro Workflow Structure:
    I’m unsure how to set up the tools within the macro for:

    • Constructing the URL dynamically.
    • Sending the API request using the Download Tool.
    • Parsing the API response.
  3. Batch Configuration:
    How do I ensure the macro processes the data in batches or iteratively handles all vids?

Please help me.
Thank you!

5 REPLIES 5
apathetichell
19 - Altair

You don't need a batch macro per se here---> you can 1) set up your auth. 2) append it to your main dataflow. 3) dynamically create your urls. 4) use a recordid  5) process via download tool.  Your issue is timeout/concurrency and how do you get this process to work most efficiently.

 

Pagination should not be an issue here because you are calling each property individually. Pagination tends to be an issue with Odata apis and situations where you retrieve a catalog in a retrun -> in your case each response is for a specific entry. 

AKPWZ
8 - Asteroid

Thank you for the quick response @apathetichell 
Could you please help me out how to use record_id to achieve my goal? How will the implementation of record_id work?

apathetichell
19 - Altair

each of your 300K values needs a recordID ---> this is so that you can do the manipulation of your values/download data after your API call and you can link the data back to the api call. You'd just add a recordID. As a warning ---> running 300K api calls can be a fairly long process. 

AKPWZ
8 - Asteroid

Hi @apathetichell sorry that I'm still unable to understand.
Is it possible for you to explain with a demo workflow which can clear my doubt.

Thanks!

apathetichell
19 - Altair

I'm sorry -> what's your question here? Your workflow has 300K records which you are using to create 300K endpoints for your API? give them a recordid via the recordId tool.

 

the recordID will make parsing the api response (JSON/XML) easier after your API call.

Labels