We are celebrating the 10-year anniversary of the Alteryx Community! Learn more and join in on the fun here.
Start Free Trial

Engine Works

Under the hood of Alteryx: tips, tricks and how-tos.
mathieuf
Alteryx
Alteryx

Let's perform the following sequence:

  • Data preparation
  • API call to enrich this data
  • Response analysis

 

To complete these steps, we'll use Designer Cloud and Plans.

 

Here's an overview of our orchestration with Plans:

 

PIC1.png

 

Step One: Data Preparation

 

My starting file will be a CSV file with stores and postal addresses.

 

PIC2.png

 

My first step is to clean up my postal addresses by removing special characters (é, à, è, -, ' ....). I could have used a file but I chose to use a Text Input combined with the Find Replace tool.

 

PIC3.png

 

Once this step is completed, I can create the 4 mandatory fields for my API call (URL, Method, Headers, Body).

 

URL:

Replace('https://data.geopf.fr/geocodage/search?q=' + [Adresse postale], ' ', '%20')

 

Adding the URL for the API call followed by the postal address and encoded the spaces with the corresponding web code '%20'

 

Method:

'GET'

 

Headers and Body can be empty but the fields must exist:

''

 

With the Select tool I delete the Postal Address field that I no longer need.

 

Then I save the output file as CSV in my Alteryx Data Storage.

 

Step Two: API Call with Plans

 

Open the Plans app and add a first Designer Cloud step. Select the workflow created earlier and verify the step configuration as follows:

 

IMG4.png

 

Then, if successful, add an HTTP step and check the Load Configuration from Dataset box.

 

Choose a file name to store the result returned by the API (note: we'll come back to this configuration a little later).

 

IMG5.png

 

Now we can build the second workflow to analyze the API results.

 

Step 3: Analysis of API Results

 

PIC4.png

 

For this workflow, we'll use two data inputs: the API return and the file produced by step 1 to retrieve all the characteristics of our data (categories, KPIs, etc.), with the join being performed using the URL.

 

We can parse the file produced by the API using the JSON Parse tool.

 

Wanting to keep only the X and Y coordinates of this result, I perform a few transformation steps. Then I store the final file (required for the overall process to work).

 

Last Step with Plans

 

Back to Plans to adjust the HTTP step by adding temp to the file name so we don't generate a timestamped file each time the Plan is executed and check the Delete Dataset After Plan Execution box.

 

IMG7.png

 

We can now add the workflow for parsing the JSON file created upon success of the API call and verify the configuration. And there you have it! 😉

 

Best Practices

 

You can add personalized emails (with variables) for success and failure of the steps of your choice.

 

If relevant, you can also add Logic steps such as delay or an approval system.

 

IMG8.png

 

IMG9.png