Dump data downloaded in JSON format via Python script into the workflow
- Subscribe to RSS Feed
- Mark Topic as New
- Mark Topic as Read
- Float this Topic for Current User
- Bookmark
- Subscribe
- Mute
- Printer Friendly Page
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Notify Moderator
Hi All,
I am running a python script through Alteryx and i would like the data to be dumped into the workflow.
The data downloaded via the Python script is in JSON format and i can not find a way to dump the data into the workflow. Initially, i tried pandas data frame but if i am not mistaken this can not be used with data in JSON format. Also, I have another process (within Alteryx workflow) that transforms the data from JSON format into tabular.
My intent is to identify how to dump the data in JSON format into the workflow.
appreciate any help.
thanks
Solved! Go to Solution.
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Notify Moderator
Hi @othmane
The Python tool exclusively interacts with the workflow using dataframes, employing the Alteryx library/functions Alteryx.read("#1") and Alteryx.write(df, 1). So, i believe that you have 3 ways to deal with this:
1-Assign the JSON to dataframe cells and then pass it to the workflow with Alteryx.write(df,1). Depending on the size of your JSON this will not work properly.
df
ColumnA |
{data:{id:1, name: "John"… |
{data:{id:2, name:"Carl"… |
2-Parse the JSON inside of your python code, and then pass the dataframe to the workflow with Alteryx.write(df,1)
df
id | name |
1 | John |
2 | Carl |
3-Dump the JSON to a local file with Python, and then open the file dynamically with your workflow. I personally like this option.
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Notify Moderator
Hi
I went with dumping the file 😁
thanks for your support
