Alteryx Designer Desktop Discussions

Find answers, ask questions, and share expertise about Alteryx Designer Desktop and Intelligence Suite.
SOLVED

API Downloading and Transforming Data

BJackson56
6 - Meteoroid

I'm trying to download CES data from FREDs via API. I've used this exact workflow on all my other extracts, but for some reason with CES data, i'm not able to get all my dates and data on one row. After I've made a join to give all the survey's Titles instead of Series ID's, my next Cross Tab creates 3 sets of date. Again, this is the only time it's doing this, so thinking maybe something with this data, but why can I not get the values to show on the same data row?

Cross Tab Pre.JPG

Cross Tab Post.JPG

 Why am I getting 3 rows for date, containing 1, 2, and 3, sets of concatenated dates? There are several columns that have data in 2nd row, and only a handful with data in the 3 row of dates. I've deleted and reconfigured the Cross Tab tool and still same results...Can anyone save my sanity!?

13 REPLIES 13
apathetichell
18 - Pollux

One more - your mapping file isn't unique. this could also be a problem. I see 56k records exiting 533 and 68k going into 554. If you are just mapping seriesid->name these numbers should be the same. The increase would mean duplications and would mean that you have additional lines you are creating.

BJackson56
6 - Meteoroid

2021.2, and yeah i see it! It is not checked. Should I? Would that help this situation?

 

A) For this workflow, I'm pulling all the CES series. All the industry level data and the full series, 113 to be exact. (removed the 2 discontinued ones)

B) Did this, and still got the same results (attached results)

 

SUM in Corss Tab.JPG

BJackson56
6 - Meteoroid

@simonaubert_bd after a lot of reworking and learning more about JSON packaging API data...you were right. I had 12 duplicate series that I was pulling in. Learned more about cross tabs too! All the learnings! :-)

BJackson56
6 - Meteoroid

@apathetichell nailed it! I had several (12 or 13 i think) that were duplicates from the start. The mapping list I used for both inputs had the duplicates...learning is fun! :-) Thanks for the help!

Labels