Hi,
I have a table containing various "data sets". These data sets are separated by an empty column and all they consist of the same number of columns.
I need to add a column at the end of each data set that assigns it a unique identifier. Please see below picture and attached files as reference.
Could someone please give me some ideas on how to solve this?
Thanks!
Solved! Go to Solution.
hi @aorozco
I created WF to do so in 2 ways. Please note that Alteryx does not allow to have duplicated field name in the same dataset, while you want to have two different fields with the same name "PROGRAM". You have to compromise somewhat; Rename to PROGRAM_1, PROGRAM_2, or transform data in to vertically.
I guess attached workflow is not exactly match to yours, but it should be good enough so you can start with. Good luck.
Thanks for the quick reply @gawa ! Your solution with the Dynamic Rename is almost perfect, except for the fact that my real input file doesn't have a null column at the end, so I don't get an identifier for the last data set. Is there a way to add a null column at the end of the file and name it based on the number of fields?
I got it, @aorozco
I suppose your real data probably have variable number of columns, or in future it can be.
Considering such a case, I prepared WF as attached. This WF accepts any number of columns, even if it has DATE_3 VALUE_3 DATE_4 VALUE_4.....
This is achieved by trick of Transpose & Cross-tab to cater for dynamic input schema. I don't explain in detail here so please open attached WF and see how it works.
Needless to say, even if your real data always have only 4 columns(DATE, VALUE, DATE_2, VALUE_2), the attached WF works fine, so don't worry.
Hi @gawa
Sorry I didn't reply earlier. I'm new at Alteryx so it took me some time to understand your workflow and adjust it to fit my real input data and test the output...
It works beautifully👌 I'm impressed with your analytic skills and how fast you were able to provide all the different solutions.
Thank you very much for your help!
@aorozco Happy to hear that! Have a nice weekend.
BTW I appreciate if you could accept solution on my post above though you marked it on your post :)