We have a batch macro which pulls in data from numerous spreadsheets and consolidates it into one concise dataset of about 6 columns.
It runs fine and produces the expected data which can be viewed in the outgoing green arrow on the tool OR in attached browse tool.
However when trying to add additional tools after the macro they will not recognize the column names. For example the select tool will only show "Unknown"
Even after running the workflow that contains the macro to populate it with data the tools will still not pick up the column information.
The macro itself has a select tool before output where the dynamic column is deselected so the macro ONLY outputs the 6 columns each time so there shouldn't be a variation in the output that the workflow can't handle.
I have searched the community and tried the various solutions to similar problems but none seemed to work.
We are using 2023.2
Thanks!!!
Solved! Go to Solution.
@DanielG That is strange, if you are able to see the macro output using a browse tool then it should work with any other tools, are you able to see the field names in the result window? can you try using a field info tool ?
@binuacs Yes. We can see the results in the window but only on the green output arrow of the macro or in the browse. When you drop in a select, or sort for another example there is nothing.
I will have my teammate try the Field Info Tool to see what happens.
Do you have screenshots of the Interface Designer and Workflow Configuration windows? Or can you supply a workflow with made up data?
@lwolfie Unfortunately no. I can't share it or screenshot as it isn't mine to share. I was asking for a teammate who reached out to me for the answer.
Though all you need to do to see what the config looks like is pull a 'sort' or 'select' to a blank canvas, it looks like that.
This can be a metadata issue. Sometimes it fixes on opening/closing alteryx. Sometimes it's amp related. It can be more problematic if you have a cross tab in your macro. I'd recommend unioning in a text input with 0 rows but the required column names after your macro. This will mandate that your fields are present downstream.
@apathetichell -- Thanks. I really wish that wasn't the solution. I really want it to just work the way you would think it should.... but it is what it is. 😁
It got us where we needed to go.
thanks to all who provided responses. Happy Holidays.
@DanielG --- on more... if you have fields which shift between runs ---I'd recommend casting them to the need types using a multi-formula formula (set for all --- not a specific type).
Potential problems:
1) your field exsists in one data source - not in others.
congrats! Your field may now be bytes or boolean --- when you expect it to be string.
2) your field is going to be created downstream but sometimes is created here
congrats! your field downstream is now labeled as [fieldname2[
3) I succesfully casted by numeric string to string using mult-field formula as apathetichell suggested:
congrats! Your field exists upstream and is now a string--- but it hits the mutli-field formula tool now as a string - and you've configure dit to cast a numeric version of the filed into a string. your workflow has now crashed. expect evil ominous message about field type changes.
how I'd love to have this handled but it won't ever be: json dictionary style.
that means that if a field exists can be tested --- and if it exists --- I can do something - and if not I can create it. Imagine testing in a formula tool if exists([myfield]) then '....' else [otherfield] endif --- I mean hypothetically this shoudl be doable - right? but if you pass in a non-exsitent field you'll get an error.