This site uses different types of cookies, including analytics and functional cookies (its own and from other sites). To change your cookie settings or find out more, click here. If you continue browsing our website, you accept these cookies.
It's the most wonderful time of the year - Santalytics 2020 is here! This year, Santa's workshop needs the help of the Alteryx Community to help get back on track, so head over to the Group Hub for all the info to get started!
Here's the description of the issue I've been facing:
- I've built a simple macro that takes a file name as input question and uses an "input tool" to import an Excel file. The file can have any schema (it's used to avoid having users type full paths).
- In workflows where that macro is used (green in the screenshot), I've noticed that when I'm connecting a new tool downstream (in the screenshot, the Select), it generates immediately a "missing field" error in the existing operator right after the macro (the Filter in the screenshot). However, the field still exists in the data (which wasn't modified, as we can see in the results table).
- My hypothesis is that the new connection triggers a metadata update in the existing tools, but somehow they don't use the current metadata in the workflow, but another version (e.g. the metadata from the file you have to define in the input tool in the macro)
Anyone has faced the same issue? Right now my answer to people using the macros is to rerun the workflow every time they see the issue, which is often and not convenient.
Sure, see attached. The macro doesn't have much: the user fills in a file name, and the macro just modifies the current workflow's path and append the user's filename. The goal is to access a datastore without having to worry about where it is. There's no check, so users can point it to any file.
This solved the problem in a particular case: when the macro always outputs the same fields.
In that case, I also had to ensure that these fields are present in the macro output regardless of the file that's being imported inside the macro. This means adding a Text Input with the required columns, without any row, and union it to the input data. This makes the macro checks successful at all times.
However, this didn't solve the issue when the macro can produce any type of output schema. In that case, I noticed that there's a tick box hidden in the interface designer "output fields change when the macro's configuration or input changes".
The result of this setting seems to be that the workflow that uses the macro actually remembers the fields that were output by the macro, making all downstream tools happy.