Force Field Type for Blank Columns in Dynamic Input
- Subscribe to RSS Feed
- Mark Topic as New
- Mark Topic as Read
- Float this Topic for Current User
- Bookmark
- Subscribe
- Mute
- Printer Friendly Page
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Notify Moderator
Hi - I am having an interesting dilemma reading in a directory of Excel files, all built with the same Excel template.
Using the Dynamic Input, I am getting error messages "XXX has a different schema than 1st file in the set.
I have manually confirmed that the Excel headers are the same across several of the files. So, I believe the cause of this error message is that some of the templates have blank fields, even though the column header exists in Excel and properly maps to the Dynamic Input template. This appears to be causing Alteryx to read these blank columns in as field type of "double", whereas templates where the fields have been populated read in as "V_String".
I can't find any way to fix this within the Dynamic Input tool. Is the best and/or only option to create a Batch macro to read in each file and within the Batch macro convert the null fields to a string with a Multi-Field tool?
Any thoughts and or suggestions are appreciated. Thanks - Randy
Solved! Go to Solution.
- Labels:
- Dynamic Processing
- Error Message
- Input
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Notify Moderator
Which version of Alteryx Designer do you have? I've noticed some issues in the Dynamic tools between different versions (<19.2 vs. 19.3) .... Wonder if this is a bug
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Notify Moderator
Not a great solution, but if you set the dynamic input tool to assume that the first row is data, then it will import all the columns as strings (as long as all your column names have at least one non-numeric character). Then use a dynamic rename tool to pull the columns from the first row. You will have to go through some post-processing to get rid of the headers from the other sheets and to get the data types right..
I know you wanted to avoid it, but I've had enough run-ins with mismatched formats that I usually just go with a Batch Macro unless I'm absolutely sure the formats are the same and every column will have data. Just set the macro up so the full path for the sheet is the template. You don't really need to do any post-processing, just make sure the macro is configured to combine the results of each iteration by name. I've attached an example where the only extra tool is an Append so I can include the sheet names with the data.
EDIT: One more thing. Using this approach means that you will get the same data types you would get if you read each sheet individually and without doing any extra work. A mix of nulls and doubles will give you double. Nulls and strings will give you string (V_String or VW_String). If one sheet has string values and another doubles in the same column (same name), then the resulting type will be string.
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Notify Moderator
I'm on 19.2, I'll test when I upgrade. Thanks for the suggestion.
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Notify Moderator
Thanks Tony. I tried your suggestion and inside the Dynamic Input template checked that that "the first row is data". That resolved a majority of the issues.
However, there are still a couple of files that are failing. The only difference I can see between a file that works and one that fails is that, in the file that fails, some of the fields default to V_WString (see screenshot of joined Field info below), whereas the files that work seem to all come in as V_String. This is the only difference I see between the schemas when reading the headers in each file as data.
Consequently, I'm going to try the Batch macro approach. That macro approach seems to provide a less fragile and more flexible solution.
Thanks for your help and the sample workflow! Much appreciated. Randy
