I had designed a somewhat complex alteryx flow which took the latest .txt file in a NAS drive. This file has several lines that need parsed, and based of the lines record key, has specific string lengths. I built using a dynamic input tool, and everything worked out great with the template and multi row logic for looping through the specific record keys.
However, the long term solution was to store these files in Hadoop. I sweep Hadoop by pulling the latest load datetime stamp. When I try to use the dynamic input tool and connect to the Hadoop instance, there is no File/Field Layout selection to bring in the .flat file that specifies what the fixed length for each individual file. What am I missing here? I searched the community, and I apologize if there is a subject on this already.
Many Thanks,
David