Hi everyone,
I’m creating a workflow in Alteryx that will call a Macro on a scheduled basis. The Macro receives the following inputs via a Text Input tool:
Normally, the Macro processing is straightforward: I use Input Data (Oracle) and Output Data (Snowflake) to establish connections, and a few Control Parameters allow the Macro to sequentially move a set of tables from Oracle to Snowflake.
However, this time I’m encountering some CLOB and BLOB fields that require additional processing, and I need some guidance.
I have two main questions:
Dynamic routing based on data type
a) How can I analyze or parse an incoming table dynamically to identify the data types of each column?
b) Once the data types are identified, how can I route them to different pipelines for processing? For example:
Text, Date, Number → Default Snowflake datatypes
BLOB → Use the BLOB Convert tool
- CLOB
Handling CLOBs
The sample table I tested on doesn’t have large CLOB values, but the workflow still times out at this step. I assume some additional processing is needed to move the CLOB. How should I handle this efficiently in Alteryx?
I’m relatively new to Alteryx and have only used a limited set of tools, so any tips, best practices, or example workflows for handling CLOBs and BLOBs in a scheduled Macro would be greatly appreciated.
Thank you in advance for your guidance!