Alteryx Designer Desktop Discussions

Find answers, ask questions, and share expertise about Alteryx Designer Desktop and Intelligence Suite.

Uploading and downloading to Azure Data Lake with string datatypes - string limit reached

JackTomlinson
5 - Atom

Hello, 

 

I have a workflow which will be used on Alteryx server which is writing .csv file types to an Azure Data lake, using a SAS key. This data will then be downloaded and used at future steps of the workflow to prevent a file browse tool needing to be used and any formatting etc. 

I am currently using the attached screenshot which was taken from Alteryx community and adapted to write to string, this is because future steps in the workflow were built using string fields and if I attempt to export and then reimport as blob the data is no longer recognised by tools downstream. 

The workflow writes successfully to the Azure data lake with smaller files but there are instances where an error is generated because the string limit is reached.

I have thought of a few workarounds, but option 1 is not very efficient, option 2 is restricted as the workflow is not currently built with the amp engine and 3 is restricted by a lack of knowledge on my end:

 

1. I have tried updating the blob input file to split the blobs into chunks, this will require an iterative macro for both upload and download files and will upload multiple parts of each dataset which does not seem an efficient way to approach writing to the data lake.

2. I'm aware of the Azure data lake tools but think not using the AMP engine will restrict these use

3. I've seen writing of Zipped files being written for SFTP which may work, but I'm not sure how this links to writing to an Azure Data Lake

 

Any more efficient ideas or workflow examples would be greatly appreciated

 

Many Thanks,

0 REPLIES 0
Labels