This site uses different types of cookies, including analytics and functional cookies (its own and from other sites). To change your cookie settings or find out more, click here. If you continue browsing our website, you accept these cookies.
Not yet. We're in the same boat. We've made it work three ways, hopefully one of them helps.
Download the CSV file and use it like normal (low hanging fruit). Probably not what you're looking for, but at least you get metadata information when you have the CSV locally.
Use the download tool to pull directly from the storage blob via SAS token. It does work, but you will need to parse the results of the download tool in order to get schema information every single time. Even with a macro to parse the CSV, it still requires you to run the workflow twice before you get metadata available to the rest of your stream. And because the result of the download tool is a single string type cell, there's a maximum blob size of somewhere between 600 MB and 1GB (depending on field lengths and quoting options). Since we use blob storage as an output from data lakes, our files can be pretty large. As an interesting note, I was able to get schema information without having to download the whole blob by using the Range parameters as documented here. This is the same method mentioned in the Knowledge Base article here
Create a custom macro that uses the Run tool to execute AzCopy.exe and download the file to a temp location and then load it into the data stream. Still don't get metadata, but AzCopy is optimized and this approach is the fastest way right now to access files out of Blob Storage. After AzCopy executes on the command line, you still have to wait for the CSV to then load into Alteryx.
If you want some help getting 3 to work, i can try and get you a macro. We also have an analytical app for posting to blob storage that also uses AzCopy.exe so our analysts don't have to mess with the command line. None of it is ideal, but we've raised significant noise on it on our end.
If you're just moving large swaths of data into a handful of containers, I would probably do this via the command line. This reference is useful. If you're trying to create numerous containers and move certain files into them according to specific rules, then it gets more complicated.
The basics are 1) install AzCopy, 2) stage your local folders, and 3) write out your command line commands in excel or notepad, 4) copy and paste each command into the command prompt and let it run.
The reference I linked will help you understand this, but the above uses the recursive flag /S to copy everything contained in the MoveToBlob folder into the container named YOURCONTAINER that is owned by the storage account YOURSTORAGEACCOUNT.