Community Spring Cleaning week is here! Join your fellow Maveryx in digging through your old posts and marking comments on them as solved. Learn more here!

Alteryx Designer Desktop Discussions

Find answers, ask questions, and share expertise about Alteryx Designer Desktop and Intelligence Suite.

Is there a way we can connect Alteryx to Azure Blob storage?

digvijay2101
5 - Atom

Do we have a direct connector available? If not, is there a way we can achieve this connection

9 REPLIES 9
david_fetters
11 - Bolide

Not yet.  We're in the same boat.  We've made it work three ways, hopefully one of them helps.

  1. Download the CSV file and use it like normal (low hanging fruit).  Probably not what you're looking for, but at least you get metadata information when you have the CSV locally.
  2. Use the download tool to pull directly from the storage blob via SAS token.  It does work, but you will need to parse the results of the download tool in order to get schema information every single time.  Even with a macro to parse the CSV, it still requires you to run the workflow twice before you get metadata available to the rest of your stream.  And because the result of the download tool is a single string type cell, there's a maximum blob size of somewhere between 600 MB and 1GB (depending on field lengths and quoting options).  Since we use blob storage as an output from data lakes, our files can be pretty large.  As an interesting note, I was able to get schema information without having to download the whole blob by using the Range parameters as documented here.  This is the same method mentioned in the Knowledge Base article here
  3. Create a custom macro that uses the Run tool to execute AzCopy.exe and download the file to a temp location and then load it into the data stream.  Still don't get metadata, but AzCopy is optimized and this approach is the fastest way right now to access files out of Blob Storage.  After AzCopy executes on the command line, you still have to wait for the CSV to then load into Alteryx.

 

We use 3 when we need it to be seamless (as part of automation pipelines with the same schemas each run), but for random things 1 is the easiest.  There is an idea that you should go support: https://community.alteryx.com/t5/Alteryx-Product-Ideas/Microsoft-Azure-blob-storage/idi-p/141190

 

If you want some help getting 3 to work, i can try and get you a macro.  We also have an analytical app for posting to blob storage that also uses AzCopy.exe so our analysts don't have to mess with the command line.  None of it is ideal, but we've raised significant noise on it on our end.

 

digvijay2101
5 - Atom

Hi David,

 

thanks for your suggestions. 

My current requirement is to migrate data from on prem to BLOB and I'm trying to see if we can  achieve it via AzCopy.exe

Do you have any inputs on how we can achieve this case?

 

Thanks in advance!

david_fetters
11 - Bolide

If you're just moving large swaths of data into a handful of containers, I would probably do this via the command line.  This reference is useful.  If you're trying to create numerous containers and move certain files into them according to specific rules, then it gets more complicated.

 

The basics are 1) install AzCopy, 2) stage your local folders, and 3) write out your command line commands in excel or notepad, 4) copy and paste each command into the command prompt and let it run.

 

The commmands you'll use will look like this:

"C:\Program Files (x86)\Microsoft SDKs\Azure\AzCopy\AzCopy.exe" /Source:"C:\MyLocalFolder\MoveToBlob" /Dest:https://YOURSTORAGEACCOUNT.blob.core.windows.net/YOURCONTAINERNAME /DestKey:asdfoauhtuahtaYOURACTUALSTORAGEACCOUNTKEYauda908f0ayd8fyh0== /S

The reference I linked will help you understand this, but the above uses the recursive flag /S to copy everything contained in the MoveToBlob folder into the container named YOURCONTAINER that is owned by the storage account YOURSTORAGEACCOUNT.

 

 

Ting
7 - Meteor

I'm having issue with the download tool, want to try the macro, do you mind to post some instruction/example about how you created the macro? That would be really helpful!

 

Thanks in advance. 

 

Ting

meghak1590
7 - Meteor

@david_fetters 

need your suggestion here.

 

i am using command run tool to copy from my local to blob storage.

i am giving this command 

azcopy copy "C:\Users\fsdf\sfsdfheaders.xlsx"  https://abcgfdfdf.blob.core.windows.net/testcontainer?sv=2019-12-STORAGEKEYY%3D 

this above command works perfectly in Command prompt tool and when i am running this using alteryx RUN command its not taking key correctly.. and hence Authentication is failing ...no idea why...

 

below error is logged into the log

 

--------------------------------------------------------------------------------
RESPONSE Status: 401 Server failed to authenticate the request. Please refer to the information in the www-authenticate header.
Content-Length: [302]
Content-Type: [application/xml]
Date: [Fri, 18 Dec 2020 15:25:28 GMT]
Server: [Windows-Azure-Blob/1.0 Microsoft-HTTPAPI/2.0]
Www-Authenticate: [Bearer authorization_uri=https://login.microsoftonline.com/5b973f99-77df-4beb-b27d-aa0c70b8482c/oauth2/authorize resource_id=https://storage.azure.com]

 

kindly provide some inputs here.

 

Thanks

tatemunja
6 - Meteoroid

Where you able to get any help with this? i have the exact same requirement as you.

LMB
6 - Meteoroid

I am in need of option 3 we have extremely large files

 

AlaKh
5 - Atom

any news regarding this ?

 

AlaKh
5 - Atom

the best way to conncet to azure blob storage is using AZcopy with Alteryx RunCommand its really efficient.You just have to add AZcopy to your path environment variable so you wouldnt encounter any probbrm while running the command from Alteryx.

Labels