Hi
I'm trying to access a container under my data storage on azure.
And I can login fine.
I have this bounch of .csv files.
My setup is like this: (what do I fill into file path (if I want to donwload all files?))
I have filled this:
Solved! Go to Solution.
Is there any way to use a wildcard in the file path?
Hey @Hamder83,
Do your files have a standardised naming scheme? In which case a batch macro would work to loop through each file. Check out the community video on batch macros if your not familure: https://community.alteryx.com/t5/Interactive-Lessons/Creating-a-Batch-Macro/ta-p/657923.
Essentially a batch macro will let you give the input a list of files to loop through:
Any questions or issues please ask :)
HTH!
Ira
Hi @IraWatt
I found a small python scripts, that made me able to read all subfolders and files:
And I set my Macro action to update value:
And if I select 1 file within the macro it works:
But I get an error when i pass my path into the macro:
Does this make any sense to you?
ahhm missed a dash. Now it works.
Hi @hamder83
Can you please share workflow. I'm struggling with same problem want to pick multiple files from azure data lake.
Thanks
Me too! please can someone share the workflow. I'm just trying to pick up multiple files with a similar name from azure
It would be great if someone could share a workflow on this. I'm currently trying to set up service to service connection to develop workflows and run them from gallery to import the most recent csv file with a certain naming convention into a SQL database. I currently do this from a directory, but we're shifting to azure data lake and csv files will be hosted there soon, not a on prem directory. Any help is really appreciated.
Hamder can you please share the workflow? I'm in the same situation.
Trying to pick up the latest file creation date time stamp, based on wildcard file name such as Item_Master*.