I have an architecture setup where IoT data is flowing from Microsoft IoT Central to an Azure Data Lake (i.e. a Blob Storage Gen 2).
The data export functionality is built-in in and easy to set up using a simple IoT Central configuration.
When configured, AVRO-files (I assume) are continuously sent to the Data Lake. Folder files are all created, named and appended automatically, in this case into a blob container named data.
All file paths are according to the pattern shown in the Screen Shot attached where the first string (after data / ) is auto-generated.
New folders are added automatically every full minute and hence every file holds a number of rows, all with timestamps within that specific minute.
I would like to aggregate data from these files on some longer time interval. It is hence not as clear-cut as pointing to one single .cvs file on a known fixed file path.
So, how do I use wild cards for the folder structure as above? How do I trigger/schedule a read and data aggregation "job" in Designer?
I have downloaded and installed Azure Data Lake File Import module and I am using that. Is that what is recommended?