Showing results for 
Search instead for 
Did you mean: 

Alteryx designer Discussions

Find answers, ask questions, and share expertise about Alteryx Designer.

Insert multiple Excel sheets to different tables in database




I've got 30 Excel files and each file has to be inserted into 30 different tables in database.

Can you please help me if this can be done through alteryx dynamically? 


Thanks in advance!!

Yes, it is possible. 


There isn't enough detail to give a concrete solution. A rough approach would be:


Assuming your Excel files are different formats then you would probably need to look to use a simple batch macro reading in each file and then loading to a database. On the Input tool you can get it to add the file details to the data coming out.


The output tool allows you to use a field in the data to adjust where it is written to:



If the Excel files are all the same format then it is easier. There is no need to use a batch macro as you can then use a wildcard in the Input tool to read them all in, and then use the same approach to send output to specified tables.







Im still not able to get the desired output.

All the excel files are in the same format.


i used a wild card but all the files are merged into one and got a single output.


For example:

excel 1 has 100 rows and excel 2 has 100 rows,

i need the data of excel 1 to be written in table 1 and the data of excel 2 has to be written on table 2.

but currently both the files are merged and im getting a single output with 200 rows.





Hello you can use the following tools in the batch macro to avoid the same:


1. Use a directory tool >>Dynamic Input tool >> Formula tool JoinValue=1

2. New Pipeline from above a Formula directory tool create a New Column JoinValue=1.

3. Use an equi join to have the FileName as a new column for the dataset of each file.

4. Load the entire data in a single dbase Table with the identifier as FileName Column .


From here on your datasets can be distinguished by the FileName column and you can create a new workflow or use InDB to pull out filespecific data from the large dataset.


I have implemented a similar scenario. Let me know in case you try and face issues.