Hi All,
I have a excel file (41700 records, 8 columns) I do some format changes/apply formulas. then this data need to be loaded in an existing SQL table. currently it is taking 75+ minutes.
I have tried following troubleshooting so far, but nothing seems to be helpful.
1. changed the output type to SQL bulk upload instead of ODBC
2. transferring data into temp table first and then transferring data from temp table to actual table.
it is still taking 75+ minutes, any help on this is highly appreciated.
Thanks in advance to all the Alteryx enthusiasts.
Hello @Rakesh_Jasuja
Could you provide more details?
How is your connection set up (DSN, connection string, etc.)? What SQL Server driver are you using? Are you using the Overwrite option? What is the metadata of your dataset? (data types, sizes, etc.).
Are you using the regular Alteryx tools or the In-DB tools? Typically these issues are related to processing power - if you can use the In-DB tools instead to leverage the database’s computing power I would do that. It would only be slower if the database is bogged down
Hi @alexnajm ,
Thanks for looking into my query !!
I have not tried in-db tool yet, because I need to run delete query first.
let me try on this solution.
Regards,
Rakesh J.
Hi @Thableaus ,
Thanks for reviewing my query.
it is a DSN, Microsoft SQL server. I will share Metadata soon.
Regards,
Rakesh J.
Hi @Thableaus ,
currently I am not overwriting table, I am loading data in temp table, then moving data from temp table into main table (appending)
below is the metadata for destination table.
Column_name | Type | Length |
Case Level 1 (group) | varchar | 255 |
Client_ID | nvarchar | 38 |
Client Name | varchar | 255 |
Day of Resolved Date | date | 3 |
Day of Case Closed Date | date | 3 |
Case Level 1 | varchar | 255 |
Case Level 2 | varchar | 255 |
Case Level 4 | varchar | 255 |
Unique Document Id | float | 8 |
Case Timeliness Definition | float | 8 |
External Followup Required | varchar | 255 |
Case Age Legacy | float | 8 |
Met/Missed 2 Day | nvarchar | -1 |
Met/Missed 5 Day | nvarchar | -1 |
Met/Missed 10 Day | nvarchar | -1 |
Met/Missed 20 Day | nvarchar | -1 |
Met/Missed Overall Timeliness | nvarchar | -1 |
case level 3 | varchar | 255 |
day of created date | date | 3 |
sr_owner | nvarchar | -1 |
closed_by | nvarchar | -1 |
Hey @Rakesh_Jasuja
This is not very conclusive.
What is the DSN configuration? What driver did you use (there are several ways to connect to SQL Server), ODBC, OleDB, using the Native Client, using the ODBC Driver, etc. What is the driver version?
Is this SQL Server on-Prem or cloud? Do you have network latency?
How did you set the Bulk Load configuration?
What is the configuration of your Output Data tool? Could you share a screenshot?
In-DB in this case does not make a difference, because you're writing data from outside of the database, to the inside of the Database. In-DB and regular tools will have the same performance, you will not be doing any "processing" using the database.
Thanks,
Hi @Thableaus
it is connected through ODBC, and it is on cloud.
however I figured out a solution which is way faster and solving my query.
here is what I did.
after adding some additional columns using formula, I saved that on excel file on my shared drive.
then I used newly created excel file as input and loaded into SQL, it took me around 6 minutes.
Hi Rakesh.
In the table with datatypes and sizes, im assuming -1 for nvarchar means maximum?
I have had similar problems, where the workflow will not error, but write extremely slow. try to change all the max values to a value closer to what is needed. This solved it for us.
Worth noting; that having the max length on, doesnt always cause problems, but we have solved the slow write by reducing those to a shorter value. I have not looked any deeper into this behaviour.