Taking Lot of time for pushing data to Azure SQL Database Using Alteryx Desktop
- Inscrever-se no RSS Feed
- Marcar tópico como novo
- Marcar tópico como lido
- Flutuar este Tópico do usuário atual
- Marcar como favorito
- Inscrever-se
- Emudecer
- Versão para impressão
- Marcar como novo
- Marcar como favorito
- Inscrever-se
- Emudecer
- Inscrever-se no RSS Feed
- Realçar
- Imprimir
- Notificar o moderador
Problem Statement:
Taking a lot of time (>8 hours) to push ~2 M records of data to Azure SQL Database through Alteryx Desktop x64(Version 2021.4.2.02731)
Scenario:
We are processing the data using Alteryx Desktop which final loads the data to Azure SQL Database.
Working flow was running successful but currently we are facing issues while loading the data to SQL database.
All the Input, Transformation tools are working fine but output tool is taking time and no configuration was changed in between.
Output Tool Configuration:
ODBC Connection: ODBC Driver 3.80 for Microsoft SQL Server 17
File Format: ODBC Database (odbc)
Output Option: Overwrite Table (Drop)
- Rótulos:
- Database Connection
- Marcar como novo
- Marcar como favorito
- Inscrever-se
- Emudecer
- Inscrever-se no RSS Feed
- Realçar
- Imprimir
- Notificar o moderador
@Pandalion123 have you tried using the Azure SQL Database ODBC driver instead? Failing that, you could try the Azure bulk uploader option to compare performance.
- Marcar como novo
- Marcar como favorito
- Inscrever-se
- Emudecer
- Inscrever-se no RSS Feed
- Realçar
- Imprimir
- Notificar o moderador
I have tried both the configurations but it didn't seem to work. I have multiple containers in the same workflow running parallel processes of taking input data from a file, applying transformations and output it. If there's a single process, Alteryx seems to handle it fine. But in my case where there are multiple simple inputs and outputs also, the output tool doesn't seem to push to SQL server.
- Marcar como novo
- Marcar como favorito
- Inscrever-se
- Emudecer
- Inscrever-se no RSS Feed
- Realçar
- Imprimir
- Notificar o moderador
- is AMP engine enabled on the workflow?
- if you run the containers separately what's the time comparison? Could you look at sequencing the outputs if there is a significant difference?
- I'd definitely lean towards what @DataNath is saying. Use the Bulk upload.
- Could you try with the IN-DB tools, in terms of writing out. Use a Data Stream IN tool and then Write Data In-DB. Compare the difference.
- If all else fails, I'd try to upload via a Python Script - if that's something you're permitted to do.
Being as the upload process is taking a significant amount of time, it may be worth uploading a smaller amount whilst testing; of course, not doing it into a production database.
All the best,
BS