Alteryx Designer Desktop Discussions

Find answers, ask questions, and share expertise about Alteryx Designer Desktop and Intelligence Suite.
SOLVED

Circumventing SQL Server Timeout

thughes
6 - Meteoroid

Hi all,

 

I have a complex workflow (including a macro) that iterates over all the tables in my database.  When I run it I get an error telling me SQL Server timed out (clearly a Server setting for non admin users at least).  Because the macro queries the next table when it has completed for the current table, it doesn't work to cache the tables.  So now for my questions:

 

1. Is there a best practice for running workflows like the one I've described above that take longer than the database connection allows?

2. Is there a way to close the connection at the end of one iteration of the macro and then open a new connection at the start of the next iteration of the macro?

 

Thanks,

Tim

3 REPLIES 3
chris_love
12 - Quasar

If the Server is timing out then this points to one of three issues:

 

1. A timeout setting on the server or in the driver

 

2. A problem with the client / server network connectivity

 

3. An issue with the connection string

 

Assuming you are using a batch macro then yes you are creating a new connection to the database each time.

 

Can you share a screenshot of the error message in the log - it would help to understand where it's happening. In the parent modules properties if you turn on Show Macro Messages then that would also help see the full macro output and what is occuring....

thughes
6 - Meteoroid

Thanks for confirming that a batch macro creates a new connection to the database each time.  Apparently it was the server timeout setting that was causing the error - for some of the tables my macro loops through just doing a SELECT* was enough to timeout.  Solved by limiting to TOP 1000000, which isn't ideal, but it works.

 

Thanks again,

Tim

thughes
6 - Meteoroid

So I spoke too soon.  After running for about 23 hours my workflow finally timed out on the 547th macro input value.  I could further limit the number of records I pull from each table, but that would further decrease the accuracy.

 

So a new question arises: is there a way continue processing a batch macro after an error occurs?  Specifically in my macro I iterate over each table in a database, so if the macro fails for one table I would like it to just move on to the next table in the list.

 

Thanks in advance!

 

-Tim

Labels