Community Spring Cleaning week is here! Join your fellow Maveryx in digging through your old posts and marking comments on them as solved. Learn more here!

Alteryx Designer Desktop Discussions

Find answers, ask questions, and share expertise about Alteryx Designer Desktop and Intelligence Suite.
SOLVED

Flow control, timeout and error on the macro level

PiotrGa
6 - Meteoroid

Hi Community,

In my company we are using Alteryx in wide area of business processes. Many times the workflows are complex and maintaining them on the internal gallery is the real challenge. The build in Alteryx logs are extremally helpful, but we are missing extra functions on the governance side. However we developed some kind of framework to help us and power users in many aspects of development and maintain of the workflows, we got to the wall.  

 

One of the example is:

> We use many db sources, they usually works fine, but sometimes they run forever - it is db problem but to diagnose this it - is a real challenge. 

 

We are trying to prepare governed connectors delivered as Alteryx macros and we want to have option of parametric timeout build-in.

So the power user can use the connector but we set the timeout in our admin configuration db.

 

I tried to use / rebuild cRew "Wait a second" macro to count the time where second part is running query but have no clue how to finish the process when the first one is done.

To be more clear: I want to set time limit, eg. 10sec and then when this finished before [Connect inDb and Data Stream Out] return records I want to return the error. However if DB Tools are faster I want to finish the process without error.

 

Is it possible? Do you have any ideas?

6 REPLIES 6
DavidSta
Alteryx
Alteryx

Hi @PiotrGa,

 

personally I'd move this a little bit more to the database side.

I don't have a sample workflow available, but let me point out how it might look like in my opinion

 

- you need a unique identificator of the query, maybe add a comment or sth. like this with a unique value, or even the real timeout, to your query

- based on this unique identificator you can run an additional watchdog process - this can be an Alteryx Workflow as well

- this process checks on the database side if your query is running longer than an expected time, every database should be able to query this value

-- in this case you can check based on your unique identificator, or you extract the "injected timeout value" from the comment

- if you detect this process is running forever you can kill the query as well with a SQL statement

 

Hope this gives you an idea how such a process might look like.

 

Best regards,

David

PiotrGa
6 - Meteoroid

Hi @DavidSta,

 

Thank you for your solution. 

I really appreciate your solution, unfortunately as we work in high risk env. we got very limited access to data bases.

As a result we can't use this kind of approach.

 

At the moment, I am searching for the way to kill the workflow in gallery based on external condition. So in this case it would work like that:

> I call Alteryx connector macro and log start time in our admin db, there is a job in db checking if any of started process did finish in the given (expected time range) if not it will call Alteryx API to kill the job.

 

So far no clue if it is possible (can't see admin API method).

 

Maybe you have other ideas ? 🙂 

Thanks again and appreciate further clues

Piotr

 

 

DavidSta
Alteryx
Alteryx

Hi @PiotrGa,

 

regarding the API endpoint I'd recommend to take a look in the idea section of the community.

https://community.alteryx.com/t5/Alteryx-Server/Cancel-running-workflow-using-Gallery-API-endpoint/i...

There you can contribute to the post to make this topic more visible to the Product Management.

 

Another Option I can think about is to "force" Alteryx throwing an Error, like with the Test tool if your condition "long runtime" is met. But I'm not confident that a kill command against the database will be send in this case. I expect the query still will run on the database until it's done.

 

But even if you are restricted in your database, the user who is executing a query must be able see at least his own query and with this the current runtime.

The biggest problem in this case might be the kill command which might not be available for your users. That's the reason why I was thinking about the watchdog process outside of your original workflow which might have higher access rights.

Maybe you can take a look to this blog post. This is based on a similar problem with postgres as the database http://dbadailystuff.com/2012/05/12/pg_terminate_backend-for-non-superusers/

 

Best regards,

David

PiotrGa
6 - Meteoroid

Hi @DavidSta ,

 


Another Option I can think about is to "force" Alteryx throwing an Error, like with the Test tool if your condition "long runtime" is met. But I'm not confident that a kill command against the database will be send in this case. I expect the query still will run on the database until it's done.


This might be a solution. 

I can use "Cancel running workflow on error" option to stop the process and from my experience it finished the query in our oracle dbs.

 

The problems I have here are:

> How to invoke the error after a given time?

> How to not invoke the error if database is first (returned records)?

 

If it is possible it is a real game changing solution for our governance.


Do you have some ideas?

 

Thanks

Piotr

DavidSta
Alteryx
Alteryx

Hi @PiotrGa,

 

I played a little bit with an interative macro to check several time, if the timeout is already reached.

Unfortunately this is not working like expected, as we can not make sure, that the single steps of your workflow are executed in parallel. Even the new AMP engine can not solve this specific part.

 

I'll attach it here if someone is interested.

 

Right now I don't have any additional ideas how to tackle this problem if you can't kill the query by itself.

 

So my recommendation would be to participate in the ideas section, to get your requirement pushed.

 

Best regards,

David

PiotrGa
6 - Meteoroid

@DavidSta Thanks for your time and effort.

Labels