Bring your best ideas to the AI Use Case Contest! Enter to win 40 hours of expert engineering support and bring your vision to life using the powerful combination of Alteryx + AI. Learn more now, or go straight to the submission form.
Start Free Trial

Alteryx Designer Desktop Discussions

Find answers, ask questions, and share expertise about Alteryx Designer Desktop and Intelligence Suite.

How to capture %OutputLog% in a 'normal' workflow

cmcclellan
14 - Magnetar

Client request is - be able to write %OutputLog% for later reference BUT the limitations are:

- no Events configured

- no manual "Save As" after the workflow has finished

 

So the workflow can be scheduled on Alteryx Server, the workflow runs, processes and finishes and the %OutputLog% (just like you get in an Event notification) is written to a database somewhere.

 

I don't think this is possible with a normal workflow, but maybe one of the CReW macros ? (I haven't started testing them to confirm because I think the client might reject that idea as well .... they have rejected emails events already)

4 REPLIES 4
dwstada
11 - Bolide

control containers output their log as a normal table, try putting your whole workflow in a control container and write the container output into a file

KGT
13 - Pulsar

The reason that you can't get it inside the workflow, is that it's not generated until the workflow completes. Right in-between the last tool and events by memory...

 

This may provide more info: https://community.alteryx.com/t5/Alteryx-Server-Discussions/Save-workflow-log-in-SQL-database/td-p/1...

 

By CReW Macros, if you're talking about the Runner Macros, then I advise against it as the Runner Macros are not supported on Server and can cause inconsistencies. The only way they would work for this, is by offloading the run to a default engine outside of gallery, executing and then pulling the log after execution, so it would be a harness that is disguising the internals which would play up with server management.

 

You're still fine though, as that log is written to the Server DB and so can be pulled from there by API after it's run. If you wanted that to happen in the same run, then you could set up a workflow on the server that pulls a log by a jobID or similar. Then have a macro put on the end of each workflow to kick off that job straight after, passing the relevant info for that job to pull the correct log.

 

Happy to talk through if you want to give me a call, this account may go defunct in 2 days (like my other couple accounts), but for the same reason, I'll have time on my hands for a few weeks... Kane

jrlindem
11 - Bolide

Agree with @dwstada that using a Control Container to get at the output log is a fun and creative way to get at that information.  It doesn't give you everything, but it is helpful to log tool behavior that occurs inside.  Connect your Browse tool to the output side of the Control Container and you'll be able to see what is made available to you, like this:

jrlindem_0-1759236621238.png


For a more sophisticated approach, @KGT 's post is also very useful.  I agree that CReW Macros are amazing, but not server supported, so best to avoid for those purposes.  -Jay

abacon
12 - Quasar

The control container approach is great to know, that's a great idea.

 

@cmcclellan Another option is to access the underlying MongoDB's persistence layer and hitting the AS_Results table. It contains the log file as a Blob within the database. This is trickier so I would go the control container route, but this is another option i you want to hit the already tabled log files.

 

See the blog below on how to access the mongo DB's from alteryx - https://community.alteryx.com/t5/Engine-Works/A-Practical-Guide-to-the-Alteryx-Server-Usage-Report/b...

 

Bacon

Labels
Top Solution Authors