Community Spring Cleaning week is here! Join your fellow Maveryx in digging through your old posts and marking comments on them as solved. Learn more here!

Alteryx Designer Desktop Discussions

Find answers, ask questions, and share expertise about Alteryx Designer Desktop and Intelligence Suite.

Output to file - and use file as input later in flow. Beeing able to cache data?

Hamder83
11 - Bolide

Hi 

I have tried several different methods, without achieving what I want.

Basicly I have a group of supplier id's basic on the selected customer:

I want to generate this list of ids (SubCarrierNumberUB):

Hamder83_0-1667913731770.png

Hamder83_2-1667913755737.png

 

And use as input to a macro later in my flow so I only read data related to these subcarriers:

Hamder83_3-1667913810037.png



The issue is, my flow starts reading the Subcarrier data, before the flow has written a new file.

How to fix?



 

 

8 REPLIES 8
ChrisTX
15 - Aurora

Try using a Block Until Done tool.  Or from the CReW macros, use Parallel Block Until Done.

 

Chris

DanielMS
Alteryx
Alteryx

Hi @Hamder83,

 

You can achieve this buy using the Block Until Done tool.

 

This tool will block a workflow progressing until it has completed all of the steps prior to the tool.

https://community.alteryx.com/t5/Alteryx-Designer-Knowledge-Base/Tool-Mastery-Block-Until-Done/ta-p/...

 

Kind regards,

 

Dan

 

 

Please like and mark this as a solution if this has solved your problem

Hamder83
11 - Bolide

Hi @ChrisTX  I have a block until done in the flow.

I have to load some data, and process them before I can write the file.

And already before the data is processed it starts reading the 2. part?


ChrisTX
15 - Aurora

In your screenshot, you're reading in a YXDB file.  Is that the data source that's being read, before another stream, earlier in the workflow, has finished writing to the same YXDB?

 

If yes, I've successfully used a Dynamic Input tool with a Parallel Block Until done to prevent reading a YXDB file that I'm writing to earlier in the workflow.

 

Chris

Hamder83
11 - Bolide

hi @DanielMS  & @ChrisTX 

 

Thanks for your replies.

My issue is, that I want to be able to cache the data, which I can't this way. (since its many gb its loading).
And I don't know before the initial data is loaded which id's to fetch and you for the lookup in the 2. part.

And I can't seem to find a workaround?

ChrisTX
15 - Aurora

I don't think I can help because I don't understand the issue, I don't understand exactly what you're trying to do.  In the workflow screenshot you posted with the red boxes, we can't see exactly what you're trying to point out.

 

I'm guessing the workflow image shows that you're writing out to a YXDB file, then reading in that file in the same workflow, but you don't want to do it this way because the data stream is very large?

 

I'm making assumptions about the actual problem.  Can you specify exactly what you're trying to do, step-by-step?

 

If you are trying to cache a large data stream, what testing have you done with the Block until Done tool?  Is the tool not working right?

 

Maybe the problem is you're using the CReW macro for "CReW_LabelledBlockUntilDone.yxmc"?  That macro seems to send all output from the output anchor 1, so I'm not sure how that one works.

 

Have you tried to use the native Alteryx  macro for Block Until Done?

 

Chris

 

Hamder83
11 - Bolide

Hi @ChrisTX 

I have attached a small workflow that simulates what im trying to do. But the 2. macro in that flow, im not able to cache, and I assume its because of the "dynamic" input.

Thats why I tried to either write to a file, and use the data in the file to load data into the 2. macro. I tried taking the data directly, but that didn't work either.

My main purpose was to be able to cashe the data from macro 2, and using input from the flow to load it.

Does it make sense?


ChrisTX
15 - Aurora

Here are a few examples where I don't understand:

 

You mentioned "But the 2. macro in that flow".

I'm guessing you mean the macro in the red box below?

 

ChrisTX_1-1669634945141.png

 

 

Where in your sample workflow do you need a cache to occur?

Where have you tried to use a Block Until Done tool?

What exactly is not working right in the sample workflow?

Where exactly do you need a specific sequential data flow?

 

The Join tool is a "blocking tool", so the join won't "miss" joined records based on timing of data going into that tool.

 

You mentioned "I tried to either write to a file, and use the data in the file to load data into the 2. macro"

  This sounds like you want to cache the data going *into* macro 2?  correct?

 

You mentioned "My main purpose was to be able to cashe the data from macro 2, and using input from the flow to load it."

  This sounds like you want to cache the data going *out* from macro 2.

 

Chris

Labels