Community Spring Cleaning week is here! Join your fellow Maveryx in digging through your old posts and marking comments on them as solved. Learn more here!

Alteryx Designer Desktop Discussions

Find answers, ask questions, and share expertise about Alteryx Designer Desktop and Intelligence Suite.
SOLVED

Block Until Done not working as expected

GauravRawal
8 - Asteroid

Hi,

 

I have a requirement to read a bunch of .csv files at once, process them for some calculations and then dump the original rows in another source table. For this, i tried to use Block Until Done tool as shown below, where i expected the flow from output 1 to get complete first and the data dump to happened from output 2. I observed that even when there is an error in the processing part, the data is still reaching to the source table, which is not correct. 

 

Am i missing something here or there is some other way to achieve this?

 

Block Until Done.PNG

6 REPLIES 6
Joe_Mako
12 - Quasar

How about Parallel Block Until Done from the CReW Macro Pack: http://www.chaosreignswithin.com/p/macros.html

 

pbud.png


A way to think of the Parallel Block Until Done tool is: the stream into input 1 must be fully complete (and no stopping errors) before the stream to output 2 is processed.

 

 

GauravRawal
8 - Asteroid

Hi Joe,

 

That worked perfectly fine. But why didn't the block until done did not help in the above case? My original thought was that this would be doable using product tools, without any external Macro.

GauravRawal
8 - Asteroid

Also, i found a sort of a problem with the working of this upon some more testing.

 

I was trying to move the source .csv files to some other folder after processing the data(after writing to a snapshot table but before writing the data to the source table). I am doing this via a .batch file in the event section of the work flow. So the flow in a nutshell is:

 

Read data from CSV files -- Process the data -- Write to a Final Table -- Move the files to some other archive folder -- Write the source data to a source table.

 

But I found that if this batch file gives an error(which is basically the end of stream 1), the data is still being written to the source table. Is it happening because this is an event, occurring on an external file and Alteryx is not considering it as an error within the flow? 

 

Below is a screenshot of the error:

 

CopyBatchError.PNG

Joe_Mako
12 - Quasar

Can you move the step that has the error to happen before the stream goes into input 1?

 

for example, in the following, the error prevents the output file from being written:

error block.png

 

The stream into input 1 must be fully complete (and no stopping errors) before the stream to output 2 is processed.

 

Stopping errors that happen after output 1 will not prevent the stream of output 2 from running. Only stopping errors that happen before input 1 will prevent output 2 from being run.

KaneG
Alteryx Alumni (Retired)

There are a couple of things at play here. 

  1. The Block Until Done tool will not wait for Stream 1 to complete before processing Stream 2, it will just wait until it has finished feeding all the data out Stream 1 before starting to feed data out stream 2. In general, tools in Alteryx know about upstream tools but not necessarily downstream tools.
  2. Hence, If there is an error after one of the Block Until Done tools, the BUD Tool doesn't care... or even know about it...

That might help explain why you have to move the processing upstream, and why the parallel BUD tool is still such a valuable add-on.

SteveKnapper
8 - Asteroid

Hello, I am aware that I am replying to an old post, but I have the same issue currently.

I am not using the crew macros as I schedule workflows via our Gallery.

 

What I have noticed in revisiting the BUD object example is that there is a message object of them which has variable settings.

can anyone provide an example of how to use the BUD object, I am thinking that I write to SQL Server the records written from step 1

could I then have something in the message object on the second step to check that step 1 has written records before continuing with step 2.

 

cheers

Steve

 

Labels