Advent of Code is back! Unwrap daily challenges to sharpen your Alteryx skills and earn badges along the way! Learn more now.

Alteryx Designer Desktop Discussions

Find answers, ask questions, and share expertise about Alteryx Designer Desktop and Intelligence Suite.
SOLVED

How to send limited amount of records at a time through a batch macro

ccagle
9 - Comet

Hi, 

There might be an answer on this out there but I haven't found anything super clear. All I want to do is send a certain number of records at a time through a batch macro. 

 

Here's the example - 

1. SQL table 'x' contains 1 million records

2. I want to send this in chunks of only 1K records at a time through my macro until all 1 million go through. 

 

Note that I need to send exactly 1K at a time though, I can't send them through row by row. The reason being is that the macro itself needs to take all 1K records (at a time) and format them into a JSON string for an API post. This is more just details though - the main thing is sending through the records in chunks. 

 

Is there a clear post or answer on how to do this? Note that I figured out how to do this as an iterative macro, but not as a batch macro. 

9 REPLIES 9
BenMoss
ACE Emeritus
ACE Emeritus
There's quite a few ways you can create these chunks.

I would recordID your dataset then use the formula tool with the statement...

CEIL([RecordID]/1000)

Then you can figure you batch macro to group by this field. I'm not clear if you are familiar with batch macros or not, please ping us if you need further guidance.

Ben
ccagle
9 - Comet

Thanks, I think I actually figured this out last night. The only thing to confirm though - let's say I called your formula there 'Groupby'. I'm attaching a picture of the batch macro configuration. Does this look correct? Essentially the field we create here will be both the 'control' and 'input' group by's in the configuration. 

ccagle
9 - Comet

Thanks, I think I actually figured this out last night. The only thing to confirm though - let's say I called your formula there 'Groupby'. I'm attaching a picture of the batch macro configuration. Does this look correct? Essentially the field we create here will be both the 'control' and 'input' group by's in the configuration. 

ChBurger
6 - Meteoroid

For anyone attempting to figure this out for themselves based on the scant reply here is some more information on batch processing.

First, take a field and add a RecordID. This will allow us to then group the records by the batch. For instance the example provided was: CEIL([RecordID]/1000) - This will then group our RecordIDs on a base of 1000. Each group of 1000 records will have the same RecordID going forward. The problem with using this is each record will still run based on it's own iteration. 

ChBurger_1-1677183936932.png

 

The second thing you need to do is summarize your batch numbers and group them by the batch ID. 

ChBurger_2-1677183962391.png

Lastly, you will need to configure your input GroupBy and your control GroupBy setting your input to your input data RecordID and your control GroupBy to your summarized RecordIDs.

ChBurger_3-1677184069222.png

 

1,000sq/ft snapshot of the workflow calling the Batch Macro.

ChBurger_0-1677183794966.png

Once this is configured you will notice the output will provide all items processed based on the control parameter rather than each individual line item.

ChBurger_4-1677184159395.png

 

Cheers!

 

 

Praverkar
5 - Atom

Hi @ChBurger,

 

What is the tool you have used after summarised tool?

 

ChBurger
6 - Meteoroid

Hey @Praverkar,

 

This is a custom batch macro I created. In this case I am utilizing a separate macro to obtain a bearer token, then migrate in the SAP SuccessFactors upsert query with the token and the JSON data. Since this is configured as a batch macro it will process each record in the assigned batch number, then output all data in that batch. 

By utilizing the summarize tool you are able to concatenate the strings of the JSON. In the concatenate properties you are able to set the start, end and separator. Within this you can set the [ and ] brackets to set the JSON as an array and using the , you are able to separate the JSON entries accordingly.

 

ChBurger_3-1679055996691.png

 

ChBurger_0-1679055254572.png

ChBurger_1-1679055687261.png

 

Cheers,

Chad

 

Praverkar
5 - Atom

Got it. Thank you so much @ChBurger

Cece_01
6 - Meteoroid

Hi, I have a query on batch macros, I have billions of records in my sales dataset and this is the query - How do  I use batch macros to load the data by region and store, please?

select Interim_Reporting.PRD.vwStoreSales_Combined_2024.[Country Key],
Interim_Reporting.PRD.vwStoreSales_Combined_2024.[Currency Key],
Interim_Reporting.PRD.vwStoreSales_Combined_2024.[Calendar Day],
Interim_Reporting.PRD.vwStoreSales_Combined_2024.[Generic Article Key],
Interim_Reporting.PRD.vwStoreSales_Combined_2024.[Product Code],
Interim_Reporting.PRD.vwStoreSales_Combined_2024.[Sales Office Key],
Interim_Reporting.PRD.vwStoreSales_Combined_2024.[Region Code],
Interim_Reporting.PRD.vwStoreSales_Combined_2024.[Plant Key],
Interim_Reporting.PRD.vwStoreSales_Combined_2024.[Store Number],
Interim_Reporting.PRD.vwStoreSales_Combined_2024.[Sales Quantity]

from Interim_Reporting.PRD.vwStoreSales_Combined_2024

Cece_01
6 - Meteoroid

I also have another query

I have this table which contains all the store details by each region , this table contains 65,820 records select .Region Code], [Region Name], [Store Number], [Store ID (dwh)], .[Is City Store], from NIT_Datasets.dbo.StoreDetails

and I have this table which contains only product codes which has 17,861 records select distinct [Product Code] from Interim_Reporting.PRD.vwStoreSales_Combined_2024

How do I append every product code to every store using alteryx? what is the fastest way to do this as it is taking so much time if I use the append fields tool?

Labels