Advent of Code is back! Unwrap daily challenges to sharpen your Alteryx skills and earn badges along the way! Learn more now.

Alteryx Designer Desktop Discussions

Find answers, ask questions, and share expertise about Alteryx Designer Desktop and Intelligence Suite.
SOLVED

Define Batch (Chunk) Size in Salesforce Output Tool

mpurdy
8 - Asteroid

 

Is there any way to define the batch (chunk) size in the Salesforce Output tool?

 

I read the documentation here: https://developer.salesforce.com/docs/atlas.en-us.api_asynch.meta/api_asynch/asynch_api_concepts_lim... and it states: "Batches are processed in chunks. The chunk size depends on the API version. In API version 20.0 and earlier, the chunk size is 100 records. In API version 21.0 and later, the chunk size is 200 records". I would like to be able to define the batch size if possible and I assume that I will have to set it in the job properties (and so create a new Salesforce Output tool) but I am not sure of the exact syntax and whether it will work for update/insert/delete operations?

 

Thanks in advance for any help people can give.

16 REPLIES 16
NeilR
Alteryx Alumni (Retired)

Hi @mpurdy,

 

The limits you'r referring to (100/200 records) are the chunk sizes that SFDC breaks into for processing. The chunk sizes that SFDC expects from the API user are in the preceding section (10MB, 10,000 records, etc). The Alteryx (version 10+) Salesforce Output tool contains logic to take care of this batching. Are you having issues with the tool?

mpurdy
8 - Asteroid

I'm not having an issue with the download tool, I'm having an issue with inserting 200 records in a particular Salesforce object. I am getting an error: "CANNOT_INSERT_UPDATE_ACTIVATE_ENTITY:Trigger: System.LimitException: Too many future calls" and so I need to somehow reduce the number of records in the chunk. I have tested using a different tool (Salesforce/Apex Data Loader) and when I reduce the batch size the records are loaded successfully. Thanks

NeilR
Alteryx Alumni (Retired)

You can wrap the SFDC Output tool in a batch macro. The contents would contain logic to assign each record a batch id number being sure that no batch contains more than X records. The control parameter would update a Filter tool that filters out a single batch at a time and passes the batch to the SFDC Output tool. Then outside of the batch macro you'd pass a list of batches that you want to process into the macro's control parameter input. If you'd like I can whip it up for you but it might be fun for you to try first :)

mpurdy
8 - Asteroid

Thanks for the suggestion Neil. I am actually using this method for my Salesforce (Input) queries (to create the WHERE clause) so I can easily replicate the logic for the Salesforce Output and include it in a batch macro.

hamdylie
7 - Meteor

Hi NeilR,

 

Does this mean the Salesforce Output connector is limited by the Bulk API Limits, e.g. we can output data into Salesforce max 10,000 batches per 24hr and max 200 records per batch => technically, we can only upload/insert 10,000 x 200 = 2,000,000 records per 24-hr rolling?

 

Thanks

mpurdy
8 - Asteroid

I'm not an expert but I believe the 200 record chunk size is an internal Salesforce processing size. I think Alteryx will actually be passing 2000 records in a single API call which Salesforce will process as 10 x 200 chunks. so I believe you can actually do 10,000 x 2,000 = 20,000,000 records. Happy for someone to correct me if they are more familiar with Salesforce limits and the Batch API.

calebdugger
7 - Meteor

Is there any chance you could do an example of one of these? I haven't done any macros yet. Thank you!

jkruger
6 - Meteoroid

Would love to see how you built out the workflow. 

 

We're trying to determine how to create batch sizes that run through our workflows because of memory issues when updating objects in salesforce. 

mpurdy
8 - Asteroid

Hi @jkruger, sorry for the delay getting back to you. I've attached 2 pictures showing the Alteryx Workflow and the Alteryx Macro with explanations. Let me know if you have any questions. 

Labels