This site uses different types of cookies, including analytics and functional cookies (its own and from other sites). To change your cookie settings or find out more, click here. If you continue browsing our website, you accept these cookies.
Is there any way to define the batch (chunk) size in the Salesforce Output tool?
I read the documentation here: https://developer.salesforce.com/docs/atlas.en-us.api_asynch.meta/api_asynch/asynch_api_concepts_lim... and it states: "Batches are processed in chunks. The chunk size depends on the API version. In API version 20.0 and earlier, the chunk size is 100 records. In API version 21.0 and later, the chunk size is 200 records". I would like to be able to define the batch size if possible and I assume that I will have to set it in the job properties (and so create a new Salesforce Output tool) but I am not sure of the exact syntax and whether it will work for update/insert/delete operations?
The limits you'r referring to (100/200 records) are the chunk sizes that SFDC breaks into for processing. The chunk sizes that SFDC expects from the API user are in the preceding section (10MB, 10,000 records, etc). The Alteryx (version 10+) Salesforce Output tool contains logic to take care of this batching. Are you having issues with the tool?
I'm not having an issue with the download tool, I'm having an issue with inserting 200 records in a particular Salesforce object. I am getting an error: "CANNOT_INSERT_UPDATE_ACTIVATE_ENTITY:Trigger: System.LimitException: Too many future calls" and so I need to somehow reduce the number of records in the chunk. I have tested using a different tool (Salesforce/Apex Data Loader) and when I reduce the batch size the records are loaded successfully. Thanks
You can wrap the SFDC Output tool in a batch macro. The contents would contain logic to assign each record a batch id number being sure that no batch contains more than X records. The control parameter would update a Filter tool that filters out a single batch at a time and passes the batch to the SFDC Output tool. Then outside of the batch macro you'd pass a list of batches that you want to process into the macro's control parameter input. If you'd like I can whip it up for you but it might be fun for you to try first :)
Thanks for the suggestion Neil. I am actually using this method for my Salesforce (Input) queries (to create the WHERE clause) so I can easily replicate the logic for the Salesforce Output and include it in a batch macro.
Does this mean the Salesforce Output connector is limited by the Bulk API Limits, e.g. we can output data into Salesforce max 10,000 batches per 24hr and max 200 records per batch => technically, we can only upload/insert 10,000 x 200 = 2,000,000 records per 24-hr rolling?
I'm not an expert but I believe the 200 record chunk size is an internal Salesforce processing size. I think Alteryx will actually be passing 2000 records in a single API call which Salesforce will process as 10 x 200 chunks. so I believe you can actually do 10,000 x 2,000 = 20,000,000 records. Happy for someone to correct me if they are more familiar with Salesforce limits and the Batch API.