Is there any way to define the batch (chunk) size in the Salesforce Output tool?
I read the documentation here: https://developer.salesforce.com/docs/atlas.en-us.api_asynch.meta/api_asynch/asynch_api_concepts_lim... and it states: "Batches are processed in chunks. The chunk size depends on the API version. In API version 20.0 and earlier, the chunk size is 100 records. In API version 21.0 and later, the chunk size is 200 records". I would like to be able to define the batch size if possible and I assume that I will have to set it in the job properties (and so create a new Salesforce Output tool) but I am not sure of the exact syntax and whether it will work for update/insert/delete operations?
Thanks in advance for any help people can give.
Solved! Go to Solution.
Hi Neil @NeilR
I have the issue with salesforce output, our salesforce is running some procedure on opportunities, I need limited bulk size to 5 to run mass update on opportunities. I current use data loader to do it, salesforce output bulk size 200 always give me error. May you share the workflow you build, thanks.
@NeilR Thanks, I tried, still confuse about the control parameter configuration. seems something not quite right. May you help take a look, thank you.
@zhuyyu it doesn't look like the macro "salesforce batchsize.yxmc" made it into your attached package. Could you attach that as well?
@NeilR Sorry. Here you go. Thanks again.
@zhuyyu I got it working with 2 minor changes:
I also added a Summarize tool within the macro just to show that it's working. You'd replace the Summarize tool (and everything after it) with the Salesforce Output tool. See attached.
@NeilR Thanks a lot, it works well.