Community Spring Cleaning week is here! Join your fellow Maveryx in digging through your old posts and marking comments on them as solved. Learn more here!

Alteryx Designer Desktop Discussions

Find answers, ask questions, and share expertise about Alteryx Designer Desktop and Intelligence Suite.
SOLVED

Getting Salesforce Output connector Error: Error transferring data: Failure when receiving

Rohit_Bajaj
9 - Comet

Hi All,

 

I am getting the below error when trying to hit Salesforce output connector.

 

Salesforce Output: Error transferring data: Failure when receiving data from the peer.

 

Background - We had made some changes to the Salesforce Output Connector - basically in the download tool, as in to have 'Data' selected from 'Headers' section in Download tool inside 'add batches to the job' sub module, propagated that forward to 'retrieve batch results' sub module, followed by some simple transformations to capture error record details along with the error message in a file.

 

While using the changed connector I am running into the above mentioned error.

 

I am not running into the above error -

 

1) If I am using the original salesforce output connector/deprecated salesforce output connector. This should imply that it has got something to do with changed/modified salesforce output connector.

2) If I am using the changed output salesforce output connector and the number of records is less (less than 152 - approx number). 

 

This might hint towards -

 

1) Some kind of SFDC governor limits.

2) Data volume and related timeouts.

3) Some erroneous data etc.

 

On following earlier posts, I could see people being re-directed to have a look at LAN connection settings.

 

But I believe that since error is not occuring for original salesforce output connector, deprecated version of salesforce output connector, less number of records, it should be something other than LAN settings.

 

In case anyone has encountered such errors, or knows any related use case, please guide us in this regard.

 

Thanks,

Rohit

5 REPLIES 5
Rohit_Bajaj
9 - Comet

On further investigation with running with different sets of data (<100 records), I am getting following error.

 

When the 'Download' tool inside 'add batches to the job' is left as is default of String (UTF-8) encoded -

 

            Salesforce Output (30)    Tool #79: Invalid document structure at Line:1 and Column:1 at row # 1

            Salesforce Output (30)    Tool #94: Error trying to add batch(es) to the job. HTTP/1.1 413 Request Entity Too Large: head ... Data from server: Error: Request Entity Too Large: head

 

When the 'Download' tool inside 'add batches to the job' is changed to Output as Blob -

 

            Salesforce Output (30)    Tool #94: Tool #22: Parse Error at char(0): Type mismatch in operator +.

            Salesforce Output (30)    Tool #79: Invalid document structure at Line:1 and Column:1 at row # 1

 

Tool 94 is Salesforce.errorCheck inside 'add batches to the job' sub module.

Tool 79 is XML Parse inside 'retrieve batch results' sub module.

 

Thanks,

Rohit

Rohit_Bajaj
9 - Comet

On further investigation, it was discovered that the 'Download' tool inside 'add batches to the job' is having the 'DownloadData' coming as

'Error: Request Entity Too Large: head' and 'DownloadHeaders' coming as 'HTTP/1.1 413 Request Entity Too Large: head' hence the error 

Tool #94: Error trying to add batch(es) to the job. HTTP/1.1 413 Request Entity Too Large: head ... Data from server: Error: Request Entity Too Large: head 

When this 'DownloadData' is moving to the XML Parser it is not able to find a valid XML structure and hence the error 

Tool #79: Invalid document structure at Line:1 and Column:1 at row # 1.

 

I even tried to change the data type to Blob inside the 'Download' tool, it does not seem to help.

 

What could be possible solution to changing this behavior?

Or if we meet the requirement in any other way, it would also help.

NeilR
Alteryx Alumni (Retired)

The Salesforce BULK API has batch size limits. In the underlying Salesforce Output macro, there is a sub-macro called SalesforceOutput.convertToCsvAndBatch that respects these limits when chunking the data before sending to the API. I suspect that your customizations have overridden this logic.

Rohit_Bajaj
9 - Comet

This has now been solved, the changes done to the tools (like Download Tool) inside the Salesforce Output connectors has been rolled back.

And desired logic is built without modifying any of the existing logic inside Salesforce Output connector.

Ufuk
5 - Atom

Hi,

  

I noticed that after using salesforce input or output connectors a couple of times, I always get this error : "Failure when recieving data from the peer". Also, another thing I noticed is that, when I reboot my laptop, I do not get the error for some time.

  

I would like to share my workaround here.

 

My workaround is to renew my IP address. When I renew my IP, I do not get the "Failure when recieving data from the peer" error.

 

To renew my IP,  I was simply rebooting my laptop. Then I realized I could do it by using good old windows command promt. I was simply runnig following two commands in order;

  

Ipconfig /release

ipconfig /renew

 

Later on, I wrote above lines in a text file, and saved as renewIP.bat 

 

I have this RenewIP file on my desktop, when I get the "Failure when recieving data from the peer" errror from alteryx, I simply run my renewIP and continue my work in alteryx.

 

Labels