Alteryx Designer Desktop Discussions

Find answers, ask questions, and share expertise about Alteryx Designer Desktop and Intelligence Suite.

CASS Tool Error: Timed out in OutboundNamedPipe::WriteFile after 5000 milliseconds

gavinott1
6 - Meteoroid

Hello,

 

I have a workflow trying to geocode ~45 million records, which I am doing with the CASS tool and US Geocoder tool.  However, the CASS tool is continually hitting the following error:

gavinott1_0-1608812440614.png

"Timed out in OutboundNamedPipe::WriteFile after 5000 milliseconds".

 

Originally, I was running all the records into the CASS tool in one workflow and the error would pop up after ~100,000-300,000 records had ran through.  I then put the tool in a macro wherein I send it 50,000 records at a time.   This seems to work better, but I'm still getting the error.

 

I'm not exactly sure what the error means or how I can fix it, but maybe increasing the timeout limit would work, if that's possible.  Has anyone dealt with this problem before?

 

Thank you,

Gavin Ott

11 REPLIES 11
RishiK
Alteryx
Alteryx
gavinott1
6 - Meteoroid

Thanks for your response - I already explored that but none of the solutions work.

paul_whitley
5 - Atom

Hey, were you ever able to solve this? I am having the same issue.

TheSAguy
7 - Meteor

I'm getting the same Issue.

Getting this error on trying to CASS 223K records.

 

I already have the data in an Alteryx database and did a Data Cleanse.

 

 

joe_normile
5 - Atom

I am having the same issue and have tried breaking up my files into fewer records, to no avail.  Alteryx support was no help, their answer was that my computer maxes out at 50k rows so I showed them a screen shot of my CPU usage and when CASS successfully processed more than 50k rows.  There must be something with certain records that cause this error?

joe_normile
5 - Atom

I have converted address to CSV, XLSX, and YMDB, and used various different row limits and I always receive the same error.  I have enabled and disabled AMP Engine.  I have increased memory to 30gb in runtime.  I have tried tiling to get smaller tables to run through CASS without converting to an output file, and I still get the same error.

TheSAguy
7 - Meteor

I don't get the 50K limit... I've always CASSed well in the millions without any issue.

Just for some reason, getting an error today. I also did the conversion thing,  CSV, XLSX, and YMDB without any luck.

 

Very frustrating. Always happens on a rush job!!

 

EDIT:

So I restart my system and it went from outputting like 3K records to outputting 64K records. 

This is from the input of 223K

 

joe_normile
5 - Atom
 

 

TheSAguy
7 - Meteor

I'm getting that same Error message about "Bad Record:...."

That's new and I've never seen that on my other jobs.

 

From your flow, it seems that one is failing.

 

For me it seems something on the record(s) is causing it even though it's in YMDB format and has been Data Cleansed.

Labels