This site uses different types of cookies, including analytics and functional cookies (its own and from other sites). To change your cookie settings or find out more, click here. If you continue browsing our website, you accept these cookies.
I have a workflow that pulls data from a Google Sheet, transforms it, and then outputs it to a different Google Sheet. The workflow was running with no errors for weeks, but now it is throwing an error and not exporting all the data. The data set is only about 20,000 records and I have a few other nearly identical workflows with smaller data sets (5,000-10,000 records) that run fine. From what I can tell looking at Workflow Messages, it splits the output into chunks or "payloads" and exports them in sequence. When I get this error, one or more of the payloads don't make it through and the corresponding rows in the Sheet will be empty, leaving large "gaps" in the data. Here is a screenshot of those messages:
In this case, it looks like payload 4 didn't make it.
HTTP/1.1 400 Bad Request Content-Type: text/html; charset=UTF-8 X-Robots-Tag: noindex, nofollow, nosnippet Content-Encoding: gzip x-chromium-appcache-fallback-override: disallow-fallback P3P: CP="This is not a P3P policy! See g.co/p3phelp for more info." P3P: CP="This is not a P3P policy! See g.co/p3phelp for more info." Date: Wed, 20 Feb 2019 00:06:52 GMT Expires: Wed, 20 Feb 2019 00:06:52 GMT Cache-Control: private, max-age=0 X-Content-Type-Options: nosniff X-Frame-Options: SAMEORIGIN X-XSS-Protection: 1; mode=block Server: GSE Set-Cookie: NID=160=RwJSLKzpSYKymkG7iieFBC-gBf-Jn4TBa5XQtVSVtnbhk_i1XM_BMziWfcceKNlosXYPx3Q5a2hZNyPe3noufdN7j5PqReMmdib7Fmo9HqcSPRo_qQmE1v8Ek5-0ifZ980zC7ex_oho3e9Vtnrxq1o9xsg-6QKmbtTjt7kJKt2c;Domain=.google.com;Path=/;Expires=Thu, 22-Aug-2019 00:06:52 GMT;HttpOnly Set-Cookie: NID=160=PJy9Fq_aRRpxJ1l27kAYK_OuievSJJqJFCtDI_AFF2BfAPwtImJEJajthSTnLlU7PwI8MouZ0yXi7Ie7C4AlfQjYGLoT1ilktF99FOz0MHC6PNKFsA5xNbvC4Z3pLal26VkDTHUTxajeRoNGgJZyZwGoDdnaYbuKXJOVUv3yf7w;Domain=.google.com;Path=/;Expires=Thu, 22-Aug-2019 00:06:52 GMT;HttpOnly Alt-Svc: quic=":443"; ma=2592000; v="44,43,39" Transfer-Encoding: chunked
All my research on this issue has turned up nothing. I somehow doubt it's purely a record count issue since I can't be the only person trying to output more than 20k rows to Sheets. I also confirmed that it's not a Sheets limitation issue as I'm nowhere near the 2 million cell limit. I'm hoping there's a simple solution out there, but if not, my backup solution is to implement the setup detailed here into my workflow so that the output tool isn't exporting all 20k at once. Still, seems silly that the tool can't handle a measly 20k records. Am I missing something?
I am experiencing the same kind of error and would be interested in seeing what anyone has in the way of solutions.
I have found a semi convoluted solution using the Record ID tool Several block until done tools with filters and Multiple Appending Google outputs but its not really the most reliable fix in the long term.
Anyone else have any ways they have found to work around this without manual copy and paste?
I am also having this error, with greater regularity recently. Similar to OP, the error seems to occur only on the larger tables (5K+ rows). Seems like the cookie is timing out, and some of the data chunks are not making it to the Google Sheet. I'm wondering if there is a setting on the Google side, or if a change needs to be made to the Google Sheets Output tool to extend the timeout window?
I have tried playing around with the google settings however its been to no avail , Looked to Appscript to try and see if it had any kind of fixs for it , nothing as far as i can tell.
Honestly it does just seem to be an issue with the Google output tool . Maybe Implementing a built in Throttle tool INTO the output or making the chunk sizes adjustable outside of stacking tools on tools on tools would fix. not sure though have not had too too much 1 on 1 time with any devs.
Glad to see this is not just a 2 person issue though!
For a second I thought I might be going crazy haha.
We've replicated this issue on our end, and it's being escalated to Development to look into. The issue is that the Sheets API is re-directing our request to a different URL and the tool is not designed to follow redirects. It's unclear why this is just now impacting Alteryx as there are several posts going back years dealing with the same issue:
Hello @DaveF , any updates regarding this issue? I also have the same problem. Trying to output 1 sheet with 11k records. Another with 50k+. Both show the Payload posted messages and no errors, but there are large chunks of data (up to 6k lines) where the records are totally blank.