This site uses different types of cookies, including analytics and functional cookies (its own and from other sites). To change your cookie settings or find out more, click here. If you continue browsing our website, you accept these cookies.
I have sent a Blob field into a Python tool before with good success. I could read it as a bytearray and all was rainbows and yellow brick roads.
However, I'm trying to read a file as binary in the Python tool, into a bytearray and send it out of the Python tool as a Blob field. I assumed this would work, but it doesn't seem to work right. Is there guidance on this yet?
Thanks!
Solved! Go to Solution.
Fact Checking myself -- The Blob input was coming in as 'bytes', so I changed the output datatype to bytes as well, but still not coming through on the output as a Blob.
Release notes for 2019.1 say Blob data format is supported, so I'm not crazy ... :)
https://help.alteryx.com/ReleaseNotes/Designer/Designer_2019.1.htm?tocpath=Designer%7C_____1
Here's a sample workflow to demo what I'm asking about:
Looks like the CachedData.py write() function uses pandas' .dtypes and not python's type() to determine the data types of the columns, which makes sense. Pandas understands a string of python 'bytes' to be an 'object' type, which is the same for a regular string. So when DatastreamUtils.py __pandasFieldTypeAttributes() is called, it maps the bytes 'object' to a V_WString.
Looks like there's a __pythonFieldTypeAttributes() started, so my hopes are that we can eventually add python 'bytes' to this conversion to allow output of binary files as Blob fields.
In the meantime, I compromised and sent a temporary file path out of the Python tool and read the file using a Blob Input tool instead. It would have been cleaner and nicer to be able to do it in the Python tool, but we do what we can ...