Alteryx Designer Desktop Discussions

Find answers, ask questions, and share expertise about Alteryx Designer Desktop and Intelligence Suite.
SOLVED

Get the blob size for a PUT Operation?

Hiblet
10 - Fireball

When you use the Download tool with an API, and you use the PUT Http verb, you often have to supply the number of bytes of any sent message in the Content-Length header. 

 

If I formulate a JSON data packet, I have to convert to Blob to use with PUT.  I need to add the number of bytes for the blob to the Content-Length header.

 

A browse tool reports the blob length when the blob field is displayed.  How can I get that reported size, so that I can insert the number of bytes in the Content-Length header?  (Note: You cannot just multiply the number of chars by 2, as UTF-8 encoding uses different numbers of bytes for different character sets [https://stackoverflow.com/questions/9761805/calc-utf-8-string-size-in-bytes]).

6 REPLIES 6
jdunkerley79
ACE Emeritus
ACE Emeritus

Past experience in doing this kind of thing has been to use a blob convert to change the blob into hexadecimal.

 

You can then get the length of the hex (which is 2 characters per bite) and use this as the content-length.

 

Makes the assumption that the blob is already UTF-8 encoded of course.

mceleavey
17 - Castor
17 - Castor

Hi @Hiblet ,

 

try the "Field Info" tool. This will output the field size of the blob you create.

 

M.



Bulien

Hiblet
10 - Fireball

Hi @mceleavey, Sorry, this just returns the field size, like 2Gb, rather than size of the data in the field.  Thanks for the effort though.

Hiblet
10 - Fireball

Hi @jdunkerley79 , James that is clever, it guarantees 2 chars per byte, so I can imply the byte length reliably.  Many thanks!

SophieJoanna
7 - Meteor

@Hiblet could you clarify how you calculated the size after converting to Hex? I'm having the same problem but struggling to follow the solution proposed here.

 

I converted to Hex, took the length of the new Hex field, then divided by 2 but I'm getting a widely different size (128 bytes) compared to what the blob input size is showing (33,222 bytes). I must be missing something! 

Hiblet
10 - Fireball

Hi @SophieJoanna , Let me see if I can shed some light!  I have a JSON string.  I copy it first.  I then convert to a blob, using "Put Text Data into Blob with Code Page" set to "Unicode UTF-8".  I then convert that blob in Hex text, using "Convert to HEX encoded binary data".  This gives me a string of Hex text, where each byte is always represented by two characters.  The Content-Length is calculated as half of the length of that string.  

 

I have attached a mini flow that shows the tools required.

 

The value you are getting should be somewhere close to the length of the starting string.  I my example, the value is exactly the length of the JSON string, because I am only using ASCII characters.  However, if there were extended characters like foreign accented or graphic chars, the length would be out.

 

At the risk of teaching grandma to suck eggs - You have to escape a specific set of characters in JSON, this StackOverflow question has an answer that details the characters...

https://stackoverflow.com/questions/19176024/how-to-escape-special-characters-in-building-a-json-str...

 

I had a problem recently with this.  A string was passed in to a macro in a fixed length text field.  As I added slash escape characters, this expanded the string and lost the last few characters from the fixed length string.  I solved this by copying the input text to a V_WString field early in the macro.  That allows the text to expand in the field.  A nasty little bug in my code.

 

Labels