The post title and workflow comments describe the situation exactly.
The fact that forcing a data type locks in the byte size, regardless of whether there is an autofield, is ridiculous and causing a lot of problems for me.
I NEED the data type to be forced, otherwise auto field mucks it up. But I need the byte size to be dynamically set (minimized via autofield) for each field on each run of the workflow for optimization.
How do I do this?
EDIT:
furthermore, it is NOT warning about data truncation. Why is this? I've just spent days uploading millions of records to a table... now I see I could easily have tons of truncations without even getting a warning?? Why??
Solved! Go to Solution.
Hi @Joshman108
You can try and change the sequence. First use select tool and force datatype but give very large size so that can accommodate big string. Use auto-field after on the required field to reduce the string variable size.
Workflow:
Hope this helps 🙂
I actually accepted this too soon, this does not fully solve the issue.
My data has different qualities from these examples namely, I have string fields that look like numbers. In your example, autofield will still obliterate the field type if it thinks it's a number, regardless of whether the upstream select specifies string: forced.
Unfortunately my data must remain this way.
So still unresolved but.... I think we're going to use a different tool for this.
Very frustrating though.
Hi @Joshman108
Here is what you can do. To prevent string to be converted to number lets append "#" (could be any character) before sending it into autofield and remove it later wards. We can append "#" on multiple columns using multi-field formula tool.
Workflow:
1. using select tool to force type and give large size
2. using multi-field formula filed to append "#" for all string fields at the beginning.
3. using autofield to minimize size. Since "#" is appended it wont be converted to numbers.
4. using multi-field formula to remove beginning "#" and getting it back to normal.
Hope this helps 🙂