Losing characters when using regex tool to tokenize field to 999 characters
- Subscribe to RSS Feed
- Mark Topic as New
- Mark Topic as Read
- Float this Topic for Current User
- Bookmark
- Subscribe
- Mute
- Printer Friendly Page
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Notify Moderator
Hello,
Ultimate solution attempting to achieve is to have a "||" delimiter inserted in a comment field between each block of 999 characters, if the string is >999 characters. What I have tried so far is splitting the field into rows using a Regex tool formula "^(.{0,999})\b\W?" tokenizing it into rows. This is resulting in variable truncating of characters anywhere from 2 to 2426 characters. I then use the Summary tool to concatenate the rows of the field together with a "||" at the end the but of course the results are incorrect due to concatenation by the Regex formula.
Solved! Go to Solution.
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Notify Moderator
Can you give some data to showcase?
Alteryx ACE
https://www.linkedin.com/in/calvintangkw/
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Notify Moderator
@caltang thank you for responding. Yes, I will build a sample dataset
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Notify Moderator
I attached the sample I worked up, and for some reason the same regex is now only truncating if >999 and not splitting to rows (see screenshot)
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Notify Moderator
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Notify Moderator
@caltang thank you very much Calvin! I appreciate your help and will work this now =)
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Notify Moderator
@caltang this worked perfectly! Thank you Calvin!
