Hello Community,
Looking for some help to allow me to finish off a project that I am working on. The concept is fairly straight forward but I am having issues resolving. Basically, I have records where I want to add a delimiter after every 4th value so I can use text to columns to split our the 4 character ID fields I need to isolate. I feel like the solution is a regex expression of formula of some kind.
Once I can successfully split out the values using text to columns, is there a way to pivot the data from each row into one column with all the values?
Any help would be greatly appreciated!
Thank you,
Dan
¡Resuelto! Ir a solución.
Hi dgsigl2
Assuming I've understood you correctly (which is a massive assumption), this might help you (see below)
Using the regex tool in tokenize mode it splits every 4 character into a separate row. The regex expression is simply four periods/full stops: ....
Use recordID beforehand if you want to keep track of which row they came from.
Hello @dgsigl2,
No picture here. I have a working solution, but I'm going to describe it to you.
(.{1,4})A group of characters with a length of 1 to 4. This will cycle through the 4 digit values. If you end up with a last set of characters that is less than 4 it will put them into that final group. If you only allow for 4 as the length, it would DROP the last values.Trim([Input], ".")
Cheers,
Mark
While I like the .... aspect of @PhilipMannering's solution and appreciate @LordNeilLord providing you with a more complete solution, I do want to point out that the solution provided REQUIRES that the sets all be 4 characters in length. If the input is ABCDEFG, the result will drop the last characters because there are not 4 characters in that set/group.
I prefer my solution because it does not require a shift to ROWS and then a shift back to a single field. If you want to use that solution, please modify the .... to .{1,4}
Cheers,
Mark
@MarqueeCrew Good point on using the expression .{1,4} if you're codes aren't a length that is a multiple of 4. Hadn't thought of that.
I hadn't thought of it either, until I tested a similar post and saw it in the sample data. Now having FAILED ( @mceleavey knows that I am human) with that posts data I learned my lesson.
Cheers,
Mark