Hello,
I have a working python logic in Alteryx to process FIX messages. My issue is when I process > 50000 records and pass it through python. It takes a lot of time. I am trying to build it in Alteryx :
I replaced the ASCII characters and tokenizing it. But I am not getting it right.
Column looking like the following
8=FIX.4.29=175735=W34=91740952=20220517-13:49:45.50315=USD37=NA5613=045254757384130=7.1200131=7.1300132=300133=4200260=4130=7.1200131=7.1400132=100133=600275=P273=20220512-13:49:45.1571091046...130=7.1400131=7.15132=1000133=1500
130 | 131 | 132 | 133 | 260 |
7.1200 | 7.1300 | 300 | 4200 | 4 |
7.1200 | 7.1400 | 100 | 600 | 4 |
7.1400 | 7.15 | 1000 | 1500 | 4 |
I appreciate any help.
Thanks!
Solved! Go to Solution.
You can replace ASCII code 01 with a Regex Replace function to convert it to a comma which you could then parse out: Regex_Replace([Field],"\x01",",")