I have a stream of data that I want to parse into rows but eliminating some of the stream when parsing.
I have a text input (connecting to a server to pull the file), Then I use the download (sftp) function to get the data. and I have tried RegEx as well as Text to Columns
Stream has fields that are Double quote and Comma separated.
Should I be using something else beside parse.
Solved! Go to Solution.
Hi Karen,
Are you trying to eliminate:
A) certain rows based on the content?
B) certain row content?
If you can post some sample data that helps a lot.
Cheers,
Bob
I have attached a sample file. In the file - you will see the original file is stored on our server as a .csv. But when I pull the file from that location It comes in a stream of data. I want to eliminate the first 2 rows on the attached file as well as data on the each of the following rows up to ORder Date.
Hi Karen,
I think this does what you wanted:
The key things to note:
In the settings of the Input data I said that:
1) the first row does not contain field names
2) delimiter is a comma
3) start Data import on line three (to drop the first two lines)
Then I replace the quotes with nothing to remove them
use Text to Columns to split to columns based on comma delimiter
drop the unneeded fields (you could also rename here)
Hope that helps!
Cheers,
Bob
can you create this workflow with version 10.6 or how can I import the version you created into 10.6