Hello,
i was trying to use blob input tool to transfer my source data into output file(in a separate tab) but blob input tool is taking forever and ever. my input source file is big with 11k+ line items. every time i run it seems like blob tool is going every single line items one by one. i connect formula tool into input file to get filepath column then connect that formula tool into blob input and chose "Input" from formula tool using file option under "Modify File Name using optional input". please advice.
Hi @kauser you may want to use a sample tool or a summarise tool to reduce the amount of blobs as by the sounds of it you will be generating 11k+ blobs. If you want to move the file from one location to another using either sample tool or a summarise tool to reduce the amount of records to the respective files (E.G. one file if you have one file with 11k+ records) you want to move then it should then move only that file.
thanks for your quick response. if i use sample/summarize tool then its wont be same origin source file. i want to keep that file exact same as origin source file.
@kauser - ever record running through a blob input/output tool will input/output a blob file. If you have 11,000 records - that's 11,000 files. if you have one file you are looking to create (say from a text conversion) you would use a summarize tool with /n as a delimiter to recreate your entire datastream into one row of data and then shoot that to your blob tool. If you have a single file (and you are looking to append that to the datastream like you would in an attachment e-mail) you would read in that file once - and then append it to your datastream via an append fields tool.
does that help?
Hi @kauser it will be the same source file as you will be only moving it once using a sample or summarise tool whereas your current setup will be trying to move it 11k.
User | Count |
---|---|
107 | |
82 | |
69 | |
54 | |
40 |