This site uses different types of cookies, including analytics and functional cookies (its own and from other sites). To change your cookie settings or find out more, click here. If you continue browsing our website, you accept these cookies.
on 05-13-201608:08 AM - edited on 06-18-201903:21 PM by SydneyF
In a previous article, we've shown you how you can upload to FTP using the Download tool. With the release of Alteryx 10.5, the Download tool now supports uploading to SFTP. With this addition, we'll take the opportunity to show you some more examples of uploading data to SFTP/FTP and show you how seamless it can be.
In this article, I'll show two examples of uploading to an SFTP server:
a file from local disk or network share
workflow data directly from within Atleryx as a csv file
If, like me, you don't have and SFTP server to work with but you still would like to test this functionality, you can download and run an SFTP server localy - I'm testing with this free self-contained one. Let's get started!
1. Upload a file from local disk or network share
In the first example, I will point Alteryx to load a file from a path on local disk and use the Download tool to upload it. I start by using a Text Input tool to input the URL (the address of the ftp/sftp server) including the port number and the full file path with the filename of where I want to save the file . I'm running my SFTP server on localhost port 22334 and I want to save my file on the root directory of the SFTP server therefore my URL is sftp://localhost:22334/filename.xlsx. Don't forget to start the URL with sftp:// or ftp:// depending on what protocol the server is running. I also add the file location in the second field. Of course you can use a Formula tool to create more dynamic URLs and file names but I'm keeping it simple for this example.
I haven't mentioned any special requirements for uploading data to SFTP/FTP so now is a good time to do so. For the Download tool to upload data, the data must be transformed into a Blob object and the tool has to be configured to send a PUT command. I will read-in the file and transform its content into a blob in one go using the Blob Input tool. In the Blob Input tool, I'll put a placeholder file name, configure the tool to Replace Entire Path With Field, and point to the field that has the input file path. The output of the Blob Input tool is a blob object.
In the Download tool on the Basic tab, I point the URL to the field that has the server details - "Address" in my case. On the Payload tab, I set the HTTP Action to PUT and set the tool to Take Query String/Body from Field and point it to the field that has the Blob object. On the connection tab, I set the username and password to the correct credentials. Now, I run the workflow and Alteryx will read the file, create the blob object and upload it to the SFTP server.
My SFTP server root folder is set to C:\temp\sftp and there I can see the filename.xlsx file that I just uploaded.
2. Upload workflow data directly into a csv file on the SFTP server
In the second example, I'll read-in data from an Excel file, deselect some columns and then upload the result into a csv file using SFTP. You can think of the Select tool as the part of your workflow that is processing data and from the Record ID tool to the Download tool as the part to prepare the upload. The upload process is very similar to the first example and the majority of the workflow is for preparing the data and creating the correct object to upload.
Now that I have read-in, cleaned, and prepared my data, this is how it looks.
As a reminder, we will need a blob object to feed into the Download tool with the Put command. The next section of the workflow will take the values in the columns and rows of my table and concatenate them into a single field containing comma-separated data. I will feed this field into the Blob Convert tool and the output to the Download tool to upload it to the server. Since I will be transposing and filtering data, I will need to have a point of reference to the original details and therefore I use the Record ID tool to create this point of reference. I then transpose the table using the Transpose tool with the Record ID as a pivot point (the key field) to work with a single column instead of replicating work across multiple columns - in general, transposing data and working with Name-Value columns tend to simplify transforming and manipulating tables.
Now that I have my data in a single column, I will use a Summarize tool to create the CSV rows by grouping on RecordID and concatenating the values with "," as the separator. I will also need to create the first row of the CSV file - the field names - so I take the first Record (Record ID = 1) with a Filter tool and concatenate the values in the Name column with "," as the delimiter. The result is each record in my data has one row of comma separated data all in one column.
With the two tables ready, the next step is to union the field names and the values into a single table with the Union tool and making sure that the tool is configured to Set a Specific Output Order. That way you get the field name before the field values in the output. The resulting table is then concatenated into a single field with the Summarize tool using "\n" (newline) as the delimiter.
The resulting data is then fed into the Blob Convert tool to create the Blob object. In addition, the target URL is appended to the table using an Append Fields tool. the Download tool is setup with a similar configuration to the first example.
Execute the workflow and, voila, we have the file uploaded to the server as filename.csv.
Both the workflow used in this article and the RebexTinySFTPServer configuration file(rename file from .log to .exe.config) is attached to this post. Please note that you must change the configuration of the Download tool (address and credentials) for the workflow to work on your setup. These examples were built with Alteryx 10.5 (10.5.9.15014).