This site uses different types of cookies, including analytics and functional cookies (its own and from other sites). To change your cookie settings or find out more, click here. If you continue browsing our website, you accept these cookies.
Hello #AlteryxAddicts, tl;dr tl;dr - Alteryx Partners leveraging the Designer API via .NET should not upgrade to Alteryx Designer 11.8, 2018.1, or 2018.2 because doing so will render API integration inoperable. The issue will be addressed in Alteryx Designer 2018.3 (expected GA end of August 2018) at which time users can safely upgrade. FAQ What Happened? We uncovered a previously unknown issue that renders the Designer API from .NET inoperable for Alteryx Designer 11.8, 2018.1, & 2018.2 (NOTE: API code itself is sound, the issue is 100% attributable to licensing). What is the Designer API? This notice is specifically for APIs accessed via the .NET assemblies such as “Allocate.Net.dll” and “AlteryxAPI.Net.dll”, documentation found on page 1 of: [installdirectory]\Alteryx\APIs\readme.pdf. Why did this happen? In our work to improve the Alteryx Designer Licensing experience, we inadvertently broke the ability to license API in Designer when using the .NET assemblies. This issue was identified and resolved in a timely manner and is expected to be resolved in Alteryx Designer 2018.3. What will this problem look like? - The user will receive a message that the API is not licensed. Who's Impacted? - Anyone leveraging Alteryx API on Designer 11.8, 2018.1, and/or 2018.2. Who's Not Impacted? Anyone leveraging Alteryx Designer API in Alteryx 11.7 and older. Alteryx Designer APIs accessed via C++ or the CLI were not impacted and will continue to work as expected. When will the issue be fixed? Alteryx Designer 2018.3 (expected GA end of August 2018) Next Steps? If you are using 11.8, 2018.1, 2018.2 and the Alteryx API, this issue directly impact you. We recommend downgrading to 11.7 until 2018.3 is released. Otherwise, wait until 2018.3 is released and use API integration with it. NOTE: You can also use Alteryx 11.7 and older without interruption until you choose to upgrade to 2018.3 (or future releases). Questions? Contact email@example.com Alter Everything! -- Alteryx Product Extensibility team
The 'How to Guide' to Cognitive Services Text Analytics Macro
As Microsoft continues to grow it's Machine Learning capabilities, Alteryx is following suit and has built a new connector taking advantage of Microsoft's Cognitive Text Analytics API.
The new Cognitive Services Text Analytics Macro will replace the AzureML Text Analytics Macro, keeping sentiment analysis and key phrase extraction, while also adding Language and Topic detection.
To use this macro you will need to create a Microsoft Account, as well as signing up to the Microsoft Azure Account.
Sign in or create a Microsoft Azure Account and this should take you to the Azure portal. You will need to click on 'Create a resource' and search for 'Text Analytics'.
You can then click 'create' in the bottom right corner.
You will need to set up a subscriptions if you haven't already. Once you set up an account, you can return to this stage and continue.
Once you have set up a subscription you will be able to create an application by filling out the following window. Hot 'Create once you are finished and this prompt azure to deploy this application (This may take a minute - you should see it in the top right of the web browser).
You can then click on the notification tab and Pin to Dashboard then go to resource.
Once you have done this you should see the screen below: (If you don't please refresh or sign in & out of cognitive services until you see the information you added above within the information ribbon below).
Click on keys, then copy and Paste them into the Cognitive Services Macro and select your field and type of text analysis.
Error: Invalid subscription key
This error tends to occur because the account hasn't fully been processed by Microsoft.
Solution: Please log in & out of Azure, verify the information looks correct and then copy and paste the key into Alteryx.
In a workflow, not too far, far away...
Structured data has vanished. In its absence, the sinister Dirty Data Order has risen from the ashes of the Server and will not rest until Data Analytics have been destroyed.
With the support of the Alteryx Engineers, Solutions Engineer Tony Moses leads a brave RESISTANCE. He is desperate to find structured data and gain its help in restoring blending, joining and analytics to the galaxy.
Tony has sent his most daring Community Leader, Matt DeSimone, on a secret mission to Jakku, where an old ally has discovered a clue to the structured data whereabouts....
Welcome to the Star Wars universe!
Ever wanted to know the most important details of your favorite characters from Star Wars? Me too!
Our generous friends, Paul Hallett and team, have given us the Star Wars API - the world's first quantified and programmatically-accessible store of Star Wars data.
After hours of watching films and trawling through content online, Paul presents us all the People, Films, Species, Starships, Vehicles and Planets from Star Wars.
The data is formatted in JSON and has exposed it to us in a REST implementation that allows us to programmatically collect and measure the data.
Now, how was I able to retrieve this treasure of information via Alteryx? Easy! I've built a REST API connection using the Download Tool to pull information based on a user inputted query in an Alteryx Application (attached as v2018.1 Star Wars.yxwz).
Normally, once having retrieved JSON formatted data, structuring and parsing the data would be a nightmare! With Alteryx, this is just one tool away. The JSON Parse Tool allows you to identify the JSON field, in this case our download data field, and easily extract Name and Value columns. From there it's some simple formatting and using the reporting tools to present us a nice clean composers file (pcxml).
Man, if only the Rebels could process information as fast as Alteryx then they wouldn't have had to send poor R2 to find Obi Wan.
I'll be bringing you, the Alteryx Community, updates of the app with each new movie release!
I hope you enjoy the API and may the Force be with you!
Fact: workflows are the best. Look it up. They’re all about getting things done and, with hundreds of tools and the ability to integrate external processes , there’s no shortage of things you can get done. We know that there are some areas of analytics that require a little extra firepower, however, and that’s why you can leverage your workflows in apps and macros for added functionality.
If you haven’t used the Run Command Tool just yet, that’s great. It means that whatever your analyses required, we had it covered with basic Designer functionality. But in spite of how great the Designer is, it just can’t do everything. There is a utility on your computer that can do just about anything, however, and it’s the command line . The Run Command Tool pairs the two into a dynamic tag-team duo that can wrestle all the computation you could need into one, integrated, Designer workflow:
API connections give access to many web-based applications, database systems, or programs by exposing objects or actions to a developer in an abstracted format that can easily be integrated into another program. In other words, an API functions as a point of integration where you can programmatically access or manipulate information for a program (or workflow) you’re building. In the context of Designer use, APIs are most often used to access data stores, utilize web-hosted services while blending, or to build connectors.
Most APIs are in RESTful architecture, the software architectural style of the World Wide Web, and typically communicate over HTTP with the same HTTP actions, or “verbs”, that web browsers use to retrieve web pages and transfer data: GET, POST, PUT, DELETE, among others. As such, any tool or software that gives you the ability to use the HTTP request-response protocol will let you communicate with these RESTful APIs. The three I most commonly use to establish my connections are Postman (a great testing app that resembles the Alteryx Download Tool configuration), cURL (a command line tool and one of the predominant HTTP/FTP transfer libraries), and everybody’s favorite software: Alteryx. Between these three tools, and maybe a web traffic monitor like Fiddler (this will let you see the HTTP requests being sent/received over your network), you should be able to establish, maintain, and automate connectivity to just about any REST API of your choosing. This article briefly exposes how to do so in cURL and Alteryx so that you can (1) easily pivot between the two for more robust troubleshooting/implementation and (2) more readily implement API connections into your Alteryx workflows, even when they are documented programmatically.
The first step in establishing an API connection is to locate the developer documentation. The quality and detail of documentation for an API is often the limiting reagent of establishing a connection, so be sure to use an API that has the best functionality, then documentation. The documentation will then help walk you through the API’s authentication before introducing the different requests you can make. Since cURL is frequently used amongst API developers, you’ll notice many of the example requests you are introduced to will be in cURL syntax. Being able to decipher those cURL requests will help you easily recreate it in a program with an interface, like Postman or Alteryx. Consider the GET and POST requests below (all included in the attached v10.6 workflow for reference – please note that the workflow will not run due to redacted credentials to the API):
The documentation gives us a URL to communicate with via an HTTP action (two in this case, GET and POST), specific to this request of listing users on search criteria: http://community.lithium.com/community-name/restapi/vc/search/users. It also notes a parameter, or argument, to accompany the request – those keywords are most often associated with the request payload. Along with a payload argument for authentication (redacted), we’ll use these elements to generate our requests. For the cURL requests, feel free to use the executable installed along with your Alteryx Designer, located in C:\Program Files\Alteryx\bin\RuntimeData\Analytic_Apps\ by default (you can get your own download here).
Request in cURL (command line):
(the –i option is used to include the response headers to resemble the Alteryx format below)
Request in Alteryx Download Tool:
Request in cURL (command line):
Request in Alteryx Download Tool:
Note: A series of cURL commands can also be implemented into a .bat file to be executed within a workflow via the Run Command Tool (master it here).
Translating the Requests Above you can see side by side examples of cURL syntax and how the API request would look in the Designer. While we can’t go over every type of request you may need, we can equip you with a metaphorical pocket dictionary that will help you translate between the two:
The URL in your cURL request will simply need to make its way to the URL Field in the Basic Tab of the Download Tool.
Timeout time (--connect-timeout <seconds> in cURL) can be changed in the Connection Tab.
The schematics below identify some of the elements from our lookup table, but in syntax:
Additionally, Postman, the tool mentioned earlier, can build requests much like in Alteryx and has a “Generate Code” feature that will convert the request to usable cURL syntax that can be helpful in translating.
While the above helps elucidate some of the similarities between cURL and Alteryx HTTP requests, there are also some notable specificities to using each. For example, cURL will give you far more control over the more granular configuration options for each request; things like what version of HTTP to use, passing cookies, using proxies, among others. Alteryx, on the other hand, gives you the flexibility to build an entirely different request from each row of data entering the Download Tool, making it near effortless to generate a large number of requests customized to your data or to automate API interactions that would otherwise require daunting programming. In addition, Alteryx makes it far easier to parse an API response into usable data such that it can be blended with your other datasets – all inside the same workflow that made the requests. Depending on the types of requests you’ll be making to your API, you’ll have to look at the different formats above and determine the optimized approach. For example, what if you need very specific request configuration but see yourself making a large number of requests? You could use Alteryx to automate writing all the syntax of your cURL commands to a .bat file, and then run that .bat file in a Run Command Tool from Alteryx. Now that you know both, choose wisely!
Question Does Alteryx support web crawling?
Yes. In Alteryx you can look at a web page, find embedded links (e.g. using regular expressions), and add to a queue of "links to visit". Then continue visiting/adding indefinitely, while also extracting various other tidbits of interest from each page visited.
In a Text Input Tool, enter URLs to crawl. Alteryx can take the URLs from a data stream (a database where we have all of the URLs we want to crawl) and iteratively repeat the process of connecting and getting the code beneath that URL:
Use the Download Tool and point it to a web address:
Alteryx returns the whole content available for that URL:
The attached v10.0 workflow allows you to connect to wikipedia and "crawl" the content of that URL. It can be saved, parsed etc. Additional functionality may be added to create a very powerful crawling engine.
Question Can you wait X seconds between processing each row in Alteryx?
Yes! Thanks to Invisio for creating an Inviso Macro Pack and posting on their blog here.
The "Wait a Second" macro lets you wait X number of seconds before processing each row in the dataset.
One application is if you are contacting an API with multiple requests. The WaitAsecond macro may help to pause the API long enough to process multiple rows without issue.
It can also be used to scrape sites without putting heavy loads on their server. An Invisio sample of scraping the Alteryx community (See Insights to the Alteryx Community)
As you can see, the part of the flow that runs through the WaitASecond tool gets a NOW timestamps which are 5 seconds a part, whereas the bottom stream, not running through the WaitASecond tool, all gets the same timestamp.
There are essentially two macros:
The first one assigns a unique id to each record and then uses that ID for the batch macro.
The batch macro has a “Command tool” that runs a ping that waits x seconds before timeout (126.96.36.199 if that exist in your network it won’t work).
The macro can be downloaded here (InvisoMacros.zip).
To do your best data blending, it is a critical need to have the flexibility to connect to as many data stores as possible. No puzzle reveals a complete picture without all the pieces in place, and the same adage holds true in analytics. While we’re proud to boast a list of supported input file formats and data platforms that may even be large enough for database storage itself, unfortunately, in the ever expanding world of data you just can’t catch them all. Enter the Download Tool . In addition to FTP access, this tool can web scrape or transfer data via API (check your data source – there’s almost always an API!), giving you access to even the most secluded data stores. With the examples compiled below, and the wealth of data accessible on the web, you can turn nearly any analytical puzzle into the Mona Lisa :
Topics discussed below:
Currency Field Formatting
Currency Field Formatting - Strings to Doubles
We often get a lot of questions from new users about how to convert fields with currency formats to doubles and vice versa. If you have currency fields in your data that come in to Alteryx with the currency format (ex: $1,354.00) and you want to perform any kind of numeric calculations, you must convert these fields to numeric fields (i.e.: Double, FixedDecimal, Int 32 etc…). For more information on data types click here.
There are a couple ways to convert this string format to a numeric format. Below I will demonstrate two ways: one is using the formula tool, the other is using the multi-field formula tool. Both are similar but one is a bit more flexible than the other.
Using the Formula Tool:
If you’re using the formula tool you must make a new field for the converted string. Make sure the type is numeric, I usually go with Double. Then use the formula shown below. The first part of the expression will replace the $ and comma with empty strings. While ToNumber() will change it to a number.
Using Multi-Field Formula:
Using the multi-field formula is very similar to the formula tool. In this tool we can specify multiple fields if we so choose. Also we don’t have to make a new field. We can actually change the ones we already have. In the configuration, make sure you have “Text” fields pop up in the selection window and select the fields you want to change. The formula in this case is exactly the same except you use [_CurrentField_] as your field variable. This variable will run through the expression for all fields that you selected at the top.
Currency Field Formatting - Doubles to Strings
After you have done your calculations and you want this back into currency format you can simply use this expression in your formula tool:
'$'+ToString([FieldName], 2, 1)
More info about this formula here.
Now on to a fun topic: Currency Conversion
For those who gather data that comes in as a different currency than what they want in their reports could always look at conversion rates manually and do the math themselves (using Alteryx of course). The only problem with that is it becomes tedious and currency conversion rates change all the time. The best way to get this real time data is to do an API call to a website which offers this data in real time.
The workflow I have attached has a macro that I have built (version 10.6), which allows a user to choose the currency their field is in with a dropdown interface and convert it to a different currency. This macro uses the xe.com free API to get currency conversion rates in real time.
The base URL we make for this request is http://www.xe.com/currencyconverter/convert/?Amount=1&From=”FROM”&To=”TO”
The “FROM” and “TO” will change when the user chooses the currencies. After that happens, these will get replaced with the currency’s ISO 4217 code and the download tool will gather all information pertaining to that URL. After some parsing, we obtain the currency conversion rate and place it into its own field, from which we use to calculate our new currency.
To learn about APIs and how to connect to them using Alteryx. I would check out this article.
NOTE: This article was written by Paul Houghton and was published originally here. All credit goes to Paul Houghton and his group at the Information Lab. Thought this was such a great solution; wanted to make sure it was made available on the Alteryx Community as well.
I have recently been working with a client who wanted to get their data into a Google Big Query table with Alteryx. It sounded like a simple problem and should have been pretty easy for me to get working.
Unfortunately that was not to be the case. Connecting to a big query table with the Simba ODBC driver is pretty easy, but that driver is just read only. Dead end there.
So off to the Big Query API documentation.
I found that uploading a single record was relatively easy but slow, uploading thousands of records should be easier.
Deeper digging I went. Using the web interface I can upload a csv to Google Cloud Storage, then load that file into Big Query, the challenge I had was to automate the process.
Step 1 - Throw it into the (Google) Cloud
So I was quite lucky that my colleagues Johanthan MacDonald and Craig Bloodworth had already made a functioning cloud uploader using the cURL program. All I had to do was doctor the URL to work for me. Ideally I wanted to make the entire process native to Alteryx.
… (T)he next phase of the automation, (is) importing the csv into big query.
Step 2 - Big Query your Data
So the second step was to get the data into big query. I found this a bit of a humdinger and it really stretched my API foo. So what did I have to do?
What query do I run?
I struggled with this for a while until I attended the Tableau User Group held at the Google Town Hall in London, I was able to get in touch with Reza Rokni who pointed me at the example builder at the bottom of the Jobs List Page where you could build an example of the query to send.
The biggest problem with this is getting the JSON file with the table schema right. I got it eventually, but there were a fair few challenges.
Step 3 - Get it to the Cloud with Alteryx
Once I had the whole process working I wanted to go back to the problem of uploading a file with Alteryx. I knew it was surely possible to upload a file with the download tool. I just couldn't work out how to get the file in a row to upload.
What settings do I use?
Enter the Blob Family
What made this the upload work was finding what the blob tools do and how they work. So what is a blob? It's a Binary Large Object, basically a file without the extension.
So now using the blob input tool I'm able to read in a file to a single field. This is exactly what I needed. It’s relatively simple from here to read the file into a single cell and use the normal download tool for the upload process.
The Final Furlong
Now to pull this all together the last step is to put in the details needed to do the upload. So what are the settings?
Well on the 'Basic' we simply want to set the target URL (check the API documentation to work out what that would be), and what we want to do with the response from the server.
On the headers page we need to define the 'Authorization' parameter and the 'Content Type' parameter that is needed by Google's API.
The last set of configuration is the payload, and that is where we define the use of a POST command and what column contains the Blob.
And after all that I have managed to have success. By combining these two processes the records are uploaded as csv, then imported into a pre-defined Big Query table.
Paul has uploaded this to his Alteryx Gallery Page (v10.5) and would love to get feedback from anyone who uses it in its current configuration. He would appreciate any feedback on how it works as it is and how easy it is to configure. He would also love to get some ideas on how to improve the tool. He already have ideas to automate the table schema file creation, develop separate up-loaders for Big Query and Google Cloud, see if changing the upload query will make it more robust, possibly build in compression, and who knows what else would be useful. Please post your comments here.
Web scraping, the process of extracting information (usually tabulated) from websites, is an extremely useful approach to still gather web-hosted data that isn’t supplied via APIs. In many cases, if the data you are looking for is stand-alone or captured completely on one page (no need for dynamic API queries), it is even faster than developing direct API connections to collect.
With the wealth of data already supplied on websites, easy access to this data can be a great supplement to your analyses to provide context or just provide the underlying data to ask new questions. Although there are a handful approaches to web scraping (two detailed on our community, here and here), there are a number of great, free, tools (parsehub and import.io to name a few) online that can streamline your web scraping efforts. This article details one approach that I find to be particularly easy, using import.io to create an extractor specific to your desired websites, and integrating calls to them into your workflow via a live query API link they provide through the service. You can do this in a few quick steps:
1. Navigate to their homepage, https://www.import.io/, and “Sign up” in the top right hand corner:
2. Once you’re signed up to use the service, navigate to your dashboard (a link can be found in the same corner of the homepage once logged in) to manage your extractors.
3. Click “New Extractor” in the top left hand corner and paste the URL that contains the data you’re trying to scrape in the “Create Extractor” pop up. Since fantasy football drafting season is just ahead of us, we’ll go ahead and use as an example tabulated data from last year’s top scorers provided by ESPN so you don’t end up like this guy (thank me later). We know our users go hard and the stakes are probably pretty high, so we want to want to get this right the first time, and using an approach that is reproducible enough to supply us with the requisite information needed to keep us among the top teams each year.
4. After a few moments, import.io will have scraped all the data from the webpage and display it to you in their “Data view.” Here you can add, remove, or rename columns to the table by selecting elements on the webpage – this is an optional step that can help you refine your dataset before generating your live query API URL for transfer, you can just as easily perform most of these operations in the Designer. For my example, I renamed the columns to reflect the statistic names on ESPN and added the “Misc TD” field that escaped the scraping algorithm.
5. Once your data is ready for import, click the red “Done” button in the top right hand corner. You’ll be redirected back to your dashboard where you can now see the extractor you created in the last step – select this extractor and look for the puzzle piece “Integrate” tab just below the extractor name in your view. You can copy and paste the “Live query API” (there’s also an option to download a CSV file of your data) listed here into a browser window to copy the JSON response that contains your data, or you can implement a call to it directly into your workflow using the Download Tool (just be sure to de-select “Encode URL Text” as you’re specifying the URL field):
That’s it! You should now have an integrated live query API for your webpage, and with an extractor that can be leveraged to rake data from that website if you want to try other pages as well. If you’d like to learn more about the approach, or on how to customize it with external scripts, try the import.io community. The sample I used above is attached here in the v10.5 workflow Webscrape.yxmd, you just have to update the live query API with one specific to your account, extractor, and webpage URL. If you decide to give it a try with the example above, be sure to let us know if we helped your fantasy team win big!
Suppose you have a datetime stamp in a dataset for the timezone where you are. This dataset includes data for locations in timezones other than the one you're in and you want to convert your datetime stamp to reflect the local timezones of the locatons in your data.
In the attached example, we'll pretend you're in the Eastern Standard Timezone and your datetime stamps are EST. Your data includes the following locations:
San Mateo, CA
Sydney, NSW, Australia
Toronto, Ontario, Canada
We'll convert your EST datetime stamp to the local timezones of these locations. This solution will use latitude / longitude coordinates to determine the local timezones. Your data will need have lat/long coordinates for each of your locations.
The solution will use a Google API to find the UTC time offsets. Depending on the date, this could include an adjustment for daylight savings as well. To read more about the API being used, visit https://developers.google.com/maps/documentation/timezone/intro#Introduction. This site will give you information about which parameters are used to make an API call, and what data is returned. This is informational since Alteryx handles the data call and data engineering.
The Google Maps Time Zone API application (and most API applications in general) require some form of authentication. This means you will need to get an API key. It’s free to get a key, but you are limited to the number of calls you can make per day (I believe it’s currently set at 5K. If you are going to have more than 5K lines of data, let me know; we may be able to come up with something else). Go to https://developers.google.com/maps/documentation/timezone/get-api-key#key and look for this:
Click on get a key. A new browser tab will open that looks like this:
Go ahead and click Continue. You’ll see a message in a black box that say ‘Creating a new project’. A new window looks like this:
You can give your key any name you want, but you will need to enter your IP address. To find your IP address, go to https://www.whatismyip.com/. Look for ‘Your IP Address Is:’ and copy/paste your IP. Click Create. It says it may take up to 5 minutes for settings to take effect.
Once you have your API key, copy and save it somewhere like NotePad. You’ll need to enter your API key in the macro configuration window. Once you enter your key and save the workflow, you shouldn’t have to reenter it again.
Your input data looks like this:
Where the DateTimeStamp is in EST. The goal is change EST to local of time of each location. Attach your input file to the 'UTC to Local Time Converter Example' macro.
Replace 'YOUR_API_KEY' with the API Key you acquired and pasted/saved in NotePad (from above):
The browser results show the results from the conversion:
NOTE: the attached example was developed on Alteryx Designer v10.1.7.1288
The UTC to Local Time Converter Example macro will make the appropriate daylight savings adjustment.
This article is part of the CS Macro Development Series. The goal of this series is to communicate tips, tricks, and the thought process that goes into developing good, dynamic macros.
Implementing APIs into macros isn’t a difficult process, you only need to first understand macros and how they’ll interact with your API requests. In short, a macro gives you the ability to encapsulate just about any repeatable process in an Alteryx workflow as a tool. Understanding that, you only need to identify what in your API request will need to change with each row the process/request is being repeated for, and how to update this request dynamically.
With each row of your input data stream, expect to be able to use fields to reference what individual values will be – doing so in a formula tool will build out parts of the request that change with each record. If instead you need to update an argument of the request just once for all your records, try using an interface tool and a place-holding value. Need to update parts of a request for only certain records? You can use formula logic or the batch macro’s control parameter approach.
Consider the Google Maps Geocoding API request format below:
If we were to send a request to their API to geocode a single address (specifying an xml output), this would look like:
To update this dynamically, within a macro, we need only to map our input fields to their appropriate places in the request, emulating the rest of the request syntax with formula logic:
(the replace function exchanges spaces for + characters, the remainder of the + characters are added as literal strings to mirror the format above)
Then only updating our key remains before passing this to a Download Tool, and this will be the same for all our input rows:
The v10.5 example above is attached for reference. It is an adaptation of a more robust Google Maps Geocoder hosted on our gallery.
Please note that in order to use this macro, you must first generate a server key from the Google Developers Console. Each key has a limit of 2,500 free requests per day. Click here for more information on usage limits for the Google Maps API.
This macro demonstrates requests to Google Maps' web service API and is meant as a proof of concept only. It is not intended for production use.
To go along with our example on how to download a file from FTP, we’ve assembled steps in v10.1 below (credentials, server removed) as an example of uploading a file to FTP. In this example (attached) I’ve encoded a string field as a Blob to be posted as a text file. Theoretically, all your fields could be concatenated to a CSV format, or another delimited format, to be converted and posted using the same steps:
My field string to be converted:
1. First identify the field to be converted to Blob in your Blob Convert Tool:
2. Specify in a Formula Tool your FTP URL and filename in the format URL/filename.extension:
3. Have your Download Tool use this field as the URL field in the Basic Tab:
4. In the Payload tab specify the HTTP action PUT and select the option “Take Query String/Body from Field” and specify your Blob field:
5. Specify your credentials in the Connection tab of the Download Tool, leave all other configuration options default:
6. Run the workflow!
After running, you should be able to confirm the successful transfer of your file in the DownloadHeader field returned from the Download Tool (it'll also be hosted on your FTP path):
Take a look at the results below: