This site uses different types of cookies, including analytics and functional cookies (its own and from other sites). To change your cookie settings or find out more, click here. If you continue browsing our website, you accept these cookies.
UPDATE: Upon further review of the customer impact, we are postponing the deprecation of the C# (.NET) SDK until we are able to identify an alternative that satisfies all of the needs of our users and partners. The C# SDK will continue to work as a frontend and backend solution in Designer and Server until further notice.
tl;dr The C# (.NET) SDK will be deprecated in December 2019. Users currently leveraging the C# SDK for custom tools will need to convert their tools to a different backend (C++, Python, or macro) ahead of its deprecation. As part of this, the UI of the tool will also need to be reconfigured using the HTML GUI SDK, or macro if using a macro. FAQ What Is Happening? The C# (.NET) SDK will be deprecated in December 2019. See the 'Next Steps' section for more information. What is the C# SDK? A way for .NET developers to create their own custom tools for Alteryx. Documentation can be found here: [installdirectory]\Alteryx\APIs\SampleCode\DotNetCustomTools.pdf Why is this changing? Alteryx is putting more resources into improving the C++ and Python SDKs.
Who's Impacted? Anyone who is leveraging the C# (.NET) SDK. Who's Not Impacted? Anyone who is not utilizing the C# (.NET) SDK. What should we use instead? The Python SDK, C++ SDK, or macro backend (https://help.alteryx.com/developer/current/BuildCustomTools.htm). Next steps? The C# (.NET) SDK will be deprecated in December 2019. Custom tools utilizing the C# (.NET) SDK might continue to work on Windows, but not all existing functionality will be available and you should consider converting the backend of your tools to something that will be supported, such as Python, C++, or macro. The UI of the tool will also need to be reconfigured using the HTML GUI SDK, or a macro (Interface tools) if using a macro. Links to backend options documentation: Python, C++, or macro. Links to frontend options documentation: macro or HTML GUI SDK documentation. Questions?
Contact email@example.com Alter Everything!
The Dynamic Rename Tool is part of the developer category of tools. It allows the user to quickly rename any or all fields within an input stream by employing the use of different methods.
The user has the option to rename only certain fields, all fields, or even dynamic/unknown fields at runtime (e.g. after a Cross Tab Tool). The option for renaming fields are:
This article is part of the CS Macro Development Series. The goal of this series is to communicate tips, tricks, and the thought process that goes into developing good, dynamic macros.
Suppose you have a dataset that will pass through a macro if a condition is true, and an entirely different macro if a condition is false. In the event the condition is true for all records, no records will be sent to the false-side macro. The false-side macro is expecting data and throws an error whenever it doesn't find any. Similarly, in a case where the condition is false for all records, no records will sent to the true-side macro and errors. You need a process that will bypass the macros whenever data is unavailable. Let’s see how you can do that.
First, create data senarious where all conditions can be tested. In the first situation, I assigned a field, Sum_Test, with a value of 1 to half of the records and a value 0 to the other half. A second situation assigns a value of 1 to all records. A third situation assigns a value of 0 to all records. Setting up these situations will allow me to test all possibilities.
The next step is to filter by condition. Depending which data senario from above you use, data may not exist if the condition is true or false. No data means your workflow will fail. We need a work-around so that doesn't happen.
What happens after the data goes down the true side or the false side is essentially the same in terms of process. The batch macro below is found downstream from the true side as well as the false side. Functionally, they work the same. The batch macro determines if data is available. If there is, data is detoured to the formula tool. If not, the detour will bypass the formula tool entirely and keep the workflow from throwing a error.
NOTE: the formula tool in the illustration below is a representation of any process that requires data if an error is to be avoided. This could anything, usually another macro. For the purpose of this illustration, I'm simply showing a single tool.
What follows are instructions for how each tool in the batch macro/detour combination are configured.
First, enter ‘Sum_Test’ as the label for the Control Parameter.
Then write an expression in the Condition tool that checks if ‘Sum_Test’ is null.
If ‘Sum_Test’ is null (True), then direct the detour tool to go to the right in the ‘T’ side action box.
Similarly, if ‘Sum_Test’ is not null (False), then direct the detour tool to go to the left in the ‘F’ side action box.
Connect both action boxes to a Detour tool. The Detour tool has no configuration.
Every Detour must be stopped by a Detour End tool.
Build a similar batch macro for the false side of Filter tool.
Union the results from the true and false sides.
I put a Frequency Table tool after the Union to verify the results.
The entire process looks like this:
Disconnect the input to the filter tool and connect a new test condition to test all the various conditions.
To find the full path and filename of a saved file in Excel, you use the =CELL function.
In Alteryx, you use a Field Info Tool to get this information:
The Field Info Tool allows you to see in tabular form the name of fields in a file as well as the field order, field type, and field size.
Name: field names within the file
Type: type of data field
Size: length of a data field
Scale: with respect to fixed decimal data types, scale refers to the digits of precision
Source: contains the full path and filename
Description: may or may not contain information; you can add a description via the Select tool
We're only interested in the Source field and this information will be the same for each field.
Using a Sample Tool, we select just the first record:
Notice the data in Source begins with 'File:'. We don't want that in the final output so we'll use a SUBSTRING function in the expression of a Formula tool to clean it up. Complete the workflow with a Select tool so we only get the Source field:
I'll mention here you can use the Directory Tool to find the full path and filenames in a directory.
Select the directory you want to search. File Specification has wildcard characters so you can limit your search to files containing specific character patterns or file types. In the example below, let's set up the File Specification to only return files with the '.xlsx' file extension:
We're only interested in the field 'FullPath' (first column) so we'll use a Select tool to drop the remaining fields.
While the Directory tool returns multiple filenames, it will not contain a worksheet name if the file is an Excel file. To get that information, you'll need to use the Field Info Tool as we did above.
Building out a workflow and find yourself stumped when trying to add needed functionality to your process? While the Designer does an incredible job of packaging just about every operation an analyst could need, you might need that extra mile. We get that. Here at Alteryx we are all about going that extra mile; if we don’t have a tool that doesn’t explicitly capture functionality for you, we try to equip you with tools that can make for an easy reach to that functionality from resources just outside of the Designer environment (see our R Tool, the API based Connector Tools, and the Run Command Tool). In this article we’ll go over an introduction on how to make and use batch files – these will easily incorporate command line based scripting into your workflows that will help you do just about everything short of feeding your dog.
In short, a batch file is a plain text file that lists a series of commands for the command line interpreter to run in Windows. They’re frequently used to make, remove, rename, move, or even copy directories or files, ping IP addresses, run other programs or services, and manipulate environment variables – and that’s hardly even the tip of the iceberg. There’s a wealth of resources online listing the different batch commands available to you, but the Ben/Peter Parker rule applies to them all: “with great power comes great responsibility.” Please use them responsibly!
To make a batch file, all you need to do is take your desired batch command(s) and write them into a text file:
Be sure to take note of the paths being used! This batch command will make a directory named “batchfolder” in whatever directory it is in when run. Before it can be run as a script, however, we have to save it as a .bat file:
In “Save As” change the “Save as type” to “All Files (*.*)” and change the “File name” .txt extension to .bat:
In your working directory you should now see a different icon, extension, and type, describing the file:
How easy was that? If you want, you can test to see if it works by navigating to the directory the batch file resides in and typing its name into the Command Prompt:
You’ll now see the directory we made with the execution of the batch file:
See, I’m not making this stuff up. Now let’s get all this set up to run for us in the designer. All you need to do is specify the .bat file name in the “Command” configuration option - by default, it will look for this file in the directory the workflow is saved in:
In the attached example, I use the Run Command Tool as an input of a test file (specified in the “Read Results”) before writing the file to the new directory made from executing the .bat file.
If the API that you are working with requires you to sign or authenticate your requests, it may utilize an implementation of OAuth 2.0 or another authentication method to show that you have the access needed to consume the web service. There are some key words that you can look for in the API documentation that you are using that will help you quickly choose the appropriate grant flow to use in Alteryx.
Now you are ready to take your workflow one step further and add some error checking. If something is wrong with your request or something goes wrong on the server, an API will usually return an error message. It is useful to capture this information and clearly display it to the user of the connector. The Message tool will come in handy here. It will help you to troubleshoot the error and actually stop the workflow from processing if you would like.
Alteryx can use the Run Command Tool to run Powershell scripts and perform any Powershell specific commands. Note that i n order to run PowerShell scripts you must make sure you have scripting enabled. You should consult with your IT department to see if you are allowed to enable this functionality.
Sometimes clients have asked how they can re-input the same excel file that they have just outputted into the same workflow. Normally, their gut instinct would be to open up a new workflow and start fresh with the updated file. However, this can be a bit cumbersome especially if they want to do this multiple times or for those wanting to do some sort of logging process in an app. Luckily, there is a quick and easy trick for this.
Attached to this article is a workflow that I go through below...
Now the initial step to this might be different depending whether or not you already have a file that you want to bring in or not. If you are using just a regular Input tool, make sure you obtain the file path by selecting “Full Path” in the “Output File Name as Field” drop down. If you are writing in a text input make sure you have a field specifying the full path. This will be the same file path where we will output the data and bring it back in. The filepaths must be consistent throughout the workflow.
After your data blending and data preparation is complete and you are ready to output the file, place a “Block Until Done” tool at the end of your stream. Make sure your first output is connected to an “Output” tool with your specified settings with the proper file path. Next, place a “Dynamic Input” tool and make sure you are reading the field with the file path and change the action dropdown to “Change Entire File Path”. For the “Input Data Source Template “ box, I just reference the same file that I am going to write to.
And that is how you output a file then re-input it within the same workflow.
How do I pass a parameter/constant in a workflow?
You can use a workflow constant to pass a parameter. Click on the Workflow - Configuration - Workflow tab. Click the + button to add a new User type constant, and give your constant a name and value!
It is quite common to hardcode values (‘constants’) in workflows using any of the following ways:
Text input tool
Variables within formula tool
Hardcode the values within formula/any other transformation tools
If the values are hardcoded within tools, especially in very large or complex workflows, maintenance can be difficult. The ‘constants’ property allows us to define all static variables in one place, so that they are accessible throughout the entire workflow.
Can you wait X seconds between processing each row in Alteryx?
Yes! Thanks to Invisio for creating an Inviso Macro Pack and posting on their blog here.
The "Wait a Second" macro lets you wait X number of seconds before processing each row in the dataset.
One application is if you are contacting an API with multiple requests. The WaitAsecond macro may help to pause the API long enough to process multiple rows without issue.
It can also be used to scrape sites without putting heavy loads on their server. An Invisio sample of scraping the Alteryx community (See Insights to the Alteryx Community)
As you can see, the part of the flow that runs through the WaitASecond tool gets a NOW timestamps which are 5 seconds a part, whereas the bottom stream, not running through the WaitASecond tool, all gets the same timestamp.
There are essentially two macros:
The first one assigns a unique id to each record and then uses that ID for the batch macro.
The batch macro has a “Command tool” that runs a ping that waits x seconds before timeout (220.127.116.11 if that exist in your network it won’t work).
The macro can be downloaded here (InvisoMacros.zip).
API connections give access to many web-based applications, database systems, or programs by exposing objects or actions to a developer in an abstracted format that can easily be integrated into another program.
When working with SPSS, values can have both a Text label and a numeric representation of the categories (equivalent of string factors in R). Columns can also have an encoded name ex. Q_1 and a longer descriptive name that maps Q_1 to the original question that was asked (closest thing in R is the data frame attribute on the column).
Alteryx reads .sav files and loads either the numeric representation or the textual representation of values based on the user’s selection. It also reads the variable labels into the Alteryx Field Description. When writing .sav output, Alteryx will write either the text or the numeric values (depending on what was used in the workflow) as well as the SPSS variable labels which were displayed in the description field. However sometimes to maintain the integrity of the whole SPSS file, clients will want the value labels, value levels, and variable labels to appear in the output file. For these cases, using the Alteryx tools and a few lines of R code (while leveraging the thousands of R packages on CRAN) wrapped in a macro gives us the needed functionality. Attached is a macro which will write the data, variable & value labels back into SPSS.
In this section, we will explain the R code logic that is specific to this macro. You can get an introduction to writing custom R code in Alteryx here.
Before we can do anything, we will need to pass the data to the tools inside the macro (more details on macros here). The raw numeric data should be connected to the D input of the macro. In addition, and because the data frames created in R don’t contain the Field Description data, we need to pass Field Description values to the M input (M for Metadata) of the macro. We’re using the Field Info Tool to extract the description into the rows and send it to the macro. With that done we can now look inside the macro.
Inside the Macro
Inside the macro, we are using the R Tool to contain the main functionality. We connect it to the Interface tools (Macro Inputs, File Browse, Action Tool) to get the data, metadata, and output file path from the user. Finally, we’re using the API tool to pass a message back to the user in the Alteryx Execution Messages.
The R Tool contains the code to properly format the input data and write it out to the .sav file. The majority of the work is already done for us in the ‘sjmisc' package from CRAN (R users know that most of the time they can find a package that does what they want). This package, among other features, implements reading and writing .sav files with both variable and value labels. We will first check if the package is not already installed and we’ll attempt to install it.
This workflow installs the additional sjmisc package. To avoid package version and dependency issues, it is possible to use Microsoft R Client as the base R. More details here.
If you would like to install the package separately you can use the R install packages App from the Alteryx Gallery.
filePath <- "c:\\temp\\outputRAlteryx.sav"
inputData <- read.Alteryx("#1", mode="data.frame")
ColumnLabels <- as.vector(read.Alteryx("#2", mode="data.frame")$Description)
Within the R code we also define a static ‘filepath ‘ pointing to where the output data should be written. The Action Tool that we previously mentioned will update this filepath to the one chosen by the user while at the same time correctly escaping the backslashes to work in R.
inputData <- read.Alteryx("#1", mode="data.frame")
ColumnLabels <- as.vector(read.Alteryx("#2", mode="data.frame")$Description)
We then read the data that we pass to the macro from input ‘#1’ into an R data frame. In this case we are depending on R’s default behavior of transforming text to factors to get the Value encodings for all columns ex Male(1), Female(2). In addition, we read the metadata from input ‘#2’ into R. The sjmisc function, set_label, that applies the variable names to the data frame expects the variable names to be passed as an object of type vector. To make it work, we have to convert the Description column of the data frame we’re reading in into a vector with the as.vector() base R function. For more details about ‘sjmisc’, you can find the documentation here.
labeledData <- sjmisc::set_label(inputData,ColumnLabels)
Finally we label inputData with the labels we just created and we store the result in the labeledData dataframe and then write it to the user’s filepath using the sjmisc’s write_spss function.
MessageOut <- paste("file written to: ",filePath)
names(MessageOut) <- "Output File Path"
We also pass the filepath as a message to the R Tool output to be displayed to the user.
Edit: It was brought to our attention that the macro has an issue writing out text columns that are longer than 120 characters. Unfortunately this is a defect in the underlying R package. As a workaround for now, the macro was modified to trim all text fields to 120 characters. Please keep this in mind when writing out data.
Mandatory Note: This macro and sample code were developed by the authors as a proof of concept to show what's possible. This is not a production-ready macro and is not supported by Alteryx. Do ask questions on this thread - BUT use at your own risk!
WriteSPSSWithLabels_sjlabelled.yxzp has been updated from using the R package sjmisc because the set_label command has been deprecated from sjmisc and is now in sjlabelled.
As currently designed, the Amazon S3 Download tool only allows one file, or object, to be read in at a time. This article explains how to create a workflow and batch macro that will read in the list of objects in a bucket and allow you to filter for the file(s) you want using wildcards!
Have you ever wanted to restrict the fields that flow through your workflow based on their data type? What about only allowing fields that begin with the same word or are in a specific position? The Dynamic Select tool is just what you’re looking for!