cancel
Showing results for 
Search instead for 
Did you mean: 

Alteryx Knowledge Base

Get started: helpful articles, sample modules, and apps!

Most of the Alteryx advanced analytics capabilities - including most of the tools in the Predictive, AB Testing, Time Series, Predictive Grouping, and Prescriptive categories - are built as R-based macros under the hood. If there's a piece of functionality that you're looking for that's lacking in Alteryx but is available in R and you have modest R coding abilities, you can extend Alteryx by creating your own R-based Alteryx tool.   The macro creation process involves four steps (quick links to the guides in the series): Find and install an appropriate R package to provide the needed functionality. Develop an Alteryx workflow that makes use of the relevant R functions via the use of an R tool. This workflow becomes the basis of the macro. Create a macro that provides the basic functionality you want, and test it in a new workflow. Polish the macro by documenting it, giving it the ability to generate a report, and doing other things to make it more polished.  The various Alteryx files created in this tutorial are attached to this post.    Once you've created the new tool, don't forget to share it with the wider community by publishing it to the Alteryx Analytics Gallery.   Background   Recently I have been working with an existing customer that is considering expanding the use of Alteryx within their organization to include other groups. Some of those groups are focused on developing predictive analytics models, and currently its members are using a number of different software products. Based on this, there are certain features that they often use in some of those products that are not available "out of the box" in Alteryx. While these features are heavily used by some members of this group, they aren't as widely used in general. A trade-off we face in developing Alteryx is to provide generally needed functionality without blowing up the number of available tools to where their sheer number becomes overwhelming to new Alteryx users.   In a number of instances we have developed new tools at the request of customers to address their needs, providing them with the tools immediately, and then folding them into a subsequent release of the product or publishing them to the Predictive District on the Alteryx Analytics Gallery. A particular case in point is the MB Affinity tool, which was part of the 10.0 release of Alteryx. The MB Affinity tool provides cosine similarity/distance measures for items. This is a common method used in creating recommendation systems of the "people who bought this item also bought" variety.   Getting back to the issue faced by the predictive analytics team of our current customer, one feature of another product that they currently use, which isn't currently pre-packaged in Alteryx, is a tool that examines the importance of potential numeric predictors for a categorical target field using an entropy based measure known as information gain or Kullback–Leibler divergence. In this series, I illustrate how to create an Alteryx macro that provides this measure.
View full article
Alteryx has recently released (Feb 2017) a new Google Analytics Connector. You can download it here.    Here is an overview of the new GA tool in Alteryx Designer 11.0:     Connecting to Google Analytics is becoming more and more popular and There are a few things you need in order to use the Google Analytics macro: A Google Account (e.g., Gmail) Authorized Access to an existing Google Analytics account   Step 1: Set up a Google Analytics account   Please visit the Google Analytics webpage and sign in https://www.google.com/analytics/     On the landing page for Google Analytics you will need to add the Account Name, Website Name and website URL. Once you have entered this information you can click ‘Get Tracking ID’ and this will generate a Tracking Code for the website you would like to attain information on. Once you have generated this code this will take you to Google Analytics Home Page. Creating the Tracking ID creates a Profile ID and/or View ID with the associated website URLs which are used in the back end in the Google Analytics Macro within Alteryx. If you would like to find this information please click on the Admin tab on the Google Analytics home page and navigate to the ‘Property Settings' and 'View settings' to see the Tracking ID and Profile ID/View ID respectively.       Step 2: Now it is time to set up the Client ID, Client Secret, and Refresh Token needed for the Google Analytics Macro:   Go to the Google developers page: https://developers.google.com/  You will now need to navigate to the 'Google API Console' (This can be found at the bottom of the developers.google.com page).    Once on this page you can click on the 'Analytics API' link      When on the landing page for the Analytics API please press the ‘Enable’ button. Once this is enabled the button should change to ‘Disable’.    Congratulations!  You have registered your application by creating a project.     Step 3: Generate your Client ID and Client Secret Within the API Manager you should see a 'Create credentials' option. Click ‘OAuth Client ID’ when the drop down menu appears.    On the next page make sure the Application type selection is Web application   This will generate additional required information below. You can leave the Name as Web Client 1, but please change the Authorized redirect URIs (second option under restrictions) to: https://developers.google.com/oauthplayground Hit Create and wait a few seconds for Google to create your new project.   Acquire your Refresh Token In another tab in your Web Browser please navigate to Google’s OAuth Playground https://developers.google.com/oauthplayground Once on the landing page firs click the cog icon button near the upper-right corner of your monitor Check the box Use your own OAuth credentials and make sure Access type is set to Offline Paste in your Client ID in the ‘OAuth Client ID’ field Paste in your Client Secret in the ‘OAuth Client Secret’ field Hit Close In the sidebar on the left, scroll down to the Google Analytics API v3 under Step 1 Select & authorize APIs Click the little grey triangle on the left and select https://www.googleapis.com/auth/analytics.readonly Hit Authorize APIs (you will be directed to another page) When prompted, hit Allow. You will be redirected back to the OAuth 2.0 Playground. Once you’ve been redirected back to the OAuth 2.0 Playground hit the Exchange authorization code for tokens button.         Your Refresh Token will be contained in a JSON object towards the bottom of the Request / Response section. You can copy this string and save in the same location as your CLient ID and Client Secret.   Congratulations!! You now have all the pieces you need to use the Alteryx Google Analytics Connector!!   Step 4: Lets move to Alteryx! The Google Analytics macro can be downloaded here and found in the Connectors tab Once you have the macro on the canvas you will have two login options: Online Login: This will take you to your Google Login and will automatically create a Client ID, Secret & Refresh Token Offline Login: This will allow you to enter in your Client ID, Secret and Token. This process is recommended for those scheduling the GA tool.    Once logged in you will be prompted to select an available Account; WebProperties; Profiles. Each configuration window will prompt you to press Next to move to the next screen. The GA tool will allow you to select the Date, Metrics & Goals, Dimensions & Segments and give you a summary view to show you selections Once you see this summary press run and you will now see your data.   Tips and Tricks Check out the S datastream output – it contains summary information with each run, and in this case shows all of the account/property/view combinations that are associated with the credentials and loaded in. The D stream will show the data from Google Analytics. When querying custom metrics, dimensions, or segments, you can only be allowed to select a certain combination. The GA tool will notify you of this. Every query requires you to select a profile, at least one metric, and a date range. Everything else is optional.   Click on Spoiler to see all error messages and troubleshooting tips!   Common Issues (Prior to Version 3)   The Refresh token lasts about 1 hour so please remember to refresh the token as this will prompt errors.   Please repeat Step 4 above to refresh the token. Remember to add your Client ID and Secret into the oauth credentials before Authorizing the API!   ‘The Field ‘id’ is not contained in the record…’ (Upgrade to the latest GA tool) This error message can allude to a number of issues. However, to limit trial and error I have prioritized the solutions below based on prior troubleshooting experience (I know im awesome).    1) Please request 'Full control' or as minimum 'read & write' permissions to the supporting macros folder. For Admin installs this folder can be found in: C:\Program Files\Alteryx\bin\RuntimeData\Macros\Supporting_Macros (Relative to where you installed Alteryx).  For Non-Admin installs this folder can be found in: C:\Users\{USERNAME]\AppData\Local\Alteryx\bin\RuntimeData\Macros\Supporting_Macros (relative to your user name) (App data is a hidden folder so you may need to turn hidden folder on)  The reason is outlined in more detail below, but in short, the GA connector reads & writes files necessary for the API connection in these locations. If we do not have write permissions the API connection will fail and give us the error above.  ****DISCLAIMER - YOU WILL NEED TO GET YOUR IT PERMISSION TO DO THIS. Unless you have admin rights to your machine.  2) Your refresh token has expired Please follow Step 4 above and remember to add your Client Secret and ID into the Oauth credentials section on the right hand side before Authorizing the API.  3)  Are you inside your company firewall? If you have tried the above solutions and neither of them worked please try outside of your company firewall (With your IT departments blessings of course)  I have not seen many instances of this but this did resolve the issue in a few cases.        ‘Tool #349: Tool #4 Error Transferring data: Failure when receiving data from the peer’ (Upgrade to the latest GA tool)   Check Profile ID This can also be firewall issues, check with IT if they are blocking transaction from Google Analytics to the user.     ‘Tool #574: Tool #522: Error creating the file "C:\Program Files|Alteryx\bin\RuntimeData\Macros\Supporting_Macros|GoogleAnalytics.DIMENSIONS.xml": Access is denied.’ (Upgrade to the latest GA tool) ‘Tool #574: Tool #706: Error creating the file "C:\Program Files|Alteryx\bin\RuntimeData\Macros\Supporting_Macros|GoogleAnalytics.PROFILES.xml": Access is denied.’ (Upgrade to the latest GA tool)   The Google Analytics macro is dependent upon deleting and updating four files within Program Files. These four XML files are the Profiles, Dimensions, Segments and Metrics. Currently a command line window flashes for a second at the start of running in Update mode – this is to circumvent an access limitation (because the installation files are located in Program Files, they cannot be overwritten – but they can be deleted and written anew). This allows the metrics, dimensions, and segments XML files to be updated.   Sometimes due to internal setting these files cannot be accessed and you may get an error saying ‘Access Denied’ (error referenced above). If this is true you can contact your IT to give you permissions to this file location. Alternatively you can run Alteryx as Administrator and this may give you the elevated Admin privileges to right to this Location from Alteryx (Right click on the Alteryx Icon and ‘Run as Administrator’.     ‘You have set up the Google Analytics Connector Tool and want to know the # of records is produced in the data output? ‘ (Upgrade to the latest GA tool)   The macro aggregates the data across the entire time period, and is grouped by dimension. If you don’t choose any segments in the query, then there will be one row. If segments are selected, then you’ll get multiple rows back for each possible value of dimensions. If you want to retrieve one record per day, the best way really is to setup a little batch or iterative macro to loop through a collection of dates.     ‘You have got a Client ID, Client Secret and a Refresh Token however you do not return any results. ‘ (Upgrade to the latest GA tool) Make sure you have installed a Google Tracking Code on the target website. If not this is something the web developer would have to do. For more information please look here: https://support.google.com/analytics/answer/1008080?hl=en     ‘Receiving a createRecord: A record was created with no field’s error.’ (Upgrade to the latest GA tool) Try right clicking on the Alteryx Designer Icon and ‘Run as Administrator’. This can give elevated permissions to access the dimensions, Profiles and Metrics in Program Files.     ‘Unknown Variable’ (Upgrade to the latest GA tool)   This error will appear when youe enter in the Client ID, Secret and Refresh Token. Do not fear, just configure the search tab and this error will be removed once you run the GA connector.  Please refer back through the steps above as you have more than likely missed a step in the configuration. This error has appeared when the ‘Analytics API’ has not been enabled.     Could not find file…’ (Upgrade to the latest GA tool)     Please check the ‘Reset to default’ option and run the Google Analytics Connector. This will re-write the four XML file into Program Files.     CONGRATS you have now made it through the worlds longest but most informative Google Analytics Article (in my opinion).   Now go free and play with your Google Analytics Data in Alteryx!      However, if you continue to have problems with the Google Analytics connector please reach out to Alteryx Support     Best,   Jordan Barker Client Service Representative
View full article
This post is part of the "Guide to Creating Your Own R-Based Macro" series.   In the case of this macro, polishing involves several elements, adding error checking, adding a reporting capability to the macro, documenting the workflow of the macro, making the connections for the interface tools "wireless", and providing the macro with its own icon. In terms of error checking, a lot of what typically would need to be addressed is handled in Alteryx's interface tools by limiting user input to only appropriate data types. The are two other possible user input errors for this macro, the user may neglect to select any potential predictor fields, or may not have selected any of the three entropy measures.   Adding error checking involves adding Error Message tools and connecting them to the appropriate interface tools (in this case the List Box tool to select predictors, and the three Checkbox tools to select importance measures). What the Error Message tools do is determine whether predictors and/or predictor importance measures have been selected by the user, and return a error message if one or both of them has not been provided by the user. Examining the Error Message tools in the macro accompanying the introductory post in this series is the best way to see how this is done.   Adding a report requires both the addition of some additional lines of code in the R tool, and some additional tools in the macro's canvas. One construct that is very common in the R code used in the predictive macros packaged in Alteryx is the use of what are really key-value pair tables. One common one is often labeled grp_out in the R code, and contains the fields grp (the labels) and out (the values), which is used to bundle report elements together in a way that allows them to be easily sent from R to Alteryx, and easily manipulated within Alteryx to create a report. To assist in accomplishing these objectives, there are R "helper functions" that are included in the AlteryxRDataX package (an open source R package that is part of the Alteryx predictive installer) to quickly format data in a convenient way. In addition, there are a set of "supporting macros" that help format the data from R quickly in Alteryx. Often these tools help address differences in the "dimensionality" of outputs. In this case, we want to include the name of the target field that is the focus of the analysis in the report, which is only a single data item. In contrast, the number of reported measures depends on both the set of potential predictors specified by the user (which needs to be one or more), and the set of measures to report (which can range from one to three).   Ultimately, R data frames (or certain types of lists) are passed as tables to Alteryx, resulting in all the data elements needing to have the same number of fields when passed as a single data frame / table. The use of the grp_out table allows this to be accomplished. To make things more concrete, the target field name is passed in the first row of the R data frame, with its label being "table name" (the value in the "grp" field) and the actual name contained in the "out" field. The header row and the data rows of the table require a bit more processing. Each row of values is converted into a string, which consists of a (numerically rounded) set of table values in a pipe ("|") delimited string. There is a R helper function to accomplish this, which is named matrixPipeDelim. The pipe delimited string is the value of "out" for each row of the table, and the label contained in the "grp" field is "table header" for the table header, and "table data" for each row (potential predictor) in the data. The code used in the Macro's R tool to create the grp_out table is given below, and shows the use of the matrixPipeDelim function as part of a character string manipulation operation: ## Create a grp_out key-value pair table for creating a report ## using Alteryx's reporting tools # The grp (key) field grp <- c("target field", "table header", rep("table data", length(names(the_data)) - 1)) # The out (value) field out <- paste(as.character(the_output[[1]]), matrixPipeDelim(as.matrix(the_output[-1])), sep = "|") out <- c(the_target, paste(names(the_output), collapse = "|"), out) grp_out <- data.frame(grp, out) write.Alteryx(grp_out, nOutput = 2)   The portion of the macro that creates the report is shown in Figure 1. The Filter tool makes use of the "grp" field to split the target name from the table of importance weight measures. The name of the target variable is sent to the a Report Text tool to create a report title, while the table of importance weight measures is sent to the Pipe to Table Rows supporting macro to convert the pipe delimited strings into an actual table. The Pipe to Table Rows macro is located in the directory C:\Program Files\Alteryx\bin\RuntimeData\Macros in a standard Alteryx installation.   There are three ways comments can be inserted into an Alteryx macro to document the underlying process: the Text Comment tool, the Tool Container tool, and the annotation capability of standard Alteryx tools. The ways in which these tools are used reflects personal taste to some extent. Personally, I'm inclined to make use of the annotation capability of Alteryx tools, since I can restructure a workflow without having to move a number of Text Comment tools around as well. Other people make heavy use of Text Comment tools. To get to the annotation panel of a standard Alteryx tool, press on the "pencil" icon in the tool's configuration window, as show in Figure 2.   By right clicking on input and output nodes of the interface tools, a context menu will appears that allows the user to make the connection to or from that tool wireless.   The final bit of polishing is giving the macro a new icon. By default, the icon a new macro receives is a blue circle. You can either use this or other generic icons provided with Alteryx, create a completely new icon from scratch, or use clip art images from the Internet or other sources. Ideally, the icon has some connection to the tool. In this case, we are creating a macro to provide entropy based measures, so something that conveys entropy seems like a good choice. An image that always conveyed entropy to me is the artist Salvador Dali's melting pocket watches from the painting The Persistence of Memory. After a bit a search, I found an image of a melting pocket watch that works well as an icon. To use the icon, go to the Interface Designer for the macro, and press the wrench icon, from there, you can change the macro's icon, as shown in Figure 3.   The workflow of the completed macro is shown in Figure 4.   It took me about one hour and 20 minutes to create the base version of the macro, and a little over an hour to polish it, for a total time of around two and half hours from start to finish.
View full article
This post is part of the "Guide to Creating Your Own R-Based Macro" series.   The workflow is now ready to be converted into a macro. To do this, click on the canvas and then on the Workflow tab of the Properties window, click on the Macro radio button to convert the workflow into a Standard Macro. At this point you will want to use the drop down menu option File > Save as... to save the file to yxmc format. My original workflow was saved to the file Entropy_Importance.yxmd, and I saved the macro to the file Entropy_Importance.yxmc.   We are now ready to add the user interface elements to the macro, and make several other changes. Figure 1 shows the final version of the basic macro.   As the figure suggests, the major changes are the addition of a number of interface tools, the Text Input tool has been converted to a Macro Input tool, and a Macro Output tool has replaced the Browse tool of the original workflow. All of these tools fall under the Interface tool group. Chad Martin wrote an excellent blog post that provides an overview on what these tools provide and how to work with them at the time of the 9.0 release. As a result, reading that post will likely be helpful if you have not yet used these tools.   I won't go into great detail, but I do want to give an overview of what is going on with the interface tools in the macro. Starting from the top left of the canvas, the first interface tool is a Drop Down tool that allows the user to select the target variable for the analysis. Inside, it is configured to only allow string type fields (which are converted to categorical variables in R) to be selected. The Action tool that it connects to modifies the upper Select tool to filter out all fields except the target field.   Moving to the right, the List Box tool allows the user to select a set of predictors. Within the tools configuration, only numeric variables (various integer, float, fixed decimal, and double types) are allowed to appear in the user interface. The Action tool associated with it modifies the lower Select tool based on the user's selection.   The final three tools as you move to the right in the canvas are Check Box tools, which if checked indicates whether a particular measure will be calculated. As you may have guessed, the macro itself will not only provide the information gain measure, but also the option of including the gain ratio, and symmetrical uncertainty entropy based measures as well.   Given the above, the code within the R tool (provided below) has gone through some alterations to allow for this additional functionality. In addition, the code example also illustrates how the user's input to the Check Box tools can be used as "question constants" in an R tool's code: # Load the FSelector package suppressWarnings(library(FSelector)) # Read in the data from Alteryx into R the_data <- read.Alteryx("#1") # Create a string of the potential predictors seperated by plus signs the_preds <- paste(names(the_data)[-1], collapse = " + ") # Get the name of the target field the_target <- names(the_data)[1] # Create a formula expression from the names of the target and predictors the_form <- as.formula(paste(the_target, the_preds, sep = " ~ ")) # Initialize the output data frame the_output <- data.frame(Field = names(the_data[-1])) col_names <- "Field" # Calculate the entropy based measure(s) selected by the user # via the "questions constants" if ('%Question.info.gain%' == "True") {     out <- information.gain(the_form, the_data)     the_output <- cbind(the_output, out[[1]])     col_names <- c(col_names, "Information Gain") } if ('%Question.gain.ratio%' == "True") {     out <- gain.ratio(the_form, the_data)     the_output <- cbind(the_output, out[[1]])     col_names <- c(col_names, "Gain Ratio") } if ('%Question.symm.uncertainty%' == "True") {     out <- symmetrical.uncertainty(the_form, the_data)     the_output <- cbind(the_output, out[[1]])     col_names <- c(col_names, "Symmetrical Uncertainty") } # Prepare the final output names(the_output) <- col_names # Output the results write.Alteryx(the_output)   It is now time to test to see if the basic macro works as expected in a workflow using different data. For the test workflow I decided to work with the Bank Marketing dataset from the UC Irvine Machine Learning Archive. The full dataset was used, which comes in CSV file format. As a result, the Auto Field tool was used to set appropriate field types. In addition, one of the predictor fields (pdays) is the number of days since a prospective customer was previously contacted with an offer to invest in a term savings account. Those who were never contacted for this product were given a code -1. Given this, the data is separated into those who have, and who have not, received a past telemarketing offer for a term savings account using a Filter tool. Finally, the basic macro was inserted into the workflow twice (based on right-clicking on the canvas and inserting the macro twice), and used against both of the data streams coming from the filter tool, with a Browse tool attached to both of them. The completed version of the test workflow is shown in Figure 2.   Frequently, things will work as expected in the workflow contained in the macro, but not when the macro is used in a new workflow, and the test workflow should allow you to find any major errors in your macro.
View full article
This post is part of the "Guide to Creating Your Own R-Based Macro" series.   Now that we have the needed R packages installed, we can use them in an Alteryx workflow. The real purpose of this workflow is to begin to put together the macro itself. As a result, there will be some minor differences between this workflow and the one you would likely create if you didn't plan on using as the basis of developing a macro. The starting workflow of the macro is show in Figure 1.   The data used in this macro (contained in a Text Input tool) is Fisher's well known Iris data set. This data consists of the length and width of both the petals and stamens of individuals from three species of the Iris flower family. In this instance we want to know how important these four measures are in determining what species to which a particular flower belongs. While this dataset is pretty far afield from a business application, it is a nice dataset to work with for creating this macro since it is small (150 rows and five fields), and represents the correct case (a categorical target, species, and numeric predictors, height and width measurements).   The basic workflow consists of only six tools. A Text Input tool contains the Iris data, which feeds into two Select tools. The upper of the Select tools selects out the target field (the field Species), while the second selects the potential predictor fields to be examined. The downstream Join tool is used to bring the data back together in a way where the first column contains the target, and the subsequent columns contain the potential predictors to be examined.   This combination of three tools would be somewhat out of place in a standard (non-macro) workflow. In general, column position does not matter, moreover, even if it did, a single Select tool could be used to alter column position. However, in this case we will alter the position of columns based on a user's choices in the final macro's user interface, and the use of two select tools allows us to accomplish this task.   The data flowing into the R tool now consists of only the target field (the first column) and the selected numeric predictors in the remaining columns. The R tool contains the following lines of code # Load the FSelector package suppressWarnings(library(FSelector)) # Read in the data from Alteryx into R the_data <- read.Alteryx("#1") # Create a string of the potential predictors seperated by plus signs the_preds <- paste(names(the_data)[-1], collapse = " + ") # Get the name of the target field the_target <- names(the_data)[1] # Create a formula expression from the names of the target and predictors the_form <- as.formula(paste(the_target, the_preds, sep = " ~ ")) # Get the information gain measures out1 <- information.gain(the_form, the_data) # Prepare the results for output out <- data.frame(a = names(the_data)[-1], b = out1[[1]]) names(out) <- c("Field", "Information Gain") # Output the results write.Alteryx(out) The R code is fairly straightforward, with the possible exception of how the locations of values are indexed. For example, the code snippet names(the_data)[-1] takes all the provided field names except the first one (the [-1] index), which is the target field. The code snippet out[[1]] obtains the first (and only) column of the data frame returned by the information.gain R function.   The contents of the Browse tool (the sixth and last tool in the workflow) are the results of the analysis.
View full article
This post is part of the "Guide to Creating Your Own R-Based Macro" series.   There are two major repositories of R packages, CRAN (the Comprehensive R Archive Network) and Bioconductor. The Bioconductor repository has over 1000 packages, which are focused specifically on bioinformatics related applications, while CRAN does not focus on a specific application area, and has over 6000 contributed packages. In general, the functionality you will want to bring to Alteryx via R will be from a package that is on the CRAN repository.   With over 6000 packages, searching for a CRAN package with specific functionality by browsing through the contents of the CRAN repository is not very practical. The two ways I recommend finding a relevant package is by either looking at the appropriate "Task View" (a description of available packages that address a particular application), or doing a web search on the feature you are hoping to obtain, coupled with the addition of "R" to the search string.   For this macro, I used the web search approach, and entered the search string "entropy information gain R" into my preferred search engine. The first hit on this search was a link to the CRAN package FSelector. Examining the documentation to this package revealed that the package delivered the desired functionality through a function called information.gain, and this was one of three entropy based measures the package provides (the other two measures are the gain ratio and symmetrical uncertainty). All three of these functions took as arguments a formula of the form target ~ predictor1 + predictor2 +...+ predictorN   and an R data frame (R's equivalent of a data table) containing the data. The output of each of these functions is a data frame that contains a single column with the value of the selected measure with one row for each of the predictor fields. The predictor field names are contained in the row.names metadata element of the data frame. We will make use of this information in creating an Alteryx macro to wrap this functionality.   The FSelector package provides exactly what we need, so it is time to install the package. There are a number of ways to install an R package in a way that allows it to be used with Alteryx. The one complication that can arise in doing this is on user machines where multiple copies of R are installed. For users not using Microsoft R, the Alteryx predictive installer places the R executables within the Alteryx installation (usually C:\Program Files\Alteryx). To make sure you are installing packages into the version of R Alteryx is using, open a command prompt and enter the command   "C:\Program Files\Alteryx\R-3.3.2\bin\x64\Rgui.exe" making sure to use the quotes. This will bring up the R console program. In the console window, type the command install.packages("FSelector")   This will bring up a GUI asking you to select a CRAN mirror to download the package from, along with its dependencies (there are several). Select a mirror that is geographically close to you for best performance. In addition, the FSelector package makes use of several other packages that call Java, so you also need to have a JVM installed on your computer to create and use this macro (I'd recommend the Windows x64 Offline version available here).   Once R is done downloading and installing the packages, make sure that FSelector and all its dependencies were correctly installed. To do this, in the R console enter the command library(FSelector)   This will cause R to load the FSelector package. If you did get an error message that some packages were not available (one possibility is the RWekajars package), install them using the install.packages command in the R Console. Once the needed packages have been installed, you can exit the R Console program.
View full article
We continue to improve the user connection experience and hope the enhanced Input Data Tool will allow users to connect and manage their data much more easily.
View full article
Installing Alteryx Server? Everything you ever need to know, and then some.
View full article
With all the bells and whistles to play around with in the Reporting Tool Category, it’s hard not to leave some out of your reporting workflows every now and again. Just don’t forget about the Report Text Tool – the tool that’ll help you painlessly add text to your reporting objects, presentations, or documents to help spruce up their readability or formatting.
View full article
As Alteryx analysts, we’re whipping up insight at blazing fast speeds. Workflow after workflow, tool after tool, we’re gleaming functional understanding from inert webs of data that empower us to make better decisions. Good insight is only as good as it is shareable, however, and to enable better sharing any Alteryx analyst can take advantage of their Workflow Dependencies to simplify input or output path dependencies in shared workflows.
View full article
As long as you know where to look, data has all the answers. Sometimes, though, those answers aren’t clear as day. More often than not, they need to be communicated in an effective format - a format that can let the data talk and highlight the important motifs for you. Another favorite of the Reporting Tool Category, the Charting Tool can do just that by adding expressive visuals to any report or presentation. Offering an exhaustive list of charts to choose from (area, stacked area, column, stacked column, bar, stacked bar, line, tornado, pareto, box and whisker, scatter, bubble, polar, radar, pie), the Charting Tool will give you the ability to add descriptive visuals, with legends and even watermarks, to your reporting workflows that will help you find the answers in your data.
View full article
There are a few important things to note when using the Trade Area Labelling Macro.  The first thing to note is that the Trade Area tool should have the distances in descending order to avoid having the largest polygon overlapping the smaller ones on the map. Within the actual macro properties you have the choice to select the Spatial Object, the Label Type (miles, kilometers, minutes), and whether or not you want the labels to display horizontally to the right or vertically above the point. The result of the macro is a label field and a spatial object field that contains the point to be labeled. Within the Map tool, configure the points so that they are size 0 (meaning they won't appear in the final map), and make sure to choose the appropriate label field to make sure the labels appear on the map.
View full article
The Auto Field Tool: a tool so easy you don’t have to do anything – just put it on your canvas and viola. Automatically optimized data types. If you’re running into data type related issues and errors in your workflows, or just looking to add some speed or reduce the occupied disk space your data is hoarding – look no further than the Preparation Tool Category’s Auto Field Tool, which reads through all the records of an input and sets the field type to the smallest possible size relative to the data contained within the column.
View full article
Believe it or not, data can be beautiful. Take your black and white data points and add some color to them in visuals with the suite of tools found in the Reporting Category https://help.alteryx.com/current/index.htm#Getting_Started/AllTools.htm#Report_Presentation_Tools! If you’re looking to create reports, presentations, images, or simply output data with a bang, you can use the Render Tool https://help.alteryx.com/current/PortfolioComposerRender.htm paired with other Reporting Tools to create HTML files (*.html), Composer files (*.pcxml), PDF documents (*.pdf), RTF documents (*.rtf), Word documents (*.docx), Excel documents (*.xlsx), MHTML files (*.mht), Power Point presentations (*.pptx), PNG images (*.html), and even Zip files (*.zip) – packed with formatting and visual aesthetic that’ll make any data-geek’s mouth water.
View full article
When you’re frequently writing and rewriting data to Excel spreadsheets that you use for Excel graphs and charts, it can quickly become a hassle to make and remake your reporting objects to keep them up-to-date so you’re visualizing the most recent data. A best practice to keep the hassle out of the process exists, though! If you keep your plots isolated to their own spreadsheet, referencing cell values in another sheet used to capture your data, you can simply overwrite the source data sheet and your plots will update automatically upon launching Excel. In the example below (attached in the v10.6 workflow Dynamically Update Reporting from Excel Spreadsheets.yxzp) we’ve included the workaround to make your Excel outputs seamless.
View full article
Is there a workaround for not being able to use the Folder Browse Tool in the Gallery? Though it may not be as clean as being able to use the Folder Browse Tool, the simple workaround for this is to use the Text Box Interface Tool instead. This will allow the user to copy a directory path from Windows Explorer and paste it into the Text Box. In the workflow, all you need to do is connect the Text Box Tool to an Output Data Tool and have the Action Tool update the path portion of the Output Data Tool. You can even enter in a default path in the Default Text section of the Text Box if there is a path that is most commonly used.
View full article
The Input Data Tool is where it all starts in the Designer. Sure, you can bring in webscraped or API data with the Download Tool (master it here) and our prebuilt Connector Tools, but the tool that makes it a breeze to grab data from your most used file formats and databases is the Input Data Tool.
View full article
When it comes to spatial analyses, few tools come up more than the Trade Area Tool. Whether you’re looking to pad polygons around your spatial objects in distance or drive time, you won’t need to make a trade-off - just the Trade Area Tool.
View full article
Any time you want to get a good point across, it’s best to show your data. Show your data off in style in your reports or presentations by adding formatting to otherwise bland data with the Table Tool! Found in the Reporting Tool Category, the Table Tool will make it easy flair to your raw data, and give it the pop it needs to really sink in.
View full article
A picture is worth a thousand words, right? Save your breathe and snap a picture to supplement your analyses and reports with the Image Tool, the camera icon tool residing next to all your other reporting needs in the Reporting Tool Category. Whether you’re looking to build a presentation or report from scratch, or simply add graphics to accentuate your raw data – this tool will make it a breeze to access image files from disk, store image files in physical workflows, or dynamically access image files (even in Blob format!) to pair with any Alteryx output.
View full article