community
cancel
Showing results for 
Search instead for 
Did you mean: 

Alteryx Knowledge Base

Definitive answers from Designer experts.
 You are using an unsupported browser for translation. Please switch to another browser.

The Alteryx Gallery is full of interesting and useful Macros which provide 'out of the box' solutions to a lot of use cases! With well over a 1000 macros available which ones do you find most useful?
View full article
This article is part of the CS Macro Development Series. The goal of this series is to communicate tips, tricks, and the thought process that goes into developing good, dynamic macros. In this part, we demonstrate how to read in multiple files with different schemas using a Batch Macro.
View full article
Fact: workflows are the best. Look it up. They’re all about getting things done and, with hundreds of tools and the ability to integrate external processes , there’s no shortage of things you can get done. We know that there are some areas of analytics that require a little extra firepower, however, and that’s why you can leverage your workflows in apps and macros for added functionality.
View full article
This article is part of the CS Macro Development Series. The goal of this series is to communicate tips, tricks, and the thought process that goes into developing good, dynamic macros. If you’ve ever built an analytical app and used the Interface Designer (View >> Interface Designer), you’ve probably spent some time in the Test View: This menu provides a great runtime view of the interface as you’re adding and configuring tools and also allows you to interact with them, much like you would when selecting “Run As An Analytical App” in your Designer. You can clear (Reset), save (Save – as an .yxwv analytical app value file), reopen (Open – search for your .yxwv files) and investigate the xml capture of your test values (View – these will initialize to your specified defaults) in here, but the real value is in using the “Open Debug” button to open your app debug workflow: This will create a new module with the workflow that would result from executing all the actions of your interface tools (individual values, tools, even xml, can be updated). You can also see these values, along with an actions log, in a comment box preceding the tools themselves. The workflow will even show you errors if your interface tools created any after updates! This comes in handy as you’re updating detours, opening/closing tool containers, and performing complex updates to your workflows via interface tools because it gives you a snapshot into what, exactly, is happening with each set of Test View values and, in effect, at runtime. For example, in the screen capture of the Test View above, we have the default app (attached as v10.5 App Debugger.yxwz) values. When opening the debug workflow (attached as v10.5 Debug Workflow Default Values.yxmd), we can see the result of them in that our row value is still “Test” and our detour defaults to the right: However, if you update the values in the Test View and reopen the debug workflow with the values below: You’ll see in the new workflow (attached as v10.5 Debug Workflow New Values.yxmd), that our row value is now updated to “DebugTest” and our detour no longer goes to the right:   The above is a simple example of the power of the tool, but using it more often in your troubleshooting will help pinpoint where your errors or conflicts are arising, freeing more time for you to build out more apps!
View full article
This article is part of the CS Macro Development Series. The goal of this series is to communicate tips, tricks, and the thought process that goes into developing good, dynamic macros. The Detour Tool and its counterpart, the Detour End Tool, are tools that come in handy in building out custom Analytical Apps and Macro workflows when you want to “turn on” or “turn off” entire sections of the workflows based on a user input. While handy, there is an alternative to the approach in using Tool Containers to encapsulate the sections that you’d like to turn on/off and using a Radio Button (or other Interface Tools) and action to “Enable/Disable Container from Condition.” There are other action types that are also useful if you’d like to implement more logic to the enable/disable approach. As long as you conjoin the outputs of each Tool Container to a Union Tool none of your data streams require records to be output, successfully completing your bypass!   Attached is a short v10.5 example of the approach, using Radio Buttons, and the “Update Value with Formula” action to update the “Disabled” attribute of Tool Containers:
View full article
This article is part of the CS Macro Development Series. The goal of this series is to communicate tips, tricks, and the thought process that goes into developing good, dynamic macros.   Implementing APIs into macros isn’t a difficult process, you only need to first understand macros and how they’ll interact with your API requests. In short, a macro gives you the ability to encapsulate just about any repeatable process in an Alteryx workflow as a tool. Understanding that, you only need to identify what in your API request will need to change with each row the process/request is being repeated for, and how to update this request dynamically. With each row of your input data stream, expect to be able to use fields to reference what individual values will be – doing so in a formula tool will build out parts of the request that change with each record. If instead you need to update an argument of the request just once for all your records, try using an interface tool and a place-holding value. Need to update parts of a request for only certain records? You can use formula logic or the batch macro’s control parameter approach.   Consider the Google Maps Geocoding API request format below:   If we were to send a request to their API to geocode a single address (specifying an xml output), this would look like: https://maps.googleapis.com/maps/api/geocode/xml?address=1600+Amphitheatre+Parkway,+Mountain+View,+CA&key=YOUR_API_KEY To update this dynamically, within a macro, we need only to map our input fields to their appropriate places in the request, emulating the rest of the request syntax with formula logic: "https://maps.googleapis.com/maps/api/geocode/xml?"+"address="+replace([Address]," ","+")+"+"+[City]+"+"+[State]+"+"+[Zip]+"&key="+"X" (the replace function exchanges spaces for + characters, the remainder of the + characters are added as literal strings to mirror the format above) Then only updating our key remains before passing this to a Download Tool, and this will be the same for all our input rows:           The v10.5 example above is attached for reference. It is an adaptation of a more robust Google Maps Geocoder hosted on our gallery.   Please note that in order to use this macro, you must first generate a server key from the Google Developers Console. Each key has a limit of 2,500 free requests per day. Click here for more information on usage limits for the Google Maps API.   This macro demonstrates requests to Google Maps' web service API and is meant as a proof of concept only. It is not intended for production use.
View full article
Question Can Alteryx run a Powershell script or perform any Powershell specific commands? Answer  Yes! We can use the Run Command Tool to do exactly that.   Note: In order to run powershell scripts you must make sure you have scripting enabled. You should consult with your IT department to see if you are allowed to enable this functionality.   Below is an example I made in 10.6 demonstrating the necessary run command tool configuration. The command will just be “powershell” to enable powershell mode in cmd.exe. Then your command arguments should be the path where the script is located so it can run:     In this particular case I want to read into the Designer the results of my script, so I specify the file that is being written as the Read Results. My “helloworld.ps1” script only contains the below:   "Hello World" | Out-File c:\temp\test1.txt    As you can see, this kicks off in the Designer and opens the script output file to continue downstream, successfully implementing Powershell scripting:  
View full article
This article is part of the Client Services Macro Development Series. The goal of this series is to communicate tips, tricks, and the thought process that goes into developing good, dynamic macros.
View full article
Are you tired of your boring old workflow? Just sitting there in your Designer, slacking off?
View full article
This article is part of the CS Macro Development Series. The goal of this series is to communicate tips, tricks, and the thought process that goes into developing good, dynamic macros.     When should you use the Group By function of the Batch macro versus using the Control Parameter tool or even both? In most cases we recommend using the Group By with “simple” batch macros and the Control Parameter with more “complex” batch macros, but I want to be a little more specific and give a few scenarios where you would use one over the other, or even both.   Group By:   As mentioned above, the Group By can be used in simpler Batch macros. For instance, your data set contains store data and date data. In the Batch macro, you want to process each store individually so that the date data does not overlap with all of the stores. In this case, creating the Batch macro will allow you to process each store individually. In order to use the Group By, change the workflow configuration to Batch macro and add Macro Input and Macro Output Interface tools. This will be enough to create the Batch macro and enable the Group By function. When you deploy the macro, you will see:     The inverted question mark ¿ is the Control Group By field. This is the data that you are going use to group by.  So in our example, this would be the list of stores. Good rule is to use either a Unique tool or Summarize tool just before the macro to create a list with each variable showing only once for the Control Group list. The other Input will be your data.   In the Input Group By field, the field will be your data that contains the same groups as the Control Group. You can think of it as a join. If you are pulling your Control Group from the same data set it will more than likely be the same field.   The Group By function is used in situations where you want to take groups of data and process those groups one at a time.  i.e.: Where you are not using the group to change other tools within the macro, but wanting to pass each group and process one at a time.   Control Parameter: The Control Parameter Interface tool allows the user to update specific tools within the Batch Macro with the Control Group. If within your macro you need certain tools to be configured by the Control Group - whether it’s a certain formula, filter, or report - you can use the Control Parameter and the subsequent Action tool (that appears once connected) to update the tool with the Control Group data. Generally the rule of thumb is that if you need to Batch a tool within the macro, then you will need a Control Parameter to update that tool.   A great example of using the Control Parameter can be found in your Sample data. Help>Sample Workflow>Macro Samples>Batch Macro Sample Workflow   When to use both: The best way to explain when you would use both the Group By and the Control Parameter is when you want to Batch a large group through the macro, but then also Batch data within a smaller group of the larger group.  For instance, say you want to do some calculations with a region of stores; then within that region you want to do calculations with ZIP codes. You can use the Group By to select your region field and then use the Control Parameter to update your tools with the ZIP Codes.   This macro would look first for the region group specified in the group list and then move to batching the data by ZIP code and run through the macro.   The sample attached was completed in 10.6.8.17850  
View full article
This article is part of the CS Macro Development Series. The goal of this series is to communicate tips, tricks, and the thought process that goes into developing good, dynamic macros.   Many macros need to be especially dynamic to allow the user to select which field to use in a very complex process. Take the Make Grid tool for example. It asks the user to specify a spatial object field, and only carries that field through its calculations. What comes out of the tool are two new fields, GridName and Grid, and none of the original fields at all.     I set out to build a macro just like this tool, except to generate a hexagonal grid. I started by building a normal workflow that could do this process, and when I was ready to convert it to a macro, I realized that I wasn't sure of the best way to enable it to choose this field dynamically.   There are two main ways to get data into your macro. Here's a quick summary of how they work:   Field Maps   The Macro Input tool has a checkbox in its configuration that reads Show Field Map.     If this is unchecked, then your macro won't do anything with the data - it will just stream in everything as is and trust that the stuff inside knows how to account for anything you throw at it.   If it is checked, then your macro will create drop down menus in its configuration window that will ask for the fields you have present in the Template Input. These drop down menus will let you select which fields to stream in to the macro in place of the ones in its template.     The field map needs all those drop downs to be filled out for it to do its thing, but if you want to make one of these inputs optional, just add (Optional) to the field name in your macro template.     Advantages: 1.  Easy to set up! One checkbox and your template is all that's needed. 2.  Makes sure only mapped fields enter the macro. This is good when converting a workflow to a macro because you don't need to worry about every form the input data stream could be in. If your stream has other fields, they will get tacked on to the stuff coming out of the macro.   Drop Down   Drop Down menus are an alternative way you can bring fields in to your macro that offer a bit more control of the process. They're particularly useful when connected to the   anchor of a Macro Input tool.     You can then update a Select tool with the selected field to choose which field is being passed along.   Advantages: 1.  Allows you to specify which fields to show to the user from a list of field types. (In this example, I am only allowing spatial objects.) 2.  You can have a default selection populate the interface. (Here we will have any field starting with "SpatialObj" get selected automatically in the configuration of the macro.) 3.  If you want something to be optional, you can use the [None] option.   "The Select Tool Trick"   If you make use of the Drop Down tool to bring in your data you'll need to update a Select tool. Here's a little trick that will make converting your workflows a lot easier.     First you'll want to uncheck *Unknown in the Select tool, since this will bring in every field not explicitly unchecked here. Then, have only the field you're selecting for checked, and navigate over to your Action tool and point it at the selected field.       Instead of having this repeated for every tool using this field, just have the field renamed in the Select tool, and refer to it by that name in all your downstream fields.   This turned out to be just what I needed for the Make Hex Grid macro, where I have a ton of stuff happening downstream and I only wanted one field to get through my Select tool.     Check out the example for a simplified version of this. (Created in version 10.1)  
View full article
This article is part of the CS Macro Development Series. The goal of this series is to communicate tips, tricks, and the thought process that goes into developing good, dynamic macros.   So let’s say you have a dataset and you want to keep the order of the fields the same as it goes through your data blending or analytical process. Most of the time you may not have to worry about field orders changing. But let’s say you have to split the data up for any reason and join it back together. Normally, your instinct would be to use a select tool in order to manually reorder the fields. But say you have 100 fields, this could be a very time-consuming and annoying process. Also, say you had a macro that requires these field orders to be updated dynamically. You wouldn’t be able to use a select tool at that point.   Luckily there is a neat little trick you can do using the Sample and Union tools.   Let’s take a look at a simple example:   Here we have a dataset with four fields. We selected two fields which lets say hypothetically undergo some change but have kept the same field names.    Now we join them back by record position in the original dataset using the Join tool. You know that Alteryx will by default rename any fields coming in the “Right” input with a “Right_” as a prefix if have those fields share the same name as any fields in the  “Left” input.    Since you want the new fields overwriting your old fields you want to make sure the original dataset goes into the “Right” input while your modified fields go into the “Left” Input. The main reason you want to do this is because in the join tool you can deselect all duplicates automatically by going into Options->”Deselect Duplicate Fields.   Note: If you wanted to do this in a batch macro you have to attach a Dynamic Select tool after the joined data from the Join tool. Then, you have to select “Select via a Formula” in the drop down and ‘type !StartsWith([Name],"Right")’ in the “Expression” box to dynamically deselect duplicate fields.     Now that your fields are updated you notice that your data is not in the order it was originally.       To get this data back into the order it was previously, we attach a Sample tool to the original dataset and select “First N Record" and N=0. This will give you the field headers in the proper original order.     Connect the output of this first and then output of your Join to a Union tool and select "Auto Config by Name". The output order of the connections matter. The connection coming from the sample tool must be the first connection to the Union tool.   Now your fields will be back in their original positions!     If new fields were added before the Join. The Union tool will default to putting those fields at end.   The example below was built in Alteryx version 10.1.  
View full article
This article is part of the CS Macro Development Series. The goal of this series is to communicate tips, tricks, and the thought process that goes into developing good, dynamic macros.   Suppose you have a dataset that will pass through a macro if a condition is true, and an entirely different macro if a condition is false. In the event the condition is true for all records, no records will be sent to the false-side macro. The false-side macro is expecting data and throws an error whenever it doesn't find any. Similarly, in a case where the condition is false for all records, no records will sent to the true-side macro and errors. You need a process that will bypass the macros whenever data is unavailable. Let’s see how you can do that.   First, create data senarious where all conditions can be tested. In the first situation, I assigned a field, Sum_Test, with a value of 1 to half of the records and a value 0 to the other half. A second situation assigns a value of 1 to all records. A third situation assigns a value of 0 to all records. Setting up these situations will allow me to test all possibilities.     The next step is to filter by condition.  Depending which data senario from above you use, data may not exist if the condition is true or false.  No data means your workflow will fail.  We need a work-around so that doesn't happen.        What happens after the data goes down the true side or the false side is essentially the same in terms of process.  The batch macro below is found downstream from the true side as well as the false side. Functionally, they work the same.  The batch macro determines if data is available.  If there is, data is detoured to the formula tool.  If not, the detour will bypass the formula tool entirely and keep the workflow from throwing a error.    NOTE: the formula tool in the illustration below is a representation of any process that requires data if an error is to be avoided.  This could anything, usually another macro.  For the purpose of this illustration, I'm simply showing a single tool.       What follows are instructions for how each tool in the batch macro/detour combination are configured.    First, enter ‘Sum_Test’ as the label for the Control Parameter.     Then write an expression in the Condition tool that checks if ‘Sum_Test’ is null.       If ‘Sum_Test’ is null (True), then direct the detour tool to go to the right in the ‘T’ side action box.     Similarly, if ‘Sum_Test’ is not null (False), then direct the detour tool to go to the left in the ‘F’ side action box.     Connect both action boxes to a Detour tool. The Detour tool has no configuration. Every Detour must be stopped by a Detour End tool.     Build a similar batch macro for the false side of Filter tool. Union the results from the true and false sides.     I put a Frequency Table tool after the Union to verify the results.   The entire process looks like this:     Disconnect the input to the filter tool and connect a new test condition to test all the various conditions.   Example workflow developed in Alteryx Designer v10.1.
View full article