Join the Alteryx Community’s Maveryx Summer Cup event! Compete, network with others, and earn your gold through a series of challenges from July 24th to August 11th. Learn more about the event here.
The Product Idea boards have gotten an update to better integrate them within our Product team's idea cycle! However this update does have a few unique behaviors, if you have any questions about them check out our FAQ.

Alteryx Designer Desktop Ideas

Share your Designer Desktop product ideas - we're listening!
Submitting an Idea?

Be sure to review our Idea Submission Guidelines for more information!

Submission Guidelines

Featured Ideas

When working on a complex, branching workflow I sometimes go down paths that do not give the correct result, but I want to keep them as they are helpful for determining the correct path.  I do not want these branches to run as they slow down the workflow or may produce errors/warnings that muddy debugging the workflow.  These paths can be several tools long and are not easily put in a container and disabled.  Similar to the Cache and Run Workflow feature that prevents upstream tools from refreshing i am suggesting a Disable all Downstream Tools feature.  In the workflow below the tools in the container could be all disabled by a right click on the first sample tool in the container.

 

T_Willins_0-1663214830996.png

 

0 Likes

I'm dealing with a database that is not always up when my production workflow runs. When the database is down, the workflow errors out because the input data connection errors. I want the workflow to gracefully execute some other steps if the database is down. I need the input data tool to allow errors to be treated as warnings, using a checkbox, and then I can have Alteryx run different logic when 0 rows are detected.

Consider, for a moment, Standard Macros as old-school Subroutines in which you would have a library of Subroutines that could be invoked from numerous code sets.  Each Subroutine could have any number of arguments, and when the Subroutine is invoked, the calling code provides the arguments and their values to the Subroutine.

 

You can do this in Alteryx - but with a very large but.  The source field names being passed to the Standard Macro have to be the same field names the Standard Macro is expecting.  To make the Standard Macros more "library friendly" - allow the calling workflow to alias fields in the dataset for the sole purpose of sending them to the Macro.

 

Example:  Standard Macro that returns a Vendor ID based on a Location and Item Number.

Macro Input: Location ID, Item Number

Calling Workflow has: Purchase Location and Item Code

 

The Macro on the calling workflow would have a mapping:

Data Set ObjectMacro Input Object
Purchasing LocationLocation ID
Item CodeItem Number

SOOOOoooooo many times it'd be great to just dictate the character length/count (fixed width) for the parse (just like you can in excel), instead of being constrained by a delimiter or being obligated to go create (potentially complex) REGEX.  Ideally you could go into the column and insert the <break> (multiple times if needed) after the given character where you'd like the parse to occur.  Anything past the last <break> would all be included in the final parse section/field. 

You could also do it a little less visual and just identify/type the character count you want for each column.  If you really want to enhance this idea, you could also include the ability to name the fields and prescribe the data type.  Those would just be gravy on the meat of the idea however, which is, provide the ability to parse by fixed length fields.

 

 

Watermark_0-1663019366956.png

 

Dear UX Usual Suspects,

 

I've created a video for you to observe the idea:

 

 

With 400+ fields available, I find it challenging when I am validating my formula output to look at the "Referenced" fields of data plus the new data fields.  It would be oh-so-nice to press a button and look only at the "valuable" data.

 

How about you?  Do you want a little of this idea @Hollingsworth @T_Willins  @Aguisande @NicoleJ 

 

Cheers,

 

Mark

...and now for probably the most trivial request in a long time, but also one of the most annoying things (for me anyway)..........

When viewing a browse window, it's so darn awesome to be able to sort and search.  However, it would be even awesomeer (yes, I just made up a word) if when you actually conducted a sort or search, you could make your selection (for sorts) or type in your criteria (for searches) and simply press the "Enter" button on the keyboard and  have it do the same thing that selecting "Apply" with the mouse does.  This is common Windows functionality and I think should be easy to implement.

In Powerpoint, you can right-click on a picture and replace it with a different picture without losing formatting.

 

Similar functionality would be useful for replacing custom macros.

  • I would like to be able to switch an old version of a custom macro with a new version in situ, without losing the connections to other tools, interface tools, or location in a container. 

Currently, the only option is to insert the new custom macro and then reset all incoming and outgoing connections. Some downstream tools (e.g., crosstab) lose their existing settings and that has to be reset too. On complicated workflows, this can introduce silent errors.

 

This capability would be especially helpful for team coding, where different team members are revising different modules for a parent workflow.

 

Currently:

Right-clicking on the canvas shows Insert > Macro > (choose from list of open macros)

Right-clicking on an existing macro shows Open Macro

 

New functionality:

Right-clicking on an existing macro shows Replace/Change Macro > (choose from list of open macros)

 

 

A typical macro does the same job every time. I therefore want it to have the same annotation each time.

I want it to have a default annotation that I save in the Interface Designer. This annotation will be shown on the canvas whenever the macro is added.

0 Likes

We are experiencing performance issues with fetching schema/table/columns info on Alteryx Designer when using Vertica DB.

 

From the troubleshoot with Alteryx support, the query hitting "odbc_columns" is contributing to the performance issue. Vertica DBA suggests to use "columns" instead of "odbc_columns". Submitting this request to change the query.

 

Refer to case 00551930 for more info.

When I'm organizing my workflow, sometimes I want to move a whole tool container on the canvas. Currently, the only way to do this is to first find the header then select and drag this. When the ends of the container is off screen, it can be hard to know how much I wanted to move my container to get it where I wanted relative to the other tools around it. I feel like it would be nice to be able to select anywhere on the tool container and drag it around (possibly holding right click and dragging so that current tool selection capabilities aren't hindered).

 

 

In the (simplified) images below, you'll see that I want my tool container to vertically align just above the browse tool:

Kenda_0-1662662867721.png

 

 

I can't currently see the top of the tool container to move it, though, so I must first navigate to that part of the workflow to select the header.

Kenda_1-1662662925974.png

 

Creating .yxi files currently is a manual and a bit of a fiddly process. It would be great to just have an option in the menu to click which would auto package the macro into a tool installer file.

Example Export.png

0 Likes

Hi - 

 

We are using the new(ish) Anaplan connector tools; in particular, the "Anaplan Output" tool (send data TO Anaplan).

 

The issue that I'm having is that the Anaplan Output Tool only accepts a CSV file.  This means that I must run one workflow to create the CSV file, then another workflow to read the CSV file and feed the Anaplan Output Tool.

 

If it were possible to have an output anchor on the Output tool that would simply pass the CSV records through to the Anaplan Output tool, the workflows would be drastically simplified.

 

Thanks,

Mark Chappell

When switching modes sometimes it reboots and looses all the code:

Python Tool Bug.gif

Currently only VADER algorithm is available however other algorithms might be interesting alternative. By other algorithms I mean: TextBlob, Flair and Custom option.

Pawel_Paleczny_0-1661946732168.png

 

Cheers,

Pawel

Pre-Filter as new option in Input Tool might reduce import data and allows to input only selected data (ie. for specific period or meeting certain conditions).

Pawel_Paleczny_1-1661942134308.png

 

Cheers,

Pawel

Currently saving file output as PowerPoint is possible only doing workaround as in Megan's article (link below) using Render Tool. It might be more intuitive to implement PowerPoint to supported options in "Output Mode" dropdown.

Pawel_Paleczny_0-1661939211930.png

https://community.alteryx.com/t5/Engine-Works/Reporting-in-Alteryx-Generating-a-PowerPoint-With-a-Gr...

 

Cheers,

Pawel

Add PowerPoint format file (ppt/pptx) into supported file type as direct connection in Input Tool.

Pawel_Paleczny_1-1661938082924.png

 

 

PS. I know that we have workaround allowing to import PowerPoint slides into Alteryx but I'm describing automated solution :)

 

Cheers,

Pawel

The Edit menu allows you to see what your next undo/redo actions are. This is super helpful, however sometimes I decide to scrap an idea I was starting on and need to perform multiple undo's in a row. It would be great if we could see a list of actions like in the debug undo/redo stack menu then select how many steps we'd like to undo/redo.

 

For example, using the below actions, if I want to undo the Change Summarize Properties and also the Modify Summarize, currently I have to do that in two steps. I'd like to be able to click the Modify Summarize and have the workflow undo all commands up to and including that one.

Kenda_0-1661880963011.png

 

Currently the Databricks in-database connector allows for the following when writing to the database

  1. Append Existing
  2. Overwrite Table (Drop)
  3. Create New Table
  4. Create Temporary Table

This request is to add a 5th option that would execute

  • Create or Replace Table

Why is this important?

  • Create or Replace is similar to the Overwrite Table (Drop) in that it fully replaces the existing table however, the key differences are
    • Drop table completely removes the table and it's data from Databricks
      • Any users or processes connected to that table live will fail during the writing process
      • No history is maintained on the table, a key feature of the Databricks Delta Lake
    • Create or Replace does not remove the table
      • Any users or processes connected to that table live will not fail as the table is not dropped
      • History is maintained for table versions which is a key feature of Databricks Delta Lake

 

While this request was specific to testing on Azure Databricks the documentation for Azure and AWS for Databricks both recommend using "Replace" instead of "Drop" and "Create" for Delta tables in Databricks. 

 

AStasi_0-1661864644374.pngAStasi_1-1661864772827.png

Hyperion Smartview Connect

Top Liked Authors