The Product Idea boards have gotten an update to better integrate them within our Product team's idea cycle! However this update does have a few unique behaviors, if you have any questions about them check out our FAQ.

Alteryx Designer Desktop Ideas

Share your Designer Desktop product ideas - we're listening!
Submitting an Idea?

Be sure to review our Idea Submission Guidelines for more information!

Submission Guidelines

Featured Ideas

When passing a data connection to the Dynamic Input Tool as a string and using the 'Change Entire File Path' option, the password parameter of the connection string is not encrypted and is displayed in the metadata source information.

 

We have since changed our macro that was using this method, but wanted to raise awareness of this situation. I suggest that the same procedure used to encrypt the password in all other connection methods be called if the workflow is configured to pass a password through the input as a string.

Hi All,

 

This is a fairly straightforward request. I'd like to be able to pass through interface tool values to the workflow events the same way I would pass it through to a tool in the workflow (%Question.<tool name>%). One use-case for this is that we are calling a workflow and passing in an ID, and if this workflow fails, I'd like to trigger an event that will call back to the application and say this specific workflow for this ID failed.

 

The temporary solution is to have the workflow write to a temp file and have the event reference that temp file, but this is clunky and risky if there are parallel runs occurring. 

 

Best,

devKev

I use detours to bypass areas when I do not need them in the loop and it would be great if i just had a enable button (kind of like how you would control in a macro).  That way i dont have to disable 3 detours but just disable "button."

Same button should work with tool container 'disable' and/or collapse.
Can you add an ExceptionHandler to the Tile Tool? The tool crashed on a large dataset.  I got an Error: Tile (1): No values found before GetMean() on the tile tool. I selected Smart Tile option on the ‘unique_zips_count’ field and grouped by ‘ID’.
To track the problem down, I had to use the sample tool to grab x number of recs and see if it would run through the Tile tool. I had to keep skipping and selecting first N recs until I narrowed the problem down to 20 records. As it turned out. all values were 0 in a specific group. I found a workaround by pulling all recs per group with a value of 0 and bypassing these with the Tile tool. Instead of doing that - could you add an ExceptionHandler and specify which RecNo it crashed on?

Can you also add option to use 1, 2, or 3 std dev in addition to smart? This way all my groups will be uniform.
  
When using an App from the gallery, I would like to keep track of how many (Allocate dataset) checkboxes a user has toggled real-time by having a counter on the same form. Functionality would be the same as a winform which may contain a label control that contains a counter that updates every time a checkbox is (un)toggled. Then I would make the ‘Next’ or ‘Continue’ button active when a certain threshold is reached by the counter. So it needs to be dynamic. Maybe in general make the forms/Apps more dynamic so you have to do less chaining of apps?

Implement a process to have looping in the workflow without resorting to Macros.   Although macros do, generally, solve the issue, I find them confusing and non-intuitive.  

 

I would suggest looping through the use of two new tools:   A StartLoop and EndLoop tool.  

 

The start loop would have two (or more) input anchors.  One anchor would be for the initial input and the other(s) for additional iterative inputs.  The start loop would hold all iterative inputs until the original inputs have passed the gate and then resubmit them in order returned to the start loop.  

 

The end loop would have three output anchors.   One anchor would be for data exiting the loop upon reaching the exit condition.  Another loop would be for the iterative (return) data.   Note that transformations can be performed on the data BEFORE it re-enters the loop.  The third would be an "overloop" exit anchor.   This would be for any data that failed to meet the exit condition within the (configurable) maximum iteration expression.   The data from the overloop anchor could be dealt with as required by the business rules for the unsatisfied data after being output from the EndLoop tool 

 

The primary configurations would be on the EndLoop tool, where you would indicate the exit condition and the maximum iteration expression.  The tool would also create an iteration counter field.  As part of the configuration you could have a check box to "retain iteration count field on exit".   If checked, the field would be maintained.   If not checked, the field would be dropped for the data as it exits the loop.

 

This would making looping a bit more intuitive and it would be graphically self-documenting as well.   Worth a mention at least. 

In the question type "DropDown/ListBox", there's an option to use an external file. The file should be able to influence whether the record is selected or not. Maybe there's a reserved column name called "SELECTED" which could contain True/False values which the UI would key off.
  • Category Developer

Sometimes, Control Containers produce error messages even if they are deactivated by feeding an empty table into their input connection.

 

screenshot_error_in_spite_control_container_deactivated.png

(Note that this is a made up example of something which can happen if input tables might be from different sources and have different columns so that they need separated treatment.)

 

According to the product team, this is expected behaviour since a selection does not allow zero columns selected. This might be true (which I doubt a bit), but it is at least counter-intuitive. If this behaviour cannot be avoided in total, I have a proposal which would improve the user experience without changing the entire workflow validation logic.

(The support engineer understands the point and has raised a defect.)

 

Instead of writing messages inside Control Containers directly to the log output (on screen, in logfile) and to mark the workflow as erroneous, I propose to introduce a message (message, warning, error) stack for tools inside Control Containers:

  1. When the configuration validation is executed:
    1. Messages (messages, warnings, errors) produced outside of Control Containers are output to the screen log and to the log files (as today).
    2. Messages (messages, warning, errors) produced inside of Control Containers are not yet output but stored in a message stack.
  2. At the moment when it is decided whether a Control container is activated or deactivated:
    1. If Control Container activated: Write the previously stored message stack for this Control Container to the screen and to the log output, and increase error and warning counts accordingly.
    2. If Control Container deactivated: Delete the message stack for this Control Container (w/o reporting anything to the log and w/o increasing error and warning count).

This would result in a different sequence of messages than today (because everything inside activated Control Containers would be reported later than today). Since there’s no logical order of messages anyways, this would not matter. And it would avoid the apparently illogical case that deactivated Control Containers produce errors.

Hi there,

 

When creating a database connection - Alteryx's default behaviour is to create an ODBC DSN-linked connection.

 

However DSN-linked connections do not work on a large server env - because this would require administrators to create these DSNs on every worker node and on every disaster recovery node, and update them all every time a canvas changes.

they are also not fully safe becuase part of the configuration of your canvas is held in the DSN - and so you cannot just rely on the code that's under version control.

 

So:

Could we add a feature to Alteryx Designer that allows a user to expand a DSN into a fully-declared conneciton string?

In other words - if the connection string is listed as 

- odbc:DSN=DSNSnowFlakeTest;UID=Username;PWD=__EncPwd1__|||NEWTESTDB.PUBLIC.MYTESTTABLE

Then offer the user the ability to expand this out by interrogating the ODBC Connection manager to instead have the fully described connection string like this:
odbc:DRIVER={SnowflakeDSIIDriver};UID=Username;pwd=__EncPwd1__;authenticator=Snowflake;WAREHOUSE=compute_wh;SERVER=xnb27844.us-east-1.snowflakecomputing.com;SCHEMA=PUBLIC;DATABASE=NewTestDB;Staging=local;Method=user

 

NOTE: This is exactly what users need to do manually today anyway to get to a DSN-less conneciton string - they have to craete a file DSN to figure out all the attributes (by opening it up in Notepad) and then paste these into the connection string manually.

 

Thanks all 

Sean

 

 

The Idea behind the Password Masking is - we have "Download Tool" from the "Developer Tab" - which is used to Download files from the given site. For example, let's take Mainframe. I have a scenario where the Alteryx Workflow should connect to the Mainframe FTP Server, download the required file which is used for downstream transformation. For the download, I get the Username and Password information from the Database table (to reduce manual intervention and prevent errors). While passing the Username and Password as a parameter to the Download Tool Macro (Custom Macro - accepts the Username/Password, Filename dynamically) - the Alteryx Workflow will obviously show the username and password in the result window (as it is considered as an output data from Input Tool). Now I want that particular password field to be masked, so whenever the particular Workflow is shared to the User - the password field remains unexposed. I know there's a way to mask a particular field using "MD5 HASH" formula, but that helps to mask anything related to Dataset and not a password (as it may consider it as a new string and not a valid password). This feature would be really beneficial to Developers who use the download tool often. A New Tool or a Custom Macro - embedding this feature would be great for users who needs Masking functionality.

Hi!

 

Can you please add a tool that stops the flow? And I don't mean the "Cancel running worfklow on error" option.

Today you can stop the flow on error with the message tool, but there's not all errors you want to stop for.

 

Eg. I can use 'Block until done' activity to check if there's files in a specific folder, and if not I want to stop the flow from reading any more inputs.

 

Today I have to make workarounds, it would be much appreciated to have a easy stop-if tool!

This could be an option on the message tool (a check box, like the Transient option).

 

Cheers,

EJ

Hello,

please remove the hard limit of 5 output files from the Python tool, if possible.

It would be very helpful for the user to forward any amount of tables in any format with different columns each.

 

Best regards,

 

In Many of our tools,Before processing any file We create backup and move it to some backup with the datetime stamp.

Can we have such option like "CreateBackup" with timestamp in input and output tools?

Currently there is a maximum amount that can be passed into the Dynamic Input, 1MB. I often hit this limit and it is infuriating.  If this was upped to 5MB that would solve a lot of my issues, but 50MB would be AMAZING.

 

Thoughts?

 

-Nick

I'm stealing this idea from Tableau's number formatting, it's a timesaver.

 

In the DateTime tool if I've initially selected a value besides Custom in the "Select the format..." list then when I click Custom rather than having the Custom textbox be blank I'd like to have it automatically populated with whatever formatting string I just selected. Here's an example screenshot:

 

 

 

Hello gurus - 

 

Pretty much every coding framework supports this.  If we really want Alteryx to embrace no-code, we've got to have some ability to control commit / rollbacks across transactions.  As it stands currently, it is pretty easy to write out parent records, fail to be able to write out children, and wind up with a database state that makes the end users very sad.  

 

Thanks!

 

brian

When reading and writing large data frames to/from a python script in Alteryx it seems that there are limitations to the SQLite component of the tool. Given that this selection is recommended only when the user is having issues in the python tool why is the option selected by default? A colleague and I spent a couple of hours trying to work through an issue with importing a data frame larger than 1000x1000 and once we found this option (SQLite override) and unchecked it the data was written back to Alteryx without any problems.

 

Hint provided by the tool, "This changes the intermediate data format between Alteryx and Jupyter from yxdb to SQLite. Use only if running into issues. See help for more details."

 

SQLite override is default selectionSQLite override is default selection

Error message provided by the tool

error message.PNG

After unchecking the option the workflow ran without any errors.

 

Recommendation: the python tool should default to SQLite override unchecked

 

Hi - think it would be great to have to open only one debug window, and where I add to my workflow, the debug automatically updates to include the new features of my workflow.

 

As it is now, I believe that I have to open a new Debug window where I have added new components to my workflow. 

Hello All,

 

We are new to Alteryx and we could see that the Supported Data Sources from IBM are of below :

  • IBM DB2
  • IBM Netezza/Pure Data Systems
  • IBM SPSS

How about adding IBM Sterling to this?

We want Alteryx to support connection with IBM Sterling OMS which will help the Business requirements

Can anyone post some suggestions on this? How we can connect to Sterling?

 

Thanks,

Praveen C

 

Idea:

As a method of deploying preprocessing and ML models it would be awesome to be able to convert a workflow to java... 

 

Rationale:

models are needed to be deployed into Complex event processes or decision systems. Even for SAS there is a need to implement the datastep algorithms and procs to run in JVM.

 

 

Quickwin:

It is possible to convert a workflow into a PMML file and then use JMML package to convert that to Java. Yet the full workflow with all preprocessing alternatives and a series of ML methods may not be captured fully.

 

Competitor example:

For SAS case here is a similar solution: http://www.dullesresearch.com/carolina-features/

Top Liked Authors