The Product Idea boards have gotten an update to better integrate them within our Product team's idea cycle! However this update does have a few unique behaviors, if you have any questions about them check out our FAQ.

Alteryx Designer Desktop Ideas

Share your Designer Desktop product ideas - we're listening!
Submitting an Idea?

Be sure to review our Idea Submission Guidelines for more information!

Submission Guidelines

Featured Ideas

In the dynamic input tool,

Where you “Read a List of Data Sources”, there should be a radio button below the “Action” field, to   

 

“INCLUDE FIELD OF DATA SOURCES”,

 

Then you’d have an output field with the isolated name from which the data was sourced. You wouldn't be required to "include full file path" then parse out the sheet the data came from. 

Hello All,

 

I'm using the dynamic input tool for SQL requests in my Workflow (WF).

I'm using the "Replace a Specific String" to replace elements in the SQL statement dynamically depeding on results of prevoius tools, user input etc.

So the statement looks like

select * from Schema_Name_xx where invoice_number = 'invoice_number_xx'

 

Since Schema_Name_xx is no valid Schema in the Database, the statement (= Validation) won't work. Only if I replace Schema_Name_xx by e.g. Invoice_Data_Current it will work, same with the invoice number, invoice_number_xx is replaced by e.g. 4711.

Therefore, validation makes no sense and will never work, only if the WF is running, the correct Schema is inserted in the SQL statement by the "Replace a Specific String" function.

It would be great to disable it in the users settings or wherever in the Designer, changing a config file would also be great :-)

Pls. note: I'm thinking (since I'm not allowed anyway ;-)) about changing/disabeling anything in the Alteryx Server settings.

 

Reason:

1. Speed: Validating a WF with SQL statements that don't work takes time (every time I save it), sometimes I get even a timeout...

2. WF error entries: Each upload with a  failed validation creates an entry in the WF result list which makes it harder to seperate them from the "real" WF errors...

 

Thanks & Best Regards,

Thomas

Hi currently if you use the cross tab tool and the names of the new fields should have special characters they end up being replaced in the new headers with underscores "_", and then need to be updated in someway. It would be great if this was all done in the tool. In other words the new headers have the special characters as desired

Hi is it possible to look at alteryx workflows being run when a file has been dropped into a file or somehting along those lines? I.e an external activty has taken place

Hi,

 

Due to our setup, we need to have the path defined as Environment variables, so they will point to different paths in case a user opens the WF locally or the server is running it.

 

The issue is that the path of dependency does not accept the windows defined variables:

 

alteryx relative path with environment vars.pngalteryx relative path with environment vars defined.png

 

Thank you!

I know that the container title/label should or can be short, and as much descriptive as possible. Also, adding extra comments inside the box helps to a more detailed explanation on regards what process is run inside the container. Visually, if I collapse the container, the "Short" title given can't be of much help.

Could it be possible to enhance the "caption" for the "Container" title? I mean to allow to type 2, 3 or more lines of text?. This will make the Container title more descriptive and visually will allow to have the containers collapsed but with a reasonable amount of text that describe (as much as possible) what happens inside the container.

 

At the moment, If I type certain amount of text, the container expands according to the length of the text 

 

Below is the typical container Title 

Normal Container Title.JPG

 

 

Below is the current situation if a person would like to give a bit of more description in the "Container" header (The container expands)

 

Extra Text in Container_CurrentProblem.JPG

 

An dream would be to have the workflow with all containers collapsed and with titles that tell you what they do (see image below)

 

 Ideal Alteryx Containe.JPG

 

This is a hybrid idea related to both posts regarding dynamic tool configuration during runtime / without having to run an analytic app.

 

What I would like to propose is a new optional connection type for the interface tools that can be updated with incoming connections (having a Q letter with white background), namely Drop Down, List Box, Tree and Map tools. This could be a simple R letter in a square for example, which would be located to the left of the incoming question anchor.

 

Use Case

 

Imagine an app where there are two control containers and three interface tools (Action tools excluded from the count) outside those containers, one of them is a Text Box connected to a filter tool (via an Action tool) in the first control container with the purpose of limiting the dataset by specifying a city for example, another one is a Numeric Up Down for limiting the dataset by the average transaction amounts that are greater than the specified amount. These two interface tools are contained in a Group Box in the Interface Designer.

 

The third interface tool is a Drop Down tool which obtains the values (which will be Store Name for this example) from the results of the Select tool (in the second control container that is connected to the output anchor of the first control container) that is connected to an incoming filter tool which is modified by the previously mentioned interface tools. Output anchor of this Select tool is connected to the hypothetical R anchor on the top of the Drop Down tool, which is then connected to an outgoing filter tool that is connected to a series of tools which ends with a Browse tool that displays basic KPI information for the store specified from the Drop Down tool.

 

The main difference of the R (Refresh) anchor from the Q anchor is that it will enable the user to dynamically update the incoming values (i.e., choices for a drop down tool) without having to run the workflow. Alteryx Designer will automatically execute only the tools necessary to be able to update the values (up to a certain point of the workflow only, which may also be indicated by the boundaries of the control containers containing the target tool) for the R anchor connected applicable Interface tools specified above. This will be possible by clicking the hypothetical confirm button (same appearance with the Apply Data Manipulations button) which only appears next to the Interface tools (or the Group Boxes containing them instead) that are automatically determined by Alteryx Designer to be providing downstream data to the the tools (T anchor of the Filter tool for example) sending values to the applicable Interface tools having an incoming R anchor connection.

 

I saw that a similar feature recently became available with Alteryx Analytics Cloud Platform with the App Builder product, and I think that Alteryx Designer Desktop could definitely benefit both from this feature and additional App Builder features (that can be adapted to Desktop counterpart) in the upcoming releases.

I usually use the comment tool by:

- dragging it on to the canvas and then

- Repositioning and expanding it to cover the tools I'd like to comment on.

 

What if I could select the tools I wanted to comment on and then use a key combination or double click so that the comment tool surrounds these tools for me. 

 

Note: Additional enhancement would be to anchor the comment to the tools selected but I see that this was dropped for consideration: https://community.alteryx.com/t5/Alteryx-Designer-Desktop-Ideas/Anchoring-comment-boxes-to-tools/idi...

 

I think it would be great to have a tool that allows you to update a dataset with another dataset. For example, this could be used in updating an archive table on a daily basis as data changes. Having a tool available that streamlines this data operation would be helpful to simplify workflows.

 

In the tool, you would be given the option to select your primary key fields, which are the fields used to identify records. Additionally, you have the option to perform an insert, modify, or delete operation, according to the primary key fields that you choose in the configuration.

 

Obviously this is something that anybody could create a macro for if they wanted to. But it would be nice to have a tool in place so that we dont have to worry about it. I think this would be a nice use case to bolster Alteryx usage as a data engineering tool for relational database management in particular.

I try to use the Comment tool for documentation within workflows for team members (and my future self when I have to revisit it months after I built it). It would be helpful to be able to use markdown formatting inside the tool.

This might even encourage more documentation. *fingers crossed*

Hi there,

 

When you connect to a DB using a connection string or an alias - this shows up in the Workflow Dependancies in a way that is very useful to allow you to identify impacts if a DB is moved or migrated.

 

However - in 2023.1, if you use DCM then the database dependancies just show up as .\ which makes dependancy management much more difficult.

 

 
 

screenshot1.png

 

Please could you add the capability to view the DCM dependancies correctly in the dependancy window?

 

BTW - this workflow Dependancy Window would be a great place to build a simple process to move existing DB connections to a DCM connection!

 

CC: @wesley-siu @_PavelP 

Hi there,

 

When connecting to data sources using DCM - could we please add the ability to make JDBC connections?

 

see:

https://community.alteryx.com/t5/Alteryx-Designer-Desktop-Ideas/Add-the-ability-to-connect-to-data-s...

https://community.alteryx.com/t5/Engine-Works/JDBC-Connections-in-Alteryx/ba-p/968782

 

As mentioned in these threads - JDBC is very common in large enterprises - and in many cases is better supported by the technology teams / developer community and so is much easier to make a connection.    Added to this - there are many databases (e.g. DB2) where JDBC connections are just much easier 

 

Please could you add JDBC connections to the DCM tooling?

 

Thank you

Sean

 

cc: @wesley-siu @_PavelP 

Alteryx should seriously consider incorporating certain Excel features into its Brows tool, as they greatly enhance usability and functionality.

 

Currently, when selecting specific records in the Brows tool, users are unable to obtain important metrics such as sum, average, or count without resorting to additional steps, such as adding a summary tool or filters.

 

However, envisioning the integration of a concise bar below the message result window that provides these essential statistics, which are immensely beneficial to users, would undoubtedly elevate the Brows tool to the next level.

 

By implementing this enhancement, Alteryx would make a significant impact and establish the Brows tool as a must-have resource.

 

 

SaadNaser_0-1684918867896.png

 

 

SaadNaser_1-1684918880407.png

 

 

 

 

 

 

I think we can all agree that Workflow Summary Tool is immensely powerful in summarizing large and/or complicated workflows.  However, some companies have begun to bar the use of certain GenAI applications, like ChatGPT. Unfortunately this makes the use of the Workflow Summary Tool impossible.  At the same time those companies are allowing the use of other forms of GenAI, like AzureAI.

 

In the Workflow Summary tool, it would be nice to have the capability to select which GenAI engine you want to use (ChatGPT, AzureAI, etc) so that you don't break corporate policy by using barred applications.  This could simply be a dropdown in the GUI configuration for the Workflow Summary Tool with a list of the most common engines.  The user would then supply their API key for that engine, and you're off to the races.

Alteryx is not able to read generated Excel sheets which have the prefix "x:" within it's XML tags.  Often this occurs when an xlsx file is created from a bot or rpa process.  Example file attached.

Hi there,

 

When creating a database connection - Alteryx's default behaviour is to create an ODBC DSN-linked connection.

 

However DSN-linked connections do not work on a large server env - because this would require administrators to create these DSNs on every worker node and on every disaster recovery node, and update them all every time a canvas changes.

they are also not fully safe becuase part of the configuration of your canvas is held in the DSN - and so you cannot just rely on the code that's under version control.

 

So:

Could we add a feature to Alteryx Designer that allows a user to expand a DSN into a fully-declared conneciton string?

In other words - if the connection string is listed as 

- odbc:DSN=DSNSnowFlakeTest;UID=Username;PWD=__EncPwd1__|||NEWTESTDB.PUBLIC.MYTESTTABLE

Then offer the user the ability to expand this out by interrogating the ODBC Connection manager to instead have the fully described connection string like this:
odbc:DRIVER={SnowflakeDSIIDriver};UID=Username;pwd=__EncPwd1__;authenticator=Snowflake;WAREHOUSE=compute_wh;SERVER=xnb27844.us-east-1.snowflakecomputing.com;SCHEMA=PUBLIC;DATABASE=NewTestDB;Staging=local;Method=user

 

NOTE: This is exactly what users need to do manually today anyway to get to a DSN-less conneciton string - they have to craete a file DSN to figure out all the attributes (by opening it up in Notepad) and then paste these into the connection string manually.

 

Thanks all 

Sean

 

 

Hi there,

 

the Snowflake documentation only refers to connection strings which use a DSN such as this page Snowflake | Alteryx Help which refers to the connection string as odbc:DSN=Simba_Snowflake_JWT;UID=user;PRIV_KEY_FILE=G:\AlteryxDataConnectorsTeam\OAuth project\PEMkey\rsa_key.p8;PRIV_KEY_FILE_PWD=__EncPwd1__;JWT_TIMEOUT=120

 

However - for canvasses which need to be productionized on Alteryx Server - it is critical to use dsn-less connection strings so that the canvasses can be deployed and run on any worker node without having to set up DSNs on every worker node.

 

A DSN-less connection string looks like this: 

ODBC:DRIVER={SnowflakeDSIIDriver};UID=UserName;pwd=Password;WAREHOUSE=compute_wh;SERVER=server.us-east-1.snowflakecomputing.com;SCHEMA=PUBLIC;DATABASE=NewTestDB;Staging=local;Method=user|||NEWTESTDB.PUBLIC.MYTESTTABLE

 

Please could you consider making an update to the help texts to provide and describe a DSN-free connection string as well as the DSN driven connections?

 

Many thanks

Sean

Have you ever had the business deliver an Excel (EEK!) file to be passed into Alteryx with a different number of header rows (because it looks pretty and is convenient)? Never, you say? Lies! 

 

I would suggest adding an option to the Input Data Tool that would give us the ability concatenate multiple header rows. This would help enable accurate data profiling for columns when output and eliminate loss from unnecessary conversion errors. Currently, the options allow us to Start Data Input on Line X; however, if the header for the column is on multiple rows, they would have to be manually entered after input due to only being able to select the lowest possible row to assure the data is accurately passed. The solution would be to be able to specify the number of rows that contain headers, concatenate them to a single row (ignoring null and carriage return) and then output that as the header. 

 

The current functionality, in a situation where each row has a variable number of header rows, causes forced errors such as a scientific string conversion of a numeric value.

If you cancel a workflow while its writing into a file, the file creation will not be rollbacked and hence a partial file would have been created.

This is problematic when working with incremental load relying on file from the past.

 
My proposal is to have an output mode which allow transactionnal writing. If workflow is cancelled nothing is being written. This could be done by writing first in a temporary file before renaming it. 

There should be a quick way to delete all unwanted tools for specific output/browse tool in workflow. This would be useful when we have huge multiple cross connect workflow. By deleting all not requited tools, it would make it easy and faster to test as segregated.

Top Liked Authors