Alteryx Designer Desktop Ideas

Share your Designer Desktop product ideas - we're listening!
Submitting an Idea?

Be sure to review our Idea Submission Guidelines for more information!

Submission Guidelines

Featured Ideas

Hi, I was looking for this but couldn't find a similar idea, so I post a new one. If someone knows about a similar idea, please ask the moderators to mer

 

CountChars(<String>, <char to count>,<case sensitive>)

 

Where <char to count> and <case sensitive> are optional parameters.

If <char to count> is not provided, the funtion will return the total character count within the <String>.

If <char to count> is provided, it'll return the number of ocurrences of that character within the <String>.

 

PS: For those tempted to suggest a workaround, I've been using REGEX_CountMatches() for this. Actually, the focus is to simplify user's experience and workflow performance providing a native function, instead of using REGEX which it's very demmanding on resources.

The canvas has 3 options as demonstrated by exhibit A:

Capture.PNG

 

The user settings can change 2 of the 3 defaults as demonstrated in exhibit B. The layout default and connection settings progress can both be defaulted for all new workflows:

Capture2.PNG

 

Thus, I would propose that a user setting be added to the annotation box so that I can set the default to hide.

 

Mic DropMic Drop

As @JordanB mentioned in his post (https://community.alteryx.com/t5/Alteryx-Knowledge-Base/Stop-workflow-on-a-condition/tac-p/74403#M19...) - there's a common need to stop a worfklow when an condition is met.

However, at present there's no way to do this without generating an error.

 

Please can we either alter the message/test component to allow for error-free termination on a formula condition; or alternatively implement the fuller idea that Mark ( @MarqueeCrew) mentioned in his programmatic Detour idea?

 

https://community.alteryx.com/t5/Alteryx-Product-Ideas/Programmatic-Detour/idi-p/12763

 

 

 

 

 

We now have the ability to output to an ESRI File Geodatabase, which is great, but it only allows you to output it to the WGS84 coordinate system.  I would like to have the same functionality to export it to other projections or coordinate systems similar to the ESRI Shapefile or ESRI Personal Geodatabase output tools (we specifically need NAD83 but I'm sure others would like other options as well).

Apologies if this has been suggested or exists. I often find myself using manual Excel files as a data source. These files frequently use cell formatting elements, such as cell color and text color, to convey important information. However, when these files are imported into Alteryx, this valuable formatting information is unfortunately lost.

 

To address this, a dedicated input tool that can read Excel files with separate fields for these formatting elements would be very helpful. This would be incredibly beneficial, especially when the data lacks other fields that relate to the coloring. Currently, I manage to achieve this using a Python tool, but integrating this as a built-in feature in Alteryx would undoubtedly be more efficient and user-friendly. This enhancement would not only simplify data preparation but also ensure the preservation of the full context of the original Excel file.

I'd like to hold CTRL, click on a tool and drag it to somewhere else on the canvas to copy it. 

 

This is functionality common in other software (e.g. Tableau, MS Office). 

 

Currently I have to either:

right click > Copy, right click > Paste, or

Ctrl + C, Ctrl + V. 

 

 

The Dynamic Input will not accept inputs with different record layouts.  The "brute force" solution is to use a standard Input tool for each file separately and then combine them with a Union Tool.  The Union Tool accepts files with different record layouts and issues warnings.  Please enhance the Dynamic Input tool (or, perhaps, add a new tool) that combines the Dynamic Input functionality with a more laid-back, inclusive Union tool approach.  Thank you.

I would like to be able to draw a box around some tools, them maybe right mouse click to add them to a container

To avoid some errors occurring during upgrade or even installation, it would be great  to add an option in the installer to go with a fresh installation (remove any previous Alteryx Designer).

 

If selected, option would:

- Warn users that everything Alteryx related is going to be deleted

- Generate a log of what is going to be removed

- Rename folders and registry keys listed there: https://community.alteryx.com/t5/Alteryx-Designer/Complete-Uninstall-of-Alteryx-Designer/ta-p/402897

(rename instead of delete to avoid "bad surprises")

 

A similar option could exist when one would like to uninstall Alteryx Designer.

 

This would remove the frustration of having to rely on a "white knight" when something happens in the middle of an upgrade or an installation.

 

Thanks,

 

PaulN 

Hello,

As of today, only English is available. But it's hard to convince French Customers with french language data to buy the AIS if it cannot work with their data.

Best regards,

Simon

Alteryx doesnt support querying tables within Apache Ignite via Ignite ODBC connector. Connectivity from Ignite being an in memory database with Alteryx would help in better connectivity via ODBC.

 

https://apacheignite-sql.readme.io/docs/overview 

Add ability to lock comment boxes size, shape, position (send to back), location on the canvas. This would allow a developer to use a template when creating workflow without accidently selecting and/or adjusting these attributes. It will also allow a user to put a tool over the top of the comment box without fear of messing up the visual display of the workflow or it getting hidden underneath the comment box.

 

Scenario:

Upstream tools end in a Summarize Tool that has set of records with the following fields:  EmailAddress, AttachmentUNCPath.  So you get a bunch of recipients with various attachments.  Each recipient can have different attachments, and this will change each time it's run.  In other words, it's fully dynamic.  

 

If the same recipient has multiple attachments, then it would be nice to group the recipient and just separate the attachments with a semi-colon (or whatever) in the same field.  Essentially creating one record per recipient, and therefore one email per recipient, and having the Email Tool attach each file.  In other words, mbarone@paychex.com gets one email with 5 attachments.  And next week maybe only 3 attachments, and so on.  

 

Currently the only way I see to accomplish this is with a batch macro.  


Would be infinitely more convenient to just have the Email Tool by default accept multiple attachments in a field as long as they are separated by a semi-colon, much like occurs in the "to" field.

 

 

Hi there,

 

Adam ( @AdamR_AYX ), Mark ( @MarqueeCrew) and many others have done a great job in putting together super helpful add-in macros in the CREW pack - and James ( @jdunkerley79 ) has really done an incredible job of filling in some gaps in a very useful way in the formula tools.

 

Would be possible to include a subset of these in the core product as part of the next release?

I'm thinking of (but others will chime in here to vote for their favourite):

- Unique only tool (CReW)

- Field Sort (CReW)

- Wildcard XLSX input (CReW) - this would eliminate a whole category of user queries on the discussion boards

- Runner (CReW - although this may have issues with licensing since many people don't have command line permission - Alteryx does really need the ability to do chained dependancy flows in a more smooth way.

- Date Utils (JDunkerly) - all of James's Date utils - again, these would immediately solve many of the support questions asked on the discussion forum

 

I think that these would really add richness & functionality to the core product, and at the same time get ahead of many of the more common queries raised by users.   I guess the only question is whether the authors would have any objection?

 

Thank you

Sean

When working in a large workflow wireless connections help to make it easier to work with. However sometimes you want to be able to see all your connections (when debugging). 

 

I'd like to see a toggle (button on the toolbar) which would display all the connections including wireless. Ideally the wireless connections would be a different color. You could then click the button again to make the the wireless connections invisible.

 

Reason:

The existing options to display are limited as you have to click on individual tools to see the connections. 

 

Please enhance the input tool to have a feature you could select to test if the file is there and another to allow the workflow to pause for a definable period if the input file is locked by another user, then retry opening.  The pause time-frame would be definable for N seconds and the number of iterations it would cycle through should be definable so you can limit how many attempts to open a file it would try.

 

File presence should be something we could use to control workflow processing.  

 

A use case would be a process that runs periodically and looks to see if a file is there and if so opens and processes it.  But if the file is not there then goes to sleep for a definable period before trying again or simply ends processing of the workflow without attempting to work any downstream tools that might otherwise result in "errors" trying to process a null stream.

 

An extension of this idea and the use case would be to have a separate tool that could evaluate a condition like a null stream or field content or file not found condition and terminate the process without causing an error indicator, or perhaps be configurable so you could cause an error to occur or choose not to cause an error to occur.

 

Using this latter idea we have an enhanced input tool that can pass a value downstream or generate a null data stream to the next tool, then this next tool can evaluate a condition, like a filter tool, which may be a null stream or file not found indicator or other condition and terminate processing per the configuration, either without a failure indicated or with a failure indicated, according to the wishes of the user.  I have had times when a file was not there and I just want the workflow to stop without throwing errors, other times I may want it to error out to cause me to investigate, other scenarios or while processing my data goes through a filter or two and the result is no data passes the last filter and downstream tools still run and generally cause a failure as they have no data to act on and I don't want that, it may be perfectly valid that on a Sunday or holiday no data passes the filters.

 

Having meandered through this I sum up with the ideal being to enhance the input tool to be able to test file presence and pass that info on to another tool that can evaluate that and control the workflow run accordingly, but as a separate tool it could be applied to a wider variety of scenarios and test a broader scope of conditions to decide if to proceed or term the workflow.

 

It can be daunting to find the tool that is currently being processed by the engine in workflows that contain hundreds of tools with many ins, outs, and branches. During runtime, I want to be shown the tool that is running on the canvas. This functionality should be in the form of a button or something to direct focus to that area. It should not be the default.

This functionality would allow the user to select (through a highlight box, or ctrl+click), only the tools in a workflow they would want to run, and the tools that are not selected would be skipped. The idea is similar to the new "add selected tools to a new tool container", but it would run them instead. 

 

I know the conventional wisdom it to either put everything you don't want run into a tool container and disable it, or to just copy/paste the tools you want run into a blank workflow. However, for very large workflows, it is very time consuming to disable a dozen or more containers, only to re-enable them shortly afterwards, especially if those containers have to be created to isolate the tools that need to be run. Overall, this would be a quality of life improvement that could save the user some time, especially with large or cumbersome workflows.

Many software & hardware companies take a very quantitative approach to driving their product innovation so that they can show an improvement over time on a standard baseline of how the product is used today; and then compare this to the way it can solve the problem in the new version and measure the improvement.

 

For example:

- Database vendors have been doing this for years using TPC benchmarks (http://www.tpc.org/) where a FIXED set of tasks is agreed as a benchmark and the database vendors then they iterate year over year to improve performance based on these benchmarks

- Graphics card companies or GPU companies have used benchmarks for years (e.g. TimeSpy; Cinebench etc).

 

How could this translate for Alteryx?

- Every year at Inspire - we hear the stats that say that 90-95% of the time taken is data preparation

- We also know that the reason for buying Alteryx is to reduce the time & skill level required to achieve these outcomes - again, as reenforced by the message that we're driving towards self-service analytics & Citizen-data-analytics.

 

The dream:

Wouldn't it be great if Alteryx could say: "In the 2019.3 release - we have taken 10% off the benchmark of common tasks as measured by time taken to complete" - and show a 25% reduction year over year in the time to complete this battery of data preparation tasks?

 

One proposed method:

  • Take an agreed benchmark set of tasks / data / problems / outcomes, based on a standard data set - these should include all of the common data preparation problems that people face like date normalization; joining; filtering; table sync (incremental sync as well as dump-and-load); etc.
  • Measure the time it takes users to complete these data-prep/ data movement/ data cleanup tasks on the benchmark data & problem set using the latest innovations and tools
  • This time then becomes the measure - if it takes an average user 20 mins to complete these data prep tasks today; and in the 2019.3 release it takes 18 mins, then we've taken 10% off the cost of the largest piece of the data analytics pipeline.

 

What would this give Alteryx?

This could be very simple to administer; and if done well it could give Alteryx:

- A clear and unambiguous marketing message that they are super-focussed on solving for the 90-95% of your time that is NOT being spent on analytics, but rather on data prep

- It would also provide focus to drive the platform in the direction of the biggest pain points - all the teams across the platform can then rally around a really deep focus on the user and accelerating their "time from raw data to analytics".   

- A competitive differentiation - invite your competitors to take part too just like TPC.org or any of the other benchmarks

 

What this is / is NOT:

  • This is not a run-time measure - i.e. this is not measuring transactions or rows per second
  • This should be focussed on "Given this problem; and raw data - what is the time it takes you, and the number of clicks and mouse moves etc - to get to the point where you can take raw data, and get it prepped and clean enough to do the analysis".
  • This should NOT be a test of "Once you've got clean data - how quickly can you do machine learning; or decision trees; or predictive analytics" - as we have said above, that is not the big problem - the big problem is the 90-95% of the time which is spent on data prep / transport / and cleanup.

 

Loads of ways that this could be administered - starting point is to agree to drive this quantitatively on a fixed benchmark of tasks and data

 

@LDuane ; @SteveA ; @jpoz ; @AshleyK ; @AJacobson ; @DerekK ; @Cimmel ; @TuvyL ; @KatieH ;  @TomSt ; @AdamR_AYX ; @apolly 

 

 

 

 

I'm sure there's a reason behind it, but can we please be allowed to run calculations on null values in a formula tool? right now, if we sum three values (1 + 3 + [null]) it produces [null], can the formula tool just ignore the null values? the only way around this is to fill the [null] cells with a value and that adds an additional step to what should be a fairly straight forward process. That value would have to be different for a multiplication formula vs an addition formula in order to not change the answer materially whereas ignoring the value is a more consistent solution.