We are celebrating the 10-year anniversary of the Alteryx Community! Learn more and join in on the fun here.

Alteryx Designer Desktop Ideas

Share your Designer Desktop product ideas - we're listening!
Submitting an Idea?

Be sure to review our Idea Submission Guidelines for more information!

Submission Guidelines

Featured Ideas

Hey Designer Gurus + @NicoleJ ,

 

Here's a picture of my canvas (running):

 

capture.png

 

I'd like to be able to see COUNTS and PERCENT completion as the workflow is running.  In my case, the numbers are BIG and  they are prioritized as BACK compared with the lines.  In the case of % complete, they  obfuscate (fancy term for block) the progress of the tool.

 

Currently, if I want to watch the water boil, paint dry or the workflow crawl/walk/run I must change the workflow before saving it to maximize the distance between the tools.  I'd like to be able to see both the COUNTS and % complete without the added effort.  My idea is to have someone at Alteryx figure out an enhancement to this without engaging the likes of @Hollingsworth who'll devise some evil keyboard shortcut.

 

Cheers,

 

Mark

Please add support for Databricks' Unity Catalog

 

Currently, when selecting a Databricks-connection in the “Connect In-DB”-tool, and opening the “Query Builder”, only tables in the catalog named “hive_metastore” are listed. That is, Alteryx submits the following SQL query to Databricks:

Listing tables 'catalog : hive\_metastore, schemaPattern : %, tableTypes : null, tableName : %'

 

However, with Unity Catalog in Databricks the namespace is three-tier and there may be multiple catalogs (and not just the "hive_metastore" catalog), see https://docs.microsoft.com/en-gb/azure/databricks/lakehouse/data-objects#--what-is-a-catalog

 

I reached out to Alteryx support, which replied that you currently have a feature request for implementing this change (ID TDCB-4056) and they furthermore suggested that I post here.

 

Thanks in advance.

Hi Community Microsoft wil deprecate basic Authentication, so users will need OAuth2 to be included in the email tool. https://docs.microsoft.com/en-us/exchange/clients-and-mobile-in-exchange-online/deprecation-of-basic... Microsoft is removing the ability to use Basic authentication in Exchange Online for Exchange ActiveSync (EAS), POP, IMAP, Remote PowerShell, Exchange Web Services (EWS), Offline Address Book (OAB), Outlook for Windows, and Mac. This change will be efective on October 1, 2022. Best regards, JP

Hello!

 

I'm submiting this idea to put other products into alteryx students program, I think that we (students) should have access to study these products (not only the Intelligence Suite, but Server as well).

We see canvasses every day where dozens fields are brought into a canvas or a macro, but never used - and this just creates slowness for no good benefit.

 

Given that one of the selling features of Alteryx is the speed of processing  - could we look at three improvements to the Alteryx engine & designer:

  • easiest: Keep track of every field brought in / created - and if they are not used in an output, then throw a warning at the end of the execution process
    • For example - you bring in fields a,b,c - you create field d and e during the flow in formula tools
    • Field d is never used as an input to any filters or formulae - and it doesn't appear on any output - so it's just waste
    • Field a and b are part of the output, so they are fine
    • Field c is never used at all - so that's just waste.
    • Field e is used to filter the records before output - so this one is fine.
    • So we've immediately found 2 fields that we can eliminate and make this canvas faster
  • Medium: Ignore the unused fields in the execution engine
  • Hardest: Tell the users that their field is unused in Alteryx Designer by doing a lineage analysis of the tools, just like software environments like Visual Studio do.    This may require a change to the engine & to designer 'cause we would need to make each tool capture the full detail of the fields that they know in their configuration in order to do this trace.

 

 

 

We've been looking into the phoneHome information that collects usage of Designer in the enterprise, and it looks like this data set (in the UsageReports collection, I believe).

Please can you add the CanvasFilename that was run to this data - we need to be able to surveil the use of Alteryx in our enterprise which is not being done within the server environment, and without the canvas name this becomes tremendously difficult.

 

Reference:

 

https://help.alteryx.com/server/11.0/admin/index.htm#Configuration/SaveDesignerUsageData.htm%3FTocPa...

https://community.alteryx.com/t5/Analytics-Blog/Alteryx-Analytics-11-0-Monitoring-and-Reporting-on-A...

 

cc: @BenG @avinashbonu @Deeksha @BenBu @revathi

We frequently have issues where users report slowness from an Alteryx installation on a particular machine; or where a specific tool or package fails to install correctly.

 

For our admin teams - this becomes a debugging exercise to go through different permutations to understand the cause - and if this is escallated to Alteryx Support, this becomes even tougher.

 

Could we think about including a basic "Self Diagnostic" in to Alteryx which runs through the basic functionalities of Alteryx with some basic timings; checks that Python is working correctly; checks the memory allocation and temporary disk space - and then either persists this to disk and/or sends to a central environment for analysis?

 

Given a large deployed environment like ours (over 10 000 seats deployed) - self-checkout-telemetry like this would provide the central team with massive increase in their ability to manage the deployed base; and at the same time signficantly reduce the time to resolve support issues.

There needs to be a way to step into macro a which is component of parent workflow for debugging.

 

Currently the only way to achieve to debug these is to capture the inputs to the macro from the parent workflow, and then run the amend inputs on the macro. For iterative / batch macros, there is no option to debug at all. This can be tedious, especially if there are a number of inputs, large amounts of data, or you are have nested macros.

 

There should be an option on the tool representing the macro in the parent workflow to trigger a Debug when running the workflow, this would result in the same behavior when choosing 'Debug' from the interface panel in the macro itself: a new 'debug' workflow is created with the inputs received from the parent workflow.

 

On iterative / batch macros, which iteration / control parameter value the debug will be triggered on should be required. So if a macro returns an error on the 3 iteration, then the user ticks 'Debug' and Iteration = 3. If it doesn't reach the 3rd iteration, then no debug workflow is created.

(1) I would like to have more text formatting options available in the Comment Tool, such as:

  • set different format for selected words (color, bold, underline, size..)
  • indenting
  • bullet list or numbered list

(2) Option to remove or recolor the blue outline of the comment box. (Especially when I have a comment in a color-filled comment box, I would prefer a comment box without a dark outline.) 

 

(3) UX -  Add an arrow cursor to indicate resizing functionality

Hello all,

 

Some Database, including Hive, support natively scheduled queries (yes, the scheduling configuration is inside the database, not through etl/dataprep system). I think this would be an interesting feature for in-db workflow output : you play the worflow once and then only have to run it when it changes, the database do the scheduling. 



https://cwiki.apache.org/confluence/display/Hive/Scheduled+Queries

Intro

Executing statements periodically can be usefull in

  • Pulling informations from external systems
  • Periodically updating column statistics
  • Rebuilding materialized views

 

Best regards,

Simon

Speed up canvas edits - The Create/Remove Space Tool

 

Usually day two of working with a canvas I realize that I have been a fool, and I come up with a significantly more elegant or simple solution.  Moving all of the containers or tools to fit my slick new container is cumbersome and slow.  I've created a GIF of a feature several tools have which allows the user to easily move and arrange items on the canvas.

 

Open source tool used in demo: bpmnJs

 

 

giphy.gif

 

 

 

 

The Azure Machine Learning Training and Scoring Tools seems great to improve Azure ML process.

Introducing: The Azure Machine Learning Training and Scoring Tools 

We tried to use this tool but can't log in to Azure ML correctly. We have several Tenant ID then log in to another tenant for office 365 not Azure ML.

====================== <Error Message> ==========================================================
Error: Azure ML Training (367): UserErrorException:
    Message: You are currently logged-in to 55f0a...-.............................................. tenant. You don't have access to d846a...-............................................. subscription, please check if it is in this tenant. All the subscriptions that you have access to in this tenant are =
 [SubscriptionInfo(subscription_name='Microsoft Azure Enterprise', subscription_id='754c5...-...........................')].
 Please refer to aka.ms/aml-notebook-auth for different authentication mechanisms in azureml-sdk.
    InnerException None
    ErrorResponse
=======================================================================================================

Microsoft states that tenant needs to be specified if we have access to multiple tenants.

Set up authentication for Azure Machine Learning resources and workflows 

temp.JPG
Could you add Tenant ID into Azure credentials so that we can use this tool? 

temp2.JPG

Please upgrade the "curl.exe" that are packaged with Designer from 7.15 to 7.55 or greater to allow for -k flags. Also please allow the -k functionality for the Atleryx Download tool.  

 

-k, --insecure

(TLS) By default, every SSL connection curl makes is verified to be secure. This option allows curl to proceed and operate even for server connections otherwise considered insecure.

The server connection is verified by making sure the server's certificate contains the right name and verifies successfully using the cert store.

 

Regards,

John Colgan

Have you ever used a Join tool with several (or many) Join fields, looked at the the L and R outputs and wondered, why didn't these records join? When there are many columns in your data, this can be a hard question to answer. It would be very handy if Alteryx could somehow report the Field(s) that each record failed to join on (perhaps as an optional added field to the L and R outputs).

Preface: I have only used the in-DB tools with Teradata so I am unsure if this applies to other supported databases.

 

When building a fairly sophisticated workflow using in-DB tools, sometimes the workflow may fail due to the underlying queries running up against CPU / Memory limits. This is most common when doing several joins back to back as Alteryx sends this as one big query with various nested sub queries. When working with datasets in the hundereds of millions and billions of records, this can be extremely taxing for the DB to run as one huge query. (It is possible to get arround this by using in-DB write out to a temporary table as an intermediate step in the workflow)

 

When a routine does hit a in-DB resource limit and the DB kills the query, it causes Alteryx to immediately fail the workflow run. Any "temporary" tables Alteryx creates are in reality perm tables that Alteryx usually just drops at the end of a successful run. If the run does not end successfully due to hitting a resource limit, these "Temporary" (perm) tables are not dropped. I only noticed this after building out a workflow and running up against a few resource limits, I then started getting database out of space errors. Upon looking into it, I found all the previously created "temporary" tables were still there and taking up many TBs of space.

 

My proposed solution is for Alteryx's in-DB tools to drop any "temporary" tables it has created when a run ends - regardless of if the entire module finished successfully. 

 

 

Thanks,

Ryan

I have been developing and accumulating custom functions over the years and they have proved to be very useful.  I am submitting these here.  I hope they are found to be beneficial.

Functions included in the attached file include:

  • DateTime
    • StandardDate(String) - Transforms any valid string to the standard date format yyyy-mm-dd
  • File
    • FileDirDepth(Path) - Returns the zero based depth of the path (zero being the root)
    • FileGetFolder(Path, Depth) - Returns the folder name given the zero based depth in the path (zero being the root)
  • String
    • LeftPart(String, Separator) - Returns the left part of a string up to the first separator
    • RightPart(String, Separator) - Returns the right part of a string after the first separator
    • Split(String, Delimiter, Index) - Returns the zero indexed part of a delimited string
    • CleanSpace(String) - Trims string and replaces multiple spaces with a single space
    • UnicodeToASCIIBasic(String) - Replaces all Unicode Characters with ASCII Basic equivalents
  • Test
    • InList(Variable, List) - If Variable is in List returns True. List must be pipe delimited
    • IsValidEmail(String) - Returns True if string is a valid email format
    • IsUUID(String) - Returns True if string is a valid UUID

To make these functions available in Alteryx, place the attached xml file in the folder C:\Program Files\Alteryx\bin\RuntimeData\FormulaAddIn if you have a standard installation. If the install is non-standard, find the \bin\RuntimeData\FormulaAddIn folder and place the attached xml file there.  Ateryx will need a restart for the functions to be available.

 

When documentin alteryx screen I sometimes hit printscreen and need to paste important matters to Comment tool...

But there is no paste from clipboard 😞

 

idea.png

I suggest adding a minor icon that enables not only reading from png but pasting a screen or other image copied directly from memory...

 

For I need the following setting so I printscreen and capture as is;

 

Join doc.png

 

Then put that into a PNG or JPG file using paint. And then prepare a comment box with that image in the background...

 

Becomes.png

 

I believe that in addition to the already suggested idea of having an option to avoid sending one email per record, the attachments capability should be overhauled. Sending multiple attachments in a single email is a common need, but the only Community idea is a partial address of the issue by requesting an ability to use semi-colon separated paths in a single field as the attachment criterion. This doesn't seem to be an optimal method given the potential usefulness of the tool and ease of use considerations. 

 

I think that a full solution should include:

  • The capability to select a (file paths) field of all desired attachments which can then be uploaded into a single email
  • The ability to use wildcards or directories in the file input mode (as you find in the Input Tool) in order to upload multiple attachments to a single email)

This would be a transformative solution to a common email need, and I think greatly appreciated!

I know that incoming and outgoing connections can be wired and wireless, and that they will highlight when one clicks on a tool. However, it would be very useful to be able to highlight a particular connector in a particular colour (selected from a palette, perhaps, from the drop down window, or from the configuration). This would be especially useful when there are many connectors originating from a single tool.

Thanks

The behavior of an "Overwrite Sheet (Drop)" configuration is such that it breaks formulas (#REF) that point to the overwritten sheet and named ranges that reference the overwritten sheet.  This is a bummer because the only way I've found to overcome the issue is to write a script that re-applies the named range.  This works, but it greatly raises the barrier to using this tool and in some corporate environments it won't even be possible.

 

What would probably be a good alternative behavior is to delete the contents of the sheet, rather than the rows/columns/cells of the sheet.  I think both probably have valid use cases but my proposed functionality is going to cause fewer issues and be the more popular behavior for most users.  I believe there is a google sheets API call for just this kind of behavior...

Top Liked Authors