The Product Idea boards have gotten an update to better integrate them within our Product team's idea cycle! However this update does have a few unique behaviors, if you have any questions about them check out our FAQ.

Alteryx Designer Desktop Ideas

Share your Designer Desktop product ideas - we're listening!
Submitting an Idea?

Be sure to review our Idea Submission Guidelines for more information!

Submission Guidelines

Featured Ideas

0 Likes

My organization requires users to change their active directory passwords every so often. That's fine. That does create an issue with changing passwords on all workflows that contain a tableau server publish, but that's not the real issue here.

 

What's more annoying is that once that password is changed, the data source name and project is lost. When updating multiple data sources across workflows (or within the same workflow) this becomes quite cumbersome and annoying. 

 

I know there are limitations on the partner side as this is technically a tableau controlled thing, but is there anything that can be done here?

0 Likes

I often copy/paste chunks of workflow and paste it into the same workflow (or a different one).  It always seems to paste just diagonally below the upper most left Tool.  This creates a real mess.  I'd like to be able to select a small area within the work area and have the chunk of workflow I'm pasting drop there - instead of on top of the existing build.

0 Likes

Hello All,

I received from an AWS adviser the following message:

_____________________________________________

Skip Compression Analysis During COPY
Checks for COPY operations delayed by automatic compression analysis.

Rebuilding uncompressed tables with column encoding would improve the performance of 2,781 recent COPY operations.
This analysis checks for COPY operations delayed by automatic compression analysis. COPY performs a compression analysis phase when loading to empty tables without column compression encodings. You can optimize your table definitions to permanently skip this phase without any negative impacts.

Observation

Between 2018-10-29 00:00:00 UTC and 2018-11-01 23:33:23 UTC, COPY automatically triggered compression analysis an average of 698 times per day. This impacted 44.7% of all COPY operations during that period, causing an average daily overhead of 2.1 hours. In the worst case, this delayed one COPY by as much as 27.5 minutes.

Recommendation

Implement either of the following two options to improve COPY responsiveness by skipping the compression analysis phase:
Use the column ENCODE parameter when creating any tables that will be loaded using COPY.
Disable compression altogether by supplying the COMPUPDATE OFF parameter in the COPY command.
The optimal solution is to use column encoding during table creation since it also maintains the benefit of storing compressed data on disk. Execute the following SQL command as a superuser in order to identify the recent COPY operations that triggered automatic compression analysis:
WITH xids AS (
SELECT xid FROM stl_query WHERE userid>1 AND aborted=0
AND querytxt = 'analyze compression phase 1' GROUP BY xid)
SELECT query, starttime, complyze_sec, copy_sec, copy_sql
FROM (SELECT query, xid, DATE_TRUNC('s',starttime) starttime,
SUBSTRING(querytxt,1,60) copy_sql,
ROUND(DATEDIFF(ms,starttime,endtime)::numeric / 1000.0, 2) copy_sec
FROM stl_query q JOIN xids USING (xid)
WHERE querytxt NOT LIKE 'COPY ANALYZE %'
AND (querytxt ILIKE 'copy %from%' OR querytxt ILIKE '% copy %from%')) a
LEFT JOIN (SELECT xid,
ROUND(SUM(DATEDIFF(ms,starttime,endtime))::NUMERIC / 1000.0,2) complyze_sec
FROM stl_query q JOIN xids USING (xid)
WHERE (querytxt LIKE 'COPY ANALYZE %'
OR querytxt LIKE 'analyze compression phase %') GROUP BY xid ) b USING (xid)
WHERE complyze_sec IS NOT NULL ORDER BY copy_sql, starttime;

Estimate the expected lifetime size of the table being loaded for each of the COPY commands identified by the SQL command. If you are confident that the table will remain under 10,000 rows, disable compression altogether with the COMPUPDATE OFF parameter. Otherwise, create the table with explicit compression prior to loading with COPY.

_____________________________________________

 

When I run the suggested query to check the COPY commands executed I realized all belonged to the Redshift bulk output from Alteryx.

 

Is there any way to implement this “Skip Compression Analysis During COPY” in alteryx to maximize performance as suggested by AWS?

 

Thank you in advance,

 

Gabriel

0 Likes

I have a process that sends out about 1,500 emails. Every once in a while, it will get stuck at some Percentage and I will have to eventually cancel the workflow, figure out how many emails were sent, and then skip that many emails in order to avoid sending duplicate emails. The process of figuring out how many were sent is currently taking the % of the tool at cancellation minus 50%(since that is where it starts), Multiplying it by 2, and then multiplying that % by the number of lines to get the approximate line of data where it froze up, and then reaching out to individuals to see if they received the email to narrow down exactly where the error occurred. 

 

Example: 60% - 50%= 10% * 2 = 20% * 1249 = 249.8.

 

This has been pretty accurate in the past, but obviously is not ideal. Is there no way for it to show us how many were sent even if we cancelled the workflow mid processing of the tool?

 

0 Likes

Our company is loving the Insight's Tool, but I am constantly being asked by users if they can export the data behind the graphs that is feeding in. For example we have an inventory dashboard for vehicles that starts at a Corporate level, but is drillable down to a "Regional" and then even more focused "Managed Area" level. Once users get down to the "Managed Area" level they want to export the line level data that is feeding into the Insight chart to actually view, work, and action the data at a vehicle level. 

 

Essentially an option to export the data feeding into the graphs. 

0 Likes

Since we know Alteryx uses R for a lot of its predictive and data analysis tools. It takes a while to run the workflow whenever there is R based tool is involved. I was told by a solution engineer that its because its opening and closing R in the background.

 

Sometimes my workflow has a bunch of tools which are running R in the background and it takes forever to run the workflow.

 

I think there should be a user setting which allows user to choose if the want to start R along with Alteryx and keep it running in the background.

 

Thanks,

0 Likes

Hi there,

 

As a beginner in Alteryx with experience in other analytics software, I noticed that there may be a very simple thing that I think could be adjusted which I feel could improve the experience of a beginner in Alteryx. Also happy to know if this is already possible.

 

When I was doing a introduction training, I noticed that a lot of the questions were regarding not being able to see the right output, regardless of the usage of the right tools & settings. Luckily, we were provided with a good trainer that immediately saw that there was a very simple reason for this: the 'output' button (sometimes called differently, for instance in a select it is called 'true' or 'false') was not selected. Instead people were looking at the input or something else. I can even imagine that some more advanced users have spend a few minutes wondering what was wrong until they realised they weren't looking at the output.

 

It seems to me to be a bit random when output or input gets selected, and as someone with experience in (preventing) addiction in the gaming industry, I know that the first experience is crucial for someone to get 'hooked' :-), and this small inconsistency seems to break the flow a bit. Could you make the default setting such that a tool shows the output rather than the input by default? A possible addition would be an option that switches a tool back to input every time a button gets deselected. From a programmers/data science perspective, that would also make a lot of sense.

 

Regards,

Charles

0 Likes

The older versions of the Publish to Tableau Server Macro had an option to Request an authentication token however the latest version does not.  Please return this option to the tool as it is very useful for constructing Rest API call scripts.

 

Thank you!

 

~ Eric Marowitz

0 Likes

Please allow a hover over that would show you the value of a variable in the formula tool. At times I get long formulas and it would be nice to see the values of each variable by just putting your mouse on top of it. Just show the first row like the preview. There is similar functionality in visual studio and it makes coding easier. 

0 Likes

Can a spell check option be included that will check the spelling in the comment tool text box and tool container captions?

 

Ideally a global check would be nice for the annotation section of each tool, especially if it can be determined if I changed the annotation from the default one on some tools.

0 Likes

Hello,

 

If I go to Options --> Advanced Options --> System Settings, why do I have to click [Next] button several times before I can get to the "Engine" tab at  the very bottom? Why not simply create a user-friendly UI screen where we could directly navigate to the section we desire?

 

Please improve the UI.

 

Thanks!

0 Likes

There should be a macro which could be used as read input macro for in-db tools.

Similarly, there should be a write macro for in-db tools.

 

0 Likes

This is more of an enhancement than a new idea.  When building an application and upon success using separate browsers to display the results, it would be nice to be able to give the browser windows a title.  Currently you see Browse (22) and Browse (38) etc.  My app checks a certain key value in multiple tables/files and presents the table results if found.  I need to rename the data to know which file the data is coming from whereas if the browser windows had a title, you would know from which file they represent.  The titles could be added in the interface designer (see attached)

 

 

0 Likes

I just downloaded the new 2018.4 version of Desktop, and I feel like it's going backwards with the Window UI. I prefer to keep my Results window on a separate screen where I can review it side-by-side with my workflow. I have three large monitors, and I have no need to keep that window tabbed, or docked. 

 

With the removal of he 'Close' option, I no longer have a way of closing that window without putting back in a mode I do not want. 

When i first started using Alteryx, the results window acted the same as any normal Windows window, but over time I've lost the ability to quickly maximize the window AND now I can't even close it. One of the most critical windows is getting harder and harder to use.

 

Pretty please! 

 

 

 

0 Likes

Good afternoon,

 

I work with a large group of individuals, close to 30,000, and a lot of our files are ran as .dif/.kat files used to import to certain applications and softwares that pertain to our work. We were wondering if this has been brought up before and what the possibility might be.

0 Likes

Good afternoon,

 

I work with a large group of individuals, close to 30,000, and a lot of our files are ran as .dif/.kat files used to import to certain applications and softwares that pertain to our work. We were wondering if this has been brought up before and what the possibility might be.

0 Likes

Yeah, so when you have 15 workflows for some folks and you've actually decided to publish to a test database first, and now you have to publish to a production database it is a *total hassle*, especially if you are using custom field mappings.  Basically you have to go remap N times where N == your number of new outputs.  

 

Maybe there is a safety / sanity check reason for this, but man, it would be so nice to be able to copy an output, change the alias to a new destination, and just have things sing along.  BRB - gotta go change 15 workflow destination mappings. 

0 Likes

When saving my workflows to the gallery I'd like to see the description field be populated with the text I used during the last save. I like to add text here and it is frustrating to have to re-type it every time.

0 Likes

There should be an option to not update values with Null-values in the database, when using the tool Output Data, with the options:

  • File Format = ODBC Database (odbc:)
  • Output Options = Update;Insert if new

 

This apply to MS SQL Server Databases for my part, but might affect other destinations as well?

0 Likes

To compare a Grid shape before a change and after the change, both shapes should have a reference point where those grids can be created.

The reference point should be changed according to the coordinates system that the designer want to use

Top Liked Authors