Free Trial

Alteryx Designer Desktop Ideas

Share your Designer Desktop product ideas - we're listening!
Submitting an Idea?

Be sure to review our Idea Submission Guidelines for more information!

Submission Guidelines

Featured Ideas

0 Likes

Create a standardized Mailbox application that could bolt onto Alteryx Server, to handle incoming attachments from sources like a Service Desk (Service Now for example) and other applications.

 

Essentially anything that regularly exports data in the form of an emailed attachments to which Alteryx could, using a series of predefined user rules and a designated email address, put those attachments into various directories ready for processing by automated Alteryx workflows.

 

This would save a huge amount of time as people currently have to manually drag and drop files. At least the on board Alteryx designers here haven't been able to come with a solution. Would also save any messy programming around systems like Outlook and bending any security issues within those systems. Many, many other applications have this simple feature built in to their products, especially service desks. I believe there would be a huge benefit to this very simple bolt on.

0 Likes

 

Why do we need yxmd files? Why shouldn't the default be yxmz? The workflow logic is the same. If you don't add any interface tools it will run, and it you want to have a interface you can. 

 

If you start off with an yxmd and then decide to make it an app you now have two files to worry about.  

 

As a habit I no longer save things as yxmd. As soon as I start a new workflow I save it as an yxmz.

 

Thoughts?

 

0 Likes

It would be a huge time saver if you had an option to unselect the fields selected and select the fields not selected in the Select tool.

0 Likes

@KuoL 

 

Yes, I know, it's weird to have a situation where a decision tree decides that no branches should be created, but it happened, and caused great confusion, panic, and delay among my students.

 

v1.1 of the Decision Tool does a hard-stop and outputs nothing when this happens, not even the succesfully-created model object while v1.0 of the stool still creates the model ("O") and the report ("R") ... just not the "I" (interactive report). Using the v1.0 version of the tool, I traced the problem down to this call:

 

dt = renderTree(the.model, tooltipParams = tooltipParams)

Where `renderTree` is part of the `AlteryxRviz` library.

 

I dug deeper and printed a traceback.

 

9: stop("dim(X) must have a positive length")
8: apply(prob, 1, max) at <tmp>#5
7: getConfidence(frame)
6: eval(expr, envir, enclos)
5: eval(substitute(list(...)), `_data`, parent.frame())
4: transform.data.frame(vertices, predicted = attr(fit, "ylevels")[frame$yval],
       support = frame$yval2[, "nodeprob"], confidence = getConfidence(frame),
       probs = getProb(frame), counts = getCount(frame))
3: transform(vertices, predicted = attr(fit, "ylevels")[frame$yval],
       support = frame$yval2[, "nodeprob"], confidence = getConfidence(frame),
       probs = getProb(frame), counts = getCount(frame))
2: getVertices(fit, colpal)
1: renderTree(the.model)

The problem is that `getConfidence` pulls `prob` from the `frame` given to it, and in the case of a model with no branches, `prob` is a list. And dim(<a list>) return null. Ergo explosion.

 

Toy dataset that triggers the error, sample from the Titanic Kaggle competition (in which my students are competing). Predict "Survived" by "Pclass".

0 Likes

Dear Team

 

If we are having a heavy Workflow in development phase, consider that we are in the last section of development. Every time when we run the workflow it starts running from the Input Tool. Rather we can have a checkpoint tool where in the data flow will be fixed until the check point and running my work flow will start from that specific check point input.

 

This reduces my Development time a lot. Please advice on the same.

 

Thanks in advance.

 

Regards,
Gowtham Raja S

+91 9787585961 

0 Likes

The error message is:

 

Error: Cross Validation (58): Tool #4: Error in tab + laplace : non-numeric argument to binary operator

 

This is odd, because I see that there is special code that handles naive bayes models. Seems that the model$laplace parameter is _not_ null by the time it hits `update`. I'm not sure yet what line is triggering the error.

0 Likes

 

The CrossValidation tool in Alteryx requires that if a union of models is passed in, then all models to be compared must be induced on the same set of predictors. Why is that necessary -- isn't it only comparing prediction performance for the plots, but doing predictions separately? Tool runs fine when I remove that requirement. Theoretically, model performance can be compared using nested cross-validation to choose a set of predictors in a deeper level, and then to assess the model in an upper level. So I don't immediately see an argument for enforcing this requirement.

 

This is the code in question:

if (!areIdentical(mvars1, mvars2)){
        errorMsg <- paste("Models", modelNames[i] , "and", modelNames[i + 1],
                          "were created using different predictor variables.")
        stopMsg <- "Please ensure all models were created using the same predictors."
      }

As an aside, why does the CV tool still require Logistic Regression v1.0 instead of v1.1?

 

And please please please can we get the Model Comparison tool built in to Alteryx, and upgraded to accept v1.1 logistic regression and other things that don't pass `the.formula`. Essential for teaching predictive analytics using Alteryx.

 

0 Likes

This would allow for a couple of things:

 

Set fiscal year for datasource to a new default.

 

Allow for specific filters on the .tde (We use this for row level security with our datasources).

 

Thanks

0 Likes

The Multi-Field Binning tool, when set to equal records, will assign any NULL fields to an 'additional' bin

e.g. if there are 10 tiles set then a bin will be created called 11 for the NULL field

 

However, when this is done it doesn't remove the NULLs from the equal distribution of bins across the remaining items (from 1-10).Assuming the NULLs should be ignored (if rest are numeric) then the binning of remaining items is wrong.

 

Suggestion is to add a tickbox in the tool to say whether or not NULL fields should be binned (current setup) or ignored (removed/ignored completely before binning allocations are made).

0 Likes

I've run into an issue where I'm using an Input (or dynamic input) tool inside a macro (attached) which is being updated via a File Browse tool. Being that I work at a large company with several data sources; so we use a lot of Shared (Gallery) Connections. The issue is that whenever I try to enter any sort of aliased connection (Gallery or otherwise), it reverts to the default connection I have in the Input or Dynamic Input tool. It does not act this way if I use a manually typed connection string.

 

Initially, I thought this was a bug; so I brought it to Support's attention. They told me that this was the default action of the tool. So I'm suggesting that the default action of Input and Dynamic Input tools be changed to allow being overridden by Aliased connections with File Browse and Action tools. The simplest way to implement this would probably be to translate the alias before pushing it to the macro.

0 Likes

The chart tool is really nice to create quick graphics efficiently, especially when using a batch macro, but the biggest problem I have with it is the inability to replace the legend icon (the squiggle line) with just a square or circle to represent the color of the line. The squiggly line is confusing and I think the legend would look crisper with a solid square, or circle, or even a customized icon!

 

Thank You!!!!

0 Likes

Some of the predictive tools put out a "Score" field when output is run through the scoring tool, and some put out a "Score_1" and/or "Score_0".  Since I frequently reuse the same workflow template for different predictive model types, it would be nice if they were consistent so that I wouldn't have to crash the workflow the first time through to get the input field names correct for downstream tools (e.g., Sort).  Thank you

0 Likes

I have long and large workflows, IMO, that are getting difficult to follow. I'd like the ability to highlight the joins and set specific colors or at the very least highlight and toggle on/off highlights. I'd also like to be able to move my joins and so they are not curving all over the canvas.

0 Likes

Hi,

 

In the Input tool, it would be useful to have the Saved Database Connections options higher in the menu, not last. Most users I know frequently use this drop down, and I find myself always grabbing the Other Databases options instead as it expands before my mouse gets down to the next one. I would vote to have it directly after File..., that way the top two options are available, either desktop data or "your" server data. To me, all the other options are one offs on a come by come basis, don't need to be above things that are used with a lot more frequency. Just two cents from a long time user...love the product either way!

 

 

1.JPG

 

Thanks!!

 

Eli Brooks

0 Likes

Recently, I posted one problem regarding on merging a column with same value using  the table tool. I do have a hard time to make a solution until @HenrietteH helped on how to do it. What she showed was helped me a lot to do what I want in my module, however it would be more easier if we are going to add this feature on the table tool itself. 

 

Thank you

0 Likes

Hi,

 

Would it be possible to simplify some of the workaround processes that are needed for generating Chart Titles when using grouping by adding the potential for using the grouping variable in the Title string so that accurate descriptions can be generated.  At the moment it requires the use of a Report Text tool which is not as neat if considering the output that is necessarily generated by grouping.

 

Only a thought,

Peter

0 Likes

I have three groups: a control group, a group that got product A, and then a group that got product B. There is a way to test the differences across all groups rather than running separate t-tests (which introduces type I error several times). If my outcome is the percent of people who were contacted, I want to see if the percent is different across groups.

 

Control Group % who were contacted: 10%

Product A group % who were contacted: 25%

Product B group % who were contacted: 33%

 

I shouldn't have to run a t-test comparing control to A, then another comparing control to B, and then a third comparing A to B. I know the method is pairwise comparisons but I'm not finding how I can do this in alteryx and I've looked on the community and surprisingly the answer seems to be "you can't" but this is not a rare statistical test!

A product analyst at alteryx help build a macro in R to run the tests but the variables need to be categorical rather than continuous. The ideal solution is that an additional predictive analytics tool can run these ANOVA tests and there's something to specify whether the variable is categorical or continuous.

 

-Justin

0 Likes

Hi,

 

As it is so important to be able to calculate and present time related concepts in modern businesses, it is not possible to have a better output choice?  I have seen the reporting chart tool, I have looked at the TS Plot tool and even noticed that the Laboratory Charting tool has disappeared.  So can you please provide an output tool that provides some focused functionality on this lacking part?

 

Kind regards,

Peter 

0 Likes

1. The Union tool 

 

When switching to Manual method and then adding fields up stream, the result is a warning "Field was not found". I don't look for warnings. This should create a red error. Having fields fall off the workflow is a pain.

 

2. Unique tool

 

Changing fields upstream causes the tool to error out when the workflow runs. No issues are shown before the run. 

 

3. Having containers all open up when I reopen a workflow is a nightmare when you have 20+ containers all over lapping.

 

0 Likes

It would be great if the deselecting of fields in a select tool updated the output window(before next run) as a "review" to make sure you are removing what you expect and/or you can see other items left behind that should be removed. This would also be useful for seeing field names update as  you organize and rename.

 

Often I join tables w/o pre-selecting the exact fields i want to pass and so I clean up at the end of the join. I know this is not the best way but a lot of times i need something downstream and have to basically walk through the whole process to move the data along.

 

 

Top Liked Authors