Be sure to review our Idea Submission Guidelines for more information!
Submission GuidelinesHello,
After used the new "Image Recognition Tool" a few days, I think you could improve it :
> by adding the dimensional constraints in front of each of the pre-trained models,
> by adding a true tool to divide the training data correctly (in order to have an equivalent number of images for each of the labels)
> at least, allow the tool to use black & white images (I wanted to test it on the MNIST, but the tool tells me that it necessarily needs RGB images) ?
Question : do you in the future allow the user to choose between CPU or GPU usage ?
In any case, thank you again for this new tool, it is certainly perfectible, but very simple to use, and I sincerely think that it will allow a greater number of people to understand the many use cases made possible thanks to image recognition.
Thank you again
Kévin VANCAPPEL (France ;-))
Thank you again.
Kévin VANCAPPEL
I would like to see a way to partially execute a workflow (specifically for an App) for the purposes of allowing user to make selections based on a dynamic data flow.
Ex:
1. Database Selection Interface
Click Next
2. Select from available columns to pass through to the output file.
Click Next
3. Pick from selected fields which fields should be pivoted.
Output file and complete run time
This was a simple example to explain a case, but the most common use I could see is for APIs.
Would be nice to have the regex tool allow you to drop original input field and report and error if any records fail to parse.
Alternative data sources namely #altdata are key for enriching data. One source is social media.
I believe Alteryx lacks in social media analytics.
If you are into #media, #advertising, #marketing analytics, #influencer analytics please support the idea by seconding...
https://www.instagram.com/developer/authorization/ is the link for the graph API updated after the latest acebook scandal... now fixed...
Can we get a more robust read.Alteryx function for mode="data.frame"? If it is reading the stream as a data frame, can we have the option stringsAsFactors = FALSE?
I am getting tripped up a lot because the code will execute in R Studio, but will get mysterious behaviours when it runs within the R Tool. I am manually converting variables to character strings in my R Tool code which I don't have to do in R Studio. However, I'm not a highly detail oriented R developer, so I will miss variable data type conversions and have spent a lot of time going down the wrong path. Also, It makes it difficult to maintain two different scripts for the same routine.
I have started using the glimpse() function in R Tool code, to help catch some data conversions since it writes the output in the message log.
Rob Campanell
I understand that Server and Designer + Scheduler versions have the option to "cancel workflows running longer than X”.
I'd like to see that functionality in the desktop edition as well.
Would like to use arrows and other shapes for documentation. Moreover, having "anchors" (i.e., like in a wiki) would really facilitate moving about large workflows. I imagine the former is not hard to implement, though uncertain about the latter.
Check out the mock-up workflow for an admittedly bad example.
When I publish using the tabcmd publish command in an event or using the Publish to Tableau Server Macro, the extract becomes LIVE. I do not want it to become LIVE, because when it does, I cannot refresh the extract using the tabcmd refreshextracts command or setup a refresh schedule in Tableau Server. Is there anyway to make this tde stay an extract after Alteryx rewrites the file? When the extract is live it will not refresh until I manually select Refresh in Tableau Server when I am in the Workbook that is connected to the data source I am publishing.
I posted the above question and was told this would be good to add to the New Idea module. Thanks!
Currently I am running two version of Alteryx and some of the macros were created/updated in the newer version of Alteryx. I would like to see only one error message displayed for all of the macros created in a newer version rather than having a dialog box pop up a dozen plus times (one for each macro) every time I open an instance.
As Alteryx becomes more focussed on the Enterprise - it is important that we build capabilities that support the needs of large-scale BI.
One of these critical needs is dealing with heterogeneous data from different systems that use different IDs for every critical entity / concept (e.g. client; product)
Here's the example:
Problem:
- In any large enterprise - there are several thousand different line-of business systems
- Each of these was probably built at a different time, and uses a different key for specific concepts - like Client & Product
- Most large enterprises that I've worked at do not have a pre-built way of transforming these codes so...
- This means that any downstream analytics finds it almost impossible to give single-view-of-customer or single-view-of-product.
Solution option A:
Reengineer all upstream systems. Not feasible
Solution option B:
Expect some reference-data team to fix this by building translations. More feasible but not fast
Remaining Solution Option:
Just as Kimball talked about - the only real way is to define a set of enterprise dimensions, which are the defined master-list of critical concepts that you need to slice-and-dice by (client; product; currency; shipping method; etc) in a way which is source-system agnostic
Then you need a method in the middle to transform incoming data to use these codes. This process is called "Conforming"
What would this look like in Alteryx?
Setup
In Use:
Impact:
In BI in smaller contexts, or quick rapid-fire BI - you don't have to worry about this. But as soon as you go past a few hundred line-of-business systems and are trying to do enterprise reporting, you really have to take this serious. This is a HUGE part of every BI persons's role in a large enterprise - and it is painful; slow and not very rewarding. If we could create this idea of a simple-to-use and high-velocity conforming process - this would absolutely tear the doors off enterprise BI - and no-one else is doing this yet!
+ @AshleyK @BenG @NickJ @ARich @patrick_digan @JoshKushner @samN @Ari_Fuller @Arianna_Fuller
It may be user-friendly to display the DSN description. It's usually stored in the windows registry under "description" :
This should display in the log and when you configure a connexion.
Hi all,
As per the post here: https://community.alteryx.com/t5/Data-Preparation-Blending/Dynamic-input-not-respecting-data-sort/td... - there are situations where you need to use something like a dynamic input to query data, but need it to be brought back in the order that you specified on the input stream.
The Dynamic Input too sorts the input stream deliberately, to check for duplicate queries so that it doesn't waste time bringing back duplicate data.
It would be great if we can extend the dynamic input tool to allow users to specify that they wish the data unsorted, and that they are OK with the consequences of possibly running the same query twice. Even if this is a setting that can only be set through XML, it would still be helpful.
Many thanks
Sean
The option to "Disable all tools that Write Output" is great during testing but I often need to toggle back and forth and its location on the Runtime tab of the Workflow Config is inconvenient.
I think it would be great to have a button for that on the toolbar with the added feature that it would visually display whether the feature is on or off (so you don't need to see an Output Data tool to determine the current status)
Hope this is fairly self-explanatory.
I'd like to be able to create presets for Summarize tool. Instead of having Group By, Sum, Count, Count Non Null, etc on top of the libraries of functions, put them into their own category. Users could then create a Favorites and the functions that they use the most would be stored in that section (editable by user).
It would be good to replicate some of the key workflow configuration settings as shortcut icons in the main shortcut toolbar.
For instance, I often use 'Disable all tools that write output' and need to toggle it on/off quickly when I'm testing a workflow. It takes too many clicks to deselect a tool, open workflow configuration, open the Runtime tab and select the checkbox. Many end-users I work with also don't even know the option is there because it is so well-hidden.
It would be much simpler and easier If I could toggle it straight from the shortcut bar. Having a keyboard shortcut to do it, like I do with ctrl-R (to run) would be even better.
Having shortcuts would also be good for:
Alternative data sources are key for enriching data. one source is social media.I believe Alteryx lacks in social media analytics.
If you are into #media, #advertising, #marketing analytics, #influencer analytics please support the idea by seconding...
There is an API called YouTube DATA API
https://developers.google.com/youtube/v3/getting-started
Some say to mato and some say to_mato, but how about: to/mato?
While working with my new friend, @Cedric we ran across a field in his data that contained a '/' character. We were building a macro where we updated the value of the field [AB/CD] with another field selected from the incoming data. Our error message was something akin to: Field AB was not found.
We worked around the issue, but what remained was the fact that certain characters are permitted in field names within some aspects of Alteryx and not in others. I don't know if you're aware of this limitation.
Cheers,
Mark
Would love to have a 'common used' tab, rather than a favourties box (as that lags what I am currenty using).
Would be nice to have it look at my usage and create sort by frequency of use table. Could also be done with all users as well (some kind of opt-in telematary data?).
When output is disabled, Alteryx's output tools are helpfully grayed out and include the message 'output has been disabled by the workflow properties.'
However, if a macro has an output, there is no visual indicator that output is disabled, even though the macro's output will also be suppressed by this workflow configuration.
Obviously, macros can be very complex, and could have both a file and a macro output, or have an optional file output, so these cannot be entirely locked out just because there is an output.
To that end, I suggest some other kind of color-coding/shading be applied visually to these tools, and that a message be added to the interface for these macros that says something like "output has been disabled, this macro may not perform all of its functions".
I just spent about 10 minutes debugging why a macro wasn't working properly in one workflow but was working in another, and it was because I had disabled output, which I wasn't thinking of because this particular macro uses the Render tool to produce a hyperlink. I wouldn't have spent more than 30 seconds on this if there was some kind of visual indicator showing me what I was doing wrong!
Pretty much only time I add Browse tools during development now is to get access to the Cell Viewer to examine values better.
Would love to be able to do this on the output window
I'm loving the ability to read from a zip file! However, I would love the ability to read all file types. For me, I don't see .accdb or .flat, and I assume other folks might be missing other file formats that they use. I find it confusing that the input tool accepts a lot of file types, but selecting the zip format then limits my choices. I believe @Aguisande mentioned this issue in the 10.5 beta.
Thanks!
User | Likes Count |
---|---|
7 | |
4 | |
4 | |
3 | |
3 |