Be sure to review our Idea Submission Guidelines for more information!
Submission GuidelinesHello,
After used the new "Image Recognition Tool" a few days, I think you could improve it :
> by adding the dimensional constraints in front of each of the pre-trained models,
> by adding a true tool to divide the training data correctly (in order to have an equivalent number of images for each of the labels)
> at least, allow the tool to use black & white images (I wanted to test it on the MNIST, but the tool tells me that it necessarily needs RGB images) ?
Question : do you in the future allow the user to choose between CPU or GPU usage ?
In any case, thank you again for this new tool, it is certainly perfectible, but very simple to use, and I sincerely think that it will allow a greater number of people to understand the many use cases made possible thanks to image recognition.
Thank you again
Kévin VANCAPPEL (France ;-))
Thank you again.
Kévin VANCAPPEL
This has probably been mentioned before, but in case it hasn't....
The dynamic input tool is useful for bringing in multiple files / tabs, but quickly stops being fit for purpose if schemas / fields differ even slightly. The common solution is to then use a dynamic input tool inside a batch macro and set this macro to 'Auto Configure by Name', so that it waits for all files to be run and then can output knowing what it has received.
It's a pain to create these batch macros for relatively straightforward and regular processes - would it be possible to have this 'Auto Configure by Name' as an option directly in the dynamic input tool, relieving the need for a batch macro?
Thanks,
Andy
This has probably been mentioned before, but in case it hasn't....
Right now, if the dynamic input tool skips a file (which it often does!) it just appears as a warning and continues processing. Whilst this is still useful to continue processing, could it be built as an option in the tool to select a 'error if files are skipped'?
Right now it is either easy to miss this is happening, or in production / on server you may want this process to be stopped.
Thanks,
Andy
With an increasing number of different projects, involving different machine learning models, it's becoming difficult to manage different package versions across workflows. Currently, the Python tool has a single virtual environment, so we need to develop models in different projects always using the same Python and package versions as the Python tool venv. While this doesn't bother the code itself too much, it becomes a problem as soon as we store and load pickled models, which are sensitive to even minor changes in packages.
This is even more so a problem when we are working on the Alteryx server, where different teams might use different packages. Currently, there is only the server admin who can install packages on the server and there can only be one version per package.
So, a more robust venv management in the Python tool would be much appreciated!
As we do more work analyzng the canvasses that our folk are producing - it's becoming more and more necessary to have a well documented definition and schema for the XML that is used for Alteryx Canvasses.
Please could you publish the full XML definition and schema for Alteryx canvasses - this will allow groups to perform deeper analytics on how people are using Alteryx, automate quality checks; look for learning gaps; scan for dependencies etc?
Note: this relates to an idea from @dataprep here: https://community.alteryx.com/t5/Alteryx-Designer-Ideas/Documentation-tool-list-fileformat/idi-p/184...
At the moment if a part of your python code takes more than 30s to run, Jupyter times out and Alteryx cancels the workflow. This makes the Python Tool unusable for anything intensive and the timeout should be removed by default or be configurable per workflow.
I've made this idea as none of the solutions in these threads feel satisfactory:
Hi All,
Was very happy to see the Bulk Loader introduced for Snowflake during last release. This bulk loader is specifically available for Snowflake environments that are hosted on AWS, but does not provide functionality for those environments using Azure. As Snowflake continues to build momentum, I imagine this will be a common request. Is there something in the pipeline to add this functionality?
For an interim solution, we will be working toward developing some generic scripts/snowsql to mimic that bulk load, but ultimately we'd love to have this as part of the tool.
Best,
devKev
How about a quick method of disabling a container.
Current state - Click on the container, pan the mouse all the way over to the tiny checkbox target in the configuration pane and click disable.
Future state - little icon by the rollup icon that can be clicked to disable/enable, differentiated by perhaps a color change of the minimized pane perhaps?
I know what you're thinking, "talk about lazy, he's whining about moving the mouse (which his hand was already on) 2 cm along his desktop and clicking"... but still what an easy usability win and one less click to do a task I find myself repeating frequently.
The Alteryx.Flexnetoperations.com license management site needs major work.
On the View Licenses page it shows all licenses going back several years. A basic need is to show only licenses which haven't expired, but that is not an option.You cannot even sort on the expiration column while you can sort on most others columns.
The most simple need is to see a list of my current active license users - but I do't see a way to do that.
I tried an "Advanced Search" and chose expiration date after 2019-10-29 and none of my licenses which expire in 2020 appear - I get a blank list.
Similarly on the administer machines page you cannot filter to hide expired licenses or even on the licenses column (which doesn't sort either).
The help link on the page doesn't bring you to help specific to that page but the general activation help front page. After several clicks I found this page:
But the help is incomplete (doesn't list Machine types or the difference between Active and Inactive)
Also, there is no export capability - copy and pasting into Excel is a formatting headache as it brings in check-boxes.
Lots of room for improvement here.
Cheers,
Bob
P.S. I understand that work is being done on this, but an ETA would be greatly appreciated.
I use a mouse which has a horizontal scroll wheel. This allows me to quickly traverse the columns of excel documents, webpages, etc.
This interaction is not available in Alteryx Designer and when working with wide data previews it would improve my UX drastically.
Please add support for windows authentication to the download tool. I know there's a workaround but that involves using curl and the run command tool. The run command tool is awful and should be avoided at all costs, so please improve the download tool so I can use internal APIs.
Here's a sample of what you get if no records are read into a python tool:
Error: CReW SHA256 (4): Tool #1: Traceback (most recent call last):
File "D:\Engine_10804_3513901e8d4d4ab48a13c314a18fd1f9_\2f1b1eb4701e445775092128efe77f76\workbook.py", line 7, in <module>
df = Alteryx.read('#1')
File "C:\Program Files\Alteryx\bin\Miniconda3\envs\DesignerBaseTools_venv\lib\site-packages\ayx\export.py", line 35, in read
return _CachedData_(debug=debug).read(incoming_connection_name, **kwargs)
File "C:\Program Files\Alteryx\bin\Miniconda3\envs\DesignerBaseTools_venv\lib\site-packages\ayx\CachedData.py", line 306, in read
data = db.getData()
File "C:\Program Files\Alteryx\bin\Miniconda3\envs\DesignerBaseTools_venv\lib\site-packages\ayx\Datafiles.py", line 500, in getData
data = self.connection.read_nparrays()
RuntimeError: DataWrap2WrigleyDb::GoRecord: Attempt to seek past the end of the file
I've fixed this in my macro by forcing a DUMMY record into the python tool (deleting it on the back-end). It would be much nicer to have error handling that prevents the issue. Even as a configuration option, what to do with no input this would simplify things.
This error condition potentially effects every python tool created.
Cheers,
Mark
It will be really great if we add Single sign-on (SSO) in download tool. Many users are facing this issue when they're trying to download data from weburl. In some case the url will verfiy the sign-on and then redirect to link from where we can download data. Currently download tool fails to verfiy SSO or siteminder authentication.
Currently the only way to do IF / FOR / WHILE loop is either in Formula tool or via iterative/batch macro.
Instead, it will be hugely useful and a lot more intuitive if there is the ability to build the FOR / WHILE logic embedded in a container (similar to LabVIEW interface https://www.ni.com/en-sg/support/documentation/supplemental/08/labview-for-loops-and-while-loops-exp...).
Advantages include:
- Increased readability. (not having to go into a macro!)
- Increased agility. (more power/ features can be added or modified on the go for something that is more than a Formula tool but not too much interface like a Macro App)
- More intuitive
Dawn.
It would be great to have the below functionality in Alteryx.
A workflow is built in Alteryx and button click in Alteryx can be used to generate SQL code that can be ran on a specific database platform, such as SQL Server to run external editors such as SQL Server Management Studio. Thanks.
While Alteryx allows for a proxy username and password in the settings, these are not passed properly to an NTLM proxy. Support for NTLM authentication would be incredibly useful for a number of corporations who utilize this firewall setup.
We currently have to either download via Python or cURL through batch commands called by Alteryx. Since Alteryx uses a cURL back-end, this should be a fairly simple addition to the existing download tool by allowing a selection of proxy server, port, and authentication method in addition to the proxy username and password. This could be done either in the tool itself or in User Settings.
When training people on the use of action tools, something that I always have to hit on is that when you are telling the tool which piece of the XML that you are adjusting, it's sort of difficult to tell what you have selected, and super easy to accidentally select something else.
Example:
When you initially select the action to take it's this nice Blue Color. However, it still doesn't feel exactly like you have actually selected anything or told the Action Tool what to do, since it's so easy to just select any other one of these actions.
A slightly different problem is that if you are selecting an action that has been previously configured, it is just this light grey color. So it can be easy to accidentally change your settings because you may not realize it's actually set up.
Here is a recent community post that sort of outlines a few of these problems.
I love the dynamic rename tool because quite often my headers are in the first row of data in a text file (or sometimes, Excel!).
However, whenever I open a workflow, I have to run the workflow first in order to make the rest of the workflow aware of the field names that I've mapped in the dynamic rename tool, and to clear out missing fields from downstream tools. When a workflow takes a while to run, this is a cumbersome step.
Alteryx Designer should be aware of the field names downstream from the dynamic rename tool, and make them available in the workflow for use downstream as soon as they are added (or when the workflow is initially opened without having been run first).
It would be a handy feature if it were possible to choose a data type for an input tool to read the data in as. For example, if a dataset has multiple fields with different data types, it would be handy to be able to make the Input Tool read and output them all as a string, if needed. This would also make a handy tool, a sort of blanket data conversion to convert all fields to the specified type.
Allow users the ability to add a delay on the connection between Control Container tools. I frequently have to rerun workflows that use the control container because the workflow has not registered that the file was properly closed on outputting from one output tool to the next. The network drives haven't resolved and show that the file is still open while its moved on to the next control container. Users should have an option in the Configuration screen to add a delay before a signal is sent for the next container to run.
In the past I was able to use a CReW tool (Wait a Second) in conjunction with the Block Until Done tool to add the delay in manually. But I have since converted all of my workflows over to Control Containers. Since then half of the times the workflow has run I encounter the following errors.