Be sure to review our Idea Submission Guidelines for more information!
Submission GuidelinesHello,
After used the new "Image Recognition Tool" a few days, I think you could improve it :
> by adding the dimensional constraints in front of each of the pre-trained models,
> by adding a true tool to divide the training data correctly (in order to have an equivalent number of images for each of the labels)
> at least, allow the tool to use black & white images (I wanted to test it on the MNIST, but the tool tells me that it necessarily needs RGB images) ?
Question : do you in the future allow the user to choose between CPU or GPU usage ?
In any case, thank you again for this new tool, it is certainly perfectible, but very simple to use, and I sincerely think that it will allow a greater number of people to understand the many use cases made possible thanks to image recognition.
Thank you again
Kévin VANCAPPEL (France ;-))
Thank you again.
Kévin VANCAPPEL
I am trying to use the Dynamic Replace to selectively update records in a set of variables from survey data. That is, I do not have all potential values in the “R” input of Dynamic Replace. Instead, I have a list of values that I would like altered from their current values by respondent (RespondentID) and question # (Q#). Currently, when I run the workflow, any Q#/ResponseID combos that are not in my “R” input are replaced with blanks. However, I would like an option that maintains the original data if there is nothing to replace the data with. Without this option, there are few (I'm still working on some ways) workarounds to ensure the integrity of the data.
Matt
Sometimes we have workflows that fail and it is a pain to reschedule them. I wish there was a button on the bottom right corner of each workflows result just like Output.
Please add XBRL - eXtensible Business Reporting Language (https://www.xbrl.org/ , Wikipedia , http://www.xbrleurope.org/ ) as output file format.
XBRL is based on XML and is used in financial word, for example all public companies in USA send their financial reports to Stock Exchange Commison in XBRL format. (http://xbrl.sec.gov/)
In Japan Central Bank and Financial Services Agency (FSA) are collecting financial data for banks and financial companies using XBRL format.
Thank you.
Regards,
Cristian
Please evaluate the opportunity to export Alteryx workflows as xml based GraphML format in order to be able to import them in yED, the free graph editor. (https://www.yworks.com/products/yed)
Thank you.
Regards,
Cristian
Please evaluate the opportunity to export Alteryx workflows in .dot format, the same file format used by graphviz (http://www.graphviz.org)
Thank you.
Regards,
Cristian
It would be nice to tie the labels to the spatial objects being labeled.
The community could benefit from easier integration of splitting and applying functions to grouped data. The summarize tool is great for splitting your data and applying summary statistical functions. It would be super useful to take that block just one step further, and allow users to apply any other (aggregate) function to their grouped data instead of just the built-in functions in the summarize tool. I would envision that aggregate function either being a custom function that is a combination of existing user-specified functions within Alteryx (e.g. in the formula tool) and/or even an interface that allows you to use other Alteryx macros on the grouped data.
Apply user-defined functions, or other powerful Alteryx macros to grouped and data is a very common operation in the data analyst's daily workflows and being able to apply them without reverting to batch/iterative macros in a seamless manner would be naturally helpful.
I recently did some extensive work on using the download tool to invoke Restful Web Services. A lot of the initial effort was around ensuring that the data being passed in the header and body for the request was as the service required. Following review of experiences on the community I used a tool called Fiddler to directly view what was being sent to identify the problems in my transformations of the data going into the Download tool. The idea is that the raw HTTP request and reply messages are available directly in Alteryx in the Results window when running a workflow, preventing the need to use another tool.
Add a search or find function that looks for content within a tool rather than just the tool number. e.g. ctrl Find, to look for any tool that uses a keyword or field in the formula/join/etc. This would save me a boatload of time editing, updating, and troubleshooting my workflows.
Currently using these macros from Adam Riley: http://www.chaosreignswithin.com/
The stuff he has over there is great and makes alteryx much more powerful. Two tools in particular that I am using from there are Conditional Runners and the Parallel Block Until Done. These are tools that for us are vital and it would be nice if Alteryx would pick them up and maintain them for stability.
During execution the user cannot scroll around. Large workflows need to be shrunk to very small icons to be able to follow the progress. Either have an option to automaticaly center on the active icon or allow scrolling during execution.
It would be really useful to have a Join function that updated an existing file (not a database, but a flat or yxdb file).
The rough SAS equivalent are the UPDATE and MODIFY Statements
http://support.sas.com/documentation/cdl/en/basess/58133/HTML/default/viewer.htm#a001329151.htm
The goal would be to have a join function that would allow you to update a master dataset's missing variables from a transaction database and, optionally, to overwrite values on the master data set with current ones, without duplicating records, based on a common key.
The use case is you have an original file, new information comes in and you want to fill in the data that was originally missing without overwriting the original data (if there is data on the transaction file for that variable). In this case only missing data is changed.
Or as a separate use case, you had original data which has now been updated and you do want to overwrite the original data. In this case any variable with new values is updated, and variables without new values is left unchanged.
Why this is needed: if you don't have a Oracle type database, it is difficult to do this task inside of Alteryx and information changes over time (customers buy new products, customers update profiles, you have a file that is missing some data, and want to merge with a file that has better data for missing values, but worse data for exisitng values (it is from a different time period (e.g. older)). In theory you could do this with "IF isnull() Then replace" statements, but you'd have to build them for each variable and have a long data flow to capture the correct updates. Now is is much faster to do it in SAS and import the updated file back into Alteryx.
Visio is our organization's most common method of communicating business processes and workflows. Being able to export an Alteryx workflow to Visio would help us communicate the tool's functionality to process owners.
It would be nice to instead of scheduling a workflow at a certain time, just schedule workflows to run after the current workflow is done.
So what it would be is instead of running workflow B at 6:00 AM every morning, you could run workflow B when workflow A finishes running so that you would know that if there was anything happening in workflow A that workflow B relied on, everything would finish in the correct order.
I think that it would be nice to be able to append a Time Stamp to the name of a file in the OUTPUT Tool, as a Mainframe Programmer I could append a time stamp to file names this is helpful when doing batch jobs that are scduled on a server.
I would love to be able to double-click on an input or output file and have the file open. Second to that would be a clickable hyperlink to the filepath that could be used to open the file, or a "go to" button or something. Anything would be better than my current process of copying and pasting the filepath into an explorer window.
In the summary tool, we often use the summary tool to concatenate strings - we love this functionality.
However, we would also like to be able to concatenate just the unique values of strings. This could be done if we ran the preceding text field through the unique tool first, and then concatenate. But when we are doing this for multiple text variables and when we need to summarize other types of data at the same time, this becomes a very un-natural combination of joins & macros.
Thanks for considering,
Jeremy
Create a tool that allows user to create calculated fields for Tableau to output along with a .tde so they are available when openning the tde.
There are several situations where precalculated materialized data will visualize inaccurately in Tableau and calcualted fields need to be used.
Hi,
I think it would be nice to have the possibility to select which fields will output fo the distance tool.
When calculating distances for between large dataset, I do not always want my to 2 set of points has par of the output and would like to drop them directly in the tool.
Thank you
Simon
Korem
XGboost regression is now the benchmark for every Kaggle competition and seems to consistently outperform random forest, spline regression, and all of the more basic models. For those of us using predictive modeling on a regular basis in our actual work, this tool would allow for a quick improvement in our model accuracy. And I think, from a marketing standpoint, having a core group of users competing in Kaggle using Alteryx would be a great way to show off Alteryx's power.
It is readily available as an R package: https://cran.r-project.org/web/packages/xgboost/index.html
User | Likes Count |
---|---|
10 | |
7 | |
5 | |
5 | |
3 |