Get Inspire insights from former attendees in our AMA discussion thread on Inspire Buzz. ACEs and other community members are on call all week to answer!
The Product Idea boards have gotten an update to better integrate them within our Product team's idea cycle! However this update does have a few unique behaviors, if you have any questions about them check out our FAQ.

Alteryx Designer Desktop Ideas

Share your Designer Desktop product ideas - we're listening!
Submitting an Idea?

Be sure to review our Idea Submission Guidelines for more information!

Submission Guidelines

Repetition parameter for Neural Network

TLDR: Add a parameter repeat (rep) to the Neural Network configuration panel, and probably other Predictive tools. This would make the tool return only the best trained model out of #rep.

 

I have done some research by comparing the R neuralnet function (from Neuralnet package) and the Alteryx Neural Network tool. I thought that the XOR example was a good one to evaluate a neural network system since it is one of the most basic non-linear tables.

 

  • R Neuralnet function

The neuralnet function is able to train and predict the 4 configurations of a XOR table (see page 3 of https://cran.r-project.org/web/packages/neuralnet/neuralnet.pdf for the test script).

 

However several attempts must be made. This is why the parameter

rep=5

is important, followed with

plot(net.xor, rep="best")

This way, instead of 1, 5 sets of initial parameters are randomly generated and 5 neural networks are trained independently. When plotting, only the neural network with the lowest error is plotted (I got 3% error). For XOR, around 5 tries seem to be enough to find a set of parameter that guesses the 4 situations of the table correctly.

 

Another parameter of the function is stepmax:

"The maximum steps for the training of the neural network. Reaching this maximum leads to a stop of the neural network’s training process."

This rules the max number of iteration for each single attempt viewed previously.

 

If one writes stepmax = 100 instead of rep = 5, the only resulting Neural Network usually does not have an average error below 5% (actually I get 49,7%).

 

  • Alteryx Neural Network

The Neural Network tool in Alteryx has a parameter documented as follow

"The maximum number of iterations for model estimation: This value controls the number of attempts the algorithm can make in attempting to find improvements in the set of model weights relative to the previous set of weights. If no improvements are found in the weights prior to the maximum number of iterations, the algorithm will terminate and return the best set of weights. This option defaults to 100 iterations. In general, given the behavior of the algorithm, it is likely to make sense to increase this value if needed, at the cost of lengthening the runtime for model creation."

 

I tried this workflow with this parameter set to default value 100, and even greater values, and I cannot get an average error below 5% (I get stuck around 50%).

 

After reading carefully these pieces of documentation and testing, I can guess that this last parameter is the equivalent of stepmax. 

 

If this is right, then it would be practical to add an equivalent for the parameter rep to the configuration panel. This parameter seems more useful than rep to me. Maybe there is already a way to simulate this parameter; if that is so, please let me know, otherwise I am going to include an R script to my workflow.

6 Comments
Atabarezz
13 - Pulsar

Can we train multi hidden layer networks by slightly changing the existing nn tool?

LGrs
6 - Meteoroid

It does not seem so, you can only choose the number of nodes in the single hidden layer.

For the XOR example however, one hidden layer should be enough.

Atabarezz
13 - Pulsar

By the way adding a parameter rep as a separate tool into the Alteryx toolbox

and iterate any parametric model would be even a greater idea don't you think...

 

LGrs
6 - Meteoroid

It would be awesome indeed! With the example, I showed that the final result you get with a predictive tool can highly depend on the random initial weights. So this tool must then run a stream (output 1) #rep times with different random seeds and outputs (output 2) the random seed that gave the stream with the highest Average/max/Sum on one field (set in parameter)?

 

I guess the problem with implementing such tool is that the information required for this tool depends on what tools come next in output 1, and I don't know if there is any tool that currently works that way in Alteryx.

 

My idea for adding this #rep to the configuration panel of Neural Network (and maybe other predictive tools too) is because I think these predictive tools actually run R functions that already have this parameter implemented, so it is really an easy thing to add in Alteryx.

Community_Admin
Alteryx
Alteryx
Status changed to: Inactive
 
Community_Admin
Alteryx
Alteryx

The status of this idea has been changed to 'Inactive'. This status indicates that:

 

1. The idea has not had activity in the form of likes or comments in over a year.

2. The idea has not reached ten likes.

3. The idea is still in the 'New Idea' status. 

 

However, this doesn't mean your idea won't be implemented! The Community can still like and comment on this idea. With enough renewed interest, this idea can be brought back into the 'New Idea' status. 

 

Thank you for contributing to the Alteryx Community and the Alteryx Product Idea Boards!