This site uses different types of cookies, including analytics and functional cookies (its own and from other sites). To change your cookie settings or find out more, click here. If you continue browsing our website, you accept these cookies.
I'm not exactly sure what you mean by the cutoff. The data you feed into your various models should have a binary target variable that has 1 or 0 in the data itself. So the Model Comparison is just asking which value, 1 or 0, is the positive in your case. So if 1 means "Yes" for you, then you would put this.
If I'm understanding you, there is no cutoff from the models - if there is 80% 1 and 20% 0 in your data, it'll use these to train your model. If you want something different, you can use the Oversample Tool to change the percentages, so its an even 50/50 distribution.
My question was on how do I choose the cutoff after the model produce the probability.
Is the model automatically predict the outcome with higher probability?
For example, let's say logistic regression model produce outcome of 0.45 for 1 and 0.55 for 0 can I choose that it will predict 1 and not 0?
When I create it in basic R I can program it to choose what is the best cutoff for my model that gives me the best results (confusion matrix, AUC etc.) Is there any way to do it in any of the alteryx macro tools?
So the cutoff isn't hard coded in the tools themselves, but rather embedded in the libraries. So for instance, logistic regression uses ROC. If you wanted to adjust this, you'd have to change up how the Tool makes the predictions by hard coding it in the Tool itself.