Let’s talk Alteryx Copilot. Join the live AMA event to connect with the Alteryx team, ask questions, and hear how others are exploring what Copilot can do. Have Copilot questions? Ask here!
Start Free Trial

Alteryx Designer Desktop Discussions

Find answers, ask questions, and share expertise about Alteryx Designer Desktop and Intelligence Suite.

Probability Cutoff Value in R Model Comparison Tool

shakir_juolay
8 - Asteroid

I developed two separate Logistic Regression Models using the R Logistic Regression Tool.

In both cases the tool reported an OPTIMAL PROBABILITY CUTOFF that was close to the proportion of positive class in my Training set.

Also, in both cases this OPTIMAL PROBABILITY CUTOFF was used to create the Confusion Matrix.

 

When I compared these two models using R Model Comparison Tool and a separate Validation set, this tool used a probability cutoff of 0.5 for both the models, leading to vast differences between Confusion Matrix of Training and Validation sets.

 

Can I somehow specify a different probability cutoff in R Model Comparison Tool or ask it to take the OPTIMAL PROBABILITY CUTOFF reported by R Logistic Regression Tool?

 

I would also like to know how R Logistic Regression Tool calculates the OPTIMAL PROBABILITY CUTOFF.

 

Alteryx Designer Version 2021.2.1.35394

2 REPLIES 2
Theechat
5 - Atom

Hi,

 

Did you find workaround for this problem? If yes, could you please share how to?

 

Theechat

shakir_juolay
8 - Asteroid

Hi,

 

No I didn't.

What I normally do now is use a balanced dataset for model building, and I mostly do it using Python based Assisted Modeling tool.

Labels
Top Solution Authors