While reviewing a binary logistic regression that predicts customer default (1) versus non-default (0), an analyst remarks: "The coefficient for DebtRatio is -0.405, so reducing DebtRatio by one unit will lower the probability of default by 40.5%." Which explanation best corrects the analyst's misinterpretation?
The statement is wrong because coefficients must first be standardized; only standardized values can be interpreted as percentage changes in probability.
A negative coefficient mainly signals severe multicollinearity; therefore the analyst should drop correlated predictors instead of interpreting the value.
Coefficients whose absolute value is below 0.5 are effectively zero, so DebtRatio has no meaningful or interpretable effect on default risk.
Logistic regression coefficients are expressed in log-odds, so you must exponentiate -0.405 (≈ 0.67) to see that the odds of default fall by about 33%; the change in probability is not 40.5%.
Logistic regression models the natural logarithm of the odds of the positive class. A coefficient therefore represents an additive change in log-odds, not in probability. To obtain an interpretable effect you must exponentiate the coefficient: exp(-0.405) ≈ 0.67. Thus a one-unit decrease in DebtRatio multiplies the odds of default by about 0.67, a 33% reduction in the odds, not a 40.5% drop in probability. The actual probability change is nonlinear and depends on the baseline; it cannot be read directly from the raw coefficient. Issues such as standardizing predictors, multicollinearity, or the absolute size of the coefficient do not explain the analyst's error.
Ask Bash
Bash is our AI bot, trained to help you pass your exam. AI Generated Content may display inaccurate information, always double-check anything important.
What does it mean that logistic regression models the 'log-odds' of the positive class?
Open an interactive chat with Bash
How do you exponentiate a coefficient in logistic regression, and why is it necessary?
Open an interactive chat with Bash
Why is the probability change in logistic regression nonlinear and dependent on the baseline probability?