CompTIA DataX DY0-001 (V1) Practice Question

A risk-analytics team is retraining a credit-default logistic regression that contains more than 200 highly correlated applicant attributes.

With pure L2 (ridge) regularization the model keeps almost every feature, making it hard to interpret. Switching to pure L1 (lasso) causes most coefficients to be driven to zero, but recall drops because only one variable from each correlated group survives. The team needs a regularization approach that (1) still shrinks coefficients to control variance, (2) can keep several correlated predictors that all carry signal, (3) can eliminate truly irrelevant variables, and (4) exposes a hyperparameter to dial the trade-off between these behaviors.

Which regularization technique best meets these requirements?

  • Increase the penalty term in ridge (L2) regression

  • Apply adaptive lasso so that weights guide variable selection

  • Use early stopping when the validation loss stops improving

  • Elastic net regularization with a tunable mixing parameter α

CompTIA DataX DY0-001 (V1)
Machine Learning
Your Score:
Settings & Objectives
Random Mixed
Questions are selected randomly from all chosen topics, with a preference for those you haven’t seen before. You may see several questions from the same objective or domain in a row.
Rotate by Objective
Questions cycle through each objective or domain in turn, helping you avoid long streaks of questions from the same area. You may see some repeat questions, but the distribution will be more balanced across topics.

Check or uncheck an objective to set which questions you will receive.

SAVE $64
$529.00 $465.00
Bash, the Crucial Exams Chat Bot
AI Bot