CompTIA DataX DY0-001 (V1) Practice Question

A data‐science team trains a GradientBoostingClassifier with these key settings:

  • n_estimators = 400
  • learning_rate = 0.10
  • max_depth = 4
  • subsample = 1.0

The model attains an F1-score of 0.96 on the training data but only 0.80 on a held-out validation set. Because of limited compute, the team must reduce overfitting without noticeably increasing training time, and they may adjust exactly one hyper-parameter. Which single change is most likely to achieve this goal?

  • Keep parameters unchanged except doubling n_estimators to 800.

  • Raise learning_rate to 0.30 and cut n_estimators to 150.

  • Increase max_depth to 8 to capture higher-order feature interactions.

  • Lower subsample to 0.7 so each tree is trained on a random 70 % of the rows.

CompTIA DataX DY0-001 (V1)
Machine Learning
Your Score:
Settings & Objectives
Random Mixed
Questions are selected randomly from all chosen topics, with a preference for those you haven’t seen before. You may see several questions from the same objective or domain in a row.
Rotate by Objective
Questions cycle through each objective or domain in turn, helping you avoid long streaks of questions from the same area. You may see some repeat questions, but the distribution will be more balanced across topics.

Check or uncheck an objective to set which questions you will receive.

SAVE $64
$529.00 $465.00
Bash, the Crucial Exams Chat Bot
AI Bot