CompTIA DataX DY0-001 (V1) Practice Question

A machine learning engineer is training a deep neural network for a non-stationary problem and notices that the learning process has effectively halted. They determine that their current optimizer, Adagrad, has caused the learning rate to diminish to a near-zero value. To mitigate this, they decide to switch to the Root Mean Square Propagation (RMSprop) optimizer. What is the key mechanism in RMSprop that directly addresses this issue of a rapidly vanishing learning rate seen in Adagrad?

  • It computes adaptive learning rates by storing an exponentially decaying average of past gradients (first moment) and past squared gradients (second moment).

  • It introduces a penalty term to the loss function based on the magnitude of the model's weights to prevent overfitting.

  • It adds a fraction of the previous weight update vector to the current one, helping to accelerate convergence and dampen oscillations.

  • It calculates a moving average of the squared gradients using a decay parameter, which prevents the denominator of the update rule from monotonically increasing.

CompTIA DataX DY0-001 (V1)
Machine Learning
Your Score:
Settings & Objectives
Random Mixed
Questions are selected randomly from all chosen topics, with a preference for those you haven’t seen before. You may see several questions from the same objective or domain in a row.
Rotate by Objective
Questions cycle through each objective or domain in turn, helping you avoid long streaks of questions from the same area. You may see some repeat questions, but the distribution will be more balanced across topics.

Check or uncheck an objective to set which questions you will receive.

SAVE $64
$529.00 $465.00
Bash, the Crucial Exams Chat Bot
AI Bot