CompTIA DataX DY0-001 (V1) Practice Question

A data scientist is training a deep neural network for a complex image classification task using the Adam optimizer. They notice that during the initial training steps, the learning rate appears to be effectively smaller than the configured alpha, leading to slower initial convergence. However, the convergence speed picks up after several iterations. Which intrinsic mechanism of the Adam optimizer is responsible for correcting this initial behavior?

  • The adaptive scaling of learning rates for each parameter based on the second moment estimate (the moving average of squared gradients).

  • The application of a predefined learning rate decay schedule, which reduces the learning rate over time to allow for finer-grained convergence.

  • Bias correction for the first and second moment estimates, which counteracts their initialization at zero and provides a more accurate estimate in the early stages of training.

  • The calculation of the first moment estimate (the moving average of the gradients), which accelerates movement along directions of persistent gradient.

CompTIA DataX DY0-001 (V1)
Machine Learning
Your Score:
Settings & Objectives
Random Mixed
Questions are selected randomly from all chosen topics, with a preference for those you haven’t seen before. You may see several questions from the same objective or domain in a row.
Rotate by Objective
Questions cycle through each objective or domain in turn, helping you avoid long streaks of questions from the same area. You may see some repeat questions, but the distribution will be more balanced across topics.

Check or uncheck an objective to set which questions you will receive.

SAVE $64
$529.00 $465.00
Bash, the Crucial Exams Chat Bot
AI Bot