CompTIA DataX DY0-001 (V1) Practice Question

An engineering team implements dropout regularization in a feed-forward neural network using the "inverted" convention adopted by modern libraries. The dropout rate is set to 0.30 (each unit is dropped with probability 0.30). Which statement correctly describes what happens to the activations during training and inference under this convention?

  • Training: each unit is multiplied by Gaussian noise with mean 0.70 and variance 0.21; inference: activations are divided by 0.70 before being passed forward.

  • Training: each unit is set to zero with probability 0.30 and the surviving activations are multiplied by 0.70; inference: no additional scaling is required because the network learns to compensate automatically.

  • Training: each unit is set to zero with probability 0.30 and the surviving activations are divided by 0.70; inference: no units are dropped and no extra scaling is applied.

  • Training: each unit is set to zero with probability 0.30 with no scaling; inference: all activations are multiplied by 0.70 to compensate for the missing units.

CompTIA DataX DY0-001 (V1)
Machine Learning
Your Score:
Settings & Objectives
Random Mixed
Questions are selected randomly from all chosen topics, with a preference for those you haven’t seen before. You may see several questions from the same objective or domain in a row.
Rotate by Objective
Questions cycle through each objective or domain in turn, helping you avoid long streaks of questions from the same area. You may see some repeat questions, but the distribution will be more balanced across topics.

Check or uncheck an objective to set which questions you will receive.

SAVE $64
$529.00 $465.00
Bash, the Crucial Exams Chat Bot
AI Bot