CompTIA DataX DY0-001 (V1) Practice Question

You have finished training a convolutional neural network that places a BatchNormalization layer after every convolution. Before exporting the model you call the network with training=False so that all BatchNormalization layers run in inference mode.

Which statement correctly describes how those BatchNormalization layers normalize their inputs when training=False?

  • It subtracts the layer's moving mean, divides by the moving variance (plus ε), and then applies the learned γ and β parameters that were updated during training.

  • It normalizes each feature by second-order moment estimates (e.g., Adam's vₜ) stored in the optimizer, so no internal moving averages are required.

  • It recomputes the mean and variance of the current inference batch, applies them for normalization, and updates the layer's moving averages without back-propagating gradients.

  • It skips normalization altogether and only performs the affine transform γx + β using the parameters learned during training.

CompTIA DataX DY0-001 (V1)
Machine Learning
Your Score:
Settings & Objectives
Random Mixed
Questions are selected randomly from all chosen topics, with a preference for those you haven’t seen before. You may see several questions from the same objective or domain in a row.
Rotate by Objective
Questions cycle through each objective or domain in turn, helping you avoid long streaks of questions from the same area. You may see some repeat questions, but the distribution will be more balanced across topics.

Check or uncheck an objective to set which questions you will receive.

SAVE $64
$529.00 $465.00
Bash, the Crucial Exams Chat Bot
AI Bot