CompTIA DataX DY0-001 (V1) Practice Question

A machine learning engineer is developing a neural network for a multi-class classification problem with five distinct, mutually exclusive categories. The output layer of the network is designed with five neurons, and the engineer chooses the Softmax activation function. For training this model, which loss function should be paired with the Softmax output layer to ensure both mathematical efficiency and a meaningful interpretation of the model's error?

  • Binary Cross-Entropy, because it can be applied to each output neuron individually, treating the multi-class problem as a series of independent binary classification tasks.

  • Mean Squared Error (MSE), because it is a versatile loss function that effectively penalizes the squared difference between the predicted probabilities and the one-hot encoded true labels.

  • Categorical Cross-Entropy, because its combination with Softmax results in a simplified and stable gradient calculation, where the gradient for each output neuron is the difference between the predicted probability and the actual target value.

  • Hinge Loss, because it is designed for maximum-margin classification and is effective at creating a clear separation between the output class probabilities.

CompTIA DataX DY0-001 (V1)
Machine Learning
Your Score:
Settings & Objectives
Random Mixed
Questions are selected randomly from all chosen topics, with a preference for those you haven’t seen before. You may see several questions from the same objective or domain in a row.
Rotate by Objective
Questions cycle through each objective or domain in turn, helping you avoid long streaks of questions from the same area. You may see some repeat questions, but the distribution will be more balanced across topics.

Check or uncheck an objective to set which questions you will receive.

SAVE $64
$529.00 $465.00
Bash, the Crucial Exams Chat Bot
AI Bot