CompTIA DataX DY0-001 (V1) Practice Question

A machine learning engineer is upgrading a natural language processing pipeline that uses an RNN-based architecture for machine translation. The existing model struggles with long-term dependencies in lengthy sentences and faces slow training times due to its sequential nature. To address these issues, the engineer decides to implement a Transformer model. Which core component of the Transformer architecture directly addresses both the challenge of capturing long-range dependencies and the bottleneck of sequential processing?

  • Residual connections and layer normalization.

  • The encoder-decoder stack.

  • Positional encodings.

  • The self-attention mechanism.

CompTIA DataX DY0-001 (V1)
Machine Learning
Your Score:
Settings & Objectives
Random Mixed
Questions are selected randomly from all chosen topics, with a preference for those you haven’t seen before. You may see several questions from the same objective or domain in a row.
Rotate by Objective
Questions cycle through each objective or domain in turn, helping you avoid long streaks of questions from the same area. You may see some repeat questions, but the distribution will be more balanced across topics.

Check or uncheck an objective to set which questions you will receive.

SAVE $64
$529.00 $465.00
Bash, the Crucial Exams Chat Bot
AI Bot