AWS Certified Data Engineer Associate DEA-C01 Practice Question

An e-commerce company must move 50 TB of on-premises MySQL data to AWS. Updates will continue on-premises for several months, but analysts want to begin running Amazon Redshift queries immediately. The final architecture will use Redshift local tables. The migration team needs an interim solution that keeps storage costs low and avoids a full reload when the cut-over occurs. Which approach best aligns the storage design with these migration requirements?

  • Ship MySQL backups on an AWS Snowball Edge, restore them into Amazon RDS for MySQL on AWS, and run Redshift federated queries against the RDS instance until migration is finished.

  • Export nightly CSV dumps to Amazon S3 and issue a Redshift COPY after each export; perform a final full reload on cut-over day.

  • Use AWS Database Migration Service to replicate the MySQL data directly into Amazon Redshift tables and let CDC keep the cluster up to date until cut-over.

  • Use AWS Database Migration Service to replicate the MySQL tables into Amazon S3 in Parquet format. Define Amazon Redshift Spectrum external tables on the S3 data and, at cut-over, COPY the same Parquet files into Redshift local tables.

AWS Certified Data Engineer Associate DEA-C01
Data Store Management
Your Score:
Settings & Objectives
Random Mixed
Questions are selected randomly from all chosen topics, with a preference for those you haven’t seen before. You may see several questions from the same objective or domain in a row.
Rotate by Objective
Questions cycle through each objective or domain in turn, helping you avoid long streaks of questions from the same area. You may see some repeat questions, but the distribution will be more balanced across topics.

Check or uncheck an objective to set which questions you will receive.

Bash, the Crucial Exams Chat Bot
AI Bot