AWS Certified Data Engineer Associate DEA-C01 Practice Question

An on-premises manufacturing system writes 200 GB of CSV sensor files to a NAS share every hour. The analytics team must ingest all files to Amazon S3 once per day for downstream processing. The solution should minimize operations, automatically retry failed transfers, and preserve files for audit. Which approach is most cost-effective and aligns with AWS best practices for batch ingestion?

  • Install the Kinesis Agent on the NAS server to continuously stream the files to an Amazon Kinesis Data Firehose delivery stream configured for S3.

  • Deploy an AWS DataSync agent on-premises and schedule a daily task to copy the NAS share to an S3 bucket with data integrity verification enabled.

  • Establish a Site-to-Site VPN, mount an Amazon EFS file system on-premises, and copy the files directly to EFS each day.

  • Run a daily AWS Database Migration Service full-load task to migrate the files into Amazon S3 using an S3 endpoint.

AWS Certified Data Engineer Associate DEA-C01
Data Ingestion and Transformation
Your Score:
Settings & Objectives
Random Mixed
Questions are selected randomly from all chosen topics, with a preference for those you haven’t seen before. You may see several questions from the same objective or domain in a row.
Rotate by Objective
Questions cycle through each objective or domain in turn, helping you avoid long streaks of questions from the same area. You may see some repeat questions, but the distribution will be more balanced across topics.

Check or uncheck an objective to set which questions you will receive.

Bash, the Crucial Exams Chat Bot
AI Bot