AWS Certified Data Engineer Associate DEA-C01 Practice Question

A DynamoDB table that stores IoT sensor readings peaks at 40,000 writes per second. The analytics team must land every new item in an Amazon S3 data lake within 60 seconds. The solution must auto-scale, provide at-least-once delivery, and minimize operational overhead. Which architecture meets these requirements MOST effectively?

  • Create an AWS Glue streaming ETL job that consumes the table's stream ARN directly and writes the data to Amazon S3.

  • Use AWS Database Migration Service in change data capture mode to replicate the DynamoDB table continuously to an S3 target.

  • Enable DynamoDB Streams with the NEW_IMAGE view and configure an AWS Lambda function as the event source; inside the function batch the records and submit them to an Amazon Kinesis Data Firehose delivery stream that writes to S3.

  • Schedule an AWS Glue batch job every minute to export the entire table to S3 by using DynamoDB export to S3.

AWS Certified Data Engineer Associate DEA-C01
Data Ingestion and Transformation
Your Score:
Settings & Objectives
Random Mixed
Questions are selected randomly from all chosen topics, with a preference for those you haven’t seen before. You may see several questions from the same objective or domain in a row.
Rotate by Objective
Questions cycle through each objective or domain in turn, helping you avoid long streaks of questions from the same area. You may see some repeat questions, but the distribution will be more balanced across topics.

Check or uncheck an objective to set which questions you will receive.

Bash, the Crucial Exams Chat Bot
AI Bot