AWS Certified Data Engineer Associate DEA-C01 Practice Question

A company ingests high-frequency IoT sensor readings and must land them in Amazon S3 in under 30 seconds. Operations teams also need the ability to replay any portion of the incoming stream if a downstream transformation job fails. Which solution meets these requirements while keeping operational overhead to a minimum?

  • Use an AWS IoT Core rule to write the sensor messages directly to an S3 bucket with the "Customer managed" retry option enabled.

  • Send the data to Amazon Kinesis Data Streams with a 24-hour retention period, add an Amazon Kinesis Data Firehose delivery stream as a consumer, and configure Firehose buffering to 1 MiB or 10 seconds before writing to Amazon S3.

  • Create a direct Amazon Kinesis Data Firehose delivery stream and reduce the S3 buffering size to 1 MiB and interval to 10 seconds.

  • Deploy an Apache Kafka cluster on Amazon EC2, configure a topic for the sensors, and use a Kafka Connect S3 sink to write data to Amazon S3 every 10 seconds.

AWS Certified Data Engineer Associate DEA-C01
Data Ingestion and Transformation
Your Score:
Settings & Objectives
Random Mixed
Questions are selected randomly from all chosen topics, with a preference for those you haven’t seen before. You may see several questions from the same objective or domain in a row.
Rotate by Objective
Questions cycle through each objective or domain in turn, helping you avoid long streaks of questions from the same area. You may see some repeat questions, but the distribution will be more balanced across topics.

Check or uncheck an objective to set which questions you will receive.

Bash, the Crucial Exams Chat Bot
AI Bot