AWS Certified Data Engineer Associate DEA-C01 Practice Question

A retail company ingests 50,000 clickstream events per second into an on-premises Apache Kafka cluster. The team wants to migrate the workload to AWS to eliminate patching and node failures while keeping existing Kafka producer and consumer code, retaining data for 7 days, and minimizing ongoing operational cost. Which approach meets these requirements?

  • Provision a multi-AZ Amazon MSK cluster with the required number of brokers and set the topic retention to 7 days.

  • Write events directly to Amazon DynamoDB using Kinesis Data Streams for DynamoDB and enable DynamoDB Time to Live (TTL) set to 7 days.

  • Send events to an Amazon SQS FIFO queue, trigger AWS Lambda for processing, and archive messages in Amazon S3 after 7 days.

  • Create an Amazon Kinesis Data Stream with enough shards, enable 7-day extended retention, and migrate producers to use the Kinesis Producer Library.

AWS Certified Data Engineer Associate DEA-C01
Data Store Management
Your Score:
Settings & Objectives
Random Mixed
Questions are selected randomly from all chosen topics, with a preference for those you haven’t seen before. You may see several questions from the same objective or domain in a row.
Rotate by Objective
Questions cycle through each objective or domain in turn, helping you avoid long streaks of questions from the same area. You may see some repeat questions, but the distribution will be more balanced across topics.

Check or uncheck an objective to set which questions you will receive.

Bash, the Crucial Exams Chat Bot
AI Bot