AWS Certified Data Engineer Associate DEA-C01 Practice Question

An e-commerce company must ingest millions of clickstream events each minute with sub-second latency, maintain strict event order within each user session, and fan-out the data simultaneously to a real-time Lambda analytics function and an S3 data lake for cold storage. Which solution is the MOST cost-effective way to meet these requirements?

  • Send events to an Amazon Kinesis Data Stream partitioned by session ID; configure an enhanced-fan-out AWS Lambda consumer for analytics and a Kinesis Data Firehose delivery stream to load data into Amazon S3.

  • Write events directly to an Amazon DynamoDB table using the session ID as the partition key and enable DynamoDB Streams for a Lambda function and for a Kinesis Data Firehose stream that delivers records to Amazon S3.

  • Publish each event to an Amazon SNS standard topic; subscribe one AWS Lambda function for analytics and one Amazon SQS queue that an AWS Glue job reads to write the data to Amazon S3.

  • Deploy an Apache Kafka cluster on Amazon EC2; create Kafka consumer groups for a Lambda analytics function and for a connector that writes records to Amazon S3.

AWS Certified Data Engineer Associate DEA-C01
Data Ingestion and Transformation
Your Score:
Settings & Objectives
Random Mixed
Questions are selected randomly from all chosen topics, with a preference for those you haven’t seen before. You may see several questions from the same objective or domain in a row.
Rotate by Objective
Questions cycle through each objective or domain in turn, helping you avoid long streaks of questions from the same area. You may see some repeat questions, but the distribution will be more balanced across topics.

Check or uncheck an objective to set which questions you will receive.

Bash, the Crucial Exams Chat Bot
AI Bot