AWS Certified Data Engineer Associate DEA-C01 Practice Question

A company ingests clickstream events into an Amazon Kinesis data stream. A data engineer must validate each record and deliver the data to Amazon S3 as 64 MiB objects. The solution must retry automatically if S3 is temporarily unavailable and must require the least operational effort while remaining fully serverless. Which approach meets these requirements?

  • Enable Amazon Redshift streaming ingestion on the Kinesis data stream and UNLOAD the materialized view to S3 at 64 MiB intervals.

  • Deploy an Amazon MSK Connect S3 Sink connector to consume from the Kinesis data stream and write 64 MiB objects to S3.

  • Configure the Kinesis data stream to trigger a Lambda function that stores incoming records in memory and uploads them to S3 when the total reaches 64 MiB.

  • Create a Kinesis Data Firehose delivery stream that uses the Kinesis data stream as its source, add a Lambda function for record validation, set the S3 destination, and configure a 64 MiB buffer size.

AWS Certified Data Engineer Associate DEA-C01
Data Ingestion and Transformation
Your Score:
Settings & Objectives
Random Mixed
Questions are selected randomly from all chosen topics, with a preference for those you haven’t seen before. You may see several questions from the same objective or domain in a row.
Rotate by Objective
Questions cycle through each objective or domain in turn, helping you avoid long streaks of questions from the same area. You may see some repeat questions, but the distribution will be more balanced across topics.

Check or uncheck an objective to set which questions you will receive.

Bash, the Crucial Exams Chat Bot
AI Bot