AWS Certified Data Engineer Associate DEA-C01 Practice Question

An advertising analytics team stores about 100 TB of clickstream logs in a single Amazon S3 bucket. Analysts query the most recent 30 days of logs frequently, but access to older data is uncommon. Compliance rules mandate that all logs be retained for 2 years. The team wants to minimize storage costs without performing any recurring manual data movements. Which approach best meets these requirements?

  • Create external tables in Amazon Redshift Spectrum and schedule a script that runs ALTER TABLE UNLOAD commands to move data to S3 Glacier 30 days after ingestion.

  • Enable an Amazon EFS lifecycle policy to move the objects to the EFS Infrequent Access storage class after 30 days and configure EFS to delete the files after 2 years.

  • Enable S3 Intelligent-Tiering so that objects are automatically migrated across all storage tiers for the entire 2-year retention period.

  • Create an S3 Lifecycle rule that stores new objects in S3 Standard, transitions them to S3 Standard-IA after 30 days, transitions them again to S3 Glacier Deep Archive after 90 days, and automatically deletes them after 2 years.

AWS Certified Data Engineer Associate DEA-C01
Data Store Management
Your Score:
Settings & Objectives
Random Mixed
Questions are selected randomly from all chosen topics, with a preference for those you haven’t seen before. You may see several questions from the same objective or domain in a row.
Rotate by Objective
Questions cycle through each objective or domain in turn, helping you avoid long streaks of questions from the same area. You may see some repeat questions, but the distribution will be more balanced across topics.

Check or uncheck an objective to set which questions you will receive.

Bash, the Crucial Exams Chat Bot
AI Bot