AWS Certified Data Engineer Associate DEA-C01 Practice Question

A company stores customer records in Salesforce and keeps a data lake in Amazon S3. For nightly analytics, the team must load only data changed in the previous 24 hours. The output objects must be partitioned into year/month/day folders, respect Salesforce API limits, and avoid any custom code or servers. Which approach delivers the requirement with the least ongoing management?

  • Deploy an AWS DataSync agent, export daily CSV files from Salesforce, and schedule DataSync to transfer the files to Amazon S3 using a date prefix.

  • Configure AWS Database Migration Service (AWS DMS) to use Salesforce as the source and Amazon S3 as the target with a replication task that runs once per day.

  • Create an Amazon AppFlow flow that sources Salesforce, schedules it to run daily with incremental sync, and enables dynamic partitioning by year/month/day in the S3 destination.

  • Develop an AWS Glue Spark job that calls the Salesforce Bulk API, filters records changed in the last day, and writes partitioned Parquet files to Amazon S3.

AWS Certified Data Engineer Associate DEA-C01
Data Ingestion and Transformation
Your Score:
Settings & Objectives
Random Mixed
Questions are selected randomly from all chosen topics, with a preference for those you haven’t seen before. You may see several questions from the same objective or domain in a row.
Rotate by Objective
Questions cycle through each objective or domain in turn, helping you avoid long streaks of questions from the same area. You may see some repeat questions, but the distribution will be more balanced across topics.

Check or uncheck an objective to set which questions you will receive.

Bash, the Crucial Exams Chat Bot
AI Bot