AWS Certified Data Engineer Associate DEA-C01 Practice Question

A data engineering team must run a weekly batch pipeline to extract data from Amazon RDS, transform it with an AWS Glue job, and load the results into Amazon Redshift. The team wants a fully managed orchestration solution that provides visual workflow monitoring, native retry logic, and minimal operational overhead. Which approach meets these requirements?

  • Implement the pipeline as a DAG in Amazon Managed Workflows for Apache Airflow, using Python and Redshift operators, and rely on Airflow's scheduler to run it weekly.

  • Build an AWS Glue Workflow that triggers the Glue job followed by an AWS Lambda function that executes the Redshift COPY command, and invoke the workflow weekly with EventBridge.

  • Configure two EventBridge rules: one to start the AWS Glue job and a second rule scheduled an hour later to run the Redshift COPY operation. Monitor both steps with CloudWatch alarms.

  • Create an Amazon EventBridge rule that triggers an AWS Step Functions Standard workflow. The workflow first uses the native StartJobRun integration to run the AWS Glue job, then calls the Amazon Redshift Data API to load the data.

AWS Certified Data Engineer Associate DEA-C01
Data Operations and Support
Your Score:
Settings & Objectives
Random Mixed
Questions are selected randomly from all chosen topics, with a preference for those you haven’t seen before. You may see several questions from the same objective or domain in a row.
Rotate by Objective
Questions cycle through each objective or domain in turn, helping you avoid long streaks of questions from the same area. You may see some repeat questions, but the distribution will be more balanced across topics.

Check or uncheck an objective to set which questions you will receive.

Bash, the Crucial Exams Chat Bot
AI Bot