GCP Professional Data Engineer Practice Question

Your data engineering team maintains a daily batch workflow that 1) executes a Dataflow job, 2) conditionally calls an external REST API, 3) waits for a BigQuery transformation to finish, and 4) sends a status email on completion. The workflow must offer dependency management, configurable retries and back-off for each step, a graphical DAG view for operators, and access to a large ecosystem of community-maintained connectors. Operations also requires centralized Cloud Monitoring and Logging without having to manage Airflow themselves. Which Google Cloud service should you recommend to orchestrate this workflow?

  • Workflows, expressed in YAML, invoking Cloud Functions and Cloud Run for each step.

  • Cloud Composer, the managed Apache Airflow service, defining the process as a DAG with task-level retries and native monitoring.

  • A single Dataflow Flex Template that embeds the orchestration logic inside the pipeline itself.

  • Cloud Scheduler jobs that publish Pub/Sub messages to trigger individual Cloud Functions sequentially.

GCP Professional Data Engineer
Designing data processing systems
Your Score:
Settings & Objectives
Random Mixed
Questions are selected randomly from all chosen topics, with a preference for those you haven’t seen before. You may see several questions from the same objective or domain in a row.
Rotate by Objective
Questions cycle through each objective or domain in turn, helping you avoid long streaks of questions from the same area. You may see some repeat questions, but the distribution will be more balanced across topics.

Check or uncheck an objective to set which questions you will receive.

Bash, the Crucial Exams Chat Bot
AI Bot