GCP Professional Data Engineer Practice Question

Your company runs a nightly Apache Beam pipeline on Cloud Dataflow. The pipeline, triggered by Cloud Scheduler, reads compressed files from the Cloud Storage bucket "raw-logs", transforms the data, and appends the results to tables in the BigQuery dataset "analytics". The pipeline executes under the service account pipeline-sa@project. A new security policy requires replacing the project-level Editor role currently assigned to this account with a least-privilege alternative while keeping the job fully functional. Which IAM configuration meets the requirement?

  • Create a custom project-level role containing bigquery.tables.update, storage.objects.get, and dataflow.jobs.create, and bind it to pipeline-sa.

  • Grant roles/dataflow.worker on the project and additionally grant roles/bigquery.dataEditor on the "analytics" dataset and roles/storage.objectViewer on the "raw-logs" bucket to pipeline-sa.

  • Replace the Editor role with the primitive Viewer role at the project level and grant no further permissions.

  • Grant roles/bigquery.admin and roles/storage.admin on the project to pipeline-sa; no Dataflow role is needed because it already owns the project.

GCP Professional Data Engineer
Designing data processing systems
Your Score:
Settings & Objectives
Random Mixed
Questions are selected randomly from all chosen topics, with a preference for those you haven’t seen before. You may see several questions from the same objective or domain in a row.
Rotate by Objective
Questions cycle through each objective or domain in turn, helping you avoid long streaks of questions from the same area. You may see some repeat questions, but the distribution will be more balanced across topics.

Check or uncheck an objective to set which questions you will receive.

Bash, the Crucial Exams Chat Bot
AI Bot