GCP Professional Data Engineer Practice Question

An analytics startup runs a Dataflow pipeline in the production project analytics-prod. The pipeline executes under service account [email protected], which currently holds the primitive Editor role at the project level. A security audit flags this as excessive. The pipeline must do only two things:

  1. Read objects already stored in Cloud Storage bucket gs://ingest-data.
  2. Write query results into the existing BigQuery dataset marketing_prod.

To follow the principle of least privilege while keeping administration effort low, how should you update IAM permissions for the service account?

  • Remove the Editor role and instead grant Storage Object Viewer and BigQuery Data Owner as project-level roles.

  • Create a custom project-level role containing only storage.objects.get and bigquery.tables.updateData, then assign it to the service account.

  • Remove the Editor role and grant Storage Object Viewer on gs://ingest-data, BigQuery Data Editor on the marketing_prod dataset, and BigQuery Job User at the project level.

  • Keep the Editor role but add an organization policy that disables sensitive service APIs to mitigate risk.

GCP Professional Data Engineer
Designing data processing systems
Your Score:
Settings & Objectives
Random Mixed
Questions are selected randomly from all chosen topics, with a preference for those you haven’t seen before. You may see several questions from the same objective or domain in a row.
Rotate by Objective
Questions cycle through each objective or domain in turn, helping you avoid long streaks of questions from the same area. You may see some repeat questions, but the distribution will be more balanced across topics.

Check or uncheck an objective to set which questions you will receive.

Bash, the Crucial Exams Chat Bot
AI Bot