GCP Professional Data Engineer Practice Question

Your organization is migrating multiple analytics projects to Google Cloud. For regulatory reasons, every new resource-including BigQuery datasets-must reside exclusively in the europe-west3 region. Infrastructure is provisioned through a centralized Terraform-based CI/CD pipeline managed at the folder level that contains dozens of projects. What is the most effective way to programmatically guarantee that engineers cannot create resources outside europe-west3 in either existing or future projects while keeping day-to-day operational effort low?

  • Set each project's BigQuery default dataset location to europe-west3 in Terraform modules and monitor Cloud Logging for any deviations.

  • Create a custom IAM role that includes bigquery.datasets.create and add an IAM condition limiting its use to europe-west3, then bind this role to all engineering service accounts.

  • Place all projects in a VPC Service Controls perimeter restricted to europe-west3 so that resource creations outside the region are automatically blocked.

  • Define an organization policy at the folder root that sets the constraints/gcp.resourceLocations list policy to allow only the europe-west3 region, and manage it with a Terraform google_org_policy_policy resource.

GCP Professional Data Engineer
Designing data processing systems
Your Score:
Settings & Objectives
Random Mixed
Questions are selected randomly from all chosen topics, with a preference for those you haven’t seen before. You may see several questions from the same objective or domain in a row.
Rotate by Objective
Questions cycle through each objective or domain in turn, helping you avoid long streaks of questions from the same area. You may see some repeat questions, but the distribution will be more balanced across topics.

Check or uncheck an objective to set which questions you will receive.

Bash, the Crucial Exams Chat Bot
AI Bot