GCP Professional Data Engineer Practice Question

Your security team requires that an external data ingestion service have only the minimum permissions needed to load CSV files into an existing BigQuery table called sales_raw.daily_import that already exists in project retail-prod. The service must be able to append new rows with the bq load command every night, but it must never be able to read table data, overwrite or delete the table, or access other datasets in the project. What is the most appropriate way to satisfy the requirement while following the principle of least privilege?

  • Grant the predefined role BigQuery Data Editor (roles/bigquery.dataEditor) on the sales_raw dataset to the service account.

  • Create a custom IAM role containing only bigquery.tables.get and bigquery.tables.updateData, and bind that role to the service account on the sales_raw.daily_import table.

  • Assign the service account the predefined role BigQuery Job User (roles/bigquery.jobUser) on the project, which is sufficient for running bq load without additional permissions.

  • Grant the primitive role Storage Object Creator on the project, because BigQuery load jobs read from Cloud Storage and write data implicitly.

GCP Professional Data Engineer
Designing data processing systems
Your Score:
Settings & Objectives
Random Mixed
Questions are selected randomly from all chosen topics, with a preference for those you haven’t seen before. You may see several questions from the same objective or domain in a row.
Rotate by Objective
Questions cycle through each objective or domain in turn, helping you avoid long streaks of questions from the same area. You may see some repeat questions, but the distribution will be more balanced across topics.

Check or uncheck an objective to set which questions you will receive.

Bash, the Crucial Exams Chat Bot
AI Bot