🔥 40% Off Crucial Exams Memberships — Deal ends today!

45 minutes, 55 seconds remaining!

GCP Professional Data Engineer Practice Question

Your manufacturing company ingests several terabytes of IoT sensor data each day as partitioned Parquet files in a Cloud Storage data lake. Data engineering teams must give hundreds of analysts interactive SQL access while enforcing BigQuery row-level security policies and a unified audit trail. Management wants to avoid duplicating the data and minimise storage cost, but they still need BigQuery-style metadata caching for performance. Which architecture best satisfies these requirements?

  • Run a daily Dataflow pipeline that loads the Parquet files into partitioned BigQuery native tables and apply column- and row-level security there.

  • Load the Parquet data into Cloud Bigtable using Dataflow and query it from BigQuery through a Bigtable external connection.

  • Expose the bucket as a BigQuery external table and control access only through Cloud Storage IAM roles.

  • Create BigLake tables in BigQuery that reference the Parquet objects in Cloud Storage and apply row-level security policies on those tables.

GCP Professional Data Engineer
Storing the data
Your Score:
Settings & Objectives
Random Mixed
Questions are selected randomly from all chosen topics, with a preference for those you haven’t seen before. You may see several questions from the same objective or domain in a row.
Rotate by Objective
Questions cycle through each objective or domain in turn, helping you avoid long streaks of questions from the same area. You may see some repeat questions, but the distribution will be more balanced across topics.

Check or uncheck an objective to set which questions you will receive.

Bash, the Crucial Exams Chat Bot
AI Bot