🔥 40% Off Crucial Exams Memberships — Deal ends today!

1 hour, 27 minutes remaining!

GCP Professional Data Engineer Practice Question

Your analytics team has created an external BigQuery table that references Parquet files stored in a Cloud Storage bucket owned by your project. You must make this dataset available to an external partner through an Analytics Hub listing so they can query the data from their own project. Corporate policy strictly prohibits granting the partner any IAM role on the Cloud Storage bucket, and you want to avoid copying the data into native BigQuery tables. Which approach meets these requirements?

  • Generate signed URLs for the Cloud Storage objects and send them to the partner so they can create their own external table in their project.

  • Publish the existing external table in an Analytics Hub listing and rely on BigQuery's default proxy access without granting any additional permissions.

  • Schedule a daily batch job that loads the Parquet files into a native BigQuery table, then share that table through Analytics Hub with read permissions.

  • Convert the external table into a BigLake table, publish the dataset in an Analytics Hub listing, and grant the partner BigQuery read permissions on the shared table.

GCP Professional Data Engineer
Preparing and using data for analysis
Your Score:
Settings & Objectives
Random Mixed
Questions are selected randomly from all chosen topics, with a preference for those you haven’t seen before. You may see several questions from the same objective or domain in a row.
Rotate by Objective
Questions cycle through each objective or domain in turn, helping you avoid long streaks of questions from the same area. You may see some repeat questions, but the distribution will be more balanced across topics.

Check or uncheck an objective to set which questions you will receive.

Bash, the Crucial Exams Chat Bot
AI Bot