🔥 40% Off Crucial Exams Memberships — Deal ends today!

1 hour, 26 minutes remaining!

GCP Professional Data Engineer Practice Question

Your organization stores click-stream logs as Parquet files in Cloud Storage and exposes them in BigQuery as external tables. A partner company needs SQL access to these logs for its own BI workflows, but you must be certain the partner cannot read or copy the underlying Cloud Storage objects and you want to avoid duplicating the data. Using Analytics Hub, what is the most appropriate way to publish the data to the partner?

  • Schedule a daily job that copies the external tables into managed BigQuery tables in a new dataset and share that dataset with the partner through Analytics Hub.

  • Convert the external tables to BigLake tables, add them to a private data exchange listing, and grant the partner only the analyticshub.subscriber role so it can query the linked dataset.

  • Export the Parquet files to a separate Cloud Storage bucket and distribute signed URLs to the partner instead of using Analytics Hub.

  • Publish the existing external tables in a public data exchange and rely on Cloud Storage object-level ACLs to restrict who can download the files.

GCP Professional Data Engineer
Preparing and using data for analysis
Your Score:
Settings & Objectives
Random Mixed
Questions are selected randomly from all chosen topics, with a preference for those you haven’t seen before. You may see several questions from the same objective or domain in a row.
Rotate by Objective
Questions cycle through each objective or domain in turn, helping you avoid long streaks of questions from the same area. You may see some repeat questions, but the distribution will be more balanced across topics.

Check or uncheck an objective to set which questions you will receive.

Bash, the Crucial Exams Chat Bot
AI Bot