GCP Professional Data Engineer Practice Question

A smart-city analytics team ingests billions of JSON sensor readings each day through Pub/Sub and immediately writes them to a raw staging location. Compliance rules require that the unmodified records be retained for five years, with older data automatically moved to colder, less expensive storage classes. Engineers will later run Dataflow jobs that cleanse the data and load curated subsets into BigQuery on demand. Which sink best satisfies the retention, cost, and future-processing requirements for the raw data layer?

  • Persist the records in a wide-column Cloud Bigtable instance

  • Insert the records into a partitioned BigQuery table using streaming inserts

  • Store the records as objects in a Cloud Storage bucket with lifecycle rules

  • Load the records into a Cloud SQL PostgreSQL database and enable point-in-time recovery

GCP Professional Data Engineer
Ingesting and processing the data
Your Score:
Settings & Objectives
Random Mixed
Questions are selected randomly from all chosen topics, with a preference for those you haven’t seen before. You may see several questions from the same objective or domain in a row.
Rotate by Objective
Questions cycle through each objective or domain in turn, helping you avoid long streaks of questions from the same area. You may see some repeat questions, but the distribution will be more balanced across topics.

Check or uncheck an objective to set which questions you will receive.

Bash, the Crucial Exams Chat Bot
AI Bot