GCP Professional Data Engineer Practice Question

A retail company needs to redesign its IoT ingestion layer. Millions of sensors worldwide will stream small JSON readings every few seconds. The solution must

  • accept HTTPS pushes with end-to-end latency under 100 ms,
  • automatically scale without capacity planning,
  • survive regional outages while retaining undelivered messages for up to 7 days, and
  • fan-out each event to several independent Dataflow streaming pipelines for further processing. Which service should you recommend as the primary ingestion point?
  • Upload each JSON reading to Cloud Storage via signed URLs and trigger Cloud Functions that forward data to downstream pipelines.

  • Post sensor readings to a regional Cloud Run service that writes directly into Cloud Spanner for later processing.

  • Publish sensor events to a Cloud Pub/Sub standard topic and let each Dataflow pipeline consume from its own subscription.

  • Deploy an Apache Kafka cluster on Cloud Dataproc to ingest events and have Dataflow read from Kafka topics.

GCP Professional Data Engineer
Ingesting and processing the data
Your Score:
Settings & Objectives
Random Mixed
Questions are selected randomly from all chosen topics, with a preference for those you haven’t seen before. You may see several questions from the same objective or domain in a row.
Rotate by Objective
Questions cycle through each objective or domain in turn, helping you avoid long streaks of questions from the same area. You may see some repeat questions, but the distribution will be more balanced across topics.

Check or uncheck an objective to set which questions you will receive.

Bash, the Crucial Exams Chat Bot
AI Bot