🔥 40% Off Crucial Exams Memberships — Deal ends today!

2 hours, 29 minutes remaining!

GCP Professional Data Engineer Practice Question

A manufacturing company streams millisecond-level telemetry (temperature, pressure, event_timestamp, machine_id) from thousands of machines into BigQuery. Analysts must calculate mean time between failures per plant and quarter, enrich results with machine specifications (model, install_date) and plant attributes (country, business_unit), and frequently filter on event_timestamp ranges. To balance query performance, storage cost, and the ability to update machine or plant attributes without rewriting historical events, how should you map these business requirements to a warehouse data model?

  • Store telemetry in a fact table partitioned on event_timestamp and clustered by machine_id; place machine specifications and plant attributes in separate dimension tables referenced through surrogate keys.

  • Use a single table where machine and plant metadata are stored as nested STRUCTs inside every telemetry event record.

  • Create a single denormalized table that repeats all machine and plant attributes in every telemetry row to eliminate joins.

  • Model machine specifications as the fact table and treat each telemetry event as a dimension record linked to it.

GCP Professional Data Engineer
Storing the data
Your Score:
Settings & Objectives
Random Mixed
Questions are selected randomly from all chosen topics, with a preference for those you haven’t seen before. You may see several questions from the same objective or domain in a row.
Rotate by Objective
Questions cycle through each objective or domain in turn, helping you avoid long streaks of questions from the same area. You may see some repeat questions, but the distribution will be more balanced across topics.

Check or uncheck an objective to set which questions you will receive.

Bash, the Crucial Exams Chat Bot
AI Bot