🔥 40% Off Crucial Exams Memberships — Deal ends today!

1 hour, 59 minutes remaining!

GCP Professional Data Engineer Practice Question

Your team is building a Dataflow streaming pipeline that ingests point-of-sale events from Cloud Pub/Sub. The dashboard must show the total revenue for every strict, non-overlapping 60-second interval aligned to the wall-clock minute (for example, 10:05:00-10:05:59). Late events that arrive up to 30 seconds after the interval closes must update the previously reported total, and no results should be emitted before the 60-second interval is complete. Which Apache Beam windowing and trigger configuration best meets these requirements with minimal complexity?

  • Use a 60-second fixed window with the default trigger and zero allowed lateness so that late events are discarded automatically.

  • Use a 60-second sliding window with a 30-second hop and an AfterProcessingTime trigger that fires every 10 seconds to refresh the dashboard frequently.

  • Place all events in a global window, add an AfterProcessingTime trigger that fires every 60 seconds, and manually clear state after each firing.

  • Use a 60-second fixed window aligned to minute boundaries, apply an AfterWatermark.pastEndOfWindow() trigger with 30 seconds of allowed lateness, and set the accumulation mode to ACCUMULATING_FIRED_PANES.

GCP Professional Data Engineer
Ingesting and processing the data
Your Score:
Settings & Objectives
Random Mixed
Questions are selected randomly from all chosen topics, with a preference for those you haven’t seen before. You may see several questions from the same objective or domain in a row.
Rotate by Objective
Questions cycle through each objective or domain in turn, helping you avoid long streaks of questions from the same area. You may see some repeat questions, but the distribution will be more balanced across topics.

Check or uncheck an objective to set which questions you will receive.

Bash, the Crucial Exams Chat Bot
AI Bot