GCP Professional Data Engineer Practice Question

A healthcare provider runs a regional pipeline that exports HL7v2 messages to Pub/Sub, processes them in near-real time with Dataflow Streaming Engine, writes temporary files to a Cloud Storage bucket, and loads curated records into BigQuery. New compliance policy demands every persisted copy of the data-including transient shuffle or Streaming Engine state-be encrypted at rest with a customer-managed Cloud KMS key that security rotates. The team wants the simplest design that meets the rule. What should you recommend?

  • Keep the existing Dataflow job but add the --kmsKey flag and use a CMEK-protected Cloud Storage bucket for staging and temporary data.

  • Enable CMEK on the BigQuery dataset and configure Pub/Sub with a customer-managed key; rely on Dataflow's default at-rest encryption.

  • Re-implement the streaming transformation on a Dataproc cluster whose VM boot disks and all HDFS or Cloud Storage paths use the organization's CMEK.

  • Run the Dataflow job inside a VPC Service Controls perimeter and ensure TLS is used for Pub/Sub and BigQuery connections without changing at-rest encryption.

GCP Professional Data Engineer
Ingesting and processing the data
Your Score:
Settings & Objectives
Random Mixed
Questions are selected randomly from all chosen topics, with a preference for those you haven’t seen before. You may see several questions from the same objective or domain in a row.
Rotate by Objective
Questions cycle through each objective or domain in turn, helping you avoid long streaks of questions from the same area. You may see some repeat questions, but the distribution will be more balanced across topics.

Check or uncheck an objective to set which questions you will receive.

Bash, the Crucial Exams Chat Bot
AI Bot