GCP Professional Cloud Security Engineer Practice Question

Your security team stores all audit logs in a central BigQuery dataset for threat hunting. An organization-level aggregated log sink has already been created with includeChildren=true and a filter of logName:"/cloudaudit.googleapis.com". Several project-level teams are starting to create their own project sinks to export logs to Pub/Sub for near-real-time alerting. The governance group wants to avoid duplicate BigQuery ingestion charges yet still guarantee that every audit log entry for every project, including any that are not routed by a project sink, reaches the central dataset. Which configuration should you implement on the organization-level sink to meet these requirements?

  • Convert the sink to intercepting mode so it always copies logs before project sinks can export them.

  • Restrict the sink's filter to severity>=ERROR to reduce duplicates without changing interception behavior.

  • Keep includeChildren=true but set the sink as non-intercepting so it only exports entries not already captured by project sinks.

  • Disable includeChildren on the sink and ask every project to add the central BigQuery dataset as an additional destination.

GCP Professional Cloud Security Engineer
Managing operations
Your Score:
Settings & Objectives
Random Mixed
Questions are selected randomly from all chosen topics, with a preference for those you haven’t seen before. You may see several questions from the same objective or domain in a row.
Rotate by Objective
Questions cycle through each objective or domain in turn, helping you avoid long streaks of questions from the same area. You may see some repeat questions, but the distribution will be more balanced across topics.

Check or uncheck an objective to set which questions you will receive.

Bash, the Crucial Exams Chat Bot
AI Bot