🔥 40% Off Crucial Exams Memberships — Deal ends today!

3 hours, 1 minute remaining!

GCP Professional Cloud Security Engineer Practice Question

A security team created an aggregated sink at the prod folder level with includeChildren set to true. The sink exports all Audit Logs generated in that folder to a BigQuery dataset that resides in a dedicated security-project. Soon after activation, the dataset starts to receive duplicate log entries that originate from the security-project itself, which is not part of the prod folder. You must keep exporting every audit log produced by projects in the prod folder, but avoid re-exporting any log entries that originate in the security-project. Which change will solve the problem with the least ongoing maintenance effort?

  • Reconfigure the existing folder-level aggregated sink to operate in non-intercepting mode so it ignores log entries that have already been exported to another sink.

  • Edit the sink filter to add a condition that excludes any entries where resource.labels.project_id equals security-project.

  • Delete the aggregated sink and create individual project sinks in every prod project, each exporting to the BigQuery dataset.

  • Move the sink to the organization root and add an exclusion filter for logName:projects/security-project.

GCP Professional Cloud Security Engineer
Managing operations
Your Score:
Settings & Objectives
Random Mixed
Questions are selected randomly from all chosen topics, with a preference for those you haven’t seen before. You may see several questions from the same objective or domain in a row.
Rotate by Objective
Questions cycle through each objective or domain in turn, helping you avoid long streaks of questions from the same area. You may see some repeat questions, but the distribution will be more balanced across topics.

Check or uncheck an objective to set which questions you will receive.

Bash, the Crucial Exams Chat Bot
AI Bot