Your organization runs Dataflow streaming jobs orchestrated by Cloud Composer in four separate Google Cloud projects. Compliance mandates that all pipeline-related logs-including Dataflow worker logs, Composer/Airflow logs, and any custom application logs-be retained for exactly two years, remain searchable in the Cloud Console's Log Explorer, and be isolated from application teams so that only the central Site Reliability Engineering (SRE) team can access them. The SRE team wants the simplest solution that avoids running additional infrastructure or duplicating log data across services. Which logging architecture best meets these requirements?
Create an aggregated log sink at the organization (or folder) level that routes every project's logs to a dedicated Cloud Logging bucket in an operations project. Set the bucket's retention to 730 days and grant only the SRE group Viewer access to that bucket.
Configure an aggregated log sink that exports all logs to a BigQuery dataset in the operations project, set table partition expiration to 730 days, and grant the SRE group BigQuery access.
In each project, export logs to a Nearline Cloud Storage bucket with a 730-day lifecycle policy; instruct the SRE team to analyze logs with Cloud Storage logs insights.
Increase the default retention of the _Default log bucket to 730 days in every project and add the SRE group to the Logs Viewer role on each project.
A centralized Cloud Logging bucket hosted in a dedicated operations project satisfies every stated requirement. An aggregated log sink created at the organization (or folder) level can route all logs from the four application projects into this single bucket, eliminating the need to deploy collectors or other infrastructure. Log buckets natively support custom retention periods of up to 3,650 days, so setting the bucket's retention to 730 days meets the two-year mandate without external storage. Because the logs stay inside Cloud Logging, they remain fully searchable through Log Explorer, and IAM policies applied to the centralized bucket restrict visibility to only the SRE team.
Sending logs to BigQuery would satisfy retention but moves search to BigQuery, not Log Explorer, and incurs additional storage plus query costs. Cloud Storage exports are not indexed in Log Explorer, so they break the search requirement. Relying on each project's _Default bucket does not centralize control or access. Therefore, routing all project logs to a centrally managed log bucket with extended retention is the correct solution.
Ask Bash
Bash is our AI bot, trained to help you pass your exam. AI Generated Content may display inaccurate information, always double-check anything important.
What is an aggregated log sink in Google Cloud?
Open an interactive chat with Bash
How does Cloud Logging retain logs for customizable periods?
Open an interactive chat with Bash
Why is routing logs to BigQuery not the best solution in this context?
Open an interactive chat with Bash
GCP Professional Data Engineer
Designing data processing systems
Your Score:
Report Issue
Bash, the Crucial Exams Chat Bot
AI Bot
Loading...
Loading...
Loading...
Pass with Confidence.
IT & Cybersecurity Package
You have hit the limits of our free tier, become a Premium Member today for unlimited access.
Military, Healthcare worker, Gov. employee or Teacher? See if you qualify for a Community Discount.
Monthly
$19.99
$19.99/mo
Billed monthly, Cancel any time.
3 Month Pass
$44.99
$14.99/mo
One time purchase of $44.99, Does not auto-renew.
MOST POPULAR
Annual Pass
$119.99
$9.99/mo
One time purchase of $119.99, Does not auto-renew.
BEST DEAL
Lifetime Pass
$189.99
One time purchase, Good for life.
What You Get
All IT & Cybersecurity Package plans include the following perks and exams .