Bash, the Crucial Exams Chat Bot
AI Bot
Monitoring, Optimization, and Security (GCP PDE) Flashcards
GCP Professional Data Engineer Flashcards
| Front | Back |
| How can you ensure sensitive data is not exposed in logs | Use log exclusions and redact sensitive data in Cloud Logging |
| How can you improve Spark job performance in Dataproc | Tune executor memory and use dynamic allocation |
| How can you optimize query performance in BigQuery | Use partitioned and clustered tables |
| How can you reduce costs in a data processing environment by optimizing storage usage | Use lower-cost storage tiers like Coldline or Archive for infrequently accessed data |
| How does Cloud Armor help secure data workflows | Protects against DDoS attacks and enforces security policies at the edge |
| How does Cloud Logging help with security | Logs access and actions for auditing purposes |
| What does the BigQuery reservation model help optimize | Cost efficiency for workloads with predictable query patterns |
| What GCP feature allows you to manage access and permissions for resources | IAM (Identity and Access Management) |
| What GCP feature can help you set spending limits and avoid unexpected costs | Budget alerts and quotas |
| What GCP feature enables automatic adjustment of processing resources to match workload demands | Autoscaling |
| What GCP practice helps reduce costs with data egress | Store data closer to the region where it will be processed or consumed |
| What GCP service allows analysis of logs for troubleshooting and auditing purposes | Cloud Logging |
| What GCP service can inspect | classify, and redact sensitive data in your workflows, Data Loss Prevention API (DLP API) |
| What GCP tool helps to visualize system performance and bottlenecks in real time | Cloud Monitoring dashboards |
| What is a cost-saving technique for managing idle resources | Use preemptible VMs or automate resource shutoff during low usage |
| What is the best practice for setting up alerts for anomalies in workflows | Configure alerting policies in Cloud Monitoring |
| What is the main advantage of using Regional buckets over Multi-Regional buckets | Lower cost and latency for region-specific workloads |
| What is the purpose of a Service Account in GCP | Provide applications or VM instances with identities for accessing resources securely |
| What practice should you follow to ensure secure data transmission in GCP | Use encryption in transit with TLS |
| What service enables centralized log export and analysis across multiple projects | Log Sinks with Logging |
| What tool automatically identifies anomalous patterns in metric data in GCP | Cloud Monitoring's Anomaly Detection feature |
| What tool in GCP is used for monitoring resource metrics and creating dashboards | Cloud Monitoring |
| Why is enabling Audit Logs important for cloud resources | Tracks who did what, when, and where for security and compliance |
| Why should you audit IAM role assignments regularly | To ensure the principle of least privilege is maintained |
Front
What GCP service allows analysis of logs for troubleshooting and auditing purposes
Click the card to flip
Back
Cloud Logging
Front
How can you optimize query performance in BigQuery
Back
Use partitioned and clustered tables
Front
What is the main advantage of using Regional buckets over Multi-Regional buckets
Back
Lower cost and latency for region-specific workloads
Front
What tool automatically identifies anomalous patterns in metric data in GCP
Back
Cloud Monitoring's Anomaly Detection feature
Front
What is the best practice for setting up alerts for anomalies in workflows
Back
Configure alerting policies in Cloud Monitoring
Front
What GCP practice helps reduce costs with data egress
Back
Store data closer to the region where it will be processed or consumed
Front
What GCP tool helps to visualize system performance and bottlenecks in real time
Back
Cloud Monitoring dashboards
Front
How can you improve Spark job performance in Dataproc
Back
Tune executor memory and use dynamic allocation
Front
What GCP service can inspect
Back
classify, and redact sensitive data in your workflows, Data Loss Prevention API (DLP API)
Front
What GCP feature allows you to manage access and permissions for resources
Back
IAM (Identity and Access Management)
Front
Why is enabling Audit Logs important for cloud resources
Back
Tracks who did what, when, and where for security and compliance
Front
What does the BigQuery reservation model help optimize
Back
Cost efficiency for workloads with predictable query patterns
Front
What is the purpose of a Service Account in GCP
Back
Provide applications or VM instances with identities for accessing resources securely
Front
How does Cloud Armor help secure data workflows
Back
Protects against DDoS attacks and enforces security policies at the edge
Front
How does Cloud Logging help with security
Back
Logs access and actions for auditing purposes
Front
What GCP feature can help you set spending limits and avoid unexpected costs
Back
Budget alerts and quotas
Front
What GCP feature enables automatic adjustment of processing resources to match workload demands
Back
Autoscaling
Front
Why should you audit IAM role assignments regularly
Back
To ensure the principle of least privilege is maintained
Front
How can you reduce costs in a data processing environment by optimizing storage usage
Back
Use lower-cost storage tiers like Coldline or Archive for infrequently accessed data
Front
How can you ensure sensitive data is not exposed in logs
Back
Use log exclusions and redact sensitive data in Cloud Logging
Front
What service enables centralized log export and analysis across multiple projects
Back
Log Sinks with Logging
Front
What tool in GCP is used for monitoring resource metrics and creating dashboards
Back
Cloud Monitoring
Front
What practice should you follow to ensure secure data transmission in GCP
Back
Use encryption in transit with TLS
Front
What is a cost-saving technique for managing idle resources
Back
Use preemptible VMs or automate resource shutoff during low usage
1/24
This deck focuses on monitoring data workflows with Cloud Monitoring and Cloud Logging, optimizing costs/performance, and implementing security best practices in data processing environments.