AWS Cloud Practitioner Practice Test (CLF-C02)
Use the form below to configure your AWS Cloud Practitioner Practice Test (CLF-C02). The practice test can be configured to only include certain exam objectives and domains. You can choose between 5-100 questions and set a time limit.

AWS Cloud Practitioner CLF-C02 Information
The AWS Certified Cloud Practitioner (CLF-C02) is an entry-level certification for individuals looking to understand the fundamentals of Amazon Web Services (AWS). This exam is designed for both technical and non-technical professionals who need a general understanding of cloud computing and AWS services. It does not require prior cloud experience, making it an ideal starting point for those new to the field.
Exam Overview
The CLF-C02 exam consists of multiple-choice and multiple-response questions. It is a 90-minute test that costs $100 USD. AWS does not publicly disclose the passing score, but candidates should aim for at least 70% to pass. The exam is available in multiple languages, including English, Japanese, Korean, and Simplified Chinese.
Exam Objectives
This exam covers four key areas: cloud concepts, security and compliance, technology, and billing and pricing. Cloud concepts include the benefits of cloud computing, the AWS global infrastructure, and the shared responsibility model. Security and compliance focus on AWS Identity and Access Management (IAM), compliance programs, and best security practices. The technology section tests knowledge of AWS compute, storage, networking, and databases, as well as the AWS Well-Architected Framework and serverless computing. Finally, billing and pricing cover AWS pricing models, total cost of ownership, and AWS support plans.
Who Should Take This Exam?
The AWS Certified Cloud Practitioner certification is suitable for individuals who want to learn about cloud computing and its business applications. It is beneficial for professionals in sales, finance, project management, and other roles that interact with cloud technology. It is also useful for those planning to pursue advanced AWS certifications.
How to Prepare
To prepare for the CLF-C02 exam, candidates should review the AWS Certified Cloud Practitioner Exam Guide and take advantage of AWS’s Free Tier for hands-on experience. AWS also offers training through its Skill Builder platform, which includes practice questions and study materials. Additionally, taking practice exams can help candidates identify areas where they need improvement.
Summary
The AWS Certified Cloud Practitioner (CLF-C02) is a foundational certification that validates a broad understanding of AWS services and cloud concepts. It is an excellent starting point for professionals who want to build cloud expertise or advance in cloud-related careers.
Scroll down to see your responses and detailed results
Free AWS Cloud Practitioner CLF-C02 Practice Test
Press start when you are ready, or press Change to modify any settings for the practice test.
- Questions: 15
- Time: Unlimited
- Included Topics:Cloud ConceptsSecurity and ComplianceCloud Technology and ServicesBilling, Pricing, and Support
A startup is looking to itemize expenditures associated with their various development initiatives to optimize resource allocation and financial reporting. Which approach should they adopt to enhance their capability to sort and filter expenditure data by specific projects?
- You selected this option
Rely on the initial price estimation tools to determine ongoing project expenses retroactively.
- You selected this option
Establish a combined budget tracking system for all projects within their cost management dashboard.
- You selected this option
Enable detailed billing features to categorize expenses using metadata tags pertaining to each project.
- You selected this option
Consolidate their billing accounts under a central management account to break down costs by department.
Answer Description
By activating cost allocation tags and tagging resources, the startup can categorize and track their spending in finer detail. This practice enables the organization to attribute costs to specific projects or teams, facilitating precise analysis and budgetary control. The ability to accurately assign and report expenses is crucial for companies to make informed decisions about resource utilization and cost-saving opportunities.
Ask Bash
Bash is our AI bot, trained to help you pass your exam. AI Generated Content may display inaccurate information, always double-check anything important.
What are cost allocation tags in AWS?
How does tagging help in financial reporting?
What are some best practices for using cost allocation tags?
Costs for on-premises environments typically include physical hardware, facility expenses, and utilities regardless of the current demand for resources.
- You selected this option
False
- You selected this option
True
Answer Description
The statement is correct. On-premises environments usually involve significant upfront capital expenditure for hardware purchase, as well as ongoing costs for maintaining the physical space, such as rent or real estate, power, and cooling. These costs are incurred even if the demand for resources fluctuates, leading to potential inefficiencies when resources are underutilized.
Ask Bash
Bash is our AI bot, trained to help you pass your exam. AI Generated Content may display inaccurate information, always double-check anything important.
What are some examples of physical hardware costs in on-premises environments?
What are facility expenses and how do they impact overall costs?
Why do utilities like power and cooling matter in on-premises environments?
Which deployment model combines cloud-based resources with on-premises infrastructure?
- You selected this option
Hybrid cloud
- You selected this option
Public cloud
- You selected this option
Private cloud
- You selected this option
Community cloud
Answer Description
The hybrid deployment model integrates both cloud-based resources and on-premises infrastructure, offering flexibility and enabling various use cases, such as gradual cloud migration, disaster recovery, and compliance with regulatory requirements.
Ask Bash
Bash is our AI bot, trained to help you pass your exam. AI Generated Content may display inaccurate information, always double-check anything important.
What are the benefits of using a hybrid cloud model?
What are some common use cases for a hybrid cloud?
How does a hybrid cloud differ from a private and public cloud?
Which service provides a physical device suitable for easing the transfer of voluminous data sets into the cloud to expedite the migration process?
- You selected this option
Snowcone
- You selected this option
DataSync
- You selected this option
Migration Hub
- You selected this option
Snowball
- You selected this option
Simple Storage Service (S3)
- You selected this option
Glue
Answer Description
The 'Snowball' device is the correct answer because it's purpose-built to physically transport substantial quantities of data into and out of the cloud infrastructure. This service is particularly useful when dealing with limitations in internet bandwidth or when transferring data would otherwise be time-consuming or costly. The 'Snowcone' device is more compact and suited for edge computing and data transfer jobs of a smaller scale. 'DataSync' is a service that facilitates online data transfers, without providing a physical transport solution. The 'Migration Hub' is more of a management tool to track applications during the migration but doesn't offer data transport. 'Simple Storage Service (S3)' does provide cloud storage but doesn't include a physical data transfer solution. 'Glue' is for data integration and does not involve physical data transfer devices.
Ask Bash
Bash is our AI bot, trained to help you pass your exam. AI Generated Content may display inaccurate information, always double-check anything important.
What is AWS Snowball and how does it work?
What are the differences between Snowball and Snowcone?
What is AWS DataSync and how does it compare to Snowball?
Given a scenario where a company's database workload has a highly variable query pattern with unpredictable, large spikes in read traffic, which database hosting option would allow for cost-efficient scalability?
- You selected this option
Deploying a self-hosted database on Amazon EC2 with reserved instances to handle peak loads
- You selected this option
Using DynamoDB with on-demand capacity mode for handling unpredictable workloads
- You selected this option
AWS Aurora with serverless deployment
- You selected this option
Creating multiple Amazon RDS Read Replicas to manage the read traffic spikes
Answer Description
AWS Aurora's serverless option automatically adjusts the database capacity in fine-grained increments to match the workload demand, which allows handling unpredictable workload spikes in a cost-efficient manner without the need for provisioning and managing scaling operations. Unlike a self-hosted database on Amazon EC2 or using read replicas, which requires manual intervention or planning for maximum capacity, Aurora Serverless provides an on-demand auto-scaling capability.
Ask Bash
Bash is our AI bot, trained to help you pass your exam. AI Generated Content may display inaccurate information, always double-check anything important.
What is AWS Aurora and how does it work?
What does serverless deployment mean in the context of AWS?
How does auto-scaling work in AWS Aurora Serverless?
A global retail chain is facing challenges managing resource provisioning to handle diverse customer demands during seasonal peaks and promotional events. Which advantage of cloud computing directly alleviates this concern by scaling resources in response to traffic fluctuations?
- You selected this option
Dedicated hosting environments
- You selected this option
Elasticity
- You selected this option
Service homogeneity
- You selected this option
Single-tenancy infrastructure
Answer Description
Elasticity is a key advantage of cloud computing that allows systems to dynamically scale resources according to fluctuating workloads. This feature is particularly useful for a retail business experiencing variable demand levels, as it ensures that their customer-facing applications can handle spikes in usage during peak seasons or promotional events without manual intervention. This avoids the risks of system overload and enhances the user experience while also controlling costs by not having to maintain excess idle resources.
Ask Bash
Bash is our AI bot, trained to help you pass your exam. AI Generated Content may display inaccurate information, always double-check anything important.
What is Elasticity in cloud computing?
How does elasticity help manage costs for businesses?
What are the differences between elasticity and scalability?
DynamoDB allows the modification of the primary key of an existing item to support dynamic schema changes.
- You selected this option
This statement is correct
- You selected this option
This statement is incorrect
Answer Description
DynamoDB does not allow the modification of an item's primary key once it has been created. To 'change' a primary key, the item must be deleted and re-created with the new key. This enforces a rigid schema on the primary key level while allowing flexibility in the attributes. Test takers might be tempted to select the wrong answer due to the flexible schema nature of NoSQL databases, but understanding the immutable nature of the primary key is essential when designing DynamoDB tables.
Ask Bash
Bash is our AI bot, trained to help you pass your exam. AI Generated Content may display inaccurate information, always double-check anything important.
Why can't we change the primary key in DynamoDB?
What are the types of primary keys in DynamoDB?
What does flexible schema mean in DynamoDB if the primary key can't change?
Amazon EC2 Reserved Instances can be modified to switch between instance families after the purchase.
- You selected this option
True
- You selected this option
False
Answer Description
Amazon EC2 Reserved Instances offer flexibility that includes the ability to modify instances, but modifications are subject to certain limitations. You can change the Availability Zone, the scope (regional or a specific Availability Zone), and the instance size within the same instance family, but you cannot switch between different instance families. This constraint is designed to maintain the reserved capacity for a particular family due to demand and capacity planning by AWS.
Ask Bash
Bash is our AI bot, trained to help you pass your exam. AI Generated Content may display inaccurate information, always double-check anything important.
What are the limitations for modifying Amazon EC2 Reserved Instances?
What does it mean to reserve EC2 capacity?
What are instance families in EC2, and why can't they be switched?
Which service should you use to consolidate security alert notifications and prioritize them based on their importance, while also maintaining compliance with industry standards and regulations?
- You selected this option
AWS Shield
- You selected this option
AWS Config
- You selected this option
Amazon Inspector
- You selected this option
AWS Security Hub
Answer Description
The service best suited for consolidating security alert notifications and prioritizing them is AWS Security Hub, as it provides a central location to view and manage security and compliance states across your resources. Its functionality extends to collecting security data from various sources and helping you understand your AWS security and compliance posture. On the other hand, AWS Shield focuses primarily on DDoS protection, Amazon Inspector on the assessment of application security, and AWS Config on tracking and management of resource configurations, without the specific emphasis on centralized security event management and compliance reporting.
Ask Bash
Bash is our AI bot, trained to help you pass your exam. AI Generated Content may display inaccurate information, always double-check anything important.
What are the main features of AWS Security Hub?
How does AWS Security Hub integrate with other AWS services?
What is the difference between AWS Security Hub and AWS Config?
A corporation intends to transition its existing relational database systems to a managed cloud environment with the requirement of having ongoing data replication to keep the source and destination databases in sync. Which service should be utilized to meet these specifications most effectively?
- You selected this option
Database Migration Service
- You selected this option
Schema Conversion Tool
- You selected this option
DataSync
- You selected this option
Relational Database Service Snapshot
Answer Description
The correct service for ongoing replication and real-time data synchronization is the Database Migration Service. When you need to move databases with minimal downtime and require continuous replication, this service is the optimal choice. It not only allows for the migration of data but also keeps the databases in sync during the migration process.
- The Snapshot feature within the Relational Database Service platform is primarily for backups, not for ongoing replication or real-time synchronization.
- The Schema Conversion Tool is mainly utilized for converting database schemas for migration purposes, but it does not handle real-time data replication.
- DataSync is more suitable for large-scale data transfer tasks and does not specialize in the specific requirements of ongoing database replication or synchronization.
Ask Bash
Bash is our AI bot, trained to help you pass your exam. AI Generated Content may display inaccurate information, always double-check anything important.
What is the Database Migration Service (DMS) and how does it work?
What advantages does continuous data replication have over traditional backup methods?
Can the Database Migration Service be used for databases other than AWS RDS?
A company is looking to automate the transition of their data to more cost-effective storage solutions as the data ages. Which of the following would be the BEST solution for automatically transitioning objects from standard storage to a lower-cost storage option after 30 days, and then deleting objects that are older than 365 days?
- You selected this option
Manually moving objects to Amazon S3 Glacier after 30 days and setting calendar reminders to delete objects after 365 days.
- You selected this option
Using AWS Storage Gateway for a cached volume to archive data and relying on the automated clean-up process.
- You selected this option
Configuring Amazon EBS snapshots to occur every 30 days and writing a script to delete snapshots older than 365 days.
- You selected this option
Creating lifecycle policies on Amazon S3 to transition objects to S3 Glacier after 30 days and permanently delete them after 365 days.
Answer Description
Lifecycle policies in Amazon S3 allow for the automatic transition of objects between different storage classes and the configuration of object expiration. In this case, utilizing lifecycle policies enables the company to reduce storage costs by automatically moving objects to a lower-cost storage option after 30 days and deleting objects no longer needed after 365 days, thus ensuring cost optimization and adherence to data retention policies. The other options do not provide automatic transitioning or expiration based on object age which is a requirement as stated in the question.
Ask Bash
Bash is our AI bot, trained to help you pass your exam. AI Generated Content may display inaccurate information, always double-check anything important.
What are Amazon S3 lifecycle policies?
What is Amazon S3 Glacier?
Why is manually moving objects not a recommended solution?
Amazon EC2 instances can only be launched using the AWS Management Console.
- You selected this option
False
- You selected this option
True
Answer Description
This statement is not true because there are several methods to launch Amazon EC2 instances including the AWS Management Console, the AWS Command Line Interface (CLI), AWS SDKs, and infrastructure as code tools like AWS CloudFormation. While the Management Console is a user-friendly, web-based interface for deploying and managing EC2 instances, the other methods provide more automation and can be better suited for repeatable, programmable deployments.
Ask Bash
Bash is our AI bot, trained to help you pass your exam. AI Generated Content may display inaccurate information, always double-check anything important.
What are AWS SDKs and how do they relate to launching EC2 instances?
What is the AWS Command Line Interface (CLI) and how does it differ from the Management Console?
What is AWS CloudFormation and how is it used to launch EC2 instances?
When using AWS services, how does the pricing model typically differ from the traditional data center expenses?
- You selected this option
You pay only for the resources you consume.
- You selected this option
You pay a fixed monthly fee regardless of resources used.
- You selected this option
You have significant upfront hardware costs that depreciate over time.
- You selected this option
You are charged based on a long-term contract with fixed resource allocation.
Answer Description
The correct answer is 'You pay only for the resources you consume.' This is in contrast to traditional on-premises environments where companies incur significant upfront capital expenses for hardware that may not always be utilized to its full capacity. AWS operates on a pay-as-you-go model, allowing customers to scale their usage up or down based on demand and pay only for the services they use. This shift from fixed to variable costs is a key economic benefit of the AWS Cloud.
Ask Bash
Bash is our AI bot, trained to help you pass your exam. AI Generated Content may display inaccurate information, always double-check anything important.
What does the pay-as-you-go model really mean?
What are the advantages of variable costs over fixed costs in AWS?
How can companies manage costs effectively in the AWS pricing model?
A company is evaluating AWS storage solutions for infrequently accessed data, which will be stored for at least six months. The company wants to minimize storage costs. Which storage option best aligns with the company's requirements?
- You selected this option
Amazon S3 Standard - Infrequent Access (S3 Standard-IA)
- You selected this option
Amazon S3 Intelligent-Tiering
- You selected this option
Amazon S3 One Zone-Infrequent Access (S3 One Zone-IA)
- You selected this option
Amazon S3 Glacier Deep Archive
Answer Description
AWS S3 Glacier Deep Archive is the most cost-effective storage option provided by AWS for long-term data archiving where data is retrieved infrequently and retrieval times of 12 hours or longer are acceptable. S3 Standard and S3 Intelligent-Tiering are not optimized for such specific long-term archiving needs, and they incur higher storage costs compared to Glacier Deep Archive. S3 One Zone-IA is cost-effective for infrequently accessed data, but not as much as Glacier Deep Archive for long-term archiving.
Ask Bash
Bash is our AI bot, trained to help you pass your exam. AI Generated Content may display inaccurate information, always double-check anything important.
What are the key benefits of using Amazon S3 Glacier Deep Archive?
How does Amazon S3 Standard - Infrequent Access differ from Glacier Deep Archive?
What situations would typically warrant the use of S3 Intelligent-Tiering?
Which aspect of cloud infrastructure security is the provider directly responsible for under the shared responsibility model?
- You selected this option
Installing security patches on the operating system
- You selected this option
Physical security of the infrastructure
- You selected this option
Setting up permissions and roles within identity services
- You selected this option
Implementing client-side data encryption mechanisms
Answer Description
Under the shared responsibility model, the provider is responsible for the security 'of' the cloud, which encompasses the physical infrastructure, including data center facilities, servers, and networking equipment. Customers are responsible for security 'in' the cloud, which can include managing their guest operating systems, configuring IAM, and encrypting client-side data.
Ask Bash
Bash is our AI bot, trained to help you pass your exam. AI Generated Content may display inaccurate information, always double-check anything important.
What is the shared responsibility model in cloud security?
What does the physical security of cloud infrastructure entail?
What are IAM roles and permissions in cloud security?
Gnarly!
Looks like that's it! You can go back and review your answers or click the button below to grade your test.