AWS Certified Developer Associate Practice Test (DVA-C02)
Use the form below to configure your AWS Certified Developer Associate Practice Test (DVA-C02). The practice test can be configured to only include certain exam objectives and domains. You can choose between 5-90 questions and set a time limit.

AWS Certified Developer Associate DVA-C02 Information
AWS Certified Developer - Associate showcases knowledge and understanding of core AWS services, uses, and basic AWS architecture best practices, and proficiency in developing, deploying, and debugging cloud-based applications by using AWS. Preparing for and attaining this certification gives certified individuals more confidence and credibility. Organizations with AWS Certified developers have the assurance of having the right talent to give them a competitive advantage and ensure stakeholder and customer satisfaction.
The AWS Certified Developer - Associate (DVA-C02) exam is intended for individuals who perform a developer role. The exam validates a candidate’s ability to demonstrate proficiency in developing, testing, deploying, and debugging AWS Cloud-based applications. The exam also validates a candidate’s ability to complete the following tasks:
- Develop and optimize applications on AWS.
- Package and deploy by using continuous integration and continuous delivery (CI/CD) workflows.
- Secure application code and data.
- Identify and resolve application issues.
Scroll down to see your responses and detailed results
Free AWS Certified Developer Associate DVA-C02 Practice Test
Press start when you are ready, or press Change to modify any settings for the practice test.
- Questions: 15
- Time: Unlimited
- Included Topics:Development with AWS ServicesSecurityDeploymentTroubleshooting and Optimization
A developer needs to ensure that the credentials used by a serverless function to connect to a database are stored securely and are not exposed in plain text. What is the recommended method for managing the credentials to maximize security?
Leverage a secrets manager to handle the credentials and retrieve them dynamically within the serverless function.
Place the encrypted credentials in a cloud storage service bucket and retrieve them via the serverless function.
Hardcode the credentials directly into the code base that is part of the serverless function.
Encrypt the credentials manually and include them as part of the serverless function's environment variables.
Answer Description
Utilizing a dedicated secret management service, specifically designed to handle sensitive information, provides robust security features such as encryption at rest and controlled access policies. This service also takes care of secure storage and rotation of secrets, which is not inherently handled when encrypting and storing values directly in the serverless function's environment variables or when using simple storage services.
Ask Bash
Bash is our AI bot, trained to help you pass your exam. AI Generated Content may display inaccurate information, always double-check anything important.
What is a secrets manager, and how does it work?
What are the benefits of using a secrets manager over environment variables?
How does credential rotation in a secrets manager improve security?
Your application uses an Amazon DynamoDB table to serve user profile information. You notice an increase in the read load causing throttling issues due to repeated accesses to the same items. Which caching strategy should you implement to minimize read latency and reduce the load on the DynamoDB table while ensuring data consistency for frequently accessed items?
Increase the read capacity units (RCUs) for the DynamoDB table to handle the higher load
Implement a read-through cache using Amazon ElastiCache
Use a write-through cache to preload user profiles into Amazon ElastiCache
Apply lazy loading to load user profiles into the cache only when updates occur
Set a time-to-live (TTL) for user profiles to invalidate the cache periodically
Store user profiles in Amazon S3 and synchronize them with DynamoDB
Answer Description
Implementing a 'read-through' caching strategy will minimize read latency and reduce the load on the DynamoDB table by caching the data after the first read from the database. Subsequent reads are served directly from the cache, which decreases the burden on the table. As the cache is populated on-demand during a database read, it ensures that the data in the cache is consistent with the database. Other caching strategies might not be as effective for this scenario. For example, write-through caching is more suited for write-heavy applications, lazy loading can lead to stale data if not managed properly, and TTL doesn't directly address the problem of high read load and throttling.
Ask Bash
Bash is our AI bot, trained to help you pass your exam. AI Generated Content may display inaccurate information, always double-check anything important.
What is a read-through cache and how does it work?
What is Amazon ElastiCache and why is it used with DynamoDB?
What are the potential drawbacks of using a read-through cache?
A developer is working on enhancing the security of a serverless infrastructure where user authentication is handled by an OIDC-compliant external identity provider. Upon a user's successful sign-in, the external service issues a token. The developer needs to ensure that this token is validated before allowing access to the serverless function endpoint. Which approach should the developer implement to enforce token validation?
Utilize a Lambda function programmed to evaluate and verify the token before proceeding with the request.
Deploy client-side certificates to secure the endpoint and validate the incoming tokens.
Configure a role with specified permissions that authenticates users based on the provided token.
Apply a resource-based policy directly on the function to check for the presence of the token in the request.
Answer Description
The developer should implement a Lambda authorizer, which is a way to handle custom authorization logic before granting access to the serverless function endpoint. The Lambda authorizer can verify the validity of the token and determine if the request should be allowed or denied. This approach is particularly useful in serverless architectures where application components are loosely coupled, and an external identity provider manages user authentication. This verification is performed within the AWS environment without making a round trip to the external identity provider. On the contrary, IAM roles are for access management within AWS services and resources, not for validating tokens directly. Resource-based policies define permissions for AWS resources, but they do not provide a method for validating bearer tokens. Client-side certificates are used for mutual TLS (mTLS) authentication but do not apply to the scenario involving verification of tokens provided by an external identity provider.
Ask Bash
Bash is our AI bot, trained to help you pass your exam. AI Generated Content may display inaccurate information, always double-check anything important.
What is an OIDC-compliant identity provider?
How does a Lambda authorizer work for token validation?
What are JWT tokens, and why are they important?
A developer is integrating a serverless function with a third-party web service. The developer needs to confirm that the function reacts appropriately under different response conditions, including successful data retrieval, errors, and timeouts, without interacting with the actual service. What is the most effective method to mimic the third-party service and assess the function's various operational responses?
Activate the serverless function in a live setting with enhanced logging to track its handling of different operational conditions.
Implement an auxiliary serverless function to reproduce the behavior of the third-party service for testing purposes.
Embed static response conditions within the serverless function code to facilitate response scenario assessment.
Establish an Amazon API Gateway and configure mock integrations for reproducing varied operational scenarios.
Answer Description
Setting up an Amazon API Gateway with mock integrations is the most effective method to mimic third-party web service responses. It enables the developer to configure the expected responses, such as those indicating a successful operation, an error, or a timeout, without needing to rely on the real web service. This provides a controlled environment for thorough assessment of the serverless function's response to diverse scenarios. Other methods, such as hardcoding responses or utilizing additional serverless functions for emulation, do not offer the same level of flexibility or convenience as API Gateway's mock integrations for simulating a wide spectrum of behaviors.
Ask Bash
Bash is our AI bot, trained to help you pass your exam. AI Generated Content may display inaccurate information, always double-check anything important.
What is Amazon API Gateway and how does it work?
What are mock integrations in AWS, and why are they useful?
What are the benefits of using mock testing for serverless functions?
Which service offers the capability to obtain temporary, privileged credentials for making authenticated requests to various resources, particularly in scenarios involving identity federation or cross-account access?
Key Management Service
Cognito User Identity Pools
Security Token Service
Simple Storage Service
Answer Description
The correct service for retrieving short-term, privileged credentials is the Security Token Service, which is often used in conjunction with federated user authentication or for assuming roles. These credentials are constrained by time and the permissions defined in their associated policies, fostering a secure environment that adheres to the principle of least privilege. The Security Token Service is a component of the overarching identity management system provided by AWS but specializes in this temporary credential issuance. In comparison, Cognito is mostly oriented towards user identity and access management in apps, whereas the Key Management Service is involved in creating and controlling encryption keys, not in issuing temporary security credentials based on permission policies.
Ask Bash
Bash is our AI bot, trained to help you pass your exam. AI Generated Content may display inaccurate information, always double-check anything important.
What is the Security Token Service (STS) and how does it work?
What is the concept of identity federation in AWS?
What are the implications of using temporary security credentials?
A web application is leveraging a content delivery network (CDN) to distribute images. The development team wants to implement caching optimizations to serve different image qualities based on the bandwidth of the user's connection, which is indicated by the Save-Data
client hint request header. Which strategy should be implemented to provide the most suitable image quality while optimizing for performance and bandwidth usage?
Determine the user's device type and capabilities by examining the
User-Agent
header and serve images from a cache optimized for desktop or mobile.Cache different versions of images depending on the
Save-Data
header value to serve lower-quality images if the client requests data savings.Ignore request headers and always serve images from the same high-quality cache to all users to simplify the caching logic.
Use the
Referer
header to determine the originating domain of the request and serve a universally optimized image quality based on the most frequent domain.
Answer Description
The best approach is to use the Save-Data
request header to determine if the user is requesting data savings. If the Save-Data
header is present and set to on
, the CDN should serve lower-quality images to conserve bandwidth. This header is a part of the Client Hints standard, which allows clients to indicate preferences in the HTTP request. Incorrect strategies, such as caching based on the User-Agent
header or ignoring headers, would fail to account for the bandwidth-saving preference conveyed by the Save-Data
header. Serving only the highest quality images ignores client preferences and can lead to unnecessarily high data usage and slower load times for users with limited bandwidth.
Ask Bash
Bash is our AI bot, trained to help you pass your exam. AI Generated Content may display inaccurate information, always double-check anything important.
What are client hints and how do they work?
Why is it important to serve different image qualities based on bandwidth?
What are the consequences of ignoring the `Save-Data` header?
Your organization requires encryption for sensitive data stored in Amazon S3. The security policy mandates that the organization manages its own encryption keys and that encryption must occur before the data leaves the organization's premises. Which encryption method aligns most closely with these requirements?
Server-side encryption with AWS KMS-managed keys (SSE-KMS)
Server-side encryption with customer-provided keys (SSE-C)
Client-side encryption with a customer-managed encryption key
Server-side encryption with Amazon S3-managed keys (SSE-S3)
Answer Description
Client-side encryption is the process where data is encrypted on the client's side (i.e., within the organization's boundary) before it is transferred to the server or service, such as S3. This approach means that the organization retains full control of the encryption keys and manages the encryption process. In this case, client-side encryption best addresses the security policy's requirement for the organization to manage its own encryption keys and for data to be encrypted before it leaves the premises. Server-side encryption involves encrypting data once it has been uploaded to S3, with key management handled by AWS or the customer through AWS Key Management Service (KMS). While it is a secure method, it does not meet the specific requirement of encrypting data before it leaves the organization's premises.
Ask Bash
Bash is our AI bot, trained to help you pass your exam. AI Generated Content may display inaccurate information, always double-check anything important.
What is client-side encryption?
What is the difference between client-side and server-side encryption?
What are customer-managed encryption keys in AWS?
When designing an application that interacts with Amazon DynamoDB, what is the purpose of serialization?
Serialization is a specific DynamoDB operation to replicate data across multiple tables.
Serialization is the process of compressing data to minimize the storage space required in Amazon DynamoDB.
Serialization is the process of converting an object's state into a format that can be stored in a database like Amazon DynamoDB.
Serialization is the process of encrypting data before storing it in Amazon DynamoDB to ensure security.
Answer Description
Serialization is the process of converting an object's state or data structure into a format that can be stored in a database or transmitted over a network. In the context of Amazon DynamoDB, which stores data in JSON format, serialization ensures that object data can be properly saved to the database. This enables you to store complex objects as entries in the DynamoDB table, which would not be possible with the native formats alone.
Ask Bash
Bash is our AI bot, trained to help you pass your exam. AI Generated Content may display inaccurate information, always double-check anything important.
What types of data formats are commonly used for serialization in DynamoDB?
How does serialization affect performance when working with DynamoDB?
What are some common libraries or tools used for serialization in AWS applications?
A development team needs to allow an external consultant's account to access a specific Amazon S3 bucket to store and retrieve files essential for a joint project. The external consultant should not be given user credentials within the team's AWS account. What type of policy should the development team attach to the S3 bucket to allow access directly to the bucket itself?
Resource-based policy (e.g., S3 bucket policy)
Service control policy (SCP)
IAM group policy
Identity-based policy attached to a user
Answer Description
A resource-based policy, specifically an S3 bucket policy, is the correct method to directly grant permissions on the S3 bucket to entities in another AWS account without sharing any user credentials. This policy allows the external account to assume the necessary permissions. While a service control policy (SCP) applies to all IAM entities in an AWS Organizations account and does not grant permissions to outside accounts, and IAM user policies are attached directly to users within your account, not to resources.
Ask Bash
Bash is our AI bot, trained to help you pass your exam. AI Generated Content may display inaccurate information, always double-check anything important.
What is a resource-based policy in AWS?
How does an S3 bucket policy work?
What are the key differences between resource-based policies and identity-based policies in AWS?
Given a situation where a developer must design an API that processes payment transactions, which of the following approaches BEST ensures that the API handles repeated submissions of the same transaction in an idempotent manner?
Store the state of each transaction in a database with enforced write locks to prevent concurrent writes.
Use a stateless protocol that does not require server-side tracking of transaction states.
Implement idempotent receipt tokens that must be submitted with each transaction.
Generate random transaction IDs for each submission and use these IDs to detect duplicates.
Answer Description
The correct answer is to implement idempotent receipt tokens. By providing a unique token for each transaction, the system can recognize and ignore duplicate submissions. This prevents the same transaction from being processed more than once, which is the core principle of idempotency. Storing transaction states and enforcing write locks might be part of the solution but do not offer idempotency for the client's API calls. Generating random transaction IDs does not ensure idempotency because it does not prevent duplicates from being processed.
Ask Bash
Bash is our AI bot, trained to help you pass your exam. AI Generated Content may display inaccurate information, always double-check anything important.
What are idempotent receipt tokens and how do they work?
Why is idempotency important in payment processing APIs?
What is the principle of idempotency in API design?
When you need to perform ad-hoc analysis of log data and write queries using a service-specific query language to gain insights into your application’s operations, which service do you use?
AWS CloudTrail
AWS Config
Amazon Elasticsearch Service
Amazon CloudWatch Logs Insights
Answer Description
The correct answer is Amazon CloudWatch Logs Insights. This service allows you to use a specialized query language to search, analyze, and visualize log data from your applications and AWS resources. It is particularly well-suited for ad-hoc log analysis and gaining operational insights. AWS CloudTrail is used for recording API calls and related events for your AWS account. Amazon Elasticsearch Service is for running Elasticsearch clusters at scale, providing search and data analytics capabilities. AWS Config is for assessing, auditing, and evaluating the configurations of AWS resources, not for querying log data for application insights.
Ask Bash
Bash is our AI bot, trained to help you pass your exam. AI Generated Content may display inaccurate information, always double-check anything important.
What is Amazon CloudWatch Logs Insights?
What types of log data can be analyzed with CloudWatch Logs Insights?
How does the query language in CloudWatch Logs Insights work?
What is an essential characteristic of a unit test when developing applications?
It should interact with a live database to validate integration points.
It should be able to be run in a production environment to test real user scenarios.
It should be isolated, testing a single component without external dependencies.
It should cover multiple components and their interactions to ensure integration.
Answer Description
A unit test should be isolated, meaning it must test a single component (a 'unit') of the software without dependencies on external resources or the state of other units. Isolation allows for more predictable and faster tests which can pinpoint errors directly within the tested unit. Tests that rely on external state or resources might be integration tests or functional tests but are not considered 'unit' tests.
Ask Bash
Bash is our AI bot, trained to help you pass your exam. AI Generated Content may display inaccurate information, always double-check anything important.
What are external dependencies in unit testing?
What tools can be used to create isolated unit tests?
How do unit tests differ from integration tests?
Utilizing Amazon CloudWatch Logs Insights for querying application logs does not support time-based queries, making it unsuitable for isolating incidents that occurred during specific time frames.
True
False
Answer Description
The correct answer is false. Amazon CloudWatch Logs Insights does indeed support time-based queries, which allows developers to isolate and examine log data from specific time frames. This is a critical feature for troubleshooting to pinpoint when specific incidents or anomalies occurred. Offering 'true' as an option plays on the possibility of lacking time-based query features, which could be a common pitfall for someone not familiar with CloudWatch Logs Insights' capabilities.
Ask Bash
Bash is our AI bot, trained to help you pass your exam. AI Generated Content may display inaccurate information, always double-check anything important.
How do time-based queries work in CloudWatch Logs Insights?
What query language does CloudWatch Logs Insights use?
What are some common use cases for Amazon CloudWatch Logs Insights?
Which service is designed to provide developers with insights into the performance and operation of their distributed applications, offering capabilities to collect, analyze, and visualize tracing data?
CloudTrail
X-Ray
CloudFormation
Inspector
Answer Description
The service designed for providing insights into the performance and operation of distributed applications is X-Ray. It allows the collection, analysis, and visualization of tracing data, which is crucial for developers to understand and optimize their applications. CloudTrail, on the other hand, focuses on logging and auditing AWS account activity. Inspector assesses applications for vulnerabilities and deviations from best practices. CloudFormation automates infrastructure provisioning, which is unrelated to application performance tracing.
Ask Bash
Bash is our AI bot, trained to help you pass your exam. AI Generated Content may display inaccurate information, always double-check anything important.
What specific types of data can X-Ray collect and analyze?
How does X-Ray visualize tracing data?
What is the difference between X-Ray and CloudTrail in AWS?
Your application is generating logs at a high volume and you have been tasked with identifying occurrences of timeout errors that might be hidden within the vast amount of log data. You decide to use Amazon CloudWatch Logs Insights to parse these logs efficiently. Which of the following CloudWatch Logs Insights queries will accurately find log entries that contain the timeout error message 'TimeoutException', along with the timestamp and the requestId of the associated requests?
display @timestamp, requestId where @message = 'TimeoutException'
filter @logStream = 'TimeoutException' | stats count() by requestId
search 'TimeoutException' | fields @timestamp, requestId
fields @timestamp, requestId, @message | filter @message like /TimeoutException/
Answer Description
The correct answer uses the fields
command to select the relevant information such as @timestamp
and requestId
, and the filter
command to narrow down the search to log entries containing the specific error message 'TimeoutException'. Including @message in the fields allows developers to verify that the filtered log entries contain the correct message. The correct query provides a precise way to sift through the logs for specific error messages, which is necessary for effective troubleshooting.
Ask Bash
Bash is our AI bot, trained to help you pass your exam. AI Generated Content may display inaccurate information, always double-check anything important.
What is Amazon CloudWatch Logs Insights?
What is the purpose of the 'filter' command in CloudWatch Logs Insights queries?
Can you explain what '@timestamp' and 'requestId' are in the context of logging?
That's It!
Looks like that's it! You can go back and review your answers or click the button below to grade your test.