Scroll down to see your responses and detailed results
Prepare for the AWS Certified Developer Associate DVA-C02 exam with this free practice test. Randomly generated and customizable, this test allows you to choose the number of questions.
Customer-managed keys in AWS KMS can be created, owned and managed by the user, while AWS managed keys are managed by AWS but are still dedicated to the user's AWS account.
True
False
The statement is correct. Customer-managed keys offer increased flexibility, allowing the user to manage the lifecycle of the keys, including creation, rotation, and deletion according to the organization's policies and compliance requirements. AWS managed keys are indeed managed by AWS, which simplifies the management of encryption keys, but they are dedicated to the user's AWS account and used by AWS services to protect resources in the account.
Bash is our AI bot, trained to help you pass your exam. AI Generated Content may display inaccurate information, always double-check anything important.
A developer is integrating third-party social identity provider authentication in an application and needs to allow authenticated users to interact with cloud storage and computing resources. Which feature of Amazon Cognito should the developer use to fulfill this requirement?
Authorize with ID tokens from the social identity provider.
Create a User Pool.
Deploy an Identity Broker service.
Use an Identity Pool.
Identity Pools, also known as Federated Identities in Amazon Cognito, allow developers to create unique identities for their application's users and enable them to assume temporary credentials to access cloud resources directly. This feature would be appropriate for the scenario described as it allows the application to give authenticated users the permissions needed to interact with storage and computing resources. Conversely, User Pools are primarily used for user sign-up and sign-in, and do not directly enable user access to storage and computing resources without an Identity Pool. Directly using ID tokens or an identity broker are methods of authentication, but they do not offer the same federated capabilities combined with resource access that an Identity Pool provides.
Bash is our AI bot, trained to help you pass your exam. AI Generated Content may display inaccurate information, always double-check anything important.
When preparing a microservice for deployment that requires multiple sensitive configurations, which approach would ensure the secure and environment-agnostic management of these settings?
Fetch the configurations from the host's instance metadata service upon container initialization
Embed the configurations into the application code within the container image
Declare sensitive environment variables within the build specification file of the container
Dynamically inject the configurations using a secrets management service at container startup
Using a secrets management service to dynamically inject configurations at runtime ensures that the microservice remains environment-agnostic and that sensitive information is not stored within the container image. This approach promotes security and flexibility in configuration management, essential for maintaining best practices in cloud application deployments. Hard-coding settings violates security best practices and reduces flexibility. Instance metadata is meant for obtaining information about the host, not for storing sensitive configuration data. Defining sensitive information as environment variables within the image build process exposes them in the image layers, which is not secure.
Bash is our AI bot, trained to help you pass your exam. AI Generated Content may display inaccurate information, always double-check anything important.
Your application requires a cost-effective storage solution to archive infrequently accessed data. The solution should maintain the data with 99.999999999% (11 9's) durability and provide retrieval times within minutes. Which AWS cloud storage option would best suit these requirements?
Amazon RDS snapshot
Amazon S3 Standard
Amazon EBS snapshots
Amazon S3 Glacier Instant Retrieval
Amazon S3 Glacier Instant Retrieval offers the necessary balance between cost-effectiveness for archiving and quick retrieval times for infrequently accessed data. It delivers 99.999999999% (11 9's) durability, similar to other Amazon S3 storage classes, but is designed to minimize costs for data that is retrieved infrequently. Other storage options either do not provide the same level of durability, have higher costs for storage and retrieval, or focus on different retrieval timeframes.
Bash is our AI bot, trained to help you pass your exam. AI Generated Content may display inaccurate information, always double-check anything important.
A software development team is implementing a set of microservices that will be exposed to their clients through endpoints. They are looking to adopt best practices for testing newly developed features without affecting the live production environment accessed by their customers. What is the most effective method to achieve this within the service that provides the endpoints?
Generate a clone of the existing production environment and make code changes directly there to validate new features
Reuse the production endpoints for testing by applying request filtering based on a custom header to differentiate traffic
Implement feature toggles in the production codebase to switch between production and development configurations
Create a new deployment stage with specific configurations for the microservices that are separate from production settings
The most effective method involves setting up a new deployment stage within the service that provides the endpoints, which in this context is Amazon API Gateway. Stages enable you to create independent environments for development and production. Setting up stage variables, specific authentication methods, and Throttling rules for the development environment ensures that testing does not interfere with the production environment. This approach allows the team to test new code changes under conditions that closely mimic the live set-up without risking the integrity and stability of the production systems.
Bash is our AI bot, trained to help you pass your exam. AI Generated Content may display inaccurate information, always double-check anything important.
You are working on a serverless application where each component is designed to execute a distinct operation within an e-commerce checkout process. During the development cycle, you want to confirm that each component functions independently and as expected without making actual calls to cloud services. What technique should you employ within your unit tests to simulate the behavior of the external dependencies?
Create instances of client objects specific to cloud resources within your unit tests
Reference recorded responses from an object storage service during test execution
Utilize SDK mocking utilities to emulate the behavior of external service calls
Configure an API management service to handle dependencies during test runs
To simulate the behavior of external dependencies, you should utilize mocking utilities provided with the SDK for your chosen programming language. This ensures that tests can run without the need for actual service calls, allowing for true unit testing by isolating the code from external interactions. Creating real client instances during testing would result in calls to the actual services, which contradicts the principles of unit testing. Similarly, configuring API Gateway to manage test dependencies creates actual service interactions. Using sample responses from an object storage service introduces external dependency to tests, which is contrary to the concept of isolation in unit testing.
Bash is our AI bot, trained to help you pass your exam. AI Generated Content may display inaccurate information, always double-check anything important.
When deploying an AWS Lambda function, which packaging option allows you to include common dependencies across multiple functions to promote code reuse and reduce the size of deployment packages?
Lambda Environment Variables
Lambda Execution Role Policies
Lambda Layers
Lambda Container Images
Lambda Layers are a way to centrally manage common components across multiple Lambda functions, enabling code reuse and reducing the size of deployment packages. Layers allow developers to include additional code such as libraries or custom runtimes, which can be used by multiple functions, avoiding the need to package these dependencies with every function deployment. This concept is important for developers because it streamlines the management of code dependencies and can help improve the deployment process.
Bash is our AI bot, trained to help you pass your exam. AI Generated Content may display inaccurate information, always double-check anything important.
Which type of IAM policy allows entities to share a set of permissions defined and updated by the cloud provider without any customization?
Principal policies
User-defined policies
Inline policies assigned directly to entities
Managed policies provided by the cloud service provider
Managed policies provided by the cloud service provider are designed for use across multiple accounts and are maintained by the provider, ensuring they align with security best practices. In contrast, customer-managed policies are created, customized, and maintained by the users themselves, providing tighter control over specific permissions but requiring proactive management and revision by the user.
Bash is our AI bot, trained to help you pass your exam. AI Generated Content may display inaccurate information, always double-check anything important.
A developer is tasked with maintaining the consistent deployment of a serverless function across multiple environments. This function requires a specific version of an external library. What approach should the developer take to manage the dependencies of the serverless function efficiently?
Manage the library version through a configuration management service.
Embed the dependencies directly into the function's deployment package.
Incorporate the library code within the function's script.
Utilize layers to package the external library, managing its version independently.
The correct approach is to utilize layers, a feature provided by the serverless function service, which allows dependencies to be packaged separately from the function code. This method promotes consistent deployment across different environments and makes it easier to update or modify the shared components without redeploying the entire function codebase. Inserting the dependencies directly into the function's deployment package would not provide the separate management and versioning that layers offer. Incorporating the library code within the function script lacks modularity and complicates updates. Using a configuration management service, which is typically meant for runtime configuration and not for binary dependencies, is not the right tool for managing external library versions.
Bash is our AI bot, trained to help you pass your exam. AI Generated Content may display inaccurate information, always double-check anything important.
Which description best exemplifies a loosely coupled component within a system architecture?
Calling a specific instance of an application server directly to process a task
Hardcoding the endpoint of a specific Lambda function for invocation from an EC2 instance
Using a shared database table for direct communication between two microservices
Publishing messages to an SQS queue instead of making direct API calls to another service
A loosely coupled component is designed to interact with other components with minimal dependencies or knowledge about them, often communicating through well-defined interfaces or messaging systems. Option 'Publishing messages to an SQS queue instead of making direct API calls to another service' is an example of loose coupling because it uses a message queue as an intermediary, which reduces the dependencies between the sender and the receiver. The other options describe situations that create direct dependencies, characterizing tight coupling, which is less desirable for scalability and resilience.
Bash is our AI bot, trained to help you pass your exam. AI Generated Content may display inaccurate information, always double-check anything important.
You have been given the task to test an AWS Lambda function locally, which is designed to process JSON data from an Amazon API Gateway proxy integration. The function expects to receive a payload with a nested JSON object in the body that contains project details. Which of the following test events correctly emulates the expected format of the incoming event from API Gateway?
{"httpMethod": "POST", "body": "{"project": {"name": "New Project", "description": "An upcoming project."}}", "path": "/projects", "requestContext": {"httpMethod": "POST", "path": "/prod/projects"}, "isBase64Encoded": false}
{"method": "POST", "body": {"project": {"name": "New Project", "description": "An upcoming project."}}, "path": "/projects"}
{"httpMethod": "POST", "body": "{'project': {'name': 'New Project', 'description': 'An upcoming project.'}}", "path": "/projects", "requestContext": , "isBase64Encoded": false}
{"httpMethod": "POST", "body": "{"project": {"name": "New Project", "description": "An upcoming project"}}", "path": "/projects"}
The correct answer is the one that adheres to the standard event structure that an AWS Lambda function expects from an API Gateway with proxy integration. This includes the correct use of 'httpMethod', 'body' with properly escaped JSON string, 'path', and 'requestContext'. The body must also be a string that is a serialized JSON object since the Lambda function will parse it as JSON. The correct option presents a nested JSON in a string format, properly serialized and escaped. Other options either present the body as an object instead of a string, improperly escape double quotes, or miss important property fields.
Bash is our AI bot, trained to help you pass your exam. AI Generated Content may display inaccurate information, always double-check anything important.
To verify the accuracy of a function tasked with handling file processing and database updates, which practice should a developer employ to simulate interactions with the storage and database services during automated tests?
Create various branches and aliases for the function to segregate operational environments during the testing phase.
Generate live instances of file storage and managed database services and execute test cases against them.
Incorporate a mocking framework to replicate the behavior of file storage and managed database services within the test suite.
Implement a strategy for stepping through the function code with a debugger to inspect the file processing and updates.
Utilizing mocking frameworks to create simulated versions of storage and database services enables the verification of application logic without affecting live resources. This method tests how the function would behave with different inputs while avoiding direct impacts on actual resources, which could lead to unintended data manipulation and unnecessary costs. Creating real instances for testing can be resource-intensive and is not as efficient for automated unit testing purposes. Versioning and employing branching strategies are more pertinent to deployment management and not specific to the act of automated testing itself. Debugging provides a way to examine code execution line by line, but it is not an automated approach and does not endorse testing best practices.
Bash is our AI bot, trained to help you pass your exam. AI Generated Content may display inaccurate information, always double-check anything important.
An organization's development team is preparing to roll out a serverless application that utilizes multiple cloud resources, including object storage, a NoSQL database, and serverless compute functions. The application must be able to read and write data to specific storage buckets and database tables. To comply with best security practices, how should you provision access for this application?
Construct a custom security profile for the application, restricting permissions exclusively to the operations required on designated storage buckets and database tables.
Generate an access key and secret key combination for the application, granting full management capabilities for all services to avoid potential disruptions.
Employ the root user's credentials for the application to ensure uninterrupted service access without having to manage multiple permission sets.
Deactivate explicit permission policies and deploy network-based controls to govern access to the necessary service resources.
Based on the principle of least privilege, the correct approach is to create a custom security profile (also known as an IAM role when implemented in AWS) for the application with the minimal set of permissions needed, which are those that allow reading from and writing to the specified storage buckets and database tables only. Granting full management capabilities across all services, as well as using the root user's credentials, would violate this principle by providing unnecessarily broad permissions and introduce significant security risks. Relying solely on network-based access controls is also incorrect as these are different types of security measures that do not manage the permissions for accessing cloud resources the way IAM roles or policies do.
Bash is our AI bot, trained to help you pass your exam. AI Generated Content may display inaccurate information, always double-check anything important.
What deployment strategy gradually shifts traffic from a previous version of an application to the latest version, enabling the ability to monitor the effects on a subset of users before full deployment?
Canary deployment
Resume-based deployment
Blue/green deployment
Rolling updates
A canary deployment strategy involves rolling out changes to a small subset of users to test the impact before deploying it to the entire infrastructure. This minimizes the risk as it allows for monitoring and quick rollback if necessary. Blue/green deployment involves switching traffic between two identical environments that only differ in the version of the application they are running. Rolling updates incrementally replace the previous version with a new version across all hosts. Resume-based deployment is not a recognized AWS deployment strategy.
Bash is our AI bot, trained to help you pass your exam. AI Generated Content may display inaccurate information, always double-check anything important.
A developer is working on enhancing the security of a serverless infrastructure where user authentication is handled by an OIDC-compliant external identity provider. Upon a user's successful sign-in, the external service issues a token. The developer needs to ensure that this token is validated before allowing access to the serverless function endpoint. Which approach should the developer implement to enforce token validation?
Apply a resource-based policy directly on the function to check for the presence of the token in the request.
Configure a role with specified permissions that authenticates users based on the provided token.
Utilize a Lambda function programmed to evaluate and verify the token before proceeding with the request.
Deploy client-side certificates to secure the endpoint and validate the incoming tokens.
The developer should implement a Lambda authorizer, which is a way to handle custom authorization logic before granting access to the serverless function endpoint. The Lambda authorizer can verify the validity of the token and determine if the request should be allowed or denied. This approach is particularly useful in serverless architectures where application components are loosely coupled, and an external identity provider manages user authentication. This verification is performed within the AWS environment without making a round trip to the external identity provider. On the contrary, IAM roles are for access management within AWS services and resources, not for validating tokens directly. Resource-based policies define permissions for AWS resources, but they do not provide a method for validating bearer tokens. Client-side certificates are used for mutual TLS (mTLS) authentication but do not apply to the scenario involving verification of tokens provided by an external identity provider.
Bash is our AI bot, trained to help you pass your exam. AI Generated Content may display inaccurate information, always double-check anything important.
Looks like that's it! You can go back and review your answers or click the button below to grade your test.
Join premium for unlimited access and more features