00:20:00

Microsoft Azure Solutions Architect Expert Practice Test (AZ-305)

Use the form below to configure your Microsoft Azure Solutions Architect Expert Practice Test (AZ-305). The practice test can be configured to only include certain exam objectives and domains. You can choose between 5-100 questions and set a time limit.

Logo for Microsoft Azure Solutions Architect Expert AZ-305
Questions
Number of questions in the practice test
Free users are limited to 20 questions, upgrade to unlimited
Seconds Per Question
Determines how long you have to finish the practice test
Exam Objectives
Which exam objectives should be included in the practice test

Microsoft Azure Solutions Architect Expert AZ-305 Information

The Microsoft Azure Solutions Architect Expert AZ-305 exam is a pivotal certification for professionals who design and implement solutions on Microsoft's cloud platform. This exam validates a candidate's expertise in translating business requirements into secure, scalable, and reliable Azure solutions. Aimed at individuals with advanced experience in IT operations, including networking, virtualization, and security, the AZ-305 certification demonstrates subject matter expertise in designing cloud and hybrid solutions. Success in this exam signifies that a professional can advise stakeholders and architect solutions that align with the Azure Well-Architected Framework and the Cloud Adoption Framework for Azure.

The AZ-305 exam evaluates a candidate's proficiency across four primary domains. These core areas include designing solutions for identity, governance, and monitoring, which accounts for 25-30% of the exam. Another significant portion, 30-35%, is dedicated to designing infrastructure solutions. The exam also assesses the ability to design data storage solutions (20-25%) and business continuity solutions (15-20%). This structure ensures that certified architects possess a comprehensive understanding of creating holistic cloud environments that address everything from identity management and data storage to disaster recovery and infrastructure deployment.

The Strategic Advantage of Practice Exams

A crucial component of preparing for the AZ-305 exam is leveraging practice tests. Taking practice exams offers a realistic simulation of the actual test environment, helping candidates become familiar with the question formats, which can include multiple-choice, multi-response, and scenario-based questions. This familiarity helps in developing effective time management skills, a critical factor for success during the timed exam. Furthermore, practice tests are an excellent tool for identifying knowledge gaps. By reviewing incorrect answers and understanding the reasoning behind the correct ones, candidates can focus their study efforts more effectively on weaker areas.

The benefits of using practice exams extend beyond technical preparation. Successfully navigating these tests can significantly boost a candidate's confidence. As performance improves with each practice test, anxiety about the actual exam can be reduced. Many platforms offer practice exams that replicate the look and feel of the real test, providing detailed explanations for both correct and incorrect answers. This active engagement with the material is more effective than passive reading and is a strategic approach to ensuring readiness for the complexities of the AZ-305 exam.

Microsoft Azure Solutions Architect Expert AZ-305 Logo
  • Free Microsoft Azure Solutions Architect Expert AZ-305 Practice Test

  • 20 Questions
  • Unlimited
  • Design identity, governance, and monitoring solutions
    Design data storage solutions
    Design business continuity solutions
    Design infrastructure solutions
Question 1 of 20

You manage several Azure SQL Databases that run in the General Purpose service tier. Corporate policy requires the following for all database backups:

  • Backups must remain available if the primary Azure region suffers an outage.
  • Auditors must be able to recover the databases in the paired secondary region without the need to maintain a continuously running secondary database.
  • The solution should use the lowest-cost built-in capability and must not require creating or managing active geo-replicated databases.

Which configuration should you recommend to meet these requirements?

  • Enable long-term retention (LTR) backups in a secondary region.

  • Implement active geo-replication to a secondary database in the paired region.

  • Enable zone-redundant backup storage for each database.

  • Configure read-access geo-redundant (RA-GRS) backup storage for the databases.

Question 2 of 20

Contoso operates a public SaaS API that is deployed in active-active mode across three Azure regions. You must design a single entry point that: terminates TLS, supports path-based routing and session affinity, automatically directs each client to the lowest-latency healthy backend, and performs fail-over to another region within seconds if an outage occurs, without relying on client DNS cache expiration. Which Azure service best meets these requirements?

  • Azure cross-region Standard Load Balancer

  • Azure Application Gateway deployed in each region and distributed by Azure DNS weighted records

  • Azure Front Door Standard/Premium

  • Azure Traffic Manager with performance routing

Question 3 of 20

You manage 50 Azure subscriptions for a global company. The security team operates a third-party SIEM that ingests data from Azure Event Hubs. All Azure Activity logs and resource diagnostic logs must be streamed to the SIEM in near-real time and retained in Azure for at least 365 days for audit purposes. Operational effort must be kept to a minimum. Which solution should you recommend?

  • Assign an Azure Policy that configures diagnostic settings on all subscriptions to send logs to a centralized Log Analytics workspace (with 365-day retention) and to a shared Event Hub namespace used by the SIEM.

  • Create an Automation Account in each subscription that regularly exports logs to an Azure Storage account and then forwards the files to the SIEM over REST.

  • Deploy Azure Sentinel in every subscription and use Sentinel data connectors and playbooks to push collected logs to the SIEM.

  • Enable Continuous Export from Azure Monitor to an Azure Storage account and configure the SIEM to pull log files from the storage account.

Question 4 of 20

Developers commit code to Azure Repos Git. Web APIs are deployed to multiple App Service instances in different Azure subscriptions. You must design an automated solution that builds on every commit, runs tests, deploys to staging slots, and pauses for manual approval before promoting to production. The deployment definition must reside with the code, and you want to avoid introducing additional services. Which Azure service should you recommend to implement the pipeline?

  • Azure DevOps multi-stage YAML pipelines

  • GitHub Actions workflows with environment protection rules

  • Azure Automation runbooks triggered by webhooks

  • Azure App Service Deployment Center with local Git integration

Question 5 of 20

Contoso Ltd. employs 30 first-line support engineers who must be able to restart any virtual machine in the company's three Azure subscriptions during their 8-hour shift. Security policy requires that:

  • Engineers receive only the minimum permissions necessary.
  • Access must expire automatically at the end of each shift.
  • A shift lead must approve the access request before it is granted. You need to recommend an authorization solution that meets the requirements while minimizing administrative effort. What should you recommend?
  • Add the engineers to the built-in Contributor role at each subscription scope and configure Azure AD Access Reviews to run once per month.

  • Create an Azure Automation runbook that restarts virtual machines and grant the engineers permission to invoke the runbook through an Azure DevOps pipeline.

  • Use Azure AD PIM to make each engineer eligible for the built-in Virtual Machine Contributor role at the resource-group level with no approval workflow and a permanent assignment.

  • Create a custom Azure RBAC role that includes only the Microsoft.Compute/virtualMachines/restart/action permission, onboard each subscription to Azure AD Privileged Identity Management, and assign the role as eligible directly to every engineer at the subscription scope. Configure PIM to require shift-lead approval and set the activation duration to eight hours.

Question 6 of 20

Contoso Ltd. has an on-premises Active Directory forest with 10,000 users. The company will adopt several Azure and SaaS applications that support SAML 2.0 or OAuth 2.0. Security requirements: users must sign in with their on-premises domain credentials; multi-factor authentication (MFA) must be enforced for all cloud logons; no user password hashes may be stored in Azure AD. You must recommend an authentication solution that meets the requirements while keeping additional on-premises infrastructure to a minimum. Which solution should you recommend?

  • Deploy an Active Directory Federation Services (AD FS) farm and configure federated authentication with Azure AD Multi-Factor Authentication Server.

  • Implement Azure AD Pass-through Authentication with Seamless Single Sign-On and enable Azure AD Multi-Factor Authentication.

  • Configure Azure AD Password Hash Synchronization with Seamless Single Sign-On and Conditional Access to enforce Multi-Factor Authentication.

  • Create an Azure AD B2C tenant and integrate the on-premises Active Directory as an identity provider by using custom policies.

Question 7 of 20

You are designing a compute platform for a containerized background worker that processes orders placed in an Azure Service Bus queue. The job runs a custom Docker image that includes proprietary machine-learning libraries larger than the 250-MB limit for Azure Functions code packages. During most weekdays the queue is empty, but from Friday night through Sunday morning it can exceed 50,000 messages and must be drained within four hours. The operations team wants the solution to scale automatically down to zero instances when idle and to require the least possible infrastructure management effort.

Which Azure compute service should you recommend?

  • Azure Kubernetes Service with the cluster autoscaler enabled

  • Azure Functions in a Premium plan running the container image

  • Azure Container Apps with KEDA-based autoscaling on the Service Bus queue

  • Azure Container Instances launched by an Azure Logic App each time messages arrive

Question 8 of 20

You are planning the assessment phase for migrating 450 VMware vSphere virtual machines from an on-premises datacenter to Microsoft Azure. The migration plan must discover all virtual machines without installing software inside each guest, collect 30 days of performance data to recommend right-sized target SKUs, map inter-VM network dependencies without requiring in-guest agents, and produce a total cost of ownership (TCO) comparison between the current environment and Azure. Which on-premises component should you deploy first to satisfy all these requirements?

  • Run the Microsoft Assessment and Planning (MAP) Toolkit

  • Deploy the Azure Migrate appliance configured for Discovery and Assessment

  • Install the Azure Site Recovery Mobility service on each virtual machine

  • Onboard the servers to Azure Arc and install the Azure Monitor Dependency agent

Question 9 of 20

Your organization must safeguard customer-owned data-encryption keys used by several Azure services. The solution must meet the following requirements:

  • Keys must reside in a FIPS 140-2 Level 3 validated hardware security module (HSM).
  • The HSM must run in a dedicated single-tenant boundary that Microsoft manages for you.
  • Administrators must assign granular permissions by using Azure role-based access control (RBAC). Which Azure service should you recommend for storing the keys?
  • Azure Key Vault Managed HSM

  • Azure Dedicated Hardware Security Module (HSM) service

  • Azure Key Vault Premium tier

  • Azure Confidential Ledger

Question 10 of 20

A mission-critical Linux virtual machine (VM) runs an internal line-of-business API for a European customer. The VM (Standard D8s v4, premium SSD) is currently deployed as a single instance in the West Europe region.
The solution must meet the following requirements:

  • Remain fully operational if an entire Azure datacenter inside West Europe becomes unavailable.
  • Provide a VM connectivity SLA of at least 99.99 percent.
  • Keep all customer data inside the West Europe region.
  • Minimize additional licensing and ongoing management overhead.

Which deployment option should you recommend?

  • Enable Azure Site Recovery to replicate the VM to another availability zone in West Europe and configure automatic failover.

  • Redeploy the workload as a virtual machine scale set with a minimum of two instances distributed across West Europe availability zones and fronted by an Azure Standard Load Balancer.

  • Add the VM to an availability set configured with two fault domains and two update domains in the existing datacenter.

  • Move the VM to an Azure Dedicated Host group that spans two fault domains within West Europe.

Question 11 of 20

A retailer stores 5 TB of semi-structured catalog data in JSON format and serves it to web and mobile apps worldwide. The solution must provide the following:

  • 99.999 percent read and write availability
  • Automatic regional failover in less than one minute if a region is unavailable
  • Recovery point objective (RPO) of under five seconds during regional outages
  • Ability to accept writes from any region

Which Azure data-platform configuration meets these requirements with the least administrative effort?

  • Deploy Azure Cosmos DB across multiple Azure regions with multi-region writes enabled and session consistency.

  • Store the data in an Azure SQL Database Hyperscale instance and configure active geo-replication to a secondary region.

  • Deploy Azure Cosmos DB in two regions with a single write region and automatic failover.

  • Store the data in an Azure Storage account configured for read-access geo-zone-redundant storage (RA-GZRS).

Question 12 of 20

Your company is building a multi-tenant SaaS solution hosted in Azure. Each tenant receives its own operational database. You expect about 250 small databases that normally consume roughly 1 DTU but can spike to 60 DTUs for a few hours each month. Management wants to minimize overall compute cost without manual intervention while keeping platform maintenance low. Which Azure relational database deployment option should you recommend?

  • Create an Azure SQL Database elastic pool sized for the combined peak demand and place all tenant databases in the pool.

  • Deploy each tenant database as an individual Azure SQL Database in the serverless compute tier with auto-pause enabled.

  • Provision an Azure SQL Managed Instance in the General Purpose tier and host all tenant databases within the instance.

  • Deploy all tenant data in a single Azure SQL Database at the Business Critical tier and use workload management to enforce per-tenant limits.

Question 13 of 20

Your company hosts an on-premises ASP.NET Core web API that must be consumed by several external partner organizations. The partners already authenticate with their own Azure AD tenants. You must expose the API while meeting the following requirements:

  • Authenticate and authorize users by using Azure AD groups.
  • Enforce Azure AD Conditional Access policies for multifactor authentication.
  • Avoid opening any inbound ports through the corporate firewall or adding new perimeter-network infrastructure.

You need to recommend the simplest Azure-based approach.

Which solution should you recommend?

  • Set up Active Directory Federation Services (AD FS) with Web Application Proxy in a perimeter network and federate the partner Azure AD tenants.

  • Deploy Azure AD Domain Services in Azure, join the API server to the managed domain, and enable Azure AD Kerberos authentication.

  • Establish a site-to-site VPN to Azure and publish the API behind an internal Load Balancer fronted by Azure Application Gateway.

  • Install an Azure AD Application Proxy connector on the on-premises network and publish the API through Azure AD Application Proxy.

Question 14 of 20

A retail company runs a 500 GB mission-critical SQL Server 2019 database on-premises. They plan to migrate it to Azure with minimal code changes. The solution must meet these requirements:

  • The platform must handle operating-system and database patching and retain automated backups for 30 days.
  • Provide synchronous high availability across at least two availability zones in the same region.
  • Deliver predictable read/write latency below 5 milliseconds.

Which Azure service and service tier should you recommend?

  • Azure SQL Database Hyperscale tier

  • Azure Virtual Machine running SQL Server 2019 Enterprise with an Always On availability group

  • Azure SQL Managed Instance in the Business Critical tier

  • Azure SQL Database single database in the Premium tier

Question 15 of 20

A financial-services company plans to migrate an on-premises risk-calculation engine to Azure. The solution runs only on Windows Server 2022 and needs 5-500 identical VM instances that must grow or shrink automatically according to CPU utilization. All instances must be patched by replacing them with the latest version of a single golden image, and they must be distributed across three availability zones to meet a 99.99 percent SLA while keeping administrative effort low. Which Azure service meets all the requirements?

  • Provision Windows Server 2022 VMs on Azure Dedicated Hosts distributed across zones and use Azure Functions to start and stop VMs on demand.

  • Use Azure Batch with a custom Windows Server 2022 node image to run the workload and rely on built-in pool autoscaling.

  • Deploy a Virtual Machine Scale Set that uses a versioned Windows Server 2022 image from an Azure Compute Gallery and spans three Availability Zones.

  • Create an Availability Set of Windows Server 2022 VMs built from a managed image and configure Azure Automation runbooks to add or remove VMs.

Question 16 of 20

Your company hosts an e-commerce application on virtual machines deployed in two Azure regions in an active-active pattern. You must publish the site through one globally reachable hostname that provides TLS/SSL termination close to users, URL path-based routing to different microservice endpoints, automatic failover to the secondary region if the primary becomes unavailable, and centralized protection against common web exploits such as the OWASP Top 10. Which Azure networking service should you include in the solution design to satisfy all of these requirements?

  • Expose the application through Azure Firewall instances advertising public IP prefixes over ExpressRoute.

  • Deploy Azure Traffic Manager in priority mode and point it to regional Azure Load Balancer public IP addresses.

  • Deploy Azure Front Door and apply a Web Application Firewall (WAF) policy to it.

  • Place an Azure Application Gateway v2 in each region and use a cross-region Azure Load Balancer to distribute traffic.

Question 17 of 20

Your company runs several microservices on Azure Kubernetes Service (AKS). Each service exposes its own internal REST endpoint. A new B2B program requires that you:

  • Publish a single public HTTPS endpoint for partners.
  • Enforce subscription keys, Azure AD authentication, and per-partner call quotas.
  • Perform lightweight request/response transformations and response caching without modifying the microservices. Which Azure service should you recommend to meet all of these requirements with minimum redevelopment effort?
  • Create an Azure Application Gateway with a Web Application Firewall to route partner traffic.

  • Implement an Azure Functions HTTP-triggered proxy that forwards requests to each microservice.

  • Publish each microservice through Azure Service Bus topics and let partners subscribe.

  • Deploy Azure API Management in front of the AKS services and expose a unified API.

Question 18 of 20

Your company ingests two data streams from a consumer IoT product. Devices send about 5 GB/hour of JSON telemetry that dashboards must query for the last seven days with sub-100 ms latency and allow flexible schema changes. Devices also upload 2 MB JPEG images that are accessed often for 30 days, seldom after, but must be retained for five years. To meet requirements at the lowest cost and administration effort, which Azure storage combination should you recommend?

  • Azure Cache for Redis to store telemetry and zone-redundant Premium SSD managed disks for images

  • Azure SQL Database Hyperscale for telemetry and Azure Files with the Cool access tier for images

  • Azure Cosmos DB (NoSQL) with autoscale throughput for telemetry, and Azure Blob Storage with lifecycle rules to move images from the Hot tier to Cool after 30 days and to Archive after 180 days

  • Azure Data Lake Storage Gen2 to store both telemetry and images in a single storage account with hierarchical namespace enabled

Question 19 of 20

An insurance company runs a mission-critical policy administration system on SQL Server 2017 Enterprise Edition. The database uses SQL Agent jobs, cross-database queries, SQL CLR procedures, and Service Broker messaging. To reduce operational overhead, the company will migrate to Azure, requiring automatic backups, built-in high availability, and minimal code changes in a platform as a service (PaaS) model. Which Azure data service should you recommend?

  • Azure SQL Database elastic pool in the Premium tier

  • Azure SQL Database single database in the Business Critical tier

  • Azure SQL Managed Instance in the General Purpose tier

  • SQL Server 2022 on an Azure Virtual Machine protected by Azure Backup

Question 20 of 20

Contoso Ltd. plans to consolidate data from several on-premises Oracle and SAP databases and from the Salesforce SaaS application into an Azure Data Lake Storage Gen2 account. The data must be loaded daily, enriched with code-free transformations, and then written to an Azure Synapse Analytics dedicated SQL pool. The solution must minimise infrastructure management, provide a large library of built-in connectors, and allow visual pipeline authoring, scheduling, and monitoring. Which Azure service should you recommend as the primary data integration and orchestration layer?

  • Azure Event Grid

  • Azure Databricks

  • Azure Data Factory

  • Azure Logic Apps