00:20:00

Microsoft Azure Solutions Architect Expert Practice Test (AZ-305)

Use the form below to configure your Microsoft Azure Solutions Architect Expert Practice Test (AZ-305). The practice test can be configured to only include certain exam objectives and domains. You can choose between 5-100 questions and set a time limit.

Logo for Microsoft Azure Solutions Architect Expert AZ-305
Questions
Number of questions in the practice test
Free users are limited to 20 questions, upgrade to unlimited
Seconds Per Question
Determines how long you have to finish the practice test
Exam Objectives
Which exam objectives should be included in the practice test

Microsoft Azure Solutions Architect Expert AZ-305 Information

The Microsoft Azure Solutions Architect Expert AZ-305 exam is a pivotal certification for professionals who design and implement solutions on Microsoft's cloud platform. This exam validates a candidate's expertise in translating business requirements into secure, scalable, and reliable Azure solutions. Aimed at individuals with advanced experience in IT operations, including networking, virtualization, and security, the AZ-305 certification demonstrates subject matter expertise in designing cloud and hybrid solutions. Success in this exam signifies that a professional can advise stakeholders and architect solutions that align with the Azure Well-Architected Framework and the Cloud Adoption Framework for Azure.

The AZ-305 exam evaluates a candidate's proficiency across four primary domains. These core areas include designing solutions for identity, governance, and monitoring, which accounts for 25-30% of the exam. Another significant portion, 30-35%, is dedicated to designing infrastructure solutions. The exam also assesses the ability to design data storage solutions (20-25%) and business continuity solutions (15-20%). This structure ensures that certified architects possess a comprehensive understanding of creating holistic cloud environments that address everything from identity management and data storage to disaster recovery and infrastructure deployment.

The Strategic Advantage of Practice Exams

A crucial component of preparing for the AZ-305 exam is leveraging practice tests. Taking practice exams offers a realistic simulation of the actual test environment, helping candidates become familiar with the question formats, which can include multiple-choice, multi-response, and scenario-based questions. This familiarity helps in developing effective time management skills, a critical factor for success during the timed exam. Furthermore, practice tests are an excellent tool for identifying knowledge gaps. By reviewing incorrect answers and understanding the reasoning behind the correct ones, candidates can focus their study efforts more effectively on weaker areas.

The benefits of using practice exams extend beyond technical preparation. Successfully navigating these tests can significantly boost a candidate's confidence. As performance improves with each practice test, anxiety about the actual exam can be reduced. Many platforms offer practice exams that replicate the look and feel of the real test, providing detailed explanations for both correct and incorrect answers. This active engagement with the material is more effective than passive reading and is a strategic approach to ensuring readiness for the complexities of the AZ-305 exam.

Microsoft Azure Solutions Architect Expert AZ-305 Logo
  • Free Microsoft Azure Solutions Architect Expert AZ-305 Practice Test

  • 20 Questions
  • Unlimited
  • Design identity, governance, and monitoring solutions
    Design data storage solutions
    Design business continuity solutions
    Design infrastructure solutions

Free Preview

This test is a free preview, no account required.
Subscribe to unlock all content, keep track of your scores, and access AI features!

Question 1 of 20

Your company runs several Azure subscriptions that host production App Service web apps. A group of support engineers must be able to restart any web app during an incident. They must not have permissions to change configuration or deploy code. Access must be requested on demand, limited to two hours per activation, and all activations must be auditable. Which authorization approach should you recommend?

  • Create a custom Azure RBAC role that contains only the restart and stop actions for Microsoft.Web sites and assign it to the support engineers as an eligible assignment by using Azure AD Privileged Identity Management with a two-hour maximum activation.

  • Permanently assign the built-in Website Contributor role to the support engineers at the subscription scope.

  • Generate a user-delegation SAS token for each web app's deployment slot and give the tokens to the support engineers.

  • Create and assign an Azure Policy initiative that allows the Restart action on App Service resources.

Question 2 of 20

You are designing a new relational data tier for a multi-tenant SaaS application running in Azure. The workload is read-heavy, experiences unpredictable spikes, and is forecast to grow from 1 TB today to more than 15 TB over the next 18 months. The solution must allow rapid, non-blocking compute scale-out for reporting queries and must let you increase storage capacity on demand without lengthy migrations. Which Azure SQL deployment option meets the requirements?

  • Deploy a single Azure SQL Database in the Hyperscale service tier.

  • Deploy a single Azure SQL Database in the Premium tier and configure active geo-replication.

  • Place the database in an Azure SQL elastic pool using the General Purpose serverless tier.

  • Use an Azure SQL Managed Instance in the Business Critical tier with read-scale-out enabled.

Question 3 of 20

Contoso works with several external design firms. Guest users from these firms must be able to submit a single request that grants them access to multiple Microsoft Teams, SharePoint Online sites, and Azure AD groups for each project. Access should expire automatically 60 days after approval unless extended, and project owners must receive recurring prompts to review and confirm each guest's continued need. You need a solution that meets these requirements while minimizing ongoing administrative effort. Which Azure AD capability should you choose?

  • Configure Azure AD entitlement management to publish an access package containing the required groups, Teams, and sites.

  • Implement Azure AD B2C user flows so external users can sign up and receive access to needed resources.

  • Enable Azure AD Privileged Identity Management and assign eligible roles to external users for the required resources.

  • Create Conditional Access policies that restrict guest user sign-in locations and require multifactor authentication.

Question 4 of 20

Your company runs an e-commerce platform on an on-premises SQL Server instance. You plan to migrate the 9 TB database to Azure. Forecasts show it will exceed 12 TB within a year. The workload experiences unpredictable spikes during promotions and must be able to double compute capacity within minutes while remaining fully managed. You also need point-in-time restore for at least seven days. Which Azure SQL deployment option and service tier should you recommend?

  • SQL Server on an Azure Virtual Machine with Premium SSD storage

  • Azure SQL Database elastic pool in the General Purpose tier

  • Azure SQL Database single database in the Hyperscale service tier

  • Azure SQL Managed Instance in the Business Critical tier

Question 5 of 20

Contoso operates several Azure subscriptions for different business units. You must design a logging and monitoring strategy with the following requirements:

  • Consolidate platform and resource diagnostic logs from all subscriptions so that administrators can run cross-resource Kusto queries.
  • Retain all log data for seven years to meet regulatory compliance.
  • Keep costs low; interactive querying is required only for the most recent 90 days. Historical data can tolerate up to 12 hours retrieval latency. Which solution should you recommend?
  • Create a Log Analytics workspace in each subscription and set the workspace retention to 2,555 days (seven years).

  • Deploy a single Log Analytics workspace in a dedicated monitoring subscription; send all diagnostic and activity logs to it; keep 90 days of data in the default retention and move older data to the workspace archive tier.

  • Configure diagnostic settings on every resource to stream logs to an Event Hub and ingest them into an Azure Data Explorer cluster set to seven-year retention.

  • Export activity and diagnostic logs from each subscription directly to an Azure Storage account with a seven-year lifecycle policy and rely on Storage analytics for ad-hoc queries.

Question 6 of 20

Contoso stores approximately 3 PB of high-resolution images in Azure. New images are uploaded daily, but after the first 30 days fewer than 1 percent are ever read again. Compliance requires the data to stay in a single region; no cross-region replication is necessary. The solution must maximize durability while keeping overall storage cost as low as possible and still allow immediate reads when an occasional file is requested. Which Azure storage configuration should you recommend?

  • Provision a Premium block blob storage account with zone-redundant storage (ZRS) and keep the blobs in the Hot tier.

  • Create a General Purpose v2 storage account with locally redundant storage (LRS) and keep the blobs in the Hot access tier.

  • Create a General Purpose v2 storage account with locally redundant storage (LRS) and set the blobs to the Cool access tier.

  • Create a General Purpose v2 storage account with geo-redundant storage (GRS) and move the blobs to the Archive access tier.

Question 7 of 20

Your company is modernizing an on-premises .NET Core REST API that currently runs in Windows Server virtual machines. The team will containerize the application and deploy it to Azure. The future workload must meet these requirements:

  • Accept HTTP requests from the internet.
  • Scale automatically from zero to thousands of concurrent requests.
  • Require no virtual-machine management by the operations team.
  • Support individual request executions that can run for up to one hour.

Which Azure compute service should you recommend?

  • Azure Container Apps

  • Azure Kubernetes Service with the cluster autoscaler enabled

  • Azure App Service on Linux with autoscale

  • Azure Functions deployed on a Consumption plan

Question 8 of 20

Your company operates 20 Azure subscriptions for different business units. The security team needs every platform diagnostic log-including Activity Log, AKS control-plane logs, and Key Vault audit events-streamed to a central Log Analytics workspace in a dedicated security subscription. The same logs must also be forwarded in near real time to a third-party SIEM that ingests data from an Event Hub. Minimizing per-subscription configuration effort is a priority. Which solution should you recommend?

  • Install the Azure Monitor agent on every resource and configure data collection rules that forward logs simultaneously to the workspace and the Event Hub.

  • Create an Azure Policy initiative that deploys diagnostic settings for every supported resource type, routing each log stream to both the central Log Analytics workspace and an Event Hub namespace in the security subscription.

  • Deploy Azure Monitor private-link scoped data collection endpoints that push logs to the workspace and stream a copy to the SIEM.

  • Send all logs only to the central Log Analytics workspace and enable continuous export from that workspace to the Event Hub used by the SIEM.

Question 9 of 20

Contoso will migrate all workloads to Azure within six months and wants to decommission its on-premises Active Directory Domain Services (AD DS) as soon as possible. Identity requirements are:

  • Provide single sign-on for new Azure-hosted applications that use SAML or OpenID Connect.
  • Keep several legacy line-of-business VMs that authenticate by using LDAP and NTLM.
  • Let external partners who already use Microsoft Entra ID access selected resources with minimal overhead.
  • Minimize ongoing infrastructure administration.

Which solution should you recommend?

  • Create an Azure AD B2C tenant, migrate internal identities into it, and federate partner tenants through custom identity providers.

  • Create a cloud-only Microsoft Entra ID tenant, enable Azure AD Domain Services for the virtual network, and use Azure AD B2B guest collaboration for partner access.

  • Keep the on-premises AD DS environment, synchronize it to Azure AD with password hash sync, and use Conditional Access policies for partner users.

  • Deploy domain controllers on Azure IaaS VMs, configure Active Directory Federation Services for single sign-on, and invite partners through AD FS claims.

Question 10 of 20

Contoso Ltd. has an on-premises Active Directory forest with 10,000 users. The company will adopt several Azure and SaaS applications that support SAML 2.0 or OAuth 2.0. Security requirements: users must sign in with their on-premises domain credentials; multi-factor authentication (MFA) must be enforced for all cloud logons; no user password hashes may be stored in Azure AD. You must recommend an authentication solution that meets the requirements while keeping additional on-premises infrastructure to a minimum. Which solution should you recommend?

  • Create an Azure AD B2C tenant and integrate the on-premises Active Directory as an identity provider by using custom policies.

  • Configure Azure AD Password Hash Synchronization with Seamless Single Sign-On and Conditional Access to enforce Multi-Factor Authentication.

  • Implement Azure AD Pass-through Authentication with Seamless Single Sign-On and enable Azure AD Multi-Factor Authentication.

  • Deploy an Active Directory Federation Services (AD FS) farm and configure federated authentication with Azure AD Multi-Factor Authentication Server.

Question 11 of 20

You are designing authentication for several Azure Kubernetes Service (AKS) clusters that will be deployed in three different Azure subscriptions. The clusters will host identical micro-services that must read secrets from a central Azure Key Vault and send diagnostics to an Event Hub namespace. The solution must:

  • Eliminate hard-coded or file-based credentials in the containers.
  • Allow the same identity to be shared by workloads running in every cluster and subscription.
  • Ensure that credential rotation never requires redeploying the applications.

Which approach should you recommend?

  • Create an Azure AD application with a client secret and store the secret in Kubernetes Secrets mounted into the pods.

  • Enable a system-assigned managed identity on every AKS cluster and grant each identity the required access to Key Vault and Event Hub.

  • Configure an Azure AD application that uses a certificate stored on each cluster node and rotate the certificate annually.

  • Create one user-assigned managed identity, grant it access to Key Vault and Event Hub, and attach that identity to the node pools of all AKS clusters.

Question 12 of 20

Your organization operates workloads in five Azure subscriptions. Compliance policy requires that all Azure resource diagnostic and activity logs be kept for at least seven years. Operations engineers must be able to run ad-hoc Log Analytics queries against the most recent 90 days of data and create near-real-time alert rules based on those queries. The solution must minimize operational effort and overall cost while satisfying the retention and query requirements.

Which approach should you recommend?

  • Stream all diagnostic and activity logs from each subscription to an Azure Event Hubs namespace, process them with an Azure Function that inserts records into an Azure SQL Database configured with long-term backup retention, and build alert rules that query the database.

  • Configure each resource to send diagnostics directly to a central storage account that has immutable blob storage and lifecycle rules to move data to the Archive access tier; deploy Azure Sentinel workspaces in every subscription to query and alert on the stored data.

  • Enable Azure Monitor metrics exporter on every resource and send the output to an Azure Data Explorer cluster configured for seven-year retention; use scheduled Kusto queries in Data Explorer to generate alerts.

  • Create a dedicated Log Analytics workspace in a central operations subscription. Configure subscription-level diagnostic settings in each subscription to forward activity logs, and configure resource or resource group diagnostic settings to forward resource logs, all to this workspace. Set the workspace's interactive retention to 90 days, enable the Archive tier with seven-year retention for older data, and define Azure Monitor alert rules on the workspace.

Question 13 of 20

A company runs critical workloads on 200 Linux and Windows virtual machines that span three Azure subscriptions. Operations must be alerted automatically if any VM's CPU utilization remains above 80 percent for longer than one minute. Administrators also need to review CPU performance trends for up to 90 days while keeping monitoring costs to a minimum. Which Azure Monitor-based solution should you recommend?

  • Create a single Log Analytics workspace, install the Azure Monitor agent on all VMs, collect the Processor% Processor Time counter every minute, and configure log alert rules.

  • Install the Application Insights agent on each VM and rely on smart detection rules to identify sustained CPU spikes.

  • Use the automatically collected Percentage CPU platform metric stored in the Azure Monitor metrics store, and configure a static metric alert on each VM that triggers when the average value exceeds 80 percent for one minute.

  • Enable Azure Monitor for VMs (VM Insights) with Change Analysis and create log-based alert rules on the InsightsMetrics table.

Question 14 of 20

Your company hosts 200 customer-specific SQL Server databases, each under 10 GB. Usage is unpredictable: most stay idle outside business hours, but many experience concurrent spikes during month-end reporting. You plan to migrate the solution to Azure SQL Database and must minimize overall compute cost while avoiding any need to manually scale individual databases. Which Azure SQL deployment and compute option should you recommend?

  • Deploy every database as an individual Azure SQL Database that uses the serverless compute tier.

  • Consolidate all data into one Azure SQL Database in the Hyperscale service tier with auto-scaling enabled.

  • Migrate all databases to a single Azure SQL Managed Instance on the General Purpose service tier.

  • Place all databases in an Azure SQL Database elastic pool that uses the General Purpose vCore service tier.

Question 15 of 20

Your company hosts several legacy ASP.NET applications on-premises that use Windows Integrated Authentication (Kerberos). Identities are synchronized to Azure Active Directory with Azure AD Connect. Management wants staff working from home to reach these applications over the internet without requiring a VPN. Access must be evaluated by Azure AD Conditional Access and the solution must avoid exposing the internal network or adding significant new infrastructure. Which approach should you recommend?

  • Deploy an Azure VPN Gateway and require users to establish a Point-to-Site VPN before accessing the applications.

  • Migrate the applications to Azure App Service and enable Azure AD authentication with Conditional Access.

  • Deploy Azure AD Domain Services, join the web servers to the managed domain, and control access through Azure role assignments.

  • Publish the applications by using Azure AD Application Proxy and configure Kerberos Constrained Delegation for single sign-on.

Question 16 of 20

You manage 40 Azure subscriptions that belong to a single tenant. The security team uses an on-premises Splunk installation and must receive all Azure Activity log events from every subscription and all Azure AD sign-in log entries within minutes. They also require that a raw, immutable copy of the same logs be retained in Azure for at least two years at the lowest possible cost. Which logging architecture should you recommend?

  • Create a diagnostic setting in each subscription to send the Azure Activity log to a central Event Hub namespace and to a storage account in a dedicated logging subscription, and configure a single Azure AD diagnostic setting to send sign-in logs to the same destinations.

  • Stream Activity logs from each subscription to individual storage accounts, then use Azure Data Factory pipelines to copy log files both to Splunk and to a long-term archive account.

  • Enable Azure Sentinel in each subscription and install a Splunk Universal Forwarder on the Sentinel VMs to pull data from the workspaces.

  • Deploy a Log Analytics workspace in each subscription, collect both log types into the workspace, and use Continuous Export from every workspace to separate Event Hubs that Splunk will poll.

Question 17 of 20

You are designing the data layer for a multi-tenant SaaS application that will migrate 500 on-premises SQL Server databases to Azure. Each tenant has its own database and exhibits unpredictable, bursty workloads-most databases are idle for long periods, but any tenant may suddenly require more compute resources. The tenant count is expected to grow to about 3,000 within two years. Management wants the most cost-effective solution that can automatically redistribute compute capacity among all databases during usage spikes, without manual intervention. Which Azure SQL deployment option should you recommend?

  • Deploy each tenant database as an Azure SQL Database in the Hyperscale tier and enable read-scale replicas.

  • Place all tenant databases in an Azure SQL Database elastic pool in the General Purpose tier.

  • Migrate the workload to an Azure SQL Managed Instance configured with provisioned compute and automatic storage growth.

  • Create separate Azure SQL Database serverless single databases for each tenant.

Question 18 of 20

Your company is developing a multi-tenant Software-as-a-Service (SaaS) web application that will run on Azure App Service. Employees of customer organizations must authenticate by using their existing Azure Active Directory (Azure AD) accounts, and individual consumers must sign in with Google or Facebook. You need to recommend an authentication platform that meets both requirements while requiring the fewest possible changes to application code. Which solution should you recommend?

  • Deploy Azure AD Domain Services and join the App Service to the managed domain for Kerberos-based authentication.

  • Create an Azure AD B2C tenant, configure Google and Facebook as identity providers, and add each customer's Azure AD tenant as an OpenID Connect identity provider.

  • Enable App Service authentication with a single-tenant Azure AD registration and configure social identity providers in custom application code.

  • Register a multi-tenant application in the company's Azure AD tenant and use Azure AD External Identities with social identity providers.

Question 19 of 20

Your company collaborates with several partner organizations. Each month, about 250 external users require access to two SharePoint Online sites, a Microsoft Teams channel, and a specific resource group in an Azure subscription. Security requirements: Access must be granted only after manager approval, permissions must automatically expire after 45 days unless re-approved, and auditors need a single report of all assignments. You need to recommend an identity governance solution that minimizes operational effort. Which Azure AD feature should you use?

  • Azure AD dynamic group membership with an expiration policy

  • Azure AD Privileged Identity Management just-in-time role assignments

  • Azure AD entitlement management access packages

  • Azure AD access reviews on a security group

Question 20 of 20

Your organization manages six Azure subscriptions that are frequently reorganized. You must design a logging solution that meets the following requirements: collect all activity and resource diagnostic logs from every subscription in a single location, retain the data for at least 730 days even if a subscription or resource is deleted, and require the least ongoing administrative effort when new subscriptions are created. Which solution should you recommend?

  • Deploy an Event Hubs namespace in every subscription, stream all diagnostic data to an on-premises SIEM, and archive the data there for two years.

  • Enable export of each subscription's Activity Log to a storage account in the same subscription and configure lifecycle management to keep the data in the Cool tier for two years.

  • Configure each resource to send diagnostic logs to individual Application Insights instances and enable continuous export from each instance to an immutable storage account.

  • Create a Log Analytics workspace in a dedicated management subscription, use an Azure Policy initiative to configure subscription-level diagnostic settings that send all Activity Log and resource diagnostic logs to the workspace, and set the workspace retention to 730 days.