ISC2 CISSP Practice Test
Certified Information Systems Security Professional
Use the form below to configure your ISC2 CISSP Practice Test. The practice test can be configured to only include certain exam objectives and domains. You can choose between 5-100 questions and set a time limit.

ISC2 CISSP Information
The (ISC)² Certified Information Systems Security Professional (CISSP) exam is one of the most widely recognized credentials in the information security field. It covers an extensive body of knowledge related to cybersecurity, including eight domains: Security and Risk Management, Asset Security, Security Architecture and Engineering, Communication and Network Security, Identity and Access Management, Security Assessment and Testing, Security Operations, and Software Development Security. This broad scope is designed to validate a candidate’s depth and breadth of knowledge in protecting organizations from increasingly complex cyber threats.
Achieving a CISSP certification signals a strong understanding of industry best practices and the ability to design, implement, and manage a comprehensive cybersecurity program. As a result, the exam is often regarded as challenging, requiring both practical experience and intensive study of each domain’s key principles. Many cybersecurity professionals pursue the CISSP to demonstrate their expertise, enhance their credibility, and open doors to higher-level roles such as Security Manager, Security Consultant, or Chief Information Security Officer.
Free ISC2 CISSP Practice Test
Press start when you are ready, or press Change to modify any settings for the practice test.
- Questions: 15
- Time: Unlimited
- Included Topics:Security and Risk ManagementAsset SecuritySecurity Architecture and EngineeringCommunication and Network SecurityIdentity and Access Management (IAM)Security Assessment and TestingSecurity OperationsSoftware Development Security
A global organization has determined that its primary security governance requirement is to create a comprehensive model that connects high-level business objectives with specific technical security implementations. Which security control framework would be BEST suited for these requirements?
Sherwood Applied Business Security Architecture
Control Objectives for Information and Related Technology
National Institute of Standards and Technology framework
International Organization for Standardization framework
Answer Description
Sherwood Applied Business Security Architecture (SABSA) is the correct answer because it provides a layered architectural model that helps organizations trace security requirements from business drivers all the way through to technical implementation. SABSA uses six layers (Business, Architect's, Designer's, Builder's, Tradesman's, and Service Manager's views) to ensure security solutions are aligned with business needs at every level.
The other frameworks, while valuable, don't provide the same comprehensive architectural approach to linking business objectives with technical implementations. NIST frameworks focus more on specific security controls and risk management approaches. ISO 27001 is centered around Information Security Management Systems with less emphasis on architectural design. COBIT offers IT governance and management practices but doesn't provide the same level of architectural guidance that connects business vision directly to technical implementation details.
Ask Bash
Bash is our AI bot, trained to help you pass your exam. AI Generated Content may display inaccurate information, always double-check anything important.
What are the six layers of SABSA?
How does SABSA differ from other security frameworks?
What is the purpose of connecting business objectives with technical implementations in security?
During the initial phase of a mission-critical financial system development, the security architect must determine which of the following approaches is BEST for gathering stakeholders' security requirements?
Conduct facilitated workshops with key stakeholders representing different business functions
Delegate security requirement gathering to department heads who will submit their needs independently
Deploy automated scanning tools to generate a list of security requirements based on industry standards
Review past security breach reports from similar financial institutions to define requirements
Answer Description
The correct answer is conducting facilitated workshops with key stakeholders representing different business functions. When gathering security requirements for mission-critical systems, especially financial ones, bringing together diverse stakeholders in facilitated workshops provides several advantages. This approach ensures comprehensive input from various perspectives (technical, business, compliance, end-users), allows for real-time discussion of security trade-offs, helps build consensus, and identifies potential conflicts early. The other approaches have limitations: automated scanning tools focus on technical vulnerabilities rather than business requirements; reviewing past breach reports may provide insights but doesn't capture current business needs or priorities; and delegating requirement gathering to department heads may miss cross-functional dependencies and create disjointed requirements.
Ask Bash
Bash is our AI bot, trained to help you pass your exam. AI Generated Content may display inaccurate information, always double-check anything important.
What are facilitated workshops and why are they effective?
What is stakeholder engagement, and why is it crucial in security requirement gathering?
What are the limitations of relying solely on automated scanning tools for security requirements?
What is key escrow in cryptographic systems?
A procedure to securely transfer keys between systems
A method to derive multiple keys from a single master key
A technique to encrypt the same data with multiple keys
A practice where a trusted third party holds copies of encryption keys
Answer Description
The correct answer is A practice where a trusted third party holds copies of encryption keys. Key escrow is a practice where a trusted third party (the escrow agent) holds copies of encryption keys. The purpose is to ensure that authorized parties, such as law enforcement with proper legal authorization or organizations needing business continuity, can access encrypted data if the primary keys are unavailable.
A technique to encrypt the same data with multiple keys is incorrect because this describes multi-key encryption, not key escrow.
A method to derive multiple keys from a master key is incorrect because this describes key derivation functions, not key escrow.
A procedure to securely transfer keys between systems is incorrect because this relates to key distribution, not escrow.
Ask Bash
Bash is our AI bot, trained to help you pass your exam. AI Generated Content may display inaccurate information, always double-check anything important.
What are the benefits of key escrow in cryptographic systems?
Who typically serves as the trusted third party in key escrow?
What are the potential risks associated with key escrow?
An organization is implementing an identity solution that allows users to authenticate once and access both on-premises applications and cloud-based SaaS platforms. Which component is essential in this hybrid architecture?
Security token service
Credential translation module
Cloud access security broker
Directory synchronization agent
Answer Description
When providing federated identity, a service provider (such as Active Directory Federation Services or similar technologies) issues security tokens that allow users to authenticate once and access resources in both environments without re-authenticating. It establishes and maintains trust relationships, performs claims transformation when necessary, and enables Single Sign-On across the hybrid environment.
A federation service provider is essential in a hybrid federated identity architecture because it serves as the intermediary that manages trust relationships between the organization's identity provider and service providers across both on-premises and cloud environments.
Ask Bash
Bash is our AI bot, trained to help you pass your exam. AI Generated Content may display inaccurate information, always double-check anything important.
What is a security token service (STS)?
What is single sign-on (SSO), and how does it work?
What are claims and how are they used in federated identity?
Which strategy best enhances the recovery capabilities for critical data following a system outage?
Updating backups once a week in one location
Creating backups on physical media stored onsite
Employing varied storage methods to protect data across locations
Utilizing a remote backup solution
Answer Description
The most effective recovery strategy employs a combination of diverse storage options to safeguard against data loss and improve recovery times. This approach allows for flexibility and redundancy, ensuring that if one method is compromised, others are available. The other options lack sufficient diversity in storage methods or do not provide optimal frequency for backups, which can impede recovery effectiveness.
Ask Bash
Bash is our AI bot, trained to help you pass your exam. AI Generated Content may display inaccurate information, always double-check anything important.
What are the different storage methods that can be employed for data protection?
Why is redundancy important in data recovery strategies?
What are the potential risks of relying solely on one backup method?
Which role is primarily responsible for ensuring that data is handled appropriately throughout its lifecycle?
Data processor
Data owner
Data subject
Data custodian
Answer Description
The correct answer is the data owner. This role is accountable for determining how data should be classified, how it is handled, and who has access to that data. Unlike data custodians, who focus on the technical aspects of data management, the owner has the authority to make decisions about data policies and controls. There are distinct responsibilities among the roles involved in the data lifecycle; for example, while data controllers manage processing activities, the ultimate authority lies with the data owner when it comes to defining security practices.
Ask Bash
Bash is our AI bot, trained to help you pass your exam. AI Generated Content may display inaccurate information, always double-check anything important.
What specific responsibilities does a data owner have?
Can you explain the differences between the roles of data owner and data custodian?
What are the implications if a data owner does not effectively manage data throughout its lifecycle?
A company is implementing new procedures for accessing sensitive financial information. Which of the following practices would best ensure that only authorized personnel can access this data?
Implement role-based access controls to restrict data access based on job functions.
Restrict data access to the IT department.
Require users to create complex passwords for accessing the data.
Conduct access reviews annually to ensure that access rights are still valid.
Answer Description
Implementing role-based access controls ensures that individuals are granted access based on their specific job responsibilities, minimizing the risk of unauthorized access to sensitive information. In contrast, requiring users to create complex passwords might enhance security, but it does not restrict access based on roles. Restricting access solely to the IT department does not address the need for appropriate access control for other authorized personnel, and merely conducting access reviews annually may lead to time gaps where unauthorized access could occur.
Ask Bash
Bash is our AI bot, trained to help you pass your exam. AI Generated Content may display inaccurate information, always double-check anything important.
What is role-based access control (RBAC)?
Why are complex passwords important in an access control strategy?
What are access reviews and why are they necessary?
A multinational corporation is preparing to expand operations into new international markets. Which of the following requirements should the security executive prioritize FIRST?
Local privacy legislation compliance
Corporate security infrastructure harmonization
Industry-specific security certification
Supply chain security assessment
Answer Description
Privacy regulation, such as General Data Protection Regulation (GDPR) compliance, is the most important requirement for the Chief Information Security Officer (CISO) to address first. Privacy regulation often includes stringent data protection requirements with significant penalties for non-compliance. Before conducting business in an area with privacy regulation, the organization must ensure its data protection practices align with requirements including data subject rights, consent mechanisms, breach notification procedures, and data protection impact assessments.
While the other options are important security considerations, addressing local privacy regulations represents a legal requirement specific to the expansion scenario described and would need immediate attention before beginning operations in new jurisdictions.
Ask Bash
Bash is our AI bot, trained to help you pass your exam. AI Generated Content may display inaccurate information, always double-check anything important.
What is privacy regulation and why is it important?
What are data subject rights under privacy regulations?
What constitutes a breach notification procedure?
What is the primary purpose of fuzz testing software as part of a security assessment?
To identify vulnerabilities by sending unexpected or random inputs
To validate input sanitization routines against known attack patterns
To verify the integrity of compiled binaries against their source code
To perform dynamic taint analysis of data flows within an application
Answer Description
The correct answer is to identify vulnerabilities by sending unexpected or random inputs. Fuzz testing works by automatically generating and sending malformed, unexpected, or random data to an application to trigger error conditions, crashes, or unexpected behaviors that might indicate security vulnerabilities. It's particularly effective at finding input validation issues, buffer overflows, and other boundary condition problems.
Performing dynamic taint analysis of data flows within an application is a different security testing technique that tracks how untrusted data moves through an application to identify potential vulnerabilities. While this approach can reveal security issues, it uses instrumentation to monitor data propagation rather than generating random inputs as fuzz testing does.
Validating input sanitization routines against known attack patterns is more closely related to penetration testing or security scanning with predefined patterns. Unlike fuzz testing, which generates random or unexpected inputs, this approach uses known malicious inputs to test specific defenses.
Verifying the integrity of compiled binaries against their source code is related to software assurance and supply chain security. This process ensures that the compiled code matches the reviewed source code and hasn't been tampered with during the build process, but it doesn't involve sending unexpected inputs to find vulnerabilities.
Ask Bash
Bash is our AI bot, trained to help you pass your exam. AI Generated Content may display inaccurate information, always double-check anything important.
What types of vulnerabilities can fuzz testing help identify?
How does fuzz testing differ from other types of security testing?
What tools are commonly used for fuzz testing?
What is the primary purpose of conducting a tabletop exercise for disaster recovery planning?
To conduct recovery processes to validate effectiveness
To confirm team members can execute their roles effectively
To evaluate and refine response strategies through discussion
To practice technical skills in real-time environments
Answer Description
The correct answer emphasizes that tabletop exercises help teams evaluate their response strategies by discussing scenarios in a controlled environment. This form of testing fosters collaboration, uncovers weaknesses in the plan, and enhances overall preparedness. Other choices may refer to relevant aspects of recovery but fail to focus on the discussion and evaluative nature of the exercise.
Ask Bash
Bash is our AI bot, trained to help you pass your exam. AI Generated Content may display inaccurate information, always double-check anything important.
What are tabletop exercises?
What types of scenarios are typically used in tabletop exercises?
How does conducting a tabletop exercise improve disaster recovery plans?
During which phase of the Software Development Lifecycle (SDLC) should security requirements first be defined?
Testing phase
Design phase
Requirements phase
Implementation phase
Answer Description
The correct answer is the requirements phase because security requirements should be identified and documented at the earliest possible stage of development. Establishing security requirements during the requirements phase ensures they're treated with the same importance as functional requirements and properly incorporated into subsequent design and implementation decisions.
**The design phase **is too late for initial security requirements definition. While security architecture designs are created during this phase, they should be based on security requirements already established. Delaying security considerations until the design phase can result in architectural decisions that are difficult to secure properly.
**The implementation phase **is significantly too late to begin considering security requirements. By this point, the architecture is established, and major changes to accommodate security needs would be expensive and time-consuming. Security during implementation should focus on following secure coding practices based on previously established requirements.
The testing phase is entirely too late for defining security requirements. At this point, the application is largely built, and discovering that it doesn't meet fundamental security needs would require substantial rework. The testing phase should verify that the implementation meets the security requirements defined much earlier in the lifecycle.
Ask Bash
Bash is our AI bot, trained to help you pass your exam. AI Generated Content may display inaccurate information, always double-check anything important.
What are the key activities in the requirements phase of SDLC?
Why is it important to include security requirements early in the SDLC?
What are some common security requirements to consider in the requirements phase?
During a disaster recovery operation, who is primarily responsible for coordinating the recovery activities and ensuring that the disaster recovery plan is executed according to established procedures?
Chief Security Officer
Incident Response Team Leader
Disaster Recovery Coordinator
Business Continuity Manager
Answer Description
The Disaster Recovery Coordinator is responsible for coordinating all recovery activities during a disaster recovery operation. This individual oversees the execution of the disaster recovery plan, manages the recovery team, communicates with stakeholders, and ensures that recovery procedures are followed correctly. The Disaster Recovery Coordinator is appointed specifically to lead the recovery efforts and serves as the central point of authority during the recovery process. The other roles mentioned have important but different responsibilities within an organization's security framework.
Ask Bash
Bash is our AI bot, trained to help you pass your exam. AI Generated Content may display inaccurate information, always double-check anything important.
What qualifications or skills does a Disaster Recovery Coordinator need?
What distinguishes the role of the Disaster Recovery Coordinator from the Business Continuity Manager?
How does the Disaster Recovery Coordinator communicate with stakeholders during recovery?
An organization is decommissioning several storage devices, including traditional Hard Disk Drives (HDDs) and Solid-State Drives (SSDs). A technician is instructed to use degaussing for media sanitization. For which type of media would this technique be ineffective?
Magnetic tapes
Floppy disks
Hard Disk Drives (HDDs)
Solid-State Drives (SSDs)
Answer Description
Degaussing is a media sanitization technique that uses a powerful magnetic field to neutralize the magnetic data stored on media. This process is effective for magnetic storage like Hard Disk Drives (HDDs) and magnetic tapes. However, it is ineffective for Solid-State Drives (SSDs) because SSDs store data electronically in flash memory (NAND) cells, which are not affected by magnetic fields. Appropriate sanitization techniques for SSDs include cryptographic erasure, using the ATA Secure Erase command, or physical destruction.
Ask Bash
Bash is our AI bot, trained to help you pass your exam. AI Generated Content may display inaccurate information, always double-check anything important.
What is degaussing and how does it work?
What are some effective media sanitization techniques for SSDs?
Why is it important to select appropriate media sanitization techniques?
A financial services company is developing a new mobile banking application that will interact with their existing backend systems through multiple APIs. During the security assessment phase, the security team needs to evaluate these APIs for potential security vulnerabilities. Which of the following testing approaches would be BEST for identifying authentication bypass vulnerabilities in the application's APIs?
Schema validation testing of API request and response formats
Load testing the APIs to measure their performance under stress
Port scanning the backend servers hosting the APIs
Fuzzing the API endpoints with unexpected input values
Answer Description
Fuzzing, or fuzz testing, is the most effective approach for identifying authentication bypass vulnerabilities in APIs because it systematically sends unexpected, malformed, or random data inputs to the API endpoints to discover how they handle invalid or unexpected inputs. This technique is particularly effective at finding authentication bypass vulnerabilities as it can reveal edge cases where input validation fails or where error handling might expose sensitive information that could assist in bypassing authentication controls.
While schema validation testing is valuable for ensuring API inputs conform to expected formats, it doesn't specifically target authentication logic flaws. Load testing focuses on performance under stress rather than security vulnerabilities. Port scanning is a network reconnaissance technique that identifies open ports and services but doesn't test application-level authentication mechanisms in APIs.
Ask Bash
Bash is our AI bot, trained to help you pass your exam. AI Generated Content may display inaccurate information, always double-check anything important.
What is fuzz testing and how is it conducted?
What are authentication bypass vulnerabilities?
Why is schema validation important in API security?
A global corporation is implementing a secure remote access solution for network administrators who need to perform system maintenance from various locations, including home networks and public Wi-Fi. The security team wants to ensure that all administrative sessions are properly authenticated, encrypted, and logged. Which of the following remote access solutions would best meet these requirements?
VPN connection with standard domain credentials
SSH with multi-factor authentication
Telnet with enhanced password policies
Remote Desktop Protocol over TLS
Answer Description
Secure Shell (SSH) with multi-factor authentication is the most appropriate solution for secure remote administrative access. SSH provides encrypted communications for administrative sessions, ensuring confidentiality of sensitive network operations. When combined with multi-factor authentication, it significantly enhances security by requiring something the user knows (password) plus something they have (token or mobile device) or something they are (biometrics). SSH also has robust logging capabilities for auditing administrative actions, which is crucial for security monitoring and compliance.
VPN alone provides encryption but lacks the specific administrative controls and logging capabilities needed for administrative access. Remote Desktop Protocol has known security vulnerabilities when not properly secured and doesn't inherently include multi-factor authentication. Telnet transmits data in plaintext, making it fundamentally insecure for administrative functions.
Ask Bash
Bash is our AI bot, trained to help you pass your exam. AI Generated Content may display inaccurate information, always double-check anything important.
What is SSH and why is it considered secure?
What is multi-factor authentication and how does it improve security?
Why is logging important in remote access solutions?
Neat!
Looks like that's it! You can go back and review your answers or click the button below to grade your test.