ISC2 CISSP Practice Test
Certified Information Systems Security Professional
Use the form below to configure your ISC2 CISSP Practice Test. The practice test can be configured to only include certain exam objectives and domains. You can choose between 5-100 questions and set a time limit.

ISC2 CISSP Information
The (ISC)² Certified Information Systems Security Professional (CISSP) exam is one of the most widely recognized credentials in the information security field. It covers an extensive body of knowledge related to cybersecurity, including eight domains: Security and Risk Management, Asset Security, Security Architecture and Engineering, Communication and Network Security, Identity and Access Management, Security Assessment and Testing, Security Operations, and Software Development Security. This broad scope is designed to validate a candidate’s depth and breadth of knowledge in protecting organizations from increasingly complex cyber threats.
Achieving a CISSP certification signals a strong understanding of industry best practices and the ability to design, implement, and manage a comprehensive cybersecurity program. As a result, the exam is often regarded as challenging, requiring both practical experience and intensive study of each domain’s key principles. Many cybersecurity professionals pursue the CISSP to demonstrate their expertise, enhance their credibility, and open doors to higher-level roles such as Security Manager, Security Consultant, or Chief Information Security Officer.
Scroll down to see your responses and detailed results
Free ISC2 CISSP Practice Test
Press start when you are ready, or press Change to modify any settings for the practice test.
- Questions: 15
- Time: Unlimited
- Included Topics:Security and Risk ManagementAsset SecuritySecurity Architecture and EngineeringCommunication and Network SecurityIdentity and Access Management (IAM)Security Assessment and TestingSecurity OperationsSoftware Development Security
True or False: A system that implements the 'fail securely' principle should default to granting access when authentication mechanisms fail.
True
False
Answer Description
The correct answer is False. The 'fail securely' principle states that when a system fails, it should default to a secure state rather than an insecure one. In practice, this means that when authentication mechanisms fail or encounter errors, the system should deny access rather than grant it. This ensures that security is maintained even during failure conditions.
Failing securely is sometimes referred to as 'fail-safe' or 'fail-closed' and is an important security design principle that helps prevent unauthorized access during exceptional conditions or system failures. If a system were to grant access by default when authentication fails, it would create significant security vulnerabilities that could be exploited by attackers.
Ask Bash
Bash is our AI bot, trained to help you pass your exam. AI Generated Content may display inaccurate information, always double-check anything important.
What does 'fail securely' mean in security design?
What are the risks of not implementing 'fail securely'?
How does 'fail securely' relate to other security principles?
Which security concept ensures that information is accessible and usable upon demand by authorized entities?
Confidentiality
Authenticity
Integrity
Availability
Answer Description
Availability is the security concept that ensures information systems and data are accessible and usable by authorized users when needed. This is a fundamental component of the CIA triad (Confidentiality, Integrity, and Availability) and one of the five pillars of information security.
Confidentiality ensures that information is not disclosed to unauthorized parties. Integrity ensures data remains accurate and unaltered. Authenticity verifies that data, transactions, or documents are genuine. Nonrepudiation prevents parties from denying their actions.
Ask Bash
Bash is our AI bot, trained to help you pass your exam. AI Generated Content may display inaccurate information, always double-check anything important.
What is the CIA triad?
Why is availability important in information security?
What measures can be taken to improve system availability?
A healthcare organization is developing a new patient portal system. The CISO has instructed the project team to follow a proactive rather than reactive approach to data protection throughout the development lifecycle. Which approach best demonstrates the principle the CISO is emphasizing?
Conducting a comprehensive data flow assessment during the requirements phase to identify potential risks before architecture decisions are made
Creating detailed compliance documentation that will be reviewed by legal counsel before system deployment
Adding detailed audit logging capabilities to track user activities once the system goes live
Implementing strong encryption protocols after the system architecture has been finalized
Answer Description
The CISO is emphasizing the Privacy by Design principle, which was developed by Dr. Ann Cavoukian and is now considered a global standard for protection. It advocates for incorporating data protection measures into the design and architecture of systems from the beginning, rather than adding them later as a reaction to problems.
The correct answer involves conducting a data flow assessment during the requirements phase, which allows the team to systematically analyze how personal information will be collected, used, shared, and maintained before any technical decisions are made. This proactive approach helps identify and mitigate risks early, embedding protection into the system architecture itself.
The other options represent reactive approaches that address concerns after design decisions have been made, focus only on compliance documentation without addressing architectural considerations, or implement technical controls without considering broader implications of data handling throughout the entire system lifecycle.
Ask Bash
Bash is our AI bot, trained to help you pass your exam. AI Generated Content may display inaccurate information, always double-check anything important.
What is Privacy by Design?
What is a data flow assessment?
What are the advantages of a proactive approach to data protection?
A company implements a process where employee access is removed solely from their email and shared drives upon termination. Given this approach, they believe they have sufficiently secured sensitive company data. Is this approach adequate?
False
True
Answer Description
This statement is false because it does not address the full scope of deprovisioning necessary for security. Merely removing access to email and shared drives overlooks other critical systems and applications the employee may have accessed. Comprehensive deprovisioning involves revoking all access rights and disabling accounts associated with the employee to protect sensitive information and maintain security across the organization.
Ask Bash
Bash is our AI bot, trained to help you pass your exam. AI Generated Content may display inaccurate information, always double-check anything important.
What does comprehensive deprovisioning involve?
What are the risks of inadequate access removal?
Why is it important to consider all systems during the deprovisioning process?
During the recovery phase of a major data breach incident, the security team has restored critical systems from backups and verified data integrity. What is the BEST next step to take before returning systems to production?
Restore user access to systems and data
Apply security configurations and patches or updates that were missing before the incident
Document recovery actions taken in the incident report
Update the incident status in the tracking system
Answer Description
The correct answer is to apply security configurations and patches or updates that were missing before the incident. This step is crucial in the recovery process because returning systems to production without addressing the original vulnerability would likely result in a recurring breach.
While documenting the recovery actions taken is important, it can be completed after systems are secure and operational. Updating the incident status would be premature without ensuring the vulnerability is addressed. Restoring user access at this stage could potentially re-expose the system to threats if the original vulnerability hasn't been patched. The application of security patches addresses the root cause of the incident and helps prevent similar incidents in the future, making it the most critical next step in the recovery process.
Ask Bash
Bash is our AI bot, trained to help you pass your exam. AI Generated Content may display inaccurate information, always double-check anything important.
What are security configurations?
What are security patches and updates?
Why is it important to address vulnerabilities before restoring access?
A multinational corporation has discovered that its customer database contains duplicate records, outdated address information, and inconsistent formatting across different regional branches. Which data maintenance practice would BEST address these issues?
Data cleansing
Data encryption
Data compression
Data normalization
Answer Description
Data cleansing is the BEST answer because it specifically addresses the issues described in the scenario: duplicate records, outdated information, and inconsistent formatting. Data cleansing is the process of detecting and correcting (or removing) corrupt, inaccurate, or irrelevant records from a dataset. This process involves identifying incomplete, incorrect, inaccurate, or irrelevant parts of the data and then replacing, modifying, or deleting this dirty data to improve data quality.
While data normalization (standardizing data formats) would help with the inconsistent formatting issue, it doesn't address the duplicate records or outdated information problems comprehensively. Data compression is about reducing storage requirements, not fixing data quality issues. Data encryption focuses on protecting the confidentiality of data, but doesn't address the data quality problems described in the scenario.
Ask Bash
Bash is our AI bot, trained to help you pass your exam. AI Generated Content may display inaccurate information, always double-check anything important.
What is data cleansing?
How does data normalization help with data quality?
Why is data compression not suitable for resolving data quality issues?
A financial institution is implementing an HPC cluster to process large datasets for real-time fraud detection. Which security control would be MOST appropriate for addressing the unique security challenges of this HPC environment?
Implementing resource isolation through containerization
Enabling hardware-level encryption for data exchanges
Allowing elevated root access for system administrators
Applying perimeter-based network monitoring
Answer Description
The correct answer is implementing resource isolation through containerization. High-Performance Computing (HPC) systems face unique security challenges due to their multi-tenant nature and shared resource architecture. When multiple processes or users share computing resources, there is a risk of data leakage between workloads or unauthorized access to sensitive information.
Containerization provides strong isolation between workloads while maintaining the performance advantages of HPC systems. It creates logical boundaries between different processing tasks, ensuring that fraud detection algorithms and sensitive financial data remain segregated from other workloads. This approach aligns with the principle of defense in depth by adding another layer of protection.
Perimeter-based network monitoring is insufficient for HPC environments as many threats can originate from within the cluster. Allowing elevated root access would violate the principle of least privilege and create unnecessary security risks. While hardware-level encryption is beneficial, it addresses data protection rather than the resource isolation needs specific to multi-tenant HPC environments.
Ask Bash
Bash is our AI bot, trained to help you pass your exam. AI Generated Content may display inaccurate information, always double-check anything important.
What is containerization in HPC environments?
Why is resource isolation important in HPC systems?
What is the principle of defense in depth?
An organization is developing handling procedures for sensitive data. What is the primary purpose of establishing information and asset handling requirements?
To transfer liability from the organization to individual employees
To reduce the overall cost of information technology operations
To maximize the value of data through extensive sharing
To ensure consistent protection of information assets throughout their lifecycle
Answer Description
The primary purpose of establishing information and asset handling requirements is to ensure consistent protection of information assets throughout their lifecycle. These requirements define how information should be handled, stored, processed, and transmitted based on its classification and sensitivity. Without established handling requirements, organizations risk inconsistent security practices, potential data breaches, and compliance violations. Proper handling requirements ensure that everyone in the organization understands how to work with information assets according to their sensitivity level and relevant regulations.
Ask Bash
Bash is our AI bot, trained to help you pass your exam. AI Generated Content may display inaccurate information, always double-check anything important.
What are information assets and why are they important?
What does it mean to handle information based on its classification?
What could happen if an organization fails to establish proper handling requirements?
A financial services company is developing a new mobile banking application that will interact with their existing backend systems through multiple APIs. During the security assessment phase, the security team needs to evaluate these APIs for potential security vulnerabilities. Which of the following testing approaches would be BEST for identifying authentication bypass vulnerabilities in the application's APIs?
Load testing the APIs to measure their performance under stress
Port scanning the backend servers hosting the APIs
Fuzzing the API endpoints with unexpected input values
Schema validation testing of API request and response formats
Answer Description
Fuzzing, or fuzz testing, is the most effective approach for identifying authentication bypass vulnerabilities in APIs because it systematically sends unexpected, malformed, or random data inputs to the API endpoints to discover how they handle invalid or unexpected inputs. This technique is particularly effective at finding authentication bypass vulnerabilities as it can reveal edge cases where input validation fails or where error handling might expose sensitive information that could assist in bypassing authentication controls.
While schema validation testing is valuable for ensuring API inputs conform to expected formats, it doesn't specifically target authentication logic flaws. Load testing focuses on performance under stress rather than security vulnerabilities. Port scanning is a network reconnaissance technique that identifies open ports and services but doesn't test application-level authentication mechanisms in APIs.
Ask Bash
Bash is our AI bot, trained to help you pass your exam. AI Generated Content may display inaccurate information, always double-check anything important.
What is fuzz testing and how is it conducted?
What are authentication bypass vulnerabilities?
Why is schema validation important in API security?
Which of the following best describes the primary purpose of a security sandbox?
A production testing area where developers debug application code before deployment
An isolated environment to run and analyze untrusted code without risking production systems
A decoy system designed to attract hackers away from legitimate resources
A tool for scanning network traffic to identify known malware signatures
Answer Description
A sandbox is an isolated environment used to execute suspicious or untrusted code and applications without risking harm to the host device or network. By providing this controlled environment, security professionals can observe the behavior of potentially malicious software and analyze its actions without exposing production systems to risk. Sandboxes are particularly valuable for testing unknown files, suspicious email attachments, or applications from untrusted sources before allowing them into the production environment.
Ask Bash
Bash is our AI bot, trained to help you pass your exam. AI Generated Content may display inaccurate information, always double-check anything important.
What types of tools can be used to create a security sandbox?
What are some common use cases for a security sandbox?
How does a security sandbox differ from traditional testing environments?
A company experienced a data breach after failing to patch a known vulnerability for six months. During litigation, they would most likely be found to have failed which of the following?
Business impact analysis
Due diligence
Due care
Code of ethics
Answer Description
Due care refers to taking reasonable steps that a prudent person would take in a given situation to prevent harm or meet obligations. In this scenario, the company failed to apply a patch for a known vulnerability for an extended period (six months), which represents a failure to exercise due care. This demonstrates a lack of reasonable action to protect systems and data, which a prudent organization would typically address in a more timely manner.
Due diligence, in contrast, refers to the investigation and research process undertaken before making decisions or taking actions, such as assessing risks before implementing systems. The scenario specifically shows a failure to act on known information rather than a failure to investigate.
The other options are incorrect because: code of ethics violations typically involve professional conduct issues, not security maintenance practices; and the business impact analysis is a process for determining critical business functions and is not directly related to patch management failures.
Ask Bash
Bash is our AI bot, trained to help you pass your exam. AI Generated Content may display inaccurate information, always double-check anything important.
What does due care mean in cybersecurity?
What is the difference between due care and due diligence?
Why is patch management important in cybersecurity?
A healthcare organization wants to implement an access control system that can make decisions based on the patient's relationship to the healthcare provider, time of day, location of access attempt, and sensitivity of the medical records. Which access control model would BEST meet these requirements?
Mandatory Access Control (MAC)
Attribute-based Access Control (ABAC)
Role-based Access Control (RBAC)
Discretionary Access Control (DAC)
Answer Description
Attribute-based Access Control (ABAC) is the correct answer because it evaluates access requests based on attributes of subjects (users), objects (resources), actions, and environmental conditions. In this healthcare scenario, ABAC can use multiple attributes like the relationship between provider and patient, time of access, location, and data sensitivity level to make dynamic access decisions.
Role-based Access Control (RBAC) would be insufficient as it primarily makes access decisions based on pre-defined roles and doesn't easily accommodate environmental conditions like time and location. Discretionary Access Control (DAC) relies on the resource owner to grant access rights and lacks the fine-grained control needed for multiple attributes. Mandatory Access Control (MAC) uses security labels and clearance levels in a rigid hierarchy, which doesn't allow for the contextual, relationship-based decisions required in this healthcare scenario.
Ask Bash
Bash is our AI bot, trained to help you pass your exam. AI Generated Content may display inaccurate information, always double-check anything important.
What are the key features of Attribute-based Access Control (ABAC)?
How does ABAC differ from Role-based Access Control (RBAC)?
What are the limitations of other access control models like DAC and MAC in this context?
What approach significantly enhances security for accessing applications by requiring verification through multiple channels?
Using various forms of verification for access
Limiting access by IP address
Accessing with a username and password
Allowing access based on device recognition
Answer Description
A strategy that involves requiring various forms of identification and verification greatly improves security. This method ensures that even if one credential is compromised, unauthorized users still face additional hurdles before gaining entry to applications. In contrast, relying on a single credential method, such as a username and password, can leave systems vulnerable, while options like device recognition and IP address filtering do not provide comprehensive security on their own.
Ask Bash
Bash is our AI bot, trained to help you pass your exam. AI Generated Content may display inaccurate information, always double-check anything important.
What are some common methods used in multi-factor authentication (MFA)?
How does multi-factor authentication improve security compared to single-factor methods?
What are the key advantages of using multi-channel verification in security?
Portable USB storage devices should be given unrestricted physical access to sensitive computing environments because most organizations have endpoint security controls already in place.
True
False
Answer Description
The statement is false because portable USB storage devices present significant security risks even when endpoint security controls are in place. These devices can:
- Introduce malware into secure environments through infected media
- Be used for unauthorized data exfiltration
- Bypass network monitoring controls
- Potentially deliver hardware-based attacks (like BadUSB)
Comprehensive access control for devices requires:
- Restricting unauthorized devices through physical means
- Implementing technical controls like device whitelisting
- Enforcing policies on approved device usage
- Possibly disabling USB ports in highly sensitive environments
Endpoint security alone is insufficient, as it may have vulnerabilities or configuration gaps that USB devices could exploit. Defense-in-depth principles require both physical access restrictions and technical controls.
Ask Bash
Bash is our AI bot, trained to help you pass your exam. AI Generated Content may display inaccurate information, always double-check anything important.
What are endpoint security controls?
What is device whitelisting and how does it work?
What does the term 'BadUSB' refer to?
What is the MOST appropriate security testing approach for identifying time-of-check to time-of-use (TOCTOU) vulnerabilities?
Dependency scanning
Fuzzing
Static code analysis
Race condition testing
Answer Description
The correct answer is Race condition testing. Time-of-check to time-of-use (TOCTOU) vulnerabilities are a type of race condition where a program checks a condition and then acts on that information assuming it's still valid, which can create a security vulnerability if the condition changes between the check and use. Race condition testing specifically focuses on identifying situations where timing differences can lead to security issues by systematically testing for conditions where operations can be interrupted or reordered in ways that expose vulnerabilities.
Static code analysis is incorrect because while static code analysis can identify some potential race conditions, it often struggles with TOCTOU vulnerabilities because these issues depend on runtime timing and execution order that may not be apparent from static analysis alone.
Fuzzing is incorrect because fuzzing focuses on finding input handling vulnerabilities by providing unexpected or malformed inputs. While fuzzing is valuable for many types of security testing, it's not specifically designed to detect timing-based race conditions like TOCTOU issues.
Dependency scanning is incorrect because dependency scanning focuses on identifying known vulnerabilities in third-party libraries and components rather than application logic vulnerabilities like race conditions. It doesn't test the dynamic behavior of code that could lead to TOCTOU issues.
Ask Bash
Bash is our AI bot, trained to help you pass your exam. AI Generated Content may display inaccurate information, always double-check anything important.
What are race conditions and why are they a security concern?
How can I perform race condition testing effectively?
What tools can help with race condition testing?
Nice!
Looks like that's it! You can go back and review your answers or click the button below to grade your test.