ISC2 CISSP Practice Test
Certified Information Systems Security Professional
Use the form below to configure your ISC2 CISSP Practice Test. The practice test can be configured to only include certain exam objectives and domains. You can choose between 5-100 questions and set a time limit.

ISC2 CISSP Information
The (ISC)² Certified Information Systems Security Professional (CISSP) exam is one of the most widely recognized credentials in the information security field. It covers an extensive body of knowledge related to cybersecurity, including eight domains: Security and Risk Management, Asset Security, Security Architecture and Engineering, Communication and Network Security, Identity and Access Management, Security Assessment and Testing, Security Operations, and Software Development Security. This broad scope is designed to validate a candidate’s depth and breadth of knowledge in protecting organizations from increasingly complex cyber threats.
Achieving a CISSP certification signals a strong understanding of industry best practices and the ability to design, implement, and manage a comprehensive cybersecurity program. As a result, the exam is often regarded as challenging, requiring both practical experience and intensive study of each domain’s key principles. Many cybersecurity professionals pursue the CISSP to demonstrate their expertise, enhance their credibility, and open doors to higher-level roles such as Security Manager, Security Consultant, or Chief Information Security Officer.

Free ISC2 CISSP Practice Test
- 20 Questions
- Unlimited
- Security and Risk ManagementAsset SecuritySecurity Architecture and EngineeringCommunication and Network SecurityIdentity and Access Management (IAM)Security Assessment and TestingSecurity OperationsSoftware Development Security
Free Preview
This test is a free preview, no account required.
Subscribe to unlock all content, keep track of your scores, and access AI features!
A company is implementing new procedures for accessing sensitive financial information. Which of the following practices would best ensure that only authorized personnel can access this data?
Require users to create complex passwords for accessing the data.
Restrict data access to the IT department.
Conduct access reviews annually to ensure that access rights are still valid.
Implement role-based access controls to restrict data access based on job functions.
Answer Description
Implementing role-based access controls ensures that individuals are granted access based on their specific job responsibilities, minimizing the risk of unauthorized access to sensitive information. In contrast, requiring users to create complex passwords might enhance security, but it does not restrict access based on roles. Restricting access solely to the IT department does not address the need for appropriate access control for other authorized personnel, and merely conducting access reviews annually may lead to time gaps where unauthorized access could occur.
Ask Bash
Bash is our AI bot, trained to help you pass your exam. AI Generated Content may display inaccurate information, always double-check anything important.
What is Role-Based Access Control (RBAC)?
How does Role-Based Access Control differ from other access control models?
Why are annual access reviews considered insufficient for managing access control?
An e-commerce company stores customer order records in a relational database that feeds analytics dashboards and the customer-service portal. Over time, stale shipping addresses and duplicate customer accounts have begun to cause reporting errors and slow queries. The security manager is updating the data-maintenance plan. Which action will BEST ensure the ongoing accuracy and relevance of operational data while still supporting business use?
Apply read-only permissions to all production tables
Schedule periodic data cleansing and validation to remove duplicates and correct outdated values
Rotate the database encryption keys every six months
Move order records older than one year to cold storage
Answer Description
Scheduling regular data-cleansing and validation routines identifies duplicates, inconsistent formats, and obsolete fields so they can be corrected or removed. This proactive process maintains data integrity and keeps information reliable for day-to-day operations. Rotating encryption keys improves confidentiality but does not address data quality. Moving old records to cold storage is an archival control that does nothing for the accuracy of active data. Applying read-only permissions protects against unauthorized changes but will not fix existing errors or outdated information.
Ask Bash
Bash is our AI bot, trained to help you pass your exam. AI Generated Content may display inaccurate information, always double-check anything important.
Why are regular audits important for maintaining data quality?
What is the difference between proactive and retrospective data management?
How do access controls and retention periods differ from regular audits in data lifecycle management?
A multinational organization with thousands of assets-ranging from laptops and servers to cloud-based databases and proprietary algorithms-has struggled to reconcile discrepancies between purchase orders and what IT and finance believe is actually deployed. Senior management asks the security manager to recommend a single approach that will establish and maintain an accurate inventory of both tangible and intangible assets. Which action BEST meets this goal?
Create a licensing repository for patents and trademarks and add physical equipment when time allows.
Develop a program that documents every physical and information asset in a centralized register and schedules regular audits to reconcile the records with reality.
Add digital assets to the existing annual physical inventory checklist handled by facilities staff.
Deploy an automated discovery tool that records all new purchases and existing devices in real time.
Answer Description
A comprehensive asset-inventory program requires two key elements: first, every physical and information asset must be recorded in a centralized register with details such as owner, location, and classification; second, the records must be validated through scheduled physical or logical audits. ISO / IEC 27002 control 5.9 explicitly calls for an accurate, up-to-date inventory and recommends regular reviews to ensure accuracy. Adding digital items to an existing physical checklist, relying solely on automated discovery tools, or focusing on intangible assets alone each addresses only part of the requirement and therefore does not fully satisfy management's objective.
Ask Bash
Bash is our AI bot, trained to help you pass your exam. AI Generated Content may display inaccurate information, always double-check anything important.
Why is regular auditing important in inventory management?
How can organizations effectively document both physical and information assets?
What are the benefits of combining manual audits with automated tracking tools?
A financial services firm recently completed a data classification project, labeling its client investment data as 'Confidential'. The Chief Information Security Officer (CISO) is now tasked with developing corresponding security policies. When establishing the specific handling requirements for this 'Confidential' data, what should the primary objective of these requirements be to align with security best practices and the data's classification?
To streamline data access for auditors and regulators to ensure compliance.
To optimize data storage and transmission costs by using advanced compression techniques.
To ensure the data is protected against unauthorized access and disclosure throughout its lifecycle.
To guarantee 99.999% availability for all financial reporting systems using the data.
Answer Description
The primary purpose of establishing handling requirements based on a 'Confidential' classification is to protect the information from unauthorized access and disclosure. While availability, cost optimization, and streamlining access for compliance are all valid business and security considerations, the core principle of handling 'Confidential' data is to maintain its secrecy. The data classification level dictates that confidentiality is the foremost priority over other aspects of the CIA triad or other operational goals.
Ask Bash
Bash is our AI bot, trained to help you pass your exam. AI Generated Content may display inaccurate information, always double-check anything important.
What does 'sensitive information' include in the context of handling requirements?
How do handling requirements ensure confidentiality, integrity, and availability (CIA)?
What are some examples of protocols for sharing and disposing of sensitive information?
What is the primary security benefit of formal code verification methods like mathematical proofs?
They are easier to implement than standard code reviews
They verify the absence of specific classes of vulnerabilities with mathematical certainty
They automatically fix identified vulnerabilities
They scan code faster than traditional static analysis tools
Answer Description
The correct answer is They verify the absence of specific classes of vulnerabilities with mathematical certainty. Formal verification methods use mathematical techniques to prove that code adheres to specific security properties or is free from certain vulnerability classes with mathematical certainty. Unlike testing, which can only show the presence of bugs, formal verification can demonstrate their absence for specified properties. This provides a higher level of assurance than can be achieved through testing or conventional static analysis alone.
They scan code faster than traditional static analysis tools is incorrect because formal verification methods are typically much more computationally intensive and time-consuming than traditional static analysis tools. The rigor of mathematical proving usually comes at the cost of performance.
They automatically fix identified vulnerabilities is incorrect because formal verification methods identify violations of specifications but do not automatically fix issues. They provide proof of correctness or counterexamples, but remediation still requires developer intervention.
They are easier to implement than standard code reviews is incorrect because formal verification methods are generally much more complex and difficult to implement than standard code reviews. They require specialized expertise, formal specifications, and significant computational resources.
Ask Bash
Bash is our AI bot, trained to help you pass your exam. AI Generated Content may display inaccurate information, always double-check anything important.
What distinguishes formal code verification from testing?
Why is formal verification computationally intensive?
What kind of expertise is needed for formal code verification?
Which role has the authority to define access and protection policies for data within an organization?
Data custodian
Data processor
Data steward
Data owner
Answer Description
The correct answer is the data owner, as this role encompasses the responsibility for defining how data should be accessed, who can access it, and what protections are to be implemented around it. Other roles, such as custodians and stewards, are tasked with managing the physical storage and administration of the data according to the policies set by the owner, but they do not have the authority to define those policies. Similarly, processors manage the data processing tasks but follow predefined guidelines without authority to set those guidelines, while users primarily interact with the data without control over its management.
Ask Bash
Bash is our AI bot, trained to help you pass your exam. AI Generated Content may display inaccurate information, always double-check anything important.
What are the responsibilities of a Data Owner?
How does a Data Custodian support a Data Owner?
What’s the difference between a Data Processor and a Data Steward?
What is the process of categorizing data into different classes based on its sensitivity and the impact to the organization if it were disclosed?
Information review
Data management
Asset evaluation
Data classification
Answer Description
The process of categorizing data is known as data classification. This involves assigning a label to data which informs users and systems how it should be handled according to its sensitivity. Different classifications help guide security protocols, access control, and the implementation of appropriate protective measures. For instance, sensitive information may require stronger access controls and encryption compared to less sensitive data.
Ask Bash
Bash is our AI bot, trained to help you pass your exam. AI Generated Content may display inaccurate information, always double-check anything important.
Why is data classification important for organizations?
What are common categories used in data classification?
What security controls are typically applied based on data classification levels?
What term refers to the process of assigning categories to data based on its level of sensitivity and the impact to the organization if that data is disclosed or compromised?
Data encoding
Data classification
Data mapping
Answer Description
Data classification is the systematic approach used to categorize data by its level of sensitivity. This process helps organizations apply appropriate security measures based on potential risk. In contrast, terms like 'data mapping' or 'data encoding' do not encapsulate this concept, as they pertain to organizing or translating data rather than assessing sensitivity levels.
Ask Bash
Bash is our AI bot, trained to help you pass your exam. AI Generated Content may display inaccurate information, always double-check anything important.
Why is data classification important for organizations?
What are the key steps involved in data classification?
How is data classification different from data mapping?
During a quarterly internal audit, the newly appointed Chief Privacy Officer discovers that the organization's asset register does not distinguish between publicly available information and data that could cause harm if exposed. She instructs each department to tag its records according to a new classification scheme. Which type of information should the team classify as sensitive rather than public or internal?
Company-wide announcements accessible to all employees.
Data about the organization's history that is already published.
Marketing materials that promote the brand.
Employee salary information that is handled with confidentiality measures.
Answer Description
Sensitive assets require stricter handling because unauthorized disclosure could adversely affect individuals or the organization. Employee salary records contain personal financial details that, if exposed, could lead to workplace disputes, privacy violations, or even identity-theft risk. Company announcements, marketing collateral, and already published historical data are intended for broad distribution and therefore do not warrant the sensitive label.
Ask Bash
Bash is our AI bot, trained to help you pass your exam. AI Generated Content may display inaccurate information, always double-check anything important.
What does it mean for an asset to be classified as sensitive information?
How does asset classification benefit an organization’s security strategy?
What are some examples of non-sensitive assets, and why are they considered less critical?
Which of the following statements BEST describes an organization's responsibility when setting data-retention periods?
Having an internal data-retention policy is unnecessary because external regulations always override it.
Internal data-retention policies must align with applicable legal and regulatory requirements to avoid compliance risks.
Organizations can set any retention period they prefer as long as it is shorter than regulatory minimums.
As long as internal retention rules are strictly enforced, alignment with external regulations is optional.
Answer Description
Internal data-retention schedules must be cross-checked against all applicable laws and industry regulations. Statutes such as HIPAA require covered documentation to be kept for six years, Sarbanes-Oxley requires certain audit records to be retained for seven years, and GDPR mandates that personal data not be stored longer than necessary. Failing to align internal policy with these external requirements can lead to fines, litigation, or regulatory action. Therefore, the only accurate statement is that internal policies must align with applicable legal and regulatory requirements.
Ask Bash
Bash is our AI bot, trained to help you pass your exam. AI Generated Content may display inaccurate information, always double-check anything important.
Why must internal data-retention policies align with external regulations?
What is the consequence of failing to align with legal data-retention mandates?
What are examples of regulations with specific data-retention periods?
Your organization is decommissioning a rack of self-encrypting solid-state drives (SSDs) that once stored highly sensitive customer records. The drives will be released to an outside recycling vendor, but they must first be sanitized so that the data can never be recovered, while still allowing the vendor to reuse the hardware. Which sanitization option BEST fulfills this requirement?
Perform a quick format of each partition
Physically shred each SSD into small fragments
Delete all files and empty the recycle bin
Issue the drive's Secure Erase (cryptographic erase) command and verify completion
Answer Description
Issuing the drive's Secure Erase (cryptographic erase) command performs a purge-level sanitization recognized by NIST SP 800-88. The command overwrites user-accessible and hidden cells or destroys the encryption key, rendering residual data mathematically unrecoverable while leaving the SSD functional for resale. Physical shredding would also eliminate the data but destroys the asset, contradicting the reuse goal. Quick format and simple deletion only alter file-system metadata and can be reversed with basic forensic tools.
Ask Bash
Bash is our AI bot, trained to help you pass your exam. AI Generated Content may display inaccurate information, always double-check anything important.
Why is overwriting data multiple times considered effective for data deletion?
What is the difference between deleting and overwriting data?
Are there standards or tools for ensuring secure data destruction?
What refers to the geographical or logical whereabouts of data within an information system?
Data integrity
Data governance
Data residency
Data encryption
Answer Description
The correct answer defines where data resides, which is crucial for managing data privacy, compliance, and access control. Knowing the specific location of data helps in implementing appropriate security controls and effectively handling data according to regulations and organizational policies. The other options relate to aspects of data management but do not specifically indicate the concept of data location.
Ask Bash
Bash is our AI bot, trained to help you pass your exam. AI Generated Content may display inaccurate information, always double-check anything important.
Why is data residency important for compliance?
How does data residency differ from data governance?
What security controls are influenced by data residency?
An organization is replacing its outdated technology systems that are no longer supported. What should be the primary action taken regarding the equipment being retired?
Remove the hardware but retain data on it in case it is needed later.
Transfer the data to a cloud service directly from the legacy systems after appropriate preparation.
Format the hard drives and store the equipment in a secure location as a backup.
Ensure the elimination of sensitive data from the hardware before retirement.
Answer Description
Ensuring the elimination of sensitive data from retired hardware is crucial to prevent potential data breaches. This action needs to involve secure methods of data removal that make information irretrievable. Other options state concrete activities, but they do not necessarily result in the correct outcome. These options do not result in proper data deletion, risking exposure to unauthorized access and undermining the organization's data security posture.
Ask Bash
Bash is our AI bot, trained to help you pass your exam. AI Generated Content may display inaccurate information, always double-check anything important.
What are secure methods of data removal?
Why is it important to ensure data is irretrievable before retiring hardware?
Are data wiping and formatting the same process?
A financial services company is migrating a large dataset containing sensitive customer PII to a new cloud platform. A debate has emerged within the project team regarding the specific encryption standards and access control policies that must be applied to the data in its new environment. According to established information security governance principles, which role holds the ultimate authority and accountability for making these decisions on data handling requirements?
Data owner
Data subject
Data custodian
Data processor
Answer Description
The correct answer is the data owner. The data owner is the role with the ultimate authority and accountability for data. This includes classifying the data and making decisions about protection, handling requirements, and access controls. While a data custodian implements technical controls and a data processor acts on the data, the data owner is the one who defines the policies and is answerable for the data's protection throughout its lifecycle. The data subject is the individual to whom the data pertains and does not have data management responsibilities.
Ask Bash
Bash is our AI bot, trained to help you pass your exam. AI Generated Content may display inaccurate information, always double-check anything important.
What is the difference between a data owner and a data custodian?
What is meant by the data lifecycle in this context?
How do data roles like data processor and data controller differ from the data owner?
During a scheduled audit of organizational resources, the security team must ensure that every resource is properly accounted for and assessed. What key action should the team prioritize to ensure thorough management?
Collect a comprehensive record of all resources and their risk assessments.
Inspect physical safeguards in place for securing critical infrastructure.
Implement training programs to increase awareness of information security.
Review the current policies regarding personnel access to sensitive systems.
Answer Description
The team should prioritize collecting a thorough inventory of all resources, as this is essential for identifying what the organization possesses and managing risks associated with those resources. This inventory facilitates compliance with security protocols and helps in identifying any vulnerabilities. The other options, while relevant to security, do not directly address the need for accurate documentation of resources.
Ask Bash
Bash is our AI bot, trained to help you pass your exam. AI Generated Content may display inaccurate information, always double-check anything important.
Why is collecting a comprehensive record of resources important for security management?
What is included in a resource inventory for security purposes?
How does a resource inventory support risk assessments?
During an audit, a multinational bank discovers that sensitive customer account information exchanged between its web application and back-end services is only protected by basic network firewalls. To reduce the risk of eavesdropping or tampering while the data is moving across public networks, which control should the security architect implement FIRST to provide both confidentiality and integrity for the data in transit?
Establish a TLS 1.3 encrypted channel for all customer transactions.
Deploy a network-based intrusion prevention system to detect malicious packets.
Implement strict role-based access control on the back-end database.
Configure secure hashing (SHA-256) of files before transmission.
Answer Description
Transport Layer Security (TLS) 1.3 establishes an encrypted channel that uses authenticated-encryption-with-associated-data (AEAD) ciphers. The encryption protects confidentiality, while the integrated message-authentication codes verify that packets have not been altered, thereby ensuring integrity. Controls such as IDS/IPS or firewalls focus on detection or traffic filtering, role-based access control secures data at rest or in use, and hashing files before transfer provides integrity only. None of those alternatives ensure both confidentiality and integrity for data crossing an untrusted network.
Ask Bash
Bash is our AI bot, trained to help you pass your exam. AI Generated Content may display inaccurate information, always double-check anything important.
What are encryption protocols, and why are they important for data transmission?
How do encryption protocols ensure both confidentiality and integrity?
What is the difference between SSL/TLS and IPsec encryption protocols?
A financial services organization is decommissioning several servers that were used to process and store highly sensitive customer financial data. The servers contain solid-state drives (SSDs). According to the company's security policy, which is aligned with NIST 800-88, the data on these drives must be made irrecoverable using the method that provides the highest level of assurance. Which of the following procedures BEST meets this requirement?
Degaussing the SSDs with a certified, high-power degausser
Physically shredding the SSDs to a particle size of 2mm or less
Executing a cryptographic erase (CE) command on the drives
Performing a multipass overwrite using a DoD 5220.22-M compliant tool
Answer Description
According to NIST 800-88, physical destruction (Destroy) provides the highest level of assurance that data is irrecoverable. For SSDs, methods like shredding to a small particle size ensure that the memory chips are physically destroyed, making data recovery impossible. Cryptographic erase is a valid 'Purge' method but relies on the correct implementation of the drive's firmware and encryption, which carries a residual risk if compromised. Degaussing is ineffective on SSDs as they are not magnetic media. Multipass overwriting is also not recommended for SSDs due to wear-leveling and over-provisioning, which can leave data remnants in unaddressable areas.
Ask Bash
Bash is our AI bot, trained to help you pass your exam. AI Generated Content may display inaccurate information, always double-check anything important.
Why does rendering a storage device unusable provide the highest assurance of data elimination?
What is the difference between overwriting and physical destruction of a storage device for data removal?
What are some effective methods to physically destroy a storage device?
A financial services company schedules nightly backups of customer databases. The backup system initiates a TLS 1.3 tunnel and copies the encrypted backup files from the primary data center in Chicago to a geographically separate disaster-recovery site in Denver. During the actual copy operation across the leased MPLS link, in which data state must the backup files be protected?
The data is archived until it reaches the disaster-recovery site.
The data is in use while the backup application accesses it.
The data is at rest at the primary data center until the transfer ends.
The data is in transit during the copy across the network.
Answer Description
Because the files are actively moving between two locations, they are in the "in transit" state. Data in transit should be protected with controls such as strong end-to-end encryption and authenticated connections. Data at rest would apply once the files are stored at either site, and data in use would apply only when the files are opened and processed in memory. The term "archived" refers to long-term inactive storage, which is not occurring during the copy operation.
Ask Bash
Bash is our AI bot, trained to help you pass your exam. AI Generated Content may display inaccurate information, always double-check anything important.
What does 'data in transit' mean?
How does encryption protect data in transit?
How is 'data at rest' different from 'data in transit'?
Under the GDPR purpose-limitation principle, which practice best helps an organization remain compliant when it designs an online form to collect personal data from customers?
Use a blanket consent statement that allows the organization to repurpose the data for any future processing.
Document and disclose, before collection, exactly which personal data will be collected and the legitimate purposes for each field.
Rely on the corporate privacy policy alone and omit purpose statements on the form to avoid confusing customers.
Request every piece of information that could be useful in future projects, provided the data is stored securely.
Answer Description
Documenting and disclosing, in advance, the exact personal data fields and the specific lawful purposes for each satisfies the GDPR's requirement that data be collected only for "specified, explicit and legitimate purposes." This transparency lets data subjects understand how their information will be used and allows auditors to verify compliance. The other choices either encourage collecting data 'just in case,' rely on blanket consent for undefined future uses, or omit purpose statements altogether-all of which breach the purpose-limitation and transparency obligations.
Ask Bash
Bash is our AI bot, trained to help you pass your exam. AI Generated Content may display inaccurate information, always double-check anything important.
What is the GDPR purpose-limitation principle?
Why is documenting and disclosing data collection purposes important?
What happens if an organization uses blanket consent forms?
Who holds the ultimate responsibility for the data within an organization?
Data processor
Data owner
Data subject
Data custodian
Answer Description
The data owner is the individual who is ultimately responsible for the data. They have the authority to make decisions regarding data handling, classification, and access. This role includes defining who can access the data and determining what protections are necessary. Other options, like users or custodians, have responsibilities, but they do not have the final authority on data management decisions.
Ask Bash
Bash is our AI bot, trained to help you pass your exam. AI Generated Content may display inaccurate information, always double-check anything important.
What are the primary responsibilities of a data owner?
How does the role of a data custodian differ from that of a data owner?
Why is it important to distinguish between the data owner and other roles like data processors?
That's It!
Looks like that's it! You can go back and review your answers or click the button below to grade your test.