[Free] New Updated (October) ISC SSCP Real Exam 71-80

Ensurepass

 

QUESTION 71

What is the primary role of smartcards in a PKI?

 

A.

Transparent renewal of user keys

B.

Easy distribution of the certificates between the users

C.

Fast hardware encryption of the raw data

D.

Tamper resistant, mobile storage and application of private keys of the users

 

Correct Answer: D

Explanation:

Reference:

HARRIS, Shon, All-In-One CISSP Certi
fication Exam Guide, 2001, McGraw- Hill/Osborne, page 139; SNYDER, J., What is a SMART CARD?.

 

Wikipedia has a nice definition at: http://en.wikipedia.org/wiki/Tamper_resistance

 

Security

Tamper-resistant microprocessors are used to store and process private or sensitive information, such as private keys or electronic money credit. To prevent an attacker from retrieving or modifying the information, the chips are designed so that the information is not accessible through external means and can be accessed only by the embedded software, which should contain the appropriate security measures.

 

Examples of tamper-resistant chips include all secure cryptoprocessors, such as the IBM 4758 and chips used in smartcards, as well as the Clipper chip.

 

It has been argued that it is very difficult to make simple electronic devices secure against tampering, because numerous attacks are possible, including:

 

physical attack of various forms (microprobing, drills, files, solvents, etc.)

 

freezing the device

 

applying out-of-spec voltages or power surges

 

applying unusual clock signals

 

inducing software errors using radiation

 

measuring the precise time and power requirements of certain operations (see power analysis)

 

Tamper-resistant chips may be designed to zeroise their sensitive data (especially cryptographic keys) if they detect penetration of their security encapsulation or out-of- specification environmental parameters. A chip may even be rated for “cold zeroisation”, the ability to zeroise itself even after its power supply has been crippled.

 

Nevertheless, the fact that an attacker may have the device in his possession for as long as he likes, and perhaps obtain numerous other samples for testing and practice, means that it is practically impossible to totally eliminate tampering by a sufficiently motivated opponent. Because of this, one of the most important elements in protecting a system is overall system design. In particular, tamper-resistant systems should “fail gracefully” by ensuring that compromise of one device does not compromise the entire system. In this manner, the attacker can be practically restricted to attacks that cost less than the expected return from compromising a single device (plus, perhaps, a little more for kudos).

 

Since the most sophisticated attacks have been estimated to cost several hundred thousand dollars to carry out, carefully designed systems may be invulnerable in practice.

 

 

QUESTION 72

What security model implies a central authority that define rules and sometimes global rules, dictating what subjects can have access to what objects?

 

A.

Flow Model

B.

Discretionary access control

C.

Mandatory access control

D.

Non-discretionary access control

 

Correct Answer: D

Explanation:

As a security administrator you might configure user profiles so that users cannot change the system’s time, alter system configuration files, access a command prompt, or install unapproved applications. This type of access control is referred to as nondiscretionary, meaning that access decisions are not made at the discretion of the user. Nondiscretionary access controls are put into place by an authoritative entity (usually a security administrator) with the goal of protecting the organization’s most critical assets.

 

Non-discretionary access control is when a central authority determines what subjects can have access to what objects based on the organizational security policy. Centralized access control is not an existing security model.

Both, Rule Based Access Control (RuBAC or RBAC) and Role Based Access Controls (RBAC) falls into this category.

 

Reference(s) used for this question:

 

Harris, Shon (2012-10-18). CISSP All-in-One Exam Guide, 6th Edition (p. 221). McGraw- Hill. Kindle Edition.

KRUTZ, Ronald L.& VINES, Russel D., The CISSP Prep Guide: Mastering the Ten Domains of Computer Security, John Wiley & Sons, 2001, Chapter 2: Access control systems (page 33).

 

 

QUESTION 73

Which of the following division is defined in the TCSEC (Orange Book) as minimal protection?

 

A.

Division D

B.

Division C

C.

Division B

D.

Division A

 

Correct Answer: A

Explanation:

The criteria are divided into four divisions: D, C, B, and A ordered in a hierarchical manner with the highest division (A) being reserved for systems providing the most comprehensive security.

 

Each division represents a major improvement in the overall confidence one can place in the system for the protection of sensitive information.

 

Within divisions C and B there are a number of subdivisions known as classes. The classes are also ordered in a hierarchical manner with syst
ems representative of division C and lower classes of division B being characterized by the set of computer security mechanisms that they possess.

 

Assurance of correct and complete design and implementation for these systems is gained mostly through testing of the security- relevant portions of the system. The security-relevant portions of a system are referred to throughout this document as the Trusted Computing Base (TCB).

 

Systems representative of higher classes in division B and division A derive their security attributes more from their design and implementation structure. Increased assurance that the required features are operative, correct, and tamperproof under all circumstances is gained through progressively more rigorous analysis during the design process.

 

TCSEC provides a classification system that is divided into hierarchical divisions of assurance levels:

 

Division D – minimal security

Division C – discretionary protection

Division B – mandatory protection

Division A – verified protection

 

Reference:

page 358 AIO V.5 Shon Harris

Source: KRUTZ, Ronald L.& VINES, Russel D., The CISSP Prep Guide: Mastering the Ten Domains of Computer Security, page 197.

 

Also:

THE source for all TCSEC “level” questions:

http://csrc.nist.gov/publications/secpubs/rainbow/std001.txt

 

 

QUESTION 74

The control measures that are intended to reveal the violations of security policy using software and hardware are associated with:

 

A.

Preventive/physical

B.

Detective/technical

C.

Detective/physical

D.

Detective/administrative

 

Correct Answer: B

Explanation:

The detective/technical control measures are intended to reveal the violations of security policy using technical means.

Source: KRUTZ, Ronald L.& VINES, Russel D., The CISSP Prep Guide: Mastering the Ten Domains of Computer Security, 2001, John Wiley & Sons, Page 35.

&nb
sp;

 

QUESTION 75

Which TCSEC level is labeled Controlled Access Protection?

 

A.

C1

B.

C2

C.

C3

D.

B1

 

Correct Answer: B

Explanation:

C2 is labeled Controlled Access Protection.

 

The TCSEC defines four divisions: D, C, B and A where division A has the highest security.

 

Each division represents a significant difference in the trust an individual or organization can place on the evaluated system. Additionally divisions C, B and A are broken into a series of hierarchical subdivisions called classes: C1, C2, B1, B2, B3 and A1.

 

Each division and class expands or modifies as indicated the requirements of the immediately prior division or class.

D — Minimal protection

 

Reserved for those systems that have been evaluated but that fail to meet the requirements for a higher division

 

C — Discretionary protection

 

C1 — Discretionary Security Protection

Identification and authentication

Separation of users and data

Discretionary Access Control (DAC) capable of enforcing access limitations on an individual basis

Required System Documentation and user manuals

C2 — Controlled Access Protection

More finely grained DAC

Individual accountability through login procedures

Audit trails

Object reuse

Resource isolation

 

B — Mandatory protection

 

B1 — Labeled Security Protection

Informal statement of the security policy model

Data sensitivity labels

Mandatory Access Control (MAC) over selected subjects and objects Label exportation capabilities

All discovered flaws must be removed or otherwise mitigated Design specifications and verification

B2 — Structured Protection

Security policy model clearly defined and formally documented DAC and MAC enforcement extended to all subjects and objects Covert storage channels are analyzed for occurrence and bandwidth Carefully structured into protection-critical and non-protection-critical elements Design and implementation enable more comprehensive testing and review Authentication mechanisms are strengthened

Trusted facility management is provided with administrator and operator segregation Strict configuration management controls are imposed B3 — Security Domains

Satisfies reference monitor requirements

Structured to exclude code not essential to security policy enforcement Significant system engineering directed toward minimizing complexity Security administrator role defined

Audit security-relevant events

Automated imminent intrusion detection, notification, and response Trusted system recovery procedures

Covert timing channels are analyzed for occurrence and bandwidth An example of such a system is the XTS-300, a precursor to the XTS-400

 

A — Verified protection

 

A1 — Verified Design

Functionally identical to B3

Formal design and verification techniques includi
ng a formal top-level specification Formal management and distribution procedures

An example of such a system is Honeywell’s Secure Communications Processor SCOMP, a precursor to the XTS-400

Beyond A1

System Architecture demonstrates that the requirements of self-protection and completeness for reference monitors have been implemented in the Trusted Computing Base (TCB).

Security Testing automatically generates test-case from the formal top-level specification or formal lower-level specifications.

Formal Specification and Verification is where the TCB is verified down to the source code level, using formal verification methods where feasible.

 

Trusted Design Environment is where the TCB is designed in a trusted facility with only trusted (cleared) personnel.

 

The following are incorrect answers:

 

C1 is Discretionary security

C3 does not exists, it is only a detractor

B1 is called Labeled Security Protection.

 

Reference(s) used for this question:

 

HARE, Chris, Security management Practices CISSP Open Study Guide, version 1.0, april 1999.

 

And AIOv4 Security Architecture and Design (pages 357 – 361) AIOv5 Security Architecture and Design (pages 358 – 362)

 

 

QUESTION 76

Because all the secret keys are held and authentication is performed on the Kerberos TGS and the authentication servers, these servers are vulnerable to:

 

A.

neither physical attacks nor attacks from malicious code.

B.

physical attacks only

C.

both physical attacks and attacks from malicious code.

D.

physical attacks but not attacks from malicious code.

 

Correct Answer: C

Explanation:

Since all the secret keys are held and authentication is performed on the Kerberos TGS and the authentication servers, these servers are vulnerable to both physical attacks and attacks from malicious code.

Because a client’s password is used in the initiation of the Kerberos request for the service protocol, password guessing can be used to impersonate a client.

Source: KRUTZ, Ronald L.& VINES, Russel D., The CISSP Prep Guide: Mastering the Ten Domains of Computer Security, 2001, John Wiley & Sons, Page 42.

 

 

QUESTION 77

The Orange Book is founded upon which security policy model?

 

A.

The Biba Model

B.

The Bell LaPadula Model

C.

Clark-Wilson Model

D.

TEMPEST

 

Correct Answer: B

Explanation:


From the glossary of Computer Security Basics:

The Bell-LaPadula model is the security policy model on which the Orange Book requirements are based. From the Orange Book definition, “A formal state transition model of computer security policy that describes a set of access control rules. In this formal model, the entities in a computer system are divided into abstract sets of subjects and objects. The notion of secure state is defined and it is proven that each state transition preserves security by moving from secure state to secure state; thus, inductively proving the system is secure. A system state is defined to be ‘secure’ if the only permitted access modes of subjects to objects are in accordance with a specific security policy. In order to determine whether or not a specific access mode is allowed, the clearance of a subject is compared to the classification of the object and a determination is made as to whether the subject is authorized for the specific access mode.” The Biba Model is an integrity model of computer security policy that describes a set of rules. In this model, a subject may not depend on any object or other subject that is less trusted than itself.

The Clark Wilson Model is an integrity model for computer security policy designed for a commercial environment. It addresses such concepts as nondiscretionary access control, privilege separation, and least privilege. TEMPEST is a government program that prevents the compromising electrical and electromagnetic signals that emanate from computers and related equipment from being intercepted and deciphered. Source: RUSSEL, Deborah & GANGEMI, G.T. Sr., Computer Security Basics, O’Reilly, 1991.

Also: U.S. Department of Defense, Trusted Computer System Evaluation Criteria (Orange Book), DOD 5200.28-STD. December 1985 (also available here).

 

 

QUESTION 78

Which of the following is an example of a passive attack?

 

A.

Denying services to legitimate users

B.

Shoulder surfing

C.

Brute-force password cracking

D.

Smurfing

 

Correct Answer: B

Explanation:

Shoulder surfing is a form of a passive attack involving stealing passwords, personal identification numbers or other confidential information by looking over someone’s shoulder. All other forms of attack are active attacks, where a threat makes a modification to the system in an attempt to take advantage of a vulnerability.

Source: HARRIS, Shon, All-In-One CISSP Certification Exam Guide, McGraw- Hill/Osborne, 2002, chapter 3: Security Management Practices (page 63).

 

 

QUESTION 79

What does the (star) integrity axiom mean in the Biba model?

 

A.

No read up

B.

No write down

C.

No read down

D.

No write up

 

Correct Answer: D

Explanation:

The (star) integrity axiom of the Biba access control model states that an object at one level of integrity is not permitted to modify an object of a higher level of integrity (no write up).

Source: KRUTZ, Ronald L.& VINES, Russel D., The CISSP Prep Guide: Mastering the Ten Domains of Computer Security, John Wiley & Sons, 2001, Chapter 5: Security Architectures and Models (page 205).

 

 

QUESTION 80

Almost all types of detection permit a system’s sensitivity to be increased or decreased during an inspection process. If the system’s sensitivity is increased, such as in a biometric authentication system, the system becomes increasingly selective and has the possibility of generating:

 

A.

Lower False Rejection Rate (FRR)

B.

Higher False Rejection Rate (FRR)

C.

Higher False Acceptance Rate (FAR)

D.

It will not affect either FAR or FRR

 

Correct Answer: B

Explanation:

Almost all types of detection permit a system’s sensitivity to be increased or decreased during an inspection process. If the system’s sensitivity is increased, such as in a biometric authentication system, the system becomes increasingly selective and has a higher False Rejection Rate (FRR).

 

Conversely, if the sensitivity is decreased, the False Acceptance Rate (FRR) will increase. Thus, to have a valid measure of the system performance, the Cross Over Error (CER) rate is used. The Crossover Error Rate (CER) is the point at which the false rejection rates and the false acceptance rates are equal. The lower the value of the CER, the more accurate the system.

 

There are three categories of biometric accuracy measurement (all represented as percentages):

 

False Reject Rate (a Type I Error): When authorized users are falsely rejected as unidentified or unverified.

 

False Accept Rate (a Type II Error): When unauthorized persons or imposters are falsely accepted as authentic.

 

Crossover Error Rate (CER): The point at which the false rejection rates and the false acceptance rates are equal. The smaller the value of the CER, the more accurate the system.

 

NOTE:

Within the ISC2 book they make use of the term Accept or Acceptance and also Reject or Rejection when referring to the type of errors within biometrics. Below we make use of Acceptance and Rejection throughout the text for conistency. However, on the real exam you could see either of the terms.

Performance of biometrics

 

Different metrics can be used to rate the performance of a biometric factor, solution or application. The most common performance metrics are the False Acceptance Rate FAR and the False Rejection Rate FRR.

 

When using a biometric application for the first time the user needs to enroll to the system. The system requests fingerprints, a voice recording or another biometric factor from the operator, this input is registered in the database as a template which is linked internally to a user ID. The next time when the user wants to authenticate or identify himself, the biometric input provided by the user is compared to the template(s) in the database by a matching algorithm which responds with acceptance (match) or rejection (no match).

FAR and FRR

The FAR or False Acceptance rate is the probability that the system incorrectly authorizes a non-authorized person, due to incorrectly matching the biometric input with a valid template. The FAR is normally expressed as a percentage, following the FAR definition this is the percentage of invalid inputs which are incorrectly accepted.

 

The FRR or False Rejection Rate is the probability that the system incorrectly rejects access to an authorized person, due to failing to match the biometric input provided by the user with a stored template. The FRR is normally expressed as a percentage, following the FRR definition this is the percentage of valid inputs which are incorrectly rejected.

 

FAR and FRR are very much dependent on the biometric factor that is used and on the technical implementation of the biometric solution. Furthermore the FRR is strongly person dependent, a personal FRR can be determined for each individual.

 

Take this into account when determining the FRR of a biometric solution, one person is insufficient to establish an overall FRR for a solution. Also FRR might increase due to environmental conditions or incorrect use, for example when using dirty fingers on a fingerprint reader. Mostly the FRR lowers when a user gains more experience in how to use the biometric device or software.

 

FAR and FRR are key metrics for biometric solutions, some biometric devices or software even allow to tune them so that the system more quickly matches or rejects. Both FRR and FAR are important, but for most applications one of them is considered most important.

 

Two examples to illustrate this:

 

When biometrics are used for logical or physical access control, the objective of the application is to disallow access to unauthorized individuals under all circumstances. It is clear that a very low FAR is needed for such an application, even if it comes at the price of a higher FRR.

 

When surveillance cameras are used to screen a crowd of people for missing children, the objective of the application is to identify any missing children that come up on the screen. When the identification of those children is automated using a face recognition software, this software has to be set up with a low FRR. As such a higher number of matches will be false positives, but these can be reviewed quickly by surveillance personnel.

 

False Acceptance Rate is also called False Match Rate, and False Rejection Rate is sometimes referred to as False Non-Match Rate.

crossover error rate

 

clip_image001

 

crossover error rate

 

Above see a graphical representation of FAR and FRR errors on a graph, indicating the CER

 

CER

The Crossover Error Rate or CER is illustrated on the graph above. It is the rate where both FAR and FRR are equal.

 

The matching algorithm in a biometric software or device uses a (configurable) threshold which determines how close to a template the input must be for it to be considered a match. This threshold value is in some cases referred to as sensitivity, it is marked on the X axis of the plot. When you reduce this threshold there will be more false acceptance errors (higher FAR) and less false rejection errors (lower FRR), a higher threshold will lead to lower FAR and higher FRR.

 

Speed

Most manufacturers of biometric devices and softwares can give clear numbers on the time it takes to enroll as well on the time for an individual to be authenticated or identified using their application. If speed is important then take your time to consider this, 5 seconds might seem a short time on paper or when testing a device but if hundreds of people will use the device multiple times a day the cumulative loss of time might be significant.

 

Reference(s) used for this question:

 

Hernandez CISSP, Steven (2012-12-21). Official (ISC)2 Guide to the CISSP CBK, Third Edition ((ISC)2 Press) (Kindle Locations 2723-2731). Auerbach Publications. Kindle Edition.

KRUTZ, Ronald L.& VINES, Russel D., The CISSP Prep Guide: Mastering the Ten Domains of Computer Security, 2001, John Wiley & Sons, Page 37.

http://www.biometric-solutions.com/index.php?story=performance_biometrics

Free VCE & PDF File for ISC SSCP Real Exam

Instant Access to Free VCE Files: ISC | ISC | SAP …
Instant Access to Free PDF Files: ISC | ISC | SAP …

This entry was posted in SSCP Real Exam (October) and tagged , , , , , , . Bookmark the permalink.