The Computer Forensics and Cybersecurity Governance Model

Published on May 2016 | Categories: Types, Instruction manuals | Downloads: 29 | Comments: 0 | Views: 341
of 8
Download PDF   Embed   Report

The Computer Forensics and CybersecurityGovernance Model

Comments

Content

Copyright © 2003 Information Systems Audit and Control Association. All rights reserved. www.isaca.org.

The Computer Forensics and Cybersecurity Governance Model
An Analysis of the Role of Technology Assurance in Understanding the Risks and Controls Governing Computer Forensics and Preventive Information Security Defenses Over Homeland Security

By Kenneth C. Brancik, CISA

T

he Computer Forensics and Cybersecurity Governance Model (figure 1), as described in this article, provides a logical and compartmentalized approach for assessing risk. Specifically, technology assurance procedures were developed as a major component of the computer forensics and cybersecurity examination methodology to provide structure to the risk assessment process. Corporate governance over computer forensics and cybersecurity is a partnership among employees, management, the audit committee, the board of directors, regulators and consultants. Establishing a partnership with each individual within a corporation or government agency is crucial to the initial success of establishing and maintaining a workable governance model. The execution and enforcement of the governance model typically is performed by a technology assurance professional. The communication reporting chain of command starts from Figure 1—The Computer Forensics and Cybersecurity Governance Model
A Macro Processes Overview (Cross reference figure 5)
Homeland Security Composite Rating for Computer Forensics and Cybersecurity Preparedness (Cross reference to figure 2)

the field auditor or examiner, who reports findings to a supervisor, then to the CIO and CEO of the company or government agency. Conclusions reached and recommendations made by the technology assurance personnel then are communicated to the audit committee and finally to the board of directors. Any material high-risk issues then are reported to the regulators and law enforcement as needs warrant.

The Governance Model’s Processes
There are three prerequisite steps needed for the effective use of the governance model and to understand whether or not an intersection exists between information security (infosec) and computer forensics. For each of the following three steps, the primary plan used for conducting the risk assessment is the model’s examination processes contained with this document. • Step 1: Infosec risk assessment (prevention)—Understanding the unique infosec risk profile of the entity is different depending on the infrastructure (e.g., operating system, network) and supported applications and systems. Senior management of an organization needs to understand the specific infosec risks in the organization by ensuring a continual risk assessment is performed on the infrastructure components and applications determined to have the highest criticality to supporting the operations of that entity. Upon completion of the model’s examination procedures, a color risk scorecard is prepared based on management’s perceived infosec risk assessment (see figures 2 and 3). • Step 2: Computer forensic assessment (detection)—The model functions in a dual capacity, both for infosec risk assessment (prevention) and for performing a post mortem forensic analysis (detection) of a suspected unauthorized intrusion or hack. Performing step 2 provides senior management a clearer understanding of the weaknesses and vulnerabilities in the infosec safeguards not identified in the initial risk assessment performed in step 1. Upon completion of the model’s examination processes, a color scorecard is prepared on management’s actual infosec risks (see figures 2 and 3). • Step 3: Analyzing the intersection between infosec and computer forensics—The color infosec (prevention) and computer forensic (detection) risk scorecards completed in steps 1 and 2 now can be analyzed. Specifically, the infosec analyst needs to compare and contrast the findings of the two color scorecards to identify similarities, but more importantly, disparities between the two scorecard ratings.

Heat Map Risk Scorecard Validation of CIRT

Validation of Journaling Validation of Cybersecurity

Validation of Firewall Security

Validation of IDS, Vulnerability Assessment and other Network Testing Validation of Infosec Policies and Procedures Management Risk Assessment of Computer Forensics and Cybersecurity Preparedness

INFORMATION SYSTEMS CONTROL JOURNAL, VOLUME 2, 2003

Figure 2—Computer Forensics and Cybersecurity Examination Methodology
Responsibility: Senior management self-assessment Computer forensics/cybersecurity risk assessment management, performed by management as a risk self-assessment (process review) Responsibility: Assurance, infosec, regulatory and law enforcement Computer forensics/cybersecurity risk assessment; “the process review” performed by assurance, regulatory and/or law enforcement (LE) Responsibility: Assurance, infosec, regulatory and law enforcement Computer forensics/cybersecurity risk assessment; “the process review” performed by assurance, regulatory and/or LE Responsibility: Assurance, infosec, regulatory and law enforcement Computer forensics/cybersecurity risk internal controls assessment; “the practice review” performed by assurance, regulatory and/or LE Responsibility: Assurance, infosec, regulatory and law enforcement Computer forensics/cybersecurity risk assessment; “the final risk scorecard” adjusted by assurance, regulatory and/or LE after forensic examination Step 10 Make revisions to the Process and Practice Risk Scorecards and Heatmap based on the completed Computer Forensics and Cybersecurity Examination Procedures. Responsibility: Assurance, infosec, regulatory and law enforcement Computer forensics/cybersecurity risk assessment; “the reporting process” performed by assurance, regulatory and/or LE

Step 1 Complete the Computer Forensics and Cybersecurity Governance Model exam procedures.

Step 2 Complete Process and Practice Risk Scorecards. Step 3 Complete Heatmap (a consolidation of all color-coded risk scorecards).

Step 4 Evaluate the Process Risk Scorecard for completeness and accuracy against the computer forensic and cybersecurity policy and procedures (“processes”).

Step 6 Complete a Process Conclusion Memo describing strengths and weaknesses of computer forensic and cybersecurity processes (policy and procedures).

Step 7 Complete the Computer Forensics and Cybersecurity Examination Procedures, Practices

Step 5 Complete Computer Forensics Examination Procedures— Cybersecurity Risk Assessment (process review of policies and procedures).

Step 8 Evaluate the managementprepared Practice Risk Scorecard for completeness and accuracy against the results of the recently completed Computer Forensics and Cybersecurity Examination Procedures. Step 9 Complete a Practice Conclusion Memo describing strengths and weaknesses of computer forensic and cybersecurity practices.

Step 11 Develop an Executive Summary Report that summarizes the final conclusion on computer forensic and cybersecurity risks.

INFORMATION SYSTEMS CONTROL JOURNAL, VOLUME 2, 2003

Figure 3—The Computer Forensics and Cybersecurity Governance Model The Risk Assessment Scorecard Criteria
A Confidentiality Infosec policies and procedures provide strong documented controls to protect data confidentiality. B Confidentiality Infosec policies and procedures provide adequate documented controls to protect data confidentiality. C Confidentiality Infosec policies and procedures need strengthening to ensure the existence of strong documented controls to protect data confidentiality. D Confidentiality Infosec policies and procedures are inadequate to ensure the existence of strong documented controls to protect data confidentiality. E Confidentiality Infosec policies and procedures are critically deficient to ensure the existence of strong documented controls to protect data confidentiality.

Integrity Infosec policies and procedures strongly protect data integrity through a high level of controls governing authenticity, nonrepudiation and accountability.

Integrity Infosec policies and procedures adequately protect data integrity through a high level of controls governing authenticity, nonrepudiation and accountability.

Integrity Infosec policies and procedures need strengthening to ensure a strong level of data integrity exists through a high level of controls governing authenticity, nonrepudiation and accountability.

Integrity Infosec policies and procedures are inadequate to ensure a strong level of data integrity exists through a high level of controls governing authenticity, nonrepudiation and accountability.

Integrity Infosec policies and procedures are critically deficient to ensure a strong level of data integrity exists through a high level of controls governing authenticity, nonrepudiation and accountability.

Availability Infosec policies and procedures provide a strong internal control standard to protect the timely availability of information technology resources (system and data).

Availability Infosec policies and procedures provide reasonable internal control standards to protect the timely availability of information technology resources (system and data).

Availability Infosec policies and procedures need strengthening to protect the timely availability of information technology resources (system and data).

Availability Infosec policies and procedures are inadequate to protect the timely availability of information technology resources (system and data).

Availability Infosec policies and procedures are critically deficient and require fundamental improvement to protect the timely availability of information technology resources (system and data).

An intersection exists when the prevention and detection color scorecard rating match, which validates management’s original risk attestation (prevention), based on the completed computer forensic analysis performed in step 2. Conversely, an intersection does not exist when there is a disparity in color risk rating assigned during the infosec (prevention) and computer forensic (detection) processes. As a result of the color risk rating disparity, further investigation and root cause analysis need to be completed and the problem(s) resolved to prevent future occurrences of the infosec vulnerabilities. Included within figure 5 are the procedures that should be followed by assurance, infosec and regulatory professionals when implementing the Computer Forensics and Cybersecurity Governance Model and the Homeland Security Advisory System. Additional supporting details can be found throughout the article. There is a need for comprehensive common criteria that allow public and private sector entities to understand the risks and controls governing computer forensics and cybersecurity. Recent federal legislation, as noted within the Homeland

Security PDD3 signed by US President George W. Bush on 11 March 2002, provides a comprehensive and effective means to disseminate information regarding the risk of terrorist acts to federal, state and local authorities and to the American people, but is focused exclusively on the protection against terrorist threats on the nation’s physical infrastructure and does not establish common criteria for measuring and understanding cybersecurity risks. The key objective of the Computer Forensics and Cybersecurity Governance Model is to establish the initial governance framework that will attempt to clarify the role of technology assurance in understanding the risks and controls governing computer forensics and preventive information security defenses. Raising the awareness and understanding of the risks and controls governing computer forensics and their interrelationship to preventive information security defenses is critical to developing adequate preventive controls needed to safeguard information systems against cyberattacks and prevent the need for computer forensics. A large risk exists in the general lack of a clear understanding of computer forensics and

INFORMATION SYSTEMS CONTROL JOURNAL, VOLUME 2, 2003

Figure 4—Homeland Security Advisory System
Advisory System for Computer Forensics and Cybersecurity Preparedness (Cross reference figure 6)

Severe infosec controls are critically deficient.

Infosec policies and practices are CRITICALLY DEFICIENT and need immediate remedial corrective action. As a result of woefully inadequate infosec controls, the potential for computer crime and the need for computer forensics are extremely high. Infosec policies and practices are INADEQUATE to reduce cybersecurity risk. As a result of inadequate infosec controls, the potential for computer crime and the need for computer forensics are high. Infosec policies and practices NEED STRENGTHENING to ensure adequate controls exist to safeguard against cybersecurity and the need for a computer forensics examination is reduced. Infosec policies and practices are ADEQUATE to reduce the risk of unauthorized access into mission-critical systems. As a result of the adequate controls, the likelihood of cybercrime and the need for a computer forensics examination is reduced. Infosec policies and practices are STRONG, greatly reducing the risk of unauthorized access into mission-critical systems. As a result of the strong controls, the likelihood of cybercrime and the need for a computer forensics examination are reduced.

High infosec controls are inadequate.

Elevated infosec controls need strengthening.

Guarded infosec controls are adequate.

Low infosec controls are strong.

cybersecurity among assurance professionals and other IT and security personnel. To address the aforementioned deficiency, a significant amount of research was conducted on this elusive topic. A secondary source of information originated from discussions with practitioners of computer forensics in US federal law enforcement and academia.

Develop Common Criteria to Identify, Measure, Monitor and Control Cybersecurity Risks and Computer Forensics Evidentiary Data
There is a need for a comprehensive common criterion that allows public and private sector entities and assurance professionals to understand the risks and controls governing

Figure 5—Computer Forensics and Cybersecurity Governance Model (Homeland Security)
Computer Forensics and Cybersecurity Governance Model Step 1 (Management) Risk Assessment and Preliminary Scorecard Management performs a computer forensic and cybersecurity risk self-assessment and preliminary sorecard for the governance model. Step 2 (Assurance, infosec and regulatory personnel) Validation The adequacy of the following processes and practices is assessed and validated: – Policies, standards and procedures – IDS, vulnerability assessments and other network testing – Cybersecurity – Firewall security – Journaling – CIRT Step 3 (Assurance, infosec and regulatory personnel) Risk Scorecard Enhancements Enhancements to management’s initial riskassessed scorecard for the model is performed, as appropriate, upon completion of the governance model’s examination procedures. Step 4 (Assurance, infosec and regulatory personnel) Heatmap All processes and practices listed within step 2 are consolidated to conclude on the assignment of one final composite risk assessment rating based on the criteria outlined within the governance model’s Risk Assessment Scorecard. Homeland Security Step 5 (Assurance, infosec and regulatory personnel) Final Scorecard Using the Heatmap scores, a composite rating is developed in accordance with the criteria noted within the Homeland Security Advisory System.

INFORMATION SYSTEMS CONTROL JOURNAL, VOLUME 2, 2003

computer forensics and cybersecurity. The US federal government has made strong efforts to train special agents, criminal investigators and other law enforcement personnel in the science of computer forensics; however, there is limited information on this important subject matter that is available in the public domain. Assurance, infosec and regulatory professionals play an important role in contributing to the evaluation of the risks in computer forensics and cybersecurity. Paradoxically, assurance, infosec and regulatory professionals who should be on the front line in assessing computer forensics and cybersecurity risks may not be able to conveniently receive the appropriate level of quality training comparable to the law enforcement community. With only a few universities and private corporations providing specialized computer forensic training for non-law enforcement individuals, the nation’s critical infrastructure (e.g., financial institutions) runs the risk of being either inaccurately audited or examined by assurance, infosec and regulatory personnel. As a negative consequence, there are islands of intelligence (“dots”), which include unauthorized computer intrusions or hacks into systems that may go undetected, not reported to law enforcement, or not analyzed internally by the entity in a complete and accurate manner.

Figure 6—Computer Forensics and Cybersecurity Methodology
Parent (Master) Process Flow Diagram (Cross reference with figure 5 for the risk assessment scorecard)

(2) IDS, vulnerability assessment and other network testing (practice)

(3) Firewall security (practice)

(1) Infosec policies and procedures (process)

Computer forensics and cybersecurity risks

(4) Cybersecurity risk assessment

(6) CIRT activities (practice)

(5) Journaling (practice)

Computer Information Systems Security Legislation and Computer Crime Survey
Assurance, infosec professionals and regulators should have a strong working knowledge of past and present legislation governing information security and cybersecurity prior to performing the detailed computer forensics examination procedures that were developed as a major component of the governance model. Listed below is US federal legislation which directly impacts infosec in both the public and private sectors and indirectly the science of computer forensics. • Computer Security and Privacy Act of 1987 • Presidential Decision Directive 63 (PDD63), 1998 • OMB A-130: Management of Information Resources, Appendix III, “Security of Federal Automated Information Resources,” amended 28 November 2000 • National Plan for Information Systems Protection of 2000 • The USA Patriot Act • Homeland Security Presidential Directive-3 (PDD3), 2002

tively implement the plan as a matter of routine. Trigger points should be established for each intrusion scenario, which would include contacting local and/or federal law enforcement computer crimes. Similar to continuity of business (COB), a contact list needs development as part of a comprehensive CIRT plan to ensure timely and accurate communication to the appropriate people and organizations.

The Crime is in Process
CIRT’s responsibility—If the organization has adequately performed due diligence in preparing for such an event, this will assist CIRT personnel in capturing key data from the intrusion in process through the use of IDS software and logs, in addition to establishing various preventive controls such as HoneyPots, HoneyNets and padded cells. HoneyPots are decoy systems that attempt to lure an attacker away from critical systems. Specifically, the decoy systems contain fabricated information, and while the attacker is still within the system, the systems administrator (SA), security officer and others are closely tracking the hacker and gathering intelligence on the suspected unauthorized party. HoneyNets take HoneyPots a step further by expanding the single-system concept of a HoneyPot to a network of multiple systems in a production environment. The primary purpose of the HoneyNet is for hacker intelligence gathering to learn more about the tools, tactics and motives of the hacker community and can be set up on a number of different platforms (e.g., different flavors of UNIX, NT, Linux). A padded cell waits for a traditional ID to detect an attacker and then is transferred to a special padded cell host, which baits the intruder with what is presumed to be attractive data and information.

The Importance of Having a Strong CIRT
Every organization and federal agency should have a computer incident response team (CIRT). The composition of the CIRT team may consist of only one individual or many depending on the complexity and size of the organization or government agency. Members of the CIRT should consider membership and involvement within the appropriate US federal sponsored organizations, which includes FIRST, NYECTF and INFRAGARD. Documented CIRT policies and procedures need development and maintenance on an ongoing basis to ensure compliance with established technical, legal and ethical issues. The quality and ongoing training and testing of the CIRT plan is needed to ensure everyone involved in the process can effec-

INFORMATION SYSTEMS CONTROL JOURNAL, VOLUME 2, 2003

Figure 7—The Computer Forensics and Cybersecurity Governance Model
*The Risk Assessment Scorecard *Completed for each process and practice listed within figure 5

Process Scorecard 6 5 4 3 2 (1) Infosec policies and procedures (process) E
Transfer each process scorecard rating to Heatmap (figure 7)

Practices 2-6 have separate rating criteria
Input: Computer Forensics (Electronic Crime Scene Investigations) 10. Importance of returning the network to a secure operating level 11. Securing and evaluating the scene 12. Documenting the scene 13. Evidence collection

Each process (1) and practice (2-6) receives a risk scorecard rating based on the risk assessment scorecard criteria (figure 3)
Input: General and Application Controls 14. Encryption 15. Employee surveillance 16. E-mail 17. Firewalls 18. Internet 19. Intranet 20. Extranet 21. LAN/WAN/WAP 22. Logging conrols/audit trails 23. Microcomputers 24. Outsourcing 25. Passwords 26. Virus prevention, detection and removal 27. Contingency planning 28. Data classification 29. Digital signatures 30. E-commerce 31. Laptops 32. Data privacy 33. Telecommunications 34. Telephone systems 35. VPN 36. PKI 37. Steganography 38. Physical security 39. Operating systems 40. Ethical hack 41. IDS 42. Vulnerability assessment 43. Social engineering 44. Audit reports

D B A C

Input: General and Application Controls 45. Identification and authentication 46. Software import control 47. Controlling interactive software 48. Software licensing 49. Remote access 50. Access to internal databases 51. Data integrity 52. Intellectual legal 53. Regulatory and legal requirements

D B A C B E A D

Input: 1. Purpose, mission and scope 2. Code of conduct 3. Hiring and maintaining CIRT staff 4. Communications (internal and external) 5. Critical information to collect 6. Tools for incident response 7. Handling major events 8. Physical issues 9. Sharing information with other agencies

D B A C B E A D E

D B A C B E A D D B A C B E E E E E E E B E A D D B A C B E A D

INFORMATION SYSTEMS CONTROL JOURNAL, VOLUME 2, 2003

Trap-and-trace—The most prudent strategy is to consult legal counsel, local, state and federal law enforcement representatives for the appropriate guidance on this matter. Generally speaking, a court order apparently is not required for a non-law enforcement person conducting a trap-and-trace on a “black hat” who is trying to gain access to a company’s network from the Internet; however, such activities may be frowned upon by certain courts. Specifically, the courts may view a trapand-trace activity as a violation of wiretap laws. For a network trap-and-trace, a UNIX SA or other qualified person who has root access can type the word who and this will provide the login name of the user, the terminal being used, the location of the terminal date and time of the user who logged in and the IP address. Careful documentation by the SA and other authorized personnel of this information should occur. Secondly, assuming this is a UNIX operating system, the SA can issue a finger command to receive more detailed information on the user (e.g., full name of the user, device number of the users’ terminals).

Figure 8—The Computer Forensics and Cybersecurity Governance Model
The Risk Assessment Scorecard Heat Map & Final Scorecard Rating
Heat Map Scorecard Rating

COMPOSITE RISK ASSESSMENT Process 1 infosec Policies and Procedures (Process)

*B

E

Process 2 IDS, Vulnerability Assess and other Network Testing

D

*FINAL SCORECARD RATING Prior to assigning a composite risk assessment rating, refer to figure 4, “Advisory System for Cybersecurity Preparedness” for FINAL rating criteria.

Involving the Authorities
If the unauthorized intrusion requires contacting the authorities according to the organization’s CIRT policies and procedures, then the first point of contact should be the US Secret Service (USSS). Obviously, if a violation of internal policy or standards may not prompt calling in the authorities when the matter can be handled through business as usual (BAU) internal investigative processes. If a crime is suspected and senior management of the institution approves, then contacting the USSS would be appropriate. It is recommended, however, that management of the institution first complete the appropriate documentation (e.g., a network incident report and a facility security survey, which includes a section for computer security questions), which can be downloaded from www.ectaskforce.org. The seizure of “federal interest computers” falls under the jurisdiction of both the USSS and the US Federal Bureau of Investigation (FBI); however, the first point of contact to a federal agency should be the USSS and they will decide after further investigation and examination as to whether or not the alleged crime falls in the FBI’s domain.

Process 3 Firewall Security (Practice)

B

B Process 4 Third-party and Internal Audits and Risk Assessments (Practice) B

Process 5 Journaling (Practice)

Process 6 CIRT Activities (Practice)

D

The Computer Forensic Examination Process
There are basically three steps in the actual computer forensic examination process, which includes: a) the image, creating a bit stream backup of hard disks and floppy disks, b) search and analysis of multiple acquired pieces of media and c) the investigation. Specialized training is recommended by the practitioner for each stage of the examination, and close consultation with legal, audit and law enforcement officials is advised.

Security. Specifically, the composition of the cybersecurity and computer forensics preparedness task force needs to include private and public sector industry technology and legal experts along with representation from the federal, state and local law enforcement community to provide insight in developing standardization and common criteria over the science of digital evidence and required information security defenses.

Conclusion
The single greatest impediment that this author envisions in successfully formulating the standardization and common criteria for computer forensics, digital evidence and cybersecurity is the continual reticence on the part of private sector entities and public sector agencies to be forthcoming in revealing embarrassing details about unauthorized attempts to break into their systems. Ostensibly, the admission of actual or suspected unauthorized computer intrusions by management of the entity is to plead mea culpa to security shortcomings, which may result in reputation risk, negative public perception and reduced profitability for the victimized private sector entities, attributed to the loss of the public’s confidence in that organization.

Future Steps
To ensure the law enforcement community, private sector entities and government agencies are all in lockstep with consistent and universal standards and procedures governing the collection, analysis, preservation and reporting of evidentiary digital data in the US, a task force needs to be established and directed by the US Office of Homeland

INFORMATION SYSTEMS CONTROL JOURNAL, VOLUME 2, 2003

Disclaimer: The contents of this article represent the opinions of the author only. They may not be shared and in agreement with the opinions of his current or previous employers. Kenneth C. Brancik, CISA has worked in technology assurance for approximately 17 years. He is a senior bank examiner for the Federal Reserve Bank of New York, where he is involved in understanding the risks and controls over emerging technologies as they impact the banking

industry. Brancik’s prior employers include Citigroup, where he was a vice president/manager within the audit and risk review department; PricewaterhouseCoopers LLP Assurance & Business Advisory Service, as a manager; and Merrill Lynch & Company, as a corporate technology auditor. He currently is a doctoral student at Pace University’s Computer Science Department, where he is studying software engineering and the impact of emerging technologies on technology and software development. He has a master’s degree in management systems from New York University.

Information Systems Control Journal, formerly the IS Audit & Control Journal, is published by the Information Systems Audit and Control Association, Inc.. Membership in the association, a voluntary organization of persons interested in information systems (IS) auditing, control and security, entitles one to receive an annual subscription to the Information Systems Control Journal. Opinions expressed in the Information Systems Control Journal represent the views of the authors and advertisers. They may differ from policies and official statements of the Information Systems Audit and Control Association and/or the IT Governance Institute® and their committees, and from opinions endorsed by authors' employers, or the editors of this Journal. Information Systems Control Journal does not attest to the originality of authors' content. © Copyright 2003 by Information Systems Audit and Control Association Inc., formerly the EDP Auditors Association. All rights reserved. ISCATM Information Systems Control AssociationTM Instructors are permitted to photocopy isolated articles for noncommercial classroom use without fee. For other copying, reprint or republication, permission must be obtained in writing from the association. Where necessary, permission is granted by the copyright owners for those registered with the Copyright Clearance Center (CCC), 27 Congress St., Salem, Mass. 01970, to photocopy articles owned by the Information Systems Audit and Control Association Inc., for a flat fee of US $2.50 per article plus 25¢ per page. Send payment to the CCC stating the ISSN (1526-7407), date, volume, and first and last page number of each article. Copying for other than personal use or internal reference, or of articles or columns not owned by the association without express permission of the association or the copyright owner is expressly prohibited. www.isaca.org

INFORMATION SYSTEMS CONTROL JOURNAL, VOLUME 2, 2003

Sponsor Documents

Or use your account on DocShare.tips

Hide

Forgot your password?

Or register your new account on DocShare.tips

Hide

Lost your password? Please enter your email address. You will receive a link to create a new password.

Back to log-in

Close