Risk Management and Analysis - Process and Policy before Technology
According to studies released by the FBI and the Computer Security
Institute (CSI), over 70% of all attacks on sensitive data and resources reported by organizations occurred from within the organization itself. Implementing an internal security
policy appears to be just as important as an external strategy. The objective of this report is to highlight the necessity of internal processes and policy alongside technology when managing and mitigating risk. The author narrates the problems of security
from the unseen forces in an individual that influence thought, behavior and personality. Computers do not yet have the intelligence to question human reasoning, understand the human psyche and then take action based upon logical deduction. The subject matter for this dissertation is based the author’s own personal working experiences, modules taught in the Master of Software Engineering and course materials used.
Background (maybe part of opening Chapter 1)
Many of the firms in question that I worked for invest significant sums of money per annum into technology, with the newfound belief that software creates the competitive advantage and brings business value to the market place. These assets some of which are tangible require many forms of security
to protect them from vandals, hackers, thieves and yes, even competitors. It is the traditional techniques of using hardware and software to manage this risk that the author believes to be the underlying problem of safe keeping their information commodities.
There is not yet a computer with the artificial intelligence, to understand, that one person accessing a system with another person’s credentials maybe alarm for suspicion. It cannot discuss this with another peer computer or explain the extra sensory feelings it has to its human superior. It does not have the ability to correlate the company’s compliance rule regarding computer access against the activity a person is performing on a machine it knows does not belong to that person.
Just as computers need rules and boundaries in order to operate in, so do people, as a society we remain sure of this. We cannot however assume that the person knows the consequences of their actions, and understands that what they are doing may be wrong based upon the rules which have been put in place by the company. We have to educate and teach first, discipline and enforce last.
The report should demonstrate the use of software engineering subject matter taught in –
1. Practical Software Engineering
2. People and Security
Risk Analysis and Management
4. Security Principles
5. Software Development Management
- An Oxford layout
- At the start of each (proper) chapter I'd expect to see a paragraph along the lines of "In this chapter we're going to do X. We're going to start by thinking about Y and then we'll move on to tackle Z.
- The theory of nature versus nurture in software engineering should be the central point to this project. I am trying to theorise that a persons genetic make up and social up bringing (both parental and cultural), has a definative role to play in software engineering. Is it not logical that our traits and imperfections are carried over to become part of the things we create? If so how do we analise this and what do we do to mitigate and manage the risk involved?
Questions to think about
- Why does a person hack computer systems? What makes them do this? Is their a median hacker age? What stats can prove this? If there is an age pattern would that be part of a process company's would implement to help prevent internal hacking based on age, gender, chemical make up? What are the moral implications of such a thing?
Examples would be to analyse some of the major software engineering failures, such as the following, and ask if better process could have been implemented to prevent this from happening -
The AT&T network collapse (1990)
In 1990, 75 million phone calls across the US went unanswered after a single switch at one of AT&T's 114 switching centers suffered a minor mechanical problem, which shut down the centre. When the centre came back up soon afterwards, it sent a message to other centres, which in turn caused them to trip and shut down and reset.
The culprit turned out to be an error in a single line of code -- not hackers, as some claimed at the time -- that had been added during a highly complex software upgrade. American Airlines alone estimated this small error cost it 200,000 reservations.
Mars Climate Observer metric problem (1998)
Two spacecraft, the Mars Climate Orbiter and the Mars Polar Lander, were part of a space program that, in 1998, was supposed to study the Martian weather, climate, and water and carbon dioxide content of the atmosphere. But a problem occurred when a navigation error caused the lander to fly too low in the atmosphere and it was destroyed.
What caused the error? A sub-contractor on the Nasa programme had used imperial units (as used in the US), rather than the Nasa-specified metric units (as used in Europe).
Mariner I space probe
A bug in the flight software for the Mariner 1 causes the rocket to divert from its intended path on launch. Mission control destroys the rocket over the Atlantic Ocean. The investigation into the accident discovers that a formula written on paper in pencil was improperly transcribed into computer code, causing the computer to miscalculate the rocket's trajectory.
Article that reflects my thoughts for this dissertation
However, I’ve seen this particular assertion — “all programming languages are the same because they’re all Turing complete” — used repeatedly as long as I’ve been a programmer. It drives me nuts.
Sure, it’s true on a technical level. Any computer language we write gets interpreted and compiled down to machine code, so at a practical level a C program with a for(;;) loop and a Python list comprehension might end up with the same values flowing over my registers and the same instructions dropping into the CPU. But this reductionist view of programming completely ignores the incredibly important role that language plays in thought.
The traditional view of languages — human or computer — is that they’re a tool we use to express thought. But modern literary and linguistic theory holds that it’s a two way street: our thought drives our language, but the language we use leaves an indelible imprint on our thought processes. I’m not a linguist, but from what I can tell the Sapir-Whorf hypothesis is the main designator for this idea of language influencing thought.
The hypothesis postulates that a particular language's nature influences the habitual thought of its speakers: That is, different language patterns yield different patterns of thought. This idea challenges the possibility of perfectly representing the world with language, because it implies that the mechanisms of any language condition the thoughts of its speaker community.
There’s no question in my mind that this applies full-force to software development: different languages make it easier or harder to conceive of certain types and classes of algorithms. So-called “syntactic sugar” can make a big difference in efficiency: one language might naturally lend itself to writing a something close to the theoretically optimal case, while another might lead you towards a different, less efficient, solution.
Most importantly, though, is the way that computer languages intersect with our own thoughts. You’ll often a developer talk about how his favorite language “fits my brain” or “matches the way I think.” As a group of analytical types, we often dismiss these types of assertions in favor of more quantitative measurements of performance or memory consumption. But that’s a huge mistake: we’ll always be more productive in a language that promotes a type of thought with which we’re already familiar.
According to the theory of neuroplasticity, thinking, learning, and acting actually change both the brain's physical structure, or anatomy, and functional organization, or physiology from top to bottom.
In other words, what you think changes what you *will* think.
At least 50 or more, here are some that I have been researching already
[ja1] John Soat, 2008, Tomorrow’s CIO: Process Before Technology, http://www.informationweek.com/blog/main/archives/2008/06/tomorrows_cio_p.html
[ja2] Matt Blaze, 2004, Safecracking for the computer scientist, http://www.crypto.com/papers/safelocks.pdf
[b1] Bruce Schneier, 2000, Secrets and Lies: Digital Security
in a Networked World, ISBN 1, John Wiley
[b2] Drew Miller, Michael Bednarczyk, 2005, Black Hat Physical Device Security
: Exploiting Hardware and Software, ISBN X, Syngress
[b3] Harold F. Tipton, Micki Krause, 2007, Information Security
Management Handbook, ISBN 2, CRC Press
[b4] Pierpaolo Degano, 2007, Programming Languages and Systems: 12th European Symposium on Programming, ESOP 2003, Held as Part of the Joint European Conferences on Theory and Practice of Software, ETAPS 2003, Warsaw, Poland, April 7-11, 2003 : Proceedings, ISBN 1, Springer
[b5] James S. Tiller, Tiller S. Tiller, 2005, The Ethical Hack: A Framework for Business Value Penetration Testing, ISBN X, CRC Press
[b6] Albert-László Barabási, 2003, Linked: The New Science of Networks, ISBN 9, Basic Books
[b7] Watts S. Humphrey, 1997, Introduction to the Personal Software Process, ISBN 7, Addison-Wesley
[b8] Thomas A. Birkland, 2005, An Introduction to the Policy Process: Theories, Concepts, and Models of Public Policy Making, ISBN 8, M.E. Sharpe
[b9] G. David Garson, 1995, Computer Technology and Social Issues, ISBN 4, Idea Group Inc (IGI)
[b10] Louis A. Poulin, 2005, Reducing Risk with Software Process Improvement, ISBN X, CRC Press
[b11] Bruce Schneier, 2003, Beyond Fear: Thinking Sensibly about Security
in an Uncertain World, ISBN X, Springer
[b12] Kevin David Mitnick, 2002, The Art of Deception: Controlling the Human Element of Security
, ISBN 4, John Wiley and Sons
[b13] Eric Gander, 2003, On Our Minds: How Evolutionary Psychology is Reshaping the Nature-versus-nurture Debate, ISBN 8, JHU Press
[b14] Lorrie Faith Cranor, Simson Garfinkel, 2005, Security
and Usability: Designing Secure Systems that People Can Use, ISBN 9, O'Reilly
[b15] Ross Anderson, 2001, Security
Engineering: A Guide to Building Dependable Distributed Systems, ISBN 3, John Wiley and Sons
World Wide Web
[www1] Wikipedia, (2008), Wile E. Coyote and Road Runner, http://en.wikipedia.org/wiki/Wile_E._Coyote_and_Road_Runner
[www2] Wikipedia, (2008), Cash is King, http://en.wikipedia.org/wiki/Cash_is_king
[www3] Wikipedia, (2008), Disk Cloning, http://en.wikipedia.org/wiki/Disk_cloning
[www4] Wikipedia, (2008), Firewall, http://en.wikipedia.org/wiki/Firewall
[www5] Wikipedia, (2008), Intrusion detection system, http://en.wikipedia.org/wiki/Intrusion-detection_system
[www6] Wikipedia, (2008), Virtual private network, http://en.wikipedia.org/wiki/Vpn
[www7] Wikipedia, (2008), Smart Card, http://en.wikipedia.org/wiki/Smart_card
[www8] Wikipedia, (2008), Anti-virus, http://en.wikipedia.org/wiki/Antivirus
[www9] Wikipedia, (2008), Encryption, http://en.wikipedia.org/wiki/Encryption
[www10] Wikipedia, (2008), Nature versus nurture, http://en.wikipedia.org/wiki/Nature_versus_nurture
[www11] Kimberly Powell, (2008), Nature vs. Nurture - Are We Really Born That Way?, http://genealogy.about.com/cs/geneticgenealogy/a/nature_nurture.htm
[ Order Custom Essay ]
[ View Full Essay ]
"10 Top IT Disasters." (2007) Retrieved June 5, 2009 from http://www.itepistemology.com/2007/11/10-top-it-disasters-by-zdnet-plus-one.html
Barrett Devlin (2009) Computer virus strikes U.S. Marshals, FBI affected. Associated Press. Retrieved June 1, 2009 fromhttp://www.boston.com/news/nation/washington/articles/2009/05/21/apnewbreak_virus_attacks_us_marshals_computers/
Charette, R.N.2005. Why Software Fail, IEEE Spectrum, 42
"Computer system." United States Department of Commerce. Retrieved June 1, 2009
Darwin, C. (1859). On the origin of species. London: John Murray.
Deci, E.L., & Ryan, R.M. (1985). Intrinsic motivation and self-determination in human behavior. New York:Plenum Press.
Denning, D. (1990). Information warfare and security. Reading: Addison-Wesley.
Douglas, J.Y. "Nature" versus "Nurture": The Three Paradoxes of Hypertext. Retrieved June 2, 2009, from http://web.nwe.ufl.edu/~jdouglas/readerly.pdf
Gander, E. 2003, On Our Minds: How Evolutionary Psychology is Reshaping the Nature-versus-nurture Debate, ISBN 0801873878, JHU Press
Hack. (2009). In Merriam-Webster Online Dictionary.
Retrieved June 5, 2009, from http://www.merriam-webster.com/dictionary/hack
Hafner, K., & Markoff, J. (1995). Cyberpunk: outlaws and hackers on the computer frontier. New York: Simon and Schuster.
Halasz, Frank G. "Reflections on NoteCards: Seven Issues for the next Generation of Hypermedia systems." Communications of the ACM 31.7 (1988): 836-852.
Johnson, K, Zuckerman, M. J., & Solomon, D. (2000). Online boasting leaves trail. usatoday.com. 7, June.
Josang, A.; AlFayyadh, B.; Grandison, T.; AlZomai, M.; McNamara, J. (2007)
Security Usability Principles for Vulnerability Analysis and Risk Assessment
Computer Security Applications Conference. ACSAC 2007. Twenty-Third Annual
Volume, Issue, 10-14 Dec. 2007-Page(s):269-278
Kailey MP, Jarratt P. RAMeX: a prototype expert system for computer security risk analysis and management. Computers & Security 1995;14(5):449e63.
Kaplan-Moss. J. (2008) "Syntactic Sugar"
Bilbao A. TUAR. A model of risk analysis in the security field, Chapter 3119-5/92. IEEE; 1992.
Karabacaka, B., Sogukpinarb I. (2005) ISRAM: information security risk analysis method. Computers & Security. 24, 147e159.
Human Behavior. http://en.wikipedia.org/wiki/Human_behavior
Human Development. Retrieved June 2, 2009, from http://medical-dictionary.thefreedictionary.com/Human+development+(psychology)
"Linguistic Relativity." Retrieved June 2, 2009, from http://en.wikipedia.org/wiki/Sapir-Whorf_hypothesis
Kay, P., Kempton, W. (1984) What Is the Sapir-Whorf Hypothesis? American Anthropologist, New Series, Vol. 86, No. 1 (Mar., 1984), pp. 65-79
Latour, Bruno. Science in Action: How to Follow Scientists and Engineers through
Society. Cambridge: Harvard University Press, 1987.
Lucy John A. (1997), Linguistic Relativity. Annual Review of Anthropology, Vol. 26 pp. 291-312
Climate Failure. NASA. Retrieved June 2, 2009, from "http://marsprogram.jpl.nasa.gov/msp98/news/mco991110.html
McEvoy N, Whitcombe A. Structured risk analysis InfraSec 2002. LNCS 2437; 2002. p. 88e103.
National Institute of Standards and Technology (NIST). Risk management guide for information technology systems 2001. Special Publication 800-30.
Neumann, P.G. (2006) System and Network Trustworthiness in Perspective. Retrieved June 2, 2009, from http://delivery.acm.org/10.1145/1190000/1180406/p1-neumann.pdf?key1=1180406&key2=3110034421&coll=GUIDE&dl=GUIDE&CFID=39339693&CFTOKEN=94956378
"Neuroplasticity." Retrieved June 2, 2009, from: http://www.medterms.com/script/main/art.asp?articlekey=40362
"PC Basics." Retrieved June 2, 2009, from: http://www.adminxp.com/begin/index.php?aid=230
Petkovic D., Thompson G., Todtenhoefer, R. (2006). Teaching Practical Software Engineering and Global Software Engineering: Evaluation and Comparison. Retrieved June 2, 2009, from http://delivery.acm.org/10.1145/1150000/1140202/p294-petkovic.pdf?key1=1140202&key2=8594224421&coll=GUIDE&dl=GUIDE&CFID=39221703&CFTOKEN=47198951
Pontell H.N. & Rosoff, S.M. White-collar delinquency. Crime, Law and Social Change
Volume 51, Number 1 / February, 2009, 147-162
Risk management. Retrieved June 2, 2009, from http://www.businessdictionary.com/definition/risk-management.html
Rodgers, M. (1999). A new hackers' taxonomy. Retrieved June 2, 2009, from www.mts.net/~mkr/hacker.doc.
Sherry John L. (2004) Media Effects Theory and the Nature/Nurture Debate: A
Historical Overview and Directions for Future Research. Media Psychology,6:1,83 -- 109
Small business Corner." Retrieved June 2, 2009, from http://csrc.nist.gov/groups/SMA/sbc/index.html
Smith, John B. And Stephen F. Weiss. "Hypertext." Communications of the ACM 31.7
Software Engineering 2004: Curriculum Guidelines for Undergraduate Degree Programs in Software Engineering, The Joint task Force on Computing Curricula, IEEE Computer Society, Association of Computing Machinery, August 2004
Spinellis D, Kokolakis S, Gritzalis S. Security requirements, risks and recommendations for small enterprise and home office environments. Information Management & Computer Security 1999;7(3):121e8.
Smetters, D. And R. Grinter. (2002) Moving from the Design of Usable Security Technologies to the Design of Useful Secure Applications. In Proceedings of the New Security Paradigms Workshop, pages 82 -- 89. ACM Press
Sterling, B. (1992). The hacker crackdown: law and disorder on the electronic frontier. London: Penguin.
Stoneburner, G., Goguen, A., Feringa, A. (2002) Risk Management Guide for Information Technology Systems.
National Institute for Standards and Technology. Retrieved June 2, 2009, from http://csrc.nist.gov/publications/nistpubs/800-30/sp800-30.pdf
Taylor, P. (2000). Hackers. Crime in the digital sublime. London: Routledge.
United States General Accounting Office (USGAO). Information security risk assessment, Retrieved June 2, 2009, from; http://www.gao.gov/cgi-bin / getrpt-GAO/AIMD-00-33O; 1999.
Hacker 'Mafiaboy' pleads guilty, (2001) Usatoday.com (2001)., 18, January.
Watson, J.B. (1925). Behaviorism. New York: Peoples Institute Publishing.
"What is Risk analysis." Retrieved June 2, 2009, from http://searchmidmarketsecurity.techtarget.com/sDefinition/0,,sid198_gci1182538,00.html
Young J.R. Top 10 Threats to Computer Systems Include Professors and Students
The Chronicle of Higher Education.