OS Security revolves around the appropriate protection of four elements. Confidentiality prevents or minimizes unauthorized access and disclosure of data and information. Integrity makes sure that the data being worked with is actually the correct data. Availability (as defined in RFC 2828) is the property of a system or system resource being accessible and usable upon demand by an authorized system entity, according to performance specification for the system. Authenticity makes possible that a computer system be able to verify the identity of a user.
The operating system is usually only a portion of all software running on a particular system (Mattord, 2007). But the operating system controls access to system resources. The security of the operating system is just a small part of overall security in computer systems, but this has been increasing greatly. There are many reasons for the safety of the operating system receives special attention today.
The evolution of computer systems has been in the last decades of staggering magnitude. Computers have become more accessible, there is also an increase in the risks associated with security. But one thing has remained constant throughout this time, is that digital systems have become increasingly complex. Microprocessors have become more complex. Operating systems have become more complex. Computers have become more complex. Networks have become more complex (Hetico, 2007). The individual networks are combined and further increased its complexity. A clear example of this is Internet, the vast network of computers, with increasing complexity is becoming more insecure. Considering that software is not without fault, and then complex software is likely to fail and a percentage of these failures affecting safety. It is also important to note that modular systems are necessarily complex, and that otherwise could not handle the complexity. But the increased modularity means that security often decreases because fault where two modules communicate (Reiter, 2008).
The only reasonable way to test the security of a system is performing assessments on it. However, the more complex the system, the harder it becomes the safety assessment. A more complex system will be more security-related errors in its analysis, design and programming. Unfortunately, the number of errors and the difficulty of evaluation does not grow according to the complexity grows much faster. The more complex a system, the harder it is to understand. There are all sorts of points of vulnerability-interface between user and machine-system interactions that grows exponentially when you can keep the entire system in the head. The more complex a system, the harder it is to make this type of analysis. Everything is more complicated: analysis, design, programming and use (Hetico, 2007).
Operating systems are not immune to this reality and become increasingly complex. One example is Microsoft Windows, which when published in 1992 (Version 3.1) was about 3 million lines of code, Windows 95 reached 15 million and has 18 million Windows 98, Windows NT released in 1992 was 4 million ...