Systems Security: Problems and Potential Solutions

With the increasing connection of computers to networks comes a corresponding increase in the threats to the integrity and security of data on those computers. Outbreaks of Internet “worms,” computer viruses, and other “malware” are becoming more frequent and virulent as witnessed by the reports of maladies named “Melissa”, “ILOVEYOU” (Weaver, 2001), and, most recently, “MyDoom” (Legon, 2003). Malicious code, or “malware,” generally describes all types of infectious programs, including viruses and worms (McAfee Security, 2003). Malware attacks in all their forms use a combination of human psychology and the vulnerabilities or design weaknesses found in software to spread rapidly from computer to computer across a network. Such attacks cause tangible damage such as depletion of computing resources, time spent to repair the damage, and loss of electronic data. Additionally, malware attacks have lasting, yet less tangible, effects on the behaviors of computer users and software developers. The problem of systems security and information assurance requires a sophisticated mix of behavioral changes by individual users, software vendors, and possibly governments to solve. Given the complexity of the problem, however, the vulnerabilities of computer systems may never be completely eliminated.

The Problem of Systems Insecurity

Computer malware has existed since the 1970s. The earliest programs were academic experiments in self-replicating code which were programmed for early versions of the Unix operating system (OS). With the advent of the personal computer in the early 1980s, viruses began to appear with greater frequency and potential for harm (Weaver, 2001). Virtually every computer operating system that has been developed is a potential virus target, with the virus population ranging from at least 5 identified variants that affect Unix systems, to over 45,000 that affect Microsoft Windows systems (year 2000 estimates, Hancock, 2000, p. 389). In the years before widespread computer networking, viruses spread slowly, relying on physical transfer by humans using removable media (e.g.: floppy disks) to spread the programs from computer to computer. Once computers become networked with each other, however, the opportunities for viruses to spread increases as do the methods by which they may spread. Malware can now spread across a network at a high rate of speed for a variety of reasons.

The effects of viruses and worms vary greatly. Some viruses may replicate and spread with no visible effect to the user, while others are designed specifically to inflict damage to the systems they infect, other systems on the network, or both. Even in the most benign cases resources are consumed, usually in the form of hard drive space, memory, processor time, or network bandwidth. Additionally, viruses meant to be benign, as is usually the case with artificial life experiments, may become malignant in certain operating environments or may potentially mutate into newer, more harmful programs as a result of transmission errors (Weaver, 2001, Propagation). Viruses that are meant to cause harm most often do so by deleting or altering files on the infected system, causing the loss of electronic data. Furthermore, virus outbreaks may consume processor or network resources, which tends to deny the use of the affected machine or network (Weaver, 2001, Malicious Payloads). When such effects are multiplied across many computers in a network, the costs to those who maintain the network, in time, money, and lost productivity, are significant (Geer, et al., 2003, p. 9).

In addition to the physical damages, there are significant behavioral effects caused by malware outbreaks. Firstly, the openness and trust that characterized early networks has virtually disappeared, partially because of malware and security problems. For example, email is frequently used as a carrier of viruses and malignant programs. The abuse of email progressively deteriorates the trust that people are wiling to have in its use, which leads to the implementation of communication barriers such as filtering, “blacklisting”, and “whitelisting.” The overall effect is that of a fragmented and unreliable communications environment, where one cannot be certain that one's message will be received. The second effect is that software becomes more complex as software vendors respond to revealed security vulnerabilities. Patches, security updates, and upgrades produce more complex code that may actually expose new vulnerabilities or degrade the performance of the system (Schultz, 2003b), which further erodes people's trust in technology.

Perspectives on the Causes of Network Security Problems

The ability of malware to reproduce quickly over a network is attributable to a variety of interrelated factors, but two areas appear to be dominant. One perspective that has gained much attention recently focuses on the near homogeneity of computer operating systems. A report released by the Computer & Communications Industry Association (CCIA) examines the antitrust suits against Microsoft, as well as the dominance that Microsoft products have in the personal computer market, with regard to the effects of these developments on the security of computer networks. Microsoft operating systems and applications have become the de facto standard for personal computer (PC) systems because of a combination of ease of use, integration between programs and the operating system, and, sometimes, by dubious business practices by Microsoft. The CCIA report describes a network monoculture dominated by Microsoft products that allows the rapid distribution of malware (Geer, et al., 2003, p. 12). The monoculture argument is based primarily on dominance in PC systems, as opposed to server systems where there is more variance in operating system software. Additionally, server environments tend to be better monitored and maintained than personal computers, and the design and implementation of such systems are more considerate of security and access concerns.

The monoculture perspective draws upon research that compares computer networks to biological ecologies. Jorgensen, et al., describe how electronic ecologies exhibit behaviors and traits similar to natural ecologies (2001). An analogy is described that relates computer code to DNA, the computer that contains the code to a cell, and the network of computers as organism. Such an analogy allows one to analyze the spread of computer viruses using well established epidemiological principles. The network-as-ecology metaphor is also presented by Wassenaar & Martin (2002). The researchers state that, like natural monocultures, “[t]he electronic monoculture that improves communication also increases the risk for contagion” (2002, p. 336).

The monoculture argument is a compelling assessment of the ability of the creators of malware to exploit common vulnerabilities across a large population of similar platforms where epidemiological concepts are valid. Unfortunately, the monoculture perspective fails to account for the possibility that, because Microsoft systems are so ubiquitous, their products may draw more of the attention and effort of malware creators. A high density of similar operating systems presents an ideal environment for malware that ensures successful dissemination of the programs across a network. Perhaps, if a different operating system, such as Unix or Macintosh, were dominant, there would likely be a corresponding reversal in the amount of malicious code available for the dominant system (Hancock, 2000).

In contrast to the monoculture argument, another perspective on the network security problem implicates the basic causes of security vulnerabilities: the integrity of software code. The software perspective recognizes that perfection in complex systems of software is impossible to attain. In order to seek perfection in software, programmers would have to be able to anticipate and simulate all possible conditions in which the software would be used, which requires excessive amounts of time to perform. Thorough testing is seldom possible in the development of software due to market pressures to release products quickly. The push to release software quickly forces a compromise between quality assurance and the security and stability of the product (Schultz, 2003a). Additionally, the more complicated the product, as is the case with operating systems, the more likely the software will be sold with unknown vulnerabilities. Furthermore, software developed before the prevalence of “always-on” networked environments were not designed for the new security concerns of the network, which is particularly true of older versions of Microsoft software, which tend to enable many insecure features by default. For example, the “Love Bug” virus in 2000 took advantage of Microsoft Windows Visual Basic Scripting (VBS) support, which allowed the malicious program to run, with full access to the operating system, but without a user's knowledge or ability to intervene (Hancock, 2000, p. 390).

Two Regions of Complexity in Security

Two broad, interrelated areas define the complexity of the network insecurity problem and account for the reasons that the problem cannot easily be solved. One area defines a continuum between usability and security. Security features necessitate some compromise in ease of use (Schultz, 2003a, p. 271). For a computer program or operating system to be easy for non-technical users, security features must be nearly or totally transparent to the user. Security features such as encryption and anti-virus programs often require user intervention or configuration to be effective. Furthermore, disabling networking or communication features by default makes it difficult for users to take advantage of these features. For example, using a firewall to block network access to one's computer may interfere with other programs, including instant messaging, various streaming media, and games. Enabling firewall access for specific programs is typically not easy for inexperienced computer users and is a common source of confusion and frustration. If a user has difficulty using a program, the program is, by definition, unusable, which defeats the purpose of the software's existence.

Another area defines the continuum between software complexity and reliability. As programs and operating systems integrate more new features, they become more complex. The larger and more voluminous the code, the more likely it is that vulnerabilities and errors will go unnoticed by the software's programmers. As mentioned earlier, this problem is exacerbated by the continual process of fixing newly discovered security vulnerabilities, which adds more size and complexity to the code. As a result, the software becomes less reliable and, therefore, less secure. A solution to network security must find a balance between these disparate concerns or fail, either because the solution will require too much adaptation by computer users, or because the solution will fail to resolve all potential software vulnerabilities.

Potential Solutions

A number of solutions to systems security problems have been proposed. In response to the monoculture perspective, security could be improved by reducing the homogeneity of networked platforms. Creating a heterogeneous network is a difficult proposition which would require that viable alternatives to Microsoft operating systems are made available. Linux is often cited as a competitor to Microsoft Windows, but no comparable user interface yet exists for Linux that can match the ease of use and administration helped to make Windows a dominant OS. The Macintosh OS was once a viable competitor to Windows, but it could not maintain a competitive advantage due in large part to the lack of third-party software products (Kling & Star, 1998, p. 26). Barring the emergence of a suitable competitor to Microsoft Windows, government regulation or influence may be the only way that the heterogeneous PC market could be diversified (Geer, et al., 2003, p. 19). Pol\icy instruments that could increase diversity include economic incentives for diversification, endorsement and use of alternatives to Microsoft products, regulation of critical infrastructure networks, and the subdivision Microsoft in response to monopoly litigation.

In response to the software design perspective, security can only be increased by improving the integrity of software through mandated security standards. The software industry, or potentially the government, could define regulations that prescribe minimum standards for security testing and evaluation of all new software products, particularly those that involve operating systems and networks. Such standards entail increased development costs and time, which would increase the cost of the software to the end user. For additional product costs to be accepted by computer users, the costs would have to be less than the estimated or perceived costs associated with losses from malware outbreaks.

A third set of solutions proposes altering computer designs to resolve network security issues. Various software and hardware vendors, including Microsoft, have proposed a set of standards called “trusted computing” (TCPA, 2003). Such standards would use new hardware designs, along with software encryption and verification technologies, to redesign computer communication and operation. In a trusted computing system, communications with non-trusted platforms, software, or networks would be mediated, which would reduce the likelihood of intrusion or abuses by anonymous network users or malware. Trusted computing directly affects the interoperability of computer hardware and software and will take some time to develop. Once available, trusted computing is likely to take a significant amount of time to be adopted by the average computer user which, in the interim, will likely create a number of unanticipated complications with electronic communications. Furthermore, the notion of trust is amorphous and difficult to define, and may either prove to be just as exploitable as current systems and networks, or unreasonably restrictive for computer users (Schoen, 2003).

Conclusion

Network security problems have existed as long as computers have been interconnected. Malicious code, in the form of viruses, worms, and other malware, exploit vulnerabilities in software and may cause significant damage to electronic data and computer systems. As networks continue to expand and more information is made available through them, the risks of serious data loss also increases. The causes of network insecurity are many and are made complex because of interactions between two major areas: security versus usability and complexity versus reliability. Several sets of solutions may be pursued that require behavioral and policy changes by computer users, the software industry, and governments. Regardless of the solutions that are implemented, no solution is likely to be ideal for all stakeholders and network insecurity will continue to be a problem for the foreseeable future.

References

Geer, D., Bace, R., Gutmann, P., Metzger, P., Pfleeger, C., Quarterman, J. & Schneier, B. (2003). CyberInsecurity: The cost of monopoly, how the dominance of Microsoft's products poses a risk to security. Retreived on 2 February, 2004 from http://www.ccianet.org/papers/cyberinsecurity.pdf.


Hancock B. (2000). Microsoft a popular virus target due to ubiquity. Computers and Security, 19(5), 389-391.


Jorgensen, J., Rossignol, P., Takikawa, M. & Upper, D. (2001) Cyber ecology: looking to ecology for insights into information assurance. DARPA information survivability conference & exposition II, 2001 (DISCEX '01) proceedings, 2, 287-296.


Kling, Rob & Star, Susan L. (1998). Human centered systems in the perspective of organizational and social informatics. Computers and society, 28(1), 22-29.


Legon, Jeordan (2003). Tricky 'MyDoom' e-mail worm spreading quickly. Retrieved on 7 February, 2004 from http://www.cnn.com/2004/TECH/internet/01/26/mydoom.worm/.


McAfee Security (2003). Virus glossary. Retrieved on 6 February, 2004 from http://us.mcafee.com/virusInfo/default.asp?id=glossary.


Schoen, Seth (2003). Trusted computing: Promise and risk. Retrieved on 7 February, 2004 from http://www.eff.org/Infra/trusted_computing/20031001_tc.php.


Schultz, E.E. (2003a). Why can't Microsoft stay out of the InfoSec headlines? Computers and security, 22(4), 270-272.


Schultz, E.E. (2003b). Patching pandemonium. Computers and security, 22(7), 556-558.


Trusted Computing Platform Alliance (TCPA) (2003). Home page. Retrieved on 7 February, 2004 from http://www.trustedcomputing.org/home.


Wassenaar, Trudy & Blaser, Martin (2002). Letter to the editor. CDC emerging infectious diseases, 8(3), 335-336.


Weaver, Nicholas (2001). A brief history of the worm. Retrieved on 6 February, 2004 from http://www.securityfocus.com/printable/infocus/1515.