(From Grolier Multimedia Encyclopedia)

Computers and Society

Computers and Society. Just as the tools people invented transformed the societies in which they lived, so the needs of these societies inspired inventions, of which computers became a cornerstone in the 20th century. Throughout the world computers have become a source as well as a conduit of information. This article takes a discursive look at computers, related technological developments, and social issues affected by the inclusion of computers in everyday life, with particular reference to issues of privacy.

The Information Revolution. Information technology has long been acknowledged as a revolutionary social force. In medieval times the invention of the printing press by Gutenberg eventually broke down the monopoly of the church over written knowledge. Previous to the printing press, handwritten manuscripts, approved and edited by the church, were the only texts available; easily available books unapproved by the church were viewed by some monks as a threat to their control over learning. Printing presses were initially banned from medieval universities in an attempt to control the dissemination of knowledge. However, once a valuable technology has been experienced, it cannot easily be completely suppressed. More important, since printing permitted the efficient documentation of knowledge, learning became a cumulative process, with each new discovery or insight building on all previously understood kernels of knowledge. Thus printing, and the resulting rapid exchange of ideas, contributed materially to the explosion of and exponential increase in discoveries in art and science during the Renaissance period.

The advent of printing also helped to fuel the Industrial Revolution in Europe in the 18th and 19th centuries by creating more productive ways to learn and making accumulated knowledge readily accessible. This was the first "information revolution" of modern times. For instance, new technologies for generating and harnessing energy to mills helped bring about the Industrial Age, accelerating the growth of cities and aggregating economic, political, and military power into the modern nation state.

Information versus Energy as a Revolutionary Force. Nuclear fission power and the digital computer emerged in the United States as the most dramatic new technologies developed and honed during World War II. Nuclear energy, with its promise of infinite amounts of cheap energy, was initially thought to be the more revolutionary of the two developments, since energy technology is essential for the growth of modern industrial economies. However, the digital computer has, in fact, made a far more significant social impact, and the technical discontinuity represented by the modern computer is no less dramatic than that of nuclear energy. Whether one measures technical progress by speed of computation, by data rate of telecommunications, or by volumetric density of data storage, the increase in capabilities, starting with the early digital computers of the late 1940s and continuing with the supercomputers of the early 1990s, was measured in factors of magnitude ranging from a million to a billion.

Transformation of the U.S. Economy. In 1976 the sociologist Daniel Bell recognized that information, not energy, would be the new bellwether of economic progress and agent of social change. His book, The Coming of Post-Industrial Society, mapped the social changes associated with the emerging transformation of industrial economies. Information technologies responsible for this social transformation include computers, telecommunications, the software that controlled their functions, and the information they created, processed, stored, and retrieved. The "Age of Information" may be said to have arrived when the majority of the people became engaged in the creation, gathering, storage, processing, or distribution of information, rather than in agriculture or manufacturing. By the early 1990s more than 60% of the U.S. workforce was estimated to be engaged in such activities, and perhaps 40% of the world's population was so employed. Since much of the information-intensive activity occurred in the service sector, it was not surprising that during the 1980s two-thirds of the U.S. workforce was engaged in services, nearly one-third in manufacturing, and only 2% in agriculture. The facilities and institutions that support these information-intensive services are collectively called the information infrastructure, by analogy to the transportation services and facilities that dominate the physical infrastructure of a modern state. The legal scholar Anne Wells Branscomb defined information infrastructure in 1982 as "the lifeline through which we sustain our business and commerce, amuse ourselves, exchange our gossip, console our friends, and cry out for help in emergencies."

In the 1990s the U.S. computer and telecommunications industries represented the largest manufacturing sector in the U.S. economy, outpacing the auto industry. In the period from 1965 to 1995, computer hardware of a given speed and power fell in cost by about 15% per year, an unprecedented rate of industrial productivity improvement. The creation of software that gave the hardware its useful function proved more resistant to productivity growth, resulting in a shift of economic activity from hardware to software. The best estimate of the total size of all segments of the U.S. software industry, made for the year 1988 by C. A. Zraket, president of Mitre Corporation, was an annual revenue of $100 billion, or roughly three times official government estimates for that period. This represented about a 60% share of world software revenue. Despite very rapid growth of the Japanese computer industry and a strong position in Japan and Europe in digital telecommunications, the United States was still the dominant industrial power in information technology in the early 1990s. Although the industry's growth rate for hardware and software slowed in the early 1990s, the pace of technological change became a source of economic as well as social change, for when individuals and institutions have difficulty accommodating change, it becomes a source of social stress.

Effects of Computers on Society. The social effects of the computers that surround us, seen and unseen, in our daily lives are of two kinds: immediate effects on each individual who uses a computer or encounters services delivered by computer, and aggregate effects on society as a whole. (These societal issues will be addressed in the following section.) Residents of the United States find the effects on individuals most visible, since they have become immersed in a society in which virtually every activity has become dependent on the reliable functioning of information technology and the people who manipulate it.

Perhaps the most dramatic evidence for the extent of computer use was seen in the fact that by the early 1990s over 20 million personal computers (PCs) were installed in homes across the United States. The base of installed PCs in U.S. businesses, government, and education (excluding homes) continued to grow throughout the 1990s. These statistics do not include imbedded microprocessors, which were installed by the hundreds of millions in automobiles, appliances, timepieces, television sets, videocassette recorders (VCRs), telephones, and fax machines, and which had countless other uses in business, agriculture, education, and the military. Products whose functions are controlled by microprocessors have often been referred to as smart products. Since they are more energy efficient and responsive to users' needs, they have changed the way people live and work in many ways.

Among the most pernicious social effects of the contribution information technology has made to economic productivity is the economic disenfranchisement of those who receive an inadequate education. Manual tools can be learned by apprenticeship; information tools require formal education. Increasingly, the tools of a modern economy incorporate computer intelligence and require educated users. Because they are more productive, their operators command higher compensation. This leaves the educationally deprived even more disadvantaged.

Generally, computers perform functions long familiar to people in other forms. Why has the shift from file cabinets, adding machines, and voice telephones to digital computers, telecommunications, and software made such a big difference in our lives? Digital information handling is fundamentally different from analog, or uncoded, information, in five crucial respects: (1) retrieval, copying, storage, and transmission of information, once digitally encoded, can be made completely error free by using mathematical "tricks" such as parity checking; (2) computers can perform operations with extraordinary speed and can inexpensively store incredibly large amounts of information in a very small space. This facility introduces great economies of scale and scope to information applications and makes possible cross-comparison of enormous databases for consistency, activities that would be inconceivably complex operations using paper records; (3) since electronic communication has become virtually instantaneous, computer communications create affinities of interest that are independent of distance. This was the idea behind Marshall McLuhan's concept of a "global electronic village"; (4) digitally encoded information can be exactly recognized by a machine, which allows information to be searched, compared, and logically processed and then used to create new information and automatically trigger further actions; and (5) intelligent machines, such as those implementing "neural networks," which emulate the wiring of synapses in the brain, can be designed and programmed so that the machines "learn" from repeated experience with a task.

Realization of the elusive promise of artificial intelligence was long delayed. The hope of automatic language translation in the 1960s remained unrealized in the 1990s, but other areas of artificial intelligence research—expert systems, image and sensory recognition, and robotic activity—have been integrated into many applications. Expert systems, pioneered by Professor Edward Feigenbaum, create logical conclusions by processing a large store of expert knowledge. Other computer systems can "see," "hear," "feel," and manipulate objects. These two lines of investigation are coming together in new tools for exploring artificial or "virtual" reality.

Despite their extraordinary capabilities and promise, and over a quarter-century of research on artificial intelligence, computers are deterministic machines; they do not think for themselves but instead doggedly follow the instructions in their programs, using the data in their stores. This fact has produced a great deal of frustration on the part of those computer users who expect a computer to respond to their instructions in the same way a human being might. This frustration has been perhaps best expressed in a poem posted anonymously on the bulletin board of the University of Wisconsin computer center:

I'm sick and tired of this machine
I wish that they would sell it.
It never does just what I want,
But only what I tell it.

Millions of citizens who use personal computers take satisfaction in their mastery of the new technology, but their frustrations also color their attitudes toward the use of computers by government, merchants, banks, and other institutions they encounter. People complain that computers can be hard to understand and that messages created by them sometimes can be inappropriate or incomprehensible. Human factors and ergonomics experts continue to attempt to overcome these difficulties; the object of their efforts has been a "user friendly" machine, a phrase used more often in irony than in admiration.

Computers and Privacy. Citizens continue to be deeply concerned about threats, both real and perceived, to their personal liberty and privacy. Government agencies and commercial firms operate a vast array of computer systems and databases that contain information about individuals, usually created to provide people with services or benefits. Each time a credit card is used, a computer record is created, which, when combined with records of mail-order purchases, magazine subscriptions, bank transactions, hotel and airline reservations, makes it possible to paint a picture of an individual's tastes, preferences, buying habits, and ability to pay. But the records contained and algorithms inherent in these machines have been largely invisible to the individuals whose actions created them. These records can be used to improve service; they can also be used to bombard individuals with unwanted solicitations for business and junk mail. Errors in these records, whether created accidently or through malintent, can create serious difficulties for individuals, such as the loss of financial credit, improper billings, or even accusations of illegal behavior.

Privacy Concerns. Privacy concerns became a national issue in the 1960s with the publication of Alan Westin's book Privacy and Freedom in 1967. They peaked during the Ford administration when Vice Pres. Nelson Rockefeller chaired a national commission to develop the legislative response to the issues raised, since in the United States government behavior had been the focus of concern. The computer industry responded early to these concerns and promulgated voluntary privacy principles. The industry insisted, successfully, that these principles should also apply to all records of personal data, whether in electronic or paper format. The Right to Privacy Act of 1974, amended in 1976, was its primary achievement. The law provides citizens with access to computer records pertaining to them and limits government agency authority in the use of these records; however, substantial exceptions for national security requirements have been made. Regulation of private-sector data activities has been quite limited in the United States; the primary regulations require citizen access to and the right to correct financial credit records. In the United Kingdom and in Europe generally, the emphasis of public concerns about privacy has been focused not on governments so much as on private firms. In some countries all commercial databases with personal information must be registered; in others special ombudsman offices have been established to protect citizens' interests.

Many privacy issues remain unresolved, for they involve trade-offs between desirable and undesirable attributes of the same application. Citizens must cooperate in the creation of some national statistical data collections, such as the U.S. Census, if their value to society is to be realized. For example, the National Crime Information Center (NCIC), operated by the Department of Justice, has banks of computers in which state and local police records are stored, along with federal data. Citizens may approve FBI (Federal Bureau of Investigation) use of the NCIC to track the activities of suspected, but unindicted, drug dealers through surveillance by local police and tracking credit card use and hotel reservations electronically. But the same people strenuously object to the idea that their own travels and buying activities might be revealed to the FBI. During Judge Robert Bork's Supreme Court confirmation hearings in 1987, the record of his choices of videotaped movies, contained in the video store's computer, was published. The public outcry against this blatant invasion of his privacy was so intense that Congress almost immediately passed a law making such records confidential.

Confidentiality of medical records is another such example. Should records of HIV- (human immunodeficiency virus-) positive individuals be disclosed to prospective employers or insurance companies? Should a national data bank of genetic information, analogous to the FBI's fingerprint files, be created?

Nation-states assert a national right to privacy, often referred to as cultural sovereignty. Satellites broadcast television signals across national borders; networks facilitate the flow of culture-sensitive information across national borders as well. Nations seeking to minimize political or cultural intrusion from their neighbors, and hoping to protect national markets for their artists and producers, find little in international law to assert their cultural sovereignty. Similar concerns surround the prospect of earth-observing satellites acquiring information from across borders.

Data Encryption. The growth of computer networks and data banks in the 1960s and 1970s created a need for good encryption. The best way to protect electronic messages and records has been to encrypt, or encipher, them. In 1976 the U.S. government promulgated a Data Encryption Standard (DES) for communicating and storing nonnational-security data, based on a strong computer cipher invented by International Business Machines Corporation (IBM). Because cryptography in the United States, as well as in other countries, has been closely regulated by government as a matter of national security at the highest level, the commercial introduction of the DES was accompanied by considerable controversy. The export of products incorporating DES was, and is, tightly regulated. This presented the government with a dilemma, since, while it wished to ensure the privacy of U.S. computer messages and records, it also wished to obtain intelligence by reading the secret messages of foreign governments, terrorists, and other criminals. Some academic and commercial cryptologists therefore wondered whether the government had weakened the DES so that the government, with its immense resources, could crack it even though private entrepreneurs could not; some people even questioned whether IBM, lured by the promise of government contracts, had connived with the government to leave a "trap door" in the DES algorithm. The DES, however, has stood the test of time; no evidence that the DES is not sufficiently robust for personal and commercial use has ever been publicly documented.

The U.S. government has sought to resolve its dilemma by prohibiting the export of the DES and convincing other countries to do the same. In the 1980s and 1990s, with encryption becoming ever cheaper, better, and more common, another threat loomed—the government might no longer be able to understand the wiretapped telephone calls of criminal suspects. It proposed that two organizations hold the encryption keys to telephone scramblers sold commercially and make those keys available to law-enforcement agencies on court order. The system, called key escrow, was not immediately adopted. Opponents felt that criminals would not use the escrowed scramblers even if they were conveniently available and that more good encryption beneficially enlarged citizen's privacy.

The laws in some Asian and European countries restrict the right of private persons and business firms to use cryptography to protect messages communicated across their borders. Some require that the encryption key be divulged to government security authorities. Private firms, suspicious of government complicity in industrial espionage, often circumvent these regulations.

Social Effects of Information Technology. It has become commonplace to think of technological change as creating social consequences, for good or ill. The anticipation of social effects of technology has become a field of study in social science called technology assessment. While it can hardly be described as an exact science, it has become a useful guide for monitoring the effects of a technology and for planning regulations, training, and other interventions to mitigate the undesired consequences and maximize the benefits from a new technology. However, technological change has itself become a product of social forces. Thus the character of social change resulting from a technology may be quite different in different societies, and the evolution of the technology in those societies also may be quite different. Indeed, the fact that transnational computer networks link societies with quite different laws and customs and create legal questions about jurisdiction and accountability has become a major source of conflict arising from the global spread of information technology.

The explosive growth of information technologies raised concerns on two counts: the possibility of technological abuse by authoritarian and bureaucratic governments intent on exercising political control of their citizens, and concern that computers could displace human beings in clerical jobs as well as in other forms of employment. The first of these concerns may be best illustrated by George Orwell's apocalyptic Nineteen Eight-Four, a Novel, first published during the Cold War era, and in which authoritarian governments use new information technologies to deprive individuals of their humanity and their freedom.

When the year 1984 arrived, reassessments were made of Orwell's warning; most scholars found evidence for his fears but on balance concluded that illicit access to technological advances such as copying machines and to foreign computer networks and the transnational diffusion of audiocassettes, and radio and television broadcasts contributed substantially to the erosion of control by the USSR's Communist party. During the abortive counterrevolution in Moscow in August 1991, an international computer network used by elementary schoolchildren was employed to exchange messages in support of democracy. The American KIDNET, over which thousands of children performed cooperative science experiments, had been linked to an elementary school in Moscow. Russian teachers were in the process of testing the network when the coup occurred. The messages they sent to the United States during the crisis were later circulated on the KIDNET, providing a graphic experience of history in the making.

In the 1990s these technologies rapidly expanded, creating links between reformers in Russia and other republics and supporters of democratic government. An estimated 100,000 modems were installed on computers in Moscow City alone, which allowed their owners to connect with other computers in Russia and to networks reaching around the globe. The Moscow Telephone network sought the registration of these modems, many of which were installed without official approval. A computer network in Moscow, GLASNET (its name derived from Glasnost, or "openness"), was associated with PeaceNet, Econet, GreenNet, and many others. It provided contact among ordinary Russian citizens and foreigners sharing their interest in peace, human rights, and environmental protection. Supporters of the Russian president Boris Yeltsin made use of GLASNET during the August 1991 coup.

Global Computer Networks. The largest collection of interconnected networks in the world is the Internet,—a collection of networks among which electronic mail messages flow. In 1995 over 100 countries were connected directly or indirectly to the Internet. In 1994 Anthony Rutkowski, president of the Internet Society, estimated that at least 10 million people had access to the Internet worldwide, and the Internet traffic level during the early 1990s increased exponentially. In the United States some 28 million personal computers, or 56% of all PCs installed in the United States, were attached to others through local area networks (LANs) by the mid-1990s. Many more can be expected to be eventually linked to the Internet.

Transborder Data Flow. When commercial applications of computer networks began to span national boundaries in the 1970s, with American companies in the lead, Europeans became concerned about foreigners controlling their domestic information resources and threatening their national sovereignty. For example, the fire department in Malmo, Sweden, purchased an interactive database providing information about the city of Malmo from a U.S. firm. Concerns about the vulnerability of Swedish citizens to a possible breakdown of the computer in Columbus, Ohio, where the data were stored, were widely publicized. Economies of scale and scope accelerated the formation of information services that crossed national boundaries. Governments reacted by creating constraining regulations. For instance, the Brazilian government required all commercial databases to be moved inside Brazilian borders before they could be used. In the 1990s, however, the level of public concern about transborder data flow subsided.

Intellectual Property. A second, unresolved information trade issue discussed in the General Agreement on Tariffs and Trade (GATT) concerned national differences in the way intellectual property has been protected and traded. In the 1990s a major international effort was under way to bring more uniformity to various nations' rules. This discussion took place in the GATT and both regionally in the European Economic Community and bilaterally among the industrialized countries. The existence of so many transnational information networks meant that intellectual property embodied in documents, images, and multimedia circled the globe with the speed of light. Unless these ideas, inventions, and creative works are protected in foreign lands, as well as in the United States, their originators could lose the opportunity to appropriate the benefits, reducing the incentive to create.

This problem has become particularly acute for computer software, where copyright and patent law have proved a less-than-perfect fit to the nature of software. Software has become very expensive to develop and produce but costs almost nothing to duplicate electronically. The software industry in the United States and Canada estimated in 1990 that $2.4 billion worth of commercial software was illegally copied and distributed. This figure was equal to nearly half of the $5.7 billion in sales of end-user, commercial software for the same year.

Developing-Country Interests. At the same time the developing countries, concerned that they had been left behind in the rush to construct a global information infrastructure, sought concessions from technologically richer countries. Strident calls for a "New World Information Order" that dominated many United Nations meetings in the 1970s and 1980s gave way to more sophisticated efforts to obtain foreign investments to create information resources under favorable conditions. Are the new electronic information technologies widening the gap between rich and poor nations, or are they providing the very tool developing countries need to overcome their isolation? The spread of computer networks around the world can accelerate a developing country's access to the world's knowledge resources. At the same time, because their information infrastructure is weak, their economies tend to lag behind those of industrial societies. The net result seems to be islands of modernity, such as Singapore and Hong Kong, amid oceans of poverty and ignorance.

Organizational Behavior. Early in the institutional use of computers, commercial firms used centralized computer systems to increase efficiency, reduce waste, protect assets, and monitor operations. As interactive computer networks became commonplace, transaction processing replaced the more labor-intensive manipulation of paper documents. The result was increased efficiency and control, emphasizing the hierarchical structure of organizations. The ARPANET, a computer network developed by university scientists working with the Defense Department's Advanced Research Projects Agency, had the opposite effect—empowering individuals at all levels in an organization to share ideas, plans, and solutions. The hierarchical structure of the institution was thus flattened. ARPANET quickly gave rise to commercial computer networks. By 1985 some 80,000 IBM employees were interconnected through a simpler peer network, called VNET. Renamed BITNET by universities in 26 nations around the world, this store and forward network was also connected through a gateway to the Internet. By 1992 over 10,000 IBM personnel, three-quarters of them in the United States, had Internet addresses, with the number continuing to rise rapidly. The coexistence of both hierarchical and peer-to-peer networks, often using the same physical communications facilities, gives organizations a remarkable combination of creativity and control, with the virtues of both vertical and horizontal organizational relationships.

In the 1980s private corporate networks began linking themselves to one another to provide electronic data interchange (EDI). A manufacturer's orders to its suppliers are transmitted electronically, allowing the supplier to adopt "just-in-time" manufacturing efficiencies. Similarly, the manufacturer's internal network may be connected to those of its customers, providing quicker service and feedback. Each firm may have many such information relationships with its customers, suppliers, banks, stock exchanges, regulatory agencies, universities, and other institutions. Many of these links circle the globe. The result is the interlinking of the economies of many nations, with a geographic dispersion of economic activity that makes nationalistic policies increasingly inappropriate and ineffective. This intricate web of electronic information relationships is of such enormous complexity that many people have become concerned about its vulnerability to technical instabilities as well as industrial espionage.

Revolution in Engineering and Industrial Innovation. Computers have revolutionized engineering and science and their use in industrial innovation. Since production processes and the properties of materials have been quantitatively characterized, the pace of industrial innovation has been accelerated, productivity has been enhanced, and the quality of production has greatly increased. These new industrial methods are called computer-integrated manufacturing (CIM). Product designs created using computer-aided design (CAD) can be tested and refined through computer simulation. Automated production tooling can then be programmed through computer-aided manufacturing (CAM) tools. The logistics of manufacturing, including just-in-time deliveries of essential components by suppliers, are managed by materials-control and process-flow-control computers. The whole of this process is CIM. Firms using it successfully often gain a substantial competitive advantage. Thus one important social impact of the information age has been a radical change in the way in which manufacturing is performed. With the aid of these new tools, it has become much easier to distribute manufacturing activities in many different locations, to introduce new products and processes simultaneously worldwide. Cooperating firms gain the advantages of vertical integration, without the inflexibility and bureaucratic impediments of aggregating all the needed capabilities within a single firm, by using electronic data interchange.

The negative side of this revolution in the nature of work, especially for inhabitants of wealthy countries, has been a permanent alteration in the global division of labor. The geographic dispersion of the locus of production, made possible through computer control of processes and logistics, means that U.S. workers compete with others in faraway places. Even service jobs have been exported to low-cost locations. People in the Caribbean island of Barbados, for example, were employed to enter the data on gasoline credit slips into massive banks of computers, connected to America by satellite dish. From the perspective of the developing countries, this phenomenon provided significant economic opportunity.

Telecommuting and Collaborating at a Distance. Just as industrial firms have used computer networks to create geographically distributed industrial alliances, individuals began to use computer networking as an alternative to commuting to work. In 1991 there were 31.2 million American households in which one or more persons worked at home at least part-time, up 12.6% from the previous year. This figure represented one-third of all U.S. households. These households used information technology to make work at home feasible; two-thirds of them had an electronic answering machine, and nearly half had a personal computer. The annual growth rates of home computers, facsimile (fax) machines, and modems for telecommuting were estimated at 28% per year. Computer networks also made it possible for individuals at widely separated locations to collaborate in a common task. As applied to scientific and engineering research, the National Science Foundation referred to this arrangement as a "collaboratory." It was extensively used in software design and development and spread to many other fields of endeavor.

Computer Crime and Sabotage. Computers give rise to problems that have proved a challenge to the U.S. legal system. Courts traditionally look to precedent to resolve conflicts and to create appropriate codes of behavior; however, when radical technological change occurs, the old models on which cases have been based can no longer be stretched to cover new situations, and new laws must be written. A new legal specialty arose: computer law. Thousands of attorneys specialize in it.

It was not until 1980 that Congress determined computer software could be covered by copyright protection. The legal basis for controlling unauthorized entry into computers is complex. Wiretapping of computer communications was not illegal until enactment of the Electronic Communication Privacy Act in 1986. The Computer Fraud and Abuse Act of 1986 makes deliberate unauthorized computer penetration a felony, and with inadvertent entry a misdemeanor. Every state but one (Vermont) had, by 1991, passed a similar statute dealing with computer crime.

Computer systems are vulnerable to destructive activities ranging from deliberate sabotage, to theft or alteration of data, to computer vandalism, to relatively harmless pranks by "hackers." One particularly pernicious form of unauthorized intrusion is a self-replicating software program that moves throughout a network (a computer "virus"); it may destroy data or clog networks or only post unexpected messages (a computer "worm"). One example of such a "worm" was that inserted into the Internet in 1988 by Robert T. Morris, Jr. Although he insisted he had no intention of causing havoc, networks became hopelessly clogged all across the United States, and computers had to be shut down and restarted, seriously disrupting hundreds of institutions. He was convicted under the 1986 federal statute.

Most deliberate attempts to steal, alter, or destroy data can be prosecuted under existing criminal laws where intent to do harm can be shown. When an intrusion takes place in a country other than the country from which the data was stolen, the jurisdiction in which the offense should be prosecuted may be unclear. The hazards of computer terrorism, that is, deliberate attempts to wreak havoc against a nation or an institution, are considerable. The information systems most critical to national security are rigorously protected from outside attack. Most of the press reports of intrusions into computers on military installations relate to access to mail systems or other applications at relatively low security levels.

So-called computer hackers represent quite a different phenomenon. The term hacker is generally applied to young people with a special talent for computers, a distrust of all large institutions and their bureaucracies, and a deep conviction that information should be treated as a public good, open to all. Sometimes a hacker's motivation is simply to demonstrate that the security of computer installations is not as great as authorities claim; hackers justify their intrusions by asserting that they contribute to deeper understanding of computer security. Four hackers from a group called the Legion of Doom, once arrested for entering a commercial computer, created Comsec Data Security, a computer security company designed to protect computer owners from other would-be intruders. The fact that hacker culture is creative, if disrespectful of authority, does not excuse the unauthorized tampering with systems used by others. But many individuals who practiced "hacking" in their youth have become major contributors to technical advances in computer software.

Future Prospects. As the use of computers and other digital information technologies become commonplace in many countries, the nature of concerns about the social effects change. In Europe during the 1970s, concerns about computers displacing human employment were high on the political agenda and undoubtedly contributed to Europe's loss of competitive position in the world computer industry. Automation of both clerical and manufacturing tasks became widely accepted as a necessary tool for remaining competitive in world markets, even as Europe and North America struggled with high unemployment levels in the 1990s. Early science fiction stories, in which computers escaped human control and through "artificial intelligence" reduced humans to a position inferior to the machine, are no longer a source of serious concern.

There is another way, however, that computers may be reducing human autonomy even as they empower people to leverage their intelligence. As the applications of computers get more complex, and computers are asked to help make decisions when events are moving too fast for humans to track, the individual must cede control of events to the computer. In this process the opportunity for human instinct to intervene when unwanted outcomes seem in prospect is lost; the human becomes in that sense a victim of the machine. Consider, for example, the destruction of an Iranian commercial airliner by the U.S.S. Vincennes in 1988. Its automatic weapons were computer controlled, designed to intercept missiles such as the Exocet that sank H.M.S. Sheffield in the Falklands war. The people operating the weapons on Vincennes were in a room full of computers with no windows; everything happened far away, too quickly to be seen by human eyes. The reality confronted by the human beings was an artificial reality; the complexity of the real world was filtered by the algorithms built into the radars and computers. Unexpected consequences of the assumptions built into such systems were no longer susceptible to challenge; the humans were surrounded by a sterile world largely devoid of the subtleties that make real events intelligible. As Gary Chapman, the leader of the Century 21 project to define goals for U.S. science, has written, "We seem to be moving away from the concept of the rational citizen who can examine the world and assess and implement proposals for improvement, and toward a society of spectators, people in a state of suspended animation with a cognitive inability to penetrate the fast-paced phantasmagoria of mediated experience presented to them."

People have become concerned about the vulnerability of societies dependent on networks of computers too complex for most humans to fully understand. For example, can one be certain that world financial markets might not become unstable when so much computer-programmed trading drives market activity? A second issue is the unemployability of undereducated people in an information society. If even the simplest tasks become automated, what jobs await those who drop out of school? A third issue concerns the loss of an individual's anonymity in a world where merchants, government officials, and possibly criminal elements can learn so much about an individual's preferences and activities if they gain access to computer data records available for purchase without any legal restraint. For example, most states will sell to direct mailers the names and addresses of everyone who has registered a car. Finally, in a world where information linkages cross political boundaries and create self-defined communities of interest not based on geography or nationality, what will be the future of the nation-state? As global interdependence is replaced by global connectivity, will the result be the resurgence of democracy and individual liberty or a chaotic situation in which ethnic and religious factions, unconstrained by stable political systems, are at war with one another? Experiences in the last decade of the 20th century gave evidence of both.

Lewis Branscomb
Harvard University

Bibliography

Bell, Daniel, The Coming of Post-Industrial Society (Basic Bks. 1976).

Bowyer, Kevin W., Ethics and Computing: Living Responsibly in a Computerized World (Inst. of Electrical & Electronics Engineers 1995).

Branscomb, Anne W., Who Owns Information? From Privacy to Public Access (Basic Bks. 1994).

Feigenbaum, Edward, ed., Building Blocks of Artificial Intelligence, 1956–1986, 2 vols. (Addison-Wesley 1993).

Gore, Al, et al., "Computers Networks and Public Policy," Scientific American (September 1991). Special issue includes articles by Al Gore, Lee Sproull, Nicholas Negroponte, Mitchel Kapor, and Anne W. Branscomb.

Kelly, Kevin, Out of Control (Addison-Wesley 1994).

Marcus, George E., ed., Technoscientific Imaginaries: Conversations, Profiles, and Memoirs (Univ. of Chicago Press 1995).

McAfee, John, and Colin Haynes, Computer Viruses, Worms, Data Diddlers, Killer Programs, and Other Threats to Your System (St. Martin's 1989).

McLuhan, Marshall, and Bruce R. Powers, The Global Village: Tranformations in World Life and Media in the 21st Century (Oxford 1992).

Negroponte, Nicholas, Being Digital (Knopf 1995).

Orwell, George, Nineteen Eighty-Four, a Novel (Harcourt 1949).

Schorbach, Karl, The Gutenberg Documents, ed. by D. C. McMurtie (Oxford 1941).

Talbott, Stephen L.The Future Does Not Compute:Transcending the Machines in Our Midst (O'Reilly & Assocs. 1995).

Westin, Alan, Privacy and Freedom (Atheneum Pubs. 1967).