The 20th century may well be labeled the "century of science." Phenomenal scientific developments unfolded across three broad areas during the century: conceptual reformulations that fundamentally changed the way in which the natural world is understood and portrayed; changes in the status of science in society; and the establishment of powerful new relations between science and technology.
Most of the basic ideas held by scientists at the end of the 19th century were overturned in the course of the 20th century. As a result, a host of theoretical novelties emerged across a broad range of disciplines to radically transform our present understanding of the world.
The Einsteinian Revolution in Physics
In the last decade of the 19th century a nagging series of problems began to undermine the intellectual edifice of contemporary physics, and by the dawn of the 20th century a serious crisis had developed. The discovery of radioactivity in 1896 by French physicist Antoine Henri Becquerel challenged the accepted principle of the immutability of atoms. The discovery of the electron in 1897 by British physicist Joseph John Thomson called into question the traditional, indivisible atom as the smallest unit of matter. Beginning in 1887, the repeated failure of American physicist Albert A. Michelson and his collaborators to detect the motion of the Earth relative to the ether, an all-pervasive substance posited as the physical substrate for the propagation of light, likewise proved problematical.
Well educated technically in the central dogmas of the field, yet young and professionally marginal enough not to be locked into established beliefs, Albert Einstein was perfectly positioned to effect a revolution in contemporary physics. In 1905 he published an extraordinary series of papers that redirected the discipline. The most earth-shaking dealt with special relativity, the modern physics of moving bodies. In essence, by positing that nothing can move faster than light, Einstein reformulated the mechanics of Isaac Newton, where possibilities of speeds greater than the speed of light exist. The upshot was special relativity, an interpretation of motion that dispensed with the traditional reference frames of absolute space and absolute time. All observations (such as when an event takes place, how long a ruler is, or how heavy an object is) become relative; they depend on the position and speed of observer. As a feature of his new physics, Einstein posited his celebrated formula, E = mc2, which equates energy and mass (via a constant speed of light, c) and makes matter a kind of frozen energy. In 1915, Einstein published on general relativity, the modern physics of accelerated motions, wherein he equated gravity to acceleration. Observations of a total solar eclipse in 1919 seemed to confirm Einstein's prediction that the mass of the Sun should bend starlight; similar, very precise calculations of Mercury's orbit around the Sun brought about like agreement with general relativity.
Particle Physics and the Quantum Revolution
With the discovery of the electron and radioactivity, the atom became a prime focus of research. In 1897, J. J. Thomson proposed a model of the atom having negatively charged electrons sprinkled about like raisins in a cake. Using radioactive emissions to investigate the interior structure of the atom, British physicist Ernest Rutherford in 1911 announced that atoms were composed mostly of empty space. Rutherford proposed a model that had electrons orbiting a solid nucleus, much like planets orbit the Sun. In the 1920s, problems with this solar system modelsuch as: Why do electrons maintain stable orbits? Why do atoms radiate energy in a discontinuous manner?led to the formulation of the so-called and seemingly paradoxical "new theory" of quantum mechanics. The principles of quantum mechanics were difficult to accept. Empirical studies supported the theory, however, as did the social network that arose around Niels Bohr and his Institute for Theoretical Physics in Copenhagen. Essentially, quantum theory replaced the deterministic mechanical model of the atom with one that sees atoms (and, indeed, all material objects) not as sharply delineated entities in the world but, rather, as having a dual wave-particle nature, the existence of which can be understood as a "probability wave." That is, quantum-mechanical "waves" predict the likelihood of finding an objectwhether an electron or an automobileat a particular place within specified limits. The power of this counterintuitive analysis was greatly strengthened in 1926 when two distinct mathematical means of expressing these ideasthe matrix mechanics developed by Werner Heisenberg and the wave equations developed by Erwin Schrödingerwere shown to be formally equivalent. In 1927, Heisenberg proposed his famous uncertainty principle, which states that an observer cannot simultaneously and with absolute accuracy determine both the position and the speed (or momentum) of a body. In other words, quantum mechanics reveals an inherent indeterminacy built into nature. Knowledge of fundamental particles grew rapidly thereafter. In 1930, Wolfgang Pauli postulated the existence of a virtually massless uncharged particle that he christened the neutrino. In 1932 the discovery of the neutrona neutral body otherwise similar to the positively charged protoncomplemented the existence of electrons and protons. In that same year, detection of the positron, a positively charged electron, revealed the existence of antimatter, a special kind of matter that annihilates regular matter in bursts of pure energy. Based on the work of the flamboyant American physicist Richard Feynman, quantum theory is now known as quantum electrodynamics, or quantum field theory. Hand in hand with experimental high-energy physics, it has revealed an unexpectedly complex world of fundamental particles. Using particle accelerators of ever greater energy, nuclear physicists have created and identified more than 200 such particles, most of them very short-lived.
Active research continues in contemporary particle physics. For example, elusive particles known as the Higgs boson and the graviton (posited to mediate the force of gravity) are now prime quarry. Theorists succeeded in the 1970s in conceptually unifying electromagnetism and the weak nuclear force into the so-called electroweak force; an as-yet-unrealized grand unification theory will perhaps add the strong nuclear force. However, the Holy Grail of an ultimate theory uniting all the forces of the universe, including quantum gravity, remains a distant goal.
Cosmology and Space Science
In the 18th century, astronomers broached the idea that our Galaxy (the Milky Way) is an "island universe." Observations of nebulous bodies in the 19th century further suggested that additional galaxies may populate the cosmos outside the Milky Way. Only in the 1920s, howeverin particular because of the work of American astronomer Edwin Hubbledid the extragalactic nature of some nebulous bodies, the immense distances that they involve, and the apparent expansion of the universe as a whole become established in cosmology. The question of how to account for the apparent expansion of the universe puzzled theorists in the 1950s. Essentially, two mutually exclusive views divided their allegiance. One, the steady-state theory, held that new matter appears as space expands. The alternative theory proposed that the universe originated in an incredibly hot and dense "big bang," resulting in a universe that continues to expand. Both camps had their supporters and arguments, although most cosmologists seemed predisposed to steady-state models throughout the first half of the 20th century. The debate was ultimately decided only after 1965 with the almost accidental discovery by two Bell Laboratory scientists (Robert W. Wilson and Arno W. Penzias) of the so-called 3° Kelvin, or K (0° K being absolute zero), background radiation. (The idea that explains their discovery is elegant. If the universe began as a hot fireball with a big bang, then it should have "cooled" over time, and calculations could predict what the residual temperature of the universe should be today.) It was this relic heat, measured by an omnipresent background radiation at roughly three degrees above absolute zero, that Penzias and Wilson stumbled upon. Their discovery won them the Nobel Prize in 1979 and sounded the death knell for steady-state cosmology.
Research in cosmology continues in a number of areas. Precisely how old is the universe? What will be its fate? Work in quantum cosmology concerning such esoteric concepts as "imaginary" time, "virtual" particles, quantum "tunneling," and the chance coming-into-being of matter out of the quantum vacuum may prove useful in addressing these issues. Today, theorists also entertain such esoteric entities as superstrings, multiple universes, and equally provocative but perhaps intellectually productive concepts. Research into black holesthose extraordinarily dense celestial objects whose gravity prevents even the escape of light but that "evaporate" energymay also illuminate these ultimate questions.
More proximate celestial matters also drew attention in the 20th century. For example, a notable landmark concerns the evidence (uncovered in 1980) suggesting that a catastrophic meteorite or cometary impact caused a mass extinction, including the extinction of the dinosaurs, at the end of the Cretaceous Period 65 million years ago. The significance of this finding lies in what it implies about the role of accident in the history of life, including the origin and fate of humans. Scientists are now surveying the orbits of comets and meteors to see whether any of them offer the threat of future possible collisions.
Planetary space science has similarly transformed the way in which humans visualize the Earth and its near neighbors. The century had begun with some remaining hope that life (and in some cases, perhaps even Earthlike life) might yet be found on other planets of the solar system. Long before the century's end these hopes had almost wholly been dashedmost spectacularly in the case of cloud-hidden Venus, which space probes revealed to be a place too hellishly hot to support any life form whatsoever. Mars, the other potentially Earthlike planet, did provide suggestions of possible life in the geological past, but the planet as it now exists is an arid desert. Some hope for solar-system life still remains at Europa, a satellite of Jupiter that may contain a liquid sea locked beneath its frozen surface. In the meantime, however, space science beyond the Sun's local system opened up dramatically in the later 20th century with the affirmed discovery of objects circling other stars. The bodies found thus far are notably un-Earthlike, but their mere existence offers the chance for vistas yet unguessed.
The Life Sciences
The 19th century bequeathed many fundamental biological principles: the concept of the cell, the germ theory of disease, rejection of the possibility of spontaneous generation, experimental methods in physiology, and a mature field of organic chemistry. However, at the turn of the 20th century the scientific community was far from accepting one notable axiom, that of evolution by natural selection, first proposed by Charles Darwin in 1859.
Around 1900, rediscovery of the work on heredity of the Austrian monk Gregor Mendel set the stage for the 20th-century synthesis of Darwin's findings. Mendel had shown that individual traits maintained themselves over the course of generations, thus providing a basis for understanding variation and selection from a Darwinian point of view. Thomas Hunt Morgan, experimenting with the Drosophila fruit fly at Columbia University in the 1910s and '20s, showed that chromosomes within cells contain the information for heredity and that genetic change produces variation. Mathematicians specializing in population genetics soon demonstrated that a single variation could come to characterize a whole new species, and by the later 1940s DNA (deoxyribonucleic acid) looked to be the material basis of life and inheritance. The uncovering of the double helical structure of DNA in 1953 by James D. Watson and Francis Crick provided the key conceptual breakthrough for understanding reproduction, the nature of heredity, and the molecular basis of evolution.
Although Darwin had postulated the evolution of humans from simian ancestors, a coherent story of human evolution was the product of 20th-century scientific thought. Discovery of the Neandertal variety of Homo sapiens and Paleolithic cave art (produced by anatomically modern humans; see prehistoric art) was generally not accepted until the turn of the 20th century. The first Homo erectus fossil was unearthed in 1895, and the first Australopithecus surfaced only in 1925. The "discovery" of the notorious Piltdown man in 1908 argued against human evolution from more primitive ancestors, and only in 1950 was the Piltdown artifact proven to be a fraud. Since that time, an increasingly clearer picture has emerged of the stages of human evolution, from Australopithecus afarensis ("Lucy" and her kin, first uncovered by Donald Johanson in 1974), through Homo habilis, H. erectus, and then varieties of H. sapiens.
The scientific study of the human mind likewise expanded in the period. Several divergent strands characterize psychology in the 20th century. One of those strands, the work of Sigmund Freud, was of signal theoretical importance into the 1960s. In such works as The Interpretation of Dreams (1900), Freud made an indelible impression on the century through his emphasis on the unconscious and on infantile sexuality as operative not only in psychological illness but also in everyday life, and Freudian psychoanalysis provided a new avenue for self-awareness and improved mental health. By the same token, 20th-century psychology was characterized by a wholly reductionist and nonsubjective strand. The Russian experimentalist I. P. Pavlov pioneered this line of work in proving that conditioning can determine behavior. The American psychologists John B. Watson and, later, B. F. Skinner formalized these notions as behaviorism, a psychology that emphasized learned behaviors and habit and that rejected the study of mental processes as unscientific. Psychology has since moved in the direction of cognitive explorations and is currently under challenge by increasingly sophisticated biochemical and physiological accounts of mental experience. Sociobiology and evolutionary psychology, fields that postulate that patterns of culture may be explained by genetics, are likewise affecting the science of psychology.
Striking theoretical novelties that arose in geology and the Earth sciences in the 20th century also suggested interesting new chapters in the development and evolution of life forms. Most notably, the idea that the continents may actually be "plates" riding on the Earth's mantle was generally rejected when it was first put forward in 1915 by German geologist Alfred Wegener, but it gained virtually universal acceptance in the 1960s for a variety of technical (and social) reasons. Understanding the theories of plate tectonics and continental drift has since allowed scientists to "run the film backwards"that is, to recapitulate the geological history of the Earthwith all the implications that this process holds for both geology and biology.
The Scientific Enterprise in Society
Changes in the social character of science in the 20th century evolved hand in hand with theoretical reformulations.
In the 20th century the pursuit of science became a professional career and the full-time occupational concern of nearly a million people in the United States who engaged in research and development. Although wide latitude exists in scientific employment, the social pathways available for scientific training are quite narrow. Whether an individual will become a scientist is in many respects determined during his or her middle school years. Almost universally, the individual has to graduate from high school with the requisite science courses, complete an undergraduate major in a field of science, pursue graduate work, obtain a Ph.D., and then (usually) continue training for a period of years after the Ph.D. in temporary positions known as "post-docs." The Ph.D. constitutes the scientist's license, but from there on career paths diverge. Traditionally, the norm has been to pursue careers of research and teaching in universities. Increasingly, young men and women of science find productive lives for themselves in government and private industry. Also, what scientists do in the present day spans a broad range of practice: from disinterested, pure-science research in universities or specialized research facilities, to more mundane, applied-science work in industry. Generally speaking, whether in academe, industry, or government, younger scientists tend to be the more active producers of scientific knowledge.
The status of women in science changed dramatically in the 20th century. Women have never been completely absent from the social or intellectual histories of science; and, as they gained admission to universities in the 19th century, gradually increasing numbers of women became scientists. Heroic accounts often single out Marie Curie, who was the first female professor at the Sorbonne and the only woman ever to win two Nobel Prizes (in 1903 and 1911). The Austrian physicist Lise Meitner held a professorship at the University of Berlin before escaping Nazi Germany; she played a seminal role in articulating the theory behind the atomic bomb, and her case exemplifies the increasing presence and professionalism of women in modern science. With her mastery of X-ray diffraction techniques, the physical chemist Rosalind Franklin was instrumental in the discovery of the double helix of DNA. Franklin's story presents cautionary lessons as well, however, because the social system of British science in the 1950s did little to welcome her or give her credit for her work. It remains difficult for women to pursue science-based careers, butmirroring other shifts in Western societies over the last several decadesopportunities have improved for women; the idea of a female scientist is no longer exceptional. Scientists are expected to publish research results in scientific journals, and the old adage of "publish or perish" continues to apply to contemporary science. Participation in professional societies forms another requisite part of scientific life today, as does getting grants. Grants typically include funds for investigators' salaries, graduate and post-doc support, and equipment. In what amounts to subsidies of institutions, budgets usually include so-called "indirect" costs paid not to support research but to underwrite the grantee's institution.
Honorary organizations such as the National Academy of Sciences (founded in 1863) cap the elaborate social organization of contemporary science. International organizations such as the International Council of Scientific Unions (1931) govern science on a world level. Finally, at least in the public mind, standing at the very top of the social and institutional pyramid of contemporary science stand the Nobel Prizes, given annually since 1901 in the fields of physics, chemistry, and physiology or medicine.
More was involved in 20th-century science than simply the linear evolution of scientific ideas or successive stages in the social and professional development of science. The exponential growth of knowledge represented another characteristic feature of science in the 20th century. By every indication, science has grown at a geometric rate since the 17th century. Several paradoxical consequences follow from this exponential growth. For example, a high proportion of all the scientists who ever livedsome 80 to 90 percentare said to be alive today. Given its exponential growth, the scientific enterprise clearly cannot expand indefinitely, for such growth would sooner or later consume all resources. Indeed (and as predicted), an exponential growth rate for science has not been maintained since the 1970s. In any particular area of science, howeverespecially "hot" newer ones such as superconductivity or AIDS researchexponential growth and an ultimate plateau remain the typical pattern. Other methods of growth measurement, notably citation studies, add to these basic understandings. Scientists back up their results by referencing other papers, and much can be learned from studying these citation patterns. For example, a significant percentage of published work is never cited or actively used. In essence, much of this production disappears into "black holes" in the scientific literature (a fraction, however, that is considerably less than in the humanities). Citation studies also show a drastically skewed inequality of scientific productivity among researchers. That is, for any collection of scientists, a handful will be big producers while the vast majority will produce little, if any, work. A corollary of this differential productivity affects institutions, too, in that a handful of elite universities dominates the production of science. Citation studies also show an unusual "half-life" of scientific information. That is, older scientific results prove less useful to practicing scientists than more recent science, and therefore citations fall off with time. In contrast with the traditional humanities, the scientific enterprise thus displays a characteristic ahistorical present-mindedness.
Big Science and the Bomb
The development of the atomic bomb by the United States during World War II marks a watershed in the history of modern science and technology. The reasons are twofold. First, the case dramatically revealed the practical potential of science and what could emerge from turning theory toward useful ends. Second, it demonstrated what might be forthcoming when government supported large-scale scientific research and development with abundant resources. Neither of these considerations was entirely new in the 1940s. The novelty of the atomic bomb caseand what it portended for the future of science and technology in the second half of the 20th centurystemmed from the combination of these two factors: large-scale government initiatives, used to exploit scientific theory for practical ends.
The scientific theory behind the bomb emerged only in the late 1930s. In 1938 the German physicist Otto Hahn demonstrated that certain heavy elements (such as uranium) could undergo nuclear fission and be "split" into more simple components. Then, in 1939, Lise Meitner calculated the immense amounts of energy that could be released from an explosive nuclear chain reaction. On Aug. 2, 1939, with war nearly underway in Europe, Albert Einstein wrote a historic letter to President Franklin Roosevelt about the matter. The result was the largest science-based research-and-development venture to date in history. The Manhattan Project, as it was called, involved 43,000 people working in 37 installations across the country, and it ended up costing (in contemporary money) $2.2 billion. On July 16, 1945, directed by American physicist J. Robert Oppenheimer, the team set off the world's first atomic explosion in New Mexico.
The atomic bomb brought World War II to a dramatic end. It also launched the cold war. The military-industrial establishments of the United States and the Soviet Union raced to develop larger atomic weapons and then, from 1952, even more powerful thermonuclear bombs that derived their energy from the nuclear fusion of hydrogen into helium. World War II gave a push to a number of other government-funded applied-science projects, such as developing radar, penicillin production, jet propulsion, and the earliest electronic computers. The war established a new paradigm for science, government, and university relations that has endured to this day. After World War II it became hard to think of technology other than as applied science.
The Manhattan Project typified a new way of doing science: that of the industrialization of scientific productionwhat has been called "big science." In the 20th century, research came to necessitate large installations and expensive equipment, increasingly beyond the resources of individual experimenters or universities. Teams of researchers began to replace the labor of individuals. Each member of such a team was a specialized worker, and papers issuing from team-based science were sometimes authored by hundreds of individuals. For example, the discovery of the entity known as the top quark in 1995 was produced at the Fermi National Accelerator Laboratory by two separate teams, each numbering 450 scientists, and technicians staffing two detectors that cost $100 million apiece. Individual or small-group research continues in some fields, such as botany, mathematics, and paleontology, but in other areas, such as particle physics, biomedical engineering, or space exploration, big science represented an important novelty of the 20th century.
Science and Technology
The modern merger of science and technology, or "technoscience," continued in the second half of the 20th century. In 1995, $73 billion in federal money went to scientific research and developmenta figure that represented 2.6 percent of U.S. gross domestic product. The nation's total public and private investment in basic research hovered at $160 billion at the century's end.
Military defense overwhelmingly drives government patronage for science in the United States. Indeed, the science budget for the U.S. Department of Defense is three times that of the next largest consumer of federal science dollars, the National Institutes of Health (NIH), and defense-related science expenditures continue to receive more than half of all federal science spending. Similarly, federal dollars overwhelmingly go to support applied science. The descending order of funding from the Defense Department, to the NIH, to the National Aeronautics and Space Administration (NASA), and to the National Science Foundation (NSF), is also revealing in this regard, in that fundings for defense, health, and NASA space projects (which have always entailed significant political and employment goals) come well before support of the agency nominally charged with the promotion of pure scientific research, the NSF. Even there, only $2.3 billion of the NSF's overall 1995 budget of $3.4 billion went to research.
For many years following the start of the cold war, the big applied science of space exploration claimed its impressive share of U.S. appropriations. NASA itself came into being in the late 1950s, at a time when the Soviet Union was alarming the United States by its launch of the world's first successful artificial satellites. However, after a few early disasters the American manned space program forged ahead with an effort that resulted in the 1969 landing of the first humans on a celestial bodythe Moonother than the Earth. The Soviet Union also reached the Moon by means of automated vehicles, but its space ventures were far surpassed by U.S. probes that explored the far limits of the solar system and returned important images and data on its bodies. Many nations have since joined in the space enterprise to different degrees and for a wide range of purposes. The use of satellite data in various scientific, technological, meteorological, and military programs has long since become a staple of human life, and Russia, the United States, and a number of other countries are currently engaged in plans for building a permanent space station.
Examples of applied science in the 20th century were almost too numerous to mention: electric power, radio, television, satellite communications, computers, huge areas of scientific medicine and pharmaceuticals, genetics, agriculture, materials science, and so on and so forth. They all testified to the power of modern scientific research and development. The historical union of science and technology in the 20th century produced a significant new force for change. Marxist thinkers in the former Soviet Union recognized the novelty of this development, labeling it the 20th century's "scientific-technological revolution." The term has not been taken up generally, but it captures much of what is at issue here: the effective merger of science and technology in the 20th century and the increasing importance of science-based technologies and technology-based sciences for the economic and social well-being of humanity today.
In the period following World War II, science enjoyed an unquestioned moral, intellectual, and technical authority. Through the operations of its apparently unique "scientific method," theoretical science seemed to offer an infallible path to knowledge, while applied science promised to transform human existence. Paradoxically, or perhaps not, just as the effects of a fused science-technology culture began to refashion advanced societieswith the bomb, television, interstate highways, computers, the contraceptive pill, and so onso, too, beginning in the 1960s, antiscience and antitechnological reactions began to manifest themselves. The counterculture of the 1960s and '70s represents one example. Failures of nuclear reactors at Three Mile Island (1979) and Chernobyl (1986), perceived threats from depletion of the ozone layer (see global warming), and the spread of new diseases such as AIDS and the Ebola virus made many people suspicious of the material benefits of science and technology. Increasing societal concerns with ecology, recycling, "appropriate technologies," and "green" politics derive from such suspicions.
Work in the philosophy of science and the sociology of knowledge have challenged any uniquely privileged position for science as a means of knowing or staking claims about any ultimate truth. Most thinkers now recognize that scientific-knowledge claims are relative, inevitably fallible human creations and not final statements about an objective nature. Although some would seize on this conclusion to promote intellectual anarchy or the primacy of theology, paradoxically, no better mechanism exists for understanding the natural world than the human enterprise of science and research.
James E. McClellan III
Ash, Mitchell G., and William R. Woodward, eds., Psychology in Twentieth-Century Thought and Society (1987).
Christianson, Gale E., Edwin Hubble: Mariner of the Nebulae (1995).
Daniel, Glyn, and Christopher Chippindale, eds., The Pastmasters: Eleven Modern Pioneers of Archaeology (1989).
Deutsch, Karl W., et al., eds., Advances in the Social Sciences, 1900-1980 (1986).
Ewing, John, ed., A Century of Mathematics through the Eyes of the [American Mathematical] Monthly (1994).
Flowers, Charles, A Science Odyssey: 100 Years of Discovery (1998).
Forman, Paul, and Sanchez-Ron, Jose M., National Military Establishments and the Advancement of Science and Technology: Studies in 20th-Century History (1996).
Gleick, James, Genius: The Life and Science of Richard Feynman (1993).
Gornick, Vivian, Women in Science: 100 Journeys into the Territory, rev. ed. (1990).
Hawking, Stephen W., A Brief History of Time, rev. ed. (1996).
Horgan, John, The End of Science: Facing the Limits of Knowledge in the Twilight of the Scientific Age (1997).
Krige, John, and Pestre, Dominique, eds., Science in the Twentieth Century (1997).
Maddox, John Royden, What Remains to Be Discovered: Mapping the Secrets of the Universe, the Origins of Life, and the Future of the Human Race (1998).
Ogilvie, Marilyn Bailey, and Meek, Kerry Lynne, Women and Science: An Annotated Bibliography (1996).
Outhwaite, William, and Bottomore, T. B., eds., The Blackwell Dictionary of Twentieth-Century Social Thought (1993).
Pais, Abraham, "Subtle Is the Lord …": The Science and Life of Albert Einstein (1982).
Piel, Gerard, The Age of Science: What Scientists Learned in the Twentieth Century (2001).
Rhodes, Richard, Dark Sun: The Making of the Hydrogen Bomb (1995), and The Making of the Atomic Bomb (1986).
Rossiter, Margaret W., Women Scientists in America: Struggles and Strategies to 1940 (1982), and Women Scientists in America: Before Affirmative Action (1982).
Watson, James D., The Double Helix, ed. by Gunther S. Stent (1980).
Williams, Trevor I., ed., Science: A History of Discovery in the Twentieth Century (1990).