Topics
Introduction
Philosophers
|
Ian Hacking
Ian Hacking is a philosopher and historian of science (trained in analytic language philosophy) who documented the development of probability from the seventeenth century to the late nineteenth in his major works, The Emergence of Probability (1975), and The Taming of Chance (1990). Hacking identifies probability with the mathematics of randomness and chance, which did not appear until the Renaissance. From the beginning, he says, probability was dual. It has an epistemic element having to do with degrees of belief, and an ontological aspect, having to do with the performance of randomizing devices like dice and coins in the long run of large numbers of trials. The first is epistemic or a priori probability, the latter is the ontological and a posteriori frequency statistics that we get from experiments. Probabilities are theories used to establish degrees of belief. Statistics are experiments that may validate some theories. In The Taming of Chance, Hacking argues for a nineteenth-century "erosion of determinism," making room for genuine chance. (Other historians, e.g., Stephen Brush, made similar claims at about the same time.) The most decisive conceptual event of twentieth century physics has been the discovery that the world is not deterministic. Causality, long the bastion of metaphysics, was toppled, or at least tilted: the past does not determine exactly what happens next. This event was preceded by a more gradual transformation. During the nineteenth century it became possible to see that the world might be regular and yet not subject to universal laws of nature. A space was cleared for chance. Most of the mathematicians (Abraham de Moivre, Pierre-Simon Laplace, Carl Friedrich Gauss, and others) who developed the calculus of probabilities, and most nineteenth-century physical scientists believed that randomness in chance events, including the atomic and molecular randomness that succeeded in explaining irreversibility and the second law of thermodynamics, may be the result of some unknown underlying universal laws of nature, such as the "law of large numbers" and the "normal distribution." Laplace explained the appearance of chance as the result of human ignorance. He said, "The word 'chance,' then expresses only our ignorance of the causes of the phenomena that we observe to occur and to succeed one another in no apparent order." For most of them, the growing indeterminism described by Hacking was traceable to human ignorance of the detailed motion of atomic particles. To be sure, there were some nineteenth-century vociferous proponents of "absolute" chance, such as Charles Sanders Peirce and the French philosophers Charles Renouvier and Alfred Fouillée, who inspired Peirce and his colleague William James. But the kind of indeterminism we have as a result of quantum mechanical indeterminacy is quite different from typical nineteenth centuries of probability and chance. For example, Arthur Stanley Eddington, who was intimately familiar with the statistical mechanical basis of the second law of thermodynamics, maintained that the determinism of classical physics, which presumably included chance and probability, was gone forever. In The Nature of the Physical World (1928), Eddington dramatically announced "It is a consequence of the advent of the quantum theory that physics is no longer pledged to a scheme of deterministic law," Prominent dissenters from quantum theory such as Max Planck, Albert Einstein, Louis de Broglie, Erwin Schodinger, and David Bohm, hoped that an underlying deterministic explanation would be found some day for quantum randomness. Many philosophers, and a few scientists, still hold to this possibility of a return to strict determinism and causality.
The example of C. S. Peirce
Hacking uses Charles Sanders Peirce as his model of a nineteenth-century thinker who embraced ontological chance (Peirce called it tychism). While Peirce is an excellent choice, he is not at all typical. And Peirce had his doubts about chance, for example he criticized chance's role in the Darwinist version of evolution.
Peirce actually modeled his thinking on the work of Charles Darwin, but he was not satisfied with Darwin's fortuitous variation and natural selection. He falsely associated it with the Social Darwinist thinking of his time and called it a "greed philosophy." Peirce also rejected the deterministic evolution scheme of Herbert Spencer, and proposed his own grand scheme for the evolution of everything including the laws of Nature! He called this third possibility synechism, a coined term for continuity, in clear contrast to the merely random events of his tychism. With his typical triad of chance, determinism, and continuity, Pierce's evolutionist thinking resembles that of Hegel. It was the basis for the evolutionary growth of variety, of irregular departures from an otherwise mechanical universe, including life and Peirce's own original thoughts. For Peirce and Hegel, ideas are living things with meanings that grow over time. Peirce was a "realist" in that he believed these ideas have a metaphysically real existence. Peirce argued that the laws of nature themselves changed with time, at least that laws "emerge" at different epochs and that the laws of biology are not reducible to the laws of chemistry and physics, and idea Peirce likely got from Emile Boutroux. Hacking ends The Taming of Chance with a paean to Peirce... Peirce denied determinism. He also doubted that the world is a determinate given. He laboured in a community seeking to establish the true values of Babbage's constants of nature; he said there aren't any, over and above those numbers upon which we increasingly settle. He explained inductive learning and reasoning in terms of merely statistical stability. At the level of technique, he made the first self-conscious use of randomization in the design of experiments: that is, he used the law-like character of artificial chances in order to pose sharper questions and to elicit more informative answers. He provided one of the standard rationalia for statistical inference — one that, named after other and later workers, is still with us. He had an objective, frequentist approach to probability, but pioneered a measure of the subjective weight of evidence (the log odds). In epistemology and metaphysics, his pragmatic conception of reality made truth a matter of what we find out in the long run. But above all, he conceived of a universe that is irreducibly stochastic. Free Will
Hacking ends his opening argument with a famous quote from Kant on free will (from an essay, Idea for a Universal History with a Cosmopolitan Intent,), which shows Kant to believe that statistics may appear to be random but are clearly governed by a universal law.
Whatsoever difference there may be in our notions of the freedom of will metaphysically considered, it is evident that the manifestations of this will, viz. human actions, are as much under the control of universal laws of nature as any other physical phenomena. It is the province of History to narrate these manifestations; and, let their causes be ever so secret, we know that History, simply by taking its station at a distance and contemplating the agency of the human will upon a large scale, aims at unfolding to our view a regular stream of tendency in the great succession of events — so that the very same course of incidents which, taken separately and individually, would have seemed perplexed, incoherent, and lawless, yet viewed in their connection and as the actions of the human species and not of independent beings, never fail to discover a steady and continuous, though slow, development of certain great predispositions in our nature. Thus, for instance, deaths, births, and marriages, considering how much they are separately dependent on the freedom of the human will, should seem to be subject to no law according to which any calculation could be made beforehand of their amount: and yet the yearly registers of these events in great countries prove that they go on with as much conformity to the laws of nature as the oscillations of the weather.'Hacking also looks briefly at twentieth-century arguments for freedom and tries to understand why they differ from a century earlier. He explains why probability seemed to create space for freedom in 1936, despite the fact that it had seemed to rule it out in 1836. But this hardly explains why leading quantum scientists like Max Planck, Albert Einstein, and especially Erwin Schrödinger, who endorsed the 19th-century view of probability and statistical mechanics developed by Ludwig Boltzmann, should by 1936 be more determinist than Hacking feels that Peirce and other thinkers of the late 19th-century had become. The second wave of quantum mechanics. which commenced in 1926, established that the fundamental laws of microphysics are irreducibly probabilistic.
On Thomas Kuhn
Hacking wrote an introductory essay for the 50th-anniversary edition of Kuhn's classic The Structure of Scientific Revolutions.
For Teachers
For Scholars
|