2007 - Science, Reason, Truth
University of San Marino, Ancient Monastery of Santa Chiara
August 17-19, 2007
An international Symposium organized in collaboration with the Templeton Foundation and in connection with the XXIX edition of Meeting for Friendship Amongst Peoples
The International Symposium "Science Reason and Truth" will be the second edition of the "San Marino Symposium", following the successful 2006 Edition. As for last year, the invitation-only 2007 San Marino Symposium will be related with the great "Rimini Meeting for Friendship of Peoples", which takes place every summer since 1980 in the nearby town of Rimini. This year, the Rimini Meeting theme will be: "Truth is the destiny for which we were made". The San Marino Symposium will bring the general theme of the Rimini Meeting to the academic level with a coordinated and focused discussion involving a limited number (about 12) world class scholars. The Invited Speakers will work together and discuss key interdisciplinary issues such as the capacity to attain truth, the relationship between scientific enquiry and other methods for reaching truth, and the boundaries of human reason. The set of presentations and discussions will be divided in three sessions, focussed on the three key words of the Symposium title ("science", "reason", "truth"), each specified by a set of topical questions.
The proceedings have been published on Euresis Journal.
1st session - Science
Scientific knowledge is usually perceived as the result of a powerful and rigorously defined method based on experimental studies and logical-deductive reasoning. This approach would seem to require a highly specialized predisposition and talent, involving a remarkable but limited set of intellectual and human capacities. Here I suggest that scientists actually engaged in the battle of scientific research, in order to move towards new knowledge and discovery use a much wider range of rational and affective capabilities than usually assumed. Wonder and aesthetic attraction for the natural phenomena are essential to initiate and to maintain scientific interest, curiosity and imagination. Deep appreciation of the ultimate questions about meaning, origin and destiny appear to act as decisive — though often implicit — motivation for the creativity and dedication of most great scientists. Conceiving an observation or experiment involves rational processes that are analogous to the ability to frame and ask questions. Although classical falsification seems to limit the result from any single experiment to a "negative truth", the quest for a "positive truth" is approachable as a convergence of a large number of independent clues. All this implies that scientific investigation calls into play a much wider variety of intellectual and personal features than usually thought: empirical evidence and demonstration ability are essential, but their possibility seem to rest on a much broader notion of reason. It is also widely believed that scientific knowledge, as it proceeds, should lead to narrowing down the space for wonder and sense of mystery. Here I suggest that this is not the case and, on the contrary, there are reasons to expect that such attitudes are actually enhanced by new scientific discoveries.
Traditionally, the laws of physics are considered to be perfect, immutable, transcendent, infinitely precise, mathematical relationships, imprinted on the universe from without at the moment of birth. I claim that this is an idealized fiction. Instead the laws are more like computer software: programs being run on the great cosmic computer. They emerge with the universe at the big bang, and are inherent in it, not stamped on it from without like a maker's mark. Manmade computers are limited in their performance by finite processing speed and memory. So too, the cosmic computer is limited in power by its age and the finite speed of light. Seth Lloyd has calculated how many bits of information the observable universe has processed since the big bang. The answer is one followed by 122 zeros. Crucially, however, the limit was smaller in the past because the universe was younger. Just after the big bang, when the basic properties of the universe were being forged, its information capacity was so restricted the consequences would have been profound. If a law is a truly exact mathematical relationship, it requires infinite information to specify it. In my opinion, however, no law can apply to a level of precision finer than all the information in the universe can express. In the first split second of cosmic existence, the laws must therefore have been seriously fuzzy. Then, as the information content of the universe climbed, the laws focused, and homed in on the weird life-encouraging form we observe today. Thus the "wiggle room" in the laws was enough for the universe to engineer its own bio-friendliness. Thus, three centuries after Newton, symmetry is restored: the laws explain the universe even as the universe explains the laws. If there is an ultimate meaning to existence, as I believe is the case, the answer is to be found within nature, not beyond it.
Since the first days of the Mechanical Philosophy in the Scientific Revolution, western science and conventional Christian theology have adopted a view of Nature as totally ordered, despite surface appearances, ordered under the deterministic laws of physics. This image was hardly disturbed even under quantum mechanics; now it is just probabilities that evolve deterministically, if not single events. In the last twenty years, however, this picture of total order has been breaking down, in a quiet unheralded revolution. The breakdown is not in one fell swoop, from some cataclysmic shift in fundamental science, but rather is seen through careful, detailed studies of practices and techniques, successes and failures across the sciences, from geology to biology to economics and in physics itself. From discipline to discipline those who study how science works and what its successes can teach about Nature itself are replacing the image of a Nature totally ordered under immutable law with an image of a Nature where laws are changeable and contingent, in which total order does not lie to be discovered beneath the surface of our perceptions but rather must be created. This talk reviews work across the disciplines in which this breakdown is made visible and raises some serious theological questions that that emerge: what new understanding of the relation between God and man and of man's responsibilities must we have in a universe where order is not guaranteed but in which people can create and destroy it?
This paper proposes a relation of mutual (albeit asymmetrical) implication between the natural sciences and theology, a relation deriving from the fact that neither science nor theology is or can ever be philosophically neutral. The proposed mutual-asymmetrical implication is direct, by way of a relation that is not deductive (in either direction) but (intrinsically) analogical. Although the proposal is that science also has direct implications for theology (1), the paper focuses primarily on the sense in which theology (Christian faith), via ontology, bears direct implications for the observation and theory-construction characteristic of science. The argument of the paper is developed in terms of Christian ontology, whose hallmark feature is taken to be the claim that being is gift. Following a brief discussion of the meaning of this claim (in light of the metaphysics of Thomas Aquinas and also Vatican II's teaching regarding the "imago Dei," especially as developed in the pontificates of John Paul II and Benedict XVI), the paper looks at the question of the relation between science and theology in terms of "abstraction." The distinction between the natural sciences and theology turns on the nature of abstraction—of what it means for reason to "pull or take out from" existing beings one specific aspect, for example, spatial and temporal extension insofar as this can be measured by appropriate instruments. It is often said that the problems of positivism and scientism (the presumption that rationality coincides with the empirical sciences) and reductionism (the presumption that all "higher" aspects or kinds of being can be "analyzed down" to physical properties conceived mechanistically) result from a failure to observe the limits specified by a science's distinct mode of abstraction. Such problems would be avoided, so the argument runs, were it recognized that science does not claim to exhaust the intelligibility of an object in the integrity of its existing being—were physics or biology thus to remain just physics or biology and not venture onto the terrain of philosophy or theology. While recognizing the important sense in which this is of course true, the paper attempts to show that such an argument, nevertheless, is still governed by an idea of abstraction needing further differentiation, and just so far instantiates a petitio principii. The argument commonly presupposes an "additive"—or purely external—relation between the aspect of being that is "taken out" for consideration in a given science and those aspects of being that for "disciplinary" purposes are left aside or not "taken out." The present paper argues instead for an internal relation between x and non-x, insisting that failure to recognize this internal relation results in what will remain "softer" versions of positivism and reductionism. It is, again, often said that philosophy and theology consider the (original) "that" and "why" of things, while science considers the "how." The paper attempts to show that philosophy/theology and science on the contrary both imply—within the disciplinary abstraction proper to each—judgments about the "that," the "why," and the "how," though each discipline bears these judgments in its own way. The paper attempts to show the sense in which this is so, in terms of the ontological claim that being is gift, and concludes by illustrating the significance of the paper's proposal relative to some scientific claims made in the name of disciplinary abstraction.
(1) In an asymmetrical way that will be clarified but not extensively developed.
2nd session - Reason
The plate tectonic revolution occurred between 1966 and 1968 when this model replaced all previous tectonic models. Scientists who had been living for years with a fixed Earth model suddenly found themselves dealing with a highly mobile and changing Earth. The drift of the continents was first proposed by Alfred Wegener in 1912. But Continental Drift was rejected by the scientific community in the 1920's because it could not account for most of the observations then made by the geophysicists. In 1960, Harry Hess began to float a new hypothesis, "Sea Floor Spreading" which explained how the renewal of the ocean floor could account both for the drift of the continents and the recent discovery of the young age of the oceans. It was the time during which the floor of the oceans was first thoroughly investigated through a major financial effort of the US Navy. Sea Floor Spreading was the product of this systematic exploration that demonstrated that the oceans floors were much younger than the crust of the continents. Earth scientists had finally decided to look over board into the oceans. They thus discovered that their continental boat was indeed moving.
I lived this revolution from the inside as an actor. I joined the Lamont Geological Observatory in the United States in 1959, to participate in a world-circling cruise of the RV Vema in order to test the continuity of the Rift valley of the Mid Oceanic Ridge around South Africa into the Indian Ocean and then the Pacific Ocean. At the same time, the necessity to detect nuclear tests led to the first modern seismic observatory network and the discovery of what will later be called plate boundaries. Putting together these two new discoveries was accomplished with the plate tectonic model and their compatibility was first best put into evidence in the first global plate kinematic model that I published in 1968. The surface of the Earth was covered with a few plates that moved with respect to each other. Earthquakes occurred at the boundaries of the plates and were due to their relative motions. One could measure these motions and relate them to the seismicity and tectonics which they induced.
It is important to realize that when the model was adopted through a broad consensus of the scientific community, there was no actual demonstration of the validity of plate tectonics. This was done in the next 20 years with the deep sea drilling and the geodetic measurements of plate velocities. In 1968, the model was nearly unanimously adopted because its explicative and predictive power was immensely superior to those of previous models. Yet there were still at the time major observations that were unexplained by the plate tectonic model. I will especially discuss this point to which I was confronted and that I believe to be very important to realize. Because there are always scientists, such as the famous Sir Harold Jeffreys at the time, who thought that if any evidence is conflicting with the model, the scientific attitude is to look for a new idea that may reconciliate observations and theory. As a result, Jeffreys never accepted plate tectonics although he died in 1989. This attitude, if widely adopted, would have delayed plate tectonics from being adopted by the scientific community for more than twenty years.
What is the meaning of truth then within such a context? Experimental science is pragmatic. It uses the framework that best explains the observations and best predicts the results expected. In the case of plate tectonics, the judgment on whether the new theory passed this test depended on the disciplines to which belonged the scientists involved. We were not looking for truth but for efficiency in dealing with the Earth with the results available at the time. In the end, the quasi-totality of the scientific community adopted this theory because of its overall efficiency although there were still a few unsolved major observations that appeared to remain unexplained or even to contradict the theory. Clearly, we used reason to elaborate this new chapter of our science but we never claimed that we were looking for truth.
Results going back to Turing and Godel provide us with limitations on our ability to decide the truth or falsity of mathematical assertions in a number of important mathematical contexts.
There are two kinds of such limiting results that must be carefully distinguished. Results of the first kind state the nonexistence of any algorithm for determining whether any statement among a given set of statements is true or false.
Results of the second kind are much deeper and represent much greater challenges. They point to a specific statement A, among a given set of statements, where we can neither prove nor refute A using accepted principles of mathematical reasoning.
We give a brief survey of these limiting results. These include limiting results of the first kind: from number theory in mathematics, and from idealized computing devices in theoretical computer science.
The highlight of the talk is a discussion of limiting results of the first kind in the context of simplified physical systems; and a discussion of limiting results of the second kind. The simplified physical systems involve a small number of bodies, operating in potentially infinite one dimensional discrete space time.
The rapid, perplexing increase in the incidence of autism has led to a correlative increase in research on it. The most salient feature of autism is now thought to be its severe impairment in what psychologists call 'social cognition', or what some philosophers call `mindreading', namely, the knowledge of persons and their mental states. Autism's deficits as regards social cognition or mind-reading have also made researchers increasingly aware of what normally developing children (and adults) can do effortlessly, and the new research has done a great deal to illuminate the nature of social cognition or mind-reading. These new studies about the knowledge of persons illuminates the nature of a second-person experience or perspective, and the results are highly suggestive for more than one area of philosophy, including not only epistemology but also the philosophy of art.
The Enlightenment is often described as "the Age of Reason", but in fact it often (in David Hume, for instance) limited the role of Reason to being "the slave of the passions". Secularism is largely the consequence of this limitation, and might more properly be called "strong evidentialism", the axiom that all beliefs must be proportioned to available empirical evidence. Reason was given a more positive role in theologians like Anselm, and in philosophers like Hegel. For them, it was an objective intelligibility and beauty discernible by intellect. Empirical evidence is always necessary, but never sufficient, for the construction of a rational worldview. This paper argues against strong evidentialism, and for a stronger view of reason as the creative construction of a coherent, comprehensive, and plausible metaphysics. The intelligible cosmos disclosed by science is part of such a metaphysics. But equally important is the axiological dimension of value, purpose and meaning, found in distinctively personal experience. When this is taken into account, so I argue, religion can assume a proper and rational place in a comprehensive metaphysics of human experience. My paper will attempt to locate science, as a rational enterprise, within a broader notion of reason that also allows for rationality in morals, value-theory and religion - and which may in turn illuminate the scientific concept of reason also.
3rd session - Truth
t is often assumed that the way to truth is a linear process starting from observations, measurements or evident principles and proceeding to a logical conclusion. This way is generally implicit in books on science and theology. Examination of how scientists and theologians actually work shows that on the contrary truth is usually obtained through the convergence of many different indications, none of them conclusive on its own, but all pointing in the same direction. Newman called this way of knowing the illative sense. This way to truth will be illustrated by several examples from science and theology.
The aim of this lecture is to give an overview of several different notions of the concepts of truth and proof in mathematics. This includes the two main directions of 'Platonic realism' and 'Formalism', with some variants, and other views such as 'Intuitionism', empiricism and quasi-empiricism, Field's fictionalism, and social constructivism and realism. The lecture concludes with remarks on the notion of proof, including very recent progress obtained by computer scientists for understanding the overall notion of complexity of proof checking, and finally with some personal reminiscences and remarks on the subject.
Here is a series of statements that illustrates a way in which many philosophers think about truth and falsity. The proposition (thesis, belief) that the Earth goes round the sun is about certain objects, the Earth and the sun; the proposition says that these objects are related in a certain way (the former goes round the latter); it is true because its objects, the things it is about, do stand in the relation it says they stand in. The proposition may thus be said to be objectively true, since the status "true" is conferred on it by the way things stand with its objects. In a similar way, the proposition that the sun goes round the Earth is objectively false—and was objectively false even when most people believed that the sun went round the Earth. The objective truth or falsity of the things that human beings happen to believe or disbelieve is thus in one very obvious sense independent of whether those things are believed or disbelieved; it is in fact independent of the existence of human beings and their mental states: even if there had never been any human beings—or any other sapient creatures—, it would still be true that the earth went round the sun and false that the sun went round the earth. Philosophers who think of truth and falsity in the way this series of statements illustrates are today called (metaphysical) realists. But not all philosophers are realists. There are also (metaphysical) anti-realists, who hold that, in some important sense, the very existence of truth and falsity depends on the cognitive activities of human (or other sapient) beings.
The phrase "objective truth" occurs in political as well as in philosophical writing. The phrase is central to the thought of one of the greatest of the political writers of the last century, George Orwell. Orwell's views on what he calls "objective truth" are presented in very stark and dramatic fashion in 1984, and particularly in the "debate" about truth that is the climax of the novel.
This paper asks what the relation is between these two debates—the current debate among philosophers in the universities about "objective truth" and the fictional debate between Winston Smith and O'Brien in the cells of the Ministry of Love about "objective truth." It will be argued that the obvious similarity between the two debates is not merely verbal, that the very same concept of "objective truth" is at stake in both debates. In the end, philosophical anti-realists cannot evade this question: How does your position differ from O'Brien's?
I begin with the natural desire for truth and the natural belief that the natural desire for truth is satisfiable. This belief is consistent with our evidence, but is not demonstrated by evidence. It requires basic trust in our belief-forming faculties and many of our emotions. I then argue that the exercise of the capacities we trust makes both strong and weak forms of epistemic egoism inconsistent, and I argue that the ideal of epistemic autonomy is incoherent. I then argue that trust that the natural desire for truth is satisfiable leads to trust that other natural desires are satisfiable, in particular, the natural desire for connectedness to the universe. Belief that this desire is satisfiable has important implications for the sorts of beliefs we accept as reasonable.
CONVENOR:
Charles L. Harper, John Templeton Foundation Vice President
CO-CHAIRS:
Marco Bersanelli, Astronomy and Astrophysics, Physics Department, University of Milano
Linda Zagzebski, Department of Philosophy, University of Oklahoma, USA
SCIENTIFIC COMMITTEE:
Tommaso Bellini, Department of medical biotechnology and translational medicine, University of Milano
Marco Bersanelli, Physics Department, University of Milano
Charles Harper, John Templeton Foundation Vice President
Giorgio Petroni, University of San Marino
Elio Sindoni, CEUR Foundation
ORGANIZING COMMITTEE:
Marco Aluigi, Meeting for Friendship Amongst Peoples
Tonino Ceccoli, Euresis Association
Hyung S. Choi, John Templeton Foundation
Donatella Pifferetti, Euresis Association
Nicola Sabatini, Euresis Association