This book aims to provide scientists and engineers, and those interested in scientific issues, with a concise account of how the nature of scientific knowledge evolved from antiquity to a seemingly final form in the Twentieth Century that now strongly limits the knowledge that people would like to gain in the Twenty-first Century. Some might think that such issues are only of interest to specialists in epistemology (the theory of knowledge); however, today’s major scientific and engineering problems—in biology, medicine, environmental science, etc.—involve enormous complexity, and it is precisely this complexity that runs up against the limits of what is scientifically knowable. To understand the issue, one must appreciate the radical break with antiquity that occurred with the birth of modern science in the Seventeenth Century, the problems of knowledge and truth engendered by modern science, and the evolution of scientific thinking through the Twentieth Century. While originally aimed at practicing scientists and engineers, it is my hope that this book can provide a generally educated person with a basic understanding of how our perspective on scientific knowledge has evolved over the centuries to escape pre-Galilean commonsense thinking. Such an appreciation is not only beneficial for one’s general education, but is important for non-scientists who must teach young students or make policy decisions in government or business. Physicist and historian Gerald Holton states the dilemma faced by many: "By having let the intellectuals remain in terrified ignorance of modern science, we have forced them into a position of tragic impotence; they are, as it were, blindfolded in a maze through which they feel they cannot traverse. They are caught between their irrepressible desire to understand the universe and, on the other hand, their clearly recognized inability to make any sense out of modern science." (Holton, 1996) Perhaps this small book can help some make sense of modern science and the crisis of complexity that will bedevil the Twenty-first Century. Except for the last chapter, mathematics has been avoided, and even in that chapter it has been kept minimal, the only exception being in Section 7.6, which requires some details of the Wiener filter, which are provided. Biological networks are used to illustrate complexity issues, but these are kept mainly at the descriptive level. Beyond the general issues that have interested me since first encountering them in my genomic research, the immediate motivation behind the book comes from three sources. First, for several years I have been giving lectures on the “Foundations of Translational Science,” which as the name suggests concerns the translation of scientific knowledge into practice. It is a terminology popularly used in medicine. More generally, it refers to modern engineering. The lectures place the problems of computational biomedicine into the framework of classical scientific knowledge and consider the problems of large-scale modeling in medicine. The audience has consisted of Ph.D. students, post-doctoral candidates, and faculty. I have successively added more historical development of scientific epistemology because the audience always asks for more. This book provides it. Second, in 2011, my colleague Michael Bittner and I published the book Epistemology of the Cell: A Systems Perspective on Biological Knowledge, which discusses epistemological problems relating to cellular biology, with emphasis on biomarkers and network models in genomic medicine (Dougherty and Bittner, 2011). The book has some historical and philosophic background, but, as it has turned out, not a sufficient amount for the large number of contemporary students who have virtually no background in the philosophy of science. The current book rectifies that problem, is focused on science and engineering more generally than cellular biology, includes an extensive discussion of the emerging complexity problems, and puts forward ideas on how one might begin to address these problems in translational science. Third, in the summer of 2015 I attended a small workshop in Hanover, Germany, entitled How to Build Trust in Computer Simulations—Towards a General Epistemology of Validation. The workshop brought together researchers from different fields who were interested in the emerging crisis of scientific knowledge. It was apparent that the issues that I had been grappling with were ubiquitous across science, economics, engineering, and social science. The discussions in Germany stimulated my thinking. This was accentuated because, upon giving a lecture at the University of Munich, I was asked to contribute a chapter to a forthcoming book on epistemology with the idea of speculating on how to deal with model complexity from the perspective of validation and data in the context of translational science [Dougherty, 2016]. Those speculations, which have developed since last summer and have reached a plateau, are discussed in the last chapter of the book, with applications to biomedicine, pattern recognition, and signal processing. The book is short, about one hundred and fifty pages. This is intentional because the goal is to succinctly and cohesively hit the necessary points for one to grasp the meaning and structure of scientific thinking, and then engage the current crisis of validation. These are exciting times for a scientist (or anyone) who is interested in fundamental problems of complex systems. Just as physicists in the first half of the Twentieth Century had to squarely confront the unintelligibility of Nature, today’s scientist must confront the virtual impossibility of reconciling the desire to model big systems with small data within the context of existing scientific epistemology. The profound question for scientists in the Twenty-first Century: Is it possible to weaken scientific epistemology and broaden the domain of science without destroying it? Edward R. Dougherty College Station, Texas October 2016
|