As disciplines, what would you say are the most significant ways where philosophy and science differ?
Science tells us what the world is like; philosophy tells us why we should take seriously that the world is like the way science says it is. Strictly speaking, however, the difference between philosophy and science, and hence the relation between the two, is a philosophical problem. There are two extreme views: naturalism and apriorism.
On the apriorist view, philosophy is fully distinct from science, both in its methods and its subject matter. The method of philosophy is taken to be conceptual analysis, and its subject matter provides firm foundations for science and human practice. On the naturalist view, philosophy is continuous with science both in its methods and its subject matter: philosophy deals with questions that arise at the theoretical end of science and to rely on the very same methods that scientists use. These two views of the relation between science and philosophy are exemplified in the tree image of knowledge that René Descartes put forward in the 1640s and in the mariner-at-sea image put forward by Otto Neurath in the 1930s.
On Descartes’ view, metaphysics (or ‘first philosophy’) is the roots of the tree of knowledge, physics its trunk, and the various sciences its branches. Philosophy—for him a purely a priori discipline—makes possible the scientific knowledge of the world and sets constraints on the theoretical and empirical models of the world. On Neurath’s view, our scientific account of the world is all there is. He invited us to think of this account as a boat at sea, with no dry land upon which the boat could be docked and inspected when problems arise. All we can do is repair the boat while keeping it afloat, fixing it piece by piece.
In my own view, the correct position is somewhere in the middle. There is no dry dock from which to view our scientific image of the world from the outside and fix its problems. But philosophy does not thereby lose its autonomy and significance. Philosophy, I think, plays two distinct roles regarding science: explicative and critical.
In its explicative mode, philosophy aims to explicate (that is, to render more precise and more definite) the various concepts that are employed by science in general and by the various sciences in particular—and hence, to specify their common content as well as their differences and relations. In its critical role, philosophy aims to criticise the various conceptions of science as well as the various ways to present science, its methods, and its aims. A key object, for instance, of the critical function of philosophy is to disentangle the part of scientific theories that is up to us and the part which is up to the world. In other words, disentangling the contribution of the mind and the contribution of the world in our scientific image of reality.
“The critical function of philosophy is to disentangle the contribution of the mind and the contribution of the world in our scientific image of reality”
Importantly, the various sciences offer us perspectives on reality. They conceptualise the world by means of different structures of concepts. Philosophy offers the space in which the various images of the world provided by the individual sciences are fused together into a stereoscopic view of reality. Philosophy offers a more global (but not absolute) perspective on reality—for seeing the whole picture. Even if there is no way to put together a coherent and unified image of the world—even if, that is, the scientific image is characteristically disunified and disconnected—this can be ‘seen’ only within philosophy.
Can you give a sense of the type of questions that the philosophy of science is concerned with?
Broadly speaking, there are four kinds of questions that philosophy of science deals with: metaphysical, epistemic, conceptual, and practical.
On the metaphysical front, the key issue is the implications of the scientific image of the world about the basic ontological categories of the natural (and social) world. For instance, are there laws of nature? What does causation consist of? Are there natural kinds and properties? Do things in the world happen by necessity? Do worldly objects possess causal powers? Are there mechanisms that generate or support various functions and behaviours?
On the epistemic front, the key issue is the epistemic credentials of science and in particular the status of scientific knowledge. Science relies on theories, hypotheses, and principles which typically but not invariably go beyond the observable aspects of the world and describe it as possessing a hidden-to-the-senses causal-explanatory structure. How are these theories supported or licensed by the evidence? How seriously should we take the scientific image of the world as being true or true-like in order to have a just view of science?
On the conceptual front, the key issue is the ways scientific theories represent the world as well as the conditions of representational success. Science represents the world via theories and theories employ a number of representational media, from language, to models, to diagrams etc. Scientific theories employ, almost invariably, idealisations and abstractions in representing natural phenomena. How do scientific concepts acquire their content? How is it best to understand the conceptual connections between theories? How is it best to evaluate the representational content of theories? How is it best to understand the relation of theories to experience and experiment?
On the practical front, there are a number of issues having to do with ethical, social, and other practical problems. Science is far from value-free, and the investigation of the place, role and function of values in science has been an important element of our scientific thinking. Values do not function as methods do, yet they are constitutively involved in scientific judgements and in theory-choice and evaluation in science. Feminist approaches to science have played a key role in uncovering various cognitive and social biases and have promoted the image of a socially responsible science. Issues about the ethics of science, the structure of scientific research, risk-analysis, and the role of science (and of the scientists and the scientific institutions) in policy-making have acquired prominence.
“Science relies on theories, hypotheses and principles which, typically but not invariably, go beyond the observable aspects of the world”
In practice, all these fronts are intertwined. In my view, a proper engagement with philosophy of science should deal with all four kinds of issues and have a historical dimension, too. Philosophy of science has a rich history, the understanding of which (apart from the intellectual worth it has in its own right) can help us have a better view of the significance of current approaches to science and of attempted solutions to perennial problems.
It is occasionally remarked that the higher the complexity of any particular area of science, the more it strays into philosophical territory. Do you think this is true? What need does an individual scientist have for philosophy?
I certainly think this is true. Complexity arises, typically, when a theory is extended to cover new phenomena and the foundations upon which it was built become shaky. This incidentally is the view Einstein himself had. He famously said that in the periods of normal science, the scientist might well leave the philosophising about science to the philosopher. But when the stakes at higher, that is when:
. . . experience forces us to seek a newer and more solid foundation, the physicist cannot simply surrender to the philosopher the critical contemplation of the theoretical foundations; for, he himself knows best, and feels more surely where the shoe pinches.
The way I read Einstein is that scientists have an important theoretical reason to be actively engaged in the philosophical scrutiny of science when science seems to be in trouble.
A key task for philosophy vis-à-vis science is to create—or to contribute to the creation of—the very conceptual framework through which scientific theories represent the world. Einstein suggests that this philosophical task requires the active engagement of scientists with philosophy: it cannot be successfully performed unless scientists are engaged in philosophy. All this does not imply that scientists should become philosophers, or the converse. But it calls for osmosis between the two distinct perspectives.
“Scientists have an important theoretical reason to be actively engaged in the philosophical scrutiny of science when science seems to be in trouble”
This osmosis was the hallmark of major past scientists and philosophers such as René Descartes, Isaac Newton, James Clerk Maxwell, Hermann Helmholtz, Henri Poincare, and Jean Perrin—to name but the most notable cases. Philosophy gives scientists the conceptual tools to reflect on their theories and practices, to question entrenched assumptions and presuppositions and to defend the scientific achievements.
Your first choice is Understanding Philosophy of Science by James Ladyman. Can you tell me about this introduction and why have you chosen it?
A good introduction is like a good appetiser in a meal. It’s meant to get you excited about what lies ahead, and it should prepare your senses for the main course. James Ladyman’s introduction is an excellent appetiser to a full meal of philosophy of science. But it can also stand on its own as a substantial main course. What makes this book stand out is, on the one hand, the clarity by which it is written and, on the other hand, the in-depth coverage of issues not normally treated in general introductions to philosophy of science.
“James Ladyman’s introduction is an excellent appetiser to a full meal of philosophy of science”
The book covers standard material that a novice or beginner should understand: the problem of the description and justification of scientific method (with particular emphasis on the justification of induction); Karl Popper’s falsificationism and its problems; Thomas Kuhn’s account of scientific revolutions; the relation between theory and observation. But it goes on to keep the reader up to speed with the intricate recent debates concerning scientific realism. The scientific realism debate is a key controversy concerning science in general. Roughly put, the question is whether there are good reasons to take science to be ‘on the right track’; to have latched onto reality. Scientific realism is the view that mature and predictively successful scientific theories are (approximately) true of the world; hence, the entities they posit (or entities like the ones posited) are part of reality.
This optimistic view of science has been challenged in many ways. For instance, it is argued that empirical evidence systematically underdetermines theories, hence it is impotent in turning the evidential balance in favour of one theory. Or it is argued that the history of science is full of theories that were once empirically successful and yet were abandoned as false later on. It is then concluded that current theories will be abandoned as false in due course, despite their empirical successes.
Get the weekly Five Books newsletter
In the face of challenges such as the above, realists have retreated to weaker positions such as selective realism (only some parts of the theory, those that ‘fuel’ the empirical successes of the theory, get credit from these successes), structural realism (the theory gets the structure of the world right) and others. These (and other) developments, though utterly significant for understanding science’s relation to the world, are not treated in many recent textbooks. Ladyman’s book is a very welcome exception.
Next on your list is The Scientific Image by Bas van Fraasen. This is a hugely significant work arguing against scientific realism. This debate largely centres on the status of “unobservable entities”: entities that do explanatory work in science and yet are not empirically detectable. An example of this might be the electron or the quark. Can you outline why van Fraassen contests scientific realism?
Van Fraassen’s position is subtle. He does not deny that unobservable entities exist. Rather, he says that one need not believe in their existence in order to have a reasonable view of science and its practice. His anti-realism is based on the empiricist tenet that belief should be constrained by what is observable and actual and amounts to a kind of agnosticism about the existence of unobservables.
He therefore defends what he calls ‘Constructive Empiricism’ in opposition to scientific realism. His account of scientific realism is in my view somewhat idiosyncratic, since he takes scientific realism to involve two theses: one axiological and another doxastic (about belief). The axiological thesis says that theories aim at truth; the doxastic thesis says that acceptance of a theory involves belief in its truth.
Whereas, he takes Constructive Empiricism to say that theories aim at empirical adequacy, and that acceptance of a theory involves belief only in its empirical adequacy (though he adds that acceptance involves more than belief, viz., commitment to use the theory to interpret the worldly phenomena). I said that this way to view the realism debate is idiosyncratic, since one can be a scientific realist or a constructive empiricist without thinking that theories have achieved their respective aims.
Be that as it may, van Fraassen’s key insight is that an empiricist should set limits to what is accepted on the basis of experience, and since he thinks that unobservable entities are beyond experience, belief in them should be “supererogatory”. His key argument is that the extra risk that realists seem to take by believing in the reality of unobservable entities is illusory since the theory can only be proved wrong by showing that it is empirically inadequate. In any case, he adds, the claim that a theory is empirically adequate (i.e., that it saves all phenomena) is always more (or at least as) probable than the claim that a theory is true.
You are one of the most prominent contemporary defenders of scientific realism. Where, in your view, does van Fraassen go wrong?
I think a key problem with van Fraassen’s view is that it is inherently unstable. To see this, we have to reflect a bit more on the notion of empirical adequacy. A theory is empirically adequate if and only if it saves all phenomena, past, present, and future. Now, this is a no less utopian aim than proving the theory true, since at any given moment of time, scientists have only a finite amount of data available. Hence, even if these data do not refute the theory, they are far from proving that the theory is empirically adequate. The claim that a theory is empirically adequate is already ‘inflated’ vis-à-vis the available data, which show at most that a theory is unrefuted.
But why go for belief in empirical adequacy as opposed to belief in truth? If the argument is that the former is epistemically safer than the latter, then this makes Constructive Empiricism unstable: the epistemic safety principle, if sensible at all, makes safer the even weaker belief in the claim that the theory is unrefuted (as opposed to the stronger belief that the theory is empirically adequate). Empiricism could be stricter than constructive empiricism: it could claim that the aim of science is to produce unrefuted theories. Constructive Empiricism is more liberal than this, but in being so, it sets some boundaries to what can be known that does go beyond what a strict version of empiricism would allow, viz., that only what has been experienced can be known. But then, there is no logical obstacle in setting the boundaries a little higher, as realists demand.
The issue of observability has drawn considerable attention among philosophers of science. There is a famous argument, by Grover Maxwell, to the effect that all entities are observable under suitable circumstances. Maxwell’s point was that ‘observability’ should be best understood as ‘detectability through or by means of some instrument’. Now, van Fraassen takes it to be the case that an entity is observable if it could be observed by a suitably placed observer. This claim is modal (‘could be observed’), but it is not clear how the modality is to be understood.
Are dinosaurs, for instance, ‘observable’ even if their observation would require time travel? And are sun spots ‘observable’ even if, strictly speaking, no one could be close enough to the sun to observe them? To be sure, van Fraassen claims that observability concerns empirically discoverable facts about humans qua organisms in the world. But even if we were to grant that there is an empirically discoverable divide between observable entities and unobservable ones, we are still left with a question: why should the observable/unobservable distinction capture the border between what is epistemically accessible and what is not?
“Are dinosaurs, for instance, ‘observable’ even if their observation would require time-travel?”
What van Fraassen has failed to establish is that the boundaries of experience should include only claims about unobserved-yet-observables and that they ought to exclude all claims about unobservables. In fact, there is a venerable empiricist tradition, exemplified by Hans Reichenbach and Wesley Salmon, according to which an empiricist epistemology can lead to accepting the reality of unobservable entities, based on suitable ampliative methods, without thereby abandoning empiricism.
Constructive empiricism is not the only available anti-realist conception of science. Can you characterise the other forms of anti-realism? Which would you say gives the strongest case?
It’s interesting that the key anti-realist claim has been based on some kind of epistemic dichotomy—that some realms of being are cognitively impenetrable by us. The dichotomy has been by and large, vertical: there is something epistemically suspicious with the unobservable per se, or some aspects of the unobservable. As we have seen, for Constructive Empiricism, the epistemic dichotomy is drawn quite sharply along the line of the observable/unobservable distinction. What’s worth noting is that subsequent forms of anti-realism were also weak realist positions, since the dichotomy is now drawn within the realm of the unobservable, therefore allowing that there is epistemic access to some unobservable parts of reality.
Get the weekly Five Books newsletter
The two most promising but ultimately failing anti-realist views are Kyle Stanford’s neo-instrumentalism and Derek Turner’s historical hypo-realism. In the latter, the dichotomy is between the past and the tiny. Turner claims that we can know more about the tiny than the past; hence, it is safer to be a scientific realist about the tiny unobservables, such as electrons. He bases his claim on a distinction between a unifier (an entity that plays a unifying role) and a producer (an entity that can be manipulated to produce new phenomena), and argues that past (un)observables (like dinosaurs) can at best be unifiers, whereas tiny unobservables can be producers, too.
“Turner claims that we can know more about the tiny than the past; hence, it is safer to be a scientific realist about the tiny unobservables, such as electrons”
In Stanford’s case, the epistemic dichotomy is between those entities to which there is an independent route of epistemic access (mediated by theories that cannot be subjected to serious doubt) and those entities to which all supposed epistemic access is mediated by high-level theories. Kyle Stanford takes it that the former are epistemically accessible, while the latter are impenetrable. High-level theories are taken to be useful conceptual tools for guiding action rather than maps of a reality unavailable to the senses. Part of Stanford’s motivation is the claim that, given the past record of science (especially when it comes to high-level theories), it is likely that the truth lies in the space of currently unconceived alternatives to extant scientific theories.
But, for one, this kind of argument neglects the fact that as science grows, the space of unconceived alternatives is constrained and restricted by what we already know; that is, by well-established scientific theories. For another, the problem of the existence of unconceived alternatives is a general problem for epistemology. Given that there is no deductive link between evidence and theory, it is always possible that current theories are false, and hence that there are unconceived alternatives to them. This gives rise to the issue of under what conditions we are entitled to talk about justification and knowledge, which is a general epistemological problem to be dealt with independently of the realism debate.
Let’s move on to your third choice. This is Peter Lipton’s Inference to the Best Explanation. Tell me about this one.
Peter Lipton (1954–2007) was an exceptionally talented philosopher of science. His book Inference to the Best Explanation is a model of lucidity, rigorous argumentation, and philosophical depth. It came out in 1991 and had a second edition in 2004, with the second having substantially new material.
Inference to the Best Explanation (IBE) is a pervasive mode of inference (or reasoning) in science. The key idea is that ‘best explanation’ is a guide to truth. It is related to what the American Pragmatist philosopher Charles Sanders Peirce called “abduction”. This is a reasoning process which proceeds as follows: “The surprising fact C is observed. But if A were true, C would be a matter of course. Hence, there is reason to suspect that A is true”.
IBE is taken by scientific realists to be the way in which scientists form beliefs and accept hypotheses about unobservable entities. For instance, the best explanation of the macroscopic behaviour of gases is that they are composed of atoms. In fact, the most basic argument for realism itself is an inference to the best explanation: that scientific theories are (approximately) true is the best explanation of their predictive successes. But IBE has been notoriously hard to formalise and to defend (or justify). Many philosophers ask: what does explanation have to do with truth?
“The most basic argument for realism itself is an inference to the best explanation: that scientific theories are (approximately) true is the best explanation of their predictive successes”
Lipton attempted to answer this question by distinguishing between loveliness and likeliness. Loveliness is a function of the explanatory qualities of a hypothesis; that is, how simple, comprehensive, unified and natural it is. Likeliness has to do with how likely a hypothesis is. Hence, Lipton unravels two facets of IBE: inference to the Loveliest Explanation and Inference to the Likeliest Explanation, where the loveliest explanation is one which would, if true, be the most explanatory or provide the most understanding.
In effect, Lipton’s strategy has been to impose two types of filter on the choice of hypotheses. One selects a relatively small number of potential explanations of the evidence as plausible, while the other selects the best among them as the actual explanation. Both filters operate on the basis of explanatory considerations. That is, both filters should act as explanatory-quality tests. Then, he argued that the loveliness of an explanation is a symptom of its likeliness. Hence, explanations that are lovely will also be likely. But what guides the inference is the loveliness (explanatory power) of an explanation.
He was aware, though, that this was not the end of the story. What he called the ‘problem of matching’—the extent of the match between loveliness and likeliness—is still with us. In the second edition of the book, Lipton made an extra effort to reconcile IBE with Bayesianism—that is, the view that scientific inference is probabilistic and modelled by a famous theorem in the theory of probability, known as Bayes’s theorem. This has proven to be a very fruitful area of research that flourished after Peter’s untimely death.
You have also picked The Advancement of Science: Science without Legend, Objectivity without Illusions by Philip Kitcher. What is the project of this book?
For me, this book has a personal significance. I was given a copy of it in August 1993 by my then thesis supervisor David Papineau, while I was writing up my doctoral dissertation. David advised me not to read it until I submitted my dissertation so that I wouldn’t get distracted. I followed his advice and read it in the beginning of 1994. It was a revelation for me.
The book aims to deflate the “legend” that science is a march to truth (to the one complete true story of the world) and that this is achieved by the use of a fully objective scientific method. Many critics of science in the twentieth century, from Thomas Kuhn to the social constructivists, have taken the failures of the legend to show that science cannot reveal truths about the world, or to question its objectivity, rationality and hegemony.
But Kitcher does not want to do this. In his book, he aims to show how scientific progress and objectivity can still be defended, even though the legend is just a legend.
“Many critics of science in the twentieth century, from Thomas Kuhn to the social constructivists, have taken the failures of the legend to show that science cannot reveal truths about the world”
This is done within a thoroughly naturalistic framework in which scientists are seen, not as sole knowers, but as biological and social beings with various cognitive constraints and limitations. Individual cognitive practices are integrated into a network of collective consensus-forming practices. One such practice aims to offer cogent unifying explanations of the worldly phenomena, where the unification consists in using the same explanatory schemata to account for diverse phenomena, like Darwin did with his explanatory pattern of natural selection. Scientific enterprise is progressive in that more and more significant truths about the world are discovered and by making more and more refined classifications of natural kinds.
Let’s look more closely at the legends and illusions that he’s advocating we resist. On what basis has the idea of scientific objectivity been challenged in the twentieth century?
The two major challenges to scientific objectivity have come from the Kuhnian notion of incommensurability and the social constructivist programme in the sociology of scientific knowledge. The notion of incommensurability was introduced by Kuhn to capture the relation between scientific paradigms before and after a scientific revolution. The pre-revolutionary and the post-revolutionary paradigms were said to be incommensurable in that there was no strict translation of the terms and predicates of the old paradigm into those of the new.
Though Kuhn developed this notion in several distinct ways, its core is captured by the thought that two theories are incommensurable if there is no language into which both theories can be translated without residue or loss. Kuhn supplemented this notion of untranslatability with the notion of lexical structure: two theories are incommensurable if their lexical structures (that is, their taxonomies of natural kinds) cannot be mapped into each other.
Too many philosophers, this notion threatened scientific objectivity since competing paradigms cannot be properly compared. Hence, there is no objective sense in which the new paradigm can be said to be more progressive than the old. Kuhn went to extremes by claiming that:
The proponents of competing paradigms practice their trades in different worlds . . . Practicing in different worlds, the two groups of scientists see different things when they look from the same point of view in the same direction.
This made the world well lost. To be sure, it’s best if we see Kuhn’s philosophy as a version of neo-Kantianism because it implied a distinction between the world-in-itself, which is epistemically inaccessible to inquirers, and the phenomenal world, which is constituted by the concepts and categories of the inquirers, and is therefore epistemically accessible to them.
But Kuhn’s neo-Kantianism was relativised: he thought that there was a plurality of phenomenal worlds, each being dependent on, or constituted by, some community’s paradigm. The paradigm imposes, so to speak, a structure on the world of appearances: it carves up this world in ‘natural kinds’. This is how a phenomenal world is ‘created’. But different paradigms carve up the world of appearances into different networks of natural kinds.
The second challenge was based on the so-called ‘symmetry principle’ in the ‘Strong Programme of the Sociology of Scientific Knowledge’ (SSK). This programme aimed at a causal-naturalistic explanation of scientific belief and the claim was that, as David Bloor put it, the same types of cause would explain true and false, or rational and irrational, beliefs. Accordingly, the world drops out as a factor for the explanation of scientific belief. For the advocates of SSK, there are only locally credible reasons, and not “absolute proofs”, that one scientific theory is better than another. But, of course, scientists do not talk about “absolute proofs” of theories. Still, there are typically good evidential reasons to prefer one theory to another.
In the extreme case of social constructivist views, the claim is that scientific entities are constructed by means of negotiations and other socially influenced consensus-making processes among scientists. Science is taken to be only one of any number of possible “discourses”, none of which is fundamentally truer than any other. What unites this cluster of views are vague slogans such as ‘scientific truth is a matter of social authority’ or ‘nature plays (little or) no role in how science works’.
As socially immersed beings, how much room does Kitcher have for social influences on scientists to impact their research and, with it, their scientific objectivity?
Kitcher clearly accepts that scientists are social beings and that there are a number of social influences on their views and work. However, he defends the view that the various social influences and biases are not so powerful that they prevent scientists from abandoning false beliefs and accepting truer ones. In other words, the social influences are seldom so powerful as to render negligible the reality’s contribution to scientific belief.
How would you say Kitcher’s approach to scientific progress ultimately wrestles free from the claws of someone like Thomas Kuhn?
For Kitcher, there is conceptual, explanatory and cognitive progress as science grows. He argues that there is no significant incommensurability between competing theories, since for him, scientific expression-types are no longer associated with single (putative) referents. Instead, each expression-type is endowed with a reference potential: a potential such that its tokens may refer to more than one (putative) entity, depending on the event that has initiated the production of each particular token. This allows him to speak of reference-preserving translation between competing theories.
For instance, Joseph Priestley’s “dephlogisticated air” has in its reference potential both phlogiston-free air and oxygen. Depending on the context of utterance, tokens of “dephlogisticated air” may refer to either of the two members of the reference-potential; hence, they may fail to refer altogether or fail to refer to oxygen.
For Kitcher, conceptual progress is refinement of the reference-potential of concepts. Besides, unlike Kuhn, Kitcher thinks that there is considerable progress towards a truer account of the world. Even if our perception of nature may be theory-dependent, it does not follow that nature itself is theory-dependent.
Having discussed Kuhn, that leads us nicely to your final choice. This is Dynamics of Reason by Michael Friedman, a work also concerned with the nature of scientific revolutions but through a Kantian lens. Tell me about this book.
This is, in many ways, a tour de force. Friedman repeats the call of Hermann von Helmholtz, who is one of his philosophical heroes: Back to Kant! And while Helmholtz had in mind the excesses of German idealism, Friedman is moved by a deeper reading of logical positivism, which was the scapegoat of Kuhnians and Popperians. This deeper reading is not in essence a reinterpretation, but a rehabilitation. Friedman unveils the Kantian origin of the basic tenets of logical positivism, showing at the same time how they were transformed and redefined in the light of Frege’s and Russell’s new logic, Hilbert’s axiomatic method, and the fundamental changes in physics and geometry in the turn of the twentieth century.
On Friedman’s reading, both Kant and the logical positivists shared a common project: showing how the scientific image of the world can yield objective knowledge. Kant found in Newtonian theory a model of how the fundamental laws of nature are founded in universal principles of human knowledge, and especially in the principles of mathematics and Euclidean geometry. These universal and necessary principles of human knowledge the forms of pure intuition provided the framework within which scientific knowledge and objectivity, as exemplified by Newtonian mechanics, are defined and defended.
“On Friedman’s reading, both Kant and the logical positivists shared a common project: showing how the scientific image of the world can yield objective knowledge”
According to the Kantian conception of knowledge, the possibility of human knowledge presupposes synthetic a priori constraints when building models of the world based on experience. The synthetic a priori principles are universal, necessary, and certain. Being independent from experience, they are unrevisable. At the same time, they constitute the object of knowledge.
In a similar fashion, the logical positivists had sought to show how objectivity could be redefined and defended in light of the new, post-Newtonian scientific worldview that was shaped by Einstein’s theory of relativity and quantum physics. The transformation of the very idea of objectivity and the validity of scientific knowledge relied on the thought that the principles of mathematics and logic constitute an a priori scaffolding upon which the empirical knowledge of the world is hooked. Central to this transformation is Hans Reichenbach’s Relativitätstheorie und Erkenntnis apriori, which was published in 1921.
In the light of the theory of relativity, which challenged both Newtonian mechanics and the underlying Euclidean geometry, Reichenbach proposed a distinction between two elements of the Kantian conception of synthetic a priori principles: (a) a priori principles are considered unrevisable, thus necessarily true; and (b) a priori principles are considered to be constitutive of the object of knowledge. Reichenbach accepted the second dimension, but denied the first. That is, he denied that a priori principles are necessarily true and unrevisable. Instead, being dependent on a framework, they must be abandoned when the framework they constitute is abandoned. The framework is abandoned for broadly empirical reasons; in particular, when the theories that are embedded in it are in persistent conflict with experience.
This new conception of the a priori, qua a set of principles constitutive of a theoretical framework, retains the spirit of the Kantian idea that there can be no systematic attempt to know the world unless the acceptable empirical theories are limited in such a way as to satisfy a set of a priori principles, which describe the basic structure that the world must have in order for it to be knowable. But these a priori principles become at the same time relativised; that is, revisable.
This way of viewing things leads Friedman to show that the distinct and autonomous role of philosophy is established not by the fact that it is cut off from science. Rather, the role of philosophy is to provide the (meta-scientific) domain upon which reason is called to unveil and highlight the rationality that permeates, and the continuity which characterizes, the otherwise radical scientific revolutions. In other words, philosophy offers the domain in which the various constitutive a priori principles of the various theoretical frameworks are detected and explained as well as the space in which the reasons for their change become visible. This is what Friedman calls ‘Dynamics of Reason’.
“The transformation of the very idea of objectivity and the validity of scientific knowledge relied on the thought that mathematics and logic constitute an a priori scaffolding upon which the empirical knowledge of the world is hooked”
The Kuhnian approach to science, from which Friedman takes some cues, oscillates between two fundamentally different and conflicting assumptions: the rationality of normal science and the irrationality of revolutionary change. The synthesis sought by Friedman restores rationality in scientific change within a meta-scientific (and hence philosophical) domain, highlighting the role of constitutive and at the same time revisable principles.
But if the constitutive principles are framework dependent and revisable, how is (descriptive) naturalism or scepticism avoided? Friedman’s answer is that philosophy provides a regulative ideal: viewing the succession of theoretical frameworks or paradigms as a convergent series in which “we successively refine our constitutive principles in the direction of ever greater generality and adequacy” (DR, 63).
Philosophy and the sciences are in a perpetual relationship of dynamic interaction and mutual determination. The prime philosophical project therefore consists in seeking the “universal, unchanging principles” of reason, as Ernst Cassirer had put it.
Like Kitcher, Friedman is optimistic about scientific objectivity as an attainable goal. What would you say is the most significant way that their approaches differ?
Kitcher’s approach is a lot more naturalistic and, as of late, pragmatic. Friedman’s approach is anti-naturalistic: he emphasises more the role of human mind in the constitution of the object of knowledge of science.
What do you consider to be the most interesting directions that current philosophy of science is exploring?
I would single out three (revealing my own preferences and biases). The first is in the metaphysics of science and has to do with the implications of the scientific image for the deep structure of the world. ‘Metaphysics’ is no longer a dirty word. The on-going battle between neo-Humean and neo-Aristotelian conceptions of the world is a case in point.
The neo-Aristotelian tradition inflates ontology with causal powers, necessary connections and the like in order to explain and ground the regularity there is in the world, while the neo-Humean tradition takes regularity as a brute fact, does away with regularity-enforcers and advances a metaphysically thin conception of laws of nature. In between, there are the structuralists and the primitivists. I find the engagement with the role of mechanisms in causation, explanation and scientific practice in general particularly promising.
“‘Metaphysics’ is no longer a dirty word”
The second direction that I consider most interesting has to do, unsurprisingly, with the scientific realism debate. In particular, there are attempts to re-evaluate the role and strength of the historical challenge to realism, to discuss the microstructure of theory-change in science and to develop new forms of anti-realism.
Get the weekly Five Books newsletter
The third direction has to do with the role of values (both epistemic and social) in science and in science-based policy making. Here I think there is still a lot to be learned. The big challenge is to unveil the role of values in science (in theory-choice, theory-appraisal, and decision-making under uncertainty) while at the same time defending the objectivity of science and scientific knowledge. There is immensely interesting work going on (in relation to issues concerning climate change and other issues) and we have still a lot to learn from feminist perspectives on science, including my own favourite: standpoint epistemology in general, and feminist standpoint in particular.
“Feminist approaches to science have played a key role in uncovering various cognitive and social biases and have promoted the image of a socially responsible science”
A fourth direction which has attracted my own interest lately is the history of philosophy of science. Apart from its own intrinsic interest, engagement with HoPoS has a lot to teach us about current debates and why they have taken the turns they have.
With the growing concerns about a post-truth climate, and diminishing trust in science among certain demographics, would you say the scientific community needs philosophy more than ever?
Absolutely! Philosophy offers a magnifying glass through which the invisible causes of certain prejudices, biases, assumptions, or presuppositions are clearly seen and questioned. Science is by far the best way we humans have invented to push back the frontiers of ignorance and error, to achieve a deep understanding of the world and of our place in it, and to make the world a better place to live. But science is not a faultless, value-neutral and interest-free way to understand and change the world. Hence, science needs critical defence against excessive scepticism, relativism and public distrust.
“Science needs critical defence against excessive scepticism, relativism, and public distrust”
Philosophy can play a critical role in defending the objectivity of science, in showing the robustness of scientific facts and in combating the ‘post-truth’ ideology. Science needs philosophy more than ever. In fact, society at large needs philosophy more than ever. Philosophy is the living example of the tremendous achievements of human reason. It is our collective insurance against, unreason, authoritarianism and conceptual vacua. It is sad that many scientists treat philosophy as an after-retirement pass-time. Philosophy does not merely fill the cracks of the scientific image of the world. It is the glue that holds it together.
Five Books aims to keep its book recommendations and interviews up to date. If you are the interviewee and would like to update your choice of books (or even just what you say about them) please email us at [email protected]
Five Books interviews are expensive to produce. If you've enjoyed this interview, please support us by donating a small amount.