For the book Philosophical Semantics, to be published by CSP in 2017/2 (draft version)
Indem die Besinnung auf das Destruktive des Fortschritts seinen Feinden überlassen bleibt, verliert das blindlings pragmatisierte Denken seinen aufhebenden Charakter, und darum auch die Beziehung auf Wahrheit.
[In that reflection on the destructive aspect of progress is left to its enemies, blindly pragmaticized thought loses its uplifting character and thereby also its relation to truth.]
Theodor Adorno & Max Horkheimer
Making empty is the result of making small.
Science (mainly applied science) rises, while culture (artistic, religious, philosophical) falls. Whereas culture was once a source of values, today science and technology have made cultural values seem superfluous.
The critical theory of society has offered some explanations for this. Drawing on Max Weber’s basic concept of the ‘disenchantment of the world’ (Entzauberung der Welt), it asserts that in our modern technological society instrumental reason prevails over valuing reason, promoting mass culture and furthering science and technology at the expense of the old mystical-humanistic culture, without having sufficient resources to fill the void left behind.
Under the pressure of this scientistic institutional framework, we should not wonder that a kind of philosophy prevails that all too often materially and institutionally mimics the ways particular scientific fields work. It often mimics sciences in a way that suggests the way much of continental philosophy has mimicked literary forms, that is, often counterfeiting the most proper forms of philosophical argumentation with the effect of losing its relation to truth. For instance, by taking into account only the discussions of recent years, one might work as if philosophy is going through the same linear development as science, only to find itself somewhere later in a previsible cul-de-sac. But this inevitably segmented philosophy of the ‘last novelty’ made for ‘immediate consumption’ by and for specialists and related scientists no longer seems, as in the tradition, to be an independent conjectural undertaking making balanced use of whatever new scientific knowledge can serve its purposes. More often, it seems a busy handmaiden of science suffering from loss of identity and self-esteem: particularized proto-scientific speculation, atomized conjectural work that does not look beyond its own narrow interests and scarcely touching the central philosophical problems legated by the tradition.
In pointing to this, I am far from embracing Manichaeism. I am not claiming that for science to exert great influence on philosophy is inevitably specious and unfruitful. Often philosophy plays a role in furthering the development of particular sciences. Moreover, there are felicitous cases, like the rapid proliferation of theories of consciousness over the last four decades, which serves as a striking example of fruitful philosophical work directly associated with the development of empirical science that has deepened the field of investigation – and this is only one example among many.
Nevertheless, it is important to remember that this same intellectual movement can easily become an ideologically motivated agenda if it tempts the theoretical philosopher to import new knowledge from particular sciences – formal or empirical – in ways that cause him to lose sight of the vast scope of the philosophical landscape. A possible consequence of this may be what some have aptly labeled expansionist scientism: an effort to forcefully reduce some domain of philosophy to the scope of investigative strategies and views derived from a more or less established particular science. In order to achieve this aim, the particular (formal or empirical) scientific field must be expanded in order to answer questions belonging to some central domain of philosophy, using a reductionist strategy that underestimates philosophy’s encompassing and multifaceted character (a first example of expansionist scientism was in my view Pythagoreanism, which tried to find answers to the problems of life in numbers and their applications). The price one must pay for this may be that persistent, distinctive philosophical difficulties that cannot be accommodated within the new particularizing model must be minimized, if not quietly swept under the rug.
A chief inconsistency of scientism arises from the fact that while sciences are in various ways all particular, philosophy is most properly ‘holistic’: As Wittgenstein once wrote, the fundamental problems of philosophy are so interconnected that it is impossible to solve any one philosophical problem without first having solved all the others. Insofar as his claim has any truth, it means that a persistent difficulty of the central philosophical problems is that we need a proper grasp of the whole to be able to evaluate and answer them properly. This is what can make philosophy so unbearably complex and multifarious. And the lack of this is what can make philosophy appears like a headless turkey running around. Taking account of parts as belonging to a whole, trying to see things sub specie totius, is also what the great systems of classical philosophy – such as those of Aristotle, Kant and Hegel – strove to achieve, even if paying a price that we are now better able to see as unavoidably high in terms of misleading and aporetic speculation. Nonetheless, it would be too easy to conclude that true comprehensiveness is no longer a fundamental desideratum of philosophy (Wittgenstein was well aware of this when he called for more ‘Übersichtlichkeit’).
A main reason for the narrowness and fragmentation of much of our present linguistic-analytical philosophy can be explained as follows. The new Anglo-American philosophy – from W. V-O. Quine to Donald Davidson, and from Saul Kripke to Hilary Putnam and Timothy Williamson – has challenged a great variety of inherited commonsensical starting points and challenged them in often undeniably insightful and imaginative ways, although in my view with ultimately unsustainable results. Because of this, a large part of theoretical philosophy has increasingly lost touch with its intuitive commonsensical grounding in the way things prima facie seem to be and for the most part really are.
Take, for instance, the concept of meaning: the word ‘meaning’ was challenged by Quine as too vague a noise to be reasonably investigated. But an approach is inevitably limited if it starts from a kind of positivist reductionist perspective that denies or ignores commonsense certainties, like the obvious fact that meanings exist. Indeed, using this strategy of skeptically questioning all kinds of deeply ingrained truisms, scientistically oriented philosophers have sawed off the branches they were sitting on. The reason for this is that the result of the adopted strategy couldn’t be other than replacing true comprehensiveness with a superficializing positivistic fragmentation of often misleadingly-grounded philosophical concerns, which ends by plunging philosophy into what Scott Soames called the ‘age of specialization.’
This fragmentation can be regarded as dividing to conquer, I admit; but it may also be a matter of dividing to subjugate; and what is here to be subjugated is often the philosophical intellect. Indeed, by focusing too much on the trees, we may lose sight of the philosophical forest and thereby even of where the trees are. Without the well-reasoned assumption of some deep commonsensical truisms, no proper descriptive metaphysics (P. F. Strawson) remains possible. And without this, the only path left for originality in philosophy of language, after rigorous training in techniques of argumentation, may turn out to be the use of new formalistic pyrotechnics of unknown value. This would have the end-effect of blocking the paths of inquiry, disarming adequate philosophical analysis and increasing the risk that the whole enterprise will degenerate into a sort of scholastic, fragmented, vacuous intellectual Glasperlenspiel.
It may be that practitioners of reductive scientistic philosophy are aware of the problem, but they have found plausible excuses for neglecting to solve it. Some have suggested that any attempt to do philosophy on a comprehensive level would not suffice to meet the present standards of scholarly adequacy demanded by the academic community. But in saying this they forget that philosophy does not need to be pursued close on the heels of new advances in the sciences, which are continually producing and handing down new authoritative developments. Philosophy in itself still remains an autonomous cultural enterprise: it is inherently conjectural and dependent on indispensable metaphorical elements intrinsic to its pursuit of comprehensiveness. Most of philosophy remains a relatively free cultural enterprise with a right to controlled speculation, experimentation and even transgression, though most properly done in the pursuit of truth.
Others have concluded that today it is impossible to develop a truly encompassing theoretical philosophy. For them this kind of philosophy cannot succeed because of the difficulties imposed by the overwhelming amount of information required, putting the task far beyond the cognitive capacity of individual human minds. We are – to borrow Colin McGinn’s original metaphor – cognitively closed to finding decisive solutions for the great traditional problems of philosophy: In our efforts to do ambitious comprehensive philosophy, we are like chimps trying to develop the theory of relativity: Just as they lack sufficient mental capacity to solve the problem of relativity, we lack sufficient mental capacity to develop comprehensive philosophy and will therefore never succeed! Hence, if we wish to make progress, we should shift our efforts to easier tasks...
This last answer is perhaps specious and borders on defeatism. The very ability to initiate the discussion of broadly-inclusive philosophy suggests that we might also be able to accomplish our task. As Wittgenstein once wrote, if we are able to pose a (true) question, it is because we are also in principle able to find its answer. In contrast to human thinkers, one indication that chimps could never develop a theory of relativity is that they are unable to even pose questions such as what would happen if they could move at the speed of light. The intelligence needed to pose such questions is about the same as the intelligence needed to answer them. Even if the total amount of scientific knowledge available to us has increased immensely, it may well be that the amount of really essential information needed to answer any given question is sufficiently limited for us to grasp and apply. As Russell once theorized, very often the science needed to do philosophy can be limited to very general findings. Moreover, not all philosophical approaches need to be taken into account, since they are often superimposed or displaced. The main difficulty may reside in the circumstances, strategies and authenticity of attempts, in limits on the context of discovery, rather than in the sheer impossibility of progress. In any case, it is a fact that in recent years, true comprehensiveness has almost disappeared in the philosophy of linguistic analysis. However, the main reason does not seem to be impossibility in principle, but rather loss of the proper cultural soil in which a more comprehensive philosophy could flourish.
In this book, I begin by arguing that more fruitful soil can be found if we start with a better reasoned and more affirmative appreciation of commonsense truisms, combined with a more pluralistic approach, always prepared to incorporate the relevant (formal and empirical) results of science. Perhaps it is precisely against the unwanted return of a broader pluralistic approach that much of the mainstream of our present philosophy of language secretly struggle. This is often obscured by some sort of dense, nearly scholastic scientistic atmosphere, so thick that practitioners barely notice it surrounding them. The intellectual climate sometimes recalls the Middle Ages, when no one was allowed to challenge established religious dogmas. I even entertain the suspicion that in some quarters the attempt to advance any plausible comprehensive philosophy of language against the institutional power of reductive scientism runs the risk of being ideologically discouraged as a project and silenced as a fact.
Ernst Tugendhat, who (together with Jürgen Habermas) attempted with considerable success to develop comprehensive philosophy in the seventies, has recently seemed to be hoisting the white flag and conceding that the heyday of philosophy is past. The problem is in my view aggravated because we live in a time of widespread cultural indifference, heavily influenced by the steady almost exponential development of science and technology that minimizes the role of valuing reason. Though quite indispensable from the viewpoint of the instrumental reason, our scientific age tends to impose a compartmentalized form of alienation on philosophical research that works against more broadly oriented attempts to understand reality.
In the present book, I insist on swimming against the current. My main task here – a risky one – is to establish grounds for a more comprehensive philosophy of meaning and reference, while arguing against certain reductionist-scientistic approaches that are blocking the available paths of inquiry. Hence, it is an attempt to restore its deserved integrity to the philosophy of language, without offending either common sense or science; an effort to give a balanced, systematic and sufficiently plausible overview of meaning and the mechanisms of reference, using bridges laboriously constructed between certain summits of philosophical thought. In this way, I hope to realize the old philosophical ambition of a comprehensive synthesis, insofar as this still sounds like a reasonable undertaking.