Advanced draft 2
Table of Contents
Chapter I: Introduction
Appendix to Chapter I: How Do Proper Names Really Work? (Cutting the Gordian knot)
1. A meta-descriptivist rule for proper names
2. Identification rules at work
3. Objection of vagueness
5. Ignorance and error
7. Names versus descriptions
8. Autonomous definite descriptions
9. Some classical counterexamples
10. Explanatory failure of the causal-historical view
Chapter II: Against the Metaphysics of Reference: Methodological Assumptions
1. Common sense and meaning
2. Critical common-sensism
3. Ambitious versus modest common sense
4. Primacy of established knowledge
5. Philosophizing by examples
6. Tacit knowledge of meaning: traditional explanation
7. A very simple example of semantic-cognitive rules
8. Criteria versus symptoms
9. Challenges to the traditional explanation (i): John McDowell
10. Challenges to the traditional explanation (ii): Gareth Evans
11. Non-reflexive semantic cognitions
Appendix to Chapter II: Modal Illusions: Against Trans-epistemic Metaphysical Identities
Chapter III: Wittgensteinian Semantics
1. Semantic-cognitive link
2. Why can’t reference be meaning?
3. Failure of Russell’s atomistic referentialism
4. Meaning as a function of use
5. Meaning as a kind of rule
6. Meaning as a combination of rules
7. Meaning and language games
8. Meaning and forms of life
9. Tying the threads together
10. Criteria and symptoms again
11. Transgressions of the internal limits of language
12. The form of semantic cognitive rules
13. What is wrong with the private language argument?
14. Concluding remarks
Appendix to Chapter III: Trope Theory and the Unbearable Lightness of Being
1. Introducing tropes
2. Tropes and universals
3. Tropes and concrete particulars
Chapter IV: An Extravagant Reading of Fregean Semantics
1. Reference of a singular term
2. Sense of a singular term
3. Reference of a predicative expression
4. Ontological level
5. Referring to particularized properties (tropes)
6. Difficulties with the concept of unsaturation
7. Unsaturation as ontological dependence
8. Sense of a predicative term
9. The dependence of the predicative sense
10. The concept of horse paradox
11. Existence as a property of concepts
12. Two naïve objections
13. Existence attributed to objects
14. Existence of objects and its identification rules
15. Existence of spatio-temporal locations: indexicals
16. Advantages of the higher-order view of existence
17. The ubiquity of existence
18. Answering some final objections
19. Reference of concepts again: a metaphysical excurse (Mill)
20. The reference of a sentence as its truth-value
21. Structural status of facts
22. Ontological status of facts
23. Church’s slingshot argument
24. Facts: sub-facts and grounding facts
25. Taking seriously the sentence’s reference as a fact
26. The problem of identity in difference
27. Sense of sentences: the thought
28. The thought as the truth-bearer
29. Facts as true thoughts?
30. The thought as a verifying rule
31. Frege’s Platonism
32. Avoiding Frege’s Platonism
33. Further ontological consequences
34. A short digression on contingent futures
Appendix to Chapter IV: Frege, Russell, and the Puzzles of Reference
1. Russell’s solutions to the puzzles of reference
2. Fregean solutions to the same puzzles
3. Reviewing Fregean assumptions
4. Reviewing Russellian assumptions
5. Building a bridge between both views
Chapter V: Verificationism Redeemed
1. Origins of semantic verificationism
2. Wittgensteinean verificationism
3. Verifiability rule as a criterial rule
4. Objection 1: the principle is self-refuting
5. Objection 2: a formalist illusion
6. Objection 3: verificational holism
7. Objection 4: existential-universal asymmetry
8. Objection 5: arbitrary indirectness
9. Objection 6: empirical counterexamples
10. Objection 7: formal counterexamples
11. Objection 8: skepticism about rules
12. Defending analyticity
Appendix to Chapter V: The only Key to Solving the Humean Problem of Induction
1. Formulating a Humean argument
2. The basic point
3. Reformulating PF
Chapter VI: Sketch of a Unified Theory of Truth
1. Compatibility between verificationism and correspondence
2. The nature of correspondence
3. Formalizing the correspondence relation
4. Negative truths
6. Pragmatics of the correspondence relation
7. Anterograde versus retrograde procedures
8. General statements
9. Some questioned facts
10. Expansion to formal sciences
11. Why can analytic truth be called true?
12. The insufficiency of coherence
13. Coherence as a mediator
14. What about the truth of the truth-maker?
15. Objection of the linguistic-cognitive circle
16. Answering the objection of the linguistic-cognitive circle
17. Answering traditional arguments against direct realism
18. Axioms of externality
19. Skeptical scenarios
20. Verification and intentionality: Husserl
21. Solving two Husserlian problems
22. Truth and existence
23. Verifiability rules and truthmaking procedures
24. The rule’s structural mirroring of the world
25. Synopsis of this book
Epilogue: The Discovery of Wine
(to be completed...)
Un coup de dés jamais n’abolira le hazard.
[A throw of the dice will never abolish chance]
Indem die Besinnung auf das Destruktive des Fortschritts seinen Feinden überlassen bleibt, verliert das blindlings pragmatisierte Denken seinen aufhebenden Charakter, und darum auch die Beziehung auf Wahrheit.
[In that reflection on the destructive aspect of progress is left to its enemies, blindly pragmaticized thought loses its uplifting character and thereby also its relation to truth.]
Theodor Adorno & Max Horkheimer
Making empty is the result of making small.
Science (mainly applied science) rises, while culture (artistic, religious, philosophical) falls. Whereas culture was once a source of values, today science and technology have made cultural values seem superfluous.
The critical theory of society has offered some explanations for this. Drawing on Max Weber’s basic concept of the ‘disenchantment of the world’ (Entzauberung der Welt), it asserts that in our modern technological society instrumental reason prevails over valuing reason, promoting mass culture and furthering science and technology at the expense of the old mystical-humanistic culture, without having sufficient resources to fill the void left behind.
In this reductive scientistic institutional framework, we should not wonder that a kind of philosophy prevails that all too often materially and institutionally mimics the ways particular scientific fields work. For instance, taking into account only the discussions of recent years, we might suppose that philosophy had the same linear development as science. This segmented philosophy of the ‘last novelty’ made for ‘immediate consumption’ by and for specialists no longer seems, as in the tradition, to be an independent conjectural undertaking making balanced use of whatever new scientific knowledge can serve its purposes. More often, it seems to be a busy handmaid of science: particularized proto-scientific speculation that does not demand knowledge beyond its narrow interests. In saying this, I am not claiming that a strong scientific influence is inevitably specious and unfruitful. Often it finds a role in furthering the development of particular sciences. Moreover, there are some felicitous cases, like the rapid proliferation of theories of conscience over the last four decades, which serves as a striking example of fruitful philosophical work directly influenced by the development of empirical science – and this is only one example among many.
Nevertheless, it is important to remember that this same intellectual steering can also easily become an ideologically motivated endeavor if it tempts the theoretical philosopher to import new knowledge from particular sciences – formal or empirical – in ways that cause him to lose sight of the vastness of the true philosophical landscape. A consequence of this endeavor may be what some have aptly labeled expansionist scientism: an effort to reduce the central domain of philosophy to the scope of investigative strategies and views derived from some more or less established particular science. In order to achieve this aim, the particular (formal or empirical) scientific field must be expanded in order to answer questions belonging to this central domain of philosophy, using a reductionist strategy that underestimates philosophy’s comprehensive and multifaceted character (e.g., the often misguided use of modal logic in philosophy). The price one must pay for this is that persistent, distinctive philosophical difficulties that cannot be accommodated within the new particularizing model must be minimized, if not quietly swept under the rug.
A chief inconsistency of scientism arises from the fact that while sciences are all particular, philosophy is most properly ‘holistic’: As Wittgenstein once noted, the fundamental problems of philosophy are so interconnected that it is impossible to solve any one philosophical problem without first having solved all the others. This means that a persistent difficulty of the central philosophical problems is that we need a proper grasp of the whole to be able to evaluate and try to answer them properly. This is what often makes philosophy so enormously complex and multifarious. Taking account of parts as belonging to a whole, trying to see things sub specie totius, is also what the great systems of classical philosophy – such as those of Aristotle, Kant and Hegel – strove to achieve, even if paying a price that we are now better able to see as unavoidably high in terms of misleading and aporetic speculation. Nonetheless, it would be too easy to conclude that true comprehensiveness is no longer a fundamental desideratum of philosophy (Wittgenstein was well aware of this when he called for more ‘Übersichtlichkeit’).
A main reason for the lack of comprehensiveness of much of our present linguistic-analytical philosophy can be explained as follows. The new Anglo-American philosophy – from W. V-O. Quine to Donald Davidson, and from Saul Kripke to Hilary Putnam and Timothy Williamson – has challenged all kinds of inherited commonsensical starting points and challenged them in undeniably insightful and imaginative ways, although in my view with ultimately unsustainable results. Because of this, much of our theoretical philosophy has increasingly lost touch with its intuitive commonsensical grounding in the way things prima facie seem to be and for the most part really are.
Take, for instance, the concept of meaning: the word ‘meaning’ was challenged by Quine as too vague a noise to be reasonably investigated. But an approach is inevitably limited if it starts from a purely reductionistic-scientistic perspective that denies or ignores commonsense certainties, like the obvious fact that meanings exist. In fact, using this strategy of skeptically questioning all kinds of deeply ingrained truisms, scientistically oriented philosophers have sawed off the branches they were sitting on. The reason for this is that the result of the adopted strategy couldn’t be other than replacing true comprehensiveness (with its predictable depth) with a superficializing positivistic fragmentation of often misleadingly-grounded philosophical concerns, which ends by plunging philosophy into what Scott Soames uncritically called the ‘age of specialization.’
This fragmentation could be regarded as dividing to conquer, I admit; but it may also be a matter of dividing to subjugate; and what is here to be subjugated is the philosophical intellect. Indeed, without the well-reasoned assumption of deep commonsensical truisms, no proper descriptive metaphysics remains possible. Without this, the only path left for originality in philosophy of language, after rigorous training in techniques of argumentation, may turn out to be the use of new formalistic pyrotechnics of unknown value. And the end-effect of this may be to limit the possibilities for inquiry, preventing adequate philosophical analysis and increasing the risk that the philosophical enterprise will degenerate into a sort of scholastic, fragmented, vacuous intellectual Glasperlenspiel.
It may be that the practitioners of scientistic philosophy are aware of the problem, but they have found plausible excuses for it. Some have suggested that any attempt to do philosophy on a comprehensive level would not suffice to meet the present standards of scholarly adequacy demanded by the academic community. But in saying this they forget that philosophy does not need to be pursued close behind new advances in the sciences, which are continually inheriting new authoritative developments. Philosophy in itself still remains an autonomous cultural enterprise: it is inherently conjectural and dependent on the indispensable metaphorical elements intrinsic to its pursuit of comprehensiveness. Most of philosophy remains a relatively free cultural enterprise with a right to controlled speculation, experimentation and even transgression, though typically done in the pursuit of truth.
Others have concluded that today it is impossible to develop a truly comprehensive theoretical philosophy. For them this kind of philosophy cannot succeed because of the difficulties imposed by the overwhelming amount of information required, putting the task far beyond the cognitive capacity of individual human minds. We are – to borrow Colin McGinn’s original metaphor – cognitively closed to finding decisive solutions for the great traditional problems of philosophy: In our efforts to do ambitious comprehensive philosophy, we are like chimps trying to develop the theory of relativity: Just as they lack sufficient mental capacity to solve the problem of relativity, we lack sufficient mental capacity to develop comprehensive philosophy and will therefore never succeed! Hence, if we wish to make progress, we should shift our efforts to easier tasks...
This last answer looks suspiciously close to defeatism. The very ability to initiate the discussion of comprehensive philosophy suggests that we might also be able to accomplish our task. As Wittgenstein once noted, if we are able to pose a question, it is because we are also able to find its answer. In contrast to human thinkers, chimps could never develop relativity theory, but they could also never ask what would happen if they could move at the speed of light. Even if the amount of scientific knowledge has increased immensely, it may well be that the amount of really essential information remains sufficiently limited for us to grasp and apply it. As Russell once noted, very often the science needed to do philosophy can be limited to very general findings. Moreover, not all philosophical approaches need to be taken into account, since they are often superimposed or displaced. The main difficulty may reside in the circumstances, strategies and authenticity of attempts; in the limits imposed on the context of discovery more than in the sheer impossibility of making progress. In any case, it is a fact that in recent years, true comprehensiveness has nearly disappeared in the philosophy of linguistic analysis. However, the main reason does not seem to be impossibility in principle, but rather the loss of the right cultural soil in which comprehensive philosophy could flourish.
In this book, I begin by arguing that more fruitful soil can be found if we start with a better reasoned and more affirmative appreciation of commonsense truisms, combined with a more pluralistic approach, prepared to incorporate the relevant (formal and empirical) results of science. Perhaps it is precisely against the unwanted return of a broader pluralistic approach that many in the mainstream of our present philosophy of language secretly struggle. This is often (though not always) obscured by some sort of dense, nearly scholastic scientistic atmosphere, so thick that practitioners barely notice it surrounding them. The intellectual climate sometimes recalls the middle ages, when no one was allowed to challenge religious dogmas. I even entertain the suspicion that in some quarters the attempt to advance a plausible comprehensive philosophy of language against the institutional power of reductive scientism runs the risk of being ideologically discouraged as a project and silenced as a fact.
Ernst Tugendhat, who (together with Jürgen Habermas) attempted with considerable success to develop comprehensive philosophy in the seventies, has recently seemed to have given up and is declaring that the heyday of philosophy is past. The problem is in my view aggravated because we live in a time of widespread cultural indifference, heavily influenced by the increasing development of science and technology. Though quite indispensable, this tends to cause a compartmentalized form of alienation in research that works against more comprehensive attempts to comprehend reality.
In the present book, I insist on swimming against the current. My main task here – a risky one – is to establish grounds for a new comprehensive orthodoxy, while arguing against certain reductionist-scientistic approaches that are blocking the way. Hence, it is an attempt to restore to the philosophy of language its deserved integrity, without contradicting either common sense or science; an effort to give a balanced, systematic and sufficiently plausible overview of meaning and the mechanisms of reference, using bridges laboriously constructed between some summits of philosophical thought. In this way I hope to realize the old philosophical ambition of a comprehensive synthesis, insofar as it still seems to be a reasonable undertaking.
Before any acknowledgments, I must emphasize Wittgenstein’s major influence on my philosophical outlook. His extremely suggestive and multifarious approach is more far-reaching than unprepared readers could possibly grasp, and the originality of his philosophical mind is indebted to his freedom from the burdens of the academic factory. I must also name the strong influence on my work of two living philosophers: John Searle and Ernst Tugendhat.
(to be completed...)
- I -
Logic, I should maintain, must no more admit a unicorn than zoology can; for logic is concerned with the real world just as truly as zoology, though with its more abstract and general features.
A philosophical tradition which suffers from the vice of horror mundi in an endemic way is condemned to futility.
Kevin Mulligan, Peter Simons, Barry Smith
Could the old orthodoxy of the philosophy of language that prevailed in the first half of the twentieth century, with its insistence on the centrality of meaning, its eroded semantic principle of verifiability, its naïve correspondentialism, its elementary distinction between analytic and synthetic, its crude descriptivist-internalist theories of proper names and general terms, its monolithic dichotomy between the necessary a priori and the contingent a posteriori… be nearer to the truth than the now still dominant causal-externalist orthodoxy?
This book was written in the conviction that this question should be answered affirmatively. I am convinced that the philosophy of language of the first half of the twentieth century was more profound, comprehensive and closer to the truth than the approaches of the new orthodoxy, and its insights were often more powerful. The reason seems to lie in the socio-cultural background. Cultural revolutions are products of great conflicts. And the period between the end of the nineteenth century and the Second World War was one of increasing social turmoil. This cast doubt on all established cultural values, providing the right atmosphere for intellectuals and artists disposed to develop sweepingly original innovations. This could be witnessed not only in philosophy and the arts, but also in the human and natural sciences.
However, in saying this I am not necessarily dismissing the more institutionalized philosophy that came later. In the philosophy of language I don’t, for instance, reject the philosophical interest of anti-verificationist arguments like those of W. V-O. Quine. Nor do I reject the deep philosophical originality and relevance of the new causal-externalist mainstream founded mainly by Saul Kripke and Keith Donnellan in the early seventies and later elaborated by Hilary Putnam, David Kaplan and many others. These and other accomplishments are relevant and in a sense even indispensable for reaching my goals.
However, the value of their labors is in my judgment predominantly negative, since I think their conclusions fall short of the truth. In other words, their significance consists mostly in being dialectically relevant challenges, which if adequately met would be followed by an enriching reformulation of old primarily descriptivist-internalist-cognitivist views of meaning and reference. These views could become increasingly complex in very positive and productive ways.
The aim of the present book is to contribute to moving in the proposed direction. My approach to the topics considered here consists in gradually developing and defending a primarily internalist, cognitivist and neo-descriptivist analysis of the nature of the cognitive meaning of our expressions and their mechanisms of reference. But this approach will be indirect, since the analysis will usually be supported by a critical examination of some central views of traditional analytic philosophy, particularly those of Wittgenstein and Frege. Furthermore, such explanations will be complemented by a renewed reading and defense of the idea that existence is a higher-order property, a detailed reconsideration of the verificationist view of meaning, and a reassessment of the correspondence theory of truth, which I see as complementary to the suggested form of verificationism and dependent on a renewed treatment of the old problem of perception.
The obvious assumption that makes my project prima facie plausible is the idea that language is a system of rules, some of which are more proper sources of meaning. The most central meaning-rules are those responsible for what Aristotle called apophantic speech – representational discourse, whose meaning-rules I call semantic-cognitive rules. Indeed, it is prima facie highly plausible to think that the cognitive meaning (i.e., informative content and not mere linguistic meaning) of our representational language cannot be given by anything other than semantic-cognitive rules or combinations of such rules. Our knowledge of these rules or conventions is – as will be defended – usually tacit, implicit, non-reflexive, that is, we are able to use them correctly but are very often unable to develop them in a linguistically explicit way.
My ultimate aim should be to investigate the structure of semantic-cognitive rules by examining our basic referential expressions, which are singular terms, general terms and in a sense declarative sentences, in order to furnish an appropriate explanation of their reference mechanisms. In the present book, I do this only very partially, often in the appendices, summarizing ideas already presented in my last book which still require development (see 2014, Ch. 2, 3, 4). I do this because in the main text of the present work my central aim is rather to justify and clarify my own assumptions on the philosophy of meaning and reference.
In developing these views, I realized in retrospect that my main goal was essentially to revive a program already speculatively developed by Ernst Tugendhat in his classical work Vorlesungen zur Einführung in die sprachanalytische Philosophie. This book, published in 1976, can be considered the swansong of the old orthodoxy, defending a non-externalist and non-causalist program that was gradually abandoned during the next decade under the ever-growing influence of the new causal-externalist orthodoxy. Tugendhat’s strategy in developing this program can be understood in its core as a semantic analysis of the fundamental singular predicative statement. This statement is not only epistemically fundamental, it is also the indispensable basis for building our first-order truth-functional language. In summary, offering a statement of the form Fa, he suggested that:
1) the meaning of the singular term a should be given by its identification rule (Identifikationsregel),
2) the meaning of the general term F should be given by its application rule (Verwendungsregel), which I also call a characterization or (preferably) ascription rule,
3) the meaning of the complete singular predicative statement Fa should be given by its verifiability rule (Verifikationsregel), which results from the combined application of the first two rules.
(cf. Tugendhat & Wolf 1983: 235-6; Tugendhat 1976: 259, 484, 487-8).
The verifiability rule is in this case obtained by jointly applying the first two rules in such a way that the identification rule of the singular term must be applied first, in order to then use the general term’s ascription rule. Thus, for instance, Yuri Gagarin, the first man to orbit the Earth from beyond its atmosphere, gazed out of his space capsule and exclaimed: ‘The Earth is blue!’ In order to make this a true statement, he should first have identified the Earth by applying the identification rule of the proper name ‘Earth’; then, based on the result of this application, he would have been able to apply the ascription rule of the predicative expression ‘…is blue.’ In this combined application, these two rules work as a kind of verifiability rule for the statement ‘The Earth is blue.’ That is: if these rules can be conjunctively applied, then the statement is true, otherwise, it is false. Tugendhat saw this not only as a form of verificationism, but also as a kind of correspondence theory of truth – a conclusion contested by some readers.
In order to test Tugendhat’s view, we can critically ask if it is not possible that we really first apply the ascription rule of a predicative expression. For example, suppose that one night you see something burning at a distance without knowing what is on fire. Only after approaching it do you see that it is an old, abandoned factory. It may seem that in this example you first applied the ascription rule and later the identification rule. However, in suggesting this we forget that to see the fire one must first direct one’s eyes at a certain spatio-temporal spot, thereby localizing the place where something is on fire. Hence, a primitive identification rule for a place was first applied. Hence, initially the statement will not be; ‘That old building is on fire,’ but simply ‘There is a fire… over there.’ Later, when you are closer to the building, you can make a more precise statement. Thus, in this same way while looking out of his space capsule’s window Gagarin could think, ‘There is blue color down below me’, before saying ‘The Earth is blue’. Even in this case, the ascription rule cannot be applied without the earlier application of some identification rule, even if it is one that is only able to identify a vague spatio-temporal region from the window. To expand on the objection, we could consider a statement like ‘It is all a white fog.’ Notwithstanding, even here, ‘It is all…’ expresses an identification rule (of the whole visual field here and now) for the singular term, while ‘…a white fog’ expresses the ascription rule for the general term.
Tugendhat came to his conclusions as a result of purely speculative considerations, without analyzing the structure of these rules and without answering the many obvious external criticisms of the program, like the numerous well-known objections already made against verificationism. But what is extraordinary is that he was arguably right, since I believe the present book will make it hard to contest his main views.
My methodological strategies, as will be seen, are also different from those used in the more formally oriented approaches opposed by this book, which are mostly inherited from the philosophy of ideal language in its positivistic developments. My approach is primarily oriented by the communicative and social roles of language, which I use as the fundamental units of analysis. This means that I am more influenced by the so-called ordinary language tradition than by the ideal language tradition. I believe a comprehensive understanding of language must emphasize its unavoidable involvement in overall societal life. Consequently, I assign a heuristic value to common sense and ordinary language intuitions, often seeking support in a more careful examination of concrete examples of how our linguistic expressions are effectively employed.
Finally, my approach is systematic. The chapters of this book are interconnected so that the plausibility of each is better supported when regarded in its relation to arguments developed in the preceding chapters and their often critical appendices. Even if complementary, these appendices are placed as counterpoints to the chapters, aiming to justify the expressed views, if not to add something to them.
 English translation: Traditional and Analytical Philosophy: Lectures on the Philosophy of Language (2016).
 In this book I use the word ‘statement’ in most cases as referring to the speech act of making an assertion.
 An antecedent of this is J. L. Austin’s correspondence view, according to which an indexical statement (e.g. ‘This rose is red’) is said to be true when the historical fact correlated with its demonstrative convention (here represented by the demonstrative ‘this’) is of the type established by the sentence’s descriptive convention (the red rose type) (Austin 1950: 122). This is a first approximation of conventionalist strategies later employed by Dummett in his interpretation of Frege (cf. 1981: 194, 229) and still later more cogently explored by Tugendhat under some Husserlian influence.
 The ideal language tradition (inspired by the logical analysis of language) and the ordinary language tradition (inspired by the real workings of natural language) represent opposed (though arguably complementary) views. The first was founded by Frege, Russell and the early Wittgenstein. It was also strongly associated with philosophers of logical positivism, particularly Rudolf Carnap. With the rise of Nazism in Europe, most philosophers associated with logical positivism fled to the USA, where they strongly influenced American analytical philosophy. The philosophies of W. V-O. Quine, Donald Davidson, and later Kripke, Putnam and David Kaplan, along with the present mainstream philosophy of language with its metaphysics of reference, are in indirect ways later products of ideal language philosophy. The ordinary language tradition, in its turn, was represented after the Second World War by the Oxford School. It was inspired by the analysis of what Austin called ‘the whole speech act in the total speech situation’. Its main theorists were J. L. Austin, Gilbert Ryle and P. F. Strawson, although it had an antecedent in the later philosophy of Wittgenstein and still earlier in G. E. Moore’s commonsense approach. Ordinary language philosophy also affected American philosophy through relatively isolated figures like Paul Grice and John Searle, whose academic influence was foreseeably not as great. For the historical background, see J. O. Urmson (1956).