This is only an advertisement for the 490 pages book I will publish in the next months by the Cambridge Scholars Publishing. The present draft shows what I am trying to do.
Logic, I should maintain, must no more admit a unicorn than zoology can; for logic is concerned with the real world just as truly as zoology, though with its more abstract and general features.
A philosophical tradition which suffers from the vice of horror mundi in an endemic way is condemned to futility.
—Kevin Mulligan, Peter Simons, Barry Smith
The old orthodoxy of the philosophy of language that prevailed during the first half of the twentieth century was marked by an insistence on the centrality of meaning, an eroded semantic principle of verifiability, naive correspondentialism, an elementary distinction between analytic and synthetic, crude descriptivist-internalist theories of proper names and general terms, a monolithic dichotomy between the necessary a priori and the contingent a posteriori… Could it nevertheless come closer to the truth than the now dominant causal-externalist orthodoxy?
This book was written in the conviction that this question should be answered affirmatively. I am convinced that the philosophy of language of the first half of the twentieth century that formed the bulk of the old orthodoxy was often more virtuous, more comprehensive, more profound and closer to the truth than the approaches of the new orthodoxy, and that its rough-hewn insights were often more powerful, particularly in the works of philosophers like Wittgenstein, Frege, Russell and also Husserl. My conjecture is that the reason lies in the socio-cultural background. Even if also motivated by a desire to approach the authentic consensual truth only possible for science, philosophy in itself has its own epistemic place as a cultural conjectural endeavor, inevitably harboring metaphorical components which are comparable to those of the fine arts and comprehensive aims comparable to those of religion, even if it is in itself independent of both (Costa 2002). In its best the first half of the twentieth century preserved these traits. One reason might be that this was still a very elitist and hierarchical intellectual world, while our present world is much more level by a scientifically oriented pragmatic society, which does not make it the best place for philosophy as an effort to reach surveillability. Another reason is that great culture is the result of great conflict which puts received views in doubt. And the period between the end of the nineteenth century and the Second World War was one of increasing social turmoil. This conflict cast doubt on all established cultural values, creating the right atmosphere for intellectuals and artists motivated to develop sweepingly original innovations. This could be felt not only in philosophy and the arts, but also in fields reserved for particular sciences.
Philosophy of language since the Second World War has been much more a form of strongly established academic ‘normal philosophy,’ to borrow Thomas Kuhn’s term. On the one hand, it was a continuation of the old orthodoxy, represented in the writings of philosophers like John Austin, P. F. Strawson, Michael Dummett, John Searle, Ernst Tugendhat, Jürgen Habermas… whose side I usually take. On the other hand, we have seen the emergence of what I call the new orthodoxy, founded by Saul Kripke and Keith Donnellan in the early seventies and later elaborated by Hilary Putnam, David Kaplan and many others. In opposition to the old orthodoxy, this approach emphasizes externalism about meaning, causalism and anti-cognitivism. This new orthodoxy has become the contemporary mainstream position in philosophy of language.
I do not deny the philosophical relevance of this new orthodoxy. Nor do I reject its originality and dialectical force. Nevertheless, the new orthodoxy has already long since lost much of its creative impetus, and it now has transformed itself into a kind of scholastic discussion among specialists. Moreover, the value of the new orthodoxy in philosophy of language is in my judgment predominantly negative, since most of its conclusions fall short of the truth. This means that the significance of its ideas consists mostly in their being dialectically relevant challenges, which, I believe, could be adequately answered by an improved reformulation of old, primarily descriptivist-internalist-cognitivist views of meaning and reference that are to some extent developed in the present book. Indeed, I intend to show that the views of the old orthodoxy could be reformulated in much more sophisticated ways, not only answering the challenges of the new orthodoxy, but also suggesting solutions to problems that the contemporary philosophy of language hasn’t addressed as well as it should.
My approach to the topics considered here consists in gradually developing and defending a primarily internalist, cognitivist and neo-descriptivist analysis of the nature of the cognitive meanings of our expressions and their inherent mechanisms of reference. But this approach will be indirect, since the analysis will be supported by a critical examination of some central views of traditional analytic philosophy, particularly those of Wittgenstein and Frege. Furthermore, such explanations will be supplemented by a renewed reading and defense of the idea that existence is a higher-order property, a detailed reconsideration of the verificationist explanation of cognitive meaning, and a reassessment of the correspondence theory of truth, which I see as involving coherence, complementary to the here suggested form of verificationism and dependent on a correct treatment of the epistemic problem of perception.
The obvious assumption that makes my project prima facie plausible is the idea that language is a system of rules, some of which should be the most proper sources of meaning. Following Ernst Tugendhat, I assume that the most central meaning-rules are those responsible for what Aristotle called apophantic speech: the representational discourse, whose meaning-rules I call semantic-cognitive rules. Indeed, it seems at first highly plausible to think that the cognitive meaning (i.e., informative content and not mere linguistic meaning) of our representational language cannot be given by anything other than semantic-cognitive rules or associations of such rules. Our knowledge of these typically conventional rules is – as will be shown – usually tacit, implicit, non-reflexive. That is, we are able to use them correctly in a cognitive way, though we find almost unsurmontable difficulties when trying to analyze them in a linguistically explicit way when they belong to philosophically relevant concepts.
My ultimate aim should be to investigate the structure of semantic-cognitive rules by examining our basic referential expressions – singular terms, general terms and also declarative sentences – in order to furnish an appropriate explanation of their reference mechanisms. In the present book, I do this only partially, often in the appendices, summarizing ideas already presented in my last book (2014, Chs. 2 to 4), aware that they still require development. I proceed in this way because in the main text of the present book my main concern is rather to justify and clarify my own assumptions on the philosophy of meaning and reference.
1. Ernst Tugendhat’s analysis of singular predicative statements
In developing these views, I soon realized that my main goal could be seen as essentially a way to revive a program already speculatively developed by Ernst Tugendhat in his classical work Traditional and Analytical Philosophy: Lectures on the Philosophy of Language (2016). This book, published in 1976, can be considered the swansong of the old orthodoxy, defending a non-externalist and basically non-properly-causalist program that was gradually forgotten during the next decades under the ever-growing influence of the new causal-externalist orthodoxy. Tugendhat’s strategy in developing this program can be understood in its core as a semantic analysis of the fundamental singular predicative statement. This statement is not only epistemically fundamental, it is also the indispensable basis for building our first-order truth-functional language. In summary, given a statement of the form Fa, he suggested that:
1) The meaning of the singular term a should be its identification rule (Identifikationsregel),
2) the meaning of the general term F should be its application rule (Verwendungsregel), which I also call a characterization or (preferably) an ascription rule,
3) the meaning of the complete singular predicative statement Fa should be its verifiability rule (Verifikationsregel), which results from the collaborative application of the first two rules.
(Cf. Tugendhat & Wolf 1983: 235-6; Tugendhat 1976: 259, 484, 487-8).
In this case, the verifiability rule is obtained by the sequential application of the first two rules in such a way that the identification rule of the singular term must be applied first, in order to then apply the general term’s ascription rule. Thus, for instance, Yuri Gagarin, the first man to orbit the Earth from above its atmosphere, gazed out of his space capsule and exclaimed: ‘The Earth is blue!’ In order to make this a true statement, he should first have identified the Earth by applying the identification rule of the proper name ‘Earth.’ Then, based on the result of this application, he would have been able to apply the ascription rule of the predicative expression ‘…is blue.’ In this form of combined application, these two rules work as a kind of verifiability rule for the statement ‘The Earth is blue.’ That is: if these rules can be conjunctively applied, then the statement is true, otherwise, it is false. Tugendhat saw this not only as a form of verificationism, but also as a kind of correspondence theory of truth – a conclusion that I find correct, although rejected by some of his readers.
In order to test Tugendhat’s view, we can critically ask if it is not possible that we really first apply the ascription rule of a predicative expression. For example, suppose that one night you see something burning at a distance without knowing what is on fire. Only after approaching it do you see that it is an old, abandoned factory. It may seem that in this example you first applied the ascription rule and later the identification rule. However, in suggesting this you forget that to see the fire one must first direct one’s eyes at a certain spatio-temporal spot, thereby localizing the individualized place where something is on fire. Hence, a primitive identification rule for a place at a certain time was first generated and applied.
That is, initially the statement will not be: ‘That old building is on fire,’ but simply ‘Over there… is fire.’ Later on, when you are closer to the building, you can make a more precise statement. Thus, in this same way while looking out of his space capsule’s porthole, Gagarin could think, ‘Out there below the porthole it is blue’, before saying ‘The Earth is blue’. But even in this case, the ascription rule cannot be applied without the earlier application of some identification rule, even if it is one that is only able to identify a vague spatio-temporal region from the already identified porthole. To expand on the objection, we could consider a statement like ‘It is all white fog.’ Notwithstanding, even here, ‘It is all…’ expresses an identification rule (of my whole visual field covering the place where I am right now) for the singular term, while ‘…white fog’ expresses the ascription rule that can afterwards be applied to the whole place where I am. Even if there is no real property, as when I state ‘It is all darkness,’ what I mean can be translated into the true statement ‘Here and now there is no light.’ And from this statement it is clear that I first apply the indexical identification rule for the here and now and afterwards see the inapplicability of the ascription rule for lightness expressed by the negation ‘…there is no light’ corresponding to the predicative expression ‘…is all darkness.’
Tugendhat reached his conclusions through purely speculative considerations, without analyzing the structure of these rules and without answering the many obvious external criticisms of the program, like the numerous well-known objections already made against verificationism. But what is extraordinary is that he was arguably right, since the present book will make it hard to contest his main views.
2. The virtue of comprehensiveness
My methodological strategies are also different from those used in the more formally oriented approaches criticized in this book, insofar as they follow a positivist form of ideal language philosophy that hypostasizes form ignoring or distorting empirical truisms. In opposition, my approach is primarily oriented by the communicative and social roles of language regarded as the fundamental units of analysis. It must be so because I assume that the most proper philosophical approach must be as comprehensive as possible and that an all-inclusive understanding of language and meaning must fairly contemplate its unavoidable involvement in overall societal life. This means that I am more influenced by what could be broadly called natural language tradition (understood in a broad, critical sense) than by much of the ideal language tradition, thus being inclined to assign a fair amount of heuristic value to common sense and critical examination of natural language intuitions, often seeking support in a more careful examination of concrete examples of how linguistic expressions are effectively employed in adequately chosen contexts.
Finally, my approach is systematic, which means that coherence belongs to it heuristically. The chapters of this book are so interconnected that the plausibility of each is better supported when regarded in its relation to arguments developed in the preceding chapters and their often critical appendices. Even if complementary, these appendices are included as a sometimes indispensable counterpoint to the chapters, aiming to better justify the expressed views, if not to add something relevant to them.
The whole inquiry strives in the direction of comprehensiveness, aiming to reintegrate theoretical philosophy under the recognition that there is no philosophical question completely independent of all the others (Appiah: 377). In this way it shows itself to be an attempt to analyze linguistically approximated concepts like meaning, reference, existence and truth, insofar as they are internally associated with one another and, unavoidably, with a cluster of some main metaphysical and epistemological framework concepts constitutive of our understanding of the world.
 Original German title: Vorlesungen zur Einführung in die sprachanalytische Philosophie.
 I use the word ‘statement’ in most cases as referring to the speech act of making an assertion.
 An antecedent of this is J. L. Austin’s correspondence view, according to which an indexical statement (e.g., ‘This rose is red’) is said to be true when the historical fact correlated with its demonstrative convention (here represented by the demonstrative ‘this’) is of the type established by the sentence’s descriptive convention (the red rose type) (Austin 1950: 122). This was a first approximation of conventionalist strategies later employed by Dummett in his interpretation of Frege (Cf. 1981: 194, 229) and still later more cogently explored by Tugendhat under some Husserlian influence.
 Tugendhat’s thesis has a wide scope. Consider a conversational implicature: – ‘Do you know how to cook?’ – ‘I am French,’ which implicates the statement ‘I know how to cook.’ (Recanati 2004: 5) But this does not effect Tugendhat’s thesis, for the proper and implied meanings posed by the statement ‘I am French’ would then be established by means of verifiability rules.
 The ideal language tradition (inspired by the logical analysis of language) and the natural language tradition (inspired by the real work of natural language) represent opposed (though arguably also complementary) views. The first was founded by Frege, Russell and the early Wittgenstein. It was also later strongly associated with philosophers of logical positivism, particularly Rudolf Carnap. With the rise of Nazism in Europe, most philosophers associated with logical positivism fled to the USA, where they strongly influenced American analytic philosophy. The philosophies of W. V-O. Quine, Donald Davidson, and later Saul Kripke, Hilary Putnam and David Kaplan, along with the present mainstream philosophy of language, with its metaphysics of reference, are in indirect ways later American developments of ideal language philosophy. What I prefer to call the natural language tradition was represented after the Second World War by the Oxford School as the sometimes dogmatically restrictive ‘ordinary language philosophy.’ The latter was inspired by the analysis of what J. L. Austin called ‘the total speech act in the total speech situation.’ Its main theorists were Austin, Gilbert Ryle and P. F. Strawson, although it had an antecedent in the later natural language philosophy of Wittgenstein and, still earlier, in G. E. Moore’s commonsense approach. Natural language philosophy also affected American philosophy through relatively isolated figures like Paul Grice and John Searle, whose academic influence has foreseeably not been as great... For the initial historical background, see J. O. Urmson (1956).
 After his broad exposition of contemporary philosophy K. A. Appiah concluded: ‘The subject is not a collection of separate problems that can be addressed independently. Issues in epistemology and the philosophy of language reappear in the discussions of philosophy of mind, morals, politics, law, science and religion… What is the root of the philosophical style is a desire to give a general and systematic account of our thought and experience, one that is developed critically, in the light of evidence and arguments.’ (2003: 377-378) (Cf. also Kenny 1993: 9)