Quem sou eu

Minha foto
If you wish to be acquainted with my groundbreaking work in philosophy, take a look at this blogg. It is the biggest, the broadest, the deepest. It is so deep that I guess that the narrowed focus of your mind eyes will prevent you to see its full deepness.

quarta-feira, 6 de setembro de 2017

INTRODUCTION TO "PHILOSOPHICAL SEMANTICS"

Introduction to Philosophical Semantics - draft version (final version published in 2018 by CSP)



- I -
INTRODUCTION


Logic, I should maintain, must no more admit a unicorn than zoology can; for logic is concerned with the real world just as truly as zoology, though with its more abstract and general features.
Bertrand Russell

A philosophical tradition which suffers from the vice of horror mundi in an endemic way is condemned to futility.
Kevin Mulligan, Peter Simons, Barry Smith

The old orthodoxy of the philosophy of language that prevailed during the first half of the twentieth century was marked by an insistence on the centrality of meaning, an eroded semantic principle of verifiability, naïve correspondentialism, an elementary distinction between analytic and synthetic, crude descriptivist-internalist theories of proper names and general terms, a monolithic dichotomy between the necessary a priori and the contingent a posteriori… Could it nevertheless come closer to the truth than the now dominant causal-externalist orthodoxy?
  This book was written in the conviction that this question should be answered affirmatively. I am convinced that the philosophy of language of the first half of the twentieth century that formed the bulk of the old orthodoxy was often more virtuous, more comprehensive, more profound and closer to the truth than the approaches of the new orthodoxy, and that its rough-hewn insights were often more powerful, particularly in the works of philosophers like Wittgenstein, Frege, Russell and Husserl. My conjecture is that the reason lies in the socio-cultural background. Although increasingly influenced by science, philosophy in itself has its own epistemic place as a cultural conjectural endeavor, harboring quasi-aesthetic metaphorical components and quasi-mystical comprehensive aims (Costa 2002). The first half of the twentieth century preserved these traits. It was still a highly hierarchical intellectual world, while our present world is much more levelled, which does not make it the best place for high cultural development. Moreover, great culture is the result of a great conflict. And the period between the end of the nineteenth century and the Second World War was one of increasing social turmoil. This conflict cast doubt on all established cultural values, creating the right atmosphere for intellectuals and artists disposed to develop sweepingly original innovations. This could be felt not only in philosophy and the arts, but also in fields reserved to sciences.
  Philosophy of language since the Second World War has been much more a form of strongly established academic ‘normal philosophy.’  On the one hand, it was a continuation of the old orthodoxy, represented in the writings of philosophers like John Austin, P. F. Strawson, Michael Dummett, Ernst Tugendhat, Jürgen Habermas and John Searle… whose side I take. On the other hand, we have seen the emergence of what I call the new orthodoxy, founded by Saul Kripke and Keith Donnellan in the early seventies and later elaborated by Hilary Putnam, David Kaplan and many others. In opposition to the old orthodoxy, this approach emphasizes externalism about meaning, causalism and anti-cognitivism. This new orthodoxy has become the contemporary mainstream in philosophy of language.
  I do not deny the relevance of this new orthodoxy. Nor do I reject its originality and dialectical force. Perhaps I am more indebted to it than I would like to admit. Nevertheless, the new orthodoxy has already long since lost much of its creative power, and it now risks transforming itself into a kind of scholastic discussion among specialists. Moreover, the value of the new orthodoxy in philosophy of language is in my judgment predominantly negative, since I think most of its conclusions fall short of the truth. This means that the significance of its ideas consists mostly in their being dialectically relevant challenges, which, I believe, could be adequately answered by an enriched reformulation of old primarily descriptivist-internalist-cognitivist views of meaning and reference that are to some extent developed in the present book. Indeed, I intend to show that the views of the old orthodoxy could be reformulated in much more sophisticated ways, not only answering the challenges of the new orthodoxy, but also problems that the contemporary philosophy of language hasn’t properly addressed.
  My approach to the topics considered here consists in gradually developing and defending a primarily internalist, cognitivist and neo-descriptivist analysis of the nature of the cognitive meanings of our expressions and their in my view inherent mechanisms of reference. But this approach will be indirect, since the analysis will usually be supported by a critical examination of some central views of traditional analytic philosophy, particularly those of Wittgenstein and Frege. Furthermore, such explanations will be supplemented by a renewed reading and defense of the idea that existence is a higher-order property, a detailed reconsideration of the verificationist explanation of meaning, and a reassess­ment of the adequation theory of truth, which I see as complementary to the suggested form of verificationism and dependent on a correct treatment of the epistemic problem of perception.
  The obvious assumption that makes my project prima facie plausible is the idea that language is a system of rules, some of which are more proper sources of meaning. The most central meaning-rules are those responsible for what Aristotle called apophantic speech – representational discourse, whose meaning-rules I call semantic-cognitive rules. Indeed, it seems at first highly plausible to think that the cognitive meaning (i.e., informative content and not mere linguistic meaning) of our representational language cannot be given by anything other than semantic-cognitive rules or combinations of such rules. Our knowledge of these typically conventional rules is – as will be defended – usually tacit, implicit, and non-reflexive. That is, we are able to use them correctly but we are almost never able to analyze them in a linguistically explicit way.
  My ultimate aim should be to investigate the structure of semantic-cognitive rules by examining our basic referential expressions, which are singular terms, general terms and in a sense declarative sentences, in order to furnish an appropriate explanation of their reference mechanisms. In the present book, I do this only very partially, often in the appendices, summarizing ideas already presented in my last book (2014, Ch. 2, 3, 4), though aware that they still require development. I proceed in this way because in the main text of the present work my central aim is rather to justify and clarify my own assumptions on the philosophy of meaning and reference.
  In developing these views, I realized in retrospect that my main goal could be seen as essentially a way to revive a program already speculatively developed by Ernst Tugendhat in his classical work Vorlesungen zur Einführung in die sprachanalytische Philosophie.[1] This book, published in 1976, can be considered the swansong of the old orthodoxy, defending a non-externalist and basically non-causalist program that was gradually abandoned during the next decade under the ever-growing influence of the new causal-externalist orthodoxy. Tugendhat’s strategy in developing this program can be understood in its core as a semantic analysis of the fundamental singular predicative statement. This statement is not only epistemically fundamental, it is also the indispensable basis for building our first-order truth-functional language.[2] In summary, offering a statement of the form Fa, he suggested that:[3]

1)            the meaning of the singular term a should be given by its identification rule (Identifikationsregel),
2)            the meaning of the general term F should be given by its application rule (Verwendungsregel), which I also call a characterization or (preferably) ascription rule,
3)            the meaning of the complete singular predicative statement Fa should be given by its verifiability rule (Verifikationsregel), which results from the combined application of the first two rules.
(cf. Tugendhat & Wolf 1983: 235-6; Tugendhat 1976: 259, 484, 487-8).

In this case, the verifiability rule is obtained by jointly applying the first two rules in such a way that the identification rule of the singular term must be applied first, in order to then use the general term’s ascription rule. Thus, for instance, Yuri Gagarin, the first man to orbit the Earth from above its atmosphere, gazed out of his space capsule and exclaimed: ‘The Earth is blue!’ In order to make this a true statement, he should first have identified the Earth by applying the identification rule of the proper name ‘Earth.’ Then, based on the result of this application, he would have been able to apply the ascription rule of the predicative expression ‘…is blue.’ In this form of combined application, these two rules work as a kind of verifiability rule for the statement ‘The Earth is blue.’ That is: if these rules can be conjunctively applied, then the statement is true, otherwise, it is false. Tugendhat saw this not only as a form of verificationism, but also as a kind of adequation theory of truth – a conclusion that I find correct, although it is rejected by some of his readers.
  In order to test Tugendhat’s view, we can critically ask if it is not possible that we really first apply the ascription rule of a predicative expression. For example, suppose that one night you see something burning at a distance without knowing what is on fire. Only after approaching it do you see that it is an old, abandoned factory. It may seem that in this example you first applied the ascription rule and later the identification rule. However, in suggesting this you forget that to see the fire one must first direct one’s eyes at a certain spatio-temporal spot, thereby localizing the individualized place where something is on fire. Hence, a primitive identification rule for a place was first formed and applied.
  That is, initially the statement will not be: ‘That old building is on fire,’ but simply ‘Over there… is fire.’ Later on, when you are closer to the building, you can make a more precise statement. Thus, in this same way while looking out of his space capsule’s porthole, Gagarin could think, ‘Out under the porthole there is blue’, before saying ‘The Earth is blue’. Even in this case, the ascription rule cannot be applied without the earlier application of some identification rule, even if it is one that is only able to identify a vague spatio-temporal region from the porthole. To expand on the objection, we could consider a statement like ‘It is all white fog.’ Notwithstanding, even here, ‘It is all…’ expresses an identification rule (of my whole visual field covering the place where I am now) for the singular term, while ‘…white fog’ expresses the ascription rule that can be afterwards applicable to the place where I am. Even if there is no property, as when I state ‘It is all darkness,’ what I mean is that the statement ‘Here and now there is no light’ is true. And from this statement it is clear that I first apply the indexical identification rule for the here and now and afterwards see the inapplicability of the ascription rule for lightness expressed by the negation or by the predicative expression ‘…is all darkeness.’
  Tugendhat reached his conclusions through purely speculative considerations, without analyzing the structure of these rules and without answering the many obvious external criticisms of the program, like the numerous well-known objections already made against verificationism. But what is extraordinary is that he was arguably right, since I believe the present book will make it hard to contest his main views.
  My methodological strategies, as will be seen, are also different from those used in the more formally oriented approaches criticized in this book, which seems to be mostly inherited from some mixture of pragmatism and the philosophy of ideal language in its positivistic developments. My approach is primarily oriented by the communicative and social roles of language, which can be regarded as the fundamental units of analysis. It must be so because I assume that a philosophical approach must be as comprehensive as possible and that an all-inclusive understanding of language and meaning must emphasize its unavoidable involvement in overall societal life. This means that I am more influenced by the so-called natural language tradition than by the ideal language tradition, being lead to assign heuristic value to common sense and natural language intuitions, often seeking support in a more careful examination of concrete examples of how linguistic expressions are effectively employed in chosen contexts.[4]
  Finally, my approach is systematic, which means that coherence belongs to it heuristically. The chapters of this book are interconnected so that the plausibility of each is better supported when regarded in its relation to arguments developed in the preceding chapters and their often critical appendices. Even if complementary, these appendices are added as counterpoint to the chapters, aiming to justify the expressed views, if not to add something relevant to them.








[1] English translation: Traditional and Analytical Philosophy: Lectures on the Philosophy of Language (2016).
[2] In this book, I use the word ‘statement’ in most cases as referring to the speech act of making an assertion.
[3] An antecedent of this is J. L. Austin’s correspondence view, according to which an indexical statement (e.g. ‘This rose is red’) is said to be true when the historical fact correlated with its demonstrative convention (here represented by the demonstrative ‘this’) is of the type established by the sentence’s descriptive convention (the red rose type) (Austin 1950: 122). This is a first approximation of conventionalist strategies later employed by Dummett in his interpretation of Frege (cf. 1981: 194, 229) and still later more cogently explored by Tugendhat under some Husserlian influence.
[4] The ideal language tradition (inspired by the logical analysis of language) and the natural language tradition (inspired by the real workings of natural language) represent opposed (though arguably also complementary) views. The first was founded by Frege, Russell and the early Wittgenstein. It was also strongly associated with philosophers of logical positivism, particularly Rudolf Carnap. With the rise of Nazism in Europe, most philosophers associated with logical positivism fled to the USA, where they strongly influenced American analytical philosophy. The philosophies of W. V-O. Quine, Donald Davidson, and later Kripke, Putnam and David Kaplan, along with the present mainstream philosophy of language with its metaphysics of reference, are in indirect ways later developments of ideal language philosophy. The natural (as ordinary) language tradition, in its turn, was represented after the Second World War by the Oxford School. It was inspired by the analysis of what Austin called ‘the whole speech act in the total speech situation.’ Its main theorists were J. L. Austin, Gilbert Ryle and P. F. Strawson, although it had an antecedent in the later philosophy of Wittgenstein and, still earlier, in G. E. Moore’s commonsensical approach. Natural language philosophy also affected American philosophy through relatively isolated figures like Paul Grice and John Searle, whose academic influence has foreseeably not been as great. For the historical background, see J. O. Urmson (1956).

Nenhum comentário:

Postar um comentário