Quem sou eu

Minha foto
If you wish to be acquainted with my groundbreaking work in philosophy, take a look at this blogg. It is the biggest, the broadest, the deepest. It is so deep that I guess that the narrowed focus of your mind eyes will prevent you to see its full deepness.

quinta-feira, 28 de junho de 2018

PHILOSOPHY OF LANGUAGE

This is an draft of the introduction of the book Philosophical Semantics, to be published by Cambridge Scholars Publishing in 2018/2


PHILOSOPHY OF LANGUAGE: A METHODOLOGICAL APPROACH



The old orthodoxy of the philosophy of language that prevailed during the first half of the twentieth century was marked by an insistence on the centrality of meaning, an eroded semantic principle of verifiability, naive correspondentialism, an elementary distinction between analytic and synthetic, crude descriptivist-internalist theories of proper names and general terms, a monolithic dichotomy between the necessary a priori and the contingent a posteriori… Could it nevertheless come closer to the truth than the now dominant causal-externalist orthodoxy?
     This book was written in the conviction that this question should be answered affirmatively. I am convinced that the philosophy of language of the first half of the twentieth century that formed the bulk of the old orthodoxy was often more virtuous, more comprehensive, more profound and closer to the truth than the approaches of the new orthodoxy, and that its rough-hewn insights were often more powerful, particularly in the works of philosophers like Wittgenstein, Frege, Russell and even Husserl. My conjecture is that the reason lies in the socio-cultural background. Even if also motivated by a desire to approach the authentic consensual truth only possible for science, philosophy in itself has its own epistemic place as a cultural conjectural endeavor, unavoidably harboring metaphorical components which can be approached to those of the fine arts, and comprehensive aims approachable to those of religion, even if it is in itself independent of both (Costa 2002). At its best, the first half of the twentieth century preserved these traits. One reason might be that this was still a very elitist and hierarchical intellectual world, while our present academic world is much more regimented by a scientifically oriented pragmatic society, which does not make it the best place for philosophy as an effort to reach surveillability. A more important reason is that great culture is the result of great conflict. And the period between the end of the nineteenth century and the Second World War was a time of increasing social turmoil with tragic dimensions. This conflict cast doubt on all established cultural values, creating the right atmosphere for the emergence of intellectuals and artists disposed to develop sweepingly original innovations. This could be felt not only in philosophy and the arts, but also in fields reserved for particular sciences.
     Philosophy of language since the Second World War has been much more a form of strongly established academic ‘normal philosophy,’ to borrow Thomas Kuhn’s term. On the one hand, it was a continuation of the old orthodoxy, represented in the writings of philosophers like John Austin, P. F. Strawson, Michael Dummett, John Searle, Ernst Tugendhat, Jürgen Habermas… whose side I usually take. On the other hand, we have seen the emergence of what is called the new orthodoxy, founded by Saul Kripke and Keith Donnellan in the early seventies and later elaborated by Hilary Putnam, David Kaplan, and many others. In opposition to the old orthodoxy, this approach emphasizes externalism about meaning, causalism, and anti-cognitivism. This new orthodoxy has become the contemporary mainstream position in philosophy of language.
     I do not deny the philosophical relevance of this new orthodoxy. Nor do I reject its originality and dialectical force. Perhaps I am more indebted to it than I wish to admit. Nevertheless, it has already long since lost much of its creative impetus, and it now has transformed itself into a kind of scholastic discussion among specialists. Moreover, the value of the new orthodoxy in philosophy of language is in my judgment predominantly negative, since most of its conclusions fall short of the truth. This means that the significance of its ideas consists mostly in their being dialectically relevant challenges, which, I believe, could be adequately answered by an improved reformulation of old, primarily descriptivist-internalist-cognitivist views of meaning and its connection with reference that are to some extent developed in the present book. Indeed, I intend to show that the views of the old orthodoxy could be reformulated in much more sophisticated ways, not only answering the challenges of the new orthodoxy, but also suggesting solutions to problems that the contemporary philosophy of language hasn’t addressed as well as it should.
     My approach to the topics considered here consists in gradually developing and defending a primarily internalist, cognitivist and neodescriptivist analysis of the nature of the cognitive meanings of our expressions and their inherent mechanisms of reference. But this approach will be indirect since the analysis will be supported by a critical examination of some central views of traditional analytic philosophy, particularly those of Wittgenstein and Frege. Furthermore, such explanations will be supplemented by a renewed reading and defense of the idea that existence is a higher-order property, a detailed revaluation of the verificationist explanation of cognitive meaning, and a reassess­ment of the correspondence theory of truth, which I see as complementary to the here developed form of verificationism, involving coherence and dependent on a correct treatment of the epistemic problem of perception.
     The obvious assumption that makes my project prima facie plausible is the idea that language is a system of rules, some of which should be the most proper sources of meaning. Following Ernst Tugendhat, I assume that the most central meaning-rules are those responsible for what Aristotle called apophantic speech: representational discourse, whose meaning-rules I call semantic-cognitive rules. Indeed, it seems at first highly plausible to think that the cognitive meaning (i.e., informative content and not mere linguistic meaning) of our representational language cannot be given by anything other than semantic-cognitive rules or associations of such rules. Our knowledge of these typically conventional rules is – as will be shown – usually tacit, implicit, non-reflexive. That is, we are able to use them correctly in a cognitive way, though we find almost unsurmountable difficulties when trying to analyze them in a linguistically explicit way, particularly when they belong to philosophically relevant concepts.
     My ultimate aim should be to investigate the structure of semantic-cognitive rules by examining our basic referential expressions – singular terms, general terms and also declarative sentences – in order to furnish an appropriate explanation of their reference mechanisms. In the present book, I do this only partially, often in the appendices, summarizing ideas already presented in my last book (2014, Chs. 2 to 4), aware that they still require development. I proceed in this way because in the main text of the present book my main concern is rather to justify and clarify my own assumptions on the philosophy of meaning and reference.
1. Ernst Tugendhat’s analysis of singular predicative statements
In developing these views, I soon realized that my main goal could be seen as essentially a way to revive a program already speculatively developed by Ernst Tugendhat in his classical work Traditional and Analytical Philosophy: Lectures on the Philosophy of Language.[1] This book, first published in 1976, can be considered the swansong of the old orthodoxy, defending a non-externalist and basically non-properly-causalist program that was gradually forgotten during the next decades under the ever-growing influence of the new causal-externalist orthodoxy. Tugendhat’s strategy in developing this program can be understood in its core as a semantic analysis of the fundamental singular predicative statement. This statement is not only epistemically fundamental, it is also the indispensable basis for building our first-order truth-functional language. In summary, given a statement of the form Fa, he suggested that:

1)     The meaning of the singular term a should be its identification rule (Identifikationsregel),
2)     the meaning of the general term F should be its application rule (Verwendungsregel), which I also call a characterization or (preferably) an ascription rule,
3)     the meaning of the complete singular predicative statement Fa should be its verifiability rule (Verifikationsregel), which results from the collaborative application of the first two rules.
(Cf. Tugendhat & Wolf 1983: 235-6; Tugendhat 1976: 259, 484, 487-8)

In this case, the verifiability rule is obtained by the sequential application of the first two rules in such a way that the identification rule of the singular term must be applied first, in order to then apply the general term’s ascription rule. Thus, for instance, Yuri Gagarin, the first man to orbit the Earth from above its atmosphere, gazed out of his space capsule and exclaimed: ‘The Earth is blue!’ In order to make this a true statement, he should first have identified the Earth by applying the identification rule of the proper name ‘Earth.’ Then, based on the result of this application, he would have been able to apply the ascription rule of the predicative expression ‘…is blue.’ In this form of combined application, these two rules work as a kind of verifiability rule for the statement ‘The Earth is blue.’ That is: if these rules can be conjunctively applied, then the statement is true, otherwise, it is false. Tugendhat saw this not only as a form of verificationism, but also as a kind of correspondence theory of truth – a conclusion that I find correct, although rejected by some of his readers.[2]
     In order to test Tugendhat’s view, we can critically ask if it is not possible that we really first apply the ascription rule of a predicative expression. For example, suppose that one night you see something burning at a distance without knowing what is on fire. Only after approaching it do you see that it is an old, abandoned factory. It may seem that in this example you first applied the ascription rule and later the identification rule. However, in suggesting this you forget that to see the fire one must first direct one’s eyes at a certain spatiotemporal spot, thereby localizing the individualized place where something is on fire. Hence, a primitive identification rule for a place at a certain time needed to be first generated and applied.
     That is, initially the statement will not be: ‘That old building is on fire,’ but simply ‘Over there… is fire.’ Later on, when you are closer to the building, you can make a more precise statement. Thus, in this same way, while looking out of his space capsule’s porthole, Gagarin could think, ‘Out there below the porthole it is blue,’ before saying ‘The Earth is blue.’ But even in this case, the ascription rule cannot be applied without the earlier application of some identification rule, even if it is one that is only able to identify a vague spatiotemporal region from the already identified porthole. To expand on the objection, one could consider a statement like ‘It is all white fog.’ Notwithstanding, even here, ‘It is all…’ expresses an identification rule (of my whole visual field covering the place where I am right now) for the singular term, while ‘…white fog’ expresses the ascription rule that can afterward be applied to the whole place where I am. Even if there is no real property, as when I state ‘It is all darkness,’ what I mean can be translated into the true statement ‘Here and now there is no light.’ And from this statement, it is clear that I first apply the indexical identification rule for the here and now and afterward see the inapplicability of the ascription rule for lightness expressed by the negation ‘…there is no light’ corresponding to the predicative expression ‘…is all darkness.’
     Tugendhat reached his conclusions through purely speculative considerations, without analyzing the structure of these rules and without answering the many obvious external criticisms of the program, like the numerous well-known objections already made against verificationism. But what is extraordinary is that he was arguably right, since the present book will make it hard to contest his main views.[3]








[1] Original German title: Vorlesungen zur Einführung in die sprachanalytische Philosophie.
[2] An antecedent of this is J. L. Austin’s correspondence view, according to which an indexical statement (e.g., ‘This rose is red’) is said to be true when the historical fact correlated with its demonstrative convention (here represented by the demonstrative ‘this’) is of the type established by the sentence’s descriptive convention (the red rose type) (Austin 1950: 122). This was a first approximation of conventionalist strategies later employed by Dummett in his interpretation of Frege (Cf. 1981: 194, 229) and still later more cogently explored by Tugendhat under some Husserlian influence.
[3] Tugendhat’s thesis crosses over peculiarities of linguistic interaction. Consider a conversational implicature: – ‘Do you know how to cook?’ – ‘I am French,’ which implicates the statement ‘I know how to cook.’ (Recanati 2004: 5) Obviously, this does not effect Tugendhat’s thesis, for the proper and implied meanings posed by the statement ‘I am French’ would then be established by means of verifiability rules.
[4] The ideal language tradition (steered by the logical analysis of language) and the natural language tradition (steered by the real work of natural language) represent opposed (though arguably also complementary) views. The first was founded by Frege, Russell and the early Wittgenstein. It was also later strongly associated with philosophers of logical positivism, particularly Rudolf Carnap. With the rise of Nazism in Europe, most philosophers associated with logical positivism fled to the USA, where they strongly influenced American analytic philosophy. The philosophies of W. V-O. Quine, Donald Davidson, and later Saul Kripke, Hilary Putnam and David Kaplan, along with the present mainstream philosophy of language, with its metaphysics of reference, are in indirect ways later American developments of ideal language philosophy. What I prefer to call the natural language tradition was represented after the Second World War in Oxford by the sometimes dogmatically restrictive ‘ordinary language philosophy.’ Its main theorists were J. L. Austin, Gilbert Ryle, and P. F. Strawson, although it had an antecedent in the less restrictive natural language philosophy of the later Wittgenstein and, still earlier, in G. E. Moore’s commonsense approach. Natural language philosophy also affected American philosophy through relatively isolated figures like Paul Grice and John Searle, whose academic influence has foreseeably not been as great... For the initial historical background, see J. O. Urmson (1956).
[5] After his broad exposition of contemporary philosophy, K. A. Appiah concluded: ‘The subject is not a collection of separate problems that can be addressed independently. Issues in epistemology and the philosophy of language reappear in the discussions of philosophy of mind, morals, politics, law, science, and religion… What is the root of the philosophical style is a desire to give a general and systematic account of our thought and experience, one that is developed critically, in the light of evidence and arguments.’ (2003: 377-378) Because of this, the hardest task for those committed to comprehensive coherence is to reach a position that enables the evaluation of the slightest associations among issues belonging to the most diverse domains of our conceptual network (Cf. Kenny 1993: 9).

Nenhum comentário:

Postar um comentário