Some of the major issues at the intersection of philosophy of language and philosophy of mind are also dealt with in modern psycholinguistics. Some important questions are How much of language is innate? Is language acquisition a special faculty in the mind? What is the connection between thought and language?
There are three general perspectives on the issue of language learning. The first is the behaviorist perspective, which dictates that not only is the solid bulk of language learned, but it is learned via conditioning. The second is the hypothesis testing perspective, which understands the child's learning of syntactic rules and meanings to involve the postulation and testing of hypotheses, through the use of the general faculty of intelligence. The final candidate for explanation is the innatist perspective, which states that at least some of the syntactic settings are innate and hardwired, based on certain modules of the mind.
There are varying notions of the structure of the brain when it comes to language. Connectionist models emphasize the idea that a person's lexicon and their thoughts operate in a kind of distributed, associativenetwork. Nativist models assert that there are specialized devices in the brain that are dedicated to language acquisition. Computation models emphasize the notion of a representational language of thought and the logic-like, computational processing that the mind performs over them. Emergentist models focus on the notion that natural faculties are a complex system that emerge from simpler biological parts. Reductionistmodels attempt to explain higher-level mental processes in terms of the basic low-level neurophysiological activity of the brain.
Language and thought
An important problem which touches both philosophy of language and philosophy of mind is to what extent language influences thought and vice versa. There have been a number of different perspectives on this issue, each offering a number of insights and suggestions.
Linguists Sapir and Whorf suggested that language limited the extent to which members of a "linguistic community" can think about certain subjects (a hypothesis paralleled in George Orwell's novel Nineteen Eighty-Four). In other words, language was analytically prior to thought. Philosopher Michael Dummett is also a proponent of the "language-first" viewpoint.
The stark opposite to the Sapir–Whorf position is the notion that thought (or, more broadly, mental content) has priority over language. The "knowledge-first" position can be found, for instance, in the work of Paul Grice. Further, this view is closely associated with Jerry Fodor and his language of thought hypothesis. According to his argument, spoken and written language derive their intentionality and meaning from an internal language encoded in the mind. The main argument in favor of such a view is that the structure of thoughts and the structure of language seem to share a compositional, systematic character. Another argument is that it is difficult to explain how signs and symbols on paper can represent anything meaningful unless some sort of meaning is infused into them by the contents of the mind. One of the main arguments against is that such levels of language can lead to an infinite regress. In any case, many philosophers of mind and language, such as Ruth Millikan, Fred Dretske and Fodor, have recently turned their attention to explaining the meanings of mental contents and states directly.
Another tradition of philosophers has attempted to show that language and thought are coextensive – that there is no way of explaining one without the other. Donald Davidson, in his essay "Thought and Talk", argued that the notion of belief could only arise as a product of public linguistic interaction. Daniel Dennett holds a similar interpretationist view of propositional attitudes. To an extent, the theoretical underpinnings to cognitive semantics (including the notion of semantic framing) suggest the influence of language upon thought. However, the same tradition views meaning and grammar as a function of conceptualization, making it difficult to assess in any straightfoward way.
Some thinkers, like the ancient sophist Gorgias, have questioned whether or not language was capable of capturing thought at all.
...speech can never exactly represent perceptibles, since it is different from them, and perceptibles are apprehended each by the one kind of organ, speech by another. Hence, since the objects of sight cannot be presented to any other organ but sight, and the different sense-organs cannot give their information to one another, similarly speech cannot give any information about perceptibles. Therefore, if anything exists and is comprehended, it is incommunicable.
There are studies that prove that languages shape how people understand causality. Some of them were performed by Lera Boroditsky. For example, English speakers tend to say things like "John broke the vase" even for accidents. However, Spanish or Japanese speakers would be more likely to say "the vase broke itself." In studies conducted by Caitlin Fausey at Stanford University speakers of English, Spanish and Japanese watched videos of two people popping balloons, breaking eggs and spilling drinks either intentionally or accidentally. Later everyone was asked whether they could remember who did what. Spanish and Japanese speakers did not remember the agents of accidental events as well as did English speakers. In another study, English speakers watched the video of Janet Jackson's infamous "wardrobe malfunction", accompanied by one of two written reports. The reports were identical except in the last sentence where one used the agentive phrase "ripped the costume" while the other said "the costume ripped." The people who read "ripped the costume" blamed Justin Timberlake more.
Russian speakers, who make an extra distinction between light and dark blue in their language, are better able to visually discriminate shades of blue. The Piraha, a tribe in Brazil, whose language has only terms like few and many instead of numerals, are not able to keep track of exact quantities.
In one study German and Spanish speakers were asked to describe objects having opposite gender assignment in those two languages. The descriptions they gave differed in a way predicted by grammatical gender. For example, when asked to describe a "key" — a word that is masculine in German and feminine in Spanish — the German speakers were more likely to use words like "hard," "heavy," "jagged," "metal," "serrated," and "useful," whereas Spanish speakers were more likely to say "golden," "intricate," "little," "lovely," "shiny," and "tiny." To describe a "bridge," which is feminine in German and masculine in Spanish, the German speakers said "beautiful," "elegant," "fragile," "peaceful," "pretty," and "slender," and the Spanish speakers said "big," "dangerous," "long," "strong," "sturdy," and "towering." This was the case even though all testing was done in English, a language without grammatical gender.
In a series of studies conducted by Gary Lupyan, people were asked to look at a series of images of imaginary aliens. Whether each alien was friendly or hostile was determined by certain subtle features but participants were not told what these were. They had to guess whether each alien was friendly or hostile, and after each response they were told if they were correct or not, helping them learn the subtle cues that distinguished friend from foe. A quarter of the participants were told in advance that the friendly aliens were called "leebish" and the hostile ones "grecious", while another quarter were told the opposite. For the rest, the aliens remained nameless. It was found that participants who were given names for the aliens learned to categorize the aliens far more quickly, reaching 80 per cent accuracy in less than half the time taken by those not told the names. By the end of the test, those told the names could correctly categorize 88 per cent of aliens, compared to just 80 per cent for the rest. It was concluded that naming objects helps us categorize and memorize them.
In another series of experiments  a group of people was asked to view furniture from an IKEA catalog. Half the time they were asked to label the object - whether it was a chair or lamp, for example - while the rest of the time they had to say whether or not they liked it. It was found that when asked to label items, people were later less likely to recall the specific details of products, such as whether a chair had arms or not. It was concluded that labeling objects helps our minds build a prototype of the typical object in the group at the expense of individual features.
Social interaction and language
A common claim is that language is governed by social conventions. Questions inevitably arise on surrounding topics. One question is, "What exactly is a convention, and how do we study it?", and second, "To what extent do conventions even matter in the study of language?" David Kellogg Lewis proposed a worthy reply to the first question by expounding the view that a convention is a rationally self-perpetuating regularity in behavior. However, this view seems to compete to some extent with the Gricean view of speaker's meaning, requiring either one (or both) to be weakened if both are to be taken as true.
Some have questioned whether or not conventions are relevant to the study of meaning at all. Noam Chomsky proposed that the study of language could be done in terms of the I-Language, or internal language of persons. If this is so, then it undermines the pursuit of explanations in terms of conventions, and relegates such explanations to the domain of "meta-semantics". Metasemantics is a term used by philosopher of language Robert Stainton to describe all those fields that attempt to explain how semantic facts arise. One fruitful source of research involves investigation into the social conditions that give rise to, or are associated with, meanings and languages. Etymology (the study of the origins of words) and stylistics (philosophical argumentation over what makes "good grammar", relative to a particular language) are two other examples of fields that are taken to be meta-semantic.
Not surprisingly, many separate (but related) fields have investigated the topic of linguistic convention within their own research paradigms. The presumptions that prop up each theoretical view are of interest to the philosopher of language. For instance, one of the major fields of sociology, symbolic interactionism, is based on the insight that human social organization is based almost entirely on the use of meanings. In consequence, any explanation of a social structure (like an institution) would need to account for the shared meanings which create and sustain the structure.
Rhetoric is the study of the particular words that people use to achieve the proper emotional and rational effect in the listener, be it to persuade, provoke, endear, or teach. Some relevant applications of the field include the examination of propaganda and didacticism, the examination of the purposes of swearing and pejoratives (especially how it influences the behavior of others, and defines relationships), or the effects of gendered language. It can also be used to study linguistic transparency (or speaking in an accessible manner), as well as performative utterances and the various tasks that language can perform (called "speech acts"). It also has applications to the study and interpretation of law, and helps give insight to the logical concept of the domain of discourse.
Literary theory is a discipline that some literary theorists claim overlaps with the philosophy of language. It emphasizes the methods that readers and critics use in understanding a text. This field, an outgrowth of the study of how to properly interpret messages, is unsurprisingly closely tied to the ancient discipline of hermeneutics.
Language and continental philosophy
In continental philosophy, language is not studied as a separate discipline, as it is in analytic philosophy. Rather, it is an inextricable part of many other areas of thought, such as phenomenology, semiotics,hermeneutics, Heideggerean ontology, existentialism, structuralism, deconstruction and critical theory. The idea of language is often related to that of logic in its Greek sense as "Logos", meaning discourse or dialectic. Language and concepts are also seen as having been formed by history and politics, or even by historical philosophy itself.
The field of hermeneutics, and the theory of interpretation in general, has played a significant role in 20th century continental philosophy of language and ontology beginning with Martin Heidegger. Heidegger combines phenomenology with the hermeneutics of Wilhelm Dilthey. Heidegger believed language was one of the most important concepts for Dasein. Heidegger believed that language today is worn out because of overuse of important words, and would be inadequate for in-depth study of Being (Sein). For example, Sein (being), the word itself, is saturated with multiple meanings. Thus, he invented new vocabulary and linguistic styles, based on Ancient Greek and Germanic etymological word relations, to disambiguate commonly used words. He avoided words like consciousness, ego, human, nature, etc. and instead talked holistically ofBeing-in-the-world, Dasein.
With such new concepts as Being-in-the-world, Heidegger constructs his theory of language, centered on speech. He believed speech (talking, listening, silence) was the most essential and pure form of language. Heidegger claims writing is only a supplement to speech, because even a reader constructs or contributes one's own "talk" while reading. The most important feature of language is its projectivity, the idea that language is prior to human speech. This means that when one is "thrown" into the world, his existence is characterized from the beginning by a certain pre-comprehension of the world. However, it is only after naming, or "articulation of intelligibility", can one have primary access to Dasein and Being-in-the-World.
Hans-Georg Gadamer expanded on these ideas of Heidegger and proposed a complete hermeneutic ontology. In Truth and Method, Gadamer describes language as "the medium in which substantive understanding and agreement take place between two people." In addition, Gadamer claims that the world is linguistically constituted, and cannot exist apart from language. For example, monuments and statues cannot communicate without the aid of language. Gadamer also claims that every language constitutes a world-view, because the linguistic nature of the world frees each individual from an objective environment: "... the fact that we have a world at all depends upon [language] and presents itself in it. The world as world exists for man as for no other creature in the world."
Paul Ricœur, on the other hand, proposed a hermeneutics which, reconnecting with the original Greek sense of the term, emphasized the discovery of hidden meanings in the equivocal terms (or "symbols") ofordinary language. Other philosophers who have worked in this tradition include Luigi Pareyson and Jacques Derrida.
Semiotics is the study of the transmission, reception and meaning of signs and symbols in general. In this field, human language (both natural and artificial) is just one among many ways that humans (and other conscious beings) are able to communicate. It allows them to take advantage of and effectively manipulate the external world in order to create meaning for themselves and transmit this meaning to others. Every object, every person, every event, and every force communicates (or signifies) continuously. The ringing of a telephone for example, is the telephone. The smoke that I see on the horizon is the sign that there is a fire. The smoke signifies. The things of the world, in this vision, seem to be labeled precisely for intelligent beings who only need to interpret them in the way that humans do. Everything has meaning. True communication, including the use of human language, however, requires someone (a sender) who sends a message, or text, in some code to someone else (a receiver). Language is studied only insofar as it is one of these forms (the most sophisticated form) of communication. Some important figures in the history of semiotics, are Charles Sanders Peirce, Roland Barthes, and Roman Jakobson. In modern times, its best-known figures includeUmberto Eco, A.J. Greimas, Louis Hjelmslev, and Tullio De Mauro. Investigations on signs in non-human communications are subject to biosemiotics, a field founded in the late 20th century by Thomas Sebeok andThure von Uexküll.
One issue that has troubled philosophers of language and logic is the problem of the vagueness of words. The specific instances of vagueness that most interest philosophers of language are those where the existence of "borderline cases" make it seemingly impossible to say whether a predicate is true or false. Classic examples are "is tall" or "is bald", where it cannot be said that some borderline case (some given person) is tall or not-tall. In consequence, vagueness gives rise to the Paradox of the heap. Many theorists have attempted to solve the paradox by way of n-valued logics, such as fuzzy logic, which have radically departed from classical two-valued logics.
Problem of universals and composition
One debate that has captured the interest of many philosophers is the debate over the meaning of universals. One might ask, for example, "When people say the word rocks, what is it that the word represents?" Two different answers have emerged to this question. Some have said that the expression stands for some real, abstract universal out in the world called "rocks". Others have said that the word stands for some collection of particular, individual rocks that we associate with merely a nomenclature. The former position has been called philosophical realism, and the latter nominalism.
The issue here can be explicated if we examine the proposition "Socrates is a Man".
From the radical realist's perspective, the connection between S and M is a connection between two abstract entities. There is an entity, "man", and an entity, "Socrates". These two things connect in some way or overlap.
From a nominalist's perspective, the connection between S and M is the connection between a particular entity (Socrates) and a vast collection of particular things (men). To say that Socrates is a man is to say that Socrates is a part of the class of "men". Another perspective is to consider "man" to be a property of the entity, "Socrates".
There is a third way, between nominalism and radical realism, usually called "moderate realism" and attributed to Aristotle and Thomas Aquinas. Moderate realists hold that "man" refers to a real essence or form that is really present and identical in Socrates and all other men, but "man" does not exist as a separate and distinct entity. This is a realist position, because "Man" is real, insofar as it really exists in all men; but it is a moderate realism, because "Man" is not an entity separate from the men it informs.
Nature of language
Many philosophical discussions of language begin by clarifying terminology. One item which has undergone significant scrutiny is the idea of language itself. Those philosophers who have set themselves to the task ask two important questions: "What is language in general?" and "What is a particular, individual language?"
Some semiotic outlooks have stressed that language is the mere manipulation and use of symbols in order to draw attention to signified content. If this were so, then humans would not be the sole possessors of language skills. On the other hand, many works by linguist Noam Chomsky have emphasized the role of syntax as a characteristic of any language.
More puzzling is the question of what it is that distinguishes one particular language from another. What is it that makes "English" English? What's the difference between Spanish and French? Chomsky has indicated that the search for what it means to be a language must begin with the study of the internal language of persons, or I-languages, which are based upon certain rules (or principles and parameters) which generate grammars. This view is supported in part by the conviction that there is no clear, general, and principled difference between one language and the next, and which may apply across the field of all languages. Other attempts, which he dubs E-languages, have tried to explain a language as usage within a specific speech community with a specific set of well-formed utterances in mind (markedly associated with linguists like Bloomfield).
Formal versus informal approaches
Another of the questions that has divided philosophers of language is the extent to which formal logic can be used as an effective tool in the analysis and understanding of natural languages. While most philosophers, including Frege, Alfred Tarski and Rudolf Carnap, have been more or less skeptical about formalizing natural languages, many of them developed formal languages for use in the sciences or formalized parts of natural language for investigation. Some of the most prominent members of this tradition of formal semantics include Tarski, Carnap, Richard Montague and Donald Davidson.
On the other side of the divide, and especially prominent in the 1950s and 60s, were the so-called "Ordinary language philosophers". Philosophers such as P. F. Strawson, John Langshaw Austin and Gilbert Rylestressed the importance of studying natural language without regard to the truth-conditions of sentences and the references of terms. They did not believe that the social and practical dimensions of linguistic meaning could be captured by any attempts at formalization using the tools of logic. Logic is one thing and language is something entirely different. What is important is not expressions themselves but what people use them to do in communication.
Hence, Austin developed a theory of speech acts, which described the kinds of things which can be done with a sentence (assertion, command, inquiry, exclamation) in different contexts of use on different occasions. Strawson argued that the truth-table semantics of the logical connectives (e.g., , and ) do not capture the meanings of their natural language counterparts ("and", "or" and "if-then"). While the "ordinary language" movement basically died out in the 1970s, its influence was crucial to the development of the fields of speech-act theory and the study of pragmatics. Many of its ideas have been absorbed by theorists such as Kent Bach, Robert Brandom, Paul Horwich and Stephen Neale.
While keeping these traditions in mind, the question of whether or not there is any grounds for conflict between the formal and informal approaches is far from being decided. Some theorists, like Paul Grice, have been skeptical of any claims that there is a substantial conflict between logic and natural language.