What Must Be Learned?
In order to present an adequate description and analysis of language acquisition, language itself must first be defined. In debates such as how language is acquired and whether it can be taught to non-humans, opposing sides often disagree on exactly what qualifies as language and what is merely gesturing, mimicry, or some non-linguistic communication system. Aitchison (1983) defines language by means of ten necessary features: semanticity, creativity, displacement, arbitrariness in symbol-meaning relationships, organization into two or more layers, structure-dependence, spontaneous usage, turn-taking, cultural transmission, and even use of the vocal-auditory channel, which necessarily excludes written communication and signed languages. Aside from being a bit cumbersome, this definition of language simply excludes too much. What you are reading right now fails to meet at least two of the ten requirements.
It may seem obvious that languages have words, which can be used to form sentences, which express ideas, but defining such things as "words", "sentences", and "ideas" proves surprisingly difficult. For instance, how many words does the sentence "The soon-to-be-married cab driver bought 103 candles," contain in spoken English? Would there be any difference if "103" were spelled out as "a hundred and three"? A similar problem exists in defining sentences. Spoken language does not include punctuation indicating where one sentence ends and the next begins. Obviously, defining language in such nebulous terms as words and sentences is asking for trouble.
A much more easily defined linguistic unit is the signal. A signal is any bit of sound, text, or gesturing that, through social convention, conveys some sort of information. This includes morphemes, words, constructions, sentences, and even long speeches. For the purposes of this paper, the word "language" will refer to any system of communication in which two or more signals can be combined to create a larger signal with a new meaning that is somewhat (if not entirely) predictable from the original signals and the method used to combine them. This definition does not go so far as to include traffic lights (contra Aitchison 1983), but it does lump certain types of animal communication into the same category as human language. This alone is enough to make many researchers balk.
The purpose of language is communication. When performing any sort of linguistic analysis, it must be kept in mind that an ordinary language learner does not try to form a system that can generate "all the possible utterances" of the language (contra Aitchison 1983); ey simply try to make eirself understood. Eir primary concerns are guessing people's communicative intentions more accurately and making eir own intentions known more easily. The social conventions of language exist purely to serve these ends.
Despite the impressive communication systems of such species as dolphins and bonobos, humans seem to have a special capacity for language. Certain apes have been somewhat successful in learning human language systems (Washoe, Koko, and Kanzi, for example), but none have done so anywhere near as effortlessly and completely as an average human. Thus, humans must possess some degree of innate linguistic ability (Pinker 1996).
If we are genetically predisposed to use language and our ancestors weren't, this must have developed gradually. Language is a cultural phenomenon, not a physical adaptation that could be of any use to an individual in isolation. A random change in an individual's genetic code resulting in enhanced communication skills would prove useless unless some sort of cultural communication system were already in use. Therefore, the earliest form of language must have been possible without any sort of language-specific genetic adaptation. Perhaps we used to walk on all fours, occasionally standing up to make hand gestures. As gesturing became more of a necessity, those who could walk upright habitually would be at a distinct advantage. It is this practice of walking upright that probably led to the descension of the human larynx, making complex vocal communication much more feasible (Armstrong 1999). So, if our linguistic abilities developed gradually, then our view of language should reflect this. The communication systems of the great apes should be viewed as likely precursors to our own.
Of course, not everyone feels this way. Noam Chomsky proposed that human language is possible because of a genetic adaptation that endows our species with a "universal grammar" (Aitchison 1983). Chomsky explains the similarities across the natural languages of the world by positing a single, underlying grammar, which every child is born knowing and uses to acquire eir native language. The drawback of this view is that natural languages do show immense variation in both form and structure. In order for the hypothesized grammatical rules of a given language to stay (in a sense) universal, they must sometimes become incredibly complex (Pinker 1996). Another complication is that the hypothesized "language organ" of the brain is yet to be found. So-called "language genes" have been shown to affect language only indirectly (O'Grady 2005). The transformations used in Chomsky's original Standard Theory have been shown to be unrealistic (Aitchison 1983). Perhaps the strongly nativist idea of an inborn, universal grammar will prove similarly dubious.
On the opposite end of this debate is Benjamin Whorf, who viewed the world's languages as different right down to the concepts they encode. Stressing that the categories and types we encounter do not "stare every observer in the face" (Bloom 2002), Whorf proposed that the way a human views the world is dependent on eir native language. This idea seems to be verified to a degree by the work of Choi and Bowerman (1991), who observed that the spatial perceptions of two-year-olds are affected by the language they are in the process of acquiring. In this way, the acquisition of language guides cognitive development (Tomasello 2003). People everywhere possess the intuitive feeling that the objects and creatures of our world fall into natural categories. This "naive essentialism" seems to be universal, but it is scientifically ungrounded (Bloom 2002). Such classifications as colors, animal species, and the states of matter are ultimately man-made, no more a part of our natural environment than the words we use to describe them.
The linguistic distinctions between morphology, syntax, and discourse are similarly artificial. It is useful to put animals that can mate and produce fertile offspring into a single category and differentiate them from animals of other species, but separating the various systems of signal combination within a given language makes less sense. Morphology builds words from the smallest units of meaning, syntax builds sentences from words, and discourse builds larger constructions out of sentences. So the distinctions between these systems hinge on the ideas of "words" and "sentences". In simpler terms, all three systems combine signals to create larger signals.
The only reliable way to separate syntactic and morphological processes is through analysis of the language's written component. But even today, many languages see no reason to include word boundaries in their writing systems. Are the particles of Japanese really postpositions, or are they suffixes? Another problem with this method is that writing systems are based much more on tradition than on logic. Is the Spanish command "Damelo!" ("Give me it!") really one word as it is written, or should it be broken down into "da", "me", and "lo"? Certainly, words are seen as holding a more complete meaning than their component morphemes, but sentences convey a more complete message still. And in any case, part of knowing the meanings of such words as "a" and "some" is knowing how these words interact with nouns (Bloom 2002). There is so much overlap between morphology and syntax that Dabrowska (2000) has referred to syntactic constructions as simply "big words". Within a language, many systems can be described as operating primarily on their own set of rules (such as number generation, proper names and titles, and verb phrase construction), but these should all be viewed as subsystems of the language’s grammar.
When language is viewed in historical terms, the systems of morphology, syntax, and discourse blur together even more. As Talmy Givon (1979) puts it, yesterday's discourse is today's syntax. And in fact, yesterday's syntax is today's morphology. Loose discourse is syntacticized by such processes as concatenation, reduction, and reanalysis (Tomasello 2003). Pertinent examples can be found in contemporary American speech. The separate phrases of the utterance "If you would, sign here please," can, through imperfect imitation or massive repetition, be combined into the single tone contour of "If you would sign here please." This phrase is then reanalyzed as a simple conditional that functions as a request. This allows such phrases as "If I could get you to sign here please," to serve as complete sentences. In response to "What's the problem?", a person might answer, "What the problem is is a cow's on the tracks." Over time, such a common nominal construction as "what the problem is" can be reduced to simply "the problem is", resulting in the conventionally ungrammatical sentence "The problem is is a cow's on the tracks." For some reason, the repetition of the word "is" is noticed and applied to such statements as "The thing is is a cow's on the tracks," when a similar message is intended. And so, English phrases are being modified and reanalyzed into new grammatical structures even now.
If a language's grammar is nothing more than the cumulative result of millennia worth of discursive practices becoming formalized, then perhaps the so-called "universals" of language can be explained by our cultural, and not necessarily biological, similarities as human beings (Tomasello 2003).