Morphosyntactic learning: a neurobehavioral perspective.
Structural linguistic and psycholinguistic approaches to
morphosyntax are plagued with a major logicalempirical caveat that
prevents the definition of a plausible explanatory theory. An
alternative approach is presented viewing morphosyntactic regulations as
operating on line according to sequential and associative principles
Keywords: morphosyntactic regulations, semantics, pragmatics, generative linguistics, implicit learning.
Child development (Physiological aspects)
Child development (Research)
Genes (Physiological aspects)
|Author:||Rondal, Jean A.|
|Publication:||Name: The Behavior Analyst Today Publisher: Behavior Analyst Online Audience: Academic Format: Magazine/Journal Subject: Psychology and mental health Copyright: COPYRIGHT 2010 Behavior Analyst Online ISSN: 1539-4352|
|Issue:||Date: Spring, 2010 Source Volume: 11 Source Issue: 2|
|Topic:||Event Code: 310 Science & research Canadian Subject Form: Child behaviour|
|Geographic:||Geographic Scope: Belgium Geographic Code: 4EUBL Belgium|
In his last book, Ernst Moerk laments:
How could a field that is between 100 and 200 years old, whose data are so abundantly and so readily at hand, and which has produced impressive evidence for the wealth of input and its effects, be at present still in a state where almost everything is controversial and where misleading conclusions are so predominant? While year in and year out about two billion young people acquire the various levels of widely differing, and therefore learned, mother tongues, learnability of language has been seriously questioned and rejected in some quarters. (2000, p.179)
It can be argued that the major reason for this situation is that the linguistic grammatical classes are still viewed as psychologically real and necessary for language acquisition. Whereas the formal concepts forged by linguists may be appropriate for describing sentence relationships, it is dubious that they are used by native speakers. A sequential-associative theory of morphosyntactic functioning, rooted in pragmatics and semantics, may be proposed as a plausible alternative.
Tongue and Language
Few people have realized the unrealistic character of such a research agenda. Linguistics is a hermeneutic of the tongues. It lacks the methodological tools to go beyond description. Linguists have no experimental control over the situations in which language behaviors occur and have no objective methods for validating empirically alternative theoretical models. There exists a belief in that field (assumed uncritically in psycholinguistics) that what is descriptively relevant must be ipso facto appropriate for explaining how real people proceed when producing sentences. However, to the extent that language functioning is concerned, one is addressing a neuropsychological question calling for a behavioral methodology.
Asking people is enough to convince oneself that native speakers (non-language specialists) ignore grammatical notions. They rely on semantic categories. For example, grammatical subjects are agents or topics of state, verbs specify states, actions or events, clauses express "complete" ideas, etc. Compare with the geometrical definitions in structural linguistics (for example, the reverse-tree scheme for sentence representation): the grammatical subject is the noun head of the noun phrase, located immediately below the symbol for the sentence and there is only one noun in this position.
Such a state of affairs is not alien to the generative linguist. Chomsky (1965) warns:
Assuming for the sake of discussion that the typical native speaker tacitly has at her/his disposal the formal machinery described by generative grammar (disregarding differences between successive versions of the theory), where could such a knowledge come from? Generative linguists and psycholinguists (e.g., Pinker, 1994) insist that syntactic categories cannot be induced from the input given that they are not overtly marked and have no one to single correspondence with the semantic categories. Syntactic categories, it is assumed, must be supplied innately or elaborated under the guidance of innate representations. The trouble is that representational inneism has no empirical foundation. Genes coding for universal grammatical representations have yet to be discovered. It is even doubtful that the genome has sufficient capacity for encoding the huge number of binary decisions that would be necessary to account for a linguistic grammar (Kurzwiel, 2006). One language gene has been identified. FOXP2 is a single, autosomal and dominant gene located on chromosome 7 (Lai, Fisher, Hurst, Vargha-Khadem, & Monaco, 2001). The FOX genes (forkhead box) are a big family of genes coding for proteins binding to a specific area of DNA and regulating the expression of a number of target genes. The null mutation of a gene from the FOX family can affect a potentially large number of other genes. Many forkheads are critical regulators of embryonic development. In the KE family studied by Lai et al. (2001), members over several generations had a variety of problems with spoken and written language. Corresponding indications emerge from Stromswold's (2001) review of a number of genetic studies of language (concordance analyses of disorders in twins, adoption studies, and linkage studies). Genetic factors account for much of the variance in language abilities among people with disorders and some of the variance in normal people. Nowhere, however, is there a demonstration that genetic factors are involved in coding for the abstract linguistic notions implied by representational inneism. It is more likely that they play a role in the development of the neural structures supporting language functioning. If ordinary speakers have no clear awareness of the descriptive linguistic categories and these do not develop in the brain as a consequence of some particular genetic blueprint, there is no reason to consider that they have a role in the neurobehavioral organization called language.
Morphosyntactic Functioning and Learning
Language pragmatics and semantics interfacing respectively with social and conceptual cognition are assumed to supply the early stages in contemporary models of language production (for example, Levelt, 1989,1999). They are very much part of the behavioral process with the proviso that the message needs to be further patterned according to the requirements of the tongue.
Any communicative intention presupposes a "theory of mind" (Gazzaniga, 2008). In a handful of milliseconds, the communicative intention activates a pragmatic framework specifying the objective of the message and the contrast between old information (what the speaker may reasonably hold to be known or immediately accessible to the addressee) as opposed to new information. This contrast controls the use of the ellipsis and the emphasis (prosodic or syntactic, e.g., the use of the passive voice instead of the active), and new information. Other aspects of the future utterance are also programmed at this stage, such as the illocutionary type (declaration or request), person or participant deixis (distinction between first, second, or third person, or speaker, recipient, and "bystander", respectively), optionally social deixis (e.g., polite forms), time and place deixis (e.g., proximal v. distal reference), polarity (positive v. negative), aspect (e.g., completion, duration, frequency of a given action, state or process), and mood (expressing probability, usuality, obligation, presumption, plausibility, degree, intensity, conditionality, obligation, permission, prohibition, exemption).
Lexical and semantic relational concepts are called upon as soon as the pragmatic framework is activated. Various semantic theories have been proposed (e.g., Fillmore, 1968; Chafe, 1970; Van Valin, 1999). They concord in viewing semantic relations as organized around a small number of verb types (e.g., state, process, action, action-process) and a series of subtypes (see, for example, Chafe, 1970, for more details).
A morphosyntactic component operates on the output of the pragmatic-semantic component for organizing the expression according to the sequential requirements of the tongue. Sahin, Pinker, Cash, Schomer, and Halgren (2009), using intracranial electrophysiology with epileptic patients for clinical evaluation, show that a sequence is implemented in the brain Broca's area revealing distinct neuronal firing for lexical (more or less 200 milliseconds following initiation of the language task), morphosyntactical (320 milliseconds), and phonological activity (450 milliseconds).
Specifying the psychological nature of the morphosyntactic component is one of the major challenges of today language psychology. As semantic relations are not ordered sequentially, a mechanism is needed for translating the semantic fabric into utterances. This is where the so-called functional categories of the linguists were supposed to operate (for example, the notions of "grammatical subject or object of the verb). The problem, as mentioned, is that common language users are not in possession of these notions. Consequently, they must have other means available to the same purpose. One may want to go back to suggestions by Skinner (1957) regarding morphosyntactic patterning as a purely sequential process. Most relevant is the suggestion that:
Verbal responses cannot be grouped or ordered until they have occurred or at least are about to occur (p.332). Osgood (1971) expresses the same opinion that utterances are patterned on line, "Grammars are not time-bound in generating sentences; speakers and hearers operate within time constraints and sentences must be created and understood on a 'left-to-right' basis. " (p. 521). Skinner (1957) also suggested that inflectional morphology correspond to both particular semantic characteristics of the entities referred to and proximal and/or distal associations between words.
Moreover, language production, according to Skinner, is largely formulaic. Some sentences are standard comments or responses to common situations and events. Others are nearly complete "skeletal frames" upon which a specific indication or two may be hung.
Skinner's proposals were flatly rejected by Chomsky (1959). Formulaism is indeed anathema to the generative school which has always put to the fore the so-called creative aspect of language. Actually, the fact that a language makes an infinite use of finite means in no way rules out a formulaic account of morphosyntactic functioning. Open formulas exploited by language users along with paradigmatic substitutions (see below) supply the computational power necessary to account for creativity in language use. According to Chomsky (1957), the idea that morphosyntactic patterning operates sequentially cannot be accepted because it implies a "finite state Markov process." (i.e., in producing a sentence, the speaker begins in the initial state, produces the first word, thereby switching into a second state which limits the choice of the second word, etc.; each state encompasses the syntactic restrictions that limit the choice of the next word at this point in the utterance). Chomsky claims that natural languages are not finite state languages, not the least because they exhibit recursive mechanisms of diverse types. It follows that a Markov-process conception of language is irrelevant at least for the purpose of explaining grammar. Actually, there is no obligation for a "left-to-right" syntax to operate strictly linearly, i.e., to pertain to the "simple state grammar" to which Chomsky was alluding in his 1957 essay. Nor is it necessary for a finite state grammar to conform to a simple Markov chain. A finite state syntax may include an unlimited number of recurrent loops generating an infinite number of utterances of infinite length. Moreover, assimilating Markov processes to a simple device only able to generate concatenations of first order, i.e. ruled by transitional probabilities holding between adjacent items, is misleading. Sophisticated Markov processes can deal with the production of sequences of higher order. The system can include probability networks dealing with sequences of words or groups of words. The selective probability of an item in a sequence may depend on the presence or the absence of one or several preceding items according to the notion of "limited horizon". The states the "machine" passes through may be known (so-called visible Markov models) or not (only some probabilistic function of the state sequence is known; hidden Markov models as currently exploited in speech recognition devices; Kurzwiel, 2006). The reader is referred to Manning and Schiitze (1999) for mathematical details.
Utterance production is necessarily sequential but its processing need not be strictly linear. Chomsky's formulations rule out linear syntax. A sequential syntax is different to the extent that even if the words follow each other, semantic dependencies between them (captured by transitional probabilities) may hold in proximity as well as at a distance. In this respect, it may be relevant (and amusing) to spot another confusion in Chomsky's 1957 contribution. Grammatical but nonsensical sentences, such as "Colorless green ideas sleep furiously" were taken therein to suggest that, "Probabilistic models give no particular insight into some of the basic problems of syntactic structure" (p.17).
The alleged reason was that probabilities holding between words are nil or extremely low. In utterances of the kind, however, highly probable associations do exist not between individual words but between the semantic categories to which they belong. In the example above, Colorless and green are qualities, ideas are entities, sleep a state, and furiously the characteristic of a state (or an "action"). This is these categorical semantic (and, of course, corresponding syntactic) dependencies that render the utterance grammatical.
A reasonable hypothesis is that syntactic patterning proceeds on line following the distributional regularities of the tongue. Pragmatic-semantic preferences may also influence word ordering. For example, in many languages there is a progression in nominal phrases from the kind of element that has the greatest specifying potential (the deictic) to that which has relatively less (for example, the quantifier or the qualifier). In French, the qualifier (in function of epithet as opposed to the attribute that is introduced by a copula) may be placed either before or after the entity (une belle maison; une maison belle) ; with sometimes noticeable differences in meaning (for example, un home grand means a man of an elevated stature, whereas un grand homme means a man of exceptional moral qualities). Often the anteposited epithet expresses a meaning that is not literal. There is a formal parallelism between the nominal and the verbal phrase. The verbal phrase begins with the finite verb (i.e., the verbal operator expressing person, number, tense, polarity, and modality, e.g., can, must), which is the verbal equivalent of the deictic, relating the meaning of the verb to the speaker at the time of enunciation. Corresponding suggestions may be made for other types of phrases. Phrases are ordered in various ways to form clauses depending on the tongue. There is no need to posit complicated linguistic mechanisms operating at hierarchical levels in order to account for the patterning of complex sentences. The editing mechanism controlling concord across clauses in complex sentences proceeds by proximal or distal association. Longer and/or more complex or composite sentences may force the speaker to go back to the departure semantic matrix. Behavioral correlates of this monitoring have been identified (Goldman-Eisler, 1968).
Inflectional morphology reinforces cohesion within and between phrases, clauses, and sentences. The forms vary across formal categories and there are many exceptions. This constrains the language user to a learning case by case or close to as converging analogies can be verified for the regular and some irregular forms. As suggested by Skinner (1957), inflectional morphology proceeds on line. The word inflected at the beginning of the utterance (for pragmatic/semantic reasons) serves as a discriminative flag reminding the speaker to operate cohesively with the following words if the norms of the tongue require that they must be inflected too. Given the arbitrary complexity of that part of the grammar and the high speed of language production (approximately 3.3 words--12 to 15 phonemes--per second; around 200 words per minute), it is not conceivable that inflectional marking could operate through anything else but a sequential-associative process rendered automatic by repeated practice.
Form--and meaning-based analogies play an important role in phrase, clause, and sentence production. They pertain to the paradigmatic or substitutive axis of language storing the expressive potential of the tongue. The speaker may choose among a large number of alternatives for translating her/his pragmatic-semantic plans and contents into patterned sequences of lexemes. Substitutions are dealt with on line with the proviso that there must be a distributional correspondence between substitutable lexical elements or groups.
Language production is enhanced by its formulaic character. At least a part of the utterances produced are idiomatic to some extent (Wray, 2002), meaning that the lexicon and the seqential pattern are frozen. Frequent lexical combinations are stored in memory as chunks or fomulas which allows to activate them faster. Communicators do not just have words and isolated grammatical devices; rather they have prepackaged linguistic constructions already available. Such constructions are not only idiomatic in the sense of being frozen. They may also correspond to what Skinner (1957) labels "skeletal frames", involving predetermined slots where to insert newly selected lexical material. Such frames are flexible prepatterned strings. Zipfs mathematical formula (together with Mandelbrot's; Manning & Schiitze, 1999) show that a limited set of lexical elements with a high frequency of occurrence in a corpus of language accounts for the major part of the token distribution (corresponding to a hyperbolic function). Lexemes are related to each other in groups of limited size (horizontal or hierarchical relations)--for example, the associative norms compiled by Cramer (1968), for American English. Networks of elements are constituted in this way. They supply the material for the skeletal frames.
Learning Morphosyntactic Regulations
In order to explain the existence of implicit morphosyntactic procedural knowledge in humans, one must, almost logically, appeal to implicit learning. Other components interact with this type of learning including:
(a) Innate brain structures devoted to language treatment language from the time of late fetal life.
(b) A proper theory of mind serving as a basis for Interpersonal development and language pragmatics
(c) Relevant cognitive development supporting lexical and relational semantic development.
(d) Parents supplying children with an input adapted to their abilities and adequate corrective feedbacks contingent upon the children's verbal productions.
Innate Devoted Brain Structures
Neonates recognize their mother's voice and tongue based on prosodic characteristics (Nazzi, Bertoncini, & Mehler, 1998). This ability is a direct consequence of the fetus' exposure to maternal speech during the last three months of pregnancy. By that time the peripheral and central auditory systems are fully functional (there is a loss of 40 decibels on the incoming speech frequencies due to the aquatic milieu). Newborns can discriminate categorically between all pairs of sounds of natural languages (Jusczyk, 1997). They react differentially to short sequences of variegated syllables (e.g., bagaba v. babaga; Marcus, Vijayan, Bandi Rao, & Vishton, 1999). Dehaene-Lambertz, Dehaene, and Hertz-Pannier (2002) have adapted the functional magnetic resonance technology for use with babies. Their data suggest that the left hemisphere is already dominant for the perception of speech sounds in neonates. The neuronal circuits involved in speech perception are functional right from the start. Cerebral electrophysiological data (e.g., mismatch negativity measures, brain event-related potentials) point to the same conclusion (Dehaene-Lambertz, 1997, 2000).
Numerous data document further analytical capacities in slightly older infants (for example, 8-month-olds differentiating newly presented words from older ones based on the serial order of syllables, Saffran, Aslin, & Newport, 1996; six-month-old babies differentiating prosodically well-versus ill-formed English clauses even when embedded in sentences; Nazzi, Kemler, Nelson, P. Jusczyk, & A. Jusczyk, 2000; infants discriminating between lexical items randomly selected from maternal speech belonging either to closed linguistic classes, e.g., articles, prepositions, auxiliaries, conjunctions, or to open classes, e.g., nouns, verbs, qualifiers; based on relative length and prosodic characteristics; Shi, Werker, & Morgan, 1999).
Infants develop sensitivity to nonadjacent input regularities in the course of the first year (Gomez & Gerken, 1999). Eighteen-month-olds are able to identify sequential dependencies at short left-to-right distances corresponding to groups of words (Gomez, 2002). They are reactive to the relationship between the auxiliary is and the morpheme ing in the following verb providing that the two elements are not separated by more than three syllables.
The statistical structure of the natural tongues supplies an enormous source of distributional information for syntactic learning that is only beginning to be properly evaluated. (Manning & Schiitze, 1999). Sequential regularities can be use for identifying words, groups of words, sequences of groups forming clauses, and for inducing clause organization in complex sentences.
The task of breaking the code's regulations is facilitated by the ways parents address their language-learning children. An important literature exists on this topic (see Moerk, 2000). Maximal formal simplification is observed around 12 months of age. Virtually at any moment in development the child is confronted with a language input finely tailored to her/his receptive and productive ability (Cross, 1977; Rondal, 1978). For example, over the course of language acquisition, maternal mean length of utterances (MLU) directed to the child exceeds that of the child by roughly 2.5 points for the younger and one point for the older child (Rondal, 1985). Mother-child evolution in MLU is linearly correlated (product-moment + .69 in Moerk's data 1975--between 2 and 5 years; + .55 in Rondal's data--1978--between 1 and 3 years; both correlation coefficients being statistically significant). Parental input to language-learning children is found to be almost perfectly grammatical (Marcus, 1993). This observation annihilates one of the favorite arguments of the generative tradition against the mere possibility of syntactic learning, i.e., the poverty of stimulus argument (Pinker, 1994).
A wealth of convergent data leaves little doubt that parental dynamic adaptation facilitates morphosyntactic learning. Children can concentrate their distributional analyses on series of utterances properly calibrated for length and intrinsic complexity; utterances that are proposed within adequate pragmatic and semantic interactive frameworks.
Some authors have suggested that there is no clear indication in the literature regarding the existence of negative feedbacks contingent upon the morphosyntactic dimension of children's utterances. Baker and McCarthy (1981) defined the "logical problem" of language development as the acquisition of a grammatical system in the absence of negative evidence. There is no question, however, that such evidence does indeed exist. As any other language user, parents are guided primarily by truth value and contextual adequacy, less by grammatical preoccupations. Recent data indicate, however, that they reformulate children's grammatically ill-formed utterances much more often than repeating correct ones (for example, Chouinard & Clark, 2003).
Implicit Morphosyntactic Learning
The paradigm of implicit learning and tacit knowledge was first introduced by Reber (1967). In implicit (or procedural) as opposed to explicit (or declarative) learning, the acquisition phase is incidental (i.e., subjects are unaware of the regularities governing the material that they learn) and learning leads to knowledge that is difficult or impossible to access consciously and/or to report verbally. Implicit morphosyntactic learning is computational in the sense that it exploits the stochastic characteristics of data fed into to the system and the relative frequencies of various sequential and associative patterns appearing in the input. Numerous observations confirm that the calendars of grammatical acquisitions correspond to the cumulative effects of the following parameters: (a) the relative frequencies of relevant structures in the input; (b) the intrinsic complexity of the forms to be acquired; and (c) the availability of the corresponding cognitive notions. Numerous observations support point 1 (the other two being self-evident). For example, Moerk (1980) has shown that the inflectional morphemes most often used by the parents of the Harvard children are the first to reach 90% of correct production in obligatory contexts. Lahey, Liebergott, Chesnick, Menyuk, and Adams (1992), and Wilson (2003) concur that differences in parental frequency of use of the grammatical structures play a decisive role in the acquisition calendar. Rowland and Pine (2000) and Rowland, Pine, Lieven, and Theakston (2003) report that the frequency of interrogative and relative WH-terms as well as the inversion or noninversion of the verbal element and the pronoun in parental interrogative clauses predict the corresponding evolution in children's productions.
Implicit learning is concerned with surface features of the material. Some sort of abstraction may be involved with repeated exposure. There is no need to refer to tacit knowledge remote from regularities existing in the learning material. The principles governing associative learning are sufficient to account for the acquisition of complex cognitive structures (Perruchet & Gallego, 1997). Knowledge derived from the regularities present in the input through a general process of implicit learning regards proximal and distal associations, sequential patterns, and distributional and other on line statistics. The present theory implies a direct mapping of semantic relationships (given a series of preliminary pragmatic choices) upon sequences of words and inflectional morphemes. Research on the implicit learning of graphotactic and morphological regularities in written French and English (Pacton, Perruchet, Cleeremans, & Fayol, 2001; Pacton, Perruchet, & Fayol, 2005; Deacon, Pacton, & Conrad, 2008) shows that rules are not abstracted even after massive amounts of exposure to a rule-based material. Instead subjects (children as well as adults) keep relying on statistical regularities in the material even when they have the opportunity to develop symbolic rules. They deal with associative regularities between orthographic features relating to co-occurrences of letters and morphemes within words, words within clauses, as well as with graphotactic analogies holding between words. I believe that the same conclusion holds for the morphosyntactic aspects of spoken languages.
There is no need for descriptive linguistic categories and mysterious genetic hypotheses in order to account for the human morphosyntactic ability. This core instrument of language is the object of a sequential-associative type of implicit learning enhanced in the way the input is delivered to the children. There exists an innate support to language acquisition but it is not of the representational type as claimed in the generative tradition. Instead, and more consistent with present-day tenets in neurosciences, it corresponds to an evolved sensitivity of devoted brain structures to patterns of word concatenations.
Baker, C., & McCarthy, J. (1981). The logical problem of language acquisition. Cambridge, MA: MIT Press.
Chafe, W. (1970). Meaning and the structure of language. Chicago, IL: Chicago University Press.
Chomsky, N. (1957). Syntactic structures. The Hague, The Netherlands: Mouton.
Chomsky, N. (1959). Review of Verbal behavior by B.F. Skinner. Language, 35, 26-58.
Chomsky, N. (1965). Aspects of the theory of syntax. Cambridge, MA: MIT Press.
Chouinard, M., & Clark, E. (2003). Adult reformulations of child errors as negative evidence. Journal of Child Language, 30, 637-669.
Cramer, P. (1968). Word association. New York, NY: Academic.
Cross, T. (1977). Mother's speech adjustments: The contribution of selected child's listener variables. In C. Snow & C. Ferguson (Eds.), Talking to children (pp. 151-188). New York, NY: Cambridge University Press.
Deacon, H., Pacton, S., & Conrad, M. (2008). A statistical learning perspective on children's learning about graphotactic and morphological regularities in spelling. Canadian Psychology, 49, 118-124.
Dehaene-Lambertz, G. (1997). Assessment of perinatal pathologies in premature neonates using a syllable discrimination task. Biology of the Neonate, 71, 299-305.
Dehaene-Lambertz, G. (2000). Cerebral specialization for speech and non-speech stimuli in infants. Journal of Cognitive Neuroscience, 12, 449-460.
Dehaene-Lambertz, G., Dehaene, S., & Hertz-Pannier, L. (2002). Functional neuroimaging of speech perception in infants. Science, 298, 2013-2015.
Fillmore, C. (1968). The case for case. In E. Bach & R. Harms (Eds.), Universals in linguistic theory (pp. 1-87). New York, NY: Holt, Rinehart & Winston.
Gazzaniga, M. (2008). Human. New York, NY : HarperCollins.
Goldman-Eisler, F. (1968). Psycholinguistics. Experiments in spontaneous speech. New York, NY : Academic.
Gomez, R. (2002). Variability and detection of invariant structure. Psychological Science, 13, 431-436.
Gomez, R., & Gerken, L. (1999). Artificial grammar learning by 1-year-olds leads to specific and abstract knowledge. Cognition, 70, 109-135.
Hauser, M., Chomsky, N., & Fitch, W. (2002). The faculty of language: What is it, who has it, and how did it evolve? Science, 298, 1559-1569.
Jusczyk, P. (1997). The discovery of spoken language. Cambridge, Massachusetts, MA: MIT Press.
Kurzwiel, R. (2006). The singularity is near. When human transcend biology. London: Duckworth.
Lahey, M., Liebergott, J., Chesnick, M., Menyuk, P., & Adams, J. (1992). Variability in children's use of grammatical morphemes. Applied Psycholinguistics, 13, 373-398.
Lai, C., Fisher, S., Hurst, J., Vargha-Khadem, F., & Monaco, A. (2001). A forkhead-domain gene is mutated in a severe speech and language disorder. Nature, 413, 519-523.
Levelt, W. (1989). Speaking: From intention to articulation. Cambridge, MA: MIT Press.
Levelt, W. (1999). Producing spoken language: A blueprint of the speaker. In A. Brown & P. Hagoort (Eds.), The neurocognition of language (pp. 83-122). New York, NY: Oxford University Press.
Manning, C., & Schutze, H. (1999). Foundations of statistical natural language processing. Cambridge, MA: MIT Press.
Marcus, G. (1993). Negative evidence in language acquisition. Cognition, 46, 53-85.
Marcus, G., Vijayan, S., Bandi Rao, S., & Vishton, P. (1999). Rule learning by seven month-old infants. Science, 283, 77-80.
Moerk, E. (1975). Verbal interaction between children and their mothers during the preschool years. Developmental Psychology, 11, 788-794.
Moerk, E. (1980). Relationships between parental input frequencies and children's language acquisition: A reanalysis of Brown's data. Journal of Psycholinguistic Research, 7, 105118.
Moerk, E. (2000). The guided acquisition of first language skills. Stamford, Connecticut: Ablex.
Nazzi, T., Bertoncini, J., & Mehler, J. (1998). Language discrimination by newborns: Towards an understanding of the role of rhythm. Journal of Experimental Psychology: Human Perception and Performance, 24, 1-11.
Nazzi, T., Kemler, P., Nelson, D., Jusczyk, P., & Jusczyk, A. (2000). Six-month-olds' detection of clauses embedded in continuous speech: Effects of prosodic well-formedness. Infancy, 1, 123-147.
Osgood, C. (1971). Where do sentences come from? In D. Steinberg & L. Jakobovits (Eds.), Semantics (pp. 497-529). New York: Cambridge University Press.
Pacton, S., Perruchet, P., Cleeremans, A., & Fayol, M. (2001). Implicit learning out of the lab: The case of orthographic regularities. Journal of Experimental Psychology: General, 130, 401-426.
Pacton, S., Perruchet, P., & Fayol, M. (2005). Children's implicit learning of graphotactic and morphological regularities. Child Development, 76, 324-339.
Perruchet, P., & Gallego, J. (1997). A subjective unit formation account of implicit learning. In D. Berry (Ed.), How implicit is implicit learning? (pp. 124-161). Oxford, United Kingdom: Oxford University Press.
Pinker, S. (1994). The language instinct. New York : Morrow.
Reber, A. (1967). Implicit learning of artificial grammars. Journal of Verbal Learning and Verbal Behavior, 6, 855-863.
Rondal, J.A. (1978). Maternal speech to normal and Down's children matched for mean length of utterance. In E. Meyers (Ed.), Quality of life in severely and profoundly mentally retarded people: Research foundations for improvement (pp. 193-265). Washington, D.C.: American Association on Mental Deficiency, Monograph Series N[degrees]3.
Rondal, J.A. (1985). Adult-child interaction and the process of language acquisition. New York: Praeger Press.
Rowland, C., & Pine, J. (2000). Subject-auxiliary inversion errors and WH-question acquisition: What children do know. Journal of Child Language, 27, 157-181.
Rowland, C., Pine, J., Lieven, E., & Theakston, A. (2003). Determinants of acquisition order in WH-questions: Re-evaluating the role of caregiver speech. Journal of Child Language, 30, 609-635.
Saffran, J., Aslin, R., & Newport, E. (1996). Statistical learning by 8-month-old infants. Science, 274, 1926-1928.
Sahin, N., Pinker, S., Cash, S., Schomer, D., & Halgren, E. (2009). Sequential processing of lexical, grammatical, and phonological information within Broca's area. Science, 326, 445-449.
Santelman L., & Jusczyk, P. (1998). Sensitivity to discontinuous dependencies in language learners: Evidence for limitations in processing space. Cognition, 69, 105-134.
Shi, R., Werker, J., & Morgan, J. (1999). Newborn infants' sensitivity to perceptual cues to lexical and grammatical words. Cognition, 72, B11-B21.
Skinner, B.F. (1957). Verbal behavior. Englewood Cliffs, New Jersey: Prentice-Hall.
Stromswold, K. (2001). The heritability of language: A review and meta-analysis of twin, adoption, and linkage studies. Language, 77, 647-723.
Van Valin, R. (1999). Generalized semanticx roles and the syntax-semantic interface. In F. Corblin, C. Dobrovie-Sorin, & J. Marandin (Eds.), Empirical issues in formal syntax and semantics (Vol. 2; pp. 373-389). The Hague, The Netherlands: Thesus.
Wray, A. (2002). Formulaic language and the lexicon. Cambridge, United Kingdom: Cambridge University Press.
This essay is dedicated to the memory of Ernst Moerk. I am indebted to James Welch for his careful checking of the English language.
Author contact information:
Jean A. Rondal
Department of Cognitive Sciences
University of Liege, Bat. 32; Sart Tilman
4000 Liege, Belgium
Thus a generative grammar attempts to specify what the speaker actually knows, not what he may report about his knowledge. (p. 8)
|Gale Copyright:||Copyright 2010 Gale, Cengage Learning. All rights reserved.|