1
|
Inflectional zero morphology - Linguistic myth or neurocognitive reality? Front Psychol 2022; 13:1015435. [PMID: 36571055 PMCID: PMC9773071 DOI: 10.3389/fpsyg.2022.1015435] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 08/09/2022] [Accepted: 11/14/2022] [Indexed: 12/14/2022] Open
Abstract
Knowledge of language, its structure and grammar are an essential part of our education and daily activities. Despite the importance of language in our lives, linguistic theories that explain how the language system operates are often disconnected from our knowledge of the brain's neurocognitive mechanisms underpinning the linguistic function. This is reflected, for example, in the inclusion of abstract and often controversial elements into theories of language. Here, we discuss the case of the so-called null constituent and its smallest and the most controversial variant - the zero morpheme, a hypothetical morphosyntactic device that has no overt physical (phonological or orthographic) expression. Focusing on the putative inflectional zero morpheme, we discuss the theoretical origins and pitfalls of this approach and advocate the important role for neurobiological research that could try to elucidate the neurocognitive reality of such constructs in linguistic communication.
Collapse
|
2
|
Cartography: Innateness or Convergent Cultural Evolution? Front Psychol 2022; 13:887670. [PMID: 35548511 PMCID: PMC9084363 DOI: 10.3389/fpsyg.2022.887670] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/01/2022] [Accepted: 03/21/2022] [Indexed: 11/13/2022] Open
Abstract
Haspelmath argues that linguists who conduct comparative research and try to explain patterns that are general across languages can only consider two sources of these patterns: convergent cultural evolution of languages, which provides functional explanations of these phenomena, or innate building blocks for syntactic structure, specified in the human cognitive system. This paper claims that convergent cultural evolution and functional-adaptive explanations are not sufficient to explain the existence of certain crosslinguistic phenomena. The argument is based on comparative evidence of generalizations based on Rizzi and Cinque's theories of cartographic syntax, which imply the existence of finely ordered and complex innate categories. I argue that these patterns cannot be explained in functional-adaptive terms alone.
Collapse
|
3
|
A Gestalt Theory Approach to Structure in Language. Front Psychol 2021; 12:649384. [PMID: 34220621 PMCID: PMC8249935 DOI: 10.3389/fpsyg.2021.649384] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 01/04/2021] [Accepted: 05/10/2021] [Indexed: 11/13/2022] Open
Abstract
The fact that human language is highly structured and that, moreover, the way it is structured shows striking similarities in the world’s languages has been addressed from two different perspectives. The first, and more traditional, generative hypothesis is that the similarities are due to an innate language faculty. There is an inborn ‘grammar’ with universal principles that manifest themselves in each language and cross-linguistic variation arises due to a different parameter setting of universal principles. A second perspective is that there is no inborn, innate language faculty, but that instead structure emerges from language usage. This paper purports to develop and illustrate a third perspective, according to which the structural similarities in human languages are the result of the way the cognitive system works in perception. The essential claim is that structural properties follow from the limitations of human cognition in focus.
Collapse
|
4
|
Human Linguisticality and the Building Blocks of Languages. Front Psychol 2020; 10:3056. [PMID: 32082208 PMCID: PMC7006236 DOI: 10.3389/fpsyg.2019.03056] [Citation(s) in RCA: 7] [Impact Index Per Article: 1.8] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 07/31/2019] [Accepted: 12/24/2019] [Indexed: 11/23/2022] Open
Abstract
This paper discusses the widely held idea that the building blocks of languages (features, categories, and architectures) are part of an innate blueprint for Human Language, and notes that if one allows for convergent cultural evolution of grammatical structures, then much of the motivation for it disappears. I start by observing that human linguisticality (=the biological capacity for language) is uncontroversial, and that confusing terminology ("language faculty," "universal grammar") has often clouded the substantive issues in the past. I argue that like musicality and other biological capacities, linguisticality is best studied in a broadly comparative perspective. Comparing languages like other aspects of culture means that the comparisons are of the Greenbergian type, but many linguists have presupposed that the comparisons should be done as in chemistry, with the presupposition that the innate building blocks are also the material that individual grammars are made of. In actual fact, the structural uniqueness of languages (in lexicon, phonology, and morphosyntax) leads us to prefer a Greenbergian approach to comparison, which is also more in line with the Minimalist idea that there are very few domain-specific elements of the biological capacity for language.
Collapse
|
5
|
Language, Memory, and Mental Time Travel: An Evolutionary Perspective. Front Hum Neurosci 2019; 13:217. [PMID: 31333432 PMCID: PMC6622356 DOI: 10.3389/fnhum.2019.00217] [Citation(s) in RCA: 19] [Impact Index Per Article: 3.8] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Journal Information] [Subscribe] [Scholar Register] [Received: 02/21/2019] [Accepted: 06/14/2019] [Indexed: 11/16/2022] Open
Abstract
Language could not exist without memory, in all its forms: working memory for sequential production and understanding, implicit memory for grammatical rules, semantic memory for knowledge, and episodic memory for communicating personal experience. Episodic memory is part of a more general capacity for mental travel both forward and backward in time, and extending even into fantasy and stories. I argue that the generativity of mental time travel underlies the generativity of language itself, and could be the basis of what Chomsky calls I-language, or universal grammar (UG), a capacity for recursive thought independent of communicative language itself. Whereas Chomsky proposed that I-language evolved in a single step well after the emergence of Homo sapiens, I suggest that generative imagination, extended in space and time, has a long evolutionary history, and that it was the capacity to share internal thoughts, rather than the nature of the thoughts themselves, that more clearly distinguishes humans from other species.
Collapse
|
6
|
Abstract
A central concern of generative grammar is the relationship between hierarchy and word order, traditionally understood as two dimensions of a single syntactic representation. A related concern is directionality in the grammar. Traditional approaches posit process-neutral grammars, embodying knowledge of language, put to use with infinite facility both for production and comprehension. This has crystallized in the view of Merge as the central property of syntax, perhaps its only novel feature. A growing number of approaches explore grammars with different directionalities, often with more direct connections to performance mechanisms. This paper describes a novel model of universal grammar as a one-directional, universal parser. Mismatch between word order and interpretation order is pervasive in comprehension; in the present model, word order is language-particular and interpretation order (i.e., hierarchy) is universal. These orders are not two dimensions of a unified abstract object (e.g., precedence and dominance in a single tree); rather, both are temporal sequences, and UG is an invariant real-time procedure (based on Knuth's stack-sorting algorithm) transforming word order into hierarchical order. This shift in perspective has several desirable consequences. It collapses linearization, displacement, and composition into a single performance process. The architecture provides a novel source of brackets (labeled unambiguously and without search), which are understood not as part-whole constituency relations, but as storage and retrieval routines in parsing. It also explains why neutral word order within single syntactic cycles avoids 213-like permutations. The model identifies cycles as extended projections of lexical heads, grounding the notion of phase. This is achieved with a universal processor, dispensing with parameters. The empirical focus is word order in noun phrases. This domain provides some of the clearest evidence for 213-avoidance as a cross-linguistic word order generalization. Importantly, recursive phrase structure “bottoms out” in noun phrases, which are typically a single cycle (though further cycles may be embedded, e.g., relative clauses). By contrast, a simple transitive clause plausibly involves two cycles (vP and CP), embedding further nominal cycles. In the present theory, recursion is fundamentally distinct from structure-building within a single cycle, and different word order restrictions might emerge in larger domains like clauses.
Collapse
|
7
|
Learning Biases Underlie "Universals" in Avian Vocal Sequencing. Curr Biol 2017; 27:3676-3682.e4. [PMID: 29174890 DOI: 10.1016/j.cub.2017.10.019] [Citation(s) in RCA: 26] [Impact Index Per Article: 3.7] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 07/26/2017] [Revised: 09/11/2017] [Accepted: 10/06/2017] [Indexed: 01/08/2023]
Abstract
Biological predispositions in vocal learning have been proposed to underlie commonalities in vocal sequences, including for speech and birdsong, but cultural propagation could also account for such commonalities [1-4]. Songbirds such as the zebra finch learn the sequencing of their acoustic elements ("syllables") during development [5-8]. Zebra finches are not constrained to learn a specific sequence of syllables, but significant consistencies in the positioning and sequencing of syllables have been observed between individuals within populations and between populations [8-10]. To reveal biological predispositions in vocal sequence learning, we individually tutored juvenile zebra finches with randomized and unbiased sequences of syllables and analyzed the extent to which birds produced common sequences. In support of biological predispositions, birds tutored with randomized sequences produced songs with striking similarities. Birds preferentially started and ended their song sequence with particular syllables, consistently positioned shorter and higher frequency syllables in the middle of their song, and sequenced their syllables such that pitch alternated across adjacent syllables. These patterns are reminiscent of those observed in normally tutored birds, suggesting that birds "creolize" aberrant sequence inputs to produce normal sequence outputs. Similar patterns were also observed for syllables that were not used for tutoring (i.e., unlearned syllables), suggesting that motor biases could contribute to sequence learning biases. Furthermore, zebra finches spontaneously produced acoustic patterns that are commonly observed in speech and music, suggesting that sensorimotor processes that are shared across a wide range of vertebrates could underlie these patterns in humans.
Collapse
|
8
|
Commentary: "An Evaluation of Universal Grammar and the Phonological Mind"-UG Is Still a Viable Hypothesis. Front Psychol 2016; 7:1029. [PMID: 27471480 PMCID: PMC4943953 DOI: 10.3389/fpsyg.2016.01029] [Citation(s) in RCA: 3] [Impact Index Per Article: 0.4] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 02/04/2016] [Accepted: 06/23/2016] [Indexed: 11/16/2022] Open
Abstract
Everett (2016b) criticizes The Phonological Mind thesis (Berent, 2013a,b) on logical, methodological and empirical grounds. Most of Everett’s concerns are directed toward the hypothesis that the phonological grammar is constrained by universal grammatical (UG) principles. Contrary to Everett’s logical challenges, here I show that the UG hypothesis is readily falsifiable, that universality is not inconsistent with innateness (Everett’s arguments to the contrary are rooted in a basic confusion of the UG phenotype and the genotype), and that its empirical evaluation does not require a full evolutionary account of language. A detailed analysis of one case study, the syllable hierarchy, presents a specific demonstration that people have knowledge of putatively universal principles that are unattested in their language and these principles are most likely linguistic in nature. Whether Universal Grammar exists remains unknown, but Everett’s arguments hardly undermine the viability of this hypothesis.
Collapse
|
9
|
An Evaluation of Universal Grammar and the Phonological Mind. Front Psychol 2016; 7:15. [PMID: 26903889 PMCID: PMC4744836 DOI: 10.3389/fpsyg.2016.00015] [Citation(s) in RCA: 6] [Impact Index Per Article: 0.8] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 12/01/2015] [Accepted: 01/06/2016] [Indexed: 12/01/2022] Open
Abstract
This paper argues against the hypothesis of a “phonological mind” advanced by Berent. It establishes that there is no evidence that phonology is innate and that, in fact, the simplest hypothesis seems to be that phonology is learned like other human abilities. Moreover, the paper fleshes out the original claim of Philip Lieberman that Universal Grammar predicts that not everyone should be able to learn every language, i.e., the opposite of what UG is normally thought to predict. The paper also underscores the problem that the absence of recursion in Pirahã represents for Universal Grammar proposals.
Collapse
|
10
|
Subtle Implicit Language Facts Emerge from the Functions of Constructions. Front Psychol 2016; 6:2019. [PMID: 26858662 PMCID: PMC4729932 DOI: 10.3389/fpsyg.2015.02019] [Citation(s) in RCA: 3] [Impact Index Per Article: 0.4] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 10/26/2015] [Accepted: 12/17/2015] [Indexed: 11/17/2022] Open
Abstract
Much has been written about the unlikelihood of innate, syntax-specific, universal knowledge of language (Universal Grammar) on the grounds that it is biologically implausible, unresponsive to cross-linguistic facts, theoretically inelegant, and implausible and unnecessary from the perspective of language acquisition. While relevant, much of this discussion fails to address the sorts of facts that generative linguists often take as evidence in favor of the Universal Grammar Hypothesis: subtle, intricate, knowledge about language that speakers implicitly know without being taught. This paper revisits a few often-cited such cases and argues that, although the facts are sometimes even more complex and subtle than is generally appreciated, appeals to Universal Grammar fail to explain the phenomena. Instead, such facts are strongly motivated by the functions of the constructions involved. The following specific cases are discussed: (a) the distribution and interpretation of anaphoric one, (b) constraints on long-distance dependencies,
Collapse
|
11
|
Linguistic explanation and domain specialization: a case study in bound variable anaphora. Front Psychol 2015; 6:1421. [PMID: 26441791 PMCID: PMC4585305 DOI: 10.3389/fpsyg.2015.01421] [Citation(s) in RCA: 2] [Impact Index Per Article: 0.2] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 05/22/2015] [Accepted: 09/07/2015] [Indexed: 01/29/2023] Open
Abstract
The core question behind this Frontiers research topic is whether explaining linguistic phenomena requires appeal to properties of human cognition that are specialized to language. We argue here that investigating this issue requires taking linguistic research results seriously, and evaluating these for domain-specificity. We present a particular empirical phenomenon, bound variable interpretations of pronouns dependent on a quantifier phrase, and argue for a particular theory of this empirical domain that is couched at a level of theoretical depth which allows its principles to be evaluated for domain-specialization. We argue that the relevant principles are specialized when they apply in the domain of language, even if analogs of them are plausibly at work elsewhere in cognition or the natural world more generally. So certain principles may be specialized to language, though not, ultimately, unique to it. Such specialization is underpinned by ultimately biological factors, hence part of UG.
Collapse
|
12
|
Abstract
The question of identifying the properties of language that are specific human linguistic abilities, i.e., Universal Grammar, lies at the center of linguistic research. This paper argues for a largely Emergent Grammar in phonology, taking as the starting point that memory, categorization, attention to frequency, and the creation of symbolic systems are all nonlinguistic characteristics of the human mind. The articulation patterns of American English rhotics illustrate categorization and systems; the distribution of vowels in Bantu vowel harmony uses frequencies of particular sequences to argue against Universal Grammar and in favor of Emergent Grammar; prefix allomorphy in Esimbi illustrates the Emergent symbolic system integrating phonological and morphological generalizations. The Esimbi case has been treated as an example of phonological opacity in a Universal Grammar account; the Emergent analysis resolves the pattern without opacity concerns.
Collapse
|
13
|
Computational principles of syntax in the regions specialized for language: integrating theoretical linguistics and functional neuroimaging. Front Behav Neurosci 2014; 7:204. [PMID: 24385957 PMCID: PMC3866525 DOI: 10.3389/fnbeh.2013.00204] [Citation(s) in RCA: 7] [Impact Index Per Article: 0.7] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 09/30/2013] [Accepted: 12/01/2013] [Indexed: 11/29/2022] Open
Abstract
The nature of computational principles of syntax remains to be elucidated. One promising approach to this problem would be to construct formal and abstract linguistic models that parametrically predict the activation modulations in the regions specialized for linguistic processes. In this article, we review recent advances in theoretical linguistics and functional neuroimaging in the following respects. First, we introduce the two fundamental linguistic operations: Merge (which combines two words or phrases to form a larger structure) and Search (which searches and establishes a syntactic relation of two words or phrases). We also illustrate certain universal properties of human language, and present hypotheses regarding how sentence structures are processed in the brain. Hypothesis I is that the Degree of Merger (DoM), i.e., the maximum depth of merged subtrees within a given domain, is a key computational concept to properly measure the complexity of tree structures. Hypothesis II is that the basic frame of the syntactic structure of a given linguistic expression is determined essentially by functional elements, which trigger Merge and Search. We then present our recent functional magnetic resonance imaging experiment, demonstrating that the DoM is indeed a key syntactic factor that accounts for syntax-selective activations in the left inferior frontal gyrus and supramarginal gyrus. Hypothesis III is that the DoM domain changes dynamically in accordance with iterative Merge applications, the Search distances, and/or task requirements. We confirm that the DoM accounts for activations in various sentence types. Hypothesis III successfully explains activation differences between object- and subject-relative clauses, as well as activations during explicit syntactic judgment tasks. A future research on the computational principles of syntax will further deepen our understanding of uniquely human mental faculties.
Collapse
|
14
|
Abstract
Human language is both highly diverse-different languages have different ways of achieving the same functional goals-and easily learnable. Any language allows its users to express virtually any thought they can conceptualize. These traits render human language unique in the biological world. Understanding the biological basis of language is thus both extremely challenging and fundamentally interesting. I review the literature on linguistic diversity and language universals, suggesting that an adequate notion of 'formal universals' provides a promising way to understand the facts of language acquisition, offering order in the face of the diversity of human languages. Formal universals are cross-linguistic generalizations, often of an abstract or implicational nature. They derive from cognitive capacities to perceive and process particular types of structures and biological constraints upon integration of the multiple systems involved in language. Such formal universals can be understood on the model of a general solution to a set of differential equations; each language is one particular solution. An explicit formal conception of human language that embraces both considerable diversity and underlying biological unity is possible, and fully compatible with modern evolutionary theory.
Collapse
|