header logo

Language, Thought, and the Biological Foundations of Grammar

Language, Thought, and the Biological Foundations of Grammar

Language, Thought, and the Biological Foundations of Grammar

(Generative Grammar, Cognition, and Human Uniqueness)

Riaz Laghari
Lecturer in English (Linguistics)
National University of Modern Languages (NUML), Islamabad

Preface

This post grows out of a sustained engagement with Noam Chomsky’s generative program, not as doctrine, but as a research tradition animated by a small set of deceptively simple questions:

What kind of system is human language?
How can children acquire it so rapidly?
How could such a system have evolved?
Why does it exhibit the particular formal properties it does?

The immediate stimulus for the present post is a public interview delivered by Chomsky at the Oxford University Linguistic Society in 2020. Yet the ambition of the post is broader: to situate that discussion within the long arc of generative inquiry, to extract its conceptual core, and to show why these ideas remain central to linguistics, philosophy of language, and cognitive science.


This is not a biography, not an introduction, and not an apologetic defense. It is an attempt to articulate, systematically and critically, the explanatory vision underlying the biolinguistic enterprise.

Introduction: From Description to Explanation

Modern linguistics did not begin as a science of explanation. For much of the early twentieth century, it was a discipline of classification. Structural linguistics, particularly in its American incarnation, conceived of language as an inventory of observable elements (phonemes, morphemes, constructions) organized through distributional procedures.


This approach achieved descriptive rigor, but at a cost. It deliberately renounced questions about mental reality, acquisition, and biological grounding. Language was treated as an external artifact rather than an internal system.


The publication of Syntactic Structures (1957) marked a rupture. Chomsky’s intervention was not merely technical; it was methodological and philosophical. He rejected the idea that linguistics could remain taxonomic and still count as an explanatory science. A theory of language, he argued, must explain:

How speakers know what they know
How children acquire that knowledge
Why languages have the properties they do

This post is organized around the consequences of taking those questions seriously.

1: The Failure of Taxonomic Linguistics

Structural linguistics inherited its self-image from the natural sciences, but its methods resembled those of pre-theoretical classification, not explanation. The “phonemic principle,” distributional analysis, and segmentation procedures were explicitly designed to avoid reference to unobservable mental entities.

Two assumptions dominated:

  1. The Bowasian principle: languages differ without limit.
  2. Behaviorist learning theory: language is acquired through habit formation and analogy.

These assumptions were mutually reinforcing, and jointly untenable.

The poverty of stimulus argument shattered the learning-theoretic foundation. Children converge on highly specific grammatical systems despite receiving fragmentary, noisy, and finite input. No taxonomic procedure could explain this convergence.


The deeper problem, however, was methodological: procedures cannot discover the entities needed for explanation. To explain linguistic competence, one must posit abstract structures not recoverable from surface distributions alone.

2: Generative Grammar as a Biolinguistic Program

Generative grammar is often mischaracterized as a theory of syntax. In fact, it is better understood as a research program within cognitive biology.


On this view:

Language is an internal system (I-language), not a collection of utterances.
The object of study is a mental organ, comparable in abstract status to the visual system.
Linguistic theory must respect biological constraints: learnability, evolvability, and neural implementability.

This framing transforms traditional questions. The goal is no longer to catalogue constructions but to identify the computational principles that make language possible at all.

3: The Poverty of Stimulus Revisited

The poverty of stimulus is not an empirical curiosity; it is a logical problem.

Children acquire knowledge that:

Is not explicitly taught
Is underdetermined by input
Is uniform across populations

Recent experimental work confirms that by age two or three, children control abstract syntactic dependencies that cannot be inferred from surface frequency.


Any theory that attributes acquisition to general learning mechanisms must explain how such mechanisms extract structure that is not present in the data. To date, no non-innate account has succeeded.

4: Evolutionary Paradox and the Minimalist Turn

A persistent objection to generative grammar is evolutionary: how could such a complex system arise?

Minimalist theory reframes the question by identifying the simplest possible operation capable of generating linguistic structure: Merge.


Merge takes two elements and forms an unordered set. From this trivial operation follows:

Hierarchical structure
Displacement
Infinite generativity

If Merge is computationally minimal, and computational systems require such operations anyway, then language evolution becomes plausible. Complexity arises not from baroque mechanisms, but from interaction with interfaces.

5: Internal Language and Externalization

One of the most consequential distinctions in recent theory is between:

The internal system: a thought-generating computational mechanism
Externalization: mapping internal structures to sensory-motor systems

This distinction resolves multiple puzzles:

Why languages are learned uniformly but differ superficially
Why linear order is so variable and unstable
Why sign languages share deep grammatical properties with spoken languages

Externalization is messy because it interfaces with ancient motor and perceptual systems. The internal system is simple because it evolved recently and serves a narrow function: structuring thought.

6: Structure Dependence and the Brain

Structure dependence, the fact that grammatical operations ignore linear order, is among the strongest universals in linguistics.


Neurocognitive evidence now supports this claim. Experimental studies show that:

Hierarchical stimuli activate core language areas
Linear-order-based systems do not
Even cognitively impaired individuals can acquire structure-based grammars

This convergence between theory and neuroscience is rare, and significant. It suggests that generative principles track real properties of the brain.

7: Semantics Without Reference

Chomsky’s critique of referential semantics is often misunderstood. He does not deny reference; he denies that words refer by themselves.


Reference is an act, performed by speakers using words in context. Meaning resides in mental structures, not in direct word–world pairings.


Formal semantics, when properly understood, operates over mental representations, not objects in the external world. Events, individuals, and properties are theoretical constructs, useful, but internal.


This view aligns generative linguistics with a long philosophical tradition from Aristotle to Strawson.

8: Language as a System for Thought

The most radical implication of the biolinguistic view is this:

Language is not primarily for communication.


Communication is secondary, an externalization of an internal system whose primary function is thought construction.


This explains why language is so poorly adapted to communication:

Ambiguity is rampant
Externalization is inefficient
Interpretation relies heavily on pragmatics

What language does extraordinarily well is generate structured thoughts.

Conclusion: Why Generative Grammar Still Matters

Generative grammar persists not because of institutional inertia, but because it addresses questions other approaches avoid:

Why language exists at all
Why it has the form it does
Why it is uniquely human

The biolinguistic framework does not claim final answers. It claims something more modest and more demanding: that linguistic theory must be explanatory, biologically grounded, and cognitively real.

That challenge remains unmet elsewhere.


Suggested Readings:

Chomsky, N. (2000). New Horizons in the Study of Language and Mind. Cambridge: Cambridge University Press.

Ohta, S. (2020). Why Only Us: Language and Evolution, Review By Berwick, Robert C. and Noam Chomsky, MIT Press, Cambridge, MA, 2016, vii+224ppEnglish Linguistics / Journal of the English Linguistic Society of Japan37(1), 101-111.

Source: Interview with Professor Noam Chomsky - Oxford University Linguistics Society

Tags

Post a Comment

0 Comments
* Please Don't Spam Here. All the Comments are Reviewed by Admin.