Language, Thought, and the Biological Foundations of Grammar
Preface
This post grows out of a sustained engagement with Noam Chomsky’s generative program, not as doctrine, but as a research tradition animated by a small set of deceptively simple questions:
The immediate stimulus for the present post is a public interview delivered by Chomsky at the Oxford University Linguistic Society in 2020. Yet the ambition of the post is broader: to situate that discussion within the long arc of generative inquiry, to extract its conceptual core, and to show why these ideas remain central to linguistics, philosophy of language, and cognitive science.
This is not a biography, not an introduction, and not an apologetic defense. It is an attempt to articulate, systematically and critically, the explanatory vision underlying the biolinguistic enterprise.
Introduction: From Description to Explanation
Modern linguistics did not begin as a science of explanation. For much of the early twentieth century, it was a discipline of classification. Structural linguistics, particularly in its American incarnation, conceived of language as an inventory of observable elements (phonemes, morphemes, constructions) organized through distributional procedures.
This approach achieved descriptive rigor, but at a cost. It deliberately renounced questions about mental reality, acquisition, and biological grounding. Language was treated as an external artifact rather than an internal system.
The publication of Syntactic Structures (1957) marked a rupture. Chomsky’s intervention was not merely technical; it was methodological and philosophical. He rejected the idea that linguistics could remain taxonomic and still count as an explanatory science. A theory of language, he argued, must explain:
This post is organized around the consequences of taking those questions seriously.
1: The Failure of Taxonomic Linguistics
Structural linguistics inherited its self-image from the natural sciences, but its methods resembled those of pre-theoretical classification, not explanation. The “phonemic principle,” distributional analysis, and segmentation procedures were explicitly designed to avoid reference to unobservable mental entities.
Two assumptions dominated:
- The Bowasian principle: languages differ without limit.
- Behaviorist learning theory: language is acquired through habit formation and analogy.
These assumptions were mutually reinforcing, and jointly untenable.
The poverty of stimulus argument shattered the learning-theoretic foundation. Children converge on highly specific grammatical systems despite receiving fragmentary, noisy, and finite input. No taxonomic procedure could explain this convergence.
The deeper problem, however, was methodological: procedures cannot discover the entities needed for explanation. To explain linguistic competence, one must posit abstract structures not recoverable from surface distributions alone.
2: Generative Grammar as a Biolinguistic Program
Generative grammar is often mischaracterized as a theory of syntax. In fact, it is better understood as a research program within cognitive biology.
On this view:
This framing transforms traditional questions. The goal is no longer to catalogue constructions but to identify the computational principles that make language possible at all.
3: The Poverty of Stimulus Revisited
The poverty of stimulus is not an empirical curiosity; it is a logical problem.
Children acquire knowledge that:
Recent experimental work confirms that by age two or three, children control abstract syntactic dependencies that cannot be inferred from surface frequency.
Any theory that attributes acquisition to general learning mechanisms must explain how such mechanisms extract structure that is not present in the data. To date, no non-innate account has succeeded.
4: Evolutionary Paradox and the Minimalist Turn
A persistent objection to generative grammar is evolutionary: how could such a complex system arise?
Minimalist theory reframes the question by identifying the simplest possible operation capable of generating linguistic structure: Merge.
Merge takes two elements and forms an unordered set. From this trivial operation follows:
If Merge is computationally minimal, and computational systems require such operations anyway, then language evolution becomes plausible. Complexity arises not from baroque mechanisms, but from interaction with interfaces.
5: Internal Language and Externalization
One of the most consequential distinctions in recent theory is between:
This distinction resolves multiple puzzles:
Externalization is messy because it interfaces with ancient motor and perceptual systems. The internal system is simple because it evolved recently and serves a narrow function: structuring thought.
6: Structure Dependence and the Brain
Structure dependence, the fact that grammatical operations ignore linear order, is among the strongest universals in linguistics.
Neurocognitive evidence now supports this claim. Experimental studies show that:
This convergence between theory and neuroscience is rare, and significant. It suggests that generative principles track real properties of the brain.
7: Semantics Without Reference
Chomsky’s critique of referential semantics is often misunderstood. He does not deny reference; he denies that words refer by themselves.
Reference is an act, performed by speakers using words in context. Meaning resides in mental structures, not in direct word–world pairings.
Formal semantics, when properly understood, operates over mental representations, not objects in the external world. Events, individuals, and properties are theoretical constructs, useful, but internal.
This view aligns generative linguistics with a long philosophical tradition from Aristotle to Strawson.
8: Language as a System for Thought
The most radical implication of the biolinguistic view is this:
Language is not primarily for communication.
Communication is secondary, an externalization of an internal system whose primary function is thought construction.
This explains why language is so poorly adapted to communication:
What language does extraordinarily well is generate structured thoughts.
Conclusion: Why Generative Grammar Still Matters
Generative grammar persists not because of institutional inertia, but because it addresses questions other approaches avoid:
The biolinguistic framework does not claim final answers. It claims something more modest and more demanding: that linguistic theory must be explanatory, biologically grounded, and cognitively real.
That challenge remains unmet elsewhere.
Suggested Readings:
Chomsky, N. (2000). New Horizons in the Study of Language and Mind. Cambridge: Cambridge University Press.
Ohta, S. (2020). Why Only Us: Language and Evolution, Review By Berwick, Robert C. and Noam Chomsky, MIT Press, Cambridge, MA, 2016, vii+224pp. English Linguistics / Journal of the English Linguistic Society of Japan, 37(1), 101-111.
Source: Interview with Professor Noam Chomsky - Oxford University Linguistics Society

