Deep Syntax — From Phrase Structure to Merge: The Hidden Mechanics of Grammar
Intro
Welcome.
What follows is not a description of language in the ordinary sense. It is an attempt to trace what lies beneath it: the computational architecture that allows finite human minds to generate infinite structured expressions.
At first glance, syntax appears to be a system of rules governing sentence formation. But as we move deeper, from phrase structure rules to transformations, from constraints to features, and finally to Minimalism, we begin to see something more radical emerging:
language is not a set of rules about sentences, but a recursive system for building structure itself.
1. Syntax as a System of Expansion
Traditional generative grammar begins with a simple insight:
A finite set of rules can generate an infinite number of sentences.
We formalize this using phrase structure rules:
S → NP + VPVP → V / V + NP / V + NP + NP / V + S
NP → Det + N / Proper N / N + PP
PP → P + NP
N → Adj + N / lexical entries
At first, this looks like a descriptive taxonomy. But its implications are deeper:
Grammar is not a list of sentencesIt is a generative system
From a small rule set emerges potentially unbounded expression.
This is the first foundational claim of generative linguistics:
Finitude of rules, infinity of sentences.
2. Generation and Parsing — Two Directions of the Same System
Once rules exist, they can be used in two ways:
Generation
Start with S → expand until a sentence emerges.
Parsing
Start with a sentence → reconstruct the structure that could have generated it.
These are not separate cognitive activities. They are inverse processes over the same system.
This leads to a deeper claim:
The mind does not store sentences. It computes them.
Language, then, is not retrieval; it is derivation.
3. The Hidden Structure Problem
Consider:
“Mary saw the mouse in the house”
This sentence admits multiple structural interpretations.
Why?
Because linear order does not determine hierarchical structure.
This leads to a key realization:
Surface strings underdetermine syntax.
So grammar must involve:
hierarchical structureinvisible attachments
structural ambiguity resolution
Thus:
Grammar is not linear sequencing; it is hierarchical computation.
4. Deep Structure, Surface Structure, and Transformations
To explain discrepancies between meaning and form, classical generative grammar introduces two levels:
Deep Structure
Represents underlying semantic relations:
who did what to whompredicate–argument structure
Surface Structure
Final expressed form:
word orderstylistic variation
syntactic rearrangement
Transformations connect the two.
Example:
Deep:
Mary saw the mouse
Surface:
The mouse was seen by Mary
This is not lexical substitution; it is structural reorganization.
5. Syntax as Movement and Constraint
Transformations include:
movementdeletion
insertion
reordering
But unrestricted transformations overgenerate possibilities.
So grammar introduces constraints:
locality restrictionsisland constraints
structure preservation
movement limitations
This yields a key shift:
Grammar is not only generative, but it is also restrictive.
It does not merely produce sentences; it filters impossible ones.
6. Features — The Invisible Mechanism of Agreement
Modern theory replaces many surface rules with features:
[+plural], [−plural][+tense]
φ-features (person, number, gender)
Agreement is no longer a rule like:
“add -s in third person singular”
Instead:
it is a feature-checking process during derivation.
Example:
“The cats runs”
fails because:
subject = pluralverb = singular
feature mismatch → derivational crash
Grammar becomes:
a system of formal consistency checking.
7. Recursion — The Engine of Infinite Expression
Recursion allows structure to embed within structure:
NP → NP + PPVP → VP + PP
N → Adj + N
Examples:
the catthe furry cat
the very furry cat in the house near the river
the man who said that the woman who believed that…
This is not stylistic elaboration.
It is structural necessity:
recursion allows infinite generation from finite rules.
8. Ambiguity — A Structural Property, Not an Error
Ambiguity arises naturally because:
linear strings collapse structuremultiple trees map to one surface form
attachment sites are underspecified
Thus ambiguity is not noise.
It is:
a structural consequence of hierarchical compression.
Meaning is not ambiguous because language is flawed—it is ambiguous because structure is richer than surface form.
9. Parsing as Computation
Parsing is not interpretation alone.
It is:
reconstruction of hidden derivational structure from observable output.
This aligns syntax with broader computational problems:
decoding signalsreconstructing latent variables
interpreting compressed data
Thus syntax becomes:
a theory of human symbolic computation.
10. The Minimalist Turn — Why Syntax Becomes Simpler
Classical grammar contains:
phrase structure rulestransformations
constraints
feature systems
Minimalism asks a deeper question:
What is the smallest possible system that can generate language?
The answer radically simplifies everything.
11. Merge — The Core Operation of Syntax
At the center of Minimalist theory lies one operation:
Merge
Merge combines two elements into one structured object:
Merge(A, B) → {A, B}
That is all.
From this single operation, everything follows.
12. External Merge — Building Structure
External Merge combines separate items:
Merge(the, cat) → [the cat]Merge(saw, [the mouse]) → [saw the mouse]
Merge(Mary, [saw the mouse]) → sentence structure emerges
There are no phrase structure rules.
Only:
recursive combination.
13. Internal Merge — The Origin of Movement
Movement is reinterpreted:
not displacement, but re-merging within structure.
Example:
Mary saw the mouse
→ Which mouse did Mary see?
The object is not moved—it is re-merged at a higher structural position.
Thus:
movement is an effect of recursion, not a separate operation.
14. Why Merge Generates Hierarchy
Merge inherently produces:
binary branchingnested structures
hierarchical embedding
Therefore:
hierarchy is not imposed on language; it is produced by the operation itself.
15. Features as Constraints on Merge
Not all combinations are allowed.
So Merge is regulated by features:
agreement conditionscase requirements
tense compatibility
Example:
“The cats runs”
fails because feature mismatch blocks derivation.
Grammar becomes:
a system of constrained structure-building.
16. Economy Principles — The Logic of Minimal Effort
Minimalism introduces optimization constraints:
Least Effort: choose simplest derivationLast Resort: move only if necessary
Grammar behaves like an optimization system:
not maximal generation, but minimal computation.
17. The Collapse of Grammar into One Function
At its most abstract level:
all syntactic structure reduces to recursive Merge under feature constraints.
Everything else becomes:
derivedemergent
or descriptive shorthand
Phrase structure rules become summaries of repeated Merge.
Transformations become properties of re-merge.
Trees become visualizations of computation.
18. Language as a Mathematical System
Merge behaves like a recursive function:
takes two inputsproduces structured output
applies repeatedly
generates infinite hierarchy
Thus language resembles:
a naturally occurring formal system of symbolic computation.
Conclusion — What Syntax Finally Reveals
If we follow the trajectory from phrase structure to Minimalism, grammar undergoes a complete transformation:
from rules → to operationsfrom lists → to recursion
from transformations → to derivation
from complexity → to minimal computation
And at the end, one idea remains:
Language is not a set of structures. It is a single recursive act repeated endlessly.
If earlier linguistics asked what sentences are possible, Minimalism asks something deeper:
What is the smallest computational system that makes sentence formation possible at all?
And the surprising answer is:
almost nothing, just Merge constrained by features.

