header logo

A Computational Approach to Syntax

 

A Computational Approach to Syntax

A Computational Approach to Syntax

Feature-Based Grammar in the Minimalist Tradition with Urdu and Saraiki Comparisons

Riaz Laghari
Visiting Lecturer in English, Quaid-i-Azam University & National University of Modern Languages (NUML), Islamabad


For my students (QAU, NUML, and beyond)

STRUCTURE 

PART I — FOUNDATIONS OF COMPUTATIONAL SYNTAX

1: What is Syntax? From Rules to Computation

2: The Lexicon and Feature Architecture

3: Phrase Structure and X-bar Theory

4: Hierarchical Structure vs Linear Order

PART II — CORE MECHANISMS

5: Merge and Structure Building

6: Agree and Feature Checking

7: Case Theory Across Languages

8: Theta Theory and Argument Structure

PART III — MOVEMENT AND CONSTRAINTS

9: A-Movement

10: A′-Movement (Wh, Focus, Topicalization)

11: Locality Constraints (Subjacency, Phases)

12: The EPP and Subjecthood

PART IV — INTERFACES AND INTERPRETATION

13: Binding Theory

14: Information Structure

15: PF and LF Interfaces

PART V — CROSS-LINGUISTIC SYNTAX (CORE CONTRIBUTION)

16: English vs Urdu Word Order

17: Saraiki Syntax and Argument Structure

18: Case Systems: Nominative vs Ergative

19: Agreement Systems in South Asian Languages

PART VI — ADVANCED TOPICS

20: Phase Theory

21: Minimalism and Economy Conditions

22: Computational Modeling of Syntax

23: Syntax and Cognition

PART VII — PEDAGOGICAL AND RESEARCH EXTENSIONS

24: Teaching Syntax Effectively

25: Syntax in NLP and AI

26: Research Directions in Pakistani Linguistics 

A Computational Approach to Syntax

Feature-Based Grammar in the Minimalist Tradition with Urdu and Saraiki Comparisons

Riaz Laghari

Preface

Syntax, when viewed through the lens of generative grammar, emerges not as a descriptive inventory of sentence patterns but as a computational system of the human mind. Since the foundational work of Noam Chomsky, linguistic theory has shifted toward uncovering the formal properties of this system, reducing grammar to operations, features, and interface conditions.

This post advances that project by integrating:

Minimalist syntax
Feature-based computation
Cross-linguistic evidence from English, Urdu, and Saraiki

The inclusion of Urdu and Saraiki is not merely illustrative; it is theoretical. These languages reveal how parametric variation emerges from feature specifications, not from fundamentally different grammars.

PART I — FOUNDATIONS OF COMPUTATIONAL SYNTAX

1: What is Syntax? From Rules to Computation

1.1 Introduction: The Shift from Description to Explanation

Syntax has undergone a profound transformation over the last century. What began as a largely descriptive enterprise, ataloguing sentence patterns and grammatical rules, has evolved into a formal science of the human mind. This transformation is most closely associated with the work of Noam Chomsky, who reconceptualized language as a generative system capable of producing an infinite number of sentences from a finite set of elements.


In traditional grammar, the sentence was the unit of analysis. In modern syntax, however, the focus shifts to the underlying computational system that generates those sentences. The central question is no longer “What is a grammatical sentence?” but rather:


What mental operations and representations allow humans to produce and understand sentences?


This question marks the transition from surface description to deep explanation.

1.2 Language as a Computational System

At its core, human language can be understood as a computational procedure. This procedure takes lexical items as input and generates structured expressions as output. The system is:

Finite in means (limited lexicon and rules)
Infinite in output (unbounded sentence generation)

This property is known as discrete infinity, a defining characteristic of human language.

Consider the recursive nature of embedding:

John believes [that Mary said [that Ali thinks [that…]]]

There is no theoretical limit to such recursion. This cannot be explained by memorization or linear rules alone. Instead, it requires a generative mechanism.

1.3 Competence vs Performance

A crucial distinction in generative grammar is that between:

Competence: the internalized knowledge of language
Performance: the actual use of language in real situations

Performance is affected by:

Memory limitations
Processing constraints
Social and contextual factors

Syntax, as a scientific discipline, is concerned primarily with competence, the idealized system underlying linguistic ability.

1.4 The Generative Enterprise

A grammar, in the generative sense, is not a set of prescriptive rules but a formal system that generates all and only the grammatical sentences of a language.

Such a grammar must satisfy three criteria:

Descriptive adequacy: correctly captures native speaker intuitions
Explanatory adequacy: explains how such knowledge is acquired
Computational efficiency: operates with minimal mechanisms

The third requirement becomes central in later developments, particularly in the Minimalist Program.

1.5 From Phrase Structure Rules to Minimalism

Early generative grammar relied heavily on phrase structure rules, such as:

S → NP VP
VP → V NP

While descriptively useful, these rules were:

Redundant
Language-specific
Not cognitively economical

The evolution toward Minimalism seeks to reduce grammar to:

General principles
Feature-driven operations

Thus, instead of multiple rules, we derive structure from a single operation: Merge.

1.6 Hierarchical Structure vs Linear Order

One of the most important insights of modern syntax is that language is hierarchical, not merely linear.

Consider the ambiguity:

Old men and women

This can mean:

[Old men] and [women]
Old [men and women]

This ambiguity cannot be captured by word order alone. It requires hierarchical structure, typically represented through tree diagrams or bracket notation.

1.7 Evidence from Urdu and Saraiki

The importance of hierarchical structure becomes even clearer when we compare English with Urdu and Saraiki.

English (SVO):

John ate apples.

Urdu (SOV):

جان نے سیب کھائے

Saraiki:

جان نے سیب کھادے

Despite differences in word order, all three languages share:

Predicate–argument structure
Hierarchical organization
Feature dependencies

This strongly supports the hypothesis of a Universal Grammar, where variation is superficial and deeply constrained.

1.8 The Role of Features

Modern syntax reduces grammatical variation to features, which are properties of lexical items.

Examples include:

Person (1st, 2nd, 3rd)
Number (singular, plural)
Case (nominative, accusative, ergative)

Features drive syntactic computation. They determine:

Agreement
Movement
Case assignment

Thus, syntax becomes a system of feature checking and valuation.

1.9 Grammaticality Judgments

A central method in syntactic theory is the use of grammaticality judgments.

Examples:

✓ She likes him
*She likes he

The asterisk (*) marks ungrammaticality.

Such judgments are not arbitrary; they reflect the internal grammar of speakers. Importantly, they allow linguists to:

Test hypotheses
Identify constraints
Build formal models

1.10 The Notion of Structure Dependence

One of the strongest arguments for the mental reality of syntax is structure dependence.

Consider forming a question:

The boy is happy → Is the boy happy?

Now consider:

The boy who is playing is happy

The correct question is:

Is the boy who is playing happy?

Not:

*Is the boy who playing is happy?

This shows that rules operate on structure, not linear order.

1.11 Syntax and Cognition

Syntax is not an isolated system; it is part of a broader cognitive architecture.

It interfaces with:

Semantics (meaning)
Phonology (sound)

Thus, language involves:

Form
Meaning
Computation

This triadic relationship makes syntax central to understanding the human mind.

1.12 Toward a Minimalist Perspective

The Minimalist Program seeks to answer a fundamental question:

What is the simplest possible system that can account for linguistic competence?

To answer this, it proposes:

Elimination of redundancy
Reduction of operations
Economy principles

The goal is not just descriptive accuracy but theoretical elegance.

1.13 Why Urdu and Saraiki Matter

Most syntactic theory has historically been based on English and a few European languages. However, Urdu and Saraiki provide crucial insights:

Ergative alignment
Rich agreement systems
Flexible word order

These features challenge simplistic models and push theory toward greater universality.

1.14 Summary

This chapter has established the conceptual foundation of the book:

Syntax is a computational system
Structure is hierarchical, not linear
Grammar is feature-driven
Variation across languages is parametric

These principles will guide the chapters that follow.

1.15 Exercises

Exercise 1

Identify whether the ambiguity below is structural or lexical:

Visiting relatives can be annoying

Exercise 2

Provide Urdu and Saraiki equivalents for:

The boy is eating an apple
Analyze the word order.

Exercise 3

Explain why the following is ungrammatical:

*Is the boy who playing is happy?

1.16 Further Reading

  • Chomsky, N. Syntactic Structures
  • Chomsky, N. The Minimalist Program
  • Carnie, A. Syntax: A Generative Introduction

2: The Lexicon and Feature Architecture

2.1 Introduction: The Lexicon as the Engine of Syntax

If Chapter 1 established syntax as a computational system, the present chapter identifies its fuel: the lexicon.


In the generative tradition associated with Noam Chomsky, the lexicon is not a simple list of words. It is a structured repository of feature bundles that feed the syntactic computation. Every derivation begins with the selection of items from this repository, forming what is known as the numeration.


Thus, syntax does not operate on words per se—it operates on features encoded in lexical items.

2.2 The Nature of Lexical Items

A lexical item is a complex object consisting of multiple layers of information:

Phonological Form (PF): how the item is pronounced
Semantic Form (LF): what the item means
Syntactic Features: how the item behaves structurally

A simplified representation is:

Lexical Item = ⟨Phonology, Syntax, Semantics⟩

For example:

eat:

  Category: V

  θ-grid: ⟨Agent, Theme⟩

  Features: [V, uφ]

This representation shows that lexical items are instructions for computation, not mere labels.

2.3 Feature Types: The Core Distinction

Modern syntax revolves around a crucial distinction:

Interpretable vs Uninterpretable Features

Interpretable features (iF)

Contribute to meaning
Survive at Logical Form

Uninterpretable features (uF)

Have no semantic content
Must be checked and deleted before Spell-Out

This distinction is central to the Minimalist Program and is extensively developed in works like those of Andrew Carnie.

2.4 Phi-Features (φ-features)

Phi-features encode agreement properties:

Person: 1st, 2nd, 3rd
Number: singular, plural
Gender: masculine, feminine, neuter

English:

She runs (3rd person singular agreement)

Urdu:

وہ جاتی ہے (agreement in gender and number)

Saraiki:

اوہ ویندی اے

Unlike English, Urdu and Saraiki exhibit rich agreement morphology, making φ-features more visible.

2.5 Case Features

Case is a fundamental property of noun phrases (DPs).

English Case System:

Nominative: subject position
Accusative: object position

Urdu/Saraiki Case System:

CaseMarkerFunction
Ergative-نے (-ne)subject (perfective)
Accusative/Dative-کو (-ko)object/experiencer

Example:

جان نے کتاب پڑھی
John-ERG book read

Here, the subject is marked ergative, and agreement shifts accordingly.

2.6 The Feature Matrix

Lexical items can be represented as feature matrices, which formalize their properties.

Example (English DP):

John:

Category: D

Features:

[iφ: 3rd, singular]

 [uCase]

Example (Urdu Verb):

کھایا (khāyā):

Category: V

Features:

[θ: Agent, Theme]

[uφ]

[Aspect: perfective]

These matrices allow syntax to function as a feature-matching system.

2.7 The Numeration

A derivation begins with a numeration, a selected set of lexical items:

Numeration = {John, will, eat, apples}

Each item enters the derivation with its feature specifications intact.

The syntactic system then:

Merges items
Checks features
Eliminates uninterpretable features

2.8 Functional vs Lexical Categories

A fundamental distinction in syntax is between:

Lexical Categories

N (noun)
V (verb)
A (adjective)

These carry semantic content.

Functional Categories

T (tense)
C (complementizer)
D (determiner)
v (light verb)

These carry grammatical features and drive computation.

2.9 Functional Structure in English vs Urdu/Saraiki

English:

Strong T → overt auxiliaries
Fixed word order

Urdu/Saraiki:

Rich morphology
Flexible word order
Strong v features (ergativity)

Thus, variation across languages emerges from feature distribution, not different grammatical systems.

2.10 Feature Checking and Valuation

Features must be checked through syntactic operations.

Agree Mechanism:

A Probe (with uF) searches for a Goal (with iF)
Once matched, features are valued and deleted

Example:

T [uφ] → agrees with DP [iφ]

This explains subject–verb agreement across languages.

2.11 Strong vs Weak Features

Features can differ in strength:

Strong features → trigger movement
Weak features → checked in place

English:

Weak agreement → limited movement

Urdu/Saraiki:

Strong agreement → richer morphology

2.12 Lexical Variation and Parametric Differences

Languages differ in:

Feature strength
Feature presence
Feature interpretation

This leads to:

Word order variation
Agreement patterns
Case systems

Thus, linguistic diversity is reduced to parametric variation in feature systems.

2.13 A Comparative Feature Table

FeatureEnglishUrduSaraiki
Word OrderSVOSOVSOV
AgreementLimitedRichRich
Case SystemNom-AccErgativeErgative
GenderMinimalStrongStrong

2.14 Derivational Example

Sentence:

John will eat apples

Feature Interaction:

John: [iφ, uCase]
T: [uφ, EPP]
eat: assigns θ-role

Process:

Merge VP
Introduce subject
T agrees with subject
Case assigned
EPP satisfied

2.15 Urdu/Saraiki Derivation

Sentence:

جان نے سیب کھائے

Key differences:

Ergative Case assigned by v
Agreement with object
Verb-final structure

This demonstrates how same system, different feature settings yields different outputs.

2.16 Lexicon and Cognitive Economy

The lexicon is not arbitrary; it is constrained by:

Economy principles
Learnability
Interface conditions

Thus, feature systems are:

Minimal
Efficient
Universal

2.17 Summary

This chapter has established that:

The lexicon consists of feature bundles
Features drive syntactic computation
Agreement and Case arise from feature interactions
Cross-linguistic variation is feature-based

2.18 Exercises

Exercise 1

Construct a feature matrix for:

She runs

Exercise 2

Analyze Case marking in:

علی نے کتاب پڑھی

Exercise 3

Compare agreement in:

She eats
وہ کھاتی ہے

3: Phrase Structure and X-bar Theory

Hierarchical Organization of Syntax with Urdu and Saraiki Comparisons

3.1 Introduction: From Words to Structures

Syntax is fundamentally hierarchical. Words are combined into phrases, phrases into clauses, and clauses into sentences. Understanding this hierarchy is crucial for:

Explaining ambiguity
Modeling movement
Linking form to meaning

X-bar Theory provides a formal schema that abstracts over individual languages while capturing universal properties of phrase structure.

3.2 The Motivation for X-bar Theory

Early generative grammar relied on phrase structure rules:

S → NP VP
VP → V NP

Limitations of this approach:

Redundancy: Separate rules for every category
Lack of generalization: No cross-linguistic abstraction
No explicit hierarchical representation beyond surface order

X-bar theory addresses these limitations by:

Introducing intermediate projections
Uniformly representing all categories
Distinguishing heads, complements, and specifiers

3.3 Basic X-bar Schema

The universal schema can be represented as:

XP(Specifier)  X;XX  (Complement)XP \rightarrow (Specifier) \; X' \quad ; \quad X' \rightarrow X \; (Complement)

Where:

X = Head (lexical or functional)
X' = Intermediate projection
XP = Maximal projection
Specifier = typically subject, DP, or operator

3.4 Head, Complement, and Specifier

Head (X)

Core element
Determines the category of the phrase

Complement

Sister to the head
Completes the argument structure

Specifier

Sister to X'
Fulfills EPP, topicalization, or focus functions

Example (English VP):

[VP [V' V [NP apples]]]

Example (Urdu VP):

سیب کھایا (apples ate)
[VP [V' V [NP سیب]]]

Example (Saraiki VP):

سیب کھادے
[VP [V' V [NP سیب]]]

Notice: All languages share X-bar hierarchy, but surface order differs due to movement and feature-driven derivation.

3.5 Cross-Linguistic Phrase Structure

English (SVO)

TP → Spec-TP (subject) + T′
T′ → T + VP
VP → V + NP
[TP John [T′ will [VP eat apples]]]

Urdu/Saraiki (SOV)

TP → Spec-TP (subject) + T′
T′ → vP + T
vP → DP (subject) + VP
VP → NP + V
[TP جان [T′ [vP tJohn [VP سیب کھایا]] T]]

Observation:

The underlying hierarchical relations remain constant
Verb-final word order is derived from feature-driven movement

3.6 Intermediate Projections (X')

X′ ensures that all phrases have uniform internal structure, which is necessary for:

Movement
Binding
Scope interpretation

Diagram:

XP

├── Specifier

X′

├── Head

└── Complement

All categories (N, V, A, P) conform to this pattern.

3.7 Adjunction

Adjuncts are added at the X' level:

Optional, recursive
Do not affect argument structure

Example (English VP):

John eats apples quickly.
[VP [V' [V' V NP] AdvP quickly]]

Urdu VP:

جان نے سیب جلدی کھایا
[VP [V' [V' V NP] AdvP جلدی]]

Adjunction is cross-linguistically uniform but surface placement varies.

3.8 Specifier Positions

Specifiers serve multiple roles:

Subject position (Spec-TP)
Wh-elements (Spec-CP)
Focus/Topicalization

English:

Who did John see t?
Spec-CP hosts the wh-word

Urdu/Saraiki:

کون جان کو دیکھا؟ (koun John ko dekha?)
Spec-CP hosts the wh-word, but verb remains final

Observation: Specifier positions are universal, but movement triggers vary.

3.9 Maximal Projection (XP)

XP represents the full phrase. Every constituent can be considered maximal:

NP → maximal projection of N
VP → maximal projection of V

Example:

[NP the tall man]
Head: man
Specifier: the
Adjunct: tall

3.10 Feature Checking within X-bar Theory

X-bar theory interacts with feature-driven syntax:

Head carries features (φ, Case, EPP)
Specifier may be a goal for Agree
Complement satisfies θ-roles

Example (Urdu SOV):

جان نے سیب کھایا
ElementFeaturesFunction
جانDP, iφsubject
سیبDP, iCaseobject
کھایاV, uφhead, assigns θ-roles

3.11 Trees in Urdu and Saraiki

Urdu:

[TP جان [T′ [vP tJohn [VP سیب V]] T]]

Saraiki:

[TP جان [T′ [vP tJohn [VP سیب V]] T]]

Key observations:

Hierarchy identical
Surface verb placement differs due to v-movement and parametric feature differences
Agreement pattern differs due to φ-feature valuation

3.12 Adjunction vs Specifiers: A Clear Distinction

FeatureSpecifierAdjunct
ObligatorySometimesNever
SingleMultipleMultiple
θ-role assignedNoNo

3.13 Implications for Minimalism

Uniform X-bar structure allows economy in computation
Cross-linguistic variation is captured without changing the system
Provides a framework for movement, binding, and agreement

3.14 Exercises

Exercise 1

Draw X-bar trees for:

English: John quickly ate apples
Urdu: جان نے سیب جلدی کھایا
Saraiki: جان سیب جلدی کھادے

Exercise 2

Identify Spec, Head, and Complement in:

English: Who did Mary see?
Urdu:  کس نے ماریا کو دیکھا؟
Saraiki: کس نے ماریا کو ویکھیا؟

Exercise 3

Explain why adjuncts are recursive but specifiers are not.

3.15 Summary

X-bar theory provides a universal schema for phrases
All categories have Head → Complement → Specifier
Adjunction is optional and recursive
Urdu and Saraiki confirm hierarchical uniformity despite surface differences

4: Hierarchical Structure vs Linear Order/Linearization / Spec-Head Relations

From Structural Hierarchy to Surface Order: English, Urdu, and Saraiki

4.1 Introduction

Syntax operates on hierarchical structure, not simply linear strings of words. Yet, humans perceive and produce sentences linearly. Understanding how hierarchy maps to linear order is crucial for both theoretical syntax and computational modeling.

This chapter addresses:

Spec-Head agreement
C-command relations
Linearization across English, Urdu, and Saraiki
Illustrative X-bar and full phrase trees

4.2 Spec-Head Agreement: Formal Overview

Definition

Spec-Head agreement is a universal mechanism where features of the head (X) are checked against the features of the specifier (Spec-XP).

Trigger: Head carries unvalued features (uF)
Goal: Specifier carries interpretable features (iF)
Outcome: Features are valued and deleted before Spell-Out

Formal Representation

[XP Specifier_i [X' X_j ...]] → Agree(X_j[uF], Specifier_i[iF])

X_j[uF] = head with unvalued feature
Specifier_i[iF] = DP with interpretable feature
Agree ensures φ-feature valuation (person, number, gender)

4.3 English Spec-Head Agreement

Example 1: Subject-Verb Agreement

John runs.

Tree Representation:

[TP John_i [T′ T[uφ] [VP runs_j]]]

Mechanism:

T[uφ] probes DP in Spec-TP
Features matched → T values φ-features
Agreement triggers subject-verb concord

Observation: English exhibits nominative–accusative alignment. Spec-Head agreement is mandatory for grammaticality.

4.4 Urdu Spec-Head Agreement

Example 2: Ergative Alignment

جان نے کتاب پڑھی
Jān-ne kitāb paṛhī (John-ERG book read-FEM)

Analysis:

Spec-TP: DP جان (John)
T carries uninterpretable φ-features
v assigns ergative Case
Agreement aligns with object (NP کتاب) due to ergative pattern
[TP جان_i [T′ T[uφ] [vP tJohn_i [VP کتاب_j پڑھی]]]]

Observation: Unlike English, agreement can target the object, illustrating parametric variation.

4.5 Saraiki Spec-Head Agreement

جان سیب کھادے
Jān seb khāde (John apple ate-MASC)
Same hierarchical structure as Urdu
Verb agrees with object (φ-feature), not subject
Demonstrates cross-linguistic consistency of X-bar, with parametric feature variation

4.6 C-command: Definition and Applications

Definition

A node A C-commands node B if:

Every branching node dominating A also dominates B
A does not dominate B

Formal Rule

A c-commands B ↔ (Parent(A) dominates B ∧ A does not dominate B)

Significance

C-command underlies:

Binding (pronouns and reflexives)
Scope relations
Movement dependencies

4.7 Binding Theory via C-command

Principle A (Reflexives)

A reflexive pronoun must be bound in its governing category.
Binding = c-command + coindexation

Example (English):

John_i saw himself_i → ✓
*Him_i saw John_i → ✗

Example (Urdu):

جان نے اپنے آپ کو دیکھا
Jān-ne apne āp ko dekhā
Reflexive bound in Spec-TP domain

4.8 Linearization of Hierarchical Structure

Although syntax builds trees, humans speak linearly. Linearization determines:

Word order
Adjunct placement
Verb positioning

English Linearization (SVO)

TP: Spec-TP → T → VP
VP: V → NP
Adjuncts follow VP or attach to X'

Urdu/Saraiki Linearization (SOV)

VP-final: NP → V
Subject DP remains in Spec-TP
Adjuncts can precede or follow VP

4.9 X-bar Trees with Linearization

English Example: “John quickly eats apples”

[TP John_i [T′ T[uφ] [VP [AdvP quickly] [V' V [NP apples]]]]]
Linearization respects head-initial VP

Urdu Example: “جان نے سیب جلدی کھایا”

[TP جان_i [T′ T[uφ] [vP tJohn_i [VP [NP سیب] [AdvP جلدی] [V' V]]]]]
Verb-final linearization, Spec-Head relations preserved

Saraiki Example: “جان سیب جلدی کھادے”

[TP جان_i [T′ T[uφ] [vP tJohn_i [VP [NP سیب] [AdvP جلدی] [V' V]]]]]
Verb-final order, identical hierarchical relations

4.10 The Interaction of C-command and Linearization

C-command is structural, independent of surface order
Linearization imposes phonetic sequencing
Examples:

English:

Wh-movement: Who did John see?

Wh-word moves to Spec-CP
Linearization places it sentence-initial

Urdu:

Wh-movement: کس نے جان کو دیکھا؟
Wh-word in Spec-CP
Verb remains final due to SOV linearization

4.11 Cross-Linguistic Observations

PropertyEnglishUrduSaraiki
Word OrderSVOSOVSOV
Spec-Head AgreementSubjectObject in ergativeObject in ergative
C-commandUniversalUniversalUniversal
LinearizationHead-initial VPHead-final VPHead-final VP

Observation:

Universal principles (Spec-Head, C-command)
Parametric differences (linearization, agreement target)

4.12 Exercises

Exercise 1

Draw full TP-VP trees for:

English: Mary quickly ate the apples
Urdu: مریم نے سیب جلدی کھایا
Saraiki: مریم سیب جلدی کھادے

Exercise 2

Identify c-command relations in:

John_i saw himself_i → which nodes c-command which?
جان نے اپنے آپ کو دیکھا → identify binding domain

Exercise 3

Compare Spec-Head agreement patterns across the three languages for the following:

English: The boys run
Urdu: لڑکوں نے دوڑا
Saraiki: لڑکے دوڑے

4.13 Summary

Spec-Head agreement is universal; targets vary by language
C-command explains binding and scope relations
Linearization maps hierarchical trees to surface order
English is head-initial, Urdu/Saraiki head-final
X-bar structure remains cross-linguistically uniform

PART II — CORE MECHANISMS

5: Merge and Structure Building

Formal Operations and Cross-Linguistic Derivations in English, Urdu, and Saraiki

5.1 Introduction: Merge as the Core Syntactic Operation

In the Minimalist Program, syntax is built from a single operation: Merge. Merge takes two syntactic objects and combines them into a new, hierarchically structured unit.

It replaces the older proliferation of phrase structure rules.
It is recursive, allowing unbounded sentence generation.
It respects feature requirements and drives agreement, movement, and linearization.

This chapter provides a formal, cross-linguistic account of Merge and structure building.

5.2 Formal Definition of Merge

Definition

Merge is a structure-building function that takes two syntactic objects, α and β, and creates a new syntactic object γ:

Merge(α,β)={α,β}Merge(\alpha, \beta) = \{\alpha, \beta\} α and β can be lexical items or previously merged structures
The resulting set forms a new node in the tree

Key Properties

Recursion: Merge can apply to its own output
Hierarchical output: Produces X′/XP structures
Feature-driven: Only merges that satisfy feature requirements are licensed

5.3 Binary vs N-ary Merge

Binary Merge

Combines exactly two elements
Standard in Minimalism
Example: Verb + Object → V′
VP → Merge(V, NP)

Tree Illustration (English):

[V'
V eat
NP apples
]

N-ary Merge

Combines more than two elements simultaneously
Rarely needed; often reducible to successive binary Merge
Example: VP → Merge(V, NP, AdvP)
Minimalist preference: binary Merge for economy

5.4 Headedness in Merge

Every Merge operation designates a head

Head determines:

Category (N, V, A, P, T)
Valency (θ-roles)
Feature projections

Binary Merge with Headedness

Merge(Head H, Complement XP) → H′
Head projects to intermediate (X′)
Maximal projection XP may attach a specifier

5.5 Feature-Driven Merge

Merge is licensed only when features are compatible:

Selectional features: V selects NP
Agreement features: T merges with DP for φ-feature checking
Case features: DP merges in a position where Case can be assigned

Example: English

Merge(DP John[iφ], T[uφ]) → Spec-TP
Merge(T, VP[eat apples]) → TP
Result: John eats apples

Example: Urdu

Merge(DP جان[iφ], T[uφ]) → Spec-TP
Merge(vP, T) → TP
Merge(VP → NP V) → verb-final structure: جان نے سیب کھایا

Example: Saraiki

Same derivation: جان سیب کھادے
vP feature settings trigger agreement with object instead of subject

5.6 Merge Across Categories

CategoryExample (English)Example (Urdu)Example (Saraiki)
VPeat applesسیب کھایاسیب کھادے
NPthe tall manلمبا آدمیلمبا من
PPon the tableمیز پرمیز تے

Observation: Merge applies universally, with surface variation determined by feature specification.

5.7 Successive-Cyclic Merge and Movement

Movement can be analyzed as successive-cyclic reapplication of Merge:

Wh-movement: Who did John see?
Merge(Wh-word, Spec-CP)
Must satisfy feature-checking (uQ in C, uφ in T)

Urdu Wh-Movement

Spec-CP hosts wh-word: کون جان کو دیکھا؟
Verb remains final due to head-final vP linearization

Saraiki Wh-Movement

Spec-CP hosts wh-word: کون جان کو ویکھیا؟
Verb-final linearization preserved

Observation: Merge + feature checking accounts for both movement and linearization.

5.8 Cross-Linguistic Merge Trees

English: “John quickly ate apples”

[TP John_i [T′ T[uφ] [VP [AdvP quickly] [V' V [NP apples]]]]]
Binary Merge at every step
Specifier = DP John
Adjunct = AdvP quickly
Head projects = V

Urdu: “جان نے سیب جلدی کھایا”

[TP جان_i [T′ T[uφ] [vP tJohn_i [VP [NP سیب] [AdvP جلدی] [V' V]]]]]
Verb-final
Spec-Head agreement involves φ-feature and ergative assignment

Saraiki: “جان سیب جلدی کھادے”

[TP جان_i [T′ T[uφ] [vP tJohn_i [VP [NP سیب] [AdvP جلدی] [V' V]]]]]
Same hierarchical output
Feature-driven agreement targets object

5.9 Merge and θ-Theory

Merge ensures θ-role assignment: Agent, Theme, Experiencer
Head determines argument structure
Complement merges in θ-position

English Example:

eat → Merge(V, NP)
Assigns θ-role: Agent (John), Theme (apples)

Urdu/Saraiki Example:

کھایا / کھادے → Merge(V, NP)
Assigns θ-roles respecting ergativity

5.10 Economy and Merge

Merge is binary and feature-driven for computational efficiency
N-ary Merge is avoided
Minimalist principle: Do not build structures unnecessarily

5.11 Merge and Interface Conditions

PF: Linearization
LF: Scope and binding
Merge + feature-checking ensures both interfaces are satisfied

5.12 Exercises

Exercise 1

Draw Merge trees for:

English: Mary quickly reads the book
Urdu: مریم نے کتاب جلدی پڑھی
Saraiki: مریم کتاب جلدی پڑھے

Exercise 2

Analyze feature checking in:

English: The boys run
Urdu: لڑکوں نے دوڑا
Saraiki: لڑکے دوڑے

Exercise 3

Explain why binary Merge is preferred over n-ary Merge in Minimalist syntax.

5.13 Summary

Merge is the core structure-building operation
Binary Merge is economically preferred
Headedness determines category, θ-roles, and features

Feature-driven Merge accounts for:
Agreement
Case assignment
Movement
Cross-linguistic variation
Urdu and Saraiki illustrate parametric variation with shared hierarchical principles

6: Agree and Feature Checking

The Mechanics of Feature Valuation in English, Urdu, and Saraiki

6.1 Introduction: From Structure to Feature Satisfaction

In the previous chapter, we established Merge as the core structure-building operation. Yet, hierarchical structure alone does not guarantee grammaticality. For a derivation to converge at the interfaces (PF & LF), features must be valued and checked.

The Agree operation formalizes this feature valuation process, linking probes (unvalued features) to goals (interpretable features). This chapter examines:

The formal definition of Agree
Feature types
Spec-Head and long-distance Agree
Cross-linguistic behavior in English, Urdu, and Saraiki

6.2 Formal Definition of Agree

Agree is a syntactic operation that:

Selects a probe with unvalued features (uF)
Searches its c-command domain for a goal with matching interpretable features (iF)
Values the probe’s features
Deletes uninterpretable features before Spell-Out
Agree(Probei[uF],Goali[iF])Probei[iF]  (uF deleted)Agree(Probei[uF], Goali[iF]) → Probei[iF] \; (uF \text{ deleted})

Conditions:

C-command: Probe must c-command Goal
Minimality: The closest matching Goal is selected
Feature Compatibility: Only matching features are valued

6.3 Types of Features in Agree

Feature TypeDescriptionExample
φ-featuresPerson, number, genderEnglish: She runs (3rd sg)
Case featuresAssignable case to DPsUrdu: -نے (ergative)
EPPMovement-triggeringEnglish: Spec-TP must be filled
Tense/AspectAspectual or tense featuresUrdu: perfective -ا / -ئی

6.4 Agree in Spec-Head Configurations

English Subject-Verb Agreement

John runs.

Representation:

[TP John_i [T′ T[uφ] [VP runs_j]]]

Process:

T[uφ] probes DP John[iφ]
φ-features are copied to T
DP is marked for Case (nominative)

Urdu Ergative Agreement

جان نے کتاب پڑھی

Analysis:

v assigns ergative Case to subject DP
T[uφ] agrees with object (NP کتاب) in perfective context
[TP جان_i [T′ T[uφ] [vP tJohn_i [VP کتاب_j پڑھی]]]]

Observation: Spec-Head Agree can target objects in ergative languages.

Saraiki Agreement

جان سیب کھادے
Same hierarchical structure as Urdu
v assigns φ-feature to object (NP سیب)
Subject agreement remains optional or defaults to third person

6.5 Long-Distance Agree

Agree is not restricted to adjacent elements. Long-distance Agree occurs when:

Probe is higher in the hierarchy
Goal is embedded
Minimality condition is respected

Example (English):

I believe [that she_i runs] → T of embedded clause can agree with she_i if feature is strong (rare in English, more in pro-drop languages)

Urdu Example:

مجھے لگتا ہے کہ وہ گئی ہے
Mujhe lagta hai ke woh gayi hai
Embedded T agrees with subject for φ-features

6.6 Locality and Minimality

Agree respects locality:

Probe targets the closest matching goal
Intervention effects arise when another potential goal blocks Agree

English Example:

The picture of the girls are on the wall → Agreement error due to head noun singularity

Urdu Example:

تصویر لڑکیوں کی ہے
tasveer larkiyon ki hai → Correct φ-feature agreement with head noun

6.7 Case Assignment and Agree

Case checking is intimately linked to Agree:

Nom-Acc languages (English):
T assigns nominative to Spec-TP
v assigns accusative to object

Ergative languages (Urdu, Saraiki):

v assigns ergative Case in perfective contexts
Agreement can target object rather than subject

6.8 Interaction of EPP and Agree

EPP features trigger movement to specifier positions
English: DP moves to Spec-TP to satisfy EPP and allows φ-feature checking
Urdu/Saraiki: EPP satisfied without overt subject movement in some constructions, as verb-final position dominates

6.9 Feature Checking Examples

English:

They run.
DP They [iφ] → Spec-TP
T [uφ] → Agree(DP) → φ-valued

Urdu:

لڑکوں نے دوڑا
larkon-ne dora
Subject DP [iφ] assigned ergative case
T[uφ] agrees with object if perfective

Saraiki:

لڑکے دوڑے
Feature checking aligns with object φ in verb-final structure

6.10 Feature-Driven Derivation Trees

English TP-VP

[TP They_i [T′ T[uφ, EPP] [VP run_j]]]
T[uφ] probes Spec-TP → values φ
DP occupies Spec-TP → satisfies EPP

Urdu TP-vP-VP

[TP لڑکوں_i [T′ T[uφ] [vP tLarkon_i [VP دوڑا]]]]
v assigns ergative Case
φ-feature valued on object

Saraiki TP-vP-VP

[TP لڑکے_i [T′ T[uφ] [vP tLarke_i [VP دوڑے]]]]
Object agreement maintained
Subject movement optional

6.11 Agree and Economy Principles

Minimal search: probe targets closest goal
Feature valuation must terminate derivation
Avoid unnecessary Merge or movement

Principle: Features must be valued exactly once for derivation to converge.

6.12 Summary

Agree ensures all unvalued features (uF) are checked
Spec-Head relation is a common site for Agree
C-command governs the accessibility of goals
Cross-linguistic variation emerges in target of agreement and movement triggers
Urdu and Saraiki illustrate ergative agreement patterns, English illustrates nominative–accusative patterns

6.13 Exercises

Identify all φ-features and EPP satisfaction in:
English: The girls are running
Urdu: لڑکیوں نے دوڑا
Saraiki: لڑکیاں دوڑیں
Draw TP-vP-VP trees showing Agree and feature checking in Urdu/Saraiki.
Explain why long-distance Agree is limited by minimality.

7: Case Theory Across Languages

Formal Mechanisms of Case Assignment in English, Urdu, and Saraiki

7.1 Introduction: The Role of Case in Syntax

Case is a morphosyntactic feature that licenses the grammatical role of nominal elements (subjects, objects, indirect objects).

Ensures theta-role assignment matches surface forms
Interacts with Spec-Head agreement and feature checking
Varies parametrically across languages

This chapter examines:

Case assignment mechanisms
Structural vs inherent Case
Cross-linguistic patterns in English, Urdu, and Saraiki
Interaction with φ-features and agreement

7.2 Basic Concepts of Case

Structural vs Inherent Case

TypeDefinitionExample
Structural CaseAssigned based on position in the treeEnglish nominative: John runs
Inherent CaseAssigned due to θ-role or lexical propertyUrdu ergative: جان نے کتاب پڑھی

Case Licensing Principles

Case Filter: Every overt DP must have Case
Assignability: T or v assigns Case in structural positions
Parametric variation: Languages differ in Case distribution and assignment triggers

7.3 English Case Patterns

Nominative-Accusative system
FunctionCasePositionExample
SubjectNominativeSpec-TPJohn runs
Direct objectAccusativeVP complementMary saw John
Indirect objectDativeVP complementShe gave him a book

Tree Illustration:

[TP John_i[NOM] [T′ T [VP runs_j]]]
T assigns nominative to Spec-TP
v assigns accusative to object

7.4 Urdu Case Patterns

Ergative-Accusative system in perfective aspect
Case assignment interacts with v and T
FunctionCasePositionExample
AgentErgative (-نے)Spec-vPجان نے کتاب پڑھی
ThemeAccusativeVP complementکتاب پڑھی
DativeDative marker (-کو)Indirect objectاسے کتاب دی

Observation: Ergative Case assigned by vP in perfective context

[TP جان_i [T′ T[uφ] [vP tJohn_i [VP کتاب_j پڑھی]]]]
Subject moves optionally to Spec-TP
Agreement often with object

7.5 Saraiki Case Patterns

Mirrors Urdu in verb-final constructions
Ergative marking occurs in perfective aspect
Agreement follows object φ-features
FunctionCaseExample
AgentErgativeجان سیب کھادے
ThemeAccusativeسیب کھادے
DativeDativeاسے کتاب دی

Tree Representation:

[TP جان_i [T′ T[uφ] [vP tJohn_i [VP سیب_j کھادے]]]]

7.6 Structural vs Lexical Case: Formal Distinction

Structural Case:

Governed by Spec-Head relation
English: nominative assigned to Spec-TP, accusative to object

Lexical Case:

Assigned by lexical heads (v, P, verbs)
Urdu/Saraiki ergative assigned lexically in perfective

Example:

English: She saw him → accusative assigned structurally
Urdu: جان نے کتاب پڑھی → ergative assigned lexically

7.7 Case and Feature Checking

Case features are uninterpretable (uCase)
Agree operation values uCase against iCase of head or goal
Ensures derivation converges

Urdu Example:

[vP tJohn_i[ERG] [VP کتاب_j[ACC] پڑھی[V]]]
T probes DP for φ-feature
v assigns ergative Case to subject
Object φ-feature may trigger agreement

English Example:

[TP John_i[NOM] [T′ T[uφ] [VP saw_j [DP him_ACC]]]]
T assigns nominative to subject
v assigns accusative to object

7.8 Case Assignment Trees

English: “John saw Mary”

[TP John_i[NOM] [T′ T [VP saw [DP Mary_j[ACC]]]]]

Urdu: “جان نے کتاب پڑھی”

[TP جان_i [T′ T[uφ] [vP tJohn_i[ERG] [VP کتاب_j[ACC] پڑھی[V]]]]]

Saraiki: “جان سیب کھادے”

[TP جان_i [T′ T[uφ] [vP tJohn_i[ERG] [VP سیب_j[ACC] کھادے[V]]]]]

Observation:

Hierarchical positions preserved
Surface linearization varies due to head-final vP in Urdu/Saraiki
Feature-driven Agree ensures Case and φ-feature assignment

7.9 Cross-Linguistic Observations

FeatureEnglishUrduSaraiki
Subject CaseNominativeErgativeErgative
Object CaseAccusativeAccusativeAccusative
Case Assignment TriggerStructuralLexical (perfective) + StructuralLexical + Structural
Agreement TargetSubjectObject (perfective)Object (perfective)

7.10 Exercises

Identify all cases in the following sentences:

English: The boys saw the girls
Urdu: لڑکوں نے لڑکیاں دیکھا
Saraiki: لڑکے لڑکیاں ویکھے

Draw a full vP-VP tree illustrating ergative assignment in Urdu/Saraiki.
Explain why structural Case in English is sensitive to position, but lexical Case in Urdu/Saraiki is sensitive to aspect.

7.11 Summary

Case licenses DP positions and ensures θ-role mapping
English is nominative-accusative
Urdu/Saraiki display ergative marking in perfective aspect
Agree and feature checking mediate Case assignment
Cross-linguistic differences are parametric, not universal

8: Theta Theory and Argument Structure

Mapping Roles, Structure, and Cross-Linguistic Patterns in English, Urdu, and Saraiki

8.1 Introduction: The Interface of Semantics and Syntax

Theta Theory connects syntax with semantic roles. Each verb selects arguments that must appear in particular structural positions and receive a theta-role (θ-role).

Ensures thematic well-formedness
Interfaces with Case assignment, Spec-Head agreement, and Merge
Allows cross-linguistic comparisons of argument realization

This chapter examines:

Theta-role assignment
The Theta-Criterion
Argument structure in English, Urdu, and Saraiki
Interaction with Case and φ-feature agreement

8.2 Basic Concepts of Theta Theory

Definition

A theta-role is a semantic role assigned by a verb (or other predicates) to its arguments.
Common θ-roles: Agent, Theme, Experiencer, Goal, Benefactive, Instrument

Theta-Criterion (Chomsky, 1981)

Each argument bears one and only one θ-role
Each θ-role is assigned to one and only one argument

Formal Representation

AssignTheta(V, DP) → θ-role
Each DP: exactly one θ-role
Each θ-role: exactly one DP

8.3 Argument Structure in English

Example 1: Transitive Verb
John kicked the ball.

Theta Assignment:

DPθ-role
JohnAgent
the ballTheme

Tree Representation:

[TP John_i [T′ T [VP kicked [DP the ball_j]]]]
Agent in Spec-vP
Theme in VP complement

Example 2: Ditransitive Verb

Mary gave John a book.
DPθ-role
MaryAgent
JohnGoal
a bookTheme
Merge respects θ-roles and linear order
Spec-Head agreement with subject (Mary) ensures φ-feature checking

8.4 Argument Structure in Urdu

Example 1: Transitive Perfective
جان نے کتاب پڑھی
Jān-ne kitāb paṛhī (John-ERG book-FEM read)
DPθ-roleCase
جانAgentErgative
کتابThemeAccusative

Tree Representation:

[TP جان_i [T′ T[uφ] [vP tJohn_i[ERG] [VP کتاب_j[ACC] پڑھی]]]]
Ergative Case assigned by v
φ-agreement often targets object

Example 2: Ditransitive

اس نے احمد کو کتاب دی
Us-ne Ahmad-ko kitāb di (He gave Ahmad a book)
DPθ-roleCase
اسAgentErgative
احمدGoalDative
کتابThemeAccusative

8.5 Argument Structure in Saraiki

Similar to Urdu with verb-final alignment

Example: Transitive Verb

جان سیب کھادے
DPθ-roleCase
جانAgentErgative
سیبThemeAccusative

Observation: Saraiki exhibits object agreement in perfective aspect similar to Urdu, maintaining theta-role alignment.

8.6 Theta-Role Assignment Mechanism

Theta-roles assigned at Merge:
Merge(Verb, DP) → Assign θ-role to DP
Complement position → Theme
Specifier of vP → Agent
Movement does not change θ-roles

8.7 Interaction of Theta Roles with Case

Structural Case: DP moves to satisfy θ-role and Spec-Head agreement
Lexical Case: DP receives case independently of surface position (Urdu/Saraiki ergatives)
Theta-role and Case together ensure grammaticality

8.8 Cross-Linguistic Observations

FeatureEnglishUrduSaraiki
Agent PositionSpec-vPSpec-vPSpec-vP
Theme PositionVP complementVP complementVP complement
Goal/BenefactiveSpec-VP complementDative-markedDative-marked
Case AssignmentStructuralLexical + StructuralLexical + Structural
Agreement TargetSubjectObject in perfectiveObject in perfective

Observation: Theta-roles are universal, surface realization and agreement are parametric.

8.9 Example Derivation Trees

English Ditransitive

[TP Mary_i [T′ T [vP tMary_i [VP gave [DP John_j] [DP a book_k]]]]]

Urdu Ditransitive

[TP اس_i [T′ T[uφ] [vP tUs_i[ERG] [VP [DP احمد_j[DATIVE]] [DP کتاب_k[ACC]] دی[V]]]]]

Saraiki Ditransitive

[TP اس_i [T′ T[uφ] [vP tUs_i[ERG] [VP [DP احمد_j[DATIVE]] [DP کتاب_k[ACC]] دی[V]]]]]

8.10 Theta-Grid and Argument Structure Representations

Verb θ-grid (English: give)


Argumentθ-rolePositionCase
DP1AgentSpec-vPNominative
DP2GoalVP complementDative
DP3ThemeVP complementAccusative

Verb θ-grid (Urdu: دینا / Saraiki: دینا)


Argumentθ-rolePositionCase
DP1AgentSpec-vPErgative
DP2GoalVP complementDative
DP3ThemeVP complementAccusative

8.11 Exercises

Identify θ-roles in the following sentences:

English: She sent him a letter
Urdu: اس نے اسے خط بھیجا
Saraiki: اس نے اسے خط بھیجے

Draw
vP-VP trees for English, Urdu, Saraiki showing θ-role assignment.
Explain the interaction of Case assignment and θ-role in perfective vs imperfective aspects in Urdu/Saraiki.
8.12 Summary
Theta Theory governs argument structure and thematic well-formedness
Theta-Criterion ensures unique mapping of roles to arguments
English: nominative-accusative mapping
Urdu/Saraiki: ergative alignment in perfective, dative marking for goals
Case, Merge, and Agree interface with θ-theory to create well-formed sentences

PART III — MOVEMENT AND CONSTRAINTS

9: A-Movement and Argument Positioning

Structural Dynamics and Cross-Linguistic Patterns in English, Urdu, and Saraiki

9.1 Introduction: Movement and Structural Roles

A-movement (argument movement) refers to the syntactic operation in which an argument DP moves to a position that satisfies Case, agreement, or the EPP.

Distinct from A-bar movement (topic, wh-movement)
Typically involves Spec-TP, Spec-vP, or Spec-AP positions
Essential for nominative Case assignment, subject raising, and passivization

This chapter covers:

Formal definition of A-movement
Conditions: EPP, Case, and φ-feature checking
Cross-linguistic patterns in English, Urdu, Saraiki
Illustrative trees and examples

9.2 Formal Definition of A-Movement

Operation: Moves a DP from its base position (θ-position) to an A-position

Triggered by:

EPP feature on T or v
Unvalued φ-features requiring checking
Passive morphology (optional)

Representation

Move(DP_i, Target Spec-A) → satisfies uφ/EPP features

Properties:

θ-roles preserved: movement does not change semantic role
Triggered locally: minimality applies
Occurs cyclically: in embedded clauses, movement may be successive-cyclic

9.3 A-Movement in English

9.3.1 Subject Raising

Example: John seems to be happy

Derivation:

John assigned θ-role in embedded clause
DP John moves to Spec-TP of matrix clause to satisfy EPP

Tree Representation:

[TP John_i [T′ seems [TP tJohn_i [T′ to be happy]]]]
Movement preserves θ-role
φ-feature of embedded T not checked by matrix T

9.3.2 Passive Constructions

Example: The book was read by Mary

Analysis:

Theme moves to Spec-TP to satisfy EPP & nominative Case
Agent assigned by by-phrase, not Spec-vP
[TP The book_i [T′ was [vP tThe book_i read [PP by Mary]]]]

9.4 A-Movement in Urdu

9.4.1 Subject Movement

Example: جان نے کتاب پڑھی
Jān-ne kitāb paṛhī (John-ERG book-FEM read)

Observation:

Subject DP in Spec-vP receives ergative Case
Movement to Spec-TP optional for EPP satisfaction
Verb-final surface order preserved

Tree Representation:

[TP جان_i [T′ T[uφ] [vP tJohn_i[ERG] [VP کتاب_j[ACC] پڑھی]]]]
Subject remains in vP or raises to TP depending on syntax-parametric settings

9.4.2 Raising Verbs

Example: لگتا ہے کہ جان خوش ہے
lagta hai ke Jān khush hai (It seems that John is happy)
Embedded subject moves successively for agreement and EPP
English equivalent shows spec-TP raising, Urdu may retain base position

9.5 A-Movement in Saraiki

Similar to Urdu with head-final vP
Subject movement optional; object agreement prominent
Perfective aspect triggers ergative marking on subject

Example: جان سیب کھادے

Theme remains in VP complement
Subject may or may not raise to TP depending on clause type

9.6 Interaction with Case and Agreement

English: Movement necessary to receive nominative Case
Urdu/Saraiki: Lexical ergative case may eliminate need for TP movement
Passive: DP movement to Spec-TP universal for θ-role preservation and Case assignment

9.7 Formal Properties

PropertyEnglishUrduSaraiki
TargetSpec-TP (subject)Spec-TP optionalSpec-TP optional
TriggerEPP + φ-featuresφ-features + Caseφ-features + Case
Base PositionSpec-vPSpec-vPSpec-vP
PassivizationTheme moves to Spec-TPTheme moves if markedTheme moves if marked
Verb PositionHead-medialHead-finalHead-final

9.8 Minimal Pair Illustrations

English

Active: John read the book → [Spec-vP John][VP read the book]
Passive: The book was read → [Spec-TP The book][VP tThe book read]

Urdu

Active: جان نے کتاب پڑھی → [Spec-vP John][VP book read]
Passivization optional: کتاب پڑھی گئی → Theme moves to TP

Saraiki

Active: جان سیب کھادے → [Spec-vP John][VP apple ate]
Passivization: سیب کھادے گئے → Theme moves to TP

9.9 A-Movement Trees

English Active/Passive

Active: [TP John_i [T′ T [vP tJohn_i [VP read [DP the book]]]]]
Passive: [TP The book_i [T′ was [vP tThe book_i [VP read [PP by John]]]]]

Urdu Active

[TP جان_i [T′ T[uφ] [vP tJohn_i[ERG] [VP کتاب_j[ACC] پڑھی]]]]

Saraiki Active

[TP جان_i [T′ T[uφ] [vP tJohn_i[ERG] [VP سیب_j کھادے]]]]

9.10 Summary

A-Movement relocates arguments to A-positions for Case, EPP, or φ-feature satisfaction
English requires movement for nominative assignment
Urdu/Saraiki optional movement due to ergative case licensing
Passive constructions universally involve A-movement of Theme
θ-roles preserved; surface order may vary parametrically

9.11 Exercises

Draw vP-VP-TP trees showing A-movement for:
English: The girl was praised by the teacher
Urdu: لڑکی کو استاد نے سراہا
Saraiki: لڑکی نوں استاد نے سراہیا
Explain the effect of ergative marking on A-movement in Urdu/Saraiki perfective clauses.
Identify minimal pairs illustrating movement differences between English active/passive.

10: A′-Movement (Wh, Focus, Topicalization)

The Mechanics of Non-Argument Movement Across English, Urdu, and Saraiki

10.1 Introduction: A-Bar vs A-Movement

While A-movement targets argument positions (Spec-TP, Spec-vP), A′-movement targets non-argument positions for focus, topic, or interrogative purposes.

Key characteristics:

Moves DPs or phrases to Spec-CP or other discourse-related positions
Triggered by wh-features, focus, or topicalization
Often involves long-distance movement
Constrained by island effects and minimality

This chapter explores:

Formal definition of A′-movement
Wh-movement in English, Urdu, Saraiki
Focus and topicalization
Island constraints
Illustrative derivational trees

10.2 Formal Definition of A′-Movement

Operation: Move(XP, Spec-C) to check an uninterpretable [wh], [focus], or [topic] feature.

A′-Move(XPi,Spec-CP)Feature checking at CA′\text{-Move}(XP_i, Spec\text{-}CP) \rightarrow \text{Feature checking at C}

Conditions:

Feature-driven: C has unvalued [wh] or [focus] feature
C-command: XP must be c-commanded by C
Minimality: Closest eligible XP moves (no intervening matching features)
Successive-cyclic movement: Across multiple CP layers

10.3 Wh-Movement in English

Example: What did John eat?

Derivation Steps:

VP complement “what” carries [+wh] feature
Moves to Spec-CP to check C[+wh]
T-to-C movement applies to satisfy interrogative syntax

Tree Representation:

[CP What_i [C′ did_j [TP John [T′ t_did_j [VP eat t_what_i]]]]]

Observation: Wh-movement is overt in English and obligatory for question formation.

10.3.1 Focus Movement

Example: JOHN ate the apple (contrastive focus)
XP moves to Spec-FocP to satisfy discourse prominence
[FocP JOHN_i [Foc′ Foc [TP tJOHN_i ate the apple]]]

10.3.2 Topicalization

Example: The apple, John ate t_apple
Moves object to Spec-TopP to mark topic
Optional, discourse-driven

10.4 Wh-Movement in Urdu

Example: جان نے کیا کھایا؟
Jān-ne kyā khāyā? (John-ERG what ate)

Observation:

Wh-phrase can stay in-situ (optional overt movement)
Movement may occur to Spec-CP in formal or written registers

Tree Representation (overt wh):

[CP کیا_i [C′ کیا_j [TP جان-ne [T′ t_kyā_j [VP کھایا t_kyā_i]]]]]
In colloquial speech, kyā remains in VP complement (wh-in-situ)

10.4.1 Focus in Urdu

Example: جان نے کتاب پڑھی → emphasis: کتاب جان نے پڑھی
Focus fronting moves Theme to Spec-FocP
EPP satisfied at discourse projection rather than TP

10.4.2 Topicalization in Urdu

Topic marked by left-dislocation: کتاب، جان نے پڑھی
Topic phrase moves to Spec-TopP for discourse prominence

10.5 Wh-Movement in Saraiki

Example: جان نے کیہ کھادے؟ (John-ne what ate?)
Wh-in-situ allowed, especially in colloquial Saraiki
Overt movement occurs in formal registers

Observation:

Saraiki and Urdu share flexible wh-movement patterns
Movement constrained by islands and minimality

10.6 Islands and Locality Constraints

A′-movement is restricted by syntactic islands:

Island TypeDefinitionExample
Complex NPMovement blocked out of DP/NPWho did you hear the rumor that ___ left?
SubjectCannot extract from subject DPWhat_i did [the man who bought ___] leave?
AdjunctCannot extract from adjunct clausesWhat_i did he leave [without reading ___]?
Coordinate StructureCannot extract one conjunct aloneWhat_i did John eat ___ and Mary drink ___?

Observation: Urdu and Saraiki obey similar island constraints with minor parametric variation in informal speech.

10.7 Feature-Driven Motivation

C has uninterpretable [wh] or [focus] → triggers movement
Topicalization triggered by discourse features [topic]
Minimal search ensures closest goal is targeted

Formal Representation:

Move(XP_i, Spec-CP/FocP/TopP) → Check [wh/focus/topic]_C

10.8 Successive-Cyclic Movement

A′-movement is cyclic through Spec-CPs of intermediate clauses
Ensures Feature checking at each CP layer
English example: What_i do you think [that John ate t_i]?

Tree Representation:

[CP What_i [C′ do_j [TP you [T′ t_do_j [CP tWhat_i [C′ that [TP John [T′ t_ate [VP tWhat_i]]]]]]]]]
Urdu/Saraiki follow similar cyclic paths in formal registers
Wh-in-situ reduces overt movement requirement

10.9 Cross-Linguistic Summary

FeatureEnglishUrduSaraiki
Wh-movementOvert, obligatoryOptional in-situOptional in-situ
FocusContrastive frontingTheme frontingTheme fronting
TopicalizationOptional, discourseLeft-dislocationLeft-dislocation
Island SensitivityStrongStrongStrong
Successive-CyclicMandatoryOptional overtOptional overt

10.10 Illustrative Trees

English Wh-Question

[CP What_i [C′ did [TP John [T′ t_did [VP eat t_what_i]]]]]

Urdu Wh-Movement (overt)

[CP کیا_i [C′ کیا_j [TP جان-ne [T′ t_kyā_j [VP کھایا t_kyā_i]]]]]

Saraiki Wh-Movement (overt)

[CP کیہ_i [C′ کیہ_j [TP جان-ne [T′ t_ki_hj [VP کھادے t_ki_h_i]]]]]

Focus Fronting

[FocP کتاب_i [Foc′ Foc [TP جان-ne t_کتاب_i پڑھی]]]

Topicalization

[TopP کتاب_i [Top′ Top [TP جان-ne t_کتاب_i پڑھی]]]

10.11 Summary

A′-movement targets non-argument positions for wh, focus, and topic
English: obligatory wh-movement, clear Spec-CP targeting
Urdu/Saraiki: flexible wh-in-situ, optional overt movement, focus/topicalization via left-dislocation
Island constraints govern extraction
Successive-cyclic movement ensures feature checking at each CP

10.12 Exercises

Identify wh, focus, and topic movements in these sentences:
English: Which book did John read?
Urdu: جان نے کونسی کتاب پڑھی؟
Saraiki: جان نے کیہ کتاب کھادے؟
Draw derivational trees showing successive-cyclic wh-movement in English and compare with Urdu/Saraiki in-situ derivations.
Explain island constraints with at least one English and one Urdu example.

11: Locality Constraints (Subjacency, Phases)

Structuring Movement and Syntactic Dependencies in English, Urdu, and Saraiki

11.1 Introduction: Why Locality Matters

Locality constraints regulate how far a constituent can move in a single step. They:

Prevent unlicensed long-distance dependencies
Maintain computational efficiency in the derivation
Interact with A- and A′-movement, Case checking, and feature satisfaction

Two central concepts:

Subjacency – movement cannot cross more than one bounding node per step
Phases – syntactic domains that cyclically spell out constituents

This chapter examines:

Formal definitions of Subjacency and Phases
Implications for English, Urdu, and Saraiki
Tree representations illustrating movement constraints

11.2 Subjacency

Definition

Subjacency (Chomsky, 1973) restricts movement:

A constituent cannot move across more than one bounding node (BN) at a time
Bounding nodes: NP, CP in English; DP, CP in South Asian languages

Formal Condition:

Move(XP) is blocked if it crosses two BNs without intermediate landing site\text{Move}(XP) \text{ is blocked if it crosses two BNs without intermediate landing site}

Examples in English

Grammatical:

Who_i do you think [t_i will win]

Ungrammatical (violates Subjacency):

*Who_i do you wonder [whether Mary likes t_i]

Observation: Movement skips over only one bounding node per step; violation occurs when multiple BNs are crossed without intermediate Spec-CP landing.

11.3 Phases and Phase Theory

Definition

Phases: cyclic domains that send their complement to Spell-Out
Typical phases: vP, CP
Phase Impenetrability Condition (PIC): Only elements in the edge of a phase can be accessed by higher operations

Formalization:

Phase = vP / CP
PIC: Only the edge of the phase accessible to higher operations

Consequences:

Movement must proceed successively-cyclically through phase edges
Prevents long-distance movement from deep embedded positions

11.4 Locality in English

A′-movement respects both Subjacency and Phases

Example: Successive-Cyclic Wh-Movement

What_i do you think [that John said [that Mary bought t_i]]?

Derivation:

What_i moves to Spec-CP of embedded clause (phase edge)
Then moves to matrix Spec-CP
[CP What_i [C′ do_j [TP you [T′ t_do_j [CP tWhat_i [C′ that [TP John [T′ t_said [CP tWhat_i [C′ that [TP Mary [T′ t_bought [VP tWhat_i]]]]]]]]]]]]]
Each CP is a phase
Movement occurs cyclically through phase edges
Prevents Subjacency violations

11.5 Locality in Urdu

Similar phase structure: vP and CP
Wh-in-situ optional; when overt, follows successive-cyclic paths

Example: Embedded Question

جان نے سوچا کہ کیا احمد نے کتاب پڑھی؟
Jān-ne sochā ke kyā Ahmad-ne kitāb paṛhī?
Wh-phrase “kyā” may remain in-situ, avoiding potential Subjacency violations

Phase Observations:

CPs in Urdu act as boundaries
Edge DPs/wh-phrases can move freely
Deep embedded wh requires successive-cyclic movement in formal registers

11.6 Locality in Saraiki

Phase-based architecture similar to Urdu
Wh-in-situ predominant in spoken discourse
Formal or literary registers may show overt movement

Example: Successive-Cyclic Movement

جان نے سوچیا کہ کیہ احمد نے کتاب کھادی؟
Phases: vP for each verb, CP for embedded clauses
Movement respects Subjacency and PIC

11.7 Bounding Nodes Across Languages

LanguageBounding Nodes (BN)Comments
EnglishNP, CPClassic GB theory
UrduDP, CPErgative subject in vP also considered
SaraikiDP, CPHead-final, allows wh-in-situ

Observation: Subjacency violations arise when a wh/XP crosses multiple BNs without landing in a phase edge.

11.8 Interaction with A- and A′-Movement

A-movement (arguments): mostly short-distance within vP/TP
A′-movement (wh, focus, topicalization): may be long-distance, constrained by Subjacency and Phases
PIC ensures cyclic checking of features at each phase edge

Illustrative Tree: English Embedded Question

[CP What_i [C′ do_j [TP you [T′ t_do_j [CP tWhat_i [C′ think [TP John [T′ t_think [CP tWhat_i [C′ will [TP Mary [T′ t_will [VP buy tWhat_i]]]]]]]]]]]]]

11.9 Cross-Linguistic Comparisons

FeatureEnglishUrduSaraiki
Phase domainsvP, CPvP, CPvP, CP
Successive-cyclic movementObligatoryOptional (depends on overt wh)Optional (depends on formal register)
Subjacency violationsStrongAvoided in formal registerAvoided in formal register
Wh-in-situNot allowedAllowedAllowed
Edge accessibilityPhase edgesPhase edgesPhase edges

11.10 Exercises

Identify phase boundaries in the following sentences:
English: What do you think that John bought?
Urdu: جان نے سوچا کہ کیا احمد نے کتاب پڑھی؟
Saraiki: جان نے سوچیا کہ کیہ احمد نے کتاب کھادی؟
Explain why wh-in-situ in Urdu/Saraiki avoids Subjacency violations.
Draw successive-cyclic movement trees for English embedded questions.

11.11 Summary

Locality constraints regulate movement to avoid unlicensed dependencies
Subjacency: cannot cross more than one bounding node per step
Phase Theory: vP and CP cyclically spell out complements; only edges accessible
English requires strict successive-cyclic movement
Urdu/Saraiki allow wh-in-situ, providing a parametric relaxation
Understanding phases and locality is essential for predicting movement and extraction patterns

12: The EPP and Subjecthood

Structural, Morphological, and Cross-Linguistic Perspectives in English, Urdu, and Saraiki

12.1 Introduction: The EPP in Syntax

The Extended Projection Principle (EPP) is a core feature in Generative Grammar that requires:

Every finite clause to have a specifier in the TP/Infl projection
Subjects to occupy Spec-TP, regardless of θ-role assignment

Key roles:

Ensures clause well-formedness
Triggers A-movement of arguments
Interfaces with Case checking, φ-feature agreement, and word order

This chapter examines:

Formal definition and motivation of the EPP
Subject positions in English, Urdu, Saraiki
Interaction with A-movement, agreement, and ergativity
Cross-linguistic parametric variation

12.2 Formal Definition of the EPP

Definition:

Every finite TP must have a DP (or pro) in its Specifier position.

Triggering Mechanism:

T (or Infl) has an unvalued EPP feature: [+EPP]
Movement of a DP to Spec-TP satisfies this feature

Formal Rule:

If T[+finite, +EPP] and Spec-TP empty → Move(DP) to Spec-TP

Properties:

EPP movement preserves θ-roles
Can move overt DP or pro-drop subject depending on language

12.3 Subjecthood in English

English requires overt subjects in Spec-TP for finite clauses
EPP ensures nominative Case assignment

Example 1: Finite Clause

John left early.

Tree:

[TP John_i [T′ T[+EPP] [VP tJohn_i left early]]]
John moves from Spec-vP to Spec-TP
θ-role: Agent preserved
φ-features checked by T

Example 2: Expletive Subjects

It is raining.
Expletive it occupies Spec-TP to satisfy EPP
No θ-role assigned

Tree:

[TP it [T′ is [VP raining]]]

Observation: English strictly requires Spec-TP to be filled in finite clauses.

12.4 Subjecthood in Urdu

Urdu allows pro-drop in finite clauses; EPP satisfied by null pronoun (pro)

Example: Non-Expletive Subject

جان نے کتاب پڑھی
Jān-ne kitāb paṛhī (John-ERG read the book)

Derivation:

Ergative DP in Spec-vP may optionally move to Spec-TP
TP EPP can be satisfied by pro in informal sentences

Tree:

[TP pro_i [T′ T[uφ, +EPP] [vP جان_i[ERG] [VP کتاب_j[ACC] پڑھی]]]]

Example: Expletive Pro

بارش ہو رہی ہے
bārish ho rahī hai (It is raining)
pro occupies Spec-TP to satisfy EPP
No θ-role assigned

12.5 Subjecthood in Saraiki

Head-final language with ergative marking in perfective clauses
TP may remain unfilled in informal speech; EPP satisfied pro-drop
Literary registers may show overt subject raising

Example: Transitive Clause

جان سیب کھادے
Jān sīb khāde (John ate an apple)
Ergative subject optionally moves to Spec-TP
EPP feature satisfied by null or overt subject depending on discourse context

Tree:

[TP pro_i [T′ T[uφ, +EPP] [vP جان_i[ERG] [VP سیب_j کھادے]]]]

12.6 Interaction of EPP with A-Movement

English: overt DP moves to Spec-TP to satisfy both EPP and nominative Case
Urdu/Saraiki: movement optional; EPP can be satisfied by pro-drop
Passive constructions: Theme DP may raise to Spec-TP to satisfy EPP and Case

Tree: Passive English Example

The book was read by John.
[TP The book_i [T′ was [+EPP] [vP tThe book_i read [PP by John]]]]
EPP feature satisfied by Theme DP

12.7 Parametric Variation Across Languages

FeatureEnglishUrduSaraiki
EPP satisfactionSpec-TP, overt DP requiredSpec-TP or proSpec-TP or pro
Pro-drop allowed?NoYesYes
Subject raisingObligatoryOptionalOptional
Passive EPP fulfillmentTheme movesTheme may moveTheme may move
Expletive insertionRequired (It)Optional (pro)Optional (pro)

Observation: EPP is universal, but language-specific strategies differ: English requires overt subjects, South Asian languages allow pro-drop.

12.8 Illustrative Trees

English Active Clause

[TP John_i [T′ T[+EPP] [VP tJohn_i left]]]

English Passive Clause

[TP The book_i [T′ was [+EPP] [vP tThe book_i read [PP by John]]]]

Urdu Transitive Clause

[TP pro_i [T′ T[uφ, +EPP] [vP جان_i[ERG] [VP کتاب_j[ACC] پڑھی]]]]

Saraiki Transitive Clause

[TP pro_i [T′ T[uφ, +EPP] [vP جان_i[ERG] [VP سیب_j کھادے]]]]

12.9 EPP and Subjecthood: Interaction Summary

EPP feature ensures clause well-formedness
English: Spec-TP filled by overt DP or expletive
Urdu/Saraiki: Spec-TP can be satisfied by pro-drop
Passive: Theme DP may raise to satisfy EPP and Case simultaneously
EPP interacts with A-movement, Case checking, and φ-agreement

12.10 Exercises

Identify the EPP feature in the following sentences and mark the Spec-TP position:
English: Mary will arrive soon.
Urdu: جان آئے گا
Saraiki: جان آئےس
Draw derivational trees showing EPP satisfaction in active and passive clauses for all three languages.
Explain how pro-drop in Urdu and Saraiki interacts with EPP satisfaction.

PART IV — INTERFACES AND INTERPRETATION

13: Binding Theory

Principles A, B, C and Anaphora Across English, Urdu, and Saraiki

13.1 Introduction: The Nature of Binding

Binding Theory governs anaphoric relations between pronouns, reflexives, and R-expressions in syntax.

Core observations:

Certain expressions require local antecedents (reflexives)
Others are free within local domains (pronouns)
Some must avoid local antecedents (R-expressions)

Chomsky formalized these as Principles A, B, and C:

Principle A: Reflexives must be bound locally
Principle B: Pronouns must be free locally
Principle C: R-expressions must be free everywhere

This chapter explores:

Formal definitions of binding principles
Structural application in English, Urdu, and Saraiki
Interaction with A- and A′-movement, locality, and phases

13.2 Technical Definitions

PrincipleFormal DefinitionStructural Rule
AReflexive pronouns (e.g., himself, herself) must be bound within their governing categoryBinding Condition A: DP_i binds anaphor within minimal TP/vP
BPronouns (e.g., he, she, them) must be free in their local domainBinding Condition B: No DP can bind a pronoun within the same minimal TP/vP
CR-expressions (e.g., John, Mary) must be free in all domainsBinding Condition C: No DP can c-command a coreferential R-expression

Additional Concepts:

Local Domain: Typically vP or minimal TP
Binding: DP_i c-commands DP_j and shares reference
Free: DP not bound by a co-referential DP within domain

13.3 Reflexives: Principle A

13.3.1 English

Example: John_i saw himself_i in the mirror

Analysis:

himself must be bound by local subject (John)
vP forms the local domain

Tree Representation:

[vP John_i [VP saw [DP himself_i]]]

Observation: *Himself saw John is ungrammatical (violates Principle A).

13.3.2 Urdu Reflexives

Reflexive marker: اپنا (apnā)
Example: جان_i نے اپنی_i کتاب پڑھی
Jān-ne apnī kitāb paṛhī (John read his own book)
apnā requires antecedent in same vP/TP
[vP جان_i [VP پڑھی [DP اپنی_i کتاب]]]
*اپنی جان نے کتاب پڑھی (apnī Jān-ne) violates Principle A

13.3.3 Saraiki Reflexives

Reflexive marker: اپنو (apno)
Example: جان_i نے اپنو_i سیب کھادے
Must be bound within vP

Observation: Saraiki mirrors Urdu in local binding requirement

13.4 Pronouns: Principle B

13.4.1 English Pronouns

Example: John_i said he_j would leave
Pronoun he cannot be bound locally by John
Allowed in higher TP (matrix clause), violating Principle B avoided
[vP John_i [VP said [TP he_j would leave]]]
Local binding: ungrammatical → *John_i said he_i would leave

13.4.2 Urdu Pronouns

Pronoun: وہ (wo)
Example: جان_i نے کہا کہ وہ_j آئے گا
wo may refer to matrix subject if outside local vP
Local binding restricted: Principle B respected

13.4.3 Saraiki Pronouns

Pronoun: او (o)
Principle B applies similarly to Urdu
Allows long-distance coreference; blocks local binding

13.5 R-Expressions: Principle C

13.5.1 English R-Expressions

Example: *He_i said that John_i left → ungrammatical
John cannot be c-commanded by co-referential pronoun

Tree Representation:

[TP he_i [T′ said [CP that [TP John_i left]]]]
Principle C prevents local binding

13.5.2 Urdu R-Expressions

Proper nouns: جان (Jān), علی (Ali)
Example: *وہ_i نے کہا کہ جان_i آئے گا → violates Principle C
R-expressions must be free in all domains, including across clauses

13.5.3 Saraiki R-Expressions

Proper nouns obey Principle C identically to Urdu
Example: *او_i کہیا کہ جان_i آئےس → ungrammatical

13.6 Interaction with A- and A′-Movement

A-movement: subjects moving to Spec-TP must respect Principles A/B
A′-movement: wh-fronting, focus, or topicalization may influence binding domains
Example (English): Which picture of himself_i did John_i like t_i?
Reflexive bound locally, movement extends beyond vP
Principle A satisfied pre-movement

13.7 Cross-Linguistic Observations

PrincipleEnglishUrduSaraiki
AReflexive bound locally (vP)Reflexive bound locally (apnā)Reflexive bound locally (apno)
BPronoun free locallyPronoun free locally (wo)Pronoun free locally (o)
CR-expression free globallyR-expression free globallyR-expression free globally
Long-distance bindingAllowed with pronounsAllowedAllowed
Interaction with movementPreserved under A- and A′-movementPreservedPreserved

Observation: Principles A, B, C are universal, with minor parametric adjustments in pronoun choice and reflexive marking.

13.8 Minimal Pair Illustrations

SentenceGrammatical?Explanation
John_i saw himself_iPrinciple A satisfied
*Himself saw JohnPrinciple A violation
John_i said he_i would leavePrinciple B violation
John_i said he_j would leavePrinciple B satisfied
*He_i said John_i leftPrinciple C violation

Urdu/Saraiki equivalents illustrate similar patterns using apnā/apno and pronouns/wo/o.

13.9 Illustrative Trees

English Reflexive

[vP John_i [VP saw [DP himself_i]]]

Urdu Reflexive

[vP جان_i [VP پڑھی [DP اپنی_i کتاب]]]

Saraiki Reflexive

[vP جان_i [VP کھادے [DP اپنو_i سیب]]]

13.10 Summary

Binding Theory enforces local/global constraints on referential expressions
Principle A: reflexives → bound locally (vP)
Principle B: pronouns → free locally
Principle C: R-expressions → free everywhere
Universal across English, Urdu, Saraiki with parametric variation in reflexive form and pronoun use
Interacts with movement, EPP, and phase domains

13.11 Exercises

Identify Principle A, B, or C violations in the following:
English: *Himself left the room.
Urdu: *اپنی جان نے کتاب پڑھی
Saraiki: *اپنو جان نے سیب کھادے
Draw binding trees showing reflexive and pronoun positions in vP and TP.
Explain how movement (A and A′) affects local binding domains in English and Urdu.

14: Information Structure

Focus, Topic, and Discourse Alignment in English, Urdu, and Saraiki

14.1 Introduction: The Syntax-Discourse Interface

Information Structure (IS) concerns how syntactic structures signal discourse roles, such as:

Topic: what the sentence is about
Focus: what is being asserted or emphasized
Contrastive elements: highlighting alternatives or correction

Interaction with syntax:

Drives A′-movement (Spec-CP/TopP/FocP)
Determines fronting, left-dislocation, or prosodic prominence
Affects word order, agreement, and null element realization

This chapter explores:

Formal modeling of focus and topic projection
Cross-linguistic strategies in English, Urdu, Saraiki
Interaction with movement, EPP, and phases

14.2 Formal Definitions

ConceptDefinitionSyntactic Projection
TopicInformation already known or backgrounded in discourseSpec-TopP
FocusInformation contrastive, new, or emphasizedSpec-FocP
Contrastive FocusEmphasis on an alternative setSpec-FocP with [+contrast] feature

Feature-Driven Representation:

C/Foc/Top = [+focus/+topic], triggers XP movement to Spec-FocP/Spec-TopP

14.3 English Information Structure

14.3.1 Topic

Example: The book, John read it yesterday.
Topicalized object moves to Spec-TopP
Optional, discourse-driven

Tree Representation:

[TopP The book_i [Top′ Top [TP John [T′ t_read [VP t_it_i yesterday]]]]]

14.3.2 Focus

Example: JOHN read the book. (contrastive focus)
Focus moves to Spec-FocP
[FocP JOHN_i [Foc′ Foc [TP tJOHN_i read the book]]]
Focus triggers prosodic prominence

14.3.3 Wh-Questions and Focus

Wh-phrases simultaneously satisfy A′-movement and [+wh]/[+focus]
Example: What did John read?

14.4 Urdu Information Structure

Urdu employs left-dislocation and contrastive fronting

14.4.1 Topic Fronting

Example: کتاب، جان نے پڑھی
Object کتاب moves to Spec-TopP for topic prominence
[TopP کتاب_i [Top′ Top [TP جان-ne [T′ t_read [VP t_کتاب_i پڑھی]]]]]

14.4.2 Focus Fronting

Example: جان نے کتاب پڑھی → emphasis: جان نے کتاب پڑھی (John did read the book, not someone else)
Contrastive focus marked by word order or stress
[FocP کتاب_i [Foc′ Foc [TP جان-ne t_کتاب_i پڑھی]]]
Focus position often adjacent to T

14.4.3 Prosodic Emphasis

Urdu uses pitch accent and stress for information structure
Focused constituents receive narrow or broad focus marking

14.5 Saraiki Information Structure

Head-final language, uses topic and focus fronting

14.5.1 Topic Fronting

Example: کتاب، جان نے کھادی (The book, John ate)
Object moves to Spec-TopP, maintains head-final vP

14.5.2 Focus Fronting

Example: جان نے کتاب کھادی (Contrastive emphasis on John)
Focus movement interacts with EPP and A′-movement

Observation: Saraiki shares parametric flexibility with Urdu in fronting and prosody.

14.6 Interaction with EPP and Phases

EPP triggers Spec-TP occupation; Spec-FocP/TopP movement may also satisfy discourse-driven features
Phases (vP, CP) serve as landing sites for cyclic movement
Movement is successive-cyclic, especially in embedded clauses

Example: English Embedded Focus

It was JOHN_i that Mary saw t_i yesterday.
[FocP JOHN_i [Foc′ Foc [TP Mary [T′ t_saw [VP tJOHN_i yesterday]]]]]

14.7 Cross-Linguistic Comparison

FeatureEnglishUrduSaraiki
Topic frontingOptional, left-dislocationLeft-dislocation, frequentLeft-dislocation, frequent
Focus frontingContrastive, Spec-FocPContrastive, Spec-FocPContrastive, Spec-FocP
ProsodyPitch accent, stressPitch, stressPitch, stress
Wh-questionsOvert movementOptional in-situOptional in-situ
Interaction with EPPSpec-TP for finite clauseSpec-TP or proSpec-TP or pro

Observation: English prefers overt movement and prosodic marking, while Urdu/Saraiki allow flexible overt vs in-situ focus/topic positioning.

14.8 Illustrative Trees

English Topic Fronting

[TopP The book_i [Top′ Top [TP John [T′ t_read [VP t_it_i yesterday]]]]]

English Contrastive Focus

[FocP JOHN_i [Foc′ Foc [TP tJOHN_i read the book]]]

Urdu Topic Fronting

[TopP کتاب_i [Top′ Top [TP جان-ne [T′ t_read [VP t_کتاب_i پڑھی]]]]]

Saraiki Focus Fronting

[FocP جان_i [Foc′ Foc [TP t_جان_i کتاب کھادی]]]

14.9 Exercises

Identify the topic and focus positions in the following sentences:
English: The cake, John ate yesterday.
Urdu: کیک، جان نے کھایا
Saraiki: کیک، جان نے کھادے
Draw trees for contrastive focus in English, Urdu, Saraiki.
Explain how prosody interacts with movement in Urdu and Saraiki.

14.10 Summary

Information structure drives focus and topic positioning
Interacts with A′-movement, EPP, and phases
English: rigid wh/fronting, contrastive focus
Urdu/Saraiki: flexible fronting, pro-drop, discourse-driven stress
Trees illustrate Spec-FocP and Spec-TopP projections

15: PF and LF Interfaces

Mapping Syntax to Phonology and Semantics in English, Urdu, and Saraiki

15.1 Introduction: The Concept of Interfaces

In the Minimalist Program, syntactic structures are interpreted at two major interfaces:

PF (Phonological Form): Interfaces syntax with sound/linearization
LF (Logical Form): Interfaces syntax with meaning/semantics

Key ideas:

Syntax builds abstract hierarchical structures

PF and LF interpret these structures according to language-specific rules

Operations like movement, feature checking, and agreement impact both PF and LF

This chapter explores:

Formal definitions of PF and LF
How English, Urdu, and Saraiki structures are realized
Interface effects on word order, scope, and prosody

15.2 Phonological Form (PF)

15.2.1 Definition

PF is the level of representation where syntax is mapped to sound:

Determines linear order of constituents
Accounts for intonation, stress, and prosody
Sensitive to head-initial vs head-final order

Formal Rule (Linearization):

PF: Merge(X,Y)linear order according to head-complement parameter\text{PF: } \text{Merge(X,Y)} \rightarrow \text{linear order according to head-complement parameter}

Principles:

Head-initial languages (English): Head precedes complement
Head-final languages (Urdu/Saraiki): Head follows complement
Prosodic alignment may trigger overt movement for clarity

15.2.2 PF in English

Example: John read the book yesterday

Linearization:

[TP John [T′ t_read [VP read [DP the book]]]] → linear order: John read the book
Wh-movement influences PF: What did John read?
PF constraints handle auxiliary inversion

15.2.3 PF in Urdu

Example: جان نے کتاب پڑھی
Head-final: verb follows object

Tree:

[TP جان-ne [T′ t_perfect [vP کتاب [VP پڑھی t_جان]]]]
PF realizes ergative subject, object-verb order, optional fronting for discourse

15.2.4 PF in Saraiki

Head-final order maintained: Subject-Object-Verb (SOV)
Prosody and topic/fronting may reorder for emphasis

Example:

جان نے سیب کھادے (John ate the apple)

PF operation:

Merge structures linearized according to head-final parameter

15.3 Logical Form (LF)

15.3.1 Definition

LF is the semantic interpretation level:

Maps theta-roles to arguments
Determines scope of quantifiers, negation, and focus
Sensitive to A- and A′-movement

Formal Rule (Scope Assignment):

LF: Assign theta-roles, check features, evaluate quantifiers and operators\text{LF: Assign theta-roles, check features, evaluate quantifiers and operators}

15.3.2 LF in English

Example: Every student read a book

Interpretation:

Surface scope: ∀x ∃y (x read y)
Inverse scope: ∃y ∀x (y read by all students)

LF resolves ambiguity through movement of quantifiers or focus operators
[TP every student_i [T′ t_read [VP a book_j [VP t_read t_a book_j]]]] → LF assigns scope

15.3.3 LF in Urdu

Example: ہر طالب علم نے ایک کتاب پڑھی
Quantifier scope similar to English
Word order SOV does not block inverse scope

LF operations:

Movement of quantifiers may be covert (at LF)
PRO and null subjects interpreted at LF

15.3.4 LF in Saraiki

Example: ہر طالب علم نے ایک کتاب کھادی
Covert movement at LF allows scope ambiguity resolution
Focus and topic marking interpreted semantically

15.4 Interaction of PF and LF

OperationPF EffectLF Effect
Wh-movementAux inversion, frontingScope assignment, operator binding
Focus movementProsodic prominenceAlternative set identification
PassiveTheme raised in Spec-TPθ-role re-mapping: Agent → by-phrase
Control verbsPRO realization at PFTheta-role assignment at LF
Quantifier raisingNo overt PF movement (optional)Determines logical scope

Observation: Syntax constructs hierarchical structures, PF linearizes, LF interprets meaning; movement may affect one or both interfaces.

15.5 Cross-Linguistic Observations

LanguagePF ParameterLF OperationNotes
EnglishHead-initial, SVOCovert/in-situ quantifier raisingAuxiliary inversion for wh-movement
UrduHead-final, SOVCovert quantifier raisingPro-drop subjects interpreted at LF
SaraikiHead-final, SOVCovert quantifier raisingTopic/focus affects LF operators

Observation: English linearization relies on auxiliary inversion, Urdu/Saraiki rely on head-final order, LF mechanisms largely universal.

15.6 Illustrative Trees

English Wh-Question (PF + LF)

[CP What_i [C′ did [TP John [T′ t_did [VP read t_what_i]]]]]
PF: fronting, auxiliary inversion
LF: binds operator, assigns scope

Urdu SOV Clause (PF + LF)

[TP جان-ne [T′ t_perfect [vP کتاب [VP پڑھی t_جان]]]]
PF: SOV linearization
LF: assigns theta roles (Agent: جان, Theme: کتاب)

Saraiki SOV Clause with Focus

[FocP سیب_i [Foc′ Foc [TP جان-ne t_i کھادے]]]
PF: Focus fronted for prosody
LF: Focus operator binds alternatives

15.7 Exercises

Compare PF realizations of English and Urdu SVO vs SOV clauses.
Identify LF ambiguities in: Every student read a book (English, Urdu, Saraiki).
Draw PF and LF trees for focused constituents in English, Urdu, Saraiki.
Explain the effect of pro-drop subjects on LF interpretation in Urdu/Saraiki.

15.8 Summary

PF: linearizes hierarchical syntax; handles word order, prosody, inversion
LF: assigns θ-roles, resolves scope, interprets focus and topic
Cross-linguistic parameters: English (head-initial, overt movement), Urdu/Saraiki (head-final, pro-drop)
Syntax interacts with both interfaces to preserve grammaticality and meaning

PART V — CROSS-LINGUISTIC SYNTAX (CORE CONTRIBUTION)

16: English vs Urdu/Saraiki Word Order

A Minimalist Perspective on Parametric Variation

16.1 Introduction

Word order is one of the most visible typological differences across languages.

English: Subject–Verb–Object (SVO)
Urdu/Saraiki: Subject–Object–Verb (SOV)

From a Minimalist viewpoint, this variation is not superficial:

Reflects feature strength differences
Influences movement operations
Interacts with Case assignment, agreement, and information structure

This chapter examines:

Base hierarchical structure
Derivational differences in English vs Urdu/Saraiki
Feature-driven explanations
Scrambling and discourse-driven movement

16.2 Base Structure vs Surface Order

All languages share universal hierarchical assumptions:

VP is constructed first
Arguments are introduced in vP
Functional projections dominate lexical projections (T > v > V)

16.2.1 English Example

John ate apples.
Base vP: [vP John [VP eat apples]]
TP projection: T carries tense, agreement features

16.2.2 Urdu Example

جان نے سیب کھائے
Jān-ne seb khāe
Base vP: [vP John [VP apples eat]]
T assigns perfective agreement with object
Ergative marking: -ne on subject

16.2.3 Saraiki Example

جان نے سیب کھادے
Jān-ne seb khāde
Parallel to Urdu SOV
Verb-final structure maintained
Ergative alignment and object agreement present

16.3 Structural Analysis

16.3.1 English (SVO)

[TP John [T′ T [vP tJohn [VP eat apples]]]]
Verb movement: V → T for tense/agreement
Subject in Spec-TP: satisfies EPP and nominative Case

16.3.2 Urdu/Saraiki (SOV)

[TP John [vP tJohn [VP apples eat]] T]
Verb remains in vP
Optional movement of object driven by discourse features
Subject moves minimally to satisfy EPP

Key difference: linearization arises from PF parameter (head-initial vs head-final)

16.4 Case and Ergative Alignment

Urdu and Saraiki display split ergativity in perfective constructions:

16.4.1 Urdu/Saraiki Example

جان نے سیب کھایا
John-ERG apple ate
Subject receives ergative Case (-ne)
Verb agrees with object (φ-features)

16.4.2 English Contrast

John ate apples
Subject receives nominative Case
Verb agrees with subject

Observation: Alignment differences explained by feature distribution in v vs T

16.5 Feature-Based Explanation

LanguageCase AssignmentAgreementMovement
EnglishT → nominative on subjectSubject-verb agreementV → T for tense
Urdu/Saraikiv → ergative on subjectT → object agreementOptional object scrambling, V stays in v

Conclusion: Surface variation emerges from feature strength and checking requirements, not fundamental structure.

16.6 Scrambling

Urdu and Saraiki allow flexible word order for discourse or focus:

سیب جان نے کھائے
Apples John ERG ate
Object moves to Spec-TopP or left per discourse feature
Scrambling is optional, not required by EPP

Tree Representation

[TopP apples_i [Top′ Top [TP John_j [vP t_John_j [VP t_apples_i eat]]]]]
Linear order differs, hierarchical structure constant

16.7 Theoretical Implications

Universal Grammar (UG) is invariant: hierarchical projections remain identical

Variation arises from:

Feature distribution (who receives Case, who triggers agreement)
Strength of features (strong/weak)
PF-driven movement triggers (fronting, inversion, scrambling)
Word order differences are PF effects, not deep syntactic variation

Observation: Minimalist perspective captures parametric variation elegantly.

16.8 Summary

  • English SVO: V moves, subject receives nominative Case
  • Urdu/Saraiki SOV: Verb-final, subject receives ergative, object may scramble
  • Scrambling is discourse-motivated, optional
  • Universal hierarchical structure preserved; PF parameters differ
  • Feature-driven explanation unifies cross-linguistic variation

16.9 Exercises

Draw vP and TP trees for English SVO and Urdu/Saraiki SOV sentences.
Identify Case assignment and agreement in each structure.
Explain how scrambling interacts with discourse in Urdu/Saraiki.
Compare PF-driven surface order vs LF hierarchical interpretation.

17: Saraiki Syntax and Argument Structure

A Comprehensive Analysis of Predicate-Argument Relations, Case, Movement, and Interface Phenomena

17.1 Introduction

Saraiki, a Western Punjabi language, exhibits rich SOV word order, ergative alignment, and flexible scrambling, making it an ideal case study for generative syntax and computational modeling.

This chapter integrates:

Argument structure representation
Case assignment patterns
Verb-final constructions, light verb usage, and complex predicates
Scrambling, null subjects, and PRO
Advanced syntactic phenomena: relative clauses, clefts, heavy NP extraposition, and focus
Cross-linguistic comparisons with English and Urdu
Computational modeling of movement, agreement, and interface features

The focus is on how theta roles, Case, movement operations, and interface constraints interact to produce grammatical structures.

17.2 Basic Argument Structure in Saraiki

17.2.1 Core Arguments

Subject (Agent/Experiencer): often marked with -ne (ergative) in perfective
Object (Theme/Patient): receives default absolutive or direct object marking
Verb: clause-final (head-final)

Example (transitive verb):

جان نے سیب کھادے
Jān-ne seb khāde
Agent: جان (John-ERG)
Theme: سیب (apple)
Verb: کھادے (ate)

Theta assignment:

Verb assigns Agent theta role to subject and Theme role to object
vP mediates Case assignment

17.2.2 Intransitive Verbs

علی سویا
Ali-ne soyā
Subject: علی (Ali-ERG)
Verb: سویا (slept)
Observation: Ergative marking persists in perfective aspect

17.3 Ergative Alignment and Case Checking

Saraiki exhibits split ergativity:

AspectSentenceSubject CaseVerb Agreement
Perfectiveجان نے سیب کھادےErgative (-ne)Object agreement
Imperfectiveجان سیب کھاندا اےNominativeSubject agreement

Tree Representation (SVO/SOV variation):

[TP John-ne [T′ T [vP t_John [VP seb khāde]]]]
Case assignment is feature-driven:

v[+ERG] → subject
T[+ϕ] → object agreement (perfective)

17.4 Verb-Final Construction and Headedness

VPs in Saraiki are head-final, supporting flexible scrambling:
سیب جان نے کھادے
Seb John-ne khāde
Optional movement to Spec-TopP or Spec-FocP is discourse-driven
Headedness rule:
ProjectionHead
VPV (lexical)
vPv
TPT

17.5 Complex Predicates and Light Verbs

Light verb constructions encode aspect, tense, voice, while main verb carries lexical meaning:
جان نے کتاب پڑھی
John-ne kitāb paṛhī
v[+ERG]: assigns ergative Case to subject
T[+ϕ]: agreement with object in perfective, subject in imperfective

Feature Schema:

v[+ERG] → subject Case
T[+ϕ] → agreement
V → merges lexical meaning with complements

17.6 Scrambling and Information Structure

Discourse features ([+focus], [+topic]) trigger optional fronting:
کتاب جان نے پڑھی
Kitāb John-ne paṛhī
LF preserves hierarchical argument structure
PF linearization changes surface order without affecting theta assignment

Tree Representation:

[TopP kitāb_i [Top′ Top [TP John-ne [vP t_John [VP t_kitāb paṛhī]]]]]

17.7 Null Subjects and PRO

Saraiki allows pro-drop in embedded clauses:
جان نے کہا کہ (Ø) کتاب پڑھی
John-ne kahā ke (pro) kitāb paṛhī
PRO receives Agent theta role from matrix verb
LF resolves co-reference, PF reflects null subject

17.8 Advanced Syntactic Phenomena

17.8.1 Focus and Contrastive Constructions

کتاب_i جان نے پڑھی
Kitāb_i John-ne paṛhī
Fronted object triggers contrastive focus
Movement to Spec-FocP preserves hierarchical theta roles

17.8.2 Relative Clauses and Gap Licensing

وہ کتاب جو جان نے پڑھی
vo kitāb jo John-ne paṛhī
Relative pronoun جو (jo) introduces embedded clause
Internal gap licensed via A′-movement
LF maintains theta-role assignment

Tree:

[CP kitāb_i [C′ jo [TP John-ne [vP t_John [VP t_kitāb paṛhī]]]]]

17.8.3 Clefts and Emphasis

یہی کتاب جان نے پڑھی
Yehi kitāb John-ne paṛhī
یہی (yehi) marks emphasized constituent
Spec-CleftP/TopP hosts focus
PF linearization signals prominence, LF preserves argument structure

17.8.4 Extraposition and Heavy NP Shift

جان نے پڑھی وہ کتاب جو پیچیدہ تھی
John-ne paṛhī vo kitāb jo pechida thī
Embedded relative clause shifted rightward
Theta-role assignment remains intact
PF linearization supports processing constraints

17.9 Interface Phenomena

17.9.1 Syntax-Prosody and Syntax-Discourse

Intonation contours indicate focus/topicalization
Discourse particles interact with PF linearization
Hierarchical theta structure remains consistent

Example:

کتاب_i جان نے پڑھی، لیکن پرانی تھی
Kitāb_i John-ne paṛhī, lekin purāni thī

17.10 Computational Modeling

17.10.1 Feature-Driven Trees

Fronting, clefts, and relative clauses represented via Spec-FocP/TopP movements
PF linearization computed post-Merge, LF maintains hierarchical relations

Example Pseudocode:

def Scramble(DP, target):
if DP.has_feature('+focus') or DP.has_feature('+topic'):
move(DP, target)

17.10.2 Parsing and NLP Considerations

Long-distance dependencies from relative clauses and clefts require feature propagation and A′-movement tracking
NLP parsers must handle scrambling, fronting, and complex predicates

17.11 Cross-Linguistic Comparison

FeatureEnglishUrduSaraiki
Basic orderSVOSOVSOV
Subject CaseNominativeErgative/NominativeErgative/Nominative
Verb agreementSubjectObject/SubjectObject/Subject
ScramblingOptionalOptionalOptional, discourse-driven
PRO/null subjectsLimitedExtensiveExtensive
Complex predicatesAux+VerbLight verb constructionsLight verb constructions
Relative clausesRight branchingSimilar to SaraikiRight branching
Focus frontingRareOptionalExtensive
Heavy NP extrapositionLimitedOptionalAllowed

17.12 Illustrative Trees

17.12.1 Basic Transitive Clause

[TP John-ne [T′ T [vP t_John [VP seb khāde]]]]

17.12.2 Scrambled Object

[TopP seb_i [Top′ Top [TP John-ne [vP t_John [VP t_seb khāde]]]]]

17.12.3 Embedded Clause with PRO

[TP John-ne [T′ kahā [CP ke [TP pro_i [T′ t_paṛhī kitāb]]]]]

17.12.4 Relative Clause

[CP kitāb_i [C′ jo [TP John-ne [vP t_John [VP t_kitāb paṛhī]]]]]

17.12.5 Cleft Construction

[CleftP yehi kitāb_i [C′ C [TP John-ne [vP t_John [VP t_kitāb paṛhī]]]]]

17.13 Feature-Based Schema for Saraiki Syntax

ProjectionFeatureFunction
v[+ERG]assigns ergative Case to subject (perfective)Case assignment
T[+ϕ]agrees with object (perfective) or subject (imperfective)Agreement assignment
Vlexical meaning + merges with complementsHead of VP
Scrambling[+topic], [+focus]Discourse-driven fronting
PROlicensed by control verbsReceives theta role at LF
Spec-FocP[+focus]Hosts fronted focused constituents
TopP[+topic]Hosts topicalized constituents

17.14 Exercises

Draw basic transitive and scrambled object trees with feature annotation.
Represent embedded clauses with PRO and assign theta roles.
Model relative clauses, clefts, and heavy NP extraposition in tree format.
Compare Saraiki, Urdu, and English syntactic structures computationally.
Implement scrambling algorithm for focus-driven movement in pseudocode.

17.15 Summary

Saraiki exhibits SOV order, ergative alignment, scrambling, null subjects, and light verb constructions
Advanced phenomena: relative clauses, clefts, heavy NP extraposition, and focus constructions
Feature-driven modeling captures Case assignment, agreement, and discourse effects
Cross-linguistic comparison highlights parametric variation relative to English and Urdu
Computational and psycholinguistic approaches facilitate processing, NLP applications, and syntactic theory validation

18: Case Systems: Nominative vs Ergative

A Comparative and Feature-Based Analysis Across English, Urdu, and Saraiki

18.1 Introduction: The Case Filter

The Case Filter is a fundamental principle of Universal Grammar:

All Determiner Phrases (DPs) must receive abstract Case in order to be grammatical.

Illustrative examples:

*She likes he → violation (unassigned Case)
She likes him → grammatical
English: DPs must be assigned nominative or accusative Case
Urdu/Saraiki: DPs may receive ergative, accusative, or dative Case depending on aspect and theta roles

Formal Principle:

DP → must be assigned Case at PF/LF

Violation of the Case Filter results in ungrammaticality (*).

18.2 English Case System

CaseAssigned ByExample
NominativeT (finite tense)She runs
Accusativev/VJohn saw her

Feature-Based Representation:

T carries [uφ] (unvalued phi-features) that probe for a DP in Spec-TP
v or V assigns accusative Case to the object

Example Tree:

[TP She_i [T′ T[uφ] [vP John [v′ v [VP saw t_i]]]]]
Subject receives nominative Case from T
Object receives accusative Case from v

18.3 Urdu/Saraiki Case System

Urdu and Saraiki exhibit split ergativity, mainly conditioned by perfective aspect.

CaseMarkerFunctionExample
Ergative-neSubject in perfectiveجان نے سیب کھایا (John-ERG apple ate)
Accusative-koObject markingمیں نے کتاب کو پڑھا (I read the book-ACC)
Dative-koExperiencer/indirect objectمجھے کتاب کو پسند ہے (I like the book)

Key Properties:

v assigns ergative Case in perfective transitive constructions
T assigns agreement with object in ergative alignment
Non-perfective constructions use nominative-subject agreement, like English

18.4 Agreement Mismatch in Urdu/Saraiki

Ergative alignment leads to apparent agreement mismatch:

Verb agrees with object, not with ergative subject
Example:
جان نے سیب کھایا
John-ERG apple ate
Subject: ergative (-ne)
Verb: agrees with object (φ-features)
Contrasts with English SVO pattern where verb agrees with subject

Feature-Driven Representation:

v[+ERG] → assigns ergative Case to subject
T[uφ] → probes object for agreement
PF linearization: SOV
LF interpretation: theta roles preserved

18.5 Computational Feature-Based Representation

18.5.1 English

Subject DP in Spec-TP:
T[uφ] → probes DP in Spec-TP → assigns nominative Case
v → assigns accusative Case to object
Minimal Pair:
*She likes he (ungrammatical)
She likes him (grammatical)

18.5.2 Urdu/Saraiki

Perfective Transitive Clause:
v[+ERG] → assigns ergative Case to subject
T[uφ] → agrees with object (φ-features)
Optional scrambling/fronting does not affect Case assignment, only PF linearization:
سیب جان نے کھایا
Seb John-ERG ate
Object fronted for discourse reasons, structure preserves theta roles

18.5.3 Feature-Based Tree Representation

[TP John-ne [T′ T[uφ] [vP t_John [VP seb khāye]]]]
v[+ERG]: assigns ergative Case to subject
T[uφ]: values φ-features via object
PF linearization: SOV or scrambled
LF interpretation: theta roles preserved

18.6 Cross-Linguistic Observations

FeatureEnglishUrduSaraiki
Subject CaseNominativeErgative/NominativeErgative/Nominative
Object CaseAccusativeAccusativeAccusative
Verb AgreementSubjectObject (perfective)Object (perfective)
Aspect SensitivityMinimalPerfective triggers ergativePerfective triggers ergative
ScramblingOptional discourse-drivenOptionalOptional
PF vs LFLinearization SVOSOV, topic/focus-drivenSOV, topic/focus-driven

Observation: The core hierarchical structure is universal; surface differences reflect parametric variation of Case assignment and agreement.

18.7 Exercises

Identify Case assignment in the following sentences:
English: She saw him
Urdu: جان نے سیب کھایا
Saraiki: جان نے سیب کھادے
Draw feature-based trees showing v, T, and DP interactions.
Explain the agreement mismatch in perfective Urdu/Saraiki.
Scramble the object in Urdu/Saraiki and verify theta-role preservation.

18.8 Summary

Case Filter: all DPs must receive Case
English: nominative assigned by T, accusative by v
Urdu/Saraiki: split ergativity; ergative on perfective subjects, object agreement with T
Feature-based representation captures assignment and agreement
Scrambling/fronting affects PF but preserves LF interpretations

19: Agreement Systems in South Asian Languages

Feature Checking, Verb Agreement, and Parametric Variation in English, Urdu, and Saraiki

19.1 Introduction

Agreement systems are central to syntactic theory, determining how verbs, auxiliaries, adjectives, and other functional elements interact with their arguments.

Key points:

Agreement (φ-features) ensures feature matching between a head and its arguments
South Asian languages, like Urdu and Saraiki, exhibit split ergativity, object agreement, and discourse-driven agreement patterns
English provides a simpler, nominative-accusative agreement system

This chapter examines:

Feature-driven agreement in English, Urdu, Saraiki
Ergative and accusative systems
Computational representation of agreement
Scrambling and optional agreement

19.2 Formal Definition of Agreement

Agreement: a syntactic relation whereby a head (typically T or v) probes for an argument with matching features, values its unvalued features, and optionally triggers movement.

Formal Feature-Based Rule:

Head[ uφ ] → probes DP[i φ] → DP[i φ] values Head[uφ]
uφ: unvalued phi-features on the head
i φ: interpretable phi-features on the DP
Probe-Goal mechanism: Head searches within c-command domain

19.3 English Agreement System

19.3.1 Finite Verbs

T carries [uφ] features; subject in Spec-TP values features

Example:

She runs
T[uφ] probes DP: She[i φ]
Values person, number, gender (3SG feminine singular)

19.3.2 Object Agreement

Minimal in English: usually null
Example: John likes her — verb does not agree with object

19.3.3 Feature Table: English

HeadFeaturesProbesValues
T[uφ]Spec-TPDP subject
v[ ]complementsUsually no φ-agreement

Observation: English exhibits subject-verb agreement only; object agreement is absent.

19.4 Urdu Agreement System

Urdu exhibits split ergativity and object agreement in perfective clauses.

19.4.1 Perfective Transitive Clause

جان نے سیب کھایا
John-ERG apple ate

Feature Assignment:

v[+ERG] → assigns ergative Case to subject
T[uφ] → probes object φ-features → verb agrees with object

Tree Representation:

[TP John-ne [T′ T[uφ] [vP t_John [VP seb khāya]]]]
PF: SOV linearization
LF: theta roles preserved

19.4.2 Imperfective / Non-Perfective Clause

Subject receives nominative Case
Verb agrees with subject, not object

Example:

جان سیب کھاتا ہے
John seb khātā hai
Standard nominative alignment
Agreement pattern resembles English

19.4.3 Object Agreement Optionality

Scrambled object may trigger agreement with verb
Feature-driven, discourse-sensitive

Example:

سیب جان نے کھایا
Seb John-ne khāya
PF linearization altered
LF theta assignment preserved

19.5 Saraiki Agreement System

Saraiki mirrors Urdu, but with additional dialectal variations:

v[+ERG] assigns ergative Case in perfective
T[uφ] agrees with object
Scrambling and focus influence PF, optionally LF interpretations

Example:

جان نے سیب کھادے
John-ne seb khāde
Object agreement preserved
PF: SOV, optional fronting for topicalization

Feature Table: Saraiki

HeadFeaturesProbeGoal
v[+ERG]subjectassigns ergative Case
T[uφ]objectvalues φ-features
TopP[+Topic]DPoptional scrambling

19.6 Computational Feature-Based Representation

19.6.1 Feature-Checking Operations

Probe-Goal: Head with unvalued features searches c-command domain
Value Transfer: Goal DP values unvalued features
Optional Movement: PF may linearize DP differently (scrambling, fronting)
Agreement Mismatch: v assigns ergative Case → T agrees with object

Minimal Pair:

John-ne seb khāya → verb agrees with object
Seb John-ne khāya → object fronted, agreement preserved

19.6.2 Algorithmic Representation

For each Head H[uφ]:

Search c-command domain for DP[i φ]

If found:
Assign φ-features to H
Check Case assignment (v or T)
Linearize DPs per PF rules

Encodes English vs Urdu/Saraiki parametric variation

19.7 Scrambling and Agreement Interactions

Scrambling in Urdu/Saraiki does not alter Case assignment
Agreement may still target original base-generated DP
PF realizes surface word order, LF preserves theta roles

Example:

کتاب جان نے پڑھی
Kitāb John-ne paṛhī
Scrambled object fronted
Verb agreement remains with object (PF vs LF distinction)

19.8 Cross-Linguistic Comparison Table

FeatureEnglishUrduSaraiki
Subject CaseNominativeErgative/NominativeErgative/Nominative
Object CaseAccusativeAccusativeAccusative
Verb AgreementSubjectObject (perfective)Object (perfective)
ScramblingMinimalOptionalOptional, discourse-driven
Aspect SensitivityNo splitPerfective triggers ergativePerfective triggers ergative
Null SubjectsLimitedExtensiveExtensive

Observation: Agreement systems in South Asian languages show parametric variation guided by:

Feature strength
Aspectual split
Scrambling and discourse triggers

19.9 Exercises

Draw feature-based trees showing v and T agreement in:
English: She reads books
Urdu: جان نے کتاب پڑھی
Saraiki: جان نے کتاب کھادی
Explain the effect of scrambling on agreement in Urdu/Saraiki.
Compare PF linearization vs LF theta assignment in object-fronted clauses.
Identify agreement mismatches in perfective clauses and explain feature checking.

19.10 Summary

Agreement is a feature-checking operation: T probes DP; v assigns Case
English: subject-verb agreement; accusative object agreement absent
Urdu/Saraiki: split ergativity; object agreement in perfective clauses; subject receives ergative Case
Scrambling/fronting affects PF linearization, LF theta roles remain invariant
Computational representation formalizes probe-goal operations, Case assignment, and PF linearization

PART VI — ADVANCED TOPICS

20: Phase Theory

Derivational Domains, Spell-Out, and Syntactic Locality

20.1 Introduction

Phase theory, a core concept of Minimalist syntax, formalizes locality constraints on syntactic derivations:

Proposed by Chomsky (2000, 2001)
Phrases are derived in chunks (“phases”) that are spelled out to the interfaces incrementally
Captures locality of movement, feature checking, and derivational economy

Key Concepts:

Phases: vP, CP
Spell-Out: transfer of a phase to PF (phonological form) and LF (logical form)
Phase Impenetrability Condition (PIC): defines what elements are accessible to higher heads

This chapter applies phase theory to English, Urdu, and Saraiki, highlighting movement, agreement, and Case-checking constraints.

20.2 Phases: Definition and Examples

Definition: A phase is a syntactic domain that:

Is merged as a maximal projection (XP)
Transfers to PF and LF once completed
Protects its complement domain from higher probing

Canonical Phases:

PhaseDomainFunction
vPIntroduces internal argumentsAssigns Case; spells out VP complement
CPComplements of CGoverns wh-movement, topicalization

20.2.1 vP Phase

English Example:

[TP John [T′ T [vP t_John [VP eat apples]]]]
vP: phase boundary
DP (apples): complement of V, accessible to v for Case
Subject (John): in Spec-vP, accessible to T for φ-agreement

Saraiki/Urdu Example:

[TP جان-نے [T′ T [vP t_جان [VP سیب کھادے]]]]
vP spells out internal argument DP (سیب / seb)
Ergative subject (جان-نے / John-ERG) is accessible to T for agreement

20.2.2 CP Phase

English wh-question:

[CP What_i [C′ did [TP John [T′ t_did [vP t_John [VP eat t_i]]]]]]
CP is a phase
Movement of wh-phrase must obey Phase Impenetrability Condition (PIC)

PIC: Only Spec and head of a phase are accessible to higher probes; complements are spelled out.

20.3 Phase Impenetrability Condition (PIC)

Formal statement (Chomsky 2001):

In a phase α with head H and complement domain β:

The complement β is not accessible to operations outside α

Only H and Spec(α) are visible to higher probes

H Phase Head → accessible
Spec(α) → accessible
Complement(α) → spelled out; inaccessible

Implications:

Movement across phase boundaries must pass through Spec of the phase
Explains successive-cyclic wh-movement

20.4 Application to English

20.4.1 Successive-Cyclic Movement

Wh-phrase moves via Spec-CP of each intermediate CP
What_i did [CP t_i C′ [TP John [T′ t_did [vP t_John [VP eat t_i]]]]]
vP and CP are spelled out incrementally
Higher C probes Spec of CP
PIC enforces locality

20.4.2 Object Movement

Raising of object to Spec-TP must respect vP phase boundary
Ensures feature-checking occurs locally

20.5 Application to Urdu/Saraiki

vP and CP remain canonical phases
Scrambling and topicalization interact with phase boundaries:

20.5.1 Object Scrambling Across Phases

[TopP سیب_i [Top′ Top [TP جان-نے [T′ T [vP t_جان [VP t_سیب کھادے]]]]]]
Object (سیب / seb) moves to Spec-TopP
Movement respects vP phase boundary
LF interprets theta roles correctly

20.5.2 Embedded CPs

Wh-movement and focus movement must be successive-cyclic via CP
Example:
میں نے پوچھا کہ کون_i کتاب پڑھے گا
I asked that who_i book will-read
Who moves from Spec-vP → Spec-TP → Spec-CP
Each CP and vP spells out per phase theory

20.6 Feature-Driven Operations within Phases

vP Phase:

  • Assigns Case to internal arguments
  • Subjects in Spec-vP remain accessible to T

CP Phase:

  • Governs wh-movement, topicalization, focus
  • Scrambled DPs pass through Spec positions for PF/LF access

Probe-Goal Operations:

Head[uφ] → probe DPs within c-command domain
If DP in complement of phase → must pass through Spec of phase
Explains successive-cyclic movement and agreement locality

20.7 Computational Representation

Phase-Based Algorithm:

For each phase α:
Spell out complement(α) → PF/LF
Allow probing of Spec(α) and Head(α) only
Perform local feature checking:
v[+ERG] → subject
T[uφ] → probe accessible DP
C[wh] → probe Spec-CP
Ensures agreement, Case, and movement respect locality constraints

20.8 Cross-Linguistic Implications

FeatureEnglishUrduSaraiki
PhasesvP, CPvP, CPvP, CP
Successive-cyclic movementYesYesYes
Object scramblingMinimalOptional, PF-drivenOptional, PF-driven
PIC effectsEnforces localityEnforces localityEnforces locality
AgreementSubject-verbObject/subject depending on aspectObject/subject depending on aspect

Observation: Phase theory unifies movement and agreement patterns across languages, explaining parametric variation while preserving universal hierarchies.

20.9 Exercises

Draw vP and CP phase trees for:

English wh-question: What did John eat?
Urdu embedded question: میں نے پوچھا کہ کون کتاب پڑھے گا
Saraiki object scrambling: کتاب جان نے پڑھی

Identify phase boundaries and which DPs are accessible to higher probes.
Explain how PIC constrains successive-cyclic movement.
Represent feature checking and agreement operations computationally for each phase.

20.10 Summary

Phases: vP and CP are minimal derivational units
Spell-Out: complements inaccessible to higher heads; Spec and head accessible
PIC enforces locality of movement and agreement
English, Urdu, Saraiki share universal phase structure
Scrambling, topicalization, wh-movement respect phase boundaries
Feature-driven operations (Case, φ-agreement) occur locally within phases

21: Minimalism and Economy Conditions

Optimizing Derivations: Feature Checking, Movement, and Economy Principles

21.1 Introduction

Minimalist syntax, pioneered by Chomsky (1995, 2000), aims to reduce syntactic derivations to essential operations.

Core Goals:

Achieve economical derivations (least computational effort)
Restrict movement and feature checking to necessary cases
Ensure interface conditions (PF and LF) are satisfied

Economy conditions formalize why certain movements occur while others are blocked.

This chapter applies Minimalist principles to English, Urdu, and Saraiki, examining:

Merge and Move
Feature checking efficiency
Locality and economy constraints

21.2 Merge and Economy

Merge: the operation combining two syntactic objects into a single constituent

External Merge: combines a head with a new argument/DP
Internal Merge (Move): re-merges an existing DP to a higher position

Economy Principle for Merge:

Merge only occurs when required for:

Feature checking (e.g., φ-features, Case)
Satisfying interface conditions

Example:

John_i [T′ T[uφ] [vP t_John [VP eat apples]]]
External Merge: John merged into Spec-vP
Internal Merge: not needed unless movement required (e.g., wh-question)

21.3 Move and Economy Conditions

Internal Merge (Move) Principles:

Shortest Move / Minimal Link Condition (MLC):

A DP moves to the closest position that satisfies its feature requirements

Avoids unnecessary long-distance movement

Example (English):

What_i did John eat t_i?

Spec-CP of each intermediate CP accessed in a successive-cyclic manner

Minimal distance ensures economy

Check Features as Early as Possible:

Move DP only if its unvalued features cannot be checked in situ

Example (Urdu/Saraiki scrambling):

سیب جان نے کھایا

Seb John-ne khāya

Object moved to Spec-TopP only if discourse features [+Focus/+Topic] present

Otherwise, remains in base position (economical derivation)

21.4 Economy of Representation

21.4.1 Economy of Derivation

Feature-driven movement occurs only when features are unvalued
Unnecessary movement violates economy of derivation

Formal Rule (Chomsky, 1995):

Move α → β only if α has unvalued features that β can check

21.4.2 Economy of Representation

Avoids redundant structure-building
Keeps derivations minimal: do not create additional projections unless required

English Example:

*John_i t_ate apples_i
Base-generated object satisfies Case and θ-role, no movement needed
Economy principle blocks unnecessary movement

21.5 Feature-Checking Hierarchies

Heads carry unvalued features: T[uφ], v[ERG], C[uWh]

DPs probe-complement relationship to check features
Movement occurs only to satisfy feature requirements
Scrambling and topicalization are optional operations triggered by [+Focus/+Topic]

Example: Urdu/Saraiki object agreement

v[+ERG] → assigns Case
T[uφ] → agrees with object
Scrambling → optional for discourse
PF realizes fronted object
LF preserves theta structure

21.6 Minimalist Conditions Across Languages

ConditionEnglishUrduSaraiki
MovementShortest Move / Successive-cyclicShortest Move, Scrambling optionalSame as Urdu
Case assignmentNominative/AccusativeErgative/Nominative, AccusativeErgative/Nominative, Accusative
Feature checkingSubject-verbObject agreement in perfectiveObject agreement in perfective
EconomyOnly move if features unvaluedScramble only if [+Topic/+Focus]Scramble only if [+Topic/+Focus]

Observation: Economy conditions enforce minimal derivations while allowing parametric variation.

21.7 Phase Interaction and Economy

Phases constrain movement: only Spec and head accessible
Economy ensures DPs move shortest distance through phase Spec

Example:

English wh-question: What did John eat?
Moves successive-cyclic through Spec-CP
Economy prevents unnecessary movement beyond phase boundary

Urdu/Saraiki Example:

کتاب جان نے پڑھی
Kitāb John-ne paṛhī
Scrambling to Spec-TopP respects vP and CP phase boundaries
Economy blocks movement if discourse features absent

21.8 Computational Representation

Algorithm for Minimalist Economy:

For each DP:
If DP has unvalued features:
Move to nearest head that can value features (Shortest Move)
Else:
Leave in base position (economy)
Check Case:
If unassigned → v/T assigns Case
Scramble if [+Topic/+Focus] → optional PF movement
Spell-Out phase → transfer complements to PF/LF
Captures English, Urdu, and Saraiki parametric differences

21.9 Exercises

Represent feature-checking operations computationally for:
English: What did John eat?
Urdu: جان نے کتاب پڑھی
Saraiki: کتاب جان نے کھادی

Identify derivational steps that satisfy economy conditions.
Explain optional movement in Urdu/Saraiki under [+Focus/+Topic] triggers.
Illustrate phase boundaries and successive-cyclic movement consistent with economy principles.

21.10 Summary

Minimalism seeks economical derivations in syntax
Move operations occur only to check unvalued features
Shortest Move / Minimal Link Condition ensures locality
Economy interacts with phase theory: movement occurs through Spec of phase only if necessary
English: subject-verb agreement, minimal movement
Urdu/Saraiki: object agreement, scrambling is optional and discourse-driven
Computational representation captures feature checking, movement, and economy constraints

22. Computational Modeling of Syntax

Feature Structures, Algorithms, and Applications in English, Urdu, and Saraiki

22.1 Introduction: Syntax as an Algorithm

The formalization of syntax allows linguistic theory to interface with computational systems. Modern approaches treat syntax as a set of algorithmic operations over feature-rich representations, enabling applications in:

Natural Language Processing (NLP)
Grammar checking systems
Language acquisition modeling

Core Principles:

Feature Structures: Every lexical item carries interpretable (i) and uninterpretable/unvalued (u) features.
Operations: Merge, Move, Agree, and Scramble can be formalized as computational procedures.
Constraints: Economy, phase theory, and locality can be encoded algorithmically.

Motivation: By formalizing syntax computationally, we can simulate derivations, test predictions, and implement multilingual models including English, Urdu, and Saraiki.

22.2 Feature Structures

Definition: A feature structure is a set of attribute-value pairs associated with a lexical item or syntactic node.

English Example:

DP: She
features = { category: noun, person: 3, number: singular, gender: feminine }

Urdu Example:

DP: جان-نے (John-ERG)
features = { category: noun, person: 3, number: singular, ergative: +, gender: masculine }

Saraiki Example:

DP: سیب (seb)
features = { category: noun, number: singular, case: accusative }

Observation: Feature structures serve as the basis for algorithmic operations like Merge and Agree.

22.3 Computational Operations

22.3.1 Merge

Definition: Merge combines two syntactic objects into a new constituent.

Pseudocode:

def Merge(X, Y):
return {X, Y} # Returns a new syntactic object combining X and Y

English Example:

Merge(VP, DP) → VP[eat apples]

Urdu/Saraiki Example:

Merge(vP, DP) → vP[جان-نے [VP سیب کھادے]]

Binary vs n-ary Merge:

Binary Merge: combines exactly two elements → standard in Minimalist derivations
N-ary Merge: multiple elements simultaneously → theoretical variation; generally avoided in economy-driven derivations

22.3.2 Agree

Definition: Agree is a feature-checking operation where a head with unvalued features (probe) searches its c-command domain for a goal DP with matching interpretable features.

Pseudocode:

def Agree(probe, goal):
if goal in c_command_domain(probe):
value_features(probe, goal)

English Example:

T[uφ] probes DP: She[i φ] → T’s φ-features valued

Urdu/Saraiki Example:

T[uφ] probes object DP: سیب[i φ] → T’s φ-features valued
v[+ERG] → assigns ergative Case to subject

Observation: Agree ensures subject/verb or object/verb agreement based on feature specifications and phase boundaries.

22.3.3 Internal Merge (Move) and Scrambling

Internal Merge: Moves a DP to a higher position for feature checking or discourse reasons.

Pseudocode:

def InternalMerge(DP, target):
if DP.has_unvalued_features(target):
move(DP, target)

Example (Urdu Object Scrambling):

Seb moves to Spec-TopP if [+Focus] → سیب جان نے کھادے

Example (English Wh-Movement):

What moves to Spec-CP → What did John eat t_What?

22.4 Computational Trees

Feature-based tree representation:

English:

[TP She[i φ] [T[uφ] did [vP t_She [VP eat apples]]]]

Urdu:

[TP جان-نے [T[uφ] [vP t_جان [VP سیب کھادے]]]]

Saraiki:

[TopP سیب_i [Top′ Top [TP جان-نے [vP t_جان [VP t_سیب کھادے]]]]]
Each node carries feature structures
Operations (Merge, Agree, Move) manipulate these structures algorithmically

22.5 Applications

22.5.1 Natural Language Processing (NLP)

Feature-based grammar can parse and generate sentences
Useful for machine translation (English ↔ Urdu/Saraiki)
Ensures agreement, Case assignment, and word order

22.5.2 Grammar Checking

Automatically detects violations of Case Filter, Agreement, or Movement constraints
Example:
*She likes he → flagged by computational grammar
Supports educational and computational tools

22.5.3 Language Acquisition Modeling

Simulates incremental learning of Merge and Agree
Explains parameter setting in Urdu/Saraiki learners of English and vice versa
Models scrambling, ergative agreement, and word order preferences

22.6 Algorithmic Summary

High-level pseudocode for derivation:

for lexical_item in lexicon:
Merge(lexical_item, current_structure)
if lexical_item.has_unvalued_features():
for DP in c_command_domain(lexical_item):
Agree(lexical_item, DP)
if movement_required(lexical_item):
InternalMerge(DP, target_position)
SpellOut(phase)

Key Points:

Merge builds hierarchical structure
Agree checks features and assigns Case
Internal Merge moves DPs for feature or discourse requirements
Spell-Out ensures PF/LF accessibility

22.7 Cross-Linguistic Considerations

OperationEnglishUrduSaraiki
MergeBinary, standardBinaryBinary
AgreeSubject-verb onlyObject/subject agreementObject/subject agreement
Internal MergeWh-movementOptional scramblingOptional scrambling
Feature-Driven MovementYesYesYes
Phase AwarenessvP, CPvP, CPvP, CP
PF/LF InterfacesStrictFlexible (scrambling)Flexible (scrambling)

Observation: Computational modeling captures parametric variation while maintaining universal principles.

22.8 Exercises

Implement pseudocode for Merge, Agree, and Internal Merge for:

English: She reads books
Urdu: جان نے کتاب پڑھی
Saraiki: کتاب جان نے کھادی

Draw feature-annotated trees for each sentence.
Simulate object scrambling and verify agreement assignment.
Identify phase boundaries and model PF/LF Spell-Out computationally.

22.9 Summary

Syntax can be formalized algorithmically using Merge, Agree, and Internal Merge
Feature structures encode φ-features, Case, and discourse roles
English: simple subject-verb agreement, minimal scrambling
Urdu/Saraiki: split ergativity, object agreement, optional scrambling
Computational grammar supports NLP, grammar checking, and acquisition modeling

23: Syntax and Cognition

The Interface Between Syntactic Theory, Processing, and Cognitive Architecture

23.1 Introduction

Syntax is not only a formal system of rules but also a cognitive system that interacts with processing, memory, and learning mechanisms. Understanding syntax-cognition links bridges linguistics with psycholinguistics, computational modeling, and neurolinguistics.

Key Questions:

How does the brain represent hierarchical structure?
What cognitive constraints affect movement, agreement, and word order?
How do English, Urdu, and Saraiki speakers process derivational complexity differently?

23.2 Syntax as a Cognitive Module

Hypothesis: Syntax operates as a distinct cognitive module, guided by Universal Grammar (UG) principles.

Minimalist derivations reflect economy of computation in the mind
Feature checking, Merge, and Move occur under limited working memory resources
Phase boundaries reduce processing load by chunking structures

Example:

[TP John [T′ T [vP t_John [VP eat apples]]]]
The vP phase allows the processor to compute the VP separately before integrating it into TP
This incremental processing mirrors psycholinguistic parsing strategies

23.3 Processing Movement

23.3.1 A-Movement (Subject Raising)

Subject raising occurs early in the derivation, reducing the need for long-distance retrieval
Minimal link condition (Shortest Move) reflects cognitive efficiency

Example: English

John_i [T′ T [vP t_John [VP eat apples]]]
Subject is in Spec-vP initially
Moves to Spec-TP to check φ-features and EPP
Minimizes memory retrieval distance

Urdu/Saraiki Example:

Subject may remain in Spec-vP due to ergative marking
T agrees with object instead of subject in perfective clauses
Reduces computational cost for agreement operations

23.3.2 A′-Movement (Wh, Focus, Topicalization)

Wh-movement and focus constructions impose higher processing load
Successive-cyclic movement aligns with phase theory and incremental parsing

Example:

What_i did John eat t_i?
Each intermediate Spec-CP position acts as a processing checkpoint
Cognitive system avoids long-distance dependencies in a single step

Urdu Example:

میں نے پوچھا کہ کون_i کتاب پڑھے گا
I asked that who_i book will-read
Scrambling of object (کتاب / kitāb) is optional
Cognitive system leverages topicalization for discourse salience

23.4 Working Memory and Locality

Cognitive Constraints:

Human parser prefers local dependencies
Phase theory operationalizes locality: only Spec and head of phase are accessible
Distance effects: longer movement increases processing difficulty

Example:

Complex English wh-question:
Which book_i did the professor that John recommended t_i finally read?
Multiple embedded phases
Each intermediate Spec-CP reduces retrieval cost

Urdu/Saraiki Processing:

Scrambled objects and topicalized constituents respect vP/CP phase boundaries
Allows incremental interpretation in real-time parsing

23.5 Agreement and Cognitive Load

Feature checking reduces ambiguity in derivation
English: minimal object agreement → low processing cost
Urdu/Saraiki: split ergativity → cognitive parser must identify perfective vs imperfective contexts

Example: Urdu Perfective

جان نے سیب کھایا
John-ERG apple ate
Parser identifies ergative subject
Computes object agreement → slightly higher processing load

23.6 Interface with Semantics

Syntax interacts with LF (Logical Form) to ensure theta-role assignment
Cognitive system relies on hierarchical structure to interpret meaning

English Example:

John_i [T′ T [vP t_John [VP eat apples]]]
Subject receives agent θ-role
Object receives patient θ-role
Minimalist derivation allows incremental semantic interpretation

Urdu/Saraiki Example:

Ergative subjects do not trigger verb agreement
Object agreement helps parser resolve θ-roles efficiently

23.7 Cross-Linguistic Cognitive Implications

PropertyEnglishUrduSaraiki
Subject-verb agreementSimpleConditionalConditional
Object agreementMinimalPerfective onlyPerfective only
ScramblingRareOptionalOptional
Phase-based localityvP, CPvP, CPvP, CP
Cognitive loadLowModerateModerate
Incremental processingHighModerateModerate

Observation: Parametric variation in agreement, ergativity, and scrambling correlates with processing complexity and working memory demands.

23.8 Computational Modeling of Syntax-Cognition Interface

Algorithmic Representation:

for phase in derivation:
SpellOut(phase) # compute PF/LF incrementally
for DP in phase:
if DP.has_unvalued_features():
Agree(phase.head, DP)
if movement_required(DP):
InternalMerge(DP, Spec_of_phase)
Interpret(LF)
Simulates incremental parsing and feature checking
Integrates Minimalist principles, phase theory, and cognitive efficiency

23.9 Exercises

Draw phase trees for English and Urdu wh-questions, annotating feature-checking points.
Explain incremental processing advantages of successive-cyclic movement.
Compare cognitive load for object scrambling in Urdu vs English simple SVO sentences.
Model agreement computations for perfective ergative sentences in Urdu/Saraiki.

23.10 Summary

Syntax reflects cognitive efficiency in addition to formal rules
Phase theory, Minimalist economy, and feature checking reduce processing load
English: subject-verb agreement, minimal movement, low load
Urdu/Saraiki: object agreement, optional scrambling, higher computational complexity
Cognitive models align with computational grammar algorithms, capturing incremental processing and derivational constraints

PART VII — PEDAGOGICAL AND RESEARCH EXTENSIONS

24: Teaching Syntax Effectively

Strategies, Pedagogy, and Multilingual Considerations for Syntax Instruction

24.1 Introduction

Teaching syntax, especially generative and minimalist frameworks, requires bridging formal theory and practical understanding. In multilingual contexts like Pakistan, students may have backgrounds in Urdu or Saraiki, which can influence comprehension of English syntactic concepts.

Goals of Syntax Instruction:

Build conceptual understanding of hierarchical structure, Merge, Move, and Agree
Develop analytic skills for tree-building and feature checking
Integrate cross-linguistic comparisons to illustrate universality and variation

24.2 Pedagogical Principles

24.2.1 From Theory to Practice

Start with basic phrase structure rules
Introduce X-bar theory to formalize hierarchy
Use examples from English, Urdu, and Saraiki

Example:

LanguageSentenceStructure Highlight
EnglishJohn ate applesSVO, VP internal argument
Urduجان نے سیب کھایاSOV, ergative subject
Saraikiجان نے سیب کھادےSOV, object agreement

24.2.2 Scaffolded Learning

Introduce simple clauses before embedded and complex sentences
Use incremental trees: first VP, then vP, TP, CP
Highlight feature-driven movement and phase boundaries gradually

24.3 Teaching Strategies

24.3.1 Visual Aids

Annotated trees: show Merge, Spec-Head, Agree
Color-coding features: φ-features, Case, discourse [+Topic/+Focus]
Stepwise derivations: highlight phase Spell-Out

Example (English):

[TP John[i φ] [T[uφ] did [vP t_John [VP eat apples]]]]
Highlight subject movement to Spec-TP

Example (Urdu):

[TP جان-نے [T[uφ] [vP t_جان [VP سیب کھادے]]]]
Show ergative Case assignment and object agreement

24.3.2 Cross-Linguistic Comparison

Encourage students to compare structures across languages
Emphasize parametric variation:
FeatureEnglishUrduSaraiki
Word orderSVOSOVSOV
Subject agreementnominativeergative contextergative context
Object agreementminimalperfective onlyperfective only
Scramblingrareoptionaloptional

Visual contrasts clarify theoretical concepts

24.3.3 Hands-On Exercises

Tree Building: Draw phrase structure trees for English, Urdu, and Saraiki sentences
Feature Annotation: Mark φ-features, Case, [+Topic/+Focus]
Movement Simulation: Move DPs for wh-questions, topicalization, scrambling
Computational Modeling: Write pseudocode for Merge, Agree, and Internal Merge

24.4 Active Learning Techniques

24.4.1 Collaborative Analysis

Group work: analyze complex sentences
Peer teaching of tree-building strategies

24.4.2 Incremental Complexity

Start with simple sentencesembedded clausesscrambling & topicalization
Introduce phases, economy, and Minimalist operations in digestible segments

24.4.3 Use of Technology

Syntax tree editors (e.g., TreeForm, TrEd)
Feature-based grammar simulators
NLP tools to parse sentences in English, Urdu, Saraiki

24.5 Multilingual Classroom Considerations

Transfer effects: Urdu/Saraiki background may influence perception of English SVO order
Cross-linguistic scaffolding: relate ergative subject assignment in Urdu/Saraiki to nominative in English
Discourse strategies: [+Focus/+Topic] marking in L1 may aid understanding of scrambling in English
Comparative exercises: highlight universal principles versus language-specific variations

24.6 Assessment Strategies

Formative: small tree-building exercises, feature annotation, short derivations
Summative: full syntactic derivations, cross-linguistic analysis, computational pseudocode tasks
Feedback: focus on clarity of hierarchical structure, correct feature checking, and movement justification

24.7 Cognitive and Psycholinguistic Integration

Link theory to processing constraints: phases, locality, economy
Encourage incremental derivation thinking: simulate human parsing
Highlight interface with LF/PF for comprehension and production

24.8 Advanced Teaching Topics

Phase-based derivations: explain Spell-Out and PIC through examples
Minimalist operations: Merge, Internal Merge, Agree, Scramble
Feature-driven computation: φ-features, Case assignment, discourse features
Cross-linguistic parametric variation: English SVO vs Urdu/Saraiki SOV, ergative alignment

24.9 Example Lesson Plan

Lesson 1: Phrase Structure and X-bar Theory

Objective: Understand basic tree structure
Activity: Draw VP and TP trees for English sentences
Homework: Compare with Urdu/Saraiki simple clauses

Lesson 2: Feature Checking and Movement

Objective: Internal Merge and Agree
Activity: Annotate φ-features and movement in wh-questions
Homework: Simulate scrambling in Urdu/Saraiki

Lesson 3: Minimalist Economy

Objective: Shortest Move and economy constraints
Activity: Identify unnecessary movement in example sentences
Homework: Cross-linguistic derivation analysis

24.10 Summary

Effective syntax instruction combines formal theory, visualization, and active learning
Cross-linguistic examples enhance understanding of universality and parametric variation
Phase theory, Minimalist operations, and feature-driven derivations can be taught incrementally
Computational exercises reinforce understanding and prepare students for applied syntax research

25: Syntax in NLP and AI

Integrating Generative Syntax with Computational Intelligence

25.1 Introduction

The convergence of linguistic theory and artificial intelligence has transformed how syntax is modeled, processed, and applied in natural language understanding systems. Modern NLP frameworks rely heavily on formal grammatical structures, making the Minimalist and feature-driven approaches highly relevant.

Focus of the Chapter:

Mapping syntax into computational models
Applications in AI-based language systems
Cross-linguistic implications for English, Urdu, and Saraiki

25.2 Syntax as Computable Structure

Syntax is algorithmically tractable:

Merge, Internal Merge, and Agree are computational operations

Feature structures encode essential grammatical information:
φ-features (person, number, gender)
Case features (nominative, accusative, ergative)
Discourse features ([+Focus], [+Topic])

Formal Representation Example:

English:

[TP She[i φ] [T[uφ] did [vP t_She [VP eat apples]]]]

Urdu:

[TP جان-نے [T[uφ] [vP t_جان [VP سیب کھایا]]]]

Saraiki:

[TopP سیب_i [Top′ Top [TP جان-نے [vP t_جان [VP t_سیب کھادے]]]]]
Each node is a computational object with attributes (features) and pointers to children (subconstituents).

25.3 NLP Applications

25.3.1 Parsing and Treebank Construction

Feature-based grammars allow automatic syntactic parsing
Trees generated reflect X-bar structure, movement, and phase boundaries
Multilingual parsing uses language-specific parameters

Example:

English parser: enforces SVO, subject-verb agreement
Urdu/Saraiki parser: allows SOV, optional scrambling, object agreement

Algorithmic Steps:

Tokenize input
Assign lexical features
Build trees via Merge operations
Check features via Agree
Apply Internal Merge for movement (if triggered by discourse or wh-features)
Output PF/LF structure

25.3.2 Machine Translation

Cross-linguistic syntactic knowledge is critical for English ↔ Urdu/Saraiki translation

Challenge: Different word orders (SVO vs SOV) and ergativity in L1

Feature-driven computational models handle:

Case assignment (nominative vs ergative)
φ-feature agreement
Optional scrambling

Example Pipeline:

English: John ate apples
parse feature tree
convert features to Urdu param set
Urdu output: جان نے سیب کھایا
Ensures syntactic fidelity in translation

25.3.3 Grammar Checking and Error Detection

Syntax-based AI can detect violations of grammar rules
Example: Case Filter violation
*She likes he → flagged
Can also flag improper movement or agreement mismatches

Urdu/Saraiki Example:

*جان نے سیب کھائے (incorrect agreement)
AI tools can highlight feature mismatches computationally

25.3.4 Language Acquisition Modeling

AI models simulate learning of Merge and Agree operations

Can mimic parameter setting for L2 acquisition:

English learners of Urdu/Saraiki must acquire ergative marking and scrambling rules
Urdu/Saraiki speakers learning English must adjust to strict SVO and nominative alignment

25.4 AI Integration

Syntax is integrated into cognitive AI models for language comprehension
Feature-driven derivations provide explainable representations for AI reasoning

Syntax-based AI modules enhance:
Question answering
Summarization
Discourse analysis

Example:

def process_sentence(sentence):
tree = parse(sentence)
check_features(tree)
resolve_dependencies(tree)
return interpret(tree)
Each step leverages syntactic rules as formal constraints

25.5 Cross-Linguistic AI Considerations

ComponentEnglishUrduSaraiki
Word orderSVOSOVSOV
AgreementSubject-verbObject/subjectObject/subject
ScramblingRareOptionalOptional
Case checkingNominative/AccusativeErgative/NominativeErgative/Nominative
AI complexityModerateHigherHigher
Feature-driven modelingStandardMandatory for correct derivationMandatory

Observation: AI systems must account for parametric differences and language-specific feature interactions.

25.6 Computational Trees in NLP

English Example:

TP
├── DP: She[i φ]
├── T[uφ]: did
└── vP
├── t_She
└── VP: eat apples

Urdu Example:

TP
├── DP: جان-نے
├── T[uφ]
└── vP
├── t_جان
└── VP: سیب کھایا

Saraiki Example (Topicalization):

TopP
├── DP: سیب_i
├── Top
└── TP
├── DP: جان-نے
└── vP
├── t_جان
└── VP: t_سیب کھادے
Trees allow AI systems to perform feature checking, movement, and case assignment efficiently.

25.7 Exercises

Implement a feature-based parser for English, Urdu, and Saraiki.
Simulate scrambling and ergative agreement computationally.
Build cross-linguistic translation algorithm respecting syntax.
Model movement dependencies in wh-questions and topicalization in NLP frameworks.

25.8 Summary

Syntax can be integrated into NLP and AI systems using formal operations
Feature structures, Merge, Agree, and Internal Merge provide computational tractability
Cross-linguistic variation (word order, agreement, ergativity) must be incorporated for multilingual AI
Computational modeling enables grammar checking, language acquisition simulations, parsing, and translation
Minimalist principles provide efficient, explainable representations suitable for AI

26: Research Directions in Pakistani Linguistics

Opportunities, Challenges, and Future Frontiers in Syntax, Computational Linguistics, and Multilingual Studies

26.1 Introduction

Pakistan’s linguistic landscape is uniquely diverse, with over 70 languages spoken across the country, including Urdu, Saraiki, Punjabi, Sindhi, Pashto, and Balochi. This multilingual environment offers a rich laboratory for syntactic research, computational modeling, psycholinguistics, and applied linguistics.

Objective of this chapter:

Identify high-impact research directions for Pakistani linguistics
Highlight understudied areas in syntax, morphology, and phonology
Integrate computational, psycholinguistic, and cross-linguistic methodologies

26.2 Cross-Linguistic Syntax

26.2.1 Comparative Syntactic Studies

Investigate Universal Grammar principles across Pakistani languages
Focus on feature-driven operations: Merge, Agree, Internal Merge
Study word order variation (SVO vs SOV) and scrambling patterns

Examples for Investigation:

LanguageKey FocusResearch Question
UrduErgative alignmentHow does split ergativity impact φ-feature assignment?
SaraikiObject agreementWhat triggers optional scrambling computationally?
PunjabiClitic placementInteraction with verb movement and Tense projection
SindhiDative constructionsMapping θ-roles across ergative and nominative alignment

26.2.2 Minimalist Approaches

Apply Minimalist Program principles to local languages
Examine phase theory, economy, and locality constraints in Urdu/Saraiki
Document language-specific parametric settings for cross-linguistic universals

26.3 Morphosyntactic Research

Feature inventories of Pakistani languages (φ-features, Case, tense, aspect) need systematic documentation
Investigate agreement systems, particularly split ergativity and differential object marking
Study lexical vs functional categories, particularly for code-switching contexts

Research Example:

How does Saraiki verb agreement reflect underlying φ-feature hierarchy?
Do vP vs TP projections show the same hierarchical behavior as English?

26.4 Phonology-Syntax Interface

Examine prosodic effects on scrambling, topicalization, and focus
Study intonation and stress as cues to discourse prominence
Computational modeling can integrate PF features with syntactic trees

Example: Focus-marked object in Saraiki:

سیب_i جان نے کھادے
Apples_i John ERG ate
Intonation pattern reinforces scrambled object prominence
Can be modeled computationally in NLP and speech synthesis systems

26.5 Computational Linguistics and NLP

26.5.1 Treebanks and Corpora

Build feature-rich treebanks for Urdu, Saraiki, Punjabi, and Sindhi
Include morphosyntactic annotation, movement, Case marking, and discourse features

26.5.2 Machine Learning Applications

Train AI models on local languages to improve:

Machine translation
Speech recognition
Grammar checking and educational tools

Example Project:

Feature-based parser for Urdu ergative constructions
Integrate with English-South Asian bilingual NLP applications

26.6 Psycholinguistics and Processing

Study sentence processing in Urdu/Saraiki speakers
Examine cognitive load in ergative vs nominative constructions
Explore incremental parsing and phase-based processing experimentally

Research Idea:

Compare comprehension of scrambled vs canonical SOV sentences in Saraiki
Track reaction times, memory load, and accuracy

26.7 Sociolinguistic and Educational Applications

Investigate code-switching between Urdu, English, and regional languages
Examine literacy and syntactic awareness in multilingual classrooms
Develop pedagogical tools using formal syntax for language teaching

Example Initiative:

Interactive syntax tree builder for Urdu/Saraiki learners
Feature annotation exercises aligned with Minimalist principles

26.8 Documentation and Preservation

Many regional languages are under-documented

Syntax-focused fieldwork should:
Record native speaker intuitions
Annotate movement, agreement, Case, and word order
Develop computational resources for endangered languages

Example: Saraiki and Hindko: investigate verb-final clauses, object agreement, and topicalization patterns

26.9 Future Directions

DomainKey Research Directions
SyntaxCross-linguistic parameter mapping; Merge and Move operations in regional languages
MorphologyFeature hierarchies; Case assignment and split ergativity
ComputationalNLP applications; Feature-based parsers; Machine translation
PsycholinguisticsIncremental parsing; Processing ergative constructions; Memory load studies
PedagogySyntax teaching for multilingual classrooms; Visual and computational tools
Language DocumentationTreebanks, corpora, fieldwork, endangered language preservation

26.10 Recommended Methodologies

Fieldwork and Elicitation: Collect native speaker judgments on movement, agreement, and scrambling
Treebank Annotation: Use feature-driven hierarchical structures
Experimental Syntax: Reaction time studies, comprehension experiments
Computational Modeling: Merge, Agree, Internal Merge; NLP pipelines
Cross-Linguistic Comparison: Identify parametric settings for Minimalist operations

26.11 Conclusion

Pakistan offers a rich environment for syntactic, morphosyntactic, and computational studies
Combining formal theory, cognitive models, and computational approaches can accelerate research

Urgent needs:
Corpus creation for Urdu/Saraiki/Punjabi/Sindhi
Computational tools for parsing and analysis
Integration of syntax with pedagogy, NLP, and psycholinguistics

Vision: To establish Pakistan as a global hub for multilingual syntactic research, with resources, computational tools, and expertise in both theory and application.

Suggested Readings

Adger, D. (2003). Core syntax: A minimalist approach. Oxford University Press.
Adger D. (2015). Syntax. Wiley interdisciplinary reviews. Cognitive science6(2), 131–147. https://doi.org/10.1002/wcs.1332
Bhatia, T. K., & Ritchie, W. C. (Eds.). (2014). The handbook of bilingualism and multilingualism. John Wiley & Sons.
Büring, D. (2012). The meaning of topic and focus: The 59th Street Bridge accent. Routledge.
Carnie, A. (2021). Syntax: A generative introduction. John Wiley & Sons.
Carnie, A. (2011). Modern syntax: A coursebook. Cambridge University Press.
Chomsky, N. (2002). Syntactic structures. Walter de Gruyter.
Chomsky, N. (1995). Categories and transformations. The minimalist program219, 394.
Chomsky, N. (1995). The Minimalist Program (1995). Cambridge, MA: Massachusetts Institute of Technology.
Chomsky, N. (2002). Syntactic structures. Walter de Gruyter.
Culicover, P. W. (2013). Explaining syntax: Representations, structures, and computation. OUP Oxford.
Freidin, R. (2012). Syntax: Basic concepts and applications. Cambridge University Press.
Fujita, H., Fujita, K. Human language evolution: a view from theoretical linguistics on how syntax and the lexicon first came into being. Primates 63, 403–415 (2022). https://doi.org/10.1007/s10329-021-00891-0
Heggie, L. (1993). The range of null operators: Evidence from clefting. Natural Language & Linguistic Theory11(1), 45-84.
Hirschberg, J., & Manning, C. D. (2015). Advances in natural language processing. Science349(6245), 261-266.
Hornstein, N. (2018). The minimalist program after 25 years. Annual review of linguistics4, 49-65.
Jurafsky, D., & Martin, J. H. Speech and Language Processing: An Introduction to Natural Language Processing, Computational Linguistics, and Speech Recognition.
Lewis, R. L., & Vasishth, S. (2005). An activation‐based model of sentence processing as skilled memory retrieval. Cognitive science29(3), 375-419.
Mahajan, A. (1994). The ergativity parameter: have-be alternation, word order and split ergativity.
Manning, C., & Schutze, H. (1999). Foundations of statistical natural language processing. MIT press.
Matchin, W., Mancini, S., Li, J., & Zaccarella, E. (2024). Syntax, the brain, and linguistic theory: a critical reassessment. Frontiers in Language Sciences3, 1441948.
Patel-Grosz, P. (2021). Ergativity in Indo-Aryan. In Oxford Research Encyclopedia of Linguistics.
Radford, A. (2004). English syntax: An introduction. Cambridge University Press.
Subbārāo, K. V. (2012). South Asian Languages: A Syntactic Typology. Cambridge: Cambridge University Press.
Tags

Post a Comment

0 Comments
* Please Don't Spam Here. All the Comments are Reviewed by Admin.