header logo

Grammar and Syntax

 

Grammar and Syntax

Grammar and(Theoretical and Empirical) Syntax (EN-308)

Riaz Laghari, Lecturer, English Department, Quaid-i-Azam University, Islamabad

Credit Hours: 3
Contact Hours: 48
Prerequisite: Introduction to Linguistics

COURSE DESCRIPTION

This course offers a rigorous introduction to contemporary syntactic theory, combining empirical analysis with formal modeling. Students examine the architecture of phrase structure, argument structure, movement, agreement, case theory, and clause structure across languages. The course integrates generative, functional, and usage-based approaches while emphasizing data analysis, cross-linguistic comparison, and theoretical argumentation.


The course prepares students for advanced research in syntax, linguistics, computational modeling, and interdisciplinary language sciences.

COURSE OBJECTIVES 

Students will:

  • Develop formal analytical skills in syntactic theory
  • Evaluate competing syntactic frameworks
  • Conduct cross-linguistic syntactic analysis
  • Apply constituent diagnostics rigorously
  • Construct phrase structure trees using X-bar and Minimalist assumptions
  • Analyze argument structure and thematic roles
  • Evaluate movement phenomena and constraints
  • Engage with syntax–semantics and syntax–morphology interfaces
  • Produce research-style analytical papers

LEARNING OUTCOMES

By the end of the course, students will be able to:

  • Construct formal syntactic representations
  • Apply theoretical models to unseen linguistic data
  • Critically compare Generative Grammar, SFL, and Construction Grammar
  • Analyze syntactic variation in SOV, SVO, and split-ergative systems
  • Evaluate empirical arguments in contemporary syntactic literature
  • Design small-scale syntactic research projects

ASSESSMENT STRUCTURE

  • Problem Sets (30%)
  • Midterm Analytical Exam (20%)
  • Research Paper (20%)
  • Final Exam (20%)
  • Participation & Data Workshops (10%)

COURSE CONTENTS 

MODULE 1: Foundations of Syntactic Inquiry

  • What is syntax? Formal vs functional approaches
  • Competence vs performance
  • Grammaticality judgments and methodology
  • Cross-linguistic data and typology
  • Why languages have hierarchical structure
  • Descriptive vs explanatory adequacy

Extension

  • Strong Minimalist Thesis (SMT)
  • Third Factor effects (efficiency & cognitive constraints)
  • I-language vs E-language
  • Explanatory vs evolutionary adequacy
  • The biology of Merge
  • Recursion debate (Is recursion uniquely human?)

MODULE 2: Phrase Structure & X-Bar Theory

  • Constituency tests (movement, substitution, coordination, ellipsis)
  • Binary branching
  • X-bar schema
  • Specifier–Head–Complement relations
  • Adjunction
  • Endocentricity
  • Projection principles

Extension

  • Labeling Algorithm (Chomsky 2013+)
  • Symmetry and its resolution
  • Kayne’s Antisymmetry hypothesis
  • Linearization problem
  • Why structure must be asymmetric

MODULE 3: Lexical Categories & Functional Categories

  • N, V, A, P vs T, C, D, v
  • Functional projections
  • Determiner Phrase hypothesis
  • Inflectional Phrase (IP/TP)
  • Complementizer Phrase (CP)
  • Light verbs and vP

Extension

  • Feature geometry
  • Functional sequence (f-seq)
  • Cartographic hierarchy
  • Feature inheritance
  • Phase heads

MODULE 4: Argument Structure & Theta Theory

  • Thematic roles
  • Valency
  • Subcategorization
  • Unaccusative vs unergative verbs
  • Ditransitives and alternations
  • Lexical decomposition

Extension

  • vP-shell analysis
  • Lexical decomposition (CAUSE, BECOME)
  • Alternations and syntactic derivation
  • Event structure syntax
  • Argument realization principles

MODULE 5: Case Theory & Agreement

  • Structural vs inherent case
  • Nominative–accusative vs ergative–absolutive systems
  • Split ergativity (cross-linguistic focus including South Asian languages)
  • Agreement mechanisms
  • Feature valuation

Extension

  • Dependent case theory
  • Agree mechanism in Minimalism
  • Defective intervention
  • Differential object marking
  • South Asian alignment systems (micro-analysis)
  • Case as PF vs LF phenomenon debate

MODULE 6: Movement & Transformations

  • Deep vs surface structure (historical perspective)
  • Transformational rules
  • A-movement (passives, raising)
  • A'-movement (wh-movement, topicalization)
  • Constraints on movement
  • Islands
  • Subjacency

Extension

  • Relativized Minimality
  • Intervention effects
  • The Freezing Principle
  • Anti-locality
  • Successive cyclic movement
  • Criterial positions
  • Reconstruction effects

MODULE 7: The Minimalist Program (Introductory)

  • Merge
  • Move/Internal Merge
  • Economy principles
  • Feature checking
  • Phases
  • Spell-Out

Extension

  • Phase Impenetrability Condition (PIC)
  • Escape hatches (Spec-CP / Spec-vP)
  • Transfer to PF & LF
  • Edge features
  • Cyclic derivation
  • Why phases reduce computational load

MODULE 8: Clause Structure & Left Periphery

  • CP domain
  • Force, Topic, Focus
  • Embedded clauses
  • Relative clauses
  • Complement clauses

Extension

  • Rizzi’s Cartographic Program
  • Fine structure of CP
  • Embedded Force
  • Complementizer systems cross-linguistically
  • Left-periphery in South Asian languages

MODULE 9: Head, Complement, Modifier — Advanced

  • Adjunct vs complement diagnostics
  • Selectional restrictions
  • Modification hierarchies
  • Recursive structures

Extension

  • Adjunct islands
  • Late adjunction
  • Sideward movement
  • Modification hierarchies cross-linguistically

MODULE 10: Syntax–Semantics Interface

  • Compositionality
  • Scope ambiguity
  • Quantifier raising
  • Binding theory
  • Anaphora and pronouns

Extension

  • C-command formalization
  • Reconstruction
  • LF movement
  • Binding across languages
  • Information structure & scope interaction

MODULE 11: Syntax–Morphology Interface

  • Inflectional morphology
  • Agreement morphology
  • Clitics
  • Incorporation
  • Morphosyntactic alignment

Extension

  • Distributed Morphology
  • Late insertion
  • Vocabulary insertion
  • Morphological impoverishment
  • Head movement vs lowering

MODULE 12: Functional & Usage-Based Approaches

  • Systemic Functional Linguistics
  • Construction Grammar
  • Information structure
  • Theme–Rheme
  • Discourse-syntax interface

Extension

  • Comparing derivational vs constructional models
  • Usage frequency effects
  • Emergentist syntax
  • Formal vs cognitive grammar debate

MODULE 13: Cross-Linguistic & Typological Syntax

  • SOV vs SVO word order
  • Head-initial vs head-final systems
  • Greenbergian universals
  • South Asian morphosyntax
  • Language contact effects

Extension

  • Micro-parameters
  • Dialect syntax (AAVE, Singlish)
  • Parameter setting in acquisition
  • Poverty of the Stimulus debate
  • Comparative syntax case studies

MODULE 14: Computational & Corpus Syntax

  • Treebanks
  • Annotation basics
  • Dependency vs constituency grammar
  • Introduction to parsing
  • Syntax in NLP

Extension

  • Minimalist grammars (Stabler)
  • Computational complexity of recursion
  • Formal language hierarchy
  • Syntax in large language models
  • Limitations of purely statistical parsing

MODULE 15: Research Methods in Syntax

  • Designing syntactic research questions
  • Data collection methods
  • Acceptability judgments
  • Argumentation in syntax
  • Writing a formal syntactic paper

Extension

  • Magnitude estimation
  • Likert scaling
  • Experimental syntax design
  • Statistical modeling basics
  • ERP & neurolinguistic evidence
  • Field data analysis methods

ADDITIONAL MICRO-MODULE

Ellipsis & Silent Categories

  • VP ellipsis
  • Sluicing
  • Identity conditions
  • Null subjects (pro-drop)
  • Empty Category Principle
  • PRO vs pro
  • Traces vs copies

ASSESSMENT STRUCTURE

  • Problem Sets (30%)
  • Midterm Analytical Exam (20%)
  • Research Paper (20%)
  • Final Exam (20%)
  • Participation & Data Workshops (10%)

Data Challenge (Additional Component)

Students receive raw, unglossed language data and must:

  • Identify basic word order
  • Diagnose case system
  • Determine head directionality
  • Propose phrase structure
  • Justify analysis formally

RECOMMENDED READING

Core:

  • Carnie, A. Syntax: A Generative Introduction
  • Radford, A. Minimalist Syntax
  • Adger, D. Core Syntax
  • Tallerman, M. Understanding Syntax

Advanced:

  • Chomsky, N. The Minimalist Program
  • Haegeman, L. Introduction to Government and Binding Theory
  • Biber et al. Longman Grammar of Spoken and Written English
  • Baker, M. The Atoms of Language

Notes: Grammar and (Theoretical and Empirical) Syntax

CONTENTS 

MODULE 1: Foundations of Syntactic Inquiry

1.1 What Is Syntax?

Syntax is the study of how words combine to form larger linguistic structures—phrases, clauses, and sentences—and how those structures are systematically constrained. At its core, syntax investigates structure, not merely order. The question is not simply “Which word comes next?” but rather:

What invisible architecture governs how expressions are assembled and interpreted?

A central insight of modern linguistics is that sentences are not linear strings but hierarchical objects. Consider:

The professor of linguistics from Islamabad wrote a paper.

The phrase of linguistics modifies professor, not Islamabad. This interpretation arises from structural grouping, not linear proximity. Syntax therefore studies:

  • Constituency
  • Dependency
  • Structural relations (e.g., c-command)
  • Abstract features (e.g., case, agreement)
  • Movement and displacement

Syntax asks two major types of questions:

  1. Descriptive: What patterns exist in a given language?
  2. Explanatory: Why are those patterns possible, and why are others impossible?

This distinction will shape the entire discipline.

1.2 Formal vs Functional Approaches

Modern syntactic theory broadly divides into two intellectual traditions: formal and functional approaches.

1.2.1 Formal Syntax

Formal syntax treats language as a computational system. It seeks explicit rules or operations that generate well-formed expressions.

Core assumptions:

  • Grammar is mentally represented.
  • Syntax operates independently of meaning or usage frequency.
  • The goal is formal modeling of structural principles.

Generative Grammar, particularly in the Chomskyan tradition, proposes that language arises from a minimal computational operation, Merge, which recursively combines elements into hierarchical structures.

The formal approach prioritizes:

  • Structural representation
  • Abstract features
  • Economy principles
  • Universal constraints

It asks:
What must the grammar be like in order to generate infinite expressions from finite means?

1.2.2 Functional Syntax

Functional approaches emphasize language as a tool for communication. They examine how structure emerges from discourse needs, cognition, and usage patterns.

Key assumptions:

  • Structure reflects communicative function.
  • Frequency and usage shape grammar.
  • Grammar emerges from patterns of use.

Systemic Functional Linguistics (SFL), Construction Grammar, and usage-based models prioritize:

  • Information structure
  • Theme–Rheme organization
  • Discourse functions
  • Constructional meaning

The functional tradition asks:
Why do structures look the way they do from the perspective of communication and cognition?

1.2.3 A Productive Tension

The tension between formal and functional traditions is not merely methodological—it reflects different philosophical commitments:

FormalFunctional
Internal computational systemExternal communicative system
Universal constraintsUsage-driven patterns
Structural autonomyDiscourse embeddedness

Elite syntactic inquiry requires fluency in both traditions, along with the ability to evaluate their explanatory power.

1.3 Competence vs Performance

One of the most influential distinctions in modern linguistics is that between competence and performance.

1.3.1 Linguistic Competence

Competence refers to a speaker’s internalized knowledge of grammar—the mental system that generates and interprets sentences.

Characteristics:

  • Abstract
  • Tacit
  • Systematic
  • Infinite in generative capacity

Competence explains why speakers can judge the grammaticality of sentences they have never heard before.

1.3.2 Performance

Performance refers to actual language use in real-world conditions.

It is affected by:

  • Memory limitations
  • Processing constraints
  • Distractions
  • Hesitations
  • Errors

For example:

The book that the professor that the student admired wrote was published.

Grammatically well-formed but difficult to process. The complexity arises from performance limitations, not grammatical violation.

1.3.3 Why the Distinction Matters

If syntax studies performance alone, it risks conflating:

  • Cognitive processing limitations
  • Social factors
  • Structural principles

The competence/performance distinction allows syntacticians to isolate the underlying generative system from noise introduced by usage.

Yet, modern research also asks whether competence and performance are entirely separable. Psycholinguistics and experimental syntax increasingly examine how processing data inform syntactic theory.

1.4 Grammaticality Judgments and Methodology

Syntactic theory relies heavily on intuitive judgments.

Speakers are asked whether a sentence sounds:

  • Grammatical
  • Marginal
  • Ungrammatical

Example:

  1. Who did you see? ✓
  2. Who did you wonder whether left? ✗

Judgments provide evidence for constraints such as island effects.

1.4.1 Critiques of Judgment Data

Judgment methodology faces challenges:

  • Speaker variation
  • Context sensitivity
  • Gradient acceptability
  • Experimental bias

This has led to the rise of experimental syntax, where acceptability is measured using:

  • Likert scales
  • Magnitude estimation
  • Statistical modeling

Elite syntactic inquiry demands methodological awareness: data must be systematically collected and analyzed.

1.5 Cross-Linguistic Data and Typology

Syntax cannot be studied through English alone.

Cross-linguistic research reveals:

  • SOV vs SVO orders
  • Ergative vs nominative systems
  • Null-subject languages
  • Rich vs poor agreement systems

Typology identifies recurring patterns and constraints. For example:

  • Most languages prefer consistent head direction.
  • Certain word orders are statistically rare.

Cross-linguistic comparison helps distinguish:

  • Universal principles
  • Language-specific parameters

It also prevents theoretical overfitting to English.

1.6 Why Languages Have Hierarchical Structure

One of the central discoveries of twentieth-century linguistics is that human language is hierarchical rather than purely linear.

Consider:

Old men and women.

Ambiguous because of structural grouping:

  1. [Old [men and women]]
  2. [[Old men] and women]

Linear order alone cannot explain interpretation.

1.6.1 Evidence for Hierarchy

Evidence includes:

  • Movement operations
  • Agreement patterns
  • Binding relations
  • Scope interpretation

Structure determines:

  • What can move
  • What can bind
  • What can be interpreted together

Hierarchy is, therefore, not theoretical decoration but it is empirically required.

1.6.2 Cognitive Explanation

Why hierarchy?

Because recursive structure allows:

  • Infinite expressivity
  • Embedding
  • Displacement

Human language is characterized by recursion, the ability to embed structures within structures indefinitely.

1.7 Descriptive vs Explanatory Adequacy

A grammar can be:

Descriptively Adequate

If it correctly captures observed patterns.

But a deeper goal is:

Explanatory Adequacy

A theory achieves explanatory adequacy if it explains:

  • Why children acquire grammar rapidly
  • Why certain structures are impossible
  • Why cross-linguistic patterns cluster

In modern Minimalist thought, explanatory adequacy extends further to:

  • Evolutionary plausibility
  • Cognitive efficiency
  • Computational optimality

The ultimate question becomes:

Why is human syntax structured the way it is, and could it have been otherwise?

1.8 Conclusion: The Scope of Syntactic Inquiry

Syntax stands at the intersection of:

  • Cognitive science
  • Philosophy of language
  • Formal modeling
  • Typology
  • Neuroscience

It seeks to model the internal architecture that makes human language possible.

To study syntax at an elite level is to pursue not only patterns of sentence formation but the deeper architecture of the human mind.

    Extension

    • Strong Minimalist Thesis (SMT)
    • Third Factor effects (efficiency & cognitive constraints)
    • I-language vs E-language
    • Explanatory vs evolutionary adequacy
    • The biology of Merge
    • Recursion debate (Is recursion uniquely human?)

    MODULE 2: Phrase Structure & X-Bar Theory

    2.1 From Linear Strings to Structural Architecture

    If syntax is the study of hierarchical structure, phrase structure theory provides its first formal blueprint. Early linguistic description treated sentences as sequences of words arranged in order. Modern syntactic inquiry, however, demonstrates that sentences are structured objects composed of nested units called constituents.

    The shift from linear description to hierarchical modeling represents one of the decisive intellectual revolutions of twentieth-century linguistics.

    Consider:

    The students from Islamabad admired the professor of linguistics.

    Surface order alone does not reveal which words group together. Structural grouping determines:

    • Interpretation
    • Agreement
    • Movement possibilities
    • Substitution possibilities

    Phrase structure theory attempts to formally model this internal organization.

    2.2 Constituency: The Building Blocks of Syntax

    A constituent is a group of words that functions as a unit within a larger structure. Constituency is not intuitively obvious; it must be diagnosed through empirical tests.

    2.2.1 Movement Tests

    If a sequence can move as a unit, it likely forms a constituent.

    Base sentence:

    The professor wrote a book on syntax.

    Fronting:

    A book on syntax, the professor wrote.

    The phrase a book on syntax moves together, indicating constituency.

    However:

    On syntax wrote the professor a book. (ungrammatical)

    Movement reveals structural grouping rather than mere adjacency.

    2.2.2 Substitution Tests

    A sequence that can be replaced by a pro-form is typically a constituent.

    The professor from Islamabad arrived.

    She arrived.

    Here, The professor from Islamabad forms a noun phrase (NP).

    Similarly:

    The professor wrote a book on syntax.

    The professor did so.

    The phrase wrote a book on syntax behaves as a verb phrase (VP).

    2.2.3 Coordination Tests

    Only constituents of the same type can be coordinated.

    The professor [wrote a book] and [reviewed an article].

    Both bracketed strings are verb phrases.

    But:

    The professor [wrote] and [a book on syntax].

    This fails because the strings are not equivalent constituents.

    2.2.4 Ellipsis Tests

    Ellipsis involves omission of a constituent recoverable from context.

    The professor wrote a book, and the student did too.

    The elided material corresponds to a verb phrase.

    Ellipsis thus provides strong evidence for underlying structure.

    2.2.5 Why Constituency Matters

    Constituency reveals:

    • The internal grouping of expressions
    • The structural domains of syntactic rules
    • The hierarchical organization underlying interpretation

    Without constituency, syntax collapses into word order description.

    2.3 From Phrase Structure Rules to Structural Principles

    Early generative grammar proposed phrase structure rules such as:

    S → NP VP
    VP → V NP
    NP → Det N

    These rules generate tree structures, but they lack generality. They list patterns rather than explain structural uniformity across categories.

    X-Bar Theory emerged to capture the shared architecture of all phrases.

    2.4 Binary Branching

    A central claim of modern phrase structure theory is that syntactic structure is binary branching: each node divides into exactly two constituents.

    Rather than:

    VP → V NP PP

    Binary structure proposes:

    VP

     ├── V'

     │    ├── V

     │    └── NP

     └── PP

    Binary branching offers several theoretical advantages:

    • Structural uniformity
    • Simplicity
    • Predictable recursion
    • Clear hierarchical relations

    Binary branching is not arbitrary; it reflects a deeper claim about the computational operation Merge, which combines two elements at a time.

    2.5 The X-Bar Schema

    X-Bar Theory generalizes phrase structure across all lexical categories. Instead of separate rules for NP, VP, AP, and PP, it proposes a universal schema:

    XP

     ├── Specifier

     └── X'

          ├── X (Head)

          └── Complement

    Where:

    • X = lexical head (N, V, A, P, etc.)
    • Complement = sister to the head
    • Specifier = sister to X'

    This yields three projection levels:

    1. X (Head level)
    2. X' (Intermediate level)
    3. XP (Maximal projection)

    For example:

    The professor wrote a book.

    Verb Phrase structure:

    • Head: wrote
    • Complement: a book
    • Specifier: subject (in higher projection)

    2.6 Specifier–Head–Complement Relations

    Structural relations matter more than position.

    2.6.1 Head

    The head determines:

    • Category of the phrase
    • Selectional properties
    • Agreement features

    In:

    The destruction of the city

    The noun destruction determines that the entire phrase is a noun phrase.

    2.6.2 Complement

    Complements are:

    • Required by the head
    • Selected by the head
    • Sisters to the head

    Example:

    wrote a book

    The verb selects a direct object.

    Without it:

    The professor wrote. (incomplete in certain contexts)

    2.6.3 Specifier

    Specifiers are:

    Structural positions external to the head-complement relation

    Often hosts of subjects, determiners, or degree modifiers

    In:

    The professor

    The determiner the occupies a specifier-like position in DP structure (under later analyses).

    2.6.4 Hierarchical Relations

    Specifier–head–complement relations create asymmetric structure:

    • Head and complement form a unit before combining with specifier.
    • This asymmetry explains scope, agreement, and movement possibilities.

    Hierarchy determines syntactic behavior.

    2.7 Adjunction

    Not all elements are selected. Some are optional modifiers.

    Adjuncts:

    • Are not required by the head
    • Can be iterated
    • Attach recursively

    Example:

    The professor [from Islamabad] [with extensive experience] [in syntax]

    These modifiers attach through adjunction.

    Structurally:

    Adjunction adds a new layer of projection:

    XP

     ├── XP

     └── Adjunct

    Adjunction preserves endocentricity while allowing recursion.

    2.8 Endocentricity

    A structure is endocentric if it inherits its category from one of its parts.

    In X-Bar Theory:

    • Every phrase has a head.
    • The phrase takes its category from that head.

    Thus:

    • NP is headed by N
    • VP is headed by V
    • AP is headed by A

    This principle eliminates exocentric constructions (e.g., traditional S → NP VP without a head). Later theory reanalyzes sentences as projections of functional heads (e.g., T or C), preserving endocentricity at all levels.

    Endocentricity supports the claim that syntax is uniform and hierarchical.

    2.9 Projection Principles

    Projection refers to the idea that lexical properties extend into syntax.

    The Projection Principle states:

    Lexical properties must be represented at every level of syntactic representation.

    If a verb selects a complement, that selection must be visible in structure.

    For example:

    give requires two internal arguments.

    This requirement must be syntactically represented.

    Projection ensures that:

    • Argument structure is preserved
    • Selection is respected
    • Structure reflects lexical requirements

    In Minimalist terms, projection emerges from the interaction between lexical features and Merge.

    2.10 Theoretical Significance of X-Bar Theory

    X-Bar Theory accomplished several major theoretical advances:

    1. Unified phrase structure across categories
    2. Reduced rule redundancy
    3. Established hierarchical uniformity
    4. Prepared the ground for Minimalist structure building

    Later Minimalist theory simplifies X-Bar further by deriving projections from Merge and labeling rather than stipulating levels.

    Nevertheless, X-Bar remains foundational for understanding phrase structure architecture.

    2.11 From X-Bar to Modern Minimalism

    While X-Bar explicitly encodes intermediate projections (X'), later approaches argue that:

    • Structure is built through recursive Merge
    • Projection levels are emergent
    • Labeling determines category

    Binary branching and head-complement asymmetry remain central.

    Thus, phrase structure theory is not outdated. It is the structural backbone of contemporary syntax.

    2.12 Conclusion: The Architecture of Phrases

    Phrase Structure and X-Bar Theory demonstrate that syntax is:

    • Hierarchical
    • Binary
    • Endocentric
    • Feature-driven
    • Recursively generated

    These principles explain:

    • Movement
    • Agreement
    • Binding
    • Scope
    • Ambiguity

    Understanding phrase structure is essential before approaching movement, case theory, or Minimalism.

    Syntax begins with architecture.

    Extension

    • Labeling Algorithm (Chomsky 2013+)
    • Symmetry and its resolution
    • Kayne’s Antisymmetry hypothesis
    • Linearization problem
    • Why structure must be asymmetric

    MODULE 3: Lexical Categories & Functional Categories

    3.1 Introduction: The Architecture of Categories

    Syntactic structure is built from categories. But categories are not merely labels like “noun” or “verb.” They are structural positions defined by features, selectional properties, and projectional behavior.

    Modern syntax distinguishes between two broad types:

    1. Lexical (Content) Categories
    2. Functional (Grammatical) Categories

    This distinction marks one of the most significant theoretical developments in generative syntax. Earlier phrase structure theory focused largely on lexical heads (N, V, A, P). Contemporary theory recognizes that the clausal spine is dominated by functional structure: tense, agreement, complementizers, determiners, and light verbs.

    Understanding this distinction is essential for understanding clause architecture.

    3.2 Lexical Categories: N, V, A, P

    Lexical categories carry substantive semantic content. They form the conceptual core of expressions.

    3.2.1 Nouns (N)

    Nouns denote entities, individuals, concepts, or abstract objects.

    Examples:

    • professor
    • syntax
    • city
    • destruction

    Syntactic properties:

    • Combine with determiners
    • Accept adjectival modification
    • Assign thematic roles in nominalizations

    Nouns head NPs (or DPs under later theory).

    3.2.2 Verbs (V)

    Verbs denote events, states, or processes.

    Examples:

    • write
    • admire
    • collapse
    • think

    Syntactic properties:

    • Select complements
    • Assign thematic roles
    • License arguments
    • Interact with tense and agreement

    Verbs form the structural core of predicates.

    3.2.3 Adjectives (A)

    Adjectives modify nouns or function predicatively.

    Examples:

    • intelligent
    • old
    • complex

    They:

    • Form APs
    • May take complements (proud of his work)
    • Can appear in predicative positions

    3.2.4 Prepositions (P)

    Prepositions establish relational structure.

    Examples:

    • in
    • on
    • with
    • from

    They:

    • Take complements (PP → P NP)
    • Express spatial, temporal, or abstract relations

    3.2.5 Shared Properties of Lexical Categories

    Lexical categories:

    • Carry descriptive semantic content
    • Select arguments
    • Project phrase structure
    • Are open-class (new words can be added)

    But lexical categories alone cannot account for full clause structure.

    3.3 Functional Categories: T, C, D, v

    Functional categories encode grammatical information rather than lexical meaning. They:

    • Express tense, agreement, definiteness, mood
    • Structure clauses
    • Mediate between lexical content and discourse

    They are typically closed-class and feature-rich.

    3.3.1 Tense (T)

    T (formerly I for Inflection) encodes:

    • Tense (past, present, future)
    • Agreement features
    • Sometimes mood

    Example:

    The professor wrote a book.

    The past tense is not a property of write alone. It is realized in T.

    T heads TP (Tense Phrase), which replaces older IP terminology.

    3.3.2 Complementizer (C)

    C introduces clauses and encodes:

    • Clause type (declarative, interrogative)
    • Subordination
    • Force

    Examples:

    • that
    • whether
    • if

    In:

    She believes that the professor wrote a book.

    “that” occupies C.

    C heads CP, the highest clausal projection in many analyses.

    3.3.3 Determiner (D)

    Determiners encode:

    • Definiteness
    • Specificity
    • Quantification

    Examples:

    • the
    • a
    • this
    • every

    Modern theory proposes that D, not N, is the head of nominal expressions.

    3.3.4 Light Verb (v)

    The light verb v is an abstract functional head inside the verb phrase.

    It:

    • Introduces external arguments
    • Assigns accusative case
    • Encodes transitivity

    vP sits between VP and TP in modern clause structure.

    3.4 Functional Projections: Expanding the Clause

    Earlier syntactic models proposed:

    S → NP VP

    Modern syntax proposes a layered structure:

    CP
    └── TP
      └── vP
        └── VP

    Each layer contributes grammatical information:

    • VP: lexical event
    • vP: argument introduction
    • TP: tense and agreement
    • CP: clause type and discourse force

    Functional projections form the clausal spine.

    3.5 The Determiner Phrase (DP) Hypothesis

    One of the most influential proposals in late 20th-century syntax is the DP Hypothesis.

    Traditional view:
    NP → Det N

    Revised view:
    DP → D NP

    Under this hypothesis:

    • The determiner is the head.
    • The noun phrase is its complement.

    Why propose DP?

    3.5.1 Empirical Motivation

    Pronouns behave like full noun phrases but lack nouns:

    he, she, they

    These are analyzed as D heads without NP complements.

    Determiners determine referential properties.

    Cross-linguistic parallels between CP and DP:

    C heads clauses.
    D heads noun phrases.

    This structural symmetry suggests deep architectural uniformity.

    3.6 Inflectional Phrase (IP) and Tense Phrase (TP)

    Earlier Government & Binding theory proposed IP (Inflectional Phrase). Later Minimalist theory relabels this as TP.

    Structure:

    TP
    ├── Spec (Subject position)
    └── T'
      ├── T
      └── vP

    T:

    • Assigns nominative case
    • Hosts agreement
    • Anchors temporal reference

    The subject moves to Spec-TP for case checking and agreement.

    This move reflects the shift from surface description to feature-driven syntax.

    3.7 Complementizer Phrase (CP)

    CP sits above TP and encodes clause type.

    Structure:

    CP
    ├── Spec (wh-elements, topics)
    └── C'
      ├── C
      └── TP

    Functions of CP:

    • Introduces embedded clauses
    • Hosts interrogative movement
    • Encodes illocutionary force

    In wh-questions:

    What did the professor write?

    “What” moves to Spec-CP.

    CP thus serves as the interface between clause-internal syntax and discourse.

    3.8 Light Verbs and vP Structure

    The introduction of vP revolutionized argument structure analysis.

    Earlier theory:
    VP directly contained subject and object.

    Modern analysis:

    vP
    ├── Spec (External Argument)
    └── v'
      ├── v
      └── VP

    The light verb v:

    • Introduces the external argument
    • Assigns accusative case
    • Determines transitivity

    Example:

    The professor wrote a book.

    Structure:

    • “write” forms VP with object
    • v introduces subject
    • T assigns nominative case

    This layered analysis separates:

    • Event content (VP)
    • Argument introduction (vP)
    • Temporal anchoring (TP)
    • Clause force (CP)

    3.9 Why Functional Categories Matter

    Functional categories explain:

    • Why subjects move
    • Why agreement occurs
    • Why clause types differ
    • Why determiners structure nominal reference

    Without functional projections:

    • Movement would lack motivation
    • Case assignment would be mysterious
    • Clause structure would remain flat

    Functional structure reveals syntax as a highly articulated system.

    3.10 Theoretical Consequences

    The lexical-functional distinction leads to several theoretical insights:

    1. Clauses are layered hierarchically.
    2. Grammar is feature-driven.
    3. Structure is more abstract than surface order suggests.
    4. Cross-categorial symmetry (CP and DP) reflects deep architectural principles.

    Minimalist theory further reduces these structures to the interaction of features and Merge, but the lexical-functional distinction remains foundational.

    3.11 Toward a Unified Clause Architecture

    Modern clause structure:

    CP
    └── TP
      └── vP
        └── VP

    Nominal structure mirrors it:

    DP
    └── NP

    This symmetry suggests that human syntax is not a loose collection of rules but a tightly constrained architectural system.

    3.12 Conclusion

    Lexical categories provide semantic content. Functional categories provide grammatical architecture.

    The shift from:

    NP → VP

    to

    CP → TP → vP → VP

    represents a profound deepening of syntactic theory.

    Language is not flat.
    It is layered.
    It is feature-driven.
    It is hierarchically organized.

    Understanding lexical and functional categories prepares us to examine argument structure, case theory, and movement in their full architectural context.

      Extension

      • Feature geometry
      • Functional sequence (f-seq)
      • Cartographic hierarchy
      • Feature inheritance
      • Phase heads

      MODULE 4: Argument Structure & Theta Theory

      (Advanced Theoretical and Empirical Syntax)

      1. Introduction: What Is Argument Structure?

      Argument structure lies at the heart of syntactic theory because it determines how many participants a predicate requires and how those participants are syntactically realized. The study of argument structure bridges the lexicon, syntax, and semantics, and plays a central role in frameworks ranging from **Noam Chomsky’s Government and Binding theory to contemporary Minimalism and lexicalist approaches.

      At its core, argument structure answers three fundamental questions:

      1. How many arguments does a predicate select?
      2. What semantic roles do these arguments bear?
      3. How are these arguments structurally encoded in syntax?

      Understanding argument structure allows us to explain alternations (e.g., active/passive), case marking systems, cross-linguistic variation, and the mapping between meaning and form.

      2. Thematic Roles (Theta Roles)

      2.1 Definition

      Thematic roles (θ-roles) represent the semantic relationships between predicates and their arguments. They are part of the predicate’s lexical specification.

      Common thematic roles include:

      RoleSemantic FunctionExample
      AgentInitiator of actionSara broke the glass.
      ThemeUndergoes changeSara broke the glass.
      ExperiencerFeels/perceivesAli fears snakes.
      GoalEndpointShe sent the letter to Mary.
      SourceStarting pointShe came from Lahore.
      InstrumentMeansHe cut the bread with a knife.
      BeneficiaryRecipient of benefitShe baked him a cake.

      2.2 Theta Theory

      Within Government and Binding theory (see Lectures on Government and Binding), Theta Theory formalized argument structure via:

      Theta Criterion

      Each argument receives exactly one theta role.

      Each theta role is assigned to exactly one argument.

      This principle enforces one-to-one mapping between semantic roles and syntactic positions.

      Example:

      Sara gave Ali a book.

      Sara → Agent

      Ali → Goal

      a book → Theme

      Violations:

      Sara gave Ali. (missing Theme)

      Sara gave Ali a book a pen. (extra Theme)

      2.3 Theta Roles and Structural Position

      Modern syntactic theory links thematic roles to structural configurations:

      External argument (Agent) → Spec-vP

      Internal argument (Theme) → Complement of V

      This structural encoding allows argument interpretation to emerge from syntactic hierarchy rather than arbitrary assignment.

      3. Valency

      3.1 Definition

      Valency refers to the number of arguments a verb requires.

      Borrowed from chemistry, the term conceptualizes verbs as relational centers with varying combinatorial capacity.

      3.2 Types of Valency

      TypeDescriptionExample
      AvalentNo argumentsIt rains.
      MonovalentOne argumentShe sleeps.
      DivalentTwo argumentsShe likes tea.
      TrivalentThree argumentsShe gave him a gift.

      3.3 Cross-Linguistic Variation in Valency

      Languages differ in how valency is morphologically encoded:

      • Applicatives increase valency
      • Causatives introduce new external arguments
      • Antipassives reduce valency
      • Passive constructions suppress external arguments

      South Asian languages, for example, show productive causative morphology that alters valency.

      4. Subcategorization

      4.1 Lexical Selection

      Subcategorization frames specify what complements a verb selects.

      Examples:

      • sleep → [ __ ] (no complement)
      • devour → [ __ NP ]
      • believe → [ __ CP ]
      • put → [ __ NP PP ]

      Unacceptable examples:

      • She devoured. (missing object)
      • She put the book. (missing locative PP)

      Subcategorization is lexically encoded and constrains syntactic structure.

      4.2 C-selection vs S-selection

      C-selection (Categorical Selection)

      Selection based on syntactic category.
      Example: depend selects PP.

      S-selection (Semantic Selection)

      Selection based on semantic properties.
      Example: eat selects edible objects.

      5. Unaccusative vs Unergative Verbs

      One of the most influential developments in argument structure theory is the Unaccusative Hypothesis.

      5.1 Unergative Verbs

      • Single argument = external argument (Agent)
      • Generated in Spec-vP

      Example:

      She ran.

      Structure:
      [TP She [vP She v [VP run]]]

      5.2 Unaccusative Verbs

      • Single argument = internal argument (Theme)
      • Generated as complement of V
      • Moves to subject position

      Example:

      The glass broke.

      Underlying structure:
      [VP break the glass]

      Surface:
      [TP The glass [vP t v [VP break t ]]]

      5.3 Diagnostics for Unaccusativity

      Cross-linguistic diagnostics include:

      • Auxiliary selection (Italian, Dutch)
      • Resultative constructions
      • Participle agreement
      • There-insertion in English

      Example:

      There arrived three students.

      (Allowed with unaccusatives)

      There ran three students.

      (Unacceptable for most speakers)

      5.4 Theoretical Importance

      Unaccusativity demonstrates that:

      • Surface subject position does not determine thematic role.
      • Argument structure must be represented hierarchically.
      • Syntax reflects deeper lexical organization.

      6. Ditransitives and Alternations

      Ditransitives involve three arguments:

      She gave Ali a book.

      Two major constructions in English:

      6.1 Double Object Construction

      She gave Ali a book.

      Structure:

      [VP give Ali a book]

      6.2 Prepositional Dative Construction

      She gave a book to Ali.

      Structure:
      [VP give a book [PP to Ali]]

      6.3 The Dative Alternation

      The alternation is constrained by:

      • Animacy
      • Information structure
      • Verb class
      • Possession semantics

      Not all verbs alternate:

      • She donated money to the hospital.
      • *She donated the hospital money.

      6.4 Theoretical Analyses

      Three major accounts:

      1. Lexical derivation
      2. Structural derivation
      3. Possession-based semantic accounts

      In Minimalism, double object constructions often involve an Applicative Phrase (ApplP) introducing the Goal argument structurally.

      7. Lexical Decomposition

      7.1 Concept

      Lexical decomposition assumes that verb meaning can be broken into primitive semantic components.

      Example:

      break = CAUSE (BECOME (broken))

      This approach is central to work by Ray Jackendoff and others in conceptual semantics.

      7.2 Causative Alternation

      The glass broke.
      She broke the glass.

      Decomposition:

      • Intransitive: BECOME (glass broken)
      • Transitive: CAUSE (BECOME (glass broken))

      The alternation reflects addition of external argument (Agent/Cause).

      7.3 Event Structure

      Event decomposition often distinguishes:

      • States
      • Activities
      • Accomplishments
      • Achievements

      These event types affect:

      • Argument realization
      • Aspectual interpretation
      • Compatibility with temporal modifiers

      Example:

      She ran for an hour. (Activity)

      She built a house in a year. (Accomplishment)

      8. Mapping Between Semantics and Syntax

      The central problem in argument structure theory:

      How do semantic roles map onto syntactic positions?

      Competing approaches:

      • Lexicalist mapping rules
      • Construction-based mapping
      • Structural configurational approaches
      • Distributed morphology approaches

      Modern Minimalism tends toward structural encoding rather than role-list assignment.

      9. Cross-Linguistic Dimensions of Argument Structure

      Argument realization varies across languages:

      • Ergative alignment
      • Differential object marking
      • Applicatives
      • Serial verb constructions
      • Voice systems (Philippine-type languages)

      Languages differ in whether argument alternations are:

      • Morphologically marked
      • Syntactically derived
      • Lexically encoded

      These differences inform universality vs parameterization debates.

      10. Argument Structure in Contemporary Theory

      10.1 vP Shell Hypothesis

      Proposed by Kenneth Hale and further developed by Samuel Jay Keyser, the vP shell structure encodes argument hierarchy structurally.

      Basic template:

      [vP Agent v [VP V Theme]]

      This architecture explains:

      • External argument position
      • Causative alternations
      • Ditransitive structures
      • Event decomposition

      10.2 Phases and Argument Structure

      In Minimalism (see The Minimalist Program):

      • vP is a phase.
      • Argument structure is computed locally within vP.
      • External argument introduced by little v.

      11. Empirical Methodology in Argument Structure Research

      Advanced syntactic research requires:

      • Acceptability judgment tasks
      • Corpus investigation
      • Experimental syntax
      • Cross-linguistic elicitation
      • Argument alternation testing

      Argument structure is an ideal testing ground for theory comparison because it connects lexicon, syntax, morphology, and semantics.

      12. Descriptive vs Explanatory Power in Argument Structure

      A strong theory must:

      • Predict alternation patterns
      • Explain cross-linguistic constraints
      • Capture universals of argument realization
      • Integrate event structure and syntax
      • Avoid redundant lexical stipulations

      Explanatory adequacy demands linking:

      • Cognitive architecture
      • Semantic structure
      • Universal Grammar (if assumed)
      • Interface conditions

      Concluding Synthesis

      Argument structure is not merely about listing participants in a clause. It is a window into:

      • The architecture of grammar
      • The mapping between meaning and form
      • Cross-linguistic variation
      • The nature of syntactic universals

      From Theta Theory to vP shells and lexical decomposition, the study of argument structure has transformed modern syntactic inquiry.

      It provides one of the strongest empirical domains for testing:

      • Generative models
      • Functional explanations
      • Construction-based approaches
      • Computational syntactic representations

      In advanced syntactic research, mastery of argument structure theory is indispensable for engaging in theoretical innovation and cross-linguistic discovery.

        Extension

        • vP-shell analysis
        • Lexical decomposition (CAUSE, BECOME)
        • Alternations and syntactic derivation
        • Event structure syntax
        • Argument realization principles

        MODULE 5: Case Theory & Agreement

        (Advanced Theoretical and Empirical Syntax)

        1. Introduction: Why Case Matters

        Case theory occupies a central position in modern syntactic theory because it regulates the distribution of nominal expressions and encodes core grammatical relations. Case interacts with:

        • Argument structure
        • Agreement
        • Word order
        • Alignment systems
        • Morphology
        • Information structure

        Within generative grammar, Case became formally articulated in Lectures on Government and Binding, and was later reformulated in the checking/valuation framework of the The Minimalist Program by Noam Chomsky.

        The central questions are:

        1. How are nominals licensed in syntactic structure?
        2. What distinguishes structural case from inherent case?
        3. How do case systems vary across languages?
        4. How is case related to agreement and feature valuation?

        2. Structural vs Inherent Case

        2.1 Structural Case

        Structural Case is assigned purely based on syntactic configuration.

        It does not depend on thematic role but on position.

        In nominative–accusative languages like English:

        • Nominative → Spec-TP (subject position)
        • Accusative → Complement of V

        Example:

        She saw him.

        She receives nominative from T.

        him receives accusative from v/V.

        Structural case is:

        Predictable

        Sensitive to syntactic movement

        Affected by passive

        Passive alternation illustrates structural case clearly:

        She saw him.

        He was seen.

        The object moves to subject position and receives nominative case instead of accusative.

        This indicates case assignment is structural, not semantic.

        2.2 Inherent Case

        Inherent Case is assigned along with a thematic role.

        It is:

        Lexically determined

        Not altered by passive

        Often semantically interpretable

        Examples:

        Dative experiencers

        Ergative agents (in some languages)

        Genitive case on possessors

        Example (Icelandic-type pattern):

        He-DAT likes her-NOM.

        The dative case is tied to the experiencer role and persists across syntactic alternations.

        Inherent case is typically assigned:

        By V (alongside theta role)

        Or by functional heads with semantic content (e.g., applicatives)

        2.3 Diagnostics for Distinguishing Case Types

        DiagnosticStructural CaseInherent Case
        Affected by passiveYesNo
        Tied to semantic roleNoYes
        Position-basedYesNo
        Can change via movementYesUsually no

        This distinction is fundamental in generative syntax but less central in functional approaches.

        3. Alignment Systems: Nominative–Accusative vs Ergative–Absolutive

        3.1 Nominative–Accusative Systems

        In nominative–accusative languages:

        • Subjects of transitive and intransitive verbs behave alike.
        • Objects of transitive verbs are marked differently.

        Pattern:

        Clause TypeSubjectObject
        IntransitiveNOM
        TransitiveNOMACC

        English is nominative–accusative:

        She runs.

        She sees him.

        The subject is treated consistently.

        3.2 Ergative–Absolutive Systems

        In ergative systems:

        • Subjects of intransitives and objects of transitives pattern together (absolutive).
        • Transitive subjects receive ergative case.

        Pattern:

        Clause TypeSubjectObject
        IntransitiveABS
        TransitiveERGABS

        Example (abstracted pattern):

        She-ABS runs.

        She-ERG sees him-ABS.

        This reverses the alignment logic of nominative systems.

        3.3 Theoretical Significance

        Ergative alignment challenges:

        • Traditional subject-object asymmetry
        • Universal subject-based theories
        • Uniform case assignment mechanisms

        It raises the question:

        Is ergativity syntactic, morphological, or structural?

        Competing analyses propose:

        • Ergative as inherent case
        • Ergative assigned by v
        • Ergative as dependent case
        • Ergativity emerging at PF (morphological)

        4. Split Ergativity

        Split ergativity refers to systems where alignment changes depending on:

        • Tense/aspect
        • Person hierarchy
        • Clause type
        • Nominal type

        4.1 Aspect-Based Split (South Asian Focus)

        Many Indo-Aryan languages exhibit ergativity in perfective contexts but not in imperfective contexts.

        In languages such as Hindi-Urdu:

        Imperfective:

        She-NOM eats bread.

        Perfective:

        She-ERG bread-ABS ate.

        This system reflects aspect-based ergativity.

        4.2 Why Splits Occur

        Explanations include:

        1. Structural position of external argument
        2. Aspectual head interaction
        3. Differential case licensing
        4. Morphological alignment shifts

        In some analyses, ergative case is assigned only when v is associated with perfective aspect.

        4.3 Person-Based Splits

        In some languages:

        1st/2nd person → nominative alignment

        3rd person → ergative alignment

        This suggests interaction between:

        Person features

        Case assignment

        Agreement hierarchy

        4.4 Theoretical Approaches to Split Ergativity

        Major proposals:

        Dependent case theory

        Inherent case assignment

        Structural case variation

        Phase-based licensing

        Split ergativity provides critical testing ground for syntactic theory, especially in understanding how case interacts with aspect and agreement.

        5. Agreement Mechanisms

        Agreement reflects feature sharing between syntactic elements.

        Typical agreement features:

        • Person
        • Number
        • Gender
        • Case (in some languages)

        Example:

        She runs.

        They run.

        Agreement is typically between:

        • T and subject
        • v and object (in some languages)
        • Adjectives and nouns
        • Complementizers and subjects (in some languages)

        5.1 Structural Conditions on Agreement

        Agreement typically occurs under:

        • Spec-head configuration (earlier theories)
        • Probe–goal relation (Minimalism)

        Agreement does not necessarily require adjacency.

        Long-distance agreement occurs in some languages.

        5.2 Agreement in Ergative Systems

        Ergative languages vary:

        • Some agree with absolutive argument.
        • Some show split agreement.
        • Some agree with highest accessible DP.

        South Asian languages often show agreement with absolutive objects in perfective clauses.

        This interaction between case and agreement reveals deeper structural mechanisms.

        6. Feature Valuation in Minimalism

        In Minimalist syntax, case and agreement are unified under feature checking and valuation.

        Core assumptions from The Minimalist Program:

        • Functional heads carry uninterpretable features.
        • These must be valued and deleted before Spell-Out.
        • Agreement is a byproduct of feature valuation.

        6.1 Probe–Goal System

        A head with unvalued features (probe) searches downward for a matching element (goal).

        Example:

        T has:

        [uPerson]

        [uNumber]

        [uCase]

        DP has:

        [Person]

        [Number]

        [Case: unvalued]

        Process:

        1. T probes DP.
        2. Agreement features are valued.
        3. Case is assigned to DP.
        4. Uninterpretable features deleted.

        Thus:

        Case assignment = part of agreement.

        6.2 Structural Case as Valuation

        Under this view:

        • Nominative = valued by T
        • Accusative = valued by v

        Case is not a separate module but an effect of feature interaction.

        6.3 Inherent Case Under Feature Valuation

        Inherent case may be:

        • Lexically specified
        • Assigned prior to structural case
        • Resistant to structural revaluation

        This explains why inherent case persists in passives.

        7. Dependent Case Theory (Alternative Model)

        Dependent case theory proposes:

        • Case is assigned based on relative configuration between DPs.
        • No need for abstract Case features on T/v.

        Basic idea:

        When two DPs occur in the same domain, one receives dependent case (accusative or ergative).

        This approach offers:

        Unified account of ergative and accusative systems.

        Reduction in theoretical machinery.

        It challenges earlier Case Filter assumptions.

        8. Case, Agreement, and Hierarchy

        Cross-linguistic data shows:

        Agreement targets highest accessible DP.

        Case marking may not align with agreement target.

        Morphological case ≠ abstract Case necessarily.

        Examples:

        Object agreement in some languages

        Differential object marking

        Case stacking (in Australian languages)

        These phenomena suggest that:

        Case, agreement, and grammatical relations are separable components.

        9. Empirical Methods in Case & Agreement Research

        Advanced research involves:

        • Elicitation of minimal pairs
        • Corpus analysis
        • Experimental judgment studies
        • Morphosyntactic alignment comparison
        • Feature interaction modeling

        Split ergativity and agreement mismatches are particularly fruitful for research.

        10. Theoretical Implications

        Case and agreement research informs:

        • The architecture of grammar
        • The nature of functional heads
        • Universal Grammar vs typological variation
        • Interface between syntax and morphology
        • Computational modeling of feature systems

        Key theoretical tensions include:

        • Structural vs lexical case
        • Agreement as syntactic vs morphological
        • Parameterization vs universal hierarchy
        • Economy vs richness of representation

        Concluding Synthesis

        Case theory began as a licensing mechanism ensuring that nominals appear in appropriate structural configurations. It has evolved into a broader theory of:

        • Alignment systems
        • Feature valuation
        • Agreement mechanisms
        • Cross-linguistic variation

        From nominative–accusative to split-ergative systems, from structural case to probe–goal valuation, the study of case and agreement remains one of the most empirically rich and theoretically generative domains in syntax.

        For advanced scholarship, mastery of case theory is indispensable for understanding:

        • Argument structure
        • Movement
        • Agreement
        • Interface conditions
        • Cross-linguistic typology

          Extension

          • Dependent case theory
          • Agree mechanism in Minimalism
          • Defective intervention
          • Differential object marking
          • South Asian alignment systems (micro-analysis)
          • Case as PF vs LF phenomenon debate

          MODULE 6: Movement & Transformations

          (Advanced Theoretical and Empirical Syntax)

          1. Introduction: Why Movement?

          Movement is one of the most powerful and controversial hypotheses in modern syntactic theory. It attempts to explain why linguistic expressions sometimes appear displaced from the position where they receive their interpretation.

          Consider:

          What did Sara buy __ ?

          The wh-phrase what is interpreted as the object of buy, yet it appears clause-initially. Movement theory proposes that such displacement is not arbitrary but derives from systematic structural operations.

          The concept of movement was foundational in Transformational-Generative Grammar, particularly in the early work of Noam Chomsky, and was formally developed in Syntactic Structures and later in Lectures on Government and Binding. It was subsequently reformulated under the economy-driven architecture of The Minimalist Program.

          This lecture traces the evolution of movement theory from early transformational rules to modern minimalist Internal Merge, covering A-movement, A′-movement, and locality constraints such as Subjacency and island effects.

          2. Deep Structure vs Surface Structure: A Historical Perspective

          2.1 Early Transformational Grammar

          In early generative grammar, syntax was organized into levels:

          Deep Structure (D-structure)

          Encoded basic argument relations and semantic interpretation.

          Surface Structure (S-structure)

          Reflected the final word order after transformations.

          Movement was conceived as a transformational rule mapping D-structure to S-structure.

          Example:

          Deep Structure:

          You will see who.

          Transformation (Wh-Movement):

          Who will you see?

          This dual-level architecture allowed:

          Semantic interpretation at D-structure

          Phonological interpretation at S-structure

          2.2 Criticisms and Evolution

          Over time, this architecture was simplified:

          • D-structure and S-structure were eliminated.
          • Movement became a structure-building operation.
          • Economy principles replaced arbitrary transformations.

          In Minimalism, movement is no longer a rule applied between levels; instead, it is Internal Merge, operating within a single derivational workspace.

          3. Transformational Rules

          3.1 Classical Transformations

          Early transformational rules included:

          • Passive transformation
          • Wh-movement
          • Raising
          • Extraposition
          • Topicalization

          Passive example:

          Deep:

          Someone arrested Ali.

          Transformation:

          Ali was arrested.

          The object moved to subject position, and auxiliary insertion occurred.

          Transformations were:

          • Ordered
          • Rule-specific
          • Construction-dependent

          3.2 From Transformations to Move-α

          Government and Binding theory simplified transformations into a general rule:

          Move-α (Move anything anywhere)

          Movement was constrained not by rule type but by:

          • Structural principles
          • Case theory
          • Bounding theory
          • Binding theory

          3.3 Movement in Minimalism

          Minimalism reconceptualized movement as:

          • Internal Merge
          • Triggered by feature checking
          • Constrained by economy principles

          Movement occurs only when necessary to value uninterpretable features.

          Thus, displacement is not stylistic but driven by feature satisfaction.

          4. A-Movement (Argument Movement)

          A-movement moves DPs into argument positions (A-positions).

          Typical landing sites:

          • Spec-TP (subject position)
          • Spec-vP (external argument position)

          4.1 Passive Movement

          Active:

          Sara saw Ali.

          Passive:

          Ali was seen.

          Structure:

          1. Ali originates as object.
          2. v does not assign accusative.
          3. T probes for nominative.
          4. Ali moves to Spec-TP to receive case.

          Passive illustrates:

          • Case-driven movement
          • Interaction between movement and argument structure
          • Structural case licensing

          4.2 Raising Constructions

          Raising verbs:

          Ali seems to be tired.

          Key properties:

          • Ali originates in embedded clause.
          • It receives no theta role from seem.
          • It moves to matrix Spec-TP.

          Evidence:

          There seems to be a problem.

          Seem does not assign an external theta role.

          Thus raising differs from control.

          4.3 Properties of A-Movement

          • Changes grammatical function
          • Affects binding relations
          • Feeds case assignment
          • Does not create scope ambiguities in same way as A′-movement

          A-movement is structurally necessary.

          5. A′-Movement (Non-Argument Movement)

          A′-movement targets non-argument positions.

          Typical landing sites:

          • Spec-CP
          • Topic position
          • Focus position

          5.1 Wh-Movement

          What did she buy?

          Steps:

          1. What originates as object.
          2. Moves to Spec-CP.
          3. Leaves trace/copy.

          Properties:

          • Creates filler–gap dependency
          • Sensitive to island constraints
          • Produces scope effects

          5.2 Topicalization

          This book, I have read.

          Topic movement:

          • Targets left periphery
          • Discourse-driven
          • Not case-related

          5.3 Differences Between A- and A′-Movement

          PropertyA-MovementA′-Movement
          Landing siteArgument positionNon-argument position
          Case-relatedYesNo
          Creates new binding possibilitiesYesNo
          Scope effectsLimitedStrong

          This distinction is fundamental in generative syntax.

          6. Constraints on Movement

          Movement is not unrestricted. It obeys locality constraints.

          The central question:

          Why are some extractions impossible?

          Compare:

          What did you say that she bought?

          vs

          What did you wonder whether she bought?

          The second is degraded due to structural constraints.

          7. Island Constraints

          An island is a syntactic domain from which extraction is blocked.

          Common islands include:

          • Complex NP islands
          • Adjunct islands
          • Subject islands
          • Coordinate structure islands

          7.1 Complex NP Constraint

          What did you hear the rumor that she bought __ ?

          Extraction from within a noun phrase is prohibited.

          7.2 Adjunct Island

          What did she leave because you bought __ ?

          Extraction from adjunct clauses is disallowed.

          7.3 Coordinate Structure Constraint

          What did she buy apples and __ ?

          Extraction from only one conjunct is prohibited.

          Island effects are:

          • Robust across languages
          • Highly systematic
          • Theoretically central

          8. Subjacency

          Subjacency was proposed in Government and Binding theory to formalize locality.

          It states:

          Movement cannot cross more than one bounding node at a time.

          Bounding nodes (in English):

          • NP
          • TP

          If movement crosses two bounding nodes in a single step, the derivation crashes.

          Subjacency explains why:

          What did you hear the rumor that she bought __ ?

          is unacceptable.

          8.1 Successive Cyclic Movement

          Movement proceeds through intermediate landing sites (e.g., Spec-CP).

          Thus:

          What do you think she bought?

          is derived via successive cyclic movement:

          • Object → embedded Spec-CP
          • → matrix Spec-CP

          This avoids violating Subjacency.

          9. From Subjacency to Phases

          Minimalism replaced Subjacency with phase theory.

          Key idea:

          Certain domains are phases:

          • CP
          • vP

          Movement must proceed via phase edges.

          This reformulation maintains locality while simplifying theoretical machinery.

          10. Copies vs Traces

          Earlier theories posited traces (t).

          Minimalism proposes copies:

          Movement leaves full copies, but only one is pronounced.

          Interpretation may access lower copies for:

          • Scope
          • Reconstruction
          • Binding

          Example:

          Which picture of himself did Ali like?

          Binding interpretation relies on lower copy.

          11. Empirical Evidence for Movement

          Evidence comes from:

          • Binding theory
          • Scope ambiguity
          • Parasitic gaps
          • Cross-linguistic extraction patterns
          • Reconstruction effects

          Movement is not merely descriptive; it predicts systematic dependencies.

          12. Theoretical Debates

          Movement theory remains contested:

          • Are all dependencies derived via movement?
          • Can some be base-generated?
          • Is movement purely feature-driven?
          • Are island constraints syntactic or processing-based?

          Alternative approaches include:

          • Construction grammar
          • Dependency grammar
          • Processing-based explanations

          Yet movement remains central in generative syntax.

          Concluding Synthesis

          Movement theory evolved from rule-based transformations to feature-driven Internal Merge.

          It explains:

          • Passive constructions
          • Raising
          • Wh-questions
          • Topicalization
          • Cross-linguistic displacement patterns

          Constraints such as Subjacency and island conditions reveal that movement is:

          • Structurally local
          • Hierarchically constrained
          • Systematically regulated

          From Deep Structure and Surface Structure to phases and copies, the theory of movement represents one of the most sophisticated developments in modern syntactic thought.

          Mastery of movement theory is essential for engaging with advanced research in:

          • Clause structure
          • Agreement
          • Information structure
          • Syntax–semantics interface
          • Computational parsing

            Extension

            • Relativized Minimality
            • Intervention effects
            • The Freezing Principle
            • Anti-locality
            • Successive cyclic movement
            • Criterial positions
            • Reconstruction effects

            MODULE 7: The Minimalist Program (Introductory)

            (Advanced Theoretical and Empirical Syntax)

            1. Introduction: From Government & Binding to Minimalism

            The Minimalist Program represents a major reconceptualization of generative syntax. Developed in the early 1990s by Noam Chomsky, and articulated in The Minimalist Program, Minimalism seeks to simplify grammatical theory by asking:

            How little structure is necessary to account for linguistic competence?

            Unlike earlier models (e.g., Government and Binding), Minimalism does not present a fully specified theory but rather a research program guided by economy, interface conditions, and computational efficiency.

            Its foundational assumptions include:

            Language is an optimal computational system.

            Syntax builds hierarchical structure via simple operations.

            Linguistic expressions must satisfy interface conditions at:

            Conceptual–Intentional (C-I) interface (semantics)

            Sensory–Motor (SM) interface (phonology)

            Minimalism reduces grammatical machinery to a small set of operations and principles, the most central of which are:

            • Merge
            • Internal Merge
            • Feature checking/valuation
            • Economy principles
            • Phases
            • Spell-Out

            2. Merge: The Fundamental Structure-Building Operation

            2.1 Definition

            Merge is the basic combinatorial operation of syntax.

            It takes two syntactic objects and combines them into a new object:

            Merge(X, Y) → {X, Y}

            Example:

            Merge read and books:

            → {read, books}

            This forms a hierarchical structure (VP).

            2.2 Properties of Merge

            Merge is:

            • Binary
            • Recursive
            • Structure-building
            • Unbounded

            Recursion explains infinite generativity:

            [CP That [TP she [vP read [DP the book]]]]

            Through repeated Merge, language produces unbounded hierarchical structures.

            2.3 External Merge

            External Merge combines two distinct elements:

            • Verb + Object
            • T + vP
            • C + TP

            It introduces new material into the derivation.

            3. Move as Internal Merge

            Earlier theories treated movement as a separate operation (Move-α). Minimalism reduces movement to a special case of Merge.

            3.1 Internal Merge

            Internal Merge re-merges an element already present in the structure.

            Example:

            Wh-question:

            What did she buy?

            Step 1: Merge buy + what
            Step 2: Merge T
            Step 3: Internal Merge of what to Spec-CP

            Thus, movement is simply Merge applied internally.

            3.2 Copies Instead of Traces

            Internal Merge leaves copies rather than abstract traces.

            Structure contains:

            • Higher copy (pronounced)
            • Lower copy (interpreted)

            This accounts for:

            • Reconstruction effects
            • Scope ambiguity
            • Binding phenomena

            Movement is therefore not displacement but re-merging.

            4. Economy Principles

            Minimalism assumes that the computational system is optimal and economical.

            4.1 Derivational Economy

            Operations must occur only if necessary.

            No superfluous movement.

            Movement must be feature-driven.

            4.2 Representational Economy

            Structures must avoid unnecessary elements.

            No redundant projections.
            No stipulative transformations.

            4.3 Last Resort

            Movement occurs only if required to satisfy feature checking.

            Example:

            A DP moves to Spec-TP only if it must check nominative case.

            4.4 Inclusiveness Condition

            The derivation cannot introduce new features not already present in the lexical items.

            All features must originate in the lexicon.

            5. Feature Checking and Valuation

            One of Minimalism’s central innovations is treating syntactic operations as feature-driven.

            5.1 Interpretable vs Uninterpretable Features

            Lexical items contain features:

            Interpretable (semantic)

            Person

            Number

            Gender

            Uninterpretable (formal)

            Case

            Agreement triggers

            Uninterpretable features must be checked and deleted before Spell-Out.

            5.2 Probe–Goal Mechanism

            A head with unvalued features (probe) searches its c-command domain for a matching goal.

            Example:

            T has:

            [uPerson]

            [uNumber]

            DP has:

            [Person: 3]

            [Number: sg]

            Agreement:

            T probes DP.

            Features are valued.

            Case assigned.

            Movement may occur if required for feature satisfaction.

            5.3 Case as Feature Valuation

            Structural case is not an independent module.

            Nominative → valued by T

            Accusative → valued by v

            Case assignment is a reflex of agreement relations.

            6. Phases

            Phase theory was introduced to constrain derivations and locality.

            Certain domains are “phases”:

            vP

            CP

            Phases are cyclic computational units.

            6.1 Phase Impenetrability Condition (PIC)

            Once a phase is completed, its complement becomes inaccessible to higher operations.

            Only the edge (specifier/head) remains accessible.

            This explains:

            • Locality constraints
            • Successive cyclic movement
            • Island effects

            6.2 Movement Through Phases

            Example:

            What do you think she bought?

            Steps:

            1. Object originates in VP.
            2. Moves to embedded Spec-CP (phase edge).
            3. Moves to matrix Spec-CP.

            Movement must proceed via phase edges.

            This replaces Subjacency from earlier models.

            7. Spell-Out and Interfaces

            Minimalism assumes syntax interfaces with two systems:

            • Conceptual–Intentional (meaning)
            • Sensory–Motor (sound)

            Spell-Out sends structure to these interfaces.

            7.1 Cyclic Spell-Out

            At each phase:

            • Complement of phase head is spelled out.
            • Structure becomes inaccessible to further syntax.

            Thus derivations proceed cyclically.

            7.2 Logical Form and Phonological Form

            Earlier theories distinguished:

            • LF (Logical Form)
            • PF (Phonological Form)

            Minimalism reconceptualizes these as interface outputs rather than levels of syntactic representation.

            8. Derivation Overview

            A simplified derivation:

            1. Select lexical items from Numeration.
            2. External Merge builds structure.
            3. Probe–Goal relations value features.
            4. Internal Merge occurs if required.
            5. Phase heads trigger Spell-Out.
            6. Structure reaches interfaces.

            Syntax is thus a computational procedure.

            9. Theoretical Significance

            Minimalism seeks explanatory adequacy by reducing theoretical machinery.

            It aims to explain:

            • Why movement exists
            • Why locality constraints hold
            • Why agreement systems behave systematically
            • Why hierarchical structure is universal

            Instead of adding rules, it asks:

            Are these properties necessary consequences of optimal computation?

            10. Empirical Contributions

            Minimalism has influenced:

            • Cross-linguistic parameterization
            • Agreement research
            • Ergativity analysis
            • Clause structure theory
            • Computational syntax
            • Interface research

            It provides tools for analyzing:

            • Split ergativity
            • Long-distance agreement
            • Clause typing
            • Morphosyntactic variation

            11. Critiques and Ongoing Debates

            Minimalism has faced criticisms:

            • Abstractness
            • Overreliance on invisible features
            • Empirical underdetermination
            • Alternative functional explanations

            Debates continue over:

            • Nature of features
            • Existence of phases
            • Necessity of movement
            • Learnability of feature systems

            Yet it remains the dominant formal framework in theoretical syntax.

            Concluding Synthesis

            The Minimalist Program reframes syntax as:

            • A computational system
            • Driven by Merge
            • Constrained by economy
            • Regulated by feature valuation
            • Cyclic via phases
            • Interfacing with meaning and sound through Spell-Out

            Movement becomes Internal Merge.
            Case becomes feature valuation.
            Locality becomes phase-based cyclicity.

            Minimalism reduces syntactic theory to a small set of powerful operations and principles, seeking to explain not only how language works, but why it must work that way given constraints of optimal design.

              Extension

              • Phase Impenetrability Condition (PIC)
              • Escape hatches (Spec-CP / Spec-vP)
              • Transfer to PF & LF
              • Edge features
              • Cyclic derivation
              • Why phases reduce computational load

              MODULE 8: Clause Structure & Left Periphery

              1. Clause Structure Overview

              A clause in generative syntax is often analyzed as having a hierarchical structure:

              [CP [TP [vP ... ]]]

              • CP (Complementizer Phrase): The highest projection, often called the left periphery, marking clause type and discourse features.
              • TP (Tense Phrase): Houses tense, modality, and subject features.
              • vP (little v Phrase): Contains the verb, object, and internal argument structure.

              The left periphery is especially important for encoding information structure: topic, focus, and clause force.

              2. The CP Domain / Left Periphery

              The CP domain sits at the top of the clause and can contain multiple functional projections:

              [ForceP [TopicP [FocusP [FinP [TP ... ]]]]]

              a) Force

              Marks the clause type: declarative, interrogative, exclamative, etc.

              Often realized by complementizers like that, if, whether.

              Example:

              Declarative: I think that she left.that = Force marker

              Interrogative: I wonder if she left.if = Force marker

              b) Topic

              Marks what the sentence is about (given information).

              Usually preposed to the left periphery, above focus.

              Example:

              As for the weather, it’s going to rain tomorrow.as for the weather = Topic

              c) Focus

              Marks new or contrastive information.
              Can trigger movement to the left periphery.

              Example:
              It is John who ate the cake.John = Focus

              3. Embedded Clauses

              Embedded clauses are clauses that function as arguments or modifiers within another clause. They typically occupy the CP domain as well.

              a) Relative Clauses

              • Modify a noun phrase (NP).
              • Often introduced by a relative pronoun (who, which, that).
              • Structure:

              NP [CP [RelPronoun ... [TP ...]]]

              Example:

              The book that I read was fascinating.

              that I read = relative clause modifying book

              b) Complement Clauses

              Serve as complements to verbs, adjectives, or nouns.
              Often introduced by that, whether, if.

              Types:

              Declarative complements:
              I believe that she is honest.

              Interrogative complements:
              I wonder whether she will come.

              Exclamative complements:
              I know what a mess this is!

              4. Summary Table: CP Functional Hierarchy

              ProjectionFunctionExample
              ForcePClause type / illocutionary forcethat, if, whether
              TopicPGiven / known informationAs for the weather
              FocusPNew / contrastive informationIt is John who ate it
              TPTense / subject agreementShe left yesterday
              vPVerb phrase / internal argumentsate the cake

              5. Key Points

              • The left periphery (CP domain) encodes both clause type (Force) and information structure (Topic & Focus).
              • Embedded clauses (relative or complement) have their own CP projections.
              • Movement to the left periphery is often triggered by focus or topicalization.

                Extension

                • Rizzi’s Cartographic Program
                • Fine structure of CP
                • Embedded Force
                • Complementizer systems cross-linguistically
                • Left-periphery in South Asian languages

                MODULE 9: Head, Complement, Modifier — Advanced

                1. Head, Complement, Modifier Overview

                In X-bar theory and modern syntax:

                Head (X⁰): The core of a phrase that determines its type and properties.

                Examples:

                N in NP → book

                V in VP → eat

                A in AP → happy

                Complement: A phrase required by the head to complete its argument structure.

                Typically sisters to the head in a phrase structure.

                Example:

                eat [an apple]an apple = complement of V

                Modifier / Adjunct: Optional elements that provide extra information, usually added hierarchically outside the head-complement structure.

                Example:

                eat [an apple] [quickly]quickly = modifier/adjunct

                2. Adjunct vs. Complement Diagnostics

                a) Obligatoriness

                Complement: Required by the head.
                John devoured __. → must have an object: devoured what?

                Adjunct: Optional; can be omitted.
                John devoured an apple __.quickly is optional

                b) Iterability

                Complement: Usually appears once per head (unless the head allows multiple arguments).
                Adjunct: Can appear multiple times.

                John devoured an apple quickly, happily, and silently. → multiple adjuncts possible

                c) Order flexibility

                Complement: Fixed position relative to head.
                Adjunct: Often flexible, can appear pre- or post-head depending on language.

                d) Modification by questions

                Complement: Answers “what?” or “who?” about the head.

                Adjunct: Answers “how?”, “when?”, “where?”, “why?”

                Examples:

                TypePhraseTest/Comment
                Complementeat [an apple]obligatory; answers “what?”
                Adjuncteat [an apple] [quickly]optional; answers “how?”

                3. Selectional Restrictions

                • Heads select for specific complements; this is part of their lexical properties.
                • Violating selectional restrictions results in ungrammaticality or semantic anomaly.

                Examples:

                drink [water] ✅ (natural)

                drink [a song] ❌ (violates V selection for edible/liquid NP)

                Adjectives and nouns also impose selectional restrictions:

                strong [coffee]

                strong [idea] ❌ (unacceptable unless metaphorical)

                4. Modification Hierarchies

                Modifiers can be stacked hierarchically. The hierarchy affects scope and interpretation:

                Noun Phrase Example:

                [NP [AdjP [very expensive]] [N' [N car]]]

                very modifies expensive (AdjP)
                expensive modifies car (N)

                Verb Phrase Example:

                [VP [V' [V eat] [NP an apple]] [AdvP quickly]]

                quickly modifies entire VP, not just V

                Key Principle: Modifiers higher in the hierarchy modify larger constituents, while lower modifiers modify smaller, closer constituents.

                5. Recursive Structures

                Recursion: A phrase can contain a phrase of the same type, creating potentially infinite embedding.

                Typical recursive structures:

                NP recursion:

                the [book [of [poems [by Shakespeare]]]]

                VP recursion:

                I think [you believe [she knows [the answer]]]

                AP recursion:

                a [very [extremely happy]] child

                Recursive structures often involve complements and adjuncts in nested configurations.

                Visualization:

                NP

                ├── Det: the

                ├── N': 

                │   ├── N: book

                │   └── PP: of

                │       └── NP: poems

                │           └── PP: by Shakespeare


                6. Key Principles

                1. Heads determine the category and lexical selection.
                2. Complements are required; adjuncts are optional.
                3. Selectional restrictions constrain combinations semantically and syntactically.
                4. Modification hierarchies reflect scope of modifiers.
                5. Recursion allows embedding, giving natural language its unbounded productivity.

                  Extension

                  • Adjunct islands
                  • Late adjunction
                  • Sideward movement
                  • Modification hierarchies cross-linguistically

                  MODULE 10: Syntax–Semantics Interface

                  1. Syntax–Semantics Interface

                  • This interface studies how syntactic structures map to meaning.
                  • Key principle: the meaning of a sentence is systematically determined by its syntactic structure and the meanings of its parts.
                  • Challenges arise because surface syntax may not directly reflect underlying semantic relations, e.g., scope ambiguities, pronoun binding.

                  2. Compositionality

                  Principle of Compositionality: The meaning of a complex expression is determined by:

                  The meanings of its constituents.
                  The rules used to combine them.

                  Example:

                  Sentence: Every student read a book.
                  The meaning of every student + a book + read combines systematically.

                  Key point: Syntax provides the structure along which semantic composition proceeds.

                  3. Scope Ambiguity

                  Occurs when multiple quantifiers or operators can take different hierarchical interpretations.

                  Example: Every student read a book.

                  ∀ > ∃ (surface scope): For every student, there exists a (possibly different) book that they read.

                  ∃ > ∀ (inverse scope): There exists a single book that every student read.

                  Scope assignment often requires movement operations at the semantic level (not necessarily reflected in surface syntax).

                  4. Quantifier Raising (QR)

                  Quantifier Raising is a proposed syntactic operation that moves quantified NPs to a higher position in the clause for proper scope interpretation.

                  Example:
                  [Every student]_i [read a book]_j
                  → QR: [Every student]_i [λi [read a book]_j]

                  This allows surface order to differ from logical scope, resolving ambiguities.

                  Helps explain inverse scope readings and binding possibilities.

                  5. Binding Theory

                  Deals with how pronouns, anaphors, and referring expressions get interpreted.

                  Core principles (Chomsky, 1981):
                  PrincipleTargetConstraint
                  AAnaphors (e.g., himself, herself)Must be bound in its local domain
                  BPronouns (e.g., he, she, it)Must be free in its local domain
                  CR-expressions (referring NPs, e.g., John, Mary)Must be free everywhere
                  Example:
                  1. John_i saw himself_i. ✅ (Principle A satisfied: local binding)
                  2. John_i saw him_i. ❌ (Principle B violation: pronoun cannot be locally bound)
                  3. He_i saw John_i. ✅ (Principle C satisfied: R-expression free)

                  6. Anaphora and Pronouns

                  Anaphora: Expressions that refer back to another constituent (antecedent).

                  Pronouns: Can be bound by antecedents but must satisfy binding principles.

                  Types of anaphora:

                  Reflexive anaphora: himself, herself, themselves (require local binding)

                  Pronominal anaphora: he, she, it (can refer to antecedents outside local domain)

                  Discourse anaphora: Requires context for interpretation (this, that, it)

                  Interaction with syntax:

                  Position in c-command hierarchy determines possible antecedents.

                  Scope and binding interact: moving quantifiers can affect which pronouns/anaphors they can bind.

                  7. Key Insights

                  1. Syntax structures meaning: Phrase structure, movement, and hierarchy directly impact interpretation.
                  2. Compositionality is guided by syntactic order, but semantic operations (like QR) can adjust for logical meaning.
                  3. Scope ambiguities are resolved via movement or semantic hierarchy.
                  4. Binding principles govern anaphora and pronoun interpretation, closely tied to c-command relations.
                  5. Interface phenomena often reveal mismatches between surface form and underlying meaning, highlighting the subtlety of the syntax–semantics interface.

                    Extension

                    • C-command formalization
                    • Reconstruction
                    • LF movement
                    • Binding across languages
                    • Information structure & scope interaction

                    MODULE 11: Syntax–Morphology Interface

                    1. Syntax–Morphology Interface

                    The syntax–morphology interface studies how morphological forms are derived from syntactic structures.

                    Two main approaches:

                    Distributed Morphology (DM): Morphological exponence occurs after syntactic computation.

                    Lexicalist approaches: Morphology is partially computed in the lexicon before syntax.

                    Key idea: syntactic features drive morphological realization, and morphology can sometimes feed back to affect syntax.

                    2. Inflectional Morphology

                    Inflection: Morphological changes that mark grammatical features without changing lexical category.

                    Features: tense, aspect, mood, person, number, gender, case.

                    Examples:

                    walk → walks (3rd person singular present)

                    cat → cats (plural)

                    In syntax, inflection often realizes features of functional heads:

                    Tense in T⁰, agreement in Agr⁰, case on DPs.

                    Example (English):

                    [TP John [T' has [VP eaten an apple]]]

                    has carries present perfect tense (inflectional morphology).

                    3. Agreement Morphology

                    Agreement (φ-features): Morphological expression that matches features between syntactic constituents.

                    Common types:

                    Subject–verb agreement:

                    She walks. → verb agrees in number/person with subject.

                    Noun–adjective agreement:

                    Spanish: niño bonito / niña bonita → adjective agrees in gender/number.

                    Diagnostics: agreement morphology often shows feature dependency and hierarchical relationships.

                    4. Clitics

                    Clitics: Morphological elements that behave syntactically like words but phonologically depend on hosts.

                    Types:

                    Enclitics: attach after a host (-’s in English: John’s book)

                    Proclitics: attach before a host (French je t’aime)

                    Clitics often realize syntactic features like agreement or case.

                    Example:

                    Turkish: ev-ler-im → house-PL-POSS.1SG (clitic expresses person/possessor).

                    5. Incorporation

                    Incorporation: Morphological combination of a noun or object into the verb, forming a single complex verb.

                    Properties:

                    Often reduces valency: the verb no longer requires a separate object.

                    Can be productive in polysynthetic languages.

                    Example (Mohawk):

                    wá:ta’khe-carried-it (incorporating object into the verb).

                    6. Morphosyntactic Alignment

                    Alignment describes how languages mark the syntactic roles of arguments (subject, object, agent) morphologically.

                    Types:

                    Nominative–Accusative:

                    Subject of intransitive and transitive verbs is marked the same (S=A), object differently (O).

                    Example: English (She sees him).

                    Ergative–Absolutive:

                    Subject of intransitive verbs marked like object of transitive (S=O), transitive subject differently (A).

                    Example: Basque: gizona etorri da (the man came), gizonak emakumea ikusi du (the man saw the woman).

                    Active–Stative:

                    Subjects of intransitive verbs split based on semantic role (agentive vs. experiencer).

                    Interface relevance: Morphosyntactic alignment shapes agreement morphology and case marking.

                    7. Key Principles

                    1. Morphology reflects syntactic features (tense, agreement, case).
                    2. Clitics and incorporation demonstrate tight interaction between syntax and morphology.
                    3. Inflectional and agreement morphology help identify functional heads in syntax.
                    4. Morphosyntactic alignment constrains argument realization and affects surface word order.
                    5. Distributed Morphology provides a formal framework linking syntactic features to morphological realization.

                      Extension

                      • Distributed Morphology
                      • Late insertion
                      • Vocabulary insertion
                      • Morphological impoverishment
                      • Head movement vs lowering

                      MODULE 12: Functional & Usage-Based Approaches

                      1. Functional & Usage-Based Approaches

                      Focus on language as a communicative tool, rather than an autonomous formal system.

                      Core assumptions:

                      Form follows function: grammatical structures emerge to serve communicative purposes.

                      Frequency and usage shape grammar: patterns used often become conventionalized.

                      Cognitive grounding: knowledge of language emerges from experience and interaction.

                      Two major frameworks:

                      Systemic Functional Linguistics (SFL)

                      Construction Grammar (CxG)

                      2. Systemic Functional Linguistics (SFL)

                      Developed by Michael Halliday.

                      Views language as a network of choices where speakers select options to convey meaning.

                      Three metafunctions of language:

                      Ideational – represents experience and processes (who did what to whom).
                      Interpersonal – enacts social relations (requests, commands, questions).
                      Textual – organizes message flow and coherence (theme, cohesion).

                      Clause structure in SFL:

                      Mood (Interpersonal) – declarative, interrogative, imperative.
                      Transitivity (Ideational) – process type, participants, circumstantial elements.
                      Theme–Rheme (Textual) – information structure and discourse flow.

                      3. Construction Grammar (CxG)

                      Developed by Charles Fillmore, Adele Goldberg, and others.

                      Key idea: language consists of constructions, form–meaning pairings at all levels.

                      Constructions can be phrasal, idiomatic, or schematic.

                      Example:

                      What’s X doing Y? → expresses surprise (What’s he doing here?)

                      Usage-Based Insights: frequency and entrenchment shape how constructions are processed and generalized.

                      Interface with syntax: Constructions encode both form and semantic-pragmatic function, blurring the strict syntax-semantics divide.

                      4. Information Structure

                      How given vs. new information is organized in a sentence.

                      Includes concepts like:

                      Topic/Theme – what the sentence is about (given or presupposed information).

                      Focus/Rheme – what is being said about the topic (new or contrastive information).

                      Helps speakers manage attention, discourse coherence, and emphasis.

                      Examples:

                      As for the exam, John passed it. → Theme = exam, Rheme = John passed it

                      It was John who passed the exam. → Focus = John

                      5. Theme–Rheme

                      Originates from Halliday’s textual function.
                      Theme: initial position of the clause; anchors discourse.
                      Rheme: remainder of the clause; conveys new information.

                      Example:

                      The weather (Theme) is terrible today (Rheme).

                      Reordering changes discourse emphasis:

                      It is terrible today (Rheme) that the weather (Theme). → stylistic/discourse effect

                      Interaction with syntax:

                      Word order, clefting, and fronting can mark Theme and Focus.

                      Some languages use morphology (topic markers, focus particles).

                      6. Discourse–Syntax Interface

                      Explores how syntactic structures realize discourse functions.

                      Key phenomena:

                      Topicalization / fronting – brings known information to Theme position.

                      Cleft and pseudo-cleft constructions – mark contrastive focus.

                      Information-structural word order variation – e.g., Object–Verb–Subject order for focus in some languages.

                      Discourse particles and connectors – encode speaker stance and discourse relations.

                      Example (English):

                      John, I saw yesterday. → topicalized object for emphasis

                      It was John that I saw yesterday. → cleft for contrastive focus

                      7. Key Insights

                      1. Functional approaches link form and communicative purpose, contrasting formalist approaches.
                      2. SFL emphasizes metafunctions (ideational, interpersonal, textual) and systematic choices.
                      3. Construction Grammar emphasizes learned form–meaning pairings, including idiomatic patterns.
                      4. Theme–Rheme and information structure organize discourse and guide sentence-level choices.
                      5. Discourse-syntax interface shows that syntax is shaped by pragmatic, cognitive, and communicative constraints.

                        Extension

                        • Comparing derivational vs constructional models
                        • Usage frequency effects
                        • Emergentist syntax
                        • Formal vs cognitive grammar debate

                        MODULE 13: Cross-Linguistic & Typological Syntax

                        1. SOV vs. SVO Word Order

                        Basic word order is one of the most prominent typological parameters.

                        SOV (Subject-Object-Verb):

                        Common in languages like Japanese, Korean, Urdu, Hindi.

                        Example (Urdu/Hindi):

                        Ram-ne [kitaab] [padhii] → Ram-ERG book read

                        Verb comes at the end; postpositions and auxiliaries typical.

                        SVO (Subject-Verb-Object):

                        Common in English, Mandarin, Swahili.

                        Example (English):

                        Ram read the book. → Verb follows subject, precedes object.

                        Typological correlates:

                        SOV languages often head-final in other projections (PPs, complementizers).

                        SVO languages often head-initial.

                        2. Head-Initial vs. Head-Final Systems

                        Head direction refers to whether heads precede or follow their complements.

                        Head-Initial (VO): Head precedes complement.

                        Examples: English, Swahili

                        V → NP object (eat an apple), P → NP (in the house)

                        Head-Final (OV): Head follows complement.

                        Examples: Japanese, Hindi, Korean

                        NP object → V (an apple eat), NP → P (house in)

                        Implications: Head-directionality affects:

                        Order of modifiers

                        Placement of auxiliaries and auxiliaries

                        Position of relative clauses

                        3. Greenbergian Universals

                        Joseph Greenberg (1963) proposed implicational universals in word order and typology.

                        Examples:

                        If a language is VO, prepositions tend to precede noun phrases.

                        If a language is OV, postpositions tend to follow noun phrases.

                        Adjective–noun order correlates with head direction:

                        OV → adjectives precede noun (big house)

                        VO → adjectives often follow noun (house big)

                        Greenbergian universals reveal cross-linguistic patterns and help predict syntactic behavior.

                        4. South Asian Morphosyntax

                        South Asian languages (Hindi, Urdu, Punjabi, Bengali, Marathi) show distinctive syntactic and morphological features:

                        SOV word order with postpositions.

                        Split-ergativity:

                        Ergative marking for transitive perfective verbs; nominative–accusative elsewhere.

                        Example (Hindi): Ram-ne kitab padhi → Ram-ERG book read-PFV

                        Rich agreement systems:

                        Gender, number, and sometimes honorific marking on verbs.

                        Light verb constructions:

                        V + auxiliary combinations (kar dena, liye jaana) for aspect or causation.

                        Scrambling:

                        Flexible word order driven by information structure rather than strict syntax.

                        5. Language Contact Effects

                        South Asia is linguistically diverse, so contact phenomena are pervasive:

                        Lexical borrowing – Persian, Arabic, English words in Hindi/Urdu.

                        Calquing – semantic and syntactic patterns adopted across languages.

                        Convergence in morphosyntax

                        Example: postpositions, SOV patterns spreading across unrelated languages in the region.

                        Code-switching and diglossia – influence syntactic constructions and discourse patterns.

                        Language contact can modify typological patterns, sometimes creating hybrid syntactic structures.

                        6. Key Insights

                        1. Word order is a primary typological parameter, but interacts with head-directionality and morphology.
                        2. Head-initial vs. head-final systems predict positions of objects, modifiers, and auxiliaries.
                        3. Greenbergian universals provide predictive patterns across languages.
                        4. South Asian morphosyntax exhibits SOV, split-ergativity, rich agreement, and scrambling, showing complex interaction of syntax and morphology.
                        5. Language contact is a powerful force shaping syntax, creating borrowings, calques, and convergence across unrelated languages.

                          Extension

                          • Micro-parameters
                          • Dialect syntax (AAVE, Singlish)
                          • Parameter setting in acquisition
                          • Poverty of the Stimulus debate
                          • Comparative syntax case studies

                          MODULE 14: Computational & Corpus Syntax

                          1. Computational & Corpus Syntax

                          • Corpus syntax studies syntactic patterns using large collections of texts (corpora).
                          • Computational syntax develops formal and algorithmic methods to represent, parse, and analyze syntax.
                          • Intersection: corpora provide data-driven insights, computational models provide tools for analysis and NLP applications.

                          2. Treebanks

                          Treebanks are annotated corpora with syntactic structure marked for each sentence.

                          Two main types:

                          Constituency (phrase-structure) treebanks – hierarchical trees showing phrases (NP, VP, etc.) and their internal structure.

                          Example: Penn Treebank (English)

                          Structure:

                          (S
                             (NP John)
                             (VP (V saw) (NP Mary)))

                          2. Dependency treebanks – edges represent head-dependent relations rather than phrase structure.

                          Example: Universal Dependencies (UD)

                          Structure:

                          saw → John (subject), Mary (object)

                          Uses: syntactic parsing, NLP tasks (machine translation, question answering, grammar checking).

                          3. Annotation Basics

                          Morphosyntactic annotation: POS tags, lemma, morphological features.

                          Syntactic annotation:

                          Phrase-structure trees: nodes = phrases, leaves = words, edges = dominance.

                          Dependency trees: nodes = words, edges = head-dependent relations.

                          Standards & conventions:

                          Penn Treebank tag set (POS)

                          Universal Dependencies (UD) for cross-linguistic compatibility

                          Challenges: ambiguous sentences, multiple valid parses, inconsistent conventions across languages.

                          4. Dependency vs. Constituency Grammar

                          FeatureConstituency GrammarDependency Grammar
                          Basic unitPhrase (NP, VP, AP, etc.)Word (head) & its dependents
                          Tree structureHierarchical, nestedDirected graph, edges = head-dependent
                          Example(S (NP John) (VP saw Mary))saw → John (subj), Mary (obj)
                          StrengthsCaptures internal phrase structureCaptures predicate-argument relations, easier for free-word-order languages
                          UsageTraditional linguistics, parser trainingNLP tasks, cross-linguistic parsing (UD)

                          5. Introduction to Parsing

                          Parsing = algorithmic assignment of syntactic structure to sentences.

                          Types of parsers:

                          Constituency parsers – output phrase structure trees.

                          E.g., CKY parser, probabilistic context-free grammar (PCFG) parsers.

                          Dependency parsers – output head-dependent graphs.

                          E.g., transition-based parsers, graph-based parsers.

                          Applications: syntax checking, semantic role labeling, machine translation, information extraction.

                          6. Syntax in NLP

                          Syntax informs many NLP tasks:

                          Machine translation: syntactic structure helps align source-target sentences.

                          Question answering / information extraction: dependency relations identify subject, object, relations.

                          Coreference resolution: syntactic features (c-command, argument structure) guide pronoun resolution.

                          Text generation: ensuring grammatical output.

                          Challenges: ambiguity, complex sentences, cross-linguistic variation, sparse data for low-resource languages.

                          7. Key Insights

                          1. Treebanks provide structured syntactic data essential for computational analysis.
                          2. Annotation requires careful morphological and syntactic labeling.
                          3. Constituency vs. dependency frameworks suit different analytic and NLP needs.
                          4. Parsing algorithms operationalize syntactic knowledge for applications.
                          5. Computational syntax bridges theory and application, enabling data-driven models for understanding and generating human language.

                            Extension

                            • Minimalist grammars (Stabler)
                            • Computational complexity of recursion
                            • Formal language hierarchy
                            • Syntax in large language models
                            • Limitations of purely statistical parsing

                            MODULE 15: Research Methods in Syntax

                            1. Designing Syntactic Research Questions

                            Good syntactic research begins with clear, precise, and testable questions.

                            Key principles:

                            Focus on specific constructions or phenomena: e.g., word order, agreement, case marking.

                            Make the scope manageable: narrow enough for thorough analysis.

                            Link theory and empirical data: specify the hypothesis and predictions.

                            Examples:

                            Do subjects in Hindi always precede objects in embedded clauses?

                            How does plural marking interact with postpositions in Punjabi?

                            Can focus movement override canonical word order in Urdu?

                            2. Data Collection Methods

                            Native speaker intuition: direct elicitation from speakers.
                            Corpus data: examining naturally occurring sentences in large corpora.
                            Experimental methods: psycholinguistic experiments, eye-tracking, reaction time studies.
                            Interviews & fieldwork: especially for under-documented languages.

                            Advantages & limitations:

                            Intuition: precise, controlled, but limited sample size.

                            Corpus: naturalistic, large scale, but may lack specific constructions.

                            Experiments: rigorous, quantitative, but time-consuming.

                            3. Acceptability Judgments

                            Core method in theoretical syntax: participants rate sentences on a scale (grammatical vs. ungrammatical).

                            Techniques:

                            Binary judgment: acceptable / unacceptable
                            Gradient scales: Likert scale (1–7) for nuanced judgments
                            Forced-choice tasks: select the more acceptable sentence

                            Tips for reliability:

                            Use multiple speakers
                            Control for lexical frequency, plausibility, context
                            Avoid leading questions

                            Example:

                            John saw Mary.
                            Saw John Mary.

                            4. Argumentation in Syntax

                            Presenting a syntactic argument involves:

                            Data presentation: clear examples with glosses.

                            Pattern identification: highlight contrasts and regularities.

                            Hypothesis formulation: explain why the pattern exists.

                            Theoretical analysis: relate to existing syntactic theory (e.g., X-bar, Minimalism).

                            Counterexamples and discussion: address potential objections or alternative analyses.

                            Structure of a typical argument:

                            Example:

                            Sentence A is grammatical.

                            Sentence B is ungrammatical.

                            Difference explained by [movement / agreement / feature-checking].

                            Conclude how this supports/contradicts theory.

                            5. Writing a Formal Syntactic Paper

                            Standard structure:

                            Introduction: state research question and significance.
                            Background / Literature review: situate the problem in existing theory.
                            Methodology: data sources, participant details, judgment tasks, corpus.
                            Data & Analysis: present examples, contrasts, syntactic trees, tables.
                            Discussion: interpret results, relate to theory, address counterarguments.
                            Conclusion: summarize findings, suggest future research.

                            Style tips:

                            Use consistent glossing and notation
                            Include syntactic trees, charts, and tables for clarity
                            Be concise, precise, and formal

                            6. Key Principles

                            1. Research questions should be specific, testable, and theoretically grounded.
                            2. Data collection must balance native speaker intuition, corpus evidence, and experimental validation.
                            3. Acceptability judgments are central; reliability requires careful design and replication.
                            4. Syntactic arguments must clearly show contrasts and theoretical implications.
                            5. Formal writing demands clarity, precision, and structured presentation.

                              Experimental & Quantitative Methods in Syntax

                              • Magnitude estimation
                              • Likert scaling
                              • Experimental syntax design
                              • Statistical modeling basics
                              • ERP & neurolinguistic evidence
                              • Field data analysis methods

                              1. Magnitude Estimation

                              Magnitude Estimation (ME) is a method for collecting gradient acceptability judgments.

                              Participants assign numerical values proportional to perceived acceptability.

                              Features:

                              Not limited to fixed scales (like 1–7); can use any positive number.

                              Captures fine-grained distinctions between sentence acceptability.

                              Example:

                              Reference sentence: John saw Mary = 100

                              Test sentence: Mary saw John = 60 (less acceptable)

                              2. Likert Scaling

                              Likert scales are commonly used for eliciting subjective ratings, often 1–5 or 1–7.

                              Typical scale:

                              1 = completely unacceptable

                              7 = completely acceptable

                              Advantages: simple to administer; easy to analyze statistically.

                              Limitation: less sensitive than magnitude estimation for fine-grained contrasts.

                              3. Experimental Syntax Design

                              Design principles:

                              Within-subject vs. between-subject design:

                              Within: same participant rates multiple conditions.
                              Between: each participant sees only one condition.

                              Counterbalancing: randomize presentation to control for order effects.

                              Control conditions: include grammatical and ungrammatical baseline sentences.

                              Minimal pairs: isolate the syntactic phenomenon of interest.

                              Example:

                              Investigate wh-movement in Urdu:

                              Condition A: Who did Ram see?
                              Condition B: Ram saw who?
                              Collect acceptability ratings using ME or Likert scale

                              4. Statistical Modeling Basics

                              Purpose: analyze experimental syntax data rigorously.

                              Common methods:

                              ANOVA / t-tests: compare group means across conditions.

                              Mixed-effects models: account for participants and items as random effects, e.g., lme4 in R.

                              Regression: examine influence of multiple factors on acceptability ratings.

                              Key principle: syntax experiment data is often hierarchical → participants and items must be modeled as random effects.

                              5. ERP & Neurolinguistic Evidence

                              Event-Related Potentials (ERP): measure brain responses to linguistic stimuli.

                              Key ERP components for syntax:

                              P600: associated with syntactic violations, garden-path sentences.

                              LAN (Left Anterior Negativity): early detection of morphosyntactic anomalies.

                              Usage: complements acceptability judgments to detect unconscious syntactic processing.

                              Other neurolinguistic methods: fMRI (activations in Broca’s/Wernicke’s area), MEG, eye-tracking.

                              6. Field Data Analysis Methods

                              Used especially for under-documented languages:

                              Elicitation sessions: record sentences from native speakers.

                              Structured questionnaires: control syntactic variables.

                              Audio/video corpora: transcribe, parse, and annotate morphosyntactic features.

                              Statistical and qualitative analysis:

                              Frequency counts, pattern identification

                              Annotated trees for syntactic structure

                              Validation: cross-check judgments with multiple informants for reliability

                              7. Key Principles

                              1. Gradient acceptability can be measured using magnitude estimation or Likert scaling.
                              2. Careful experimental design ensures valid, interpretable results.
                              3. Statistical modeling is essential for rigor and generalizability.
                              4. Neurolinguistic evidence (ERP, fMRI) provides independent confirmation of syntactic processing.
                              5. Field data analysis combines elicitation, corpus annotation, and statistical tools to study less-documented languages.

                              ADDITIONAL MICRO-MODULE

                              Ellipsis & Silent Categories

                              • VP ellipsis
                              • Sluicing
                              • Identity conditions
                              • Null subjects (pro-drop)
                              • Empty Category Principle
                              • PRO vs pro
                              • Traces vs copies

                              1. Ellipsis

                              • Ellipsis occurs when a syntactic constituent is unpronounced but interpretable from context.
                              • Key property: the elided material must satisfy identity conditions with an antecedent.

                              a) VP Ellipsis

                              VP ellipsis (VPE): The verb phrase is omitted but its meaning is recoverable.

                              Example:

                              John can play the guitar, and Mary can __ too.

                              Elided VP: play the guitar

                              Identity condition: elided VP must match the antecedent VP in meaning.

                              Diagnostics: Licensing by auxiliary or modal verbs

                              Works: John has eaten, and Mary has __ too.

                              Fails: John eats, and Mary does __ too. ❌ (tense mismatch)

                              b) Sluicing

                              Sluicing: Ellipsis of everything but the wh-phrase in a question.

                              Example:

                              Someone left, but I don’t know who __.

                              Elided material: someone left

                              Identity condition: sluiced clause must correspond to a fully grammatical antecedent clause.

                              Notes: Sluicing often interacts with islands and movement constraints.

                              2. Identity Conditions

                              Ellipsis licensing principle:

                              Elided material must have a grammatical antecedent.

                              Antecedent and elided material must be identical in meaning (semantic and sometimes syntactic identity).

                              Examples:

                              Good: John can play guitar, and Mary can __ too.

                              Bad: John can play guitar, and Mary will __ too. ❌ (tense/modal mismatch)

                              3. Null Subjects (Pro-Drop)

                              Some languages allow subjects to be unpronounced (null) if recoverable from context.

                              Example (Spanish):

                              [PRO] Come pizza. → “He/She eats pizza.”

                              Licensing: subject must be recoverable from agreement morphology or discourse context.

                              English is non-pro-drop, but some contexts allow imperatives and small clauses:

                              [PRO] Sit down!

                              4. Empty Category Principle (ECP)

                              Principle in Government & Binding theory: traces of movement must be properly governed.

                              Types of empty categories: traces (t), PRO, pro

                              ECP ensures:

                              Traces are licensed by movement (A-movement or A’-movement)

                              PRO occurs in subject position of infinitives and is controlled by an antecedent

                              5. PRO vs. pro

                              CategoryFunctionLicensingExample
                              PRONull subject of non-finite clauses (controlled)Must have an antecedent controllerJohn wants [PRO to leave] → John controls PRO
                              proNull subject of finite clauses (pro-drop)Rich agreement or discourse contextSpanish: [pro] Come pizza → he/she eats pizza

                              6. Traces vs. Copies

                              Traces (t): Empty positions left after movement; single-copy representation.

                              Example: Who_i did John see t_i?

                              Copies (Minimalist perspective): Entire moved element leaves a full copy; lower copies may be pronounced or deleted.

                              Example: Who_i did John see who_i? → lower copy deleted at PF.

                              Function: traces/copies maintain interpretation, binding, and agreement in moved structures.

                              7. Key Principles

                              1. Ellipsis (VP ellipsis, sluicing) relies on identity conditions with an antecedent.
                              2. Null subjects are either PRO (non-finite, controlled) or pro (finite, pro-drop languages).
                              3. Empty categories (traces, PRO, pro) maintain syntactic dependencies and obey licensing conditions like ECP.
                              4. Traces vs. copies: different theoretical approaches, but both preserve interpretability after movement.
                              5. Understanding ellipsis and silent categories is crucial for syntax-semantics interface, movement theory, and cross-linguistic variation.

                              Data Challenge (Additional Component)

                              Students receive raw, unglossed language data and must:

                              • Identify basic word order
                              • Diagnose case system
                              • Determine head directionality
                              • Propose phrase structure
                              • Justify analysis formally

                              Data Challenge: Step-by-Step Approach

                              Objective

                              Students are presented with raw, unglossed sentences from a language and are required to perform a full syntactic analysis. The goal is to combine observation, typology, and formal justification.

                              Step 1: Identify Basic Word Order

                              Examine sentence examples carefully.

                              Look for subjects (S), verbs (V), and objects (O):

                              SVO: Subject precedes Verb, which precedes Object (English-like).
                              SOV: Subject precedes Object, which precedes Verb (Hindi/Japanese-like).
                              VSO, VOS, etc.: Check for less common orders.

                              Tips:

                              Repeated elements across sentences help spot regular patterns.

                              Watch for auxiliaries, postpositions/prepositions, and adverbs, they can signal head directionality.

                              Example Table:

                              SentenceObserved Order
                              1SOV
                              2SOV
                              3SOV

                              → Suggests canonical SOV language.

                              Step 2: Diagnose Case System

                              Look for morphemes marking subjects, objects, or obliques.

                              Common types:

                              Nominative–Accusative: Subject of transitive/intransitive treated the same; object marked differently.

                              Ergative–Absolutive: Transitive subject marked differently; intransitive subject = object marking.

                              Indicators:

                              Word endings or postpositions

                              Consistency across verbs

                              Semantic roles of arguments

                              Example Analysis:

                              Sentence: Ram-ne kitab padhiRam-ERG book read

                              Ergative marking on transitive subject → Ergative–Absolutive split

                              Step 3: Determine Head Directionality

                              Identify heads vs. complements: verbs, nouns, prepositions/postpositions, auxiliaries.

                              Tests:

                              PPs: preposition vs postposition → head-initial vs head-final

                              VPs: verb precedes or follows object

                              NPs: noun precedes or follows adjectives/demonstratives

                              Observation:

                              Head-final (typical in SOV languages)

                              Head-initial (typical in SVO languages)

                              Step 4: Propose Phrase Structure

                              Use observed head-complement-modifier patterns to construct tree structure:

                              Label phrases: NP, VP, PP, CP as appropriate

                              Identify internal structure: head, complement, adjuncts

                              Include functional projections if observable (Tense, Aspect, Agreement)

                              Example: SOV head-final structure

                              [TP [NP Ram] [T' ne [VP [NP kitab] [V padhi]]]]

                              Step 5: Justify Analysis Formally

                              Support every claim with evidence from the data:

                              Word order → S, O, V positions across sentences
                              Case → morpheme marking patterns
                              Head direction → placement of postpositions, adjectives, auxiliaries
                              Phrase structure → repeated constituent combinations

                              Optional: reference typological patterns / universals (Greenbergian universals, head-direction correlations).

                              Include trees or diagrams for clarity.
                              Address exceptions or optional constructions and explain them in terms of scrambling, topicalization, or discourse effects.

                              Tips for Success

                              1. Collect multiple examples before final judgment.
                              2. Mark morphemes carefully; they are often key to case and agreement.
                              3. Use cross-sentence comparison to identify regularity and exceptions.
                              4. Be explicit: every structural decision should refer to observed data.
                              5. Optional enhancement: indicate null categories (PRO/pro/traces) if relevant.

                              RECOMMENDED READING

                              Core:

                              • Carnie, A. Syntax: A Generative Introduction
                              • Radford, A. Minimalist Syntax
                              • Adger, D. Core Syntax
                              • Tallerman, M. Understanding Syntax

                                Advanced:

                                • Chomsky, N. The Minimalist Program
                                • Haegeman, L. Introduction to Government and Binding Theory
                                • Biber et al. Longman Grammar of Spoken and Written English
                                • Baker, M. The Atoms of Language

                                Tags

                                Post a Comment

                                0 Comments
                                * Please Don't Spam Here. All the Comments are Reviewed by Admin.