Generative grammar is a linguistic framework designed to reveal the fundamental rules and principles that regulate the formation of human languages. It includes the study of phonology, morphology, syntax, and semantics, to identify universal principles shared by all languages while simultaneously detecting language-specific norms.
One key assumption of generative grammar is the concept of an innate "language faculty" in humans, which proposes that our ability to learn and produce language is directed by the intrinsic qualities of this faculty. Children learn language quickly and with little conscious effort, implying the presence of pre-existing linguistic rules that aid acquisition.
At its foundation, generative grammar aims to create a universal grammar—a collection of rules and principles that all human languages follow. This universal grammar is thought to be fundamental in human cognition, influencing how we perceive and produce language.
Noam Chomsky first postulated generative grammar in publications such as "Syntactic Structures" (1957), positing that language is made up of phrase structure rules and transformations. These guidelines outline the fundamental structure of sentences and how they can be extended to more complex forms. Over time, generative grammar has evolved into theories such as government-binding theory, while retaining a rigorous, abstract, and frequently mathematical approach to linguistic study.
Philosophical discussions about generative grammar frequently center on the inherent nature of grammatical principles and the relationship between syntax and semantics. Formalists are particularly interested in discovering hidden semantic structures within language, which complements generative linguistics' grammatical studies.