Generative Grammar

In theoretical linguistics, generative grammar refers to a particular approach to the study of syntax. A generative grammar of a language attempts to give a set of rules that will correctly predict which combinations of words will form grammatical sentences. In most approaches to generative grammar, the rules will also predict the morphology of a sentence. Generative grammar arguably originates in the work of Noam Chomsky, beginning in the late 1950s, although Chomsky has said that the first generative grammar in the modern sense was Panini's Sanskrit grammar.

Early versions of Chomsky's theory were called transformational grammar, and this term is still used as a general term that includes his subsequent theories. There are a number of competing versions of generative grammar currently practiced within linguistics. Chomsky's current theory is known as the Minimalist program. Other prominent theories include or have included dependency grammar, head-driven phrase structure grammar, lexical functional grammar, categorial grammar, relational grammar, link grammar, and tree-adjoining grammar.

Chomsky has argued that many of the properties of a generative grammar arise from an "innate" universal grammar. Proponents of generative grammar have argued that most grammar is not the result of communicative function and is not simply learned from the environment (see poverty of the stimulus argument). In this respect, generative grammar takes a point of view different from cognitive grammar, functional, and behaviorist theories.

Most versions of generative grammar characterize sentences as either grammatically correct (also known as well formed) or not. The rules of a generative grammar typically function as an algorithm to predict grammaticality as a discrete (yes-or-no) result. In this respect, it differs from stochastic grammar, which considers grammaticality as a probabilistic variable. However, some work in generative grammar (e.g. recent work by Joan Bresnan) uses stochastic versions of optimality theory.

Read more about Generative Grammar:  Frameworks, Context-free Grammars, Grammaticality Judgments, Music

Other articles related to "generative grammar, generative, grammar":

Linguistic Competence - Schools of Thought - Other Generativists - Ray S. Jackendoff
... Jackendoff's model deviates from the traditional generative grammar in that it does not treat syntax as the main generative component from which meaning and phonology is developed unlike Chomsky ... According to him, a generative grammar consists of five major components the lexicon, the base component, the transformational component, the phonological ... Againsting the syntax-centered view of generative grammar(syntactocentrism), he specifically treats phonology, syntax and semantics as three parallel generative processes ...
Transformational Grammar - Transformations
... originally proposed in the earliest forms of generative grammar (e.g ... they are still present in tree-adjoining grammar as the Substitution and Adjunction operations, and they have recently re-emerged in mainstream generative grammar in Minimalism, as the operations Merge ... In generative phonology, another form of transformation is the phonological rule, which describes a mapping between an underlying representation (the phoneme) and the ...
Generative Grammar - Music
... Generative grammar has been used to a limited extent in music theory and analysis since the 1980s ... More recently, such early generative approaches to music were further developed and extended by several scholars ...

Famous quotes related to generative grammar:

    Hence, a generative grammar must be a system of rules that can iterate to generate an indefinitely large number of structures. This system of rules can be analyzed into the three major components of a generative grammar: the syntactic, phonological, and semantic components.
    Noam Chomsky (b. 1928)