An Introduction to Transformational grammar In linguistics, transformational grammar (TG) or transformational-generative grammar (TGG) is part of the theory of generative grammar, especially of natural languages.In contrast, an extreme behaviorist linguist would argue that language can be studied only through recordings or transcriptions of actual speech and that the role of the linguist is to look for patterns in such observed speech but not to hypothesize about why such patterns might occur or to label particular utterances as either "grammatical" or "ungrammatical". Although few linguists in the 1950s actually took such an extreme position, Chomsky was on the opposite extreme, defining grammaticality in an unusually mentalistic way (for the time).[12] He argued that the intuition of a native speaker is enough to define the grammaticality of a sentence; that is, if a particular string of English words elicits a double take or the feeling of wrongness in a native English-speaker, with various extraneous factors affecting intuitions are controlled for, it can be said that the string of words is ungrammatical. That, according to Chomsky, is entirely distinct from the question of whether a sentence is meaningful or can be understood. It is possible for a sentence to be both grammatical and meaningless, as in Chomsky's famous example, "colorless green ideas sleep furiously".[13] However, such sentences manifest a linguistic problem that is distinct from that posed by meaningful but ungrammatical (non)-sentences such as "man the bit sandwich the", the meaning of which is fairly clear, but which no native speaker would accept to be well formed.Consequently, the linguist can study an idealised version of language, which greatly simplifies linguistic analysis (see the "Grammaticality" section below). The other idea related directly to the evaluation of theories of grammar. Chomsky distinguished between grammars that achieve descriptive adequacy and those that go further and achieve explanatory adequacy. A descriptively-adequate grammar for a particular language defines the (infinite) set of grammatical sentences in that language; that is, it describes the language in its entirety. A grammar that achieves explanatory adequacy has the additional property that it gives an insight into the underlying linguistic structures in the human mind. In other words, it does not merely describe the grammar of a language, but it makes predictions about how linguistic knowledge is mentally represented. For Chomsky, the nature of such mental representations is largely innate and so if a grammatical theory has explanatory adequacy, it must be able to explain the various grammatical nuances of the languages of the world as relatively-minor variations in the universal pattern of human language. I-language is taken to be the object of study in linguistic theory; it is the mentally represented linguistic knowledge that a native speaker of a language has and so is a mental object. From that perspective, most of theoretical linguistics is a branch of psychology. E-language encompasses all other notions of what a language is, such as a body of knowledge or behavioural habits shared by a community. Thus, E-language by itself is not a coherent concept,[11] and Chomsky argues that such notions of language are not useful in the study of innate linguistic knowledge or competence even though they may seem sensible and intuitive and useful in other areas of study. Competence, he argues, can be studied only if languages are treated as mental objects. Grammaticality Chomsky argued that the notions "grammatical" and "ungrammatical" could be defined in a meaningful and useful way.This notion of transformation proved adequate for subsequent versions including the "extended," "revised extended," and Government-Binding (GB) versions of generative grammar, but it may no longer be sufficient for the current minimalist grammar, as merge may require a formal definition that goes beyond the tree manipulation characteristic of Move ?. Development of basic concepts Though transformations continue to be important in Chomsky's current theories, he has now abandoned the original notion of Deep Structure and Surface Structure. Initially, two additional levels of representation were introduced (LF -- Logical Form, and PF -- Phonetic Form), but in the 1970s, Chomsky sketched out a new program of research known as Minimalism, in which Deep Structure and Surface Structure are no longer featured and PF and LF remain as the only levels of representation. To complicate the understanding of the development of Chomsky's theories, the precise meanings of Deep Structure and Surface Structure have changed over time. By the 1970s, they were normally referred to simply as D-Structure and S-Structure by Chomskyan linguists. In particular, the idea that the meaning of a sentence was determined by its Deep Structure (taken to its logical conclusions by the generative semanticists during the same period) was dropped for good by Chomskyan linguists when LF took over this role (previously, Chomsky and Ray Jackendoff had begun to argue that meaning was determined by both Deep and Surface Structure). Innate linguistic knowledge Using a term such as "transformation" may give the impression that theories of transformational generative grammar are intended as a model for the processes through which the human mind constructs and understands sentences, but Chomsky clearly stated that a generative grammar models only the knowledge that underlies the human ability to speak and understand.Aspects of the Theory of Syntax Formal definition Chomsky's advisor, Zellig Harris, took transformations to be relations between sentences such as "I finally met this talkshow host you always detested" and simpler (kernel) sentences "I finally met this talkshow host" and "You always detested this talkshow host." Chomsky developed a formal theory of grammar in which transformations manipulated not only the surface strings but also the parse tree associated with them, making transformational grammar a system of tree automata. A transformational-generative (or simply transformational) grammar thus involved two types of productive rules: phrase structure rules, such as "S -> NP VP" (a sentence may consist of a noun phrase followed by a verb phrase) etc., which could be used to generate grammatical sentences with associated parse trees (phrase markers, or P markers); and transformational rules, such as rules for converting statements to questions or active to passive voice, which acted on the phrase markers to produce other grammatically correct sentences.Chomsky argued that even though linguists were still a long way from constructing descriptively adequate grammars, progress in terms of descriptive adequacy would come only if linguists hold explanatory adequacy as their goal: real insight into the structure of individual languages can be gained only by comparative study of a wide range of languages, on the assumption that they are all cut from the same cloth.[citation needed] "I-language" and "E-language" In 1986, Chomsky proposed a distinction between I-language and E-language that is similar but not identical to the competence/performance distinction.[10] "I-language" refers to internal language and is contrasted with "E-language", which refers to external language.Noam Chomsky's 1965 book Aspects of the Theory of Syntax developed the idea that each sentence in a language has two levels of representation: a deep structure and a surface structure.[1][2] The deep structure represents the core semantic relations of a sentence and is mapped onto the surface structure, which follows the phonological form of the sentence very closely, via transformations.In this context, transformational rules are not strictly necessary for the purpose of generating the set of grammatical sentences in a language, since that can be done using phrase structure rules alone, but the use of transformations provides economy in some cases (the total number of rules can thus be reduced), and it also provides a way of representing the grammatical relations that exist between sentences, which would not otherwise be reflected in a system with phrase structure rules alone.Although it was well understood that linguistic processes are in some sense "creative," the technical devices for expressing a system of recursive processes were simply not available until much more recently. In fact, a real understanding of how a language can (in Humboldt's words) "make infinite use of finite means" has developed only within the last thirty years, in the course of studies in the foundations of mathematics.It considers grammar to be a system of rules that generate exactly those combinations of words that form grammatical sentences in a given language and involves the use of defined operations (called transformations) to produce new sentences from existing ones.Chomsky emphasized the importance of modern formal mathematical devices in the development of grammatical theory: But the fundamental reason for [the] inadequacy of traditional grammars is a more technical one.He argued that such errors in linguistic performance are irrelevant to the study of linguistic competence, the knowledge that allows people to construct and understand grammatical sentences.One was the distinction between competence and performance.[9] Chomsky noted the obvious fact that people, when they speak in the real world, often make linguistic errors, such as starting a sentence and then abandoning it midway through.However, the concept of transformations had been proposed prior to the development of deep structure to increase the mathematical and descriptive power of context-free grammars.Chomsky helped to make the innateness theory respectable after a period dominated by more behaviorist attitudes towards language.He made concrete and technically sophisticated proposals about the structure of language as well as important proposals regarding how the success of grammatical theories should be evaluated.Grammatical theories In the 1960s, Chomsky introduced two central ideas relevant to the construction and evaluation of grammatical theories.Arguing that because most of that knowledge is innate, a baby can have a large body of prior knowledge about the structure of language in general and so need to learn only the idiosyncratic features of the language(s) to which it is exposed.He quoted philosophers who posited the same basic idea several centuries ago.Deep structure was developed largely for technical reasons related to early semantic theory.