A subfield of linguistics, syntax is the study of the rules, or "patterned relations," that govern the way the words in a sentence come together. It concerns how different words which are categorized as nouns, adjectives, verbs etc. (goes back to Dionysios Trax) are combined into clauses which in turn combine into sentences.
Fields and subfields withinlinguistics.
In the framework of transformational-generative grammar (see also transformational grammar for information on the development of the theory) the structure of a sentence is represented by Phrase Structure Trees. Such a tree provides three types of information about the sentence it represents:
- the linear order of the words in the sentence (though not in all theories of syntax)
- the groupings of words into syntactic categories
- the hiearchial structure of the syntactic categories.
In computer science, the term syntax is used to denote the literal text of something written in a formal language or programming language, as opposed to its semantics or meaning.
The analysis of programming language syntax usually entails the transformation of a linear sequence of tokens (a token is akin to an individual word or punctuation mark in a natural language) into a hierarchical syntax tree (abstract syntax trees are one convenient form of syntax tree). This process, called parsing, is in some respects analogous to syntactic analysis in linguistics; in fact, certain concepts, such as the Chomsky hierarchy and context-free grammars, are common to the study of syntax in both linguistics and computer science. However, the applications of these concepts vary widely between the two fields, and the practical resemblances are small.