|This article does not cite any references or sources. (January 2010)|
The theoretical portion is primarily concerned with syntax, grammar and semantics of programming languages. One could say that this gives this particular area of computer science a strong tie with linguistics. Some courses on compiler construction will include a simplified grammar of a spoken language that can be used to form a valid sentence for the purposes of providing students with an analogy to help them understand how grammar works for programming languages.
The practical portion covers actual implementation of compilers for languages. Students will typically end up writing the front end of a compiler for a simplistic teaching language, such as Micro.
The first phase of compilation is lexical analysis of the source code. This phase involves grouping the characters into lexemes. Lexemes belong to token classes such as "integer", "identifier", or "whitespace". A token of the form <token-class, attribute-value> is produced for each lexeme. Lexical analysis is also called scanning.
The second phase of constructing a compiler is syntax analysis. The output of lexical analyser is used to create a representation which shows the grammatical structure of the tokens. Syntax analysis is also called parsing.
- Aho, Alfred V.; Lam, Monica S.; Sethi, Ravi; Ullman, Jeffrey D. (2007), Compilers: Principles, Techniques, & Tools (2nd ed.), Pearson, ISBN 978-81-317-2101-8
- Alfred V. Aho, Monica S. Lam, Ravi Sethi, Jeffrey D. Ullman. Compilers: Principles, Techniques, and Tools.
- Michael Wolfe. High-Performance Compilers for Parallel Computing. ISBN 978-0-8053-2730-4
- Let's Build a Compiler, by Jack Crenshaw, A tutorial on compiler construction.
|This programming language–related article is a stub. You can help Wikipedia by expanding it.|