資料來源 : Free On-Line Dictionary of Computing
lexical analysis
(Or "linear analysis", "scanning") The first
stage of processing a language. The stream of characters
making up the source program or other input is read one at a
time and grouped into {lexeme}s (or "tokens") - word-like
pieces such as keywords, identifiers, {literal}s and
punctutation. The lexemes are then passed to the {parser}.
["Compilers - Principles, Techniques and Tools", by Alfred
V. Aho, Ravi Sethi and Jeffrey D. Ullman, pp. 4-5]
(1995-04-05)