The cost of lexical analysis
- 1 May 1986
- journal article
- Published by Wiley in Software: Practice and Experience
- Vol. 16 (5), 473-488
- https://doi.org/10.1002/spe.4380160508
Abstract
This paper examines a common design for a lexical analyser and its supporting modules. An implementation of the design was tuned to produce the best possible performance. In effect, many of the optimizations that one would expect of a production‐quality compiler were carried out by hand. After measuring the cost of tokenizing two large programs with this version, the code was ‘detuned’ to remove specific optimizations and the measurements were repeated. In all cases, the basic algorithm was unchanged, so that the difference in cost is an indication of the effectiveness of the optimization. Comparisons were also made with a tool‐generated lexical analyser for the same task. On the basis of the measurements, several specific design and optimization strategies are recommended. These recommendations are also valid for software other than lexical analysers.This publication has 8 references indexed in Scilit:
- The cost of a generated parserSoftware: Practice and Experience, 1985
- Compiler ConstructionPublished by Springer Nature ,1984
- ‘My system gives excellent error messages’—or does it?Software: Practice and Experience, 1982
- The implementation of case statements in PascalSoftware: Practice and Experience, 1981
- Minimal perfect hash functions made simpleCommunications of the ACM, 1980
- Efficient string matchingCommunications of the ACM, 1975
- Automatic generation of efficient lexical processors using finite state techniquesCommunications of the ACM, 1968
- A Compaction Procedure for Variable-Length Storage ElementsThe Computer Journal, 1967