Lex machinary for go.
Full Changelog: https://github.com/timtadh/lexmachine/compare/v0.2.2...v0.2.3
This release includes pr #21 and #23 . Both prs are minor bug fixes.
#21 corrects a long standing mistake in both back-ends where the lexical analysis machine may in certain circumstances skip input bytes. This occurred when the analysis proceeded several characters ahead of the final character in the ultimate best match. Because this situation is quite rare it was only recently noticed.
#23 prevents the user from specifying tokens which match the empty string. As noted in #22 there is no good way to handle this case and ensure progress. Given that I can think of not use case for tokens which match the empty string (and the problems these tokens can cause) they are being statically disallowed.
changes
This release adds a new deterministic finite automaton backend for lexmachine
. For most users this will improve the speed of tokenization and is the default backend used. The NFA based engine is still available by the new function CompileNFA()
.