Package | Description |
---|---|
org.stekikun.dolmen.debug |
This package contains generic utility functions to help debug
Dolmen-generated lexical analyzers and parsers.
|
Modifier and Type | Interface and Description |
---|---|
static interface |
TokenVisualizer.LexerInterface<L extends LexBuffer,T,Cat>
This interface acts as a generic proxy to using a Dolmen-generated
lexer in the static debugging functions provided in
TokenVisualizer . |
Modifier and Type | Method and Description |
---|---|
static <L extends LexBuffer,T> |
Tokenizer.LexerInterface.of(BiFunction<String,Reader,L> makeLexer,
Function<L,T> entry,
T eofToken)
Typical usage of this method when
MyLexer has been generated
by Dolmen with a main entry point and uses some Token.EOF
token for the end-of-input:
LexerInterface.of(MyLexer::new, MyLexer::main, Token.EOF)
|
Modifier and Type | Method and Description |
---|---|
static <L extends LexBuffer,T> |
Tokenizer.file(Tokenizer.LexerInterface<L,T> lexer,
File input,
File output,
boolean positions)
Uses the given
lexer interface to tokenize the contents
of the file input , and stores the result in the
output file. |
static <L extends LexBuffer,T> |
Tokenizer.prompt(Tokenizer.LexerInterface<L,T> lexer,
boolean positions)
This method can be used to conveniently test a lexical analyzer
against various one-line sentences entered manually or fed from a
test file.
|
static <L extends LexBuffer,T> |
Tokenizer.tokenize(Tokenizer.LexerInterface<L,T> lexer,
String inputName,
Reader reader,
Writer writer,
boolean positions)
Initializes a lexical analyzer with the given input stream,
based on the
lexer interface, and repeatedly consumes
tokens from the input until the halting condition in lexer
is met. |