Package | Description |
---|---|
org.stekikun.dolmen.codegen | |
org.stekikun.dolmen.debug |
This package contains generic utility functions to help debug
Dolmen-generated lexical analyzers and parsers.
|
Modifier and Type | Field and Description |
---|---|
protected LexBuffer |
BaseParser._jl_lexbuf
The underlying lexing buffer
|
Modifier and Type | Interface and Description |
---|---|
static interface |
Tokenizer.LexerInterface<L extends LexBuffer,T>
This interface acts as a generic proxy to using a Dolmen-generated
lexer in the static debugging functions provided in
Tokenizer . |
static interface |
TokenVisualizer.LexerInterface<L extends LexBuffer,T,Cat>
This interface acts as a generic proxy to using a Dolmen-generated
lexer in the static debugging functions provided in
TokenVisualizer . |
Modifier and Type | Method and Description |
---|---|
static <L extends LexBuffer,T> |
Tokenizer.file(Tokenizer.LexerInterface<L,T> lexer,
File input,
File output,
boolean positions)
Uses the given
lexer interface to tokenize the contents
of the file input , and stores the result in the
output file. |
static <L extends LexBuffer,T,Cat> |
TokenVisualizer.file(TokenVisualizer.LexerInterface<L,T,Cat> lexer,
String input,
String output)
Outputs to the given
output file a stand-alone HTML page which
displays the tokenization of the given input file contents. |
static <L extends LexBuffer,T,Cat> |
TokenVisualizer.LexerInterface.of(BiFunction<String,Reader,L> makeLexer,
Function<L,T> entry,
Function<T,Cat> categorizer,
T eofToken)
Typical usage of this method when
MyLexer has been generated
by Dolmen with a main entry point and uses some Token.EOF
token for the end-of-input:
LexerInterface.of(MyLexer::new, MyLexer::main, Token::getClass, Token.EOF)
|
static <L extends LexBuffer,T> |
Tokenizer.LexerInterface.of(BiFunction<String,Reader,L> makeLexer,
Function<L,T> entry,
T eofToken)
Typical usage of this method when
MyLexer has been generated
by Dolmen with a main entry point and uses some Token.EOF
token for the end-of-input:
LexerInterface.of(MyLexer::new, MyLexer::main, Token.EOF)
|
static <L extends LexBuffer,T> |
Tokenizer.prompt(Tokenizer.LexerInterface<L,T> lexer,
boolean positions)
This method can be used to conveniently test a lexical analyzer
against various one-line sentences entered manually or fed from a
test file.
|
static <L extends LexBuffer,T,Cat> |
TokenVisualizer.string(TokenVisualizer.LexerInterface<L,T,Cat> lexer,
String input,
Writer output)
Outputs to the given writer a stand-alone HTML page which
displays the tokenization of the given
input string's contents. |
static <L extends LexBuffer,T> |
Tokenizer.tokenize(Tokenizer.LexerInterface<L,T> lexer,
String inputName,
Reader reader,
Writer writer,
boolean positions)
Initializes a lexical analyzer with the given input stream,
based on the
lexer interface, and repeatedly consumes
tokens from the input until the halting condition in lexer
is met. |
static <L extends LexBuffer,T,Cat> |
TokenVisualizer.visualize(TokenVisualizer.LexerInterface<L,T,Cat> lexer,
String inputName,
Reader input,
Writer output)
Outputs to the given writer a stand-alone HTML page which
displays the tokenization of the given
input contents. |