L
- the type of generated lexing buffer to analyzeT
- the type of tokens producedCat
- a category type for tokenspublic static interface TokenVisualizer.LexerInterface<L extends LexBuffer,T,Cat> extends Tokenizer.LexerInterface<L,T>
TokenVisualizer
.
It must provide a way to create a lexer
,
an entry point
to use to tokenize input,
and a halting condition
used to stop the tokenization
(typically, recognizing an end-of-input special token). Additionally,
it must provide a way to sort tokens in various categories
;
tokens in the same category will be highlighted in a similar fashion. The categories
can be anything, typical choices could be:
Class
as the category type and Object.getClass()
as the
category function, to differentiate between various kinds of valued
tokens on one side, and constant tokens on the other side;
Function.identity()
as the category function, to differentiate all distinct tokens;
getKind()
method to categorize
tokens based on what terminal symbol they correspond to.
For convenience, a static factory
of(BiFunction, Function, Function, Object)
is provided, so that if MyLexer
has been generated by Dolmen with
a main
entry point and uses some Token.EOF
token for the
end-of-input, one can simply use:
LexerInterface.of(MyLexer::new, MyLexer::main, Token::getClass, Token.EOF)to build a suitable tokenizing interface for that lexical analyzer which simply differentiates tokens based on their Java class.
Modifier and Type | Method and Description |
---|---|
Cat |
category(T token)
Tokens with the same category
|
static <L extends LexBuffer,T,Cat> |
of(BiFunction<String,Reader,L> makeLexer,
Function<L,T> entry,
Function<T,Cat> categorizer,
T eofToken)
Typical usage of this method when
MyLexer has been generated
by Dolmen with a main entry point and uses some Token.EOF
token for the end-of-input:
LexerInterface.of(MyLexer::new, MyLexer::main, Token::getClass, Token.EOF)
|
entry, halt, makeLexer, of
Cat category(T token)
token
- token
static <L extends LexBuffer,T,Cat> TokenVisualizer.LexerInterface<L,T,Cat> of(BiFunction<String,Reader,L> makeLexer, Function<L,T> entry, Function<T,Cat> categorizer, T eofToken)
MyLexer
has been generated
by Dolmen with a main
entry point and uses some Token.EOF
token for the end-of-input:
LexerInterface.of(MyLexer::new, MyLexer::main, Token::getClass, Token.EOF)
makeLexer
- entry
- categorizer
- eofToken
- makeLexer
, uses the entry point described by
entry
, highlights tokens depending on their category as
returned by the categorizer
, and stops as soon as the
token eofToken
is encountered.