L
- the type of LexBuffer
that this instance handlesT
- the type of tokens returned by the lexer's main entrypublic static interface Tokenizer.LexerInterface<L extends LexBuffer,T>
Tokenizer
.
It must provide a way to create a lexer
,
an entry point
to use to tokenize input,
and a halting condition
used to stop the tokenization
(typically, recognizing an end-of-input special token).
For convenience, a static factory of(BiFunction, Function, Object)
is provided, so that if MyLexer
has been generated by Dolmen with
a main
entry point and uses some Token.EOF
token for the
end-of-input, one can simply use:
LexerInterface.of(MyLexer::new, MyLexer::main, Token.EOF)to build a suitable tokenizing interface for that lexical analyzer.
Modifier and Type | Method and Description |
---|---|
T |
entry(L lexbuf)
Calls an entry in the given lexing buffer
|
boolean |
halt(T token) |
L |
makeLexer(String inputName,
Reader reader) |
static <L extends LexBuffer,T> |
of(BiFunction<String,Reader,L> makeLexer,
Function<L,T> entry,
T eofToken)
Typical usage of this method when
MyLexer has been generated
by Dolmen with a main entry point and uses some Token.EOF
token for the end-of-input:
LexerInterface.of(MyLexer::new, MyLexer::main, Token.EOF)
|
L makeLexer(String inputName, Reader reader)
inputName
- reader
- T entry(L lexbuf) throws LexBuffer.LexicalError
lexbuf
- LexBuffer.LexicalError
boolean halt(T token)
token
- the last token returned by entry(LexBuffer)
static <L extends LexBuffer,T> Tokenizer.LexerInterface<L,T> of(BiFunction<String,Reader,L> makeLexer, Function<L,T> entry, T eofToken)
MyLexer
has been generated
by Dolmen with a main
entry point and uses some Token.EOF
token for the end-of-input:
LexerInterface.of(MyLexer::new, MyLexer::main, Token.EOF)
makeLexer
- entry
- eofToken
- makeLexer
, uses the entry point described by
entry
, and stops as soon as the token eofToken
is encountered.