Copyright | (c) Karl Cronburg 2018 |
---|---|
License | BSD3 |
Maintainer | karl@cs.tufts.edu |
Stability | experimental |
Portability | POSIX |
Safe Haskell | None |
Language | Haskell2010 |
Synopsis
- data Token n v
- tokenName :: Token n v -> n
- tokenValue :: Token n v -> v
- tokenSize :: Token n v -> Int
- type Lexeme s = [s]
- type NDFA s i n = (n, DFA s i)
- tokenize :: forall s i n v. (Eq i, Ord s, Eq s, Show s, Show i, Show n, Show v, Hashable i, Hashable s) => [(n, DFA s i)] -> (Lexeme s -> n -> v) -> [s] -> [Token n v]
- tokenizeInc :: forall s i n v. (Eq i, Ord s, Eq n, Eq s, Show s, Show i, Show n, Show v, Hashable i, Hashable s, Hashable n) => (n -> Bool) -> [(n, DFA s i)] -> (Lexeme s -> n -> v) -> Set n -> [s] -> (Token n v, [s])
- tokenizeIncAll :: forall s i n v. (Eq i, Ord s, Eq n, Eq s, Show s, Show i, Show n, Show v, Hashable i, Hashable s, Hashable n) => (n -> Bool) -> [(n, DFA s i)] -> (Lexeme s -> n -> v) -> Set n -> [s] -> [(Token n v, [s])]
Documentation
Token with names n
, values v
, and number of input symbols consumed to match
it.
Instances
tokenValue :: Token n v -> v Source #
Get the value of a token, ignoring its name.
tokenSize :: Token n v -> Int Source #
Get the number of characters from the input that this token matched on.
:: (Eq i, Ord s, Eq s, Show s, Show i, Show n, Show v, Hashable i, Hashable s) | |
=> [(n, DFA s i)] | Association list of named DFAs. |
-> (Lexeme s -> n -> v) | Constructs the value of a token from lexeme matched. |
-> [s] | The input string. |
-> [Token n v] | The tokenized tokens. |
Entrypoint for tokenizing an input stream given a list of named DFAs that we can match on.
@dfaTuples@: converts from DFAs to the names associated with them in
the specification of the lexer.
@fncn@: function for constructing the value of a token from the lexeme
matched (e.g. varName
) and the associated token name (e.g. id
)
:: (Eq i, Ord s, Eq n, Eq s, Show s, Show i, Show n, Show v, Hashable i, Hashable s, Hashable n) | |
=> (n -> Bool) | Function that returns True on DFA names we wish to filter out of the results. |
-> [(n, DFA s i)] | Closure over association list of named DFAs. |
-> (Lexeme s -> n -> v) | Token value constructor from lexemes. |
-> Set n -> [s] -> (Token n v, [s]) | The incremental tokenizer closure. |
Incremental tokenizer takes in the same list of DFAs and AST value
constructor function, but instead returns an incremental tokenizer function
that expects a set of names that we currently expect to tokenize on,
the current input stream, and returns a single tokenized token along
with the modified input stream to iteratively call tokenizeInc
on.