ligo/src/parser/ligodity/Lexer.mli
Christian Rinderknecht af8d1083b7 Eased the translation from Ligodity AST to Liquidity AST.
More precisely,

  * I commented out the operator "@" on lists in Ligodity (it can
    be implemented as a function, as a workaround).

  * I removed the parallel "let" construct (hence the "and" keyword).

  * I renamed the type "field_assignment" into "field_assign", in
    order to match Pascaligo AST.

  * The reading of the command-line options is now done by
    calling the function [EvalOpt.read], instead of an ugly
    side-effect when loading the binary of the module. Options
    are now found in a record of type [EvalOpt.options].

  * I added support in the Ligodity lexer for #include CPP
    directives.
2019-05-15 16:05:03 +02:00

48 lines
1.7 KiB
OCaml

(* Simple lexer for the Mini-ML language *)
(* Error reporting *)
type message = string
exception Error of message Region.reg
(* Tokeniser *)
(* The call [get_token ~log] evaluates in a lexer (a.k.a
tokeniser or scanner) whose type is [Lexing.lexbuf -> Token.t].
The argument [log] is a logger. As its type shows and suggests,
it is a pair made of an output channel and a printer for
tokens. The lexer would use any logger to print the recognised
tokens to the given channel. If no logger is given to [get_token],
no printing takes place while the lexer runs.
The call [reset ~file ~line ~offset buffer] modifies in-place the
lexing buffer [buffer] so the lexing engine records that the file
associated with [buffer] is named [file], the current line is
[line] and the offset on that line is [offset]. This function is
useful when lexing a file that has been previously preprocessed by
the C preprocessor, in which case the argument [file] is the name
of the file that was preprocessed, _not_ the preprocessed file (of
which the user is not normally aware). By default, the [line]
argument is [1].
*)
type logger = out_channel * (out_channel -> Token.t -> unit)
val get_token : ?log:logger -> Lexing.lexbuf -> Token.t
val reset : ?file:string -> ?line:int -> ?offset:int -> Lexing.lexbuf -> unit
val reset_file : file:string -> Lexing.lexbuf -> unit
(* Debugging *)
type file_path = string
val iter :
(Lexing.lexbuf -> out_channel -> Token.t -> unit) -> file_path option -> unit
val trace : file_path option -> unit
val prerr : kind:string -> message Region.reg -> unit
val format_error : kind:string -> message Region.reg -> string
val output_token : Lexing.lexbuf -> out_channel -> Token.t -> unit