An OCaml modular and generalised parser combinator library.
Install the library and its dependencies via OPAM:
opam install transept
or in your project-name.opam
dependencies:
...
depends: [
"transept" { >= "0.1.0" }
...
]
...
This example is the traditional arithmetic expression language. This can be represented by the following abstract data
types.
In this first example we only care about significant items like float
, parenthesis and finally operations.
type operation =
| Add
| Minus
| Mult
| Div
type expr =
| Number of float
| BinOp of operation * expr * expr
Direct style means we parse a stream of characters. In this case all characters are significant even spaces.
Transept
provides modules in order to help parsers construction. In the next fragment Utils
contains basic functions
like constant
. The Parser
module is a is parser dedicated to char stream analysis and Literals
is dedicated to string,
float etc. parsing.
module Utils = Transept.Utils
module CharParser = Transept.Extension.Parser.For_char_list
module Literals = Transept.Extension.Literals.Make (CharParser)
Therefore we can propose a first parser dedicated to operations.
let operator =
let open Utils in
let open CharParser in
(atom '+' <$> constant Add)
<|> (atom '-' <$> constant Minus)
<|> (atom '*' <$> constant Mult)
<|> (atom '/' <$> constant Div)
Then the simple expression and the expression can be defined by the following parsers.
let expr =
(* sexpr ::= float | '(' expr ')' *)
let rec sexpr () =
let open Literals in
let open CharParser in
float <$> (fun f -> Number f) <|> (atom '(' &> do_lazy expr <& atom ')')
(* expr ::= sexpr (operator expr)? *)
and expr () =
let open CharParser in
do_lazy sexpr <&> opt (operator <&> do_lazy expr) <$> function
| e1, None -> e1
| e1, Some (op, e2) -> BinOp (op, e1, e2)
in expr
Finally, a sentence can be easily parsed.
let parse s =
let open Utils in
let open CharParser in
parse (expr ()) @@ Stream.build @@ chars_of_string s
With this solution we don't skip whitespaces. It means 1+(2+3)
is parsed when 1 + (2 + 3)
is not!
Since Transept
is a generalized version, it's possible to parse something other than characters. For this purpose a
generic lexer is proposed thanks to the Genlex
module.
Transept
provides modules in order to help parsers construction. In the next fragment Utils
contains basic functions
like constant
. The CharParser
module is a parser dedicated to char stream analysis and Stream
is dedicated to
parsing using another parser.
module Utils = Transept.Utils.Fun
module Parser = Transept.Extension.Parser.For_char_list
module Stream = Transept.Stream.Via_parser (Parser)
module Genlex = Transept.Genlex.Lexer.Make (Parser)
module Parser =
Transept.Core.Parser.Make_via_stream
(Stream)
(struct
type t = Transept.Genlex.Lexeme.t
end)
module Token = Transept.Genlex.Lexeme.Make (Parser)
Therefore, we can propose a first parser dedicated to operations.
let operator =
let open Utils in
let open Parser in
let open Token in
(kwd "+" <$> constant Add)
<|> (kwd "-" <$> constant Minus)
<|> (kwd "*" <$> constant Mult)
<|> (kwd "/" <$> constant Div)
Then the simple expression and the expression can be defined by the following parsers.
let expr =
(* sexpr ::= float | '(' expr ')' *)
let rec sexpr () =
let open Parser in
let open Lexeme in
float <$> (fun f -> Number f) <|> (kwd "(" &> do_lazy expr <& kwd ")")
(* expr ::= sexpr (operator expr)? *)
and expr () =
let open Parser in
do_lazy sexpr <&> opt (operator <&> do_lazy expr) <$> function
| e1, None -> e1
| e1, Some (op, e2) -> BinOp (op, e1, e2)
in expr ()
Finally, a sentence can be parsed using parsers. First one CharParser
parses char stream and is used by the Genlex
in order to create a stream
of lexemes. The second one Parser
is used to parse the previous lexeme stream.
let parse s =
let open Utils in
let open Parser in
let parser = CharParser.Stream.build @@ Utils.chars_of_string s in
let stream = Stream.build Genlex.tokenizer parser in
parse (expr <& eos) stream
With this solution whitespaces are skipped by the generic lexer. It means 1 + ( 2+ 3)
is parsed correctly now.
A JSON Parser has been designed with this approch based on a low level parser producing tokens and a high level parser producing JSON terms from tokens.
MIT License
Copyright (c) 2020 Didier Plaindoux
Permission is hereby granted, free of charge, to any person obtaining a copy of this software and associated documentation files (the "Software"), to deal in the Software without restriction, including without limitation the rights to use, copy, modify, merge, publish, distribute, sublicense, and/or sell copies of the Software, and to permit persons to whom the Software is furnished to do so, subject to the following conditions:
The above copyright notice and this permission notice shall be included in all copies or substantial portions of the Software.
THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE SOFTWARE.