Merge branch 'dev' of gitlab.com:ligolang/ligo into clean-sts-solver
This commit is contained in:
commit
1e06c24325
1
.gitignore
vendored
1
.gitignore
vendored
@ -7,3 +7,4 @@ Version.ml
|
|||||||
/_opam/
|
/_opam/
|
||||||
/*.pp.ligo
|
/*.pp.ligo
|
||||||
**/.DS_Store
|
**/.DS_Store
|
||||||
|
.vscode/
|
78
gitlab-pages/docs/contributors/ligo_test_guide.md
Normal file
78
gitlab-pages/docs/contributors/ligo_test_guide.md
Normal file
@ -0,0 +1,78 @@
|
|||||||
|
# Testing LIGO
|
||||||
|
|
||||||
|
Adding to the LIGO test suite is one of the more accessible ways to contribute. It exposes you to the compiler structure and primitives without necessarily demanding a deep understanding of OCaml or compiler development. And you'll probably become more familiar with LIGO itself in the process, which is helpful.
|
||||||
|
|
||||||
|
Unfortunately right now LIGO itself doesn't have a good way to do automated testing. So the tests are written in OCaml, outside of the LIGO language. Thankfully the test code is typically less demanding than the features being tested. These tests are currently contained in [src/test](https://gitlab.com/ligolang/ligo/tree/dev/src/test), but the bulk are integration tests which rely on test contracts kept in [src/test/contracts](https://gitlab.com/ligolang/ligo/tree/dev/src/test/contracts). If you're new to LIGO, reading these contracts can be a useful introduction to a given syntax. In the future we plan
|
||||||
|
to have detailed documentation for each syntax, but at the moment we only have a reference manual for [PascaLIGO](https://gitlab.com/ligolang/ligo/blob/dev/src/passes/1-parser/pascaligo/Doc/pascaligo.md)
|
||||||
|
|
||||||
|
## How To Find Good Test Cases
|
||||||
|
|
||||||
|
Your first question is probably "If I'm not already experienced, how do I know what to test?". There's a handful of things you can do to systematically find good test cases. All of them will either get you more familiar with the LIGO code base or LIGO itself.
|
||||||
|
|
||||||
|
### Extending Existing Test Cases
|
||||||
|
|
||||||
|
The fastest way to improve LIGO's test coverage is to extend existing test cases. This means considering the test cases that already exist, and thinking of things they don't cover or situations they'll fail on. A good deal of inference is required for this, but it requires minimal experience with the existing code.
|
||||||
|
|
||||||
|
### Studying The Parsers For Gaps In Coverage
|
||||||
|
|
||||||
|
LIGO is divided into a **front end** which handles syntax and a **backend** which optimizes and compiles a core language shared between syntaxes. You can find basic test cases for a particular LIGO syntax by studying its parser. You will find these under [src/passes/1-parser](https://gitlab.com/ligolang/ligo/tree/dev/src/passes/1-parser). One kind of useful test focuses on **coverage**, whether we have any testing at all for a particular aspect of a syntax. You can find these by carefully going over the syntax tree for a syntax (probably best read by looking at its `Parser.mly`) and comparing each branch to the test suite. While these tests are plentiful at the time of writing, they will eventually be filled in reliably as part of writing a new syntax.
|
||||||
|
|
||||||
|
### Creating Interesting Test Cases By Using LIGO
|
||||||
|
|
||||||
|
Another kind of useful test focuses on **depth**, whether the features are put through a wide variety of complex scenarios to make sure they stand up to real world use. One of the best ways to write these
|
||||||
|
is to use LIGO for a real project. This will require some time and energy, not just to learn LIGO but to write projects complex enough to stretch the limits of what the language can do. At the same time however it will get you used to engaging with LIGO from a developers perspective, asking how things could be better or what features are underdeveloped. If your project has practical uses, you will also be contributing to the Tezos/LIGO ecosystem while you learn. Note that because LIGO is open source, in under for us to incorporate your work as a test case it needs to be licensed in a way that's compatible with LIGO.
|
||||||
|
|
||||||
|
### Fuzzing (Speculative)
|
||||||
|
|
||||||
|
In the future you'll be able to [use fuzzing](https://en.wikipedia.org/wiki/Fuzzing) to generate test cases for LIGO. Fuzzing is often useful for finding 'weird' bugs on code paths that humans normally wouldn't stumble into. This makes it a useful supplement to human testing.
|
||||||
|
|
||||||
|
## Structure of LIGO Tests
|
||||||
|
|
||||||
|
LIGO's OCaml-based tests are written in [alcotest](https://github.com/mirage/alcotest/). However the tests you encounter in [src/test/integration_tests.ml](https://gitlab.com/ligolang/ligo/blob/dev/src/test/integration_tests.ml) are built on top of some abstractions, currently defined in [src/test/test_helpers.ml](https://gitlab.com/ligolang/ligo/blob/dev/src/test/test_helpers.ml). The use of these can be inferred fairly well from looking at existing tests, but lets break a few of them down for analysis. We'll first analyze a short integration test for assignment:
|
||||||
|
|
||||||
|
### Assignment Test
|
||||||
|
let assign () : unit result =
|
||||||
|
let%bind program = type_file "./contracts/assign.ligo" in
|
||||||
|
let make_expect = fun n -> n + 1 in
|
||||||
|
expect_eq_n_int program "main" make_expect
|
||||||
|
|
||||||
|
### assign.ligo
|
||||||
|
function main (const i : int) : int is
|
||||||
|
begin
|
||||||
|
i := i + 1 ;
|
||||||
|
end with i
|
||||||
|
|
||||||
|
|
||||||
|
So what's going on here? We have a function which takes no arguments and returns a `unit result`. We then define two variables, a `program` which is read from disk and fed to the LIGO compiler; and a comparison function `make_expect` which takes an integer and adds one to it. Using `expect_eq_n_int` the `program`'s main function is run and compared to the result of providing the same input to `make_expect`. This gives us some flavor of what to expect from these integration tests. Notice that the `main` argument given to `expect_eq_n_int` corresponds to the name of the function in `assign.ligo`. We can see in more complex tests that we're able to pull the values of arbitrary expressions or function calls from LIGO test contracts. Consider:
|
||||||
|
|
||||||
|
### Annotation Test
|
||||||
|
let annotation () : unit result =
|
||||||
|
let%bind program = type_file "./contracts/annotation.ligo" in
|
||||||
|
let%bind () =
|
||||||
|
expect_eq_evaluate program "lst" (e_list [])
|
||||||
|
in
|
||||||
|
let%bind () =
|
||||||
|
expect_eq_evaluate program "address" (e_address "tz1KqTpEZ7Yob7QbPE4Hy4Wo8fHG8LhKxZSx")
|
||||||
|
in
|
||||||
|
let%bind () =
|
||||||
|
expect_eq_evaluate program "address_2" (e_address "tz1KqTpEZ7Yob7QbPE4Hy4Wo8fHG8LhKxZSx")
|
||||||
|
in
|
||||||
|
ok ()
|
||||||
|
|
||||||
|
### annotation.ligo
|
||||||
|
const lst : list(int) = list [] ;
|
||||||
|
|
||||||
|
const address : address = "tz1KqTpEZ7Yob7QbPE4Hy4Wo8fHG8LhKxZSx" ;
|
||||||
|
|
||||||
|
const address_2 : address = ("tz1KqTpEZ7Yob7QbPE4Hy4Wo8fHG8LhKxZSx" : address) ;
|
||||||
|
|
||||||
|
Here what's going on is similar to the last program; `expect_eq_evaluate` runs a program and then pulls a particular named value from the final program state. For example, once the program stops running the value of `address` is `"tz1KqTpEZ7Yob7QbPE4Hy4Wo8fHG8LhKxZSx"`. The *comparison* however is made to a constructed expression. Remember that we're testing from OCaml, but the program is written and evaluated as LIGO. In order to provide a proper comparison, we convert our expected test values into LIGO expressions and data. Constructors such as `e_list` and `e_address` provide a bridge between LIGO and OCaml. Their definitions can be found in files such as [src/stages/ast_simplified/combinators.ml](https://gitlab.com/ligolang/ligo/blob/dev/src/stages/ast_simplified/combinators.ml), or using [Merlin's definition point finder](https://github.com/ocaml/merlin/wiki). These same functions are used during the simplification stage of LIGO compilation, so becoming familiar with them will help prepare you to work on the [front end](contributors/big-picture/front-end/).
|
||||||
|
|
||||||
|
## How To Write A Test For LIGO
|
||||||
|
|
||||||
|
What if we want to write a test of our own? If the test is in the integration test vein (which it probably is if you're testing new syntax or features), then the process looks something like:
|
||||||
|
|
||||||
|
1. Write a test contract which uses the new syntax or feature in [src/test/contracts](https://gitlab.com/ligolang/ligo/tree/dev/src/test/contracts).
|
||||||
|
2. Write an integration test in [src/test/integration_tests.ml](https://gitlab.com/ligolang/ligo/blob/dev/src/test/integration_tests.ml) in the vein of existing tests, make sure you add it to the test runner that is currently located at the bottom of the file.
|
||||||
|
3. Write the feature, assuming it doesn't already exist. Build the resulting version of LIGO without errors.
|
||||||
|
4. Run the test suite, see if your test(s) pass. If they do, you're probably done. If not it's time to go debugging.
|
@ -17,7 +17,7 @@ title: Cheat Sheet
|
|||||||
|Unit| `unit`|
|
|Unit| `unit`|
|
||||||
|Boolean|<pre><code>const hasDriversLicense: bool = False;<br/>const adult: bool = True;</code></pre> |
|
|Boolean|<pre><code>const hasDriversLicense: bool = False;<br/>const adult: bool = True;</code></pre> |
|
||||||
|Boolean Logic|<pre><code>(not True) == False == (False and True) == (False or False)</code></pre>|
|
|Boolean Logic|<pre><code>(not True) == False == (False and True) == (False or False)</code></pre>|
|
||||||
|Mutez (micro tez)| `42mtz`, `7mtz` |
|
|Mutez (micro tez)| `42mutez`, `7mutez` |
|
||||||
|Address | `"tz1KqTpEZ7Yob7QbPE4Hy4Wo8fHG8LhKxZSx"`, `"KT1JepfBfMSqkQyf9B1ndvURghGsSB8YCLMD"`|
|
|Address | `"tz1KqTpEZ7Yob7QbPE4Hy4Wo8fHG8LhKxZSx"`, `"KT1JepfBfMSqkQyf9B1ndvURghGsSB8YCLMD"`|
|
||||||
|Addition |`3 + 4`, `3n + 4n`|
|
|Addition |`3 + 4`, `3n + 4n`|
|
||||||
|Multiplication & Division| `3 * 4`, `3n * 4n`, `10 / 5`, `10n / 5n`|
|
|Multiplication & Division| `3 * 4`, `3n * 4n`, `10 / 5`, `10n / 5n`|
|
||||||
@ -35,11 +35,12 @@ title: Cheat Sheet
|
|||||||
|Variants|<pre><code>type action is<br/>| Increment of int<br/>| Decrement of int</code></pre>|
|
|Variants|<pre><code>type action is<br/>| Increment of int<br/>| Decrement of int</code></pre>|
|
||||||
|Variant *(pattern)* matching|<pre><code>const a: action = Increment(5);<br/>case a of<br/>| Increment(n) -> n + 1<br/>| Decrement(n) -> n - 1<br/>end</code></pre>|
|
|Variant *(pattern)* matching|<pre><code>const a: action = Increment(5);<br/>case a of<br/>| Increment(n) -> n + 1<br/>| Decrement(n) -> n - 1<br/>end</code></pre>|
|
||||||
|Records|<pre><code>type person is record<br/> age: int ;<br/> name: string ;<br/>end<br/><br/>const john : person = record<br/> age = 18;<br/> name = "John Doe";<br/>end<br/><br/>const name: string = john.name;</code></pre>|
|
|Records|<pre><code>type person is record<br/> age: int ;<br/> name: string ;<br/>end<br/><br/>const john : person = record<br/> age = 18;<br/> name = "John Doe";<br/>end<br/><br/>const name: string = john.name;</code></pre>|
|
||||||
|Maps|<pre><code>type prices is map(nat, tez);<br/><br/>const prices : prices = map<br/> 10n -> 60mtz;<br/> 50n -> 30mtz;<br/> 100n -> 10mtz;<br/>end<br/><br/>const price: option(tez) = prices[50n];<br/><br/>prices[200n] := 5mtz;</code></pre>|
|
|Maps|<pre><code>type prices is map(nat, tez);<br/><br/>const prices : prices = map<br/> 10n -> 60mutez;<br/> 50n -> 30mutez;<br/> 100n -> 10mutez;<br/>end<br/><br/>const price: option(tez) = prices[50n];<br/><br/>prices[200n] := 5mutez;</code></pre>|
|
||||||
|Contracts & Accounts|<pre><code>const destinationAddress : address = "tz1...";<br/>const contract : contract(unit) = get_contract(destinationAddress);</code></pre>|
|
|Contracts & Accounts|<pre><code>const destinationAddress : address = "tz1...";<br/>const contract : contract(unit) = get_contract(destinationAddress);</code></pre>|
|
||||||
|Transactions|<pre><code>const payment : operation = transaction(unit, amount, receiver);</code></pre>|
|
|Transactions|<pre><code>const payment : operation = transaction(unit, amount, receiver);</code></pre>|
|
||||||
|Exception/Failure|`fail("Your descriptive error message for the user goes here.")`|
|
|Exception/Failure|`fail("Your descriptive error message for the user goes here.")`|
|
||||||
|
|
||||||
|
|
||||||
<!--END_DOCUSAURUS_CODE_TABS-->
|
<!--END_DOCUSAURUS_CODE_TABS-->
|
||||||
|
|
||||||
</div>
|
</div>
|
@ -31,7 +31,7 @@ const dogBreed: animalBreed = "Saluki";
|
|||||||
type accountBalances is map(address, tez);
|
type accountBalances is map(address, tez);
|
||||||
|
|
||||||
const ledger: accountBalances = map
|
const ledger: accountBalances = map
|
||||||
("tz1KqTpEZ7Yob7QbPE4Hy4Wo8fHG8LhKxZSx": address) -> 10mtz
|
("tz1KqTpEZ7Yob7QbPE4Hy4Wo8fHG8LhKxZSx": address) -> 10mutez
|
||||||
end
|
end
|
||||||
```
|
```
|
||||||
|
|
||||||
@ -60,10 +60,10 @@ end
|
|||||||
type accountBalances is map(account, accountData);
|
type accountBalances is map(account, accountData);
|
||||||
|
|
||||||
// pseudo-JSON representation of our map
|
// pseudo-JSON representation of our map
|
||||||
// { "tz1...": {balance: 10mtz, numberOfTransactions: 5n} }
|
// { "tz1...": {balance: 10mutez, numberOfTransactions: 5n} }
|
||||||
const ledger: accountBalances = map
|
const ledger: accountBalances = map
|
||||||
("tz1KqTpEZ7Yob7QbPE4Hy4Wo8fHG8LhKxZSx": address) -> record
|
("tz1KqTpEZ7Yob7QbPE4Hy4Wo8fHG8LhKxZSx": address) -> record
|
||||||
balance = 10mtz;
|
balance = 10mutez;
|
||||||
numberOfTransactions = 5n;
|
numberOfTransactions = 5n;
|
||||||
end
|
end
|
||||||
end
|
end
|
||||||
|
@ -134,11 +134,11 @@ To confirm that our contract is valid, we can dry run it. As a result we see a *
|
|||||||
ligo dry-run taco-shop.ligo --syntax pascaligo --amount 1 buy_taco 1n "map
|
ligo dry-run taco-shop.ligo --syntax pascaligo --amount 1 buy_taco 1n "map
|
||||||
1n -> record
|
1n -> record
|
||||||
current_stock = 50n;
|
current_stock = 50n;
|
||||||
max_price = 50000000mtz;
|
max_price = 50000000mutez;
|
||||||
end;
|
end;
|
||||||
2n -> record
|
2n -> record
|
||||||
current_stock = 20n;
|
current_stock = 20n;
|
||||||
max_price = 75000000mtz;
|
max_price = 75000000mutez;
|
||||||
end;
|
end;
|
||||||
end"
|
end"
|
||||||
```
|
```
|
||||||
|
@ -28,8 +28,8 @@ Each taco kind, has its own `max_price` that it sells for, and a finite supply f
|
|||||||
|
|
||||||
|**kind** |id |**available_stock**| **max_price**|
|
|**kind** |id |**available_stock**| **max_price**|
|
||||||
|---|---|---|---|
|
|---|---|---|---|
|
||||||
|el clásico | `1n` | `50n` | `50000000mtz` |
|
|el clásico | `1n` | `50n` | `50000000mutez` |
|
||||||
|especial del chef | `2n` | `20n` | `75000000mtz` |
|
|especial del chef | `2n` | `20n` | `75000000mutez` |
|
||||||
|
|
||||||
### Calculating the current purchase price
|
### Calculating the current purchase price
|
||||||
|
|
||||||
@ -42,16 +42,16 @@ current_purchase_price = max_price / available_stock
|
|||||||
#### El clásico
|
#### El clásico
|
||||||
|**available_stock**|**max_price**|**current_purchase_price**|
|
|**available_stock**|**max_price**|**current_purchase_price**|
|
||||||
|---|---|---|
|
|---|---|---|
|
||||||
| `50n` | `50000000mtz` | `1tz`|
|
| `50n` | `50000000mutez` | `1tz`|
|
||||||
| `20n` | `50000000mtz` | `2.5tz` |
|
| `20n` | `50000000mutez` | `2.5tz` |
|
||||||
| `5n` | `50000000mtz` | `10tz` |
|
| `5n` | `50000000mutez` | `10tz` |
|
||||||
|
|
||||||
#### Especial del chef
|
#### Especial del chef
|
||||||
|**available_stock**|**max_price**|**current_purchase_price**|
|
|**available_stock**|**max_price**|**current_purchase_price**|
|
||||||
|---|---|---|
|
|---|---|---|
|
||||||
| `20n` | `75000000mtz` | `3.75tz` |
|
| `20n` | `75000000mutez` | `3.75tz` |
|
||||||
| `10n` | `75000000mtz` | `7.5tz`|
|
| `10n` | `75000000mutez` | `7.5tz`|
|
||||||
| `5n` | `75000000mtz` | `15tz` |
|
| `5n` | `75000000mutez` | `15tz` |
|
||||||
|
|
||||||
---
|
---
|
||||||
|
|
||||||
@ -161,11 +161,11 @@ When dry-running a contract, it's crucial to provide a correct initial storage v
|
|||||||
map
|
map
|
||||||
1n -> record
|
1n -> record
|
||||||
current_stock = 50n;
|
current_stock = 50n;
|
||||||
max_price = 50000000mtz;
|
max_price = 50000000mutez;
|
||||||
end;
|
end;
|
||||||
2n -> record
|
2n -> record
|
||||||
current_stock = 20n;
|
current_stock = 20n;
|
||||||
max_price = 75000000mtz;
|
max_price = 75000000mutez;
|
||||||
end;
|
end;
|
||||||
end
|
end
|
||||||
```
|
```
|
||||||
@ -177,11 +177,11 @@ end
|
|||||||
ligo dry-run taco-shop.ligo --syntax pascaligo main unit "map
|
ligo dry-run taco-shop.ligo --syntax pascaligo main unit "map
|
||||||
1n -> record
|
1n -> record
|
||||||
current_stock = 50n;
|
current_stock = 50n;
|
||||||
max_price = 50000000mtz;
|
max_price = 50000000mutez;
|
||||||
end;
|
end;
|
||||||
2n -> record
|
2n -> record
|
||||||
current_stock = 20n;
|
current_stock = 20n;
|
||||||
max_price = 75000000mtz;
|
max_price = 75000000mutez;
|
||||||
end;
|
end;
|
||||||
end"
|
end"
|
||||||
```
|
```
|
||||||
@ -298,11 +298,11 @@ In order to test the `amount` sent, we'll use the `--amount` option of `dry-run`
|
|||||||
ligo dry-run taco-shop.ligo --syntax pascaligo --amount 1 buy_taco 1n "map
|
ligo dry-run taco-shop.ligo --syntax pascaligo --amount 1 buy_taco 1n "map
|
||||||
1n -> record
|
1n -> record
|
||||||
current_stock = 50n;
|
current_stock = 50n;
|
||||||
max_price = 50000000mtz;
|
max_price = 50000000mutez;
|
||||||
end;
|
end;
|
||||||
2n -> record
|
2n -> record
|
||||||
current_stock = 20n;
|
current_stock = 20n;
|
||||||
max_price = 75000000mtz;
|
max_price = 75000000mutez;
|
||||||
end;
|
end;
|
||||||
end"
|
end"
|
||||||
```
|
```
|
||||||
|
@ -190,7 +190,7 @@ class HomeSplash extends React.Component {
|
|||||||
<h4 className="tagline-text">{siteConfig.tagline}</h4>
|
<h4 className="tagline-text">{siteConfig.tagline}</h4>
|
||||||
<p className="body">{siteConfig.taglineSub}</p>
|
<p className="body">{siteConfig.taglineSub}</p>
|
||||||
<LinkButton
|
<LinkButton
|
||||||
href="https://ligolang.gitlab.io/ligo-web-ide/"
|
href="https://ide.ligolang.org/"
|
||||||
className="large-primary-button"
|
className="large-primary-button"
|
||||||
>
|
>
|
||||||
Try Online
|
Try Online
|
||||||
|
@ -18,20 +18,43 @@ then
|
|||||||
fi
|
fi
|
||||||
fi
|
fi
|
||||||
|
|
||||||
|
echo "Installing dependencies.."
|
||||||
|
if [ -n "`uname -a | grep -i arch`" ]
|
||||||
|
then
|
||||||
|
sudo pacman -Sy --noconfirm \
|
||||||
|
make \
|
||||||
|
m4 \
|
||||||
|
gcc \
|
||||||
|
patch \
|
||||||
|
bubblewrap \
|
||||||
|
rsync \
|
||||||
|
curl
|
||||||
|
fi
|
||||||
|
|
||||||
|
if [ -n "`uname -a | grep -i ubuntu`" ]
|
||||||
|
then
|
||||||
sudo apt-get install -y make \
|
sudo apt-get install -y make \
|
||||||
m4 \
|
m4 \
|
||||||
gcc \
|
gcc \
|
||||||
patch \
|
patch \
|
||||||
bubblewrap \
|
bubblewrap \
|
||||||
rsync \
|
rsync \
|
||||||
curl \
|
curl
|
||||||
|
fi
|
||||||
|
|
||||||
if [ -n "`uname -a | grep -i ubuntu`" ]
|
if [ -n "`uname -a | grep -i ubuntu`" ]
|
||||||
then
|
then
|
||||||
|
echo "ubuntu"
|
||||||
sudo add-apt-repository -y ppa:avsm/ppa
|
sudo add-apt-repository -y ppa:avsm/ppa
|
||||||
sudo apt-get update
|
sudo apt-get update
|
||||||
sudo apt-get install opam
|
sudo apt-get install opam
|
||||||
else
|
else
|
||||||
|
if [ -n "`uname -a | grep -i arch`" ]
|
||||||
|
then
|
||||||
|
echo "arch"
|
||||||
|
sudo pacman -Sy --noconfirm opam
|
||||||
|
else
|
||||||
|
echo "unknown distro"
|
||||||
#I'm going to assume here that we're on x86_64, 32-bit users should be basically
|
#I'm going to assume here that we're on x86_64, 32-bit users should be basically
|
||||||
#extinct at this point right?
|
#extinct at this point right?
|
||||||
curl -L https://github.com/ocaml/opam/releases/download/2.0.4/opam-2.0.4-x86_64-linux \
|
curl -L https://github.com/ocaml/opam/releases/download/2.0.4/opam-2.0.4-x86_64-linux \
|
||||||
@ -63,5 +86,8 @@ else
|
|||||||
exit 1
|
exit 1
|
||||||
fi
|
fi
|
||||||
fi
|
fi
|
||||||
|
fi
|
||||||
|
|
||||||
opam init -a --bare
|
opam init -a --bare
|
||||||
|
|
||||||
|
|
||||||
|
@ -1,6 +1,22 @@
|
|||||||
#!/bin/sh
|
#!/bin/sh
|
||||||
set -e
|
set -e
|
||||||
|
. /etc/os-release
|
||||||
|
|
||||||
|
if [ $ID = arch ]
|
||||||
|
then
|
||||||
|
pacman -Sy
|
||||||
|
sudo pacman -S --noconfirm \
|
||||||
|
libevdev \
|
||||||
|
perl \
|
||||||
|
pkg-config \
|
||||||
|
gmp \
|
||||||
|
hidapi \
|
||||||
|
m4 \
|
||||||
|
libcap \
|
||||||
|
bubblewrap \
|
||||||
|
rsync
|
||||||
|
|
||||||
|
else
|
||||||
apt-get update -qq
|
apt-get update -qq
|
||||||
apt-get -y -qq install \
|
apt-get -y -qq install \
|
||||||
libev-dev \
|
libev-dev \
|
||||||
@ -12,3 +28,4 @@ apt-get -y -qq install \
|
|||||||
libcap-dev \
|
libcap-dev \
|
||||||
bubblewrap \
|
bubblewrap \
|
||||||
rsync
|
rsync
|
||||||
|
fi
|
||||||
|
@ -2,5 +2,5 @@
|
|||||||
set -e
|
set -e
|
||||||
set -x
|
set -x
|
||||||
|
|
||||||
printf '' | opam switch create . 4.07.1 # toto ocaml-base-compiler.4.06.1
|
printf '' | opam switch create . ocaml-base-compiler.4.07.1 # toto ocaml-base-compiler.4.06.1
|
||||||
eval $(opam config env)
|
eval $(opam config env)
|
||||||
|
@ -75,20 +75,26 @@ let display_format =
|
|||||||
let docv = "DISPLAY_FORMAT" in
|
let docv = "DISPLAY_FORMAT" in
|
||||||
let doc = "$(docv) is the format that will be used by the CLI. Available formats are 'dev', 'json', and 'human-readable' (default). When human-readable lacks details (we are still tweaking it), please contact us and use another format in the meanwhile." in
|
let doc = "$(docv) is the format that will be used by the CLI. Available formats are 'dev', 'json', and 'human-readable' (default). When human-readable lacks details (we are still tweaking it), please contact us and use another format in the meanwhile." in
|
||||||
info ~docv ~doc ["format" ; "display-format"] in
|
info ~docv ~doc ["format" ; "display-format"] in
|
||||||
value @@ opt string "human-readable" info
|
value @@
|
||||||
|
opt
|
||||||
|
(enum [("human-readable", `Human_readable); ("dev", `Dev); ("json", `Json)])
|
||||||
|
`Human_readable
|
||||||
|
info
|
||||||
|
|
||||||
let michelson_code_format =
|
let michelson_code_format =
|
||||||
let open Arg in
|
let open Arg in
|
||||||
let info =
|
let info =
|
||||||
let docv = "MICHELSON_FORMAT" in
|
let docv = "MICHELSON_FORMAT" in
|
||||||
let doc = "$(docv) is the format that will be used by compile-contract for the resulting Michelson. Available formats are 'micheline', and 'michelson' (default). Micheline is the format used by [XXX]." in
|
let doc = "$(docv) is the format that will be used by compile-contract for the resulting Michelson. Available formats are 'text' (default), 'json' and 'hex'." in
|
||||||
info ~docv ~doc ["michelson-format"] in
|
info ~docv ~doc ["michelson-format"] in
|
||||||
value @@ opt string "michelson" info
|
value @@
|
||||||
|
opt
|
||||||
|
(enum [("text", `Text); ("json", `Json); ("hex", `Hex)])
|
||||||
|
`Text info
|
||||||
|
|
||||||
let compile_file =
|
let compile_file =
|
||||||
let f source_file entry_point syntax display_format michelson_format =
|
let f source_file entry_point syntax display_format michelson_format =
|
||||||
toplevel ~display_format @@
|
toplevel ~display_format @@
|
||||||
let%bind michelson_format = Main.Display.michelson_format_of_string michelson_format in
|
|
||||||
let%bind contract =
|
let%bind contract =
|
||||||
trace (simple_info "compiling contract to michelson") @@
|
trace (simple_info "compiling contract to michelson") @@
|
||||||
Ligo.Compile.Of_source.compile_file_contract_entry source_file entry_point (Syntax_name syntax) in
|
Ligo.Compile.Of_source.compile_file_contract_entry source_file entry_point (Syntax_name syntax) in
|
||||||
@ -101,29 +107,29 @@ let compile_file =
|
|||||||
(term , Term.info ~docs cmdname)
|
(term , Term.info ~docs cmdname)
|
||||||
|
|
||||||
let compile_parameter =
|
let compile_parameter =
|
||||||
let f source_file entry_point expression syntax display_format =
|
let f source_file entry_point expression syntax display_format michelson_format =
|
||||||
toplevel ~display_format @@
|
toplevel ~display_format @@
|
||||||
let%bind value =
|
let%bind value =
|
||||||
trace (simple_error "compile-input") @@
|
trace (simple_error "compile-input") @@
|
||||||
Ligo.Run.Of_source.compile_file_contract_parameter source_file entry_point expression (Syntax_name syntax) in
|
Ligo.Run.Of_source.compile_file_contract_parameter source_file entry_point expression (Syntax_name syntax) in
|
||||||
ok @@ Format.asprintf "%a\n" Tezos_utils.Michelson.pp value
|
ok @@ Format.asprintf "%a\n" (Main.Display.michelson_pp michelson_format) value
|
||||||
in
|
in
|
||||||
let term =
|
let term =
|
||||||
Term.(const f $ source_file 0 $ entry_point 1 $ expression "PARAMETER" 2 $ syntax $ display_format) in
|
Term.(const f $ source_file 0 $ entry_point 1 $ expression "PARAMETER" 2 $ syntax $ display_format $ michelson_code_format) in
|
||||||
let cmdname = "compile-parameter" in
|
let cmdname = "compile-parameter" in
|
||||||
let docs = "Subcommand: compile parameters to a michelson expression. The resulting michelson expression can be passed as an argument in a transaction which calls a contract. See `ligo " ^ cmdname ^ " --help' for a list of options specific to this subcommand." in
|
let docs = "Subcommand: compile parameters to a michelson expression. The resulting michelson expression can be passed as an argument in a transaction which calls a contract. See `ligo " ^ cmdname ^ " --help' for a list of options specific to this subcommand." in
|
||||||
(term , Term.info ~docs cmdname)
|
(term , Term.info ~docs cmdname)
|
||||||
|
|
||||||
let compile_storage =
|
let compile_storage =
|
||||||
let f source_file entry_point expression syntax display_format bigmap =
|
let f source_file entry_point expression syntax display_format michelson_format bigmap =
|
||||||
toplevel ~display_format @@
|
toplevel ~display_format @@
|
||||||
let%bind value =
|
let%bind value =
|
||||||
trace (simple_error "compile-storage") @@
|
trace (simple_error "compile-storage") @@
|
||||||
Ligo.Run.Of_source.compile_file_contract_storage ~value:bigmap source_file entry_point expression (Syntax_name syntax) in
|
Ligo.Run.Of_source.compile_file_contract_storage ~value:bigmap source_file entry_point expression (Syntax_name syntax) in
|
||||||
ok @@ Format.asprintf "%a\n" Tezos_utils.Michelson.pp value
|
ok @@ Format.asprintf "%a\n" (Main.Display.michelson_pp michelson_format) value
|
||||||
in
|
in
|
||||||
let term =
|
let term =
|
||||||
Term.(const f $ source_file 0 $ entry_point 1 $ expression "STORAGE" 2 $ syntax $ display_format $ bigmap) in
|
Term.(const f $ source_file 0 $ entry_point 1 $ expression "STORAGE" 2 $ syntax $ display_format $ michelson_code_format $ bigmap) in
|
||||||
let cmdname = "compile-storage" in
|
let cmdname = "compile-storage" in
|
||||||
let docs = "Subcommand: compile an initial storage in ligo syntax to a michelson expression. The resulting michelson expression can be passed as an argument in a transaction which originates a contract. See `ligo " ^ cmdname ^ " --help' for a list of options specific to this subcommand." in
|
let docs = "Subcommand: compile an initial storage in ligo syntax to a michelson expression. The resulting michelson expression can be passed as an argument in a transaction which originates a contract. See `ligo " ^ cmdname ^ " --help' for a list of options specific to this subcommand." in
|
||||||
(term , Term.info ~docs cmdname)
|
(term , Term.info ~docs cmdname)
|
||||||
@ -175,17 +181,17 @@ let evaluate_value =
|
|||||||
(term , Term.info ~docs cmdname)
|
(term , Term.info ~docs cmdname)
|
||||||
|
|
||||||
let compile_expression =
|
let compile_expression =
|
||||||
let f expression syntax display_format =
|
let f expression syntax display_format michelson_format =
|
||||||
toplevel ~display_format @@
|
toplevel ~display_format @@
|
||||||
(* This is an actual compiler entry-point, so we start with a blank state *)
|
(* This is an actual compiler entry-point, so we start with a blank state *)
|
||||||
let state = Typer.Solver.initial_state in
|
let state = Typer.Solver.initial_state in
|
||||||
let%bind value =
|
let%bind value =
|
||||||
trace (simple_error "compile-input") @@
|
trace (simple_error "compile-input") @@
|
||||||
Ligo.Run.Of_source.compile_expression expression state (Syntax_name syntax) in
|
Ligo.Run.Of_source.compile_expression expression state (Syntax_name syntax) in
|
||||||
ok @@ Format.asprintf "%a\n" Tezos_utils.Michelson.pp value
|
ok @@ Format.asprintf "%a\n" (Main.Display.michelson_pp michelson_format) value
|
||||||
in
|
in
|
||||||
let term =
|
let term =
|
||||||
Term.(const f $ expression "" 0 $ syntax $ display_format) in
|
Term.(const f $ expression "" 0 $ syntax $ display_format $ michelson_code_format) in
|
||||||
let cmdname = "compile-expression" in
|
let cmdname = "compile-expression" in
|
||||||
let docs = "Subcommand: compile to a michelson value." in
|
let docs = "Subcommand: compile to a michelson value." in
|
||||||
(term , Term.info ~docs cmdname)
|
(term , Term.info ~docs cmdname)
|
||||||
|
@ -1,16 +1,9 @@
|
|||||||
open Trace
|
open Trace
|
||||||
open Main.Display
|
open Main.Display
|
||||||
|
|
||||||
let toplevel ~(display_format : string) (x : string result) =
|
let toplevel ~(display_format : display_format) (x : string result) =
|
||||||
let display_format =
|
|
||||||
try display_format_of_string display_format
|
|
||||||
with _ -> (
|
|
||||||
Format.printf "bad display format %s, try looking at DISPLAY_FORMAT in the man (--help)." display_format ;
|
|
||||||
failwith "Display format"
|
|
||||||
)
|
|
||||||
in
|
|
||||||
match x with
|
match x with
|
||||||
| Ok _ -> Format.printf "%a\n%!" (formatted_string_result_pp display_format) x
|
| Ok _ -> Format.printf "%a%!" (formatted_string_result_pp display_format) x
|
||||||
| Error _ ->
|
| Error _ ->
|
||||||
Format.eprintf "%a\n%!" (formatted_string_result_pp display_format) x ;
|
Format.eprintf "%a%!" (formatted_string_result_pp display_format) x ;
|
||||||
exit 1
|
exit 1
|
||||||
|
@ -1,3 +1,3 @@
|
|||||||
open Trace
|
open Trace
|
||||||
|
|
||||||
val toplevel : display_format : string -> string result -> unit
|
val toplevel : display_format : Main.Display.display_format -> string result -> unit
|
||||||
|
@ -87,13 +87,6 @@ type display_format = [
|
|||||||
| `Dev
|
| `Dev
|
||||||
]
|
]
|
||||||
|
|
||||||
let display_format_of_string = fun s : display_format ->
|
|
||||||
match s with
|
|
||||||
| "dev" -> `Dev
|
|
||||||
| "json" -> `Json
|
|
||||||
| "human-readable" -> `Human_readable
|
|
||||||
| _ -> failwith "bad display_format"
|
|
||||||
|
|
||||||
let formatted_string_result_pp (display_format : display_format) =
|
let formatted_string_result_pp (display_format : display_format) =
|
||||||
match display_format with
|
match display_format with
|
||||||
| `Human_readable -> string_result_pp_hr
|
| `Human_readable -> string_result_pp_hr
|
||||||
@ -101,16 +94,12 @@ let formatted_string_result_pp (display_format : display_format) =
|
|||||||
| `Json -> string_result_pp_json
|
| `Json -> string_result_pp_json
|
||||||
|
|
||||||
type michelson_format = [
|
type michelson_format = [
|
||||||
| `Michelson
|
| `Text
|
||||||
| `Micheline
|
| `Json
|
||||||
|
| `Hex
|
||||||
]
|
]
|
||||||
|
|
||||||
let michelson_format_of_string = fun s : michelson_format result ->
|
|
||||||
match s with
|
|
||||||
| "michelson" -> ok `Michelson
|
|
||||||
| "micheline" -> ok `Micheline
|
|
||||||
| _ -> simple_fail "bad michelson format"
|
|
||||||
|
|
||||||
let michelson_pp (mf : michelson_format) = match mf with
|
let michelson_pp (mf : michelson_format) = match mf with
|
||||||
| `Michelson -> Michelson.pp
|
| `Text -> Michelson.pp
|
||||||
| `Micheline -> Michelson.pp_json
|
| `Json -> Michelson.pp_json
|
||||||
|
| `Hex -> Michelson.pp_hex
|
||||||
|
@ -21,15 +21,12 @@ type display_format = [
|
|||||||
| `Dev
|
| `Dev
|
||||||
]
|
]
|
||||||
|
|
||||||
val display_format_of_string : string -> display_format
|
|
||||||
|
|
||||||
val formatted_string_result_pp : display_format -> Format.formatter -> string Simple_utils.Trace.result -> unit
|
val formatted_string_result_pp : display_format -> Format.formatter -> string Simple_utils.Trace.result -> unit
|
||||||
|
|
||||||
type michelson_format = [
|
type michelson_format = [
|
||||||
| `Michelson
|
| `Text
|
||||||
| `Micheline
|
| `Json
|
||||||
|
| `Hex
|
||||||
]
|
]
|
||||||
|
|
||||||
val michelson_format_of_string : string -> michelson_format Simple_utils.Trace.result
|
|
||||||
|
|
||||||
val michelson_pp : michelson_format -> Format.formatter -> Tezos_utils.Michelson.michelson -> unit
|
val michelson_pp : michelson_format -> Format.formatter -> Tezos_utils.Michelson.michelson -> unit
|
||||||
|
@ -11,15 +11,15 @@ let run ?options (* ?(is_input_value = false) *) (program:compiled_program) (inp
|
|||||||
let (Ex_ty input_ty) = input in
|
let (Ex_ty input_ty) = input in
|
||||||
let (Ex_ty output_ty) = output in
|
let (Ex_ty output_ty) = output in
|
||||||
(* let%bind input_ty_mich =
|
(* let%bind input_ty_mich =
|
||||||
* Trace.trace_tzresult_lwt (simple_error "error unparsing input ty") @@
|
Trace.trace_tzresult_lwt (simple_error "error unparsing input ty") @@
|
||||||
* Memory_proto_alpha.unparse_michelson_ty input_ty in
|
Memory_proto_alpha.unparse_michelson_ty input_ty in
|
||||||
* let%bind output_ty_mich =
|
let%bind output_ty_mich =
|
||||||
* Trace.trace_tzresult_lwt (simple_error "error unparsing output ty") @@
|
Trace.trace_tzresult_lwt (simple_error "error unparsing output ty") @@
|
||||||
* Memory_proto_alpha.unparse_michelson_ty output_ty in
|
Memory_proto_alpha.unparse_michelson_ty output_ty in
|
||||||
* Format.printf "code: %a\n" Michelson.pp program.body ;
|
Format.printf "code: %a\n" Michelson.pp program.body ;
|
||||||
* Format.printf "input_ty: %a\n" Michelson.pp input_ty_mich ;
|
Format.printf "input_ty: %a\n" Michelson.pp input_ty_mich ;
|
||||||
* Format.printf "output_ty: %a\n" Michelson.pp output_ty_mich ;
|
Format.printf "output_ty: %a\n" Michelson.pp output_ty_mich ;
|
||||||
* Format.printf "input: %a\n" Michelson.pp input_michelson ; *)
|
Format.printf "input: %a\n" Michelson.pp input_michelson ; *)
|
||||||
let%bind input =
|
let%bind input =
|
||||||
Trace.trace_tzresult_lwt (simple_error "error parsing input") @@
|
Trace.trace_tzresult_lwt (simple_error "error parsing input") @@
|
||||||
Memory_proto_alpha.parse_michelson_data input_michelson input_ty
|
Memory_proto_alpha.parse_michelson_data input_michelson input_ty
|
||||||
|
@ -260,7 +260,7 @@ and arith_expr =
|
|||||||
| Neg of minus un_op reg
|
| Neg of minus un_op reg
|
||||||
| Int of (string * Z.t) reg
|
| Int of (string * Z.t) reg
|
||||||
| Nat of (string * Z.t) reg
|
| Nat of (string * Z.t) reg
|
||||||
| Mtz of (string * Z.t) reg
|
| Mutez of (string * Z.t) reg
|
||||||
|
|
||||||
and logic_expr =
|
and logic_expr =
|
||||||
BoolExpr of bool_expr
|
BoolExpr of bool_expr
|
||||||
@ -391,7 +391,7 @@ let logic_expr_to_region = function
|
|||||||
let arith_expr_to_region = function
|
let arith_expr_to_region = function
|
||||||
Add {region;_} | Sub {region;_} | Mult {region;_}
|
Add {region;_} | Sub {region;_} | Mult {region;_}
|
||||||
| Div {region;_} | Mod {region;_} | Neg {region;_}
|
| Div {region;_} | Mod {region;_} | Neg {region;_}
|
||||||
| Int {region;_} | Mtz {region; _}
|
| Int {region;_} | Mutez {region; _}
|
||||||
| Nat {region; _} -> region
|
| Nat {region; _} -> region
|
||||||
|
|
||||||
let string_expr_to_region = function
|
let string_expr_to_region = function
|
||||||
|
@ -265,7 +265,7 @@ and arith_expr =
|
|||||||
| Neg of minus un_op reg (* -e *)
|
| Neg of minus un_op reg (* -e *)
|
||||||
| Int of (string * Z.t) reg (* 12345 *)
|
| Int of (string * Z.t) reg (* 12345 *)
|
||||||
| Nat of (string * Z.t) reg (* 3p *)
|
| Nat of (string * Z.t) reg (* 3p *)
|
||||||
| Mtz of (string * Z.t) reg (* 1.00tz 3tz *)
|
| Mutez of (string * Z.t) reg (* 1.00tz 3tz *)
|
||||||
|
|
||||||
and logic_expr =
|
and logic_expr =
|
||||||
BoolExpr of bool_expr
|
BoolExpr of bool_expr
|
||||||
|
@ -82,7 +82,7 @@ type t =
|
|||||||
| Constr of string Region.reg
|
| Constr of string Region.reg
|
||||||
| Int of (string * Z.t) Region.reg
|
| Int of (string * Z.t) Region.reg
|
||||||
| Nat of (string * Z.t) Region.reg
|
| Nat of (string * Z.t) Region.reg
|
||||||
| Mtz of (string * Z.t) Region.reg
|
| Mutez of (string * Z.t) Region.reg
|
||||||
| Str of string Region.reg
|
| Str of string Region.reg
|
||||||
| Bytes of (string * Hex.t) Region.reg
|
| Bytes of (string * Hex.t) Region.reg
|
||||||
|
|
||||||
@ -107,7 +107,7 @@ type t =
|
|||||||
| Type of Region.t
|
| Type of Region.t
|
||||||
| With of Region.t
|
| With of Region.t
|
||||||
|
|
||||||
(* Liquidity specific *)
|
(* Liquidity-specific *)
|
||||||
|
|
||||||
| LetEntry of Region.t
|
| LetEntry of Region.t
|
||||||
| MatchNat of Region.t
|
| MatchNat of Region.t
|
||||||
@ -137,23 +137,20 @@ val to_region : token -> Region.t
|
|||||||
|
|
||||||
(* Injections *)
|
(* Injections *)
|
||||||
|
|
||||||
type int_err =
|
type int_err = Non_canonical_zero
|
||||||
Non_canonical_zero
|
|
||||||
|
|
||||||
type ident_err = Reserved_name
|
type ident_err = Reserved_name
|
||||||
|
type nat_err = Invalid_natural
|
||||||
type invalid_natural =
|
|
||||||
| Invalid_natural
|
|
||||||
| Non_canonical_zero_nat
|
| Non_canonical_zero_nat
|
||||||
|
type sym_err = Invalid_symbol
|
||||||
|
|
||||||
|
val mk_int : lexeme -> Region.t -> (token, int_err) result
|
||||||
|
val mk_nat : lexeme -> Region.t -> (token, nat_err) result
|
||||||
|
val mk_mutez : lexeme -> Region.t -> (token, int_err) result
|
||||||
|
val mk_ident : lexeme -> Region.t -> (token, ident_err) result
|
||||||
|
val mk_sym : lexeme -> Region.t -> (token, sym_err) result
|
||||||
val mk_string : lexeme -> Region.t -> token
|
val mk_string : lexeme -> Region.t -> token
|
||||||
val mk_bytes : lexeme -> Region.t -> token
|
val mk_bytes : lexeme -> Region.t -> token
|
||||||
val mk_int : lexeme -> Region.t -> (token, int_err) result
|
|
||||||
val mk_nat : lexeme -> Region.t -> (token, invalid_natural) result
|
|
||||||
val mk_mtz : lexeme -> Region.t -> (token, int_err) result
|
|
||||||
val mk_ident : lexeme -> Region.t -> (token, ident_err) result
|
|
||||||
val mk_constr : lexeme -> Region.t -> token
|
val mk_constr : lexeme -> Region.t -> token
|
||||||
val mk_sym : lexeme -> Region.t -> token
|
|
||||||
val eof : Region.t -> token
|
val eof : Region.t -> token
|
||||||
|
|
||||||
(* Predicates *)
|
(* Predicates *)
|
||||||
|
@ -64,7 +64,7 @@ type t =
|
|||||||
| Constr of string Region.reg
|
| Constr of string Region.reg
|
||||||
| Int of (string * Z.t) Region.reg
|
| Int of (string * Z.t) Region.reg
|
||||||
| Nat of (string * Z.t) Region.reg
|
| Nat of (string * Z.t) Region.reg
|
||||||
| Mtz of (string * Z.t) Region.reg
|
| Mutez of (string * Z.t) Region.reg
|
||||||
| Str of string Region.reg
|
| Str of string Region.reg
|
||||||
| Bytes of (string * Hex.t) Region.reg
|
| Bytes of (string * Hex.t) Region.reg
|
||||||
|
|
||||||
@ -89,7 +89,7 @@ type t =
|
|||||||
| Type of Region.t
|
| Type of Region.t
|
||||||
| With of Region.t
|
| With of Region.t
|
||||||
|
|
||||||
(* Liquidity specific *)
|
(* Liquidity-specific *)
|
||||||
|
|
||||||
| LetEntry of Region.t
|
| LetEntry of Region.t
|
||||||
| MatchNat of Region.t
|
| MatchNat of Region.t
|
||||||
@ -141,8 +141,8 @@ let proj_token = function
|
|||||||
region, sprintf "Int (\"%s\", %s)" s (Z.to_string n)
|
region, sprintf "Int (\"%s\", %s)" s (Z.to_string n)
|
||||||
| Nat Region.{region; value = s,n} ->
|
| Nat Region.{region; value = s,n} ->
|
||||||
region, sprintf "Nat (\"%s\", %s)" s (Z.to_string n)
|
region, sprintf "Nat (\"%s\", %s)" s (Z.to_string n)
|
||||||
| Mtz Region.{region; value = s,n} ->
|
| Mutez Region.{region; value = s,n} ->
|
||||||
region, sprintf "Mtz (\"%s\", %s)" s (Z.to_string n)
|
region, sprintf "Mutez (\"%s\", %s)" s (Z.to_string n)
|
||||||
| Str Region.{region; value} ->
|
| Str Region.{region; value} ->
|
||||||
region, sprintf "Str %s" value
|
region, sprintf "Str %s" value
|
||||||
| Bytes Region.{region; value = s,b} ->
|
| Bytes Region.{region; value = s,b} ->
|
||||||
@ -202,7 +202,7 @@ let to_lexeme = function
|
|||||||
| Constr id -> id.Region.value
|
| Constr id -> id.Region.value
|
||||||
| Int i
|
| Int i
|
||||||
| Nat i
|
| Nat i
|
||||||
| Mtz i -> fst i.Region.value
|
| Mutez i -> fst i.Region.value
|
||||||
| Str s -> s.Region.value
|
| Str s -> s.Region.value
|
||||||
| Bytes b -> fst b.Region.value
|
| Bytes b -> fst b.Region.value
|
||||||
| Begin _ -> "begin"
|
| Begin _ -> "begin"
|
||||||
@ -280,12 +280,9 @@ let reserved =
|
|||||||
|> add "functor"
|
|> add "functor"
|
||||||
|> add "inherit"
|
|> add "inherit"
|
||||||
|> add "initializer"
|
|> add "initializer"
|
||||||
|> add "land"
|
|
||||||
|> add "lazy"
|
|> add "lazy"
|
||||||
|> add "lor"
|
|
||||||
|> add "lsl"
|
|> add "lsl"
|
||||||
|> add "lsr"
|
|> add "lsr"
|
||||||
|> add "lxor"
|
|
||||||
|> add "method"
|
|> add "method"
|
||||||
|> add "module"
|
|> add "module"
|
||||||
|> add "mutable"
|
|> add "mutable"
|
||||||
@ -379,11 +376,10 @@ let mk_int lexeme region =
|
|||||||
then Error Non_canonical_zero
|
then Error Non_canonical_zero
|
||||||
else Ok (Int Region.{region; value = lexeme, z})
|
else Ok (Int Region.{region; value = lexeme, z})
|
||||||
|
|
||||||
type invalid_natural =
|
type nat_err =
|
||||||
| Invalid_natural
|
Invalid_natural
|
||||||
| Non_canonical_zero_nat
|
| Non_canonical_zero_nat
|
||||||
|
|
||||||
|
|
||||||
let mk_nat lexeme region =
|
let mk_nat lexeme region =
|
||||||
match (String.index_opt lexeme 'p') with
|
match (String.index_opt lexeme 'p') with
|
||||||
| None -> Error Invalid_natural
|
| None -> Error Invalid_natural
|
||||||
@ -397,46 +393,52 @@ let mk_nat lexeme region =
|
|||||||
else Ok (Nat Region.{region; value = lexeme, z})
|
else Ok (Nat Region.{region; value = lexeme, z})
|
||||||
)
|
)
|
||||||
|
|
||||||
let mk_mtz lexeme region =
|
let mk_mutez lexeme region =
|
||||||
let z =
|
let z =
|
||||||
Str.(global_replace (regexp "_") "" lexeme) |>
|
Str.(global_replace (regexp "_") "" lexeme) |>
|
||||||
Str.(global_replace (regexp "mtz") "") |>
|
Str.(global_replace (regexp "mutez") "") |>
|
||||||
Z.of_string in
|
Z.of_string in
|
||||||
if Z.equal z Z.zero && lexeme <> "0mtz"
|
if Z.equal z Z.zero && lexeme <> "0mutez"
|
||||||
then Error Non_canonical_zero
|
then Error Non_canonical_zero
|
||||||
else Ok (Mtz Region.{region; value = lexeme, z})
|
else Ok (Mutez Region.{region; value = lexeme, z})
|
||||||
|
|
||||||
let eof region = EOF region
|
let eof region = EOF region
|
||||||
|
|
||||||
|
type sym_err = Invalid_symbol
|
||||||
|
|
||||||
let mk_sym lexeme region =
|
let mk_sym lexeme region =
|
||||||
match lexeme with
|
match lexeme with
|
||||||
"->" -> ARROW region
|
(* Lexemes in common with all concrete syntaxes *)
|
||||||
| "::" -> CONS region
|
";" -> Ok (SEMI region)
|
||||||
| "^" -> CAT region
|
| "," -> Ok (COMMA region)
|
||||||
| "-" -> MINUS region
|
| "(" -> Ok (LPAR region)
|
||||||
| "+" -> PLUS region
|
| ")" -> Ok (RPAR region)
|
||||||
| "/" -> SLASH region
|
| "[" -> Ok (LBRACKET region)
|
||||||
| "*" -> TIMES region
|
| "]" -> Ok (RBRACKET region)
|
||||||
| "[" -> LBRACKET region
|
| "{" -> Ok (LBRACE region)
|
||||||
| "]" -> RBRACKET region
|
| "}" -> Ok (RBRACE region)
|
||||||
| "{" -> LBRACE region
|
| "=" -> Ok (EQ region)
|
||||||
| "}" -> RBRACE region
|
| ":" -> Ok (COLON region)
|
||||||
| "," -> COMMA region
|
| "|" -> Ok (VBAR region)
|
||||||
| ";" -> SEMI region
|
| "->" -> Ok (ARROW region)
|
||||||
| "|" -> VBAR region
|
| "." -> Ok (DOT region)
|
||||||
| ":" -> COLON region
|
| "_" -> Ok (WILD region)
|
||||||
| "." -> DOT region
|
| "^" -> Ok (CAT region)
|
||||||
| "_" -> WILD region
|
| "+" -> Ok (PLUS region)
|
||||||
| "=" -> EQ region
|
| "-" -> Ok (MINUS region)
|
||||||
| "<>" -> NE region
|
| "*" -> Ok (TIMES region)
|
||||||
| "<" -> LT region
|
| "/" -> Ok (SLASH region)
|
||||||
| ">" -> GT region
|
| "<" -> Ok (LT region)
|
||||||
| "=<" -> LE region
|
| "<=" -> Ok (LE region)
|
||||||
| ">=" -> GE region
|
| ">" -> Ok (GT region)
|
||||||
| "||" -> BOOL_OR region
|
| ">=" -> Ok (GE region)
|
||||||
| "&&" -> BOOL_AND region
|
|
||||||
| "(" -> LPAR region
|
|
||||||
| ")" -> RPAR region
|
| "<>" -> Ok (NE region)
|
||||||
|
| "::" -> Ok (CONS region)
|
||||||
|
| "||" -> Ok (BOOL_OR region)
|
||||||
|
| "&&" -> Ok (BOOL_AND region)
|
||||||
|
|
||||||
| a -> failwith ("Not understood token: " ^ a)
|
| a -> failwith ("Not understood token: " ^ a)
|
||||||
|
|
||||||
(* Identifiers *)
|
(* Identifiers *)
|
||||||
|
@ -42,7 +42,7 @@
|
|||||||
|
|
||||||
%token <(string * Z.t) Region.reg> Int
|
%token <(string * Z.t) Region.reg> Int
|
||||||
%token <(string * Z.t) Region.reg> Nat
|
%token <(string * Z.t) Region.reg> Nat
|
||||||
%token <(string * Z.t) Region.reg> Mtz
|
%token <(string * Z.t) Region.reg> Mutez
|
||||||
|
|
||||||
(*%token And*)
|
(*%token And*)
|
||||||
%token <Region.t> Begin
|
%token <Region.t> Begin
|
||||||
|
@ -761,7 +761,7 @@ call_expr:
|
|||||||
|
|
||||||
core_expr:
|
core_expr:
|
||||||
Int { EArith (Int $1) }
|
Int { EArith (Int $1) }
|
||||||
| Mtz { EArith (Mtz $1) }
|
| Mutez { EArith (Mutez $1) }
|
||||||
| Nat { EArith (Nat $1) }
|
| Nat { EArith (Nat $1) }
|
||||||
| Ident | module_field { EVar $1 }
|
| Ident | module_field { EVar $1 }
|
||||||
| projection { EProj $1 }
|
| projection { EProj $1 }
|
||||||
|
@ -321,8 +321,8 @@ and print_arith_expr buffer = function
|
|||||||
| Int {region; value=lex,z} ->
|
| Int {region; value=lex,z} ->
|
||||||
let line = sprintf "Int %s (%s)" lex (Z.to_string z)
|
let line = sprintf "Int %s (%s)" lex (Z.to_string z)
|
||||||
in print_token buffer region line
|
in print_token buffer region line
|
||||||
| Mtz {region; value=lex,z} ->
|
| Mutez {region; value=lex,z} ->
|
||||||
let line = sprintf "Mtz %s (%s)" lex (Z.to_string z)
|
let line = sprintf "Mutez %s (%s)" lex (Z.to_string z)
|
||||||
in print_token buffer region line
|
in print_token buffer region line
|
||||||
| Nat {region; value=lex,z} ->
|
| Nat {region; value=lex,z} ->
|
||||||
let line = sprintf "Nat %s (%s)" lex (Z.to_string z)
|
let line = sprintf "Nat %s (%s)" lex (Z.to_string z)
|
||||||
|
21
src/passes/1-parser/pascaligo.mli
Normal file
21
src/passes/1-parser/pascaligo.mli
Normal file
@ -0,0 +1,21 @@
|
|||||||
|
(* This file provides an interface to the PascaLIGO parser. *)
|
||||||
|
|
||||||
|
open Trace
|
||||||
|
|
||||||
|
module Parser = Parser_pascaligo.Parser
|
||||||
|
module AST = Parser_pascaligo.AST
|
||||||
|
module ParserLog = Parser_pascaligo.ParserLog
|
||||||
|
module LexToken = Parser_pascaligo.LexToken
|
||||||
|
|
||||||
|
|
||||||
|
(** Open a PascaLIGO filename given by string and convert into an abstract syntax tree. *)
|
||||||
|
val parse_file : string -> (AST.t result)
|
||||||
|
|
||||||
|
(** Convert a given string into a PascaLIGO abstract syntax tree *)
|
||||||
|
val parse_string : string -> AST.t result
|
||||||
|
|
||||||
|
(** Parse a given string as a PascaLIGO expression and return an expression AST.
|
||||||
|
|
||||||
|
This is intended to be used for interactive interpreters, or other scenarios
|
||||||
|
where you would want to parse a PascaLIGO expression outside of a contract. *)
|
||||||
|
val parse_expression : string -> AST.expr result
|
@ -63,7 +63,6 @@ type kwd_not = Region.t
|
|||||||
type kwd_of = Region.t
|
type kwd_of = Region.t
|
||||||
type kwd_or = Region.t
|
type kwd_or = Region.t
|
||||||
type kwd_patch = Region.t
|
type kwd_patch = Region.t
|
||||||
type kwd_procedure = Region.t
|
|
||||||
type kwd_record = Region.t
|
type kwd_record = Region.t
|
||||||
type kwd_remove = Region.t
|
type kwd_remove = Region.t
|
||||||
type kwd_set = Region.t
|
type kwd_set = Region.t
|
||||||
@ -163,7 +162,7 @@ and ast = t
|
|||||||
and declaration =
|
and declaration =
|
||||||
TypeDecl of type_decl reg
|
TypeDecl of type_decl reg
|
||||||
| ConstDecl of const_decl reg
|
| ConstDecl of const_decl reg
|
||||||
| LambdaDecl of lambda_decl
|
| FunDecl of fun_decl reg
|
||||||
|
|
||||||
and const_decl = {
|
and const_decl = {
|
||||||
kwd_const : kwd_const;
|
kwd_const : kwd_const;
|
||||||
@ -188,7 +187,7 @@ and type_decl = {
|
|||||||
and type_expr =
|
and type_expr =
|
||||||
TProd of cartesian
|
TProd of cartesian
|
||||||
| TSum of (variant reg, vbar) nsepseq reg
|
| TSum of (variant reg, vbar) nsepseq reg
|
||||||
| TRecord of record_type
|
| TRecord of field_decl reg ne_injection reg
|
||||||
| TApp of (type_name * type_tuple) reg
|
| TApp of (type_name * type_tuple) reg
|
||||||
| TFun of (type_expr * arrow * type_expr) reg
|
| TFun of (type_expr * arrow * type_expr) reg
|
||||||
| TPar of type_expr par reg
|
| TPar of type_expr par reg
|
||||||
@ -198,11 +197,9 @@ and cartesian = (type_expr, times) nsepseq reg
|
|||||||
|
|
||||||
and variant = {
|
and variant = {
|
||||||
constr : constr;
|
constr : constr;
|
||||||
args : (kwd_of * cartesian) option
|
args : (kwd_of * type_expr) option
|
||||||
}
|
}
|
||||||
|
|
||||||
and record_type = field_decl reg injection reg
|
|
||||||
|
|
||||||
and field_decl = {
|
and field_decl = {
|
||||||
field_name : field_name;
|
field_name : field_name;
|
||||||
colon : colon;
|
colon : colon;
|
||||||
@ -213,10 +210,6 @@ and type_tuple = (type_expr, comma) nsepseq par reg
|
|||||||
|
|
||||||
(* Function and procedure declarations *)
|
(* Function and procedure declarations *)
|
||||||
|
|
||||||
and lambda_decl =
|
|
||||||
FunDecl of fun_decl reg
|
|
||||||
| ProcDecl of proc_decl reg
|
|
||||||
|
|
||||||
and fun_decl = {
|
and fun_decl = {
|
||||||
kwd_function : kwd_function;
|
kwd_function : kwd_function;
|
||||||
name : variable;
|
name : variable;
|
||||||
@ -225,21 +218,10 @@ and fun_decl = {
|
|||||||
ret_type : type_expr;
|
ret_type : type_expr;
|
||||||
kwd_is : kwd_is;
|
kwd_is : kwd_is;
|
||||||
local_decls : local_decl list;
|
local_decls : local_decl list;
|
||||||
block : block reg;
|
block : block reg option;
|
||||||
kwd_with : kwd_with;
|
kwd_with : kwd_with option;
|
||||||
return : expr;
|
return : expr;
|
||||||
terminator : semi option
|
terminator : semi option }
|
||||||
}
|
|
||||||
|
|
||||||
and proc_decl = {
|
|
||||||
kwd_procedure : kwd_procedure;
|
|
||||||
name : variable;
|
|
||||||
param : parameters;
|
|
||||||
kwd_is : kwd_is;
|
|
||||||
local_decls : local_decl list;
|
|
||||||
block : block reg;
|
|
||||||
terminator : semi option
|
|
||||||
}
|
|
||||||
|
|
||||||
and parameters = (param_decl, semi) nsepseq par reg
|
and parameters = (param_decl, semi) nsepseq par reg
|
||||||
|
|
||||||
@ -284,7 +266,6 @@ and statement =
|
|||||||
|
|
||||||
and local_decl =
|
and local_decl =
|
||||||
LocalFun of fun_decl reg
|
LocalFun of fun_decl reg
|
||||||
| LocalProc of proc_decl reg
|
|
||||||
| LocalData of data_decl
|
| LocalData of data_decl
|
||||||
|
|
||||||
and data_decl =
|
and data_decl =
|
||||||
@ -302,12 +283,8 @@ and var_decl = {
|
|||||||
}
|
}
|
||||||
|
|
||||||
and instruction =
|
and instruction =
|
||||||
Single of single_instr
|
|
||||||
| Block of block reg
|
|
||||||
|
|
||||||
and single_instr =
|
|
||||||
Cond of conditional reg
|
Cond of conditional reg
|
||||||
| CaseInstr of instruction case reg
|
| CaseInstr of if_clause case reg
|
||||||
| Assign of assignment reg
|
| Assign of assignment reg
|
||||||
| Loop of loop
|
| Loop of loop
|
||||||
| ProcCall of fun_call
|
| ProcCall of fun_call
|
||||||
@ -338,14 +315,14 @@ and set_patch = {
|
|||||||
kwd_patch : kwd_patch;
|
kwd_patch : kwd_patch;
|
||||||
path : path;
|
path : path;
|
||||||
kwd_with : kwd_with;
|
kwd_with : kwd_with;
|
||||||
set_inj : expr injection reg
|
set_inj : expr ne_injection reg
|
||||||
}
|
}
|
||||||
|
|
||||||
and map_patch = {
|
and map_patch = {
|
||||||
kwd_patch : kwd_patch;
|
kwd_patch : kwd_patch;
|
||||||
path : path;
|
path : path;
|
||||||
kwd_with : kwd_with;
|
kwd_with : kwd_with;
|
||||||
map_inj : binding reg injection reg
|
map_inj : binding reg ne_injection reg
|
||||||
}
|
}
|
||||||
|
|
||||||
and binding = {
|
and binding = {
|
||||||
@ -358,7 +335,17 @@ and record_patch = {
|
|||||||
kwd_patch : kwd_patch;
|
kwd_patch : kwd_patch;
|
||||||
path : path;
|
path : path;
|
||||||
kwd_with : kwd_with;
|
kwd_with : kwd_with;
|
||||||
record_inj : record_expr
|
record_inj : field_assign reg ne_injection reg
|
||||||
|
}
|
||||||
|
|
||||||
|
and cond_expr = {
|
||||||
|
kwd_if : kwd_if;
|
||||||
|
test : expr;
|
||||||
|
kwd_then : kwd_then;
|
||||||
|
ifso : expr;
|
||||||
|
terminator : semi option;
|
||||||
|
kwd_else : kwd_else;
|
||||||
|
ifnot : expr
|
||||||
}
|
}
|
||||||
|
|
||||||
and conditional = {
|
and conditional = {
|
||||||
@ -373,7 +360,11 @@ and conditional = {
|
|||||||
|
|
||||||
and if_clause =
|
and if_clause =
|
||||||
ClauseInstr of instruction
|
ClauseInstr of instruction
|
||||||
| ClauseBlock of (statements * semi option) braces reg
|
| ClauseBlock of clause_block
|
||||||
|
|
||||||
|
and clause_block =
|
||||||
|
LongBlock of block reg
|
||||||
|
| ShortBlock of (statements * semi option) braces reg
|
||||||
|
|
||||||
and set_membership = {
|
and set_membership = {
|
||||||
set : expr;
|
set : expr;
|
||||||
@ -425,10 +416,8 @@ and for_loop =
|
|||||||
and for_int = {
|
and for_int = {
|
||||||
kwd_for : kwd_for;
|
kwd_for : kwd_for;
|
||||||
assign : var_assign reg;
|
assign : var_assign reg;
|
||||||
down : kwd_down option;
|
|
||||||
kwd_to : kwd_to;
|
kwd_to : kwd_to;
|
||||||
bound : expr;
|
bound : expr;
|
||||||
step : (kwd_step * expr) option;
|
|
||||||
block : block reg
|
block : block reg
|
||||||
}
|
}
|
||||||
|
|
||||||
@ -442,15 +431,24 @@ and for_collect = {
|
|||||||
kwd_for : kwd_for;
|
kwd_for : kwd_for;
|
||||||
var : variable;
|
var : variable;
|
||||||
bind_to : (arrow * variable) option;
|
bind_to : (arrow * variable) option;
|
||||||
|
colon : colon;
|
||||||
|
elt_type : type_expr;
|
||||||
kwd_in : kwd_in;
|
kwd_in : kwd_in;
|
||||||
|
collection : collection;
|
||||||
expr : expr;
|
expr : expr;
|
||||||
block : block reg
|
block : block reg
|
||||||
}
|
}
|
||||||
|
|
||||||
|
and collection =
|
||||||
|
Map of kwd_map
|
||||||
|
| Set of kwd_set
|
||||||
|
| List of kwd_list
|
||||||
|
|
||||||
(* Expressions *)
|
(* Expressions *)
|
||||||
|
|
||||||
and expr =
|
and expr =
|
||||||
| ECase of expr case reg
|
ECase of expr case reg
|
||||||
|
| ECond of cond_expr reg
|
||||||
| EAnnot of annot_expr reg
|
| EAnnot of annot_expr reg
|
||||||
| ELogic of logic_expr
|
| ELogic of logic_expr
|
||||||
| EArith of arith_expr
|
| EArith of arith_expr
|
||||||
@ -481,6 +479,13 @@ and 'a injection = {
|
|||||||
closing : closing
|
closing : closing
|
||||||
}
|
}
|
||||||
|
|
||||||
|
and 'a ne_injection = {
|
||||||
|
opening : opening;
|
||||||
|
ne_elements : ('a, semi) nsepseq;
|
||||||
|
terminator : semi option;
|
||||||
|
closing : closing
|
||||||
|
}
|
||||||
|
|
||||||
and opening =
|
and opening =
|
||||||
Kwd of keyword
|
Kwd of keyword
|
||||||
| KwdBracket of keyword * lbracket
|
| KwdBracket of keyword * lbracket
|
||||||
@ -492,6 +497,7 @@ and closing =
|
|||||||
and map_expr =
|
and map_expr =
|
||||||
MapLookUp of map_lookup reg
|
MapLookUp of map_lookup reg
|
||||||
| MapInj of binding reg injection reg
|
| MapInj of binding reg injection reg
|
||||||
|
| BigMapInj of binding reg injection reg
|
||||||
|
|
||||||
and map_lookup = {
|
and map_lookup = {
|
||||||
path : path;
|
path : path;
|
||||||
@ -541,7 +547,7 @@ and arith_expr =
|
|||||||
| Neg of minus un_op reg
|
| Neg of minus un_op reg
|
||||||
| Int of (Lexer.lexeme * Z.t) reg
|
| Int of (Lexer.lexeme * Z.t) reg
|
||||||
| Nat of (Lexer.lexeme * Z.t) reg
|
| Nat of (Lexer.lexeme * Z.t) reg
|
||||||
| Mtz of (Lexer.lexeme * Z.t) reg
|
| Mutez of (Lexer.lexeme * Z.t) reg
|
||||||
|
|
||||||
and string_expr =
|
and string_expr =
|
||||||
Cat of cat bin_op reg
|
Cat of cat bin_op reg
|
||||||
@ -577,16 +583,13 @@ and selection =
|
|||||||
FieldName of field_name
|
FieldName of field_name
|
||||||
| Component of (Lexer.lexeme * Z.t) reg
|
| Component of (Lexer.lexeme * Z.t) reg
|
||||||
|
|
||||||
and tuple_expr =
|
and tuple_expr = (expr, comma) nsepseq par reg
|
||||||
TupleInj of tuple_injection
|
|
||||||
|
|
||||||
and tuple_injection = (expr, comma) nsepseq par reg
|
|
||||||
|
|
||||||
and none_expr = c_None
|
and none_expr = c_None
|
||||||
|
|
||||||
and fun_call = (fun_name * arguments) reg
|
and fun_call = (fun_name * arguments) reg
|
||||||
|
|
||||||
and arguments = tuple_injection
|
and arguments = tuple_expr
|
||||||
|
|
||||||
(* Patterns *)
|
(* Patterns *)
|
||||||
|
|
||||||
@ -596,6 +599,7 @@ and pattern =
|
|||||||
| PVar of Lexer.lexeme reg
|
| PVar of Lexer.lexeme reg
|
||||||
| PWild of wild
|
| PWild of wild
|
||||||
| PInt of (Lexer.lexeme * Z.t) reg
|
| PInt of (Lexer.lexeme * Z.t) reg
|
||||||
|
| PNat of (Lexer.lexeme * Z.t) reg
|
||||||
| PBytes of (Lexer.lexeme * Hex.t) reg
|
| PBytes of (Lexer.lexeme * Hex.t) reg
|
||||||
| PString of Lexer.lexeme reg
|
| PString of Lexer.lexeme reg
|
||||||
| PUnit of c_Unit
|
| PUnit of c_Unit
|
||||||
@ -643,14 +647,15 @@ let rec expr_to_region = function
|
|||||||
| EBytes {region; _}
|
| EBytes {region; _}
|
||||||
| EUnit region
|
| EUnit region
|
||||||
| ECase {region;_}
|
| ECase {region;_}
|
||||||
|
| ECond {region; _}
|
||||||
| EPar {region; _} -> region
|
| EPar {region; _} -> region
|
||||||
|
|
||||||
and tuple_expr_to_region = function
|
and tuple_expr_to_region {region; _} = region
|
||||||
TupleInj {region; _} -> region
|
|
||||||
|
|
||||||
and map_expr_to_region = function
|
and map_expr_to_region = function
|
||||||
MapLookUp {region; _}
|
MapLookUp {region; _}
|
||||||
| MapInj {region; _} -> region
|
| MapInj {region; _} -> region
|
||||||
|
| BigMapInj {region; _} -> region
|
||||||
|
|
||||||
and set_expr_to_region = function
|
and set_expr_to_region = function
|
||||||
SetInj {region; _}
|
SetInj {region; _}
|
||||||
@ -676,7 +681,7 @@ and comp_expr_to_region = function
|
|||||||
| Neq {region; _} -> region
|
| Neq {region; _} -> region
|
||||||
|
|
||||||
and arith_expr_to_region = function
|
and arith_expr_to_region = function
|
||||||
| Add {region; _}
|
Add {region; _}
|
||||||
| Sub {region; _}
|
| Sub {region; _}
|
||||||
| Mult {region; _}
|
| Mult {region; _}
|
||||||
| Div {region; _}
|
| Div {region; _}
|
||||||
@ -684,13 +689,13 @@ and arith_expr_to_region = function
|
|||||||
| Neg {region; _}
|
| Neg {region; _}
|
||||||
| Int {region; _}
|
| Int {region; _}
|
||||||
| Nat {region; _}
|
| Nat {region; _}
|
||||||
| Mtz {region; _} -> region
|
| Mutez {region; _} -> region
|
||||||
|
|
||||||
and string_expr_to_region = function
|
and string_expr_to_region = function
|
||||||
Cat {region; _}
|
Cat {region; _}
|
||||||
| String {region; _} -> region
|
| String {region; _} -> region
|
||||||
|
|
||||||
and annot_expr_to_region ({region; _}) = region
|
and annot_expr_to_region {region; _} = region
|
||||||
|
|
||||||
and list_expr_to_region = function
|
and list_expr_to_region = function
|
||||||
Cons {region; _}
|
Cons {region; _}
|
||||||
@ -709,30 +714,34 @@ let path_to_region = function
|
|||||||
| Path {region; _} -> region
|
| Path {region; _} -> region
|
||||||
|
|
||||||
let instr_to_region = function
|
let instr_to_region = function
|
||||||
Single Cond {region; _}
|
Cond {region; _}
|
||||||
| Single CaseInstr {region; _}
|
| CaseInstr {region; _}
|
||||||
| Single Assign {region; _}
|
| Assign {region; _}
|
||||||
| Single Loop While {region; _}
|
| Loop While {region; _}
|
||||||
| Single Loop For ForInt {region; _}
|
| Loop For ForInt {region; _}
|
||||||
| Single Loop For ForCollect {region; _}
|
| Loop For ForCollect {region; _}
|
||||||
| Single ProcCall {region; _}
|
| ProcCall {region; _}
|
||||||
| Single Skip region
|
| Skip region
|
||||||
| Single RecordPatch {region; _}
|
| RecordPatch {region; _}
|
||||||
| Single MapPatch {region; _}
|
| MapPatch {region; _}
|
||||||
| Single SetPatch {region; _}
|
| SetPatch {region; _}
|
||||||
| Single MapRemove {region; _}
|
| MapRemove {region; _}
|
||||||
| Single SetRemove {region; _}
|
| SetRemove {region; _} -> region
|
||||||
| Block {region; _} -> region
|
|
||||||
|
let clause_block_to_region = function
|
||||||
|
LongBlock {region; _}
|
||||||
|
| ShortBlock {region; _} -> region
|
||||||
|
|
||||||
let if_clause_to_region = function
|
let if_clause_to_region = function
|
||||||
ClauseInstr instr -> instr_to_region instr
|
ClauseInstr instr -> instr_to_region instr
|
||||||
| ClauseBlock {region; _} -> region
|
| ClauseBlock clause_block -> clause_block_to_region clause_block
|
||||||
|
|
||||||
let pattern_to_region = function
|
let pattern_to_region = function
|
||||||
PCons {region; _}
|
PCons {region; _}
|
||||||
| PVar {region; _}
|
| PVar {region; _}
|
||||||
| PWild region
|
| PWild region
|
||||||
| PInt {region; _}
|
| PInt {region; _}
|
||||||
|
| PNat {region; _}
|
||||||
| PBytes {region; _}
|
| PBytes {region; _}
|
||||||
| PString {region; _}
|
| PString {region; _}
|
||||||
| PUnit region
|
| PUnit region
|
||||||
@ -748,7 +757,6 @@ let pattern_to_region = function
|
|||||||
|
|
||||||
let local_decl_to_region = function
|
let local_decl_to_region = function
|
||||||
LocalFun {region; _}
|
LocalFun {region; _}
|
||||||
| LocalProc {region; _}
|
|
||||||
| LocalData LocalConst {region; _}
|
| LocalData LocalConst {region; _}
|
||||||
| LocalData LocalVar {region; _} -> region
|
| LocalData LocalVar {region; _} -> region
|
||||||
|
|
||||||
|
@ -47,7 +47,6 @@ type kwd_not = Region.t
|
|||||||
type kwd_of = Region.t
|
type kwd_of = Region.t
|
||||||
type kwd_or = Region.t
|
type kwd_or = Region.t
|
||||||
type kwd_patch = Region.t
|
type kwd_patch = Region.t
|
||||||
type kwd_procedure = Region.t
|
|
||||||
type kwd_record = Region.t
|
type kwd_record = Region.t
|
||||||
type kwd_remove = Region.t
|
type kwd_remove = Region.t
|
||||||
type kwd_set = Region.t
|
type kwd_set = Region.t
|
||||||
@ -135,8 +134,15 @@ type 'a braces = {
|
|||||||
rbrace : rbrace
|
rbrace : rbrace
|
||||||
}
|
}
|
||||||
|
|
||||||
(* The Abstract Syntax Tree *)
|
(** The Abstract Syntax Tree
|
||||||
|
|
||||||
|
The AST mirrors the contents of Parser.mly, which defines a tree of parsing
|
||||||
|
productions that are used to make a syntax tree from a given program input.
|
||||||
|
|
||||||
|
This file defines the concrete AST for PascaLIGO, which is used to associate
|
||||||
|
regions of the source code text with the contents of the syntax tree.
|
||||||
|
|
||||||
|
*)
|
||||||
type t = {
|
type t = {
|
||||||
decl : declaration nseq;
|
decl : declaration nseq;
|
||||||
eof : eof
|
eof : eof
|
||||||
@ -147,7 +153,7 @@ and ast = t
|
|||||||
and declaration =
|
and declaration =
|
||||||
TypeDecl of type_decl reg
|
TypeDecl of type_decl reg
|
||||||
| ConstDecl of const_decl reg
|
| ConstDecl of const_decl reg
|
||||||
| LambdaDecl of lambda_decl
|
| FunDecl of fun_decl reg
|
||||||
|
|
||||||
and const_decl = {
|
and const_decl = {
|
||||||
kwd_const : kwd_const;
|
kwd_const : kwd_const;
|
||||||
@ -172,7 +178,7 @@ and type_decl = {
|
|||||||
and type_expr =
|
and type_expr =
|
||||||
TProd of cartesian
|
TProd of cartesian
|
||||||
| TSum of (variant reg, vbar) nsepseq reg
|
| TSum of (variant reg, vbar) nsepseq reg
|
||||||
| TRecord of record_type
|
| TRecord of field_decl reg ne_injection reg
|
||||||
| TApp of (type_name * type_tuple) reg
|
| TApp of (type_name * type_tuple) reg
|
||||||
| TFun of (type_expr * arrow * type_expr) reg
|
| TFun of (type_expr * arrow * type_expr) reg
|
||||||
| TPar of type_expr par reg
|
| TPar of type_expr par reg
|
||||||
@ -182,11 +188,9 @@ and cartesian = (type_expr, times) nsepseq reg
|
|||||||
|
|
||||||
and variant = {
|
and variant = {
|
||||||
constr : constr;
|
constr : constr;
|
||||||
args : (kwd_of * cartesian) option
|
args : (kwd_of * type_expr) option
|
||||||
}
|
}
|
||||||
|
|
||||||
and record_type = field_decl reg injection reg
|
|
||||||
|
|
||||||
and field_decl = {
|
and field_decl = {
|
||||||
field_name : field_name;
|
field_name : field_name;
|
||||||
colon : colon;
|
colon : colon;
|
||||||
@ -195,11 +199,7 @@ and field_decl = {
|
|||||||
|
|
||||||
and type_tuple = (type_expr, comma) nsepseq par reg
|
and type_tuple = (type_expr, comma) nsepseq par reg
|
||||||
|
|
||||||
(* Function and procedure declarations *)
|
(* Function declarations *)
|
||||||
|
|
||||||
and lambda_decl =
|
|
||||||
FunDecl of fun_decl reg
|
|
||||||
| ProcDecl of proc_decl reg
|
|
||||||
|
|
||||||
and fun_decl ={
|
and fun_decl ={
|
||||||
kwd_function : kwd_function;
|
kwd_function : kwd_function;
|
||||||
@ -209,21 +209,10 @@ and fun_decl = {
|
|||||||
ret_type : type_expr;
|
ret_type : type_expr;
|
||||||
kwd_is : kwd_is;
|
kwd_is : kwd_is;
|
||||||
local_decls : local_decl list;
|
local_decls : local_decl list;
|
||||||
block : block reg;
|
block : block reg option;
|
||||||
kwd_with : kwd_with;
|
kwd_with : kwd_with option;
|
||||||
return : expr;
|
return : expr;
|
||||||
terminator : semi option
|
terminator : semi option }
|
||||||
}
|
|
||||||
|
|
||||||
and proc_decl = {
|
|
||||||
kwd_procedure : kwd_procedure;
|
|
||||||
name : variable;
|
|
||||||
param : parameters;
|
|
||||||
kwd_is : kwd_is;
|
|
||||||
local_decls : local_decl list;
|
|
||||||
block : block reg;
|
|
||||||
terminator : semi option
|
|
||||||
}
|
|
||||||
|
|
||||||
and parameters = (param_decl, semi) nsepseq par reg
|
and parameters = (param_decl, semi) nsepseq par reg
|
||||||
|
|
||||||
@ -268,7 +257,6 @@ and statement =
|
|||||||
|
|
||||||
and local_decl =
|
and local_decl =
|
||||||
LocalFun of fun_decl reg
|
LocalFun of fun_decl reg
|
||||||
| LocalProc of proc_decl reg
|
|
||||||
| LocalData of data_decl
|
| LocalData of data_decl
|
||||||
|
|
||||||
and data_decl =
|
and data_decl =
|
||||||
@ -286,12 +274,8 @@ and var_decl = {
|
|||||||
}
|
}
|
||||||
|
|
||||||
and instruction =
|
and instruction =
|
||||||
Single of single_instr
|
|
||||||
| Block of block reg
|
|
||||||
|
|
||||||
and single_instr =
|
|
||||||
Cond of conditional reg
|
Cond of conditional reg
|
||||||
| CaseInstr of instruction case reg
|
| CaseInstr of if_clause case reg
|
||||||
| Assign of assignment reg
|
| Assign of assignment reg
|
||||||
| Loop of loop
|
| Loop of loop
|
||||||
| ProcCall of fun_call
|
| ProcCall of fun_call
|
||||||
@ -322,14 +306,14 @@ and set_patch = {
|
|||||||
kwd_patch : kwd_patch;
|
kwd_patch : kwd_patch;
|
||||||
path : path;
|
path : path;
|
||||||
kwd_with : kwd_with;
|
kwd_with : kwd_with;
|
||||||
set_inj : expr injection reg
|
set_inj : expr ne_injection reg
|
||||||
}
|
}
|
||||||
|
|
||||||
and map_patch = {
|
and map_patch = {
|
||||||
kwd_patch : kwd_patch;
|
kwd_patch : kwd_patch;
|
||||||
path : path;
|
path : path;
|
||||||
kwd_with : kwd_with;
|
kwd_with : kwd_with;
|
||||||
map_inj : binding reg injection reg
|
map_inj : binding reg ne_injection reg
|
||||||
}
|
}
|
||||||
|
|
||||||
and binding = {
|
and binding = {
|
||||||
@ -342,7 +326,17 @@ and record_patch = {
|
|||||||
kwd_patch : kwd_patch;
|
kwd_patch : kwd_patch;
|
||||||
path : path;
|
path : path;
|
||||||
kwd_with : kwd_with;
|
kwd_with : kwd_with;
|
||||||
record_inj : field_assign reg injection reg
|
record_inj : field_assign reg ne_injection reg
|
||||||
|
}
|
||||||
|
|
||||||
|
and cond_expr = {
|
||||||
|
kwd_if : kwd_if;
|
||||||
|
test : expr;
|
||||||
|
kwd_then : kwd_then;
|
||||||
|
ifso : expr;
|
||||||
|
terminator : semi option;
|
||||||
|
kwd_else : kwd_else;
|
||||||
|
ifnot : expr
|
||||||
}
|
}
|
||||||
|
|
||||||
and conditional = {
|
and conditional = {
|
||||||
@ -357,7 +351,11 @@ and conditional = {
|
|||||||
|
|
||||||
and if_clause =
|
and if_clause =
|
||||||
ClauseInstr of instruction
|
ClauseInstr of instruction
|
||||||
| ClauseBlock of (statements * semi option) braces reg
|
| ClauseBlock of clause_block
|
||||||
|
|
||||||
|
and clause_block =
|
||||||
|
LongBlock of block reg
|
||||||
|
| ShortBlock of (statements * semi option) braces reg
|
||||||
|
|
||||||
and set_membership = {
|
and set_membership = {
|
||||||
set : expr;
|
set : expr;
|
||||||
@ -409,10 +407,8 @@ and for_loop =
|
|||||||
and for_int = {
|
and for_int = {
|
||||||
kwd_for : kwd_for;
|
kwd_for : kwd_for;
|
||||||
assign : var_assign reg;
|
assign : var_assign reg;
|
||||||
down : kwd_down option;
|
|
||||||
kwd_to : kwd_to;
|
kwd_to : kwd_to;
|
||||||
bound : expr;
|
bound : expr;
|
||||||
step : (kwd_step * expr) option;
|
|
||||||
block : block reg
|
block : block reg
|
||||||
}
|
}
|
||||||
|
|
||||||
@ -426,15 +422,24 @@ and for_collect = {
|
|||||||
kwd_for : kwd_for;
|
kwd_for : kwd_for;
|
||||||
var : variable;
|
var : variable;
|
||||||
bind_to : (arrow * variable) option;
|
bind_to : (arrow * variable) option;
|
||||||
|
colon : colon;
|
||||||
|
elt_type : type_expr;
|
||||||
kwd_in : kwd_in;
|
kwd_in : kwd_in;
|
||||||
|
collection : collection;
|
||||||
expr : expr;
|
expr : expr;
|
||||||
block : block reg
|
block : block reg
|
||||||
}
|
}
|
||||||
|
|
||||||
|
and collection =
|
||||||
|
Map of kwd_map
|
||||||
|
| Set of kwd_set
|
||||||
|
| List of kwd_list
|
||||||
|
|
||||||
(* Expressions *)
|
(* Expressions *)
|
||||||
|
|
||||||
and expr =
|
and expr =
|
||||||
| ECase of expr case reg
|
ECase of expr case reg
|
||||||
|
| ECond of cond_expr reg
|
||||||
| EAnnot of annot_expr reg
|
| EAnnot of annot_expr reg
|
||||||
| ELogic of logic_expr
|
| ELogic of logic_expr
|
||||||
| EArith of arith_expr
|
| EArith of arith_expr
|
||||||
@ -465,6 +470,13 @@ and 'a injection = {
|
|||||||
closing : closing
|
closing : closing
|
||||||
}
|
}
|
||||||
|
|
||||||
|
and 'a ne_injection = {
|
||||||
|
opening : opening;
|
||||||
|
ne_elements : ('a, semi) nsepseq;
|
||||||
|
terminator : semi option;
|
||||||
|
closing : closing
|
||||||
|
}
|
||||||
|
|
||||||
and opening =
|
and opening =
|
||||||
Kwd of keyword
|
Kwd of keyword
|
||||||
| KwdBracket of keyword * lbracket
|
| KwdBracket of keyword * lbracket
|
||||||
@ -476,6 +488,7 @@ and closing =
|
|||||||
and map_expr =
|
and map_expr =
|
||||||
MapLookUp of map_lookup reg
|
MapLookUp of map_lookup reg
|
||||||
| MapInj of binding reg injection reg
|
| MapInj of binding reg injection reg
|
||||||
|
| BigMapInj of binding reg injection reg
|
||||||
|
|
||||||
and map_lookup = {
|
and map_lookup = {
|
||||||
path : path;
|
path : path;
|
||||||
@ -525,7 +538,7 @@ and arith_expr =
|
|||||||
| Neg of minus un_op reg
|
| Neg of minus un_op reg
|
||||||
| Int of (Lexer.lexeme * Z.t) reg
|
| Int of (Lexer.lexeme * Z.t) reg
|
||||||
| Nat of (Lexer.lexeme * Z.t) reg
|
| Nat of (Lexer.lexeme * Z.t) reg
|
||||||
| Mtz of (Lexer.lexeme * Z.t) reg
|
| Mutez of (Lexer.lexeme * Z.t) reg
|
||||||
|
|
||||||
and string_expr =
|
and string_expr =
|
||||||
Cat of cat bin_op reg
|
Cat of cat bin_op reg
|
||||||
@ -561,16 +574,13 @@ and selection =
|
|||||||
FieldName of field_name
|
FieldName of field_name
|
||||||
| Component of (Lexer.lexeme * Z.t) reg
|
| Component of (Lexer.lexeme * Z.t) reg
|
||||||
|
|
||||||
and tuple_expr =
|
and tuple_expr = (expr, comma) nsepseq par reg
|
||||||
TupleInj of tuple_injection
|
|
||||||
|
|
||||||
and tuple_injection = (expr, comma) nsepseq par reg
|
|
||||||
|
|
||||||
and none_expr = c_None
|
and none_expr = c_None
|
||||||
|
|
||||||
and fun_call = (fun_name * arguments) reg
|
and fun_call = (fun_name * arguments) reg
|
||||||
|
|
||||||
and arguments = tuple_injection
|
and arguments = tuple_expr
|
||||||
|
|
||||||
(* Patterns *)
|
(* Patterns *)
|
||||||
|
|
||||||
@ -580,6 +590,7 @@ and pattern =
|
|||||||
| PVar of Lexer.lexeme reg
|
| PVar of Lexer.lexeme reg
|
||||||
| PWild of wild
|
| PWild of wild
|
||||||
| PInt of (Lexer.lexeme * Z.t) reg
|
| PInt of (Lexer.lexeme * Z.t) reg
|
||||||
|
| PNat of (Lexer.lexeme * Z.t) reg
|
||||||
| PBytes of (Lexer.lexeme * Hex.t) reg
|
| PBytes of (Lexer.lexeme * Hex.t) reg
|
||||||
| PString of Lexer.lexeme reg
|
| PString of Lexer.lexeme reg
|
||||||
| PUnit of c_Unit
|
| PUnit of c_Unit
|
||||||
|
@ -332,18 +332,18 @@ There are three kinds of native numerical types in PascaLIGO: `int`,
|
|||||||
other, for example `00` is invalid. Also, for the sake of convenience,
|
other, for example `00` is invalid. Also, for the sake of convenience,
|
||||||
underscores are allowed in the literals, like `1_000_000`.
|
underscores are allowed in the literals, like `1_000_000`.
|
||||||
|
|
||||||
* The second numerical type is the type of the natural numbers,
|
* The second numerical type is the type of the natural numbers, e.g.,
|
||||||
e.g., `0n` or `13n`. Note that the `nat` literals must be annotated
|
`0n` or `13n`. Note that the `nat` literals must be annotated with the
|
||||||
with the suffix `n`, which distinguishes them from `int` literals. The
|
suffix `n`, which distinguishes them from `int` literals. The same
|
||||||
same convenient use of underscores as with integer literals is allowed
|
convenient use of underscores as with integer literals is allowed too
|
||||||
too and the canonical form of zero is `0n`.
|
and the canonical form of zero is `0n`.
|
||||||
|
|
||||||
* The last kind of native numerical type is `tez`, which is a unit
|
* The last kind of native numerical type is `tez`, which is a unit of
|
||||||
of measure of the amounts (fees, accounts). Beware: the literals of
|
measure of the amounts (fees, accounts). Beware: the literals of the
|
||||||
the type `tez` are annotated with the suffix `mtz`, which stands for
|
type `tez` are annotated with the suffix `mutez`, which stands for
|
||||||
millionth of Tez, for instance, `0mtz` or `1200000mtz`. The same handy
|
millionth of Tez, for instance, `0mutez` or `1200000mutez`. The same
|
||||||
use of underscores as in natural literals help in the writing, like
|
handy use of underscores as in natural literals help in the writing,
|
||||||
`1_200_000mtz`.
|
like `1_200_000mutez`.
|
||||||
|
|
||||||
To see how numerical types can be used in expressions see the sections
|
To see how numerical types can be used in expressions see the sections
|
||||||
"Predefined operators" and "Predefined values".
|
"Predefined operators" and "Predefined values".
|
||||||
@ -533,14 +533,13 @@ in terse style (see section "Predefined types and values/Lists").
|
|||||||
|
|
||||||
Given a tuple `t` with _n_ components, the `i`th component is
|
Given a tuple `t` with _n_ components, the `i`th component is
|
||||||
|
|
||||||
t.(i)
|
t.i
|
||||||
|
|
||||||
where `t.(0)` is the first component. For example, given the
|
where `t.0` is the first component. For example, given the declaration
|
||||||
declaration
|
|
||||||
|
|
||||||
const t : int * string = (4, "four")
|
const t : int * string = (4, "four")
|
||||||
|
|
||||||
the expression `t.(1)` has the value `"four"`.
|
the expression `t.1` has the value `"four"`.
|
||||||
|
|
||||||
#### Records
|
#### Records
|
||||||
|
|
||||||
@ -833,7 +832,7 @@ example, in verbose style:
|
|||||||
A value of that type could be
|
A value of that type could be
|
||||||
|
|
||||||
record
|
record
|
||||||
goal = 10mtz;
|
goal = 10mutez;
|
||||||
deadline = "...";
|
deadline = "...";
|
||||||
backers = map end;
|
backers = map end;
|
||||||
funded = False
|
funded = False
|
||||||
|
@ -331,7 +331,7 @@ unary_expr ::=
|
|||||||
core_expr ::=
|
core_expr ::=
|
||||||
Int
|
Int
|
||||||
| Nat
|
| Nat
|
||||||
| Mtz
|
| Mutez
|
||||||
| var
|
| var
|
||||||
| String
|
| String
|
||||||
| Bytes
|
| Bytes
|
||||||
|
@ -337,7 +337,7 @@ unary_expr ::=
|
|||||||
core_expr ::=
|
core_expr ::=
|
||||||
Int
|
Int
|
||||||
| Nat
|
| Nat
|
||||||
| Mtz
|
| Mutez
|
||||||
| var
|
| var
|
||||||
| String
|
| String
|
||||||
| Bytes
|
| Bytes
|
||||||
|
@ -317,7 +317,7 @@ unary_expr ::=
|
|||||||
core_expr ::=
|
core_expr ::=
|
||||||
Int
|
Int
|
||||||
| Nat
|
| Nat
|
||||||
| Mtz
|
| Mutez
|
||||||
| Ident (* var *)
|
| Ident (* var *)
|
||||||
| String
|
| String
|
||||||
| Bytes
|
| Bytes
|
||||||
|
@ -295,7 +295,7 @@ unary_expr ::=
|
|||||||
core_expr ::=
|
core_expr ::=
|
||||||
Int
|
Int
|
||||||
| Nat
|
| Nat
|
||||||
| Mtz
|
| Mutez
|
||||||
| Ident (* var *)
|
| Ident (* var *)
|
||||||
| String
|
| String
|
||||||
| Bytes
|
| Bytes
|
||||||
|
@ -289,7 +289,7 @@ unary_expr ::=
|
|||||||
core_expr ::=
|
core_expr ::=
|
||||||
Int
|
Int
|
||||||
| Nat
|
| Nat
|
||||||
| Mtz
|
| Mutez
|
||||||
| Ident (* var *)
|
| Ident (* var *)
|
||||||
| String
|
| String
|
||||||
| Bytes
|
| Bytes
|
||||||
|
@ -292,7 +292,7 @@ unary_expr ::=
|
|||||||
core_expr ::=
|
core_expr ::=
|
||||||
Int
|
Int
|
||||||
| Nat
|
| Nat
|
||||||
| Mtz
|
| Mutez
|
||||||
| Ident (* var *)
|
| Ident (* var *)
|
||||||
| String
|
| String
|
||||||
| Bytes
|
| Bytes
|
||||||
|
@ -279,7 +279,7 @@ unary_expr ::=
|
|||||||
core_expr ::=
|
core_expr ::=
|
||||||
Int
|
Int
|
||||||
| Nat
|
| Nat
|
||||||
| Mtz
|
| Mutez
|
||||||
| Ident (* var *)
|
| Ident (* var *)
|
||||||
| String
|
| String
|
||||||
| Bytes
|
| Bytes
|
||||||
|
@ -284,7 +284,7 @@ unary_expr ::=
|
|||||||
core_expr ::=
|
core_expr ::=
|
||||||
Int
|
Int
|
||||||
| Nat
|
| Nat
|
||||||
| Mtz
|
| Mutez
|
||||||
| Ident (* var *)
|
| Ident (* var *)
|
||||||
| String
|
| String
|
||||||
| Bytes
|
| Bytes
|
||||||
|
@ -288,7 +288,7 @@ unary_expr ::=
|
|||||||
core_expr ::=
|
core_expr ::=
|
||||||
Int
|
Int
|
||||||
| Nat
|
| Nat
|
||||||
| Mtz
|
| Mutez
|
||||||
| Ident (* var *)
|
| Ident (* var *)
|
||||||
| String
|
| String
|
||||||
| Bytes
|
| Bytes
|
||||||
|
@ -283,7 +283,7 @@ unary_expr ::=
|
|||||||
core_expr ::=
|
core_expr ::=
|
||||||
Int
|
Int
|
||||||
| Nat
|
| Nat
|
||||||
| Mtz
|
| Mutez
|
||||||
| Ident (* var *)
|
| Ident (* var *)
|
||||||
| String
|
| String
|
||||||
| Bytes
|
| Bytes
|
||||||
|
@ -281,7 +281,7 @@ unary_expr ::=
|
|||||||
core_expr ::=
|
core_expr ::=
|
||||||
Int
|
Int
|
||||||
| Nat
|
| Nat
|
||||||
| Mtz
|
| Mutez
|
||||||
| Ident (* var *)
|
| Ident (* var *)
|
||||||
| Ident (* var *) brackets(expr) (* lookup *)
|
| Ident (* var *) brackets(expr) (* lookup *)
|
||||||
| Ident (* struct_name *) DOT nsepseq(selection,DOT) brackets(expr) (* lookup *)
|
| Ident (* struct_name *) DOT nsepseq(selection,DOT) brackets(expr) (* lookup *)
|
||||||
|
@ -285,7 +285,7 @@ unary_expr ::=
|
|||||||
core_expr ::=
|
core_expr ::=
|
||||||
Int
|
Int
|
||||||
| Nat
|
| Nat
|
||||||
| Mtz
|
| Mutez
|
||||||
| Ident (* var *)
|
| Ident (* var *)
|
||||||
| Ident (* var *) brackets(expr) (* lookup *)
|
| Ident (* var *) brackets(expr) (* lookup *)
|
||||||
| Ident (* struct_name *) DOT nsepseq(selection,DOT) option(brackets(expr))
|
| Ident (* struct_name *) DOT nsepseq(selection,DOT) option(brackets(expr))
|
||||||
|
@ -285,7 +285,7 @@ unary_expr ::=
|
|||||||
core_expr ::=
|
core_expr ::=
|
||||||
Int
|
Int
|
||||||
| Nat
|
| Nat
|
||||||
| Mtz
|
| Mutez
|
||||||
| Ident (* var *)
|
| Ident (* var *)
|
||||||
| Ident (* var *) brackets(expr) (* lookup *)
|
| Ident (* var *) brackets(expr) (* lookup *)
|
||||||
| Ident (* struct_name *) DOT nsepseq(selection,DOT) option(brackets(expr))
|
| Ident (* struct_name *) DOT nsepseq(selection,DOT) option(brackets(expr))
|
||||||
|
@ -285,7 +285,7 @@ unary_expr ::=
|
|||||||
core_expr ::=
|
core_expr ::=
|
||||||
Int
|
Int
|
||||||
| Nat
|
| Nat
|
||||||
| Mtz
|
| Mutez
|
||||||
| Ident (* var *)
|
| Ident (* var *)
|
||||||
| Ident (* var *) brackets(expr) (* lookup *)
|
| Ident (* var *) brackets(expr) (* lookup *)
|
||||||
| Ident (* struct_name *) DOT nsepseq(selection,DOT) option(brackets(expr))
|
| Ident (* struct_name *) DOT nsepseq(selection,DOT) option(brackets(expr))
|
||||||
|
@ -285,7 +285,7 @@ unary_expr ::=
|
|||||||
core_expr ::=
|
core_expr ::=
|
||||||
Int
|
Int
|
||||||
| Nat
|
| Nat
|
||||||
| Mtz
|
| Mutez
|
||||||
| Ident (* var *)
|
| Ident (* var *)
|
||||||
| Ident (* var *) brackets(expr) (* lookup *)
|
| Ident (* var *) brackets(expr) (* lookup *)
|
||||||
| Ident (* struct_name *) DOT nsepseq(selection,DOT) option(brackets(expr))
|
| Ident (* struct_name *) DOT nsepseq(selection,DOT) option(brackets(expr))
|
||||||
|
@ -285,7 +285,7 @@ unary_expr ::=
|
|||||||
core_expr ::=
|
core_expr ::=
|
||||||
Int
|
Int
|
||||||
| Nat
|
| Nat
|
||||||
| Mtz
|
| Mutez
|
||||||
| Ident (* var *)
|
| Ident (* var *)
|
||||||
| Ident (* var *) brackets(expr) (* lookup *)
|
| Ident (* var *) brackets(expr) (* lookup *)
|
||||||
| Ident (* struct_name *) DOT nsepseq(selection,DOT) option(brackets(expr))
|
| Ident (* struct_name *) DOT nsepseq(selection,DOT) option(brackets(expr))
|
||||||
|
@ -285,7 +285,7 @@ unary_expr ::=
|
|||||||
core_expr ::=
|
core_expr ::=
|
||||||
Int
|
Int
|
||||||
| Nat
|
| Nat
|
||||||
| Mtz
|
| Mutez
|
||||||
| Ident (* var *)
|
| Ident (* var *)
|
||||||
| Ident (* var *) brackets(expr) (* lookup *)
|
| Ident (* var *) brackets(expr) (* lookup *)
|
||||||
| Ident (* struct_name *) DOT nsepseq(selection,DOT) option(brackets(expr))
|
| Ident (* struct_name *) DOT nsepseq(selection,DOT) option(brackets(expr))
|
||||||
|
@ -285,7 +285,7 @@ unary_expr ::=
|
|||||||
core_expr ::=
|
core_expr ::=
|
||||||
Int
|
Int
|
||||||
| Nat
|
| Nat
|
||||||
| Mtz
|
| Mutez
|
||||||
| Ident (* var *)
|
| Ident (* var *)
|
||||||
| Ident (* var *) brackets(expr) (* lookup *)
|
| Ident (* var *) brackets(expr) (* lookup *)
|
||||||
| Ident (* struct_name *) DOT nsepseq(selection,DOT) option(brackets(expr))
|
| Ident (* struct_name *) DOT nsepseq(selection,DOT) option(brackets(expr))
|
||||||
|
@ -285,7 +285,7 @@ unary_expr ::=
|
|||||||
core_expr ::=
|
core_expr ::=
|
||||||
Int
|
Int
|
||||||
| Nat
|
| Nat
|
||||||
| Mtz
|
| Mutez
|
||||||
| Ident (* var *)
|
| Ident (* var *)
|
||||||
| Ident (* var *) brackets(expr) (* lookup *)
|
| Ident (* var *) brackets(expr) (* lookup *)
|
||||||
| Ident (* struct_name *) DOT nsepseq(selection,DOT) option(brackets(expr))
|
| Ident (* struct_name *) DOT nsepseq(selection,DOT) option(brackets(expr))
|
||||||
|
@ -270,7 +270,7 @@ unary_expr ::=
|
|||||||
core_expr ::=
|
core_expr ::=
|
||||||
Int
|
Int
|
||||||
| Nat
|
| Nat
|
||||||
| Mtz
|
| Mutez
|
||||||
| Ident (* var *)
|
| Ident (* var *)
|
||||||
| Ident (* var *) brackets(expr) (* lookup *)
|
| Ident (* var *) brackets(expr) (* lookup *)
|
||||||
| Ident (* struct_name *) DOT nsepseq(selection,DOT) option(brackets(expr))
|
| Ident (* struct_name *) DOT nsepseq(selection,DOT) option(brackets(expr))
|
||||||
|
@ -291,7 +291,7 @@ unary_expr ::=
|
|||||||
core_expr ::=
|
core_expr ::=
|
||||||
Int
|
Int
|
||||||
| Nat
|
| Nat
|
||||||
| Mtz
|
| Mutez
|
||||||
| Ident option(core_suffix)
|
| Ident option(core_suffix)
|
||||||
| String
|
| String
|
||||||
| Bytes
|
| Bytes
|
||||||
|
@ -349,7 +349,7 @@ XXX
|
|||||||
core_expr ::=
|
core_expr ::=
|
||||||
Int
|
Int
|
||||||
| Nat
|
| Nat
|
||||||
| Mtz
|
| Mutez
|
||||||
| Ident option(core_suffix)
|
| Ident option(core_suffix)
|
||||||
| String
|
| String
|
||||||
| Bytes
|
| Bytes
|
||||||
|
@ -1,4 +1,4 @@
|
|||||||
(* This signature defines the lexical tokens for LIGO
|
(** This signature defines the lexical tokens for LIGO
|
||||||
|
|
||||||
_Tokens_ are the abstract units which are used by the parser to
|
_Tokens_ are the abstract units which are used by the parser to
|
||||||
build the abstract syntax tree (AST), in other words, the stream of
|
build the abstract syntax tree (AST), in other words, the stream of
|
||||||
@ -35,7 +35,7 @@ type t =
|
|||||||
| Bytes of (lexeme * Hex.t) Region.reg
|
| Bytes of (lexeme * Hex.t) Region.reg
|
||||||
| Int of (lexeme * Z.t) Region.reg
|
| Int of (lexeme * Z.t) Region.reg
|
||||||
| Nat of (lexeme * Z.t) Region.reg
|
| Nat of (lexeme * Z.t) Region.reg
|
||||||
| Mtz of (lexeme * Z.t) Region.reg
|
| Mutez of (lexeme * Z.t) Region.reg
|
||||||
| Ident of lexeme Region.reg
|
| Ident of lexeme Region.reg
|
||||||
| Constr of lexeme Region.reg
|
| Constr of lexeme Region.reg
|
||||||
|
|
||||||
@ -53,13 +53,13 @@ type t =
|
|||||||
| VBAR of Region.t (* "|" *)
|
| VBAR of Region.t (* "|" *)
|
||||||
| ARROW of Region.t (* "->" *)
|
| ARROW of Region.t (* "->" *)
|
||||||
| ASS of Region.t (* ":=" *)
|
| ASS of Region.t (* ":=" *)
|
||||||
| EQUAL of Region.t (* "=" *)
|
| EQ of Region.t (* "=" *)
|
||||||
| COLON of Region.t (* ":" *)
|
| COLON of Region.t (* ":" *)
|
||||||
| LT of Region.t (* "<" *)
|
| LT of Region.t (* "<" *)
|
||||||
| LEQ of Region.t (* "<=" *)
|
| LE of Region.t (* "<=" *)
|
||||||
| GT of Region.t (* ">" *)
|
| GT of Region.t (* ">" *)
|
||||||
| GEQ of Region.t (* ">=" *)
|
| GE of Region.t (* ">=" *)
|
||||||
| NEQ of Region.t (* "=/=" *)
|
| NE of Region.t (* "=/=" *)
|
||||||
| PLUS of Region.t (* "+" *)
|
| PLUS of Region.t (* "+" *)
|
||||||
| MINUS of Region.t (* "-" *)
|
| MINUS of Region.t (* "-" *)
|
||||||
| SLASH of Region.t (* "/" *)
|
| SLASH of Region.t (* "/" *)
|
||||||
@ -137,23 +137,20 @@ val to_region : token -> Region.t
|
|||||||
|
|
||||||
(* Injections *)
|
(* Injections *)
|
||||||
|
|
||||||
type int_err =
|
type int_err = Non_canonical_zero
|
||||||
Non_canonical_zero
|
|
||||||
|
|
||||||
type ident_err = Reserved_name
|
type ident_err = Reserved_name
|
||||||
|
type nat_err = Invalid_natural
|
||||||
type invalid_natural =
|
|
||||||
| Invalid_natural
|
|
||||||
| Non_canonical_zero_nat
|
| Non_canonical_zero_nat
|
||||||
|
type sym_err = Invalid_symbol
|
||||||
|
|
||||||
|
val mk_int : lexeme -> Region.t -> (token, int_err) result
|
||||||
|
val mk_nat : lexeme -> Region.t -> (token, nat_err) result
|
||||||
|
val mk_mutez : lexeme -> Region.t -> (token, int_err) result
|
||||||
|
val mk_ident : lexeme -> Region.t -> (token, ident_err) result
|
||||||
|
val mk_sym : lexeme -> Region.t -> (token, sym_err) result
|
||||||
val mk_string : lexeme -> Region.t -> token
|
val mk_string : lexeme -> Region.t -> token
|
||||||
val mk_bytes : lexeme -> Region.t -> token
|
val mk_bytes : lexeme -> Region.t -> token
|
||||||
val mk_int : lexeme -> Region.t -> (token, int_err) result
|
|
||||||
val mk_nat : lexeme -> Region.t -> (token, invalid_natural) result
|
|
||||||
val mk_mtz : lexeme -> Region.t -> (token, int_err) result
|
|
||||||
val mk_ident : lexeme -> Region.t -> (token, ident_err) result
|
|
||||||
val mk_constr : lexeme -> Region.t -> token
|
val mk_constr : lexeme -> Region.t -> token
|
||||||
val mk_sym : lexeme -> Region.t -> token
|
|
||||||
val eof : Region.t -> token
|
val eof : Region.t -> token
|
||||||
|
|
||||||
(* Predicates *)
|
(* Predicates *)
|
||||||
|
@ -33,7 +33,7 @@ type t =
|
|||||||
| Bytes of (lexeme * Hex.t) Region.reg
|
| Bytes of (lexeme * Hex.t) Region.reg
|
||||||
| Int of (lexeme * Z.t) Region.reg
|
| Int of (lexeme * Z.t) Region.reg
|
||||||
| Nat of (lexeme * Z.t) Region.reg
|
| Nat of (lexeme * Z.t) Region.reg
|
||||||
| Mtz of (lexeme * Z.t) Region.reg
|
| Mutez of (lexeme * Z.t) Region.reg
|
||||||
| Ident of lexeme Region.reg
|
| Ident of lexeme Region.reg
|
||||||
| Constr of lexeme Region.reg
|
| Constr of lexeme Region.reg
|
||||||
|
|
||||||
@ -51,13 +51,13 @@ type t =
|
|||||||
| VBAR of Region.t
|
| VBAR of Region.t
|
||||||
| ARROW of Region.t
|
| ARROW of Region.t
|
||||||
| ASS of Region.t
|
| ASS of Region.t
|
||||||
| EQUAL of Region.t
|
| EQ of Region.t
|
||||||
| COLON of Region.t
|
| COLON of Region.t
|
||||||
| LT of Region.t
|
| LT of Region.t
|
||||||
| LEQ of Region.t
|
| LE of Region.t
|
||||||
| GT of Region.t
|
| GT of Region.t
|
||||||
| GEQ of Region.t
|
| GE of Region.t
|
||||||
| NEQ of Region.t
|
| NE of Region.t
|
||||||
| PLUS of Region.t
|
| PLUS of Region.t
|
||||||
| MINUS of Region.t
|
| MINUS of Region.t
|
||||||
| SLASH of Region.t
|
| SLASH of Region.t
|
||||||
@ -160,8 +160,8 @@ let proj_token = function
|
|||||||
| Nat Region.{region; value = s,n} ->
|
| Nat Region.{region; value = s,n} ->
|
||||||
region, sprintf "Nat (\"%s\", %s)" s (Z.to_string n)
|
region, sprintf "Nat (\"%s\", %s)" s (Z.to_string n)
|
||||||
|
|
||||||
| Mtz Region.{region; value = s,n} ->
|
| Mutez Region.{region; value = s,n} ->
|
||||||
region, sprintf "Mtz (\"%s\", %s)" s (Z.to_string n)
|
region, sprintf "Mutez (\"%s\", %s)" s (Z.to_string n)
|
||||||
|
|
||||||
| Ident Region.{region; value} ->
|
| Ident Region.{region; value} ->
|
||||||
region, sprintf "Ident \"%s\"" value
|
region, sprintf "Ident \"%s\"" value
|
||||||
@ -183,13 +183,13 @@ let proj_token = function
|
|||||||
| VBAR region -> region, "VBAR"
|
| VBAR region -> region, "VBAR"
|
||||||
| ARROW region -> region, "ARROW"
|
| ARROW region -> region, "ARROW"
|
||||||
| ASS region -> region, "ASS"
|
| ASS region -> region, "ASS"
|
||||||
| EQUAL region -> region, "EQUAL"
|
| EQ region -> region, "EQ"
|
||||||
| COLON region -> region, "COLON"
|
| COLON region -> region, "COLON"
|
||||||
| LT region -> region, "LT"
|
| LT region -> region, "LT"
|
||||||
| LEQ region -> region, "LEQ"
|
| LE region -> region, "LE"
|
||||||
| GT region -> region, "GT"
|
| GT region -> region, "GT"
|
||||||
| GEQ region -> region, "GEQ"
|
| GE region -> region, "GE"
|
||||||
| NEQ region -> region, "NEQ"
|
| NE region -> region, "NE"
|
||||||
| PLUS region -> region, "PLUS"
|
| PLUS region -> region, "PLUS"
|
||||||
| MINUS region -> region, "MINUS"
|
| MINUS region -> region, "MINUS"
|
||||||
| SLASH region -> region, "SLASH"
|
| SLASH region -> region, "SLASH"
|
||||||
@ -258,7 +258,7 @@ let to_lexeme = function
|
|||||||
| Bytes b -> fst b.Region.value
|
| Bytes b -> fst b.Region.value
|
||||||
| Int i
|
| Int i
|
||||||
| Nat i
|
| Nat i
|
||||||
| Mtz i -> fst i.Region.value
|
| Mutez i -> fst i.Region.value
|
||||||
| Ident id
|
| Ident id
|
||||||
| Constr id -> id.Region.value
|
| Constr id -> id.Region.value
|
||||||
|
|
||||||
@ -276,13 +276,13 @@ let to_lexeme = function
|
|||||||
| VBAR _ -> "|"
|
| VBAR _ -> "|"
|
||||||
| ARROW _ -> "->"
|
| ARROW _ -> "->"
|
||||||
| ASS _ -> ":="
|
| ASS _ -> ":="
|
||||||
| EQUAL _ -> "="
|
| EQ _ -> "="
|
||||||
| COLON _ -> ":"
|
| COLON _ -> ":"
|
||||||
| LT _ -> "<"
|
| LT _ -> "<"
|
||||||
| LEQ _ -> "<="
|
| LE _ -> "<="
|
||||||
| GT _ -> ">"
|
| GT _ -> ">"
|
||||||
| GEQ _ -> ">="
|
| GE _ -> ">="
|
||||||
| NEQ _ -> "=/="
|
| NE _ -> "=/="
|
||||||
| PLUS _ -> "+"
|
| PLUS _ -> "+"
|
||||||
| MINUS _ -> "-"
|
| MINUS _ -> "-"
|
||||||
| SLASH _ -> "/"
|
| SLASH _ -> "/"
|
||||||
@ -480,8 +480,8 @@ let mk_int lexeme region =
|
|||||||
then Error Non_canonical_zero
|
then Error Non_canonical_zero
|
||||||
else Ok (Int Region.{region; value = lexeme, z})
|
else Ok (Int Region.{region; value = lexeme, z})
|
||||||
|
|
||||||
type invalid_natural =
|
type nat_err =
|
||||||
| Invalid_natural
|
Invalid_natural
|
||||||
| Non_canonical_zero_nat
|
| Non_canonical_zero_nat
|
||||||
|
|
||||||
let mk_nat lexeme region =
|
let mk_nat lexeme region =
|
||||||
@ -497,46 +497,53 @@ let mk_nat lexeme region =
|
|||||||
else Ok (Nat Region.{region; value = lexeme, z})
|
else Ok (Nat Region.{region; value = lexeme, z})
|
||||||
)
|
)
|
||||||
|
|
||||||
let mk_mtz lexeme region =
|
let mk_mutez lexeme region =
|
||||||
let z =
|
let z =
|
||||||
Str.(global_replace (regexp "_") "" lexeme) |>
|
Str.(global_replace (regexp "_") "" lexeme) |>
|
||||||
Str.(global_replace (regexp "mtz") "") |>
|
Str.(global_replace (regexp "mutez") "") |>
|
||||||
Z.of_string in
|
Z.of_string in
|
||||||
if Z.equal z Z.zero && lexeme <> "0mtz"
|
if Z.equal z Z.zero && lexeme <> "0mutez"
|
||||||
then Error Non_canonical_zero
|
then Error Non_canonical_zero
|
||||||
else Ok (Mtz Region.{region; value = lexeme, z})
|
else Ok (Mutez Region.{region; value = lexeme, z})
|
||||||
|
|
||||||
let eof region = EOF region
|
let eof region = EOF region
|
||||||
|
|
||||||
|
type sym_err = Invalid_symbol
|
||||||
|
|
||||||
let mk_sym lexeme region =
|
let mk_sym lexeme region =
|
||||||
match lexeme with
|
match lexeme with
|
||||||
";" -> SEMI region
|
(* Lexemes in common with all concrete syntaxes *)
|
||||||
| "," -> COMMA region
|
";" -> Ok (SEMI region)
|
||||||
| "(" -> LPAR region
|
| "," -> Ok (COMMA region)
|
||||||
| ")" -> RPAR region
|
| "(" -> Ok (LPAR region)
|
||||||
| "{" -> LBRACE region
|
| ")" -> Ok (RPAR region)
|
||||||
| "}" -> RBRACE region
|
| "[" -> Ok (LBRACKET region)
|
||||||
| "[" -> LBRACKET region
|
| "]" -> Ok (RBRACKET region)
|
||||||
| "]" -> RBRACKET region
|
| "{" -> Ok (LBRACE region)
|
||||||
| "#" -> CONS region
|
| "}" -> Ok (RBRACE region)
|
||||||
| "|" -> VBAR region
|
| "=" -> Ok (EQ region)
|
||||||
| "->" -> ARROW region
|
| ":" -> Ok (COLON region)
|
||||||
| ":=" -> ASS region
|
| "|" -> Ok (VBAR region)
|
||||||
| "=" -> EQUAL region
|
| "->" -> Ok (ARROW region)
|
||||||
| ":" -> COLON region
|
| "." -> Ok (DOT region)
|
||||||
| "<" -> LT region
|
| "_" -> Ok (WILD region)
|
||||||
| "<=" -> LEQ region
|
| "^" -> Ok (CAT region)
|
||||||
| ">" -> GT region
|
| "+" -> Ok (PLUS region)
|
||||||
| ">=" -> GEQ region
|
| "-" -> Ok (MINUS region)
|
||||||
| "=/=" -> NEQ region
|
| "*" -> Ok (TIMES region)
|
||||||
| "+" -> PLUS region
|
| "/" -> Ok (SLASH region)
|
||||||
| "-" -> MINUS region
|
| "<" -> Ok (LT region)
|
||||||
| "/" -> SLASH region
|
| "<=" -> Ok (LE region)
|
||||||
| "*" -> TIMES region
|
| ">" -> Ok (GT region)
|
||||||
| "." -> DOT region
|
| ">=" -> Ok (GE region)
|
||||||
| "_" -> WILD region
|
|
||||||
| "^" -> CAT region
|
(* Lexemes specific to PascaLIGO *)
|
||||||
| _ -> assert false
|
| "=/=" -> Ok (NE region)
|
||||||
|
| "#" -> Ok (CONS region)
|
||||||
|
| ":=" -> Ok (ASS region)
|
||||||
|
|
||||||
|
(* Invalid lexemes *)
|
||||||
|
| _ -> Error Invalid_symbol
|
||||||
|
|
||||||
(* Identifiers *)
|
(* Identifiers *)
|
||||||
|
|
||||||
@ -632,13 +639,13 @@ let is_sym = function
|
|||||||
| VBAR _
|
| VBAR _
|
||||||
| ARROW _
|
| ARROW _
|
||||||
| ASS _
|
| ASS _
|
||||||
| EQUAL _
|
| EQ _
|
||||||
| COLON _
|
| COLON _
|
||||||
| LT _
|
| LT _
|
||||||
| LEQ _
|
| LE _
|
||||||
| GT _
|
| GT _
|
||||||
| GEQ _
|
| GE _
|
||||||
| NEQ _
|
| NE _
|
||||||
| PLUS _
|
| PLUS _
|
||||||
| MINUS _
|
| MINUS _
|
||||||
| SLASH _
|
| SLASH _
|
||||||
|
@ -9,7 +9,7 @@
|
|||||||
%token <(LexToken.lexeme * Hex.t) Region.reg> Bytes
|
%token <(LexToken.lexeme * Hex.t) Region.reg> Bytes
|
||||||
%token <(LexToken.lexeme * Z.t) Region.reg> Int
|
%token <(LexToken.lexeme * Z.t) Region.reg> Int
|
||||||
%token <(LexToken.lexeme * Z.t) Region.reg> Nat
|
%token <(LexToken.lexeme * Z.t) Region.reg> Nat
|
||||||
%token <(LexToken.lexeme * Z.t) Region.reg> Mtz
|
%token <(LexToken.lexeme * Z.t) Region.reg> Mutez
|
||||||
%token <LexToken.lexeme Region.reg> Ident
|
%token <LexToken.lexeme Region.reg> Ident
|
||||||
%token <LexToken.lexeme Region.reg> Constr
|
%token <LexToken.lexeme Region.reg> Constr
|
||||||
|
|
||||||
@ -27,13 +27,13 @@
|
|||||||
%token <Region.t> VBAR (* "|" *)
|
%token <Region.t> VBAR (* "|" *)
|
||||||
%token <Region.t> ARROW (* "->" *)
|
%token <Region.t> ARROW (* "->" *)
|
||||||
%token <Region.t> ASS (* ":=" *)
|
%token <Region.t> ASS (* ":=" *)
|
||||||
%token <Region.t> EQUAL (* "=" *)
|
%token <Region.t> EQ (* "=" *)
|
||||||
%token <Region.t> COLON (* ":" *)
|
%token <Region.t> COLON (* ":" *)
|
||||||
%token <Region.t> LT (* "<" *)
|
%token <Region.t> LT (* "<" *)
|
||||||
%token <Region.t> LEQ (* "<=" *)
|
%token <Region.t> LE (* "<=" *)
|
||||||
%token <Region.t> GT (* ">" *)
|
%token <Region.t> GT (* ">" *)
|
||||||
%token <Region.t> GEQ (* ">=" *)
|
%token <Region.t> GE (* ">=" *)
|
||||||
%token <Region.t> NEQ (* "=/=" *)
|
%token <Region.t> NE (* "=/=" *)
|
||||||
%token <Region.t> PLUS (* "+" *)
|
%token <Region.t> PLUS (* "+" *)
|
||||||
%token <Region.t> MINUS (* "-" *)
|
%token <Region.t> MINUS (* "-" *)
|
||||||
%token <Region.t> SLASH (* "/" *)
|
%token <Region.t> SLASH (* "/" *)
|
||||||
@ -51,7 +51,6 @@
|
|||||||
%token <Region.t> Case (* "case" *)
|
%token <Region.t> Case (* "case" *)
|
||||||
%token <Region.t> Const (* "const" *)
|
%token <Region.t> Const (* "const" *)
|
||||||
%token <Region.t> Contains (* "contains" *)
|
%token <Region.t> Contains (* "contains" *)
|
||||||
%token <Region.t> Down (* "down" *)
|
|
||||||
%token <Region.t> Else (* "else" *)
|
%token <Region.t> Else (* "else" *)
|
||||||
%token <Region.t> End (* "end" *)
|
%token <Region.t> End (* "end" *)
|
||||||
%token <Region.t> For (* "for" *)
|
%token <Region.t> For (* "for" *)
|
||||||
@ -68,12 +67,10 @@
|
|||||||
%token <Region.t> Of (* "of" *)
|
%token <Region.t> Of (* "of" *)
|
||||||
%token <Region.t> Or (* "or" *)
|
%token <Region.t> Or (* "or" *)
|
||||||
%token <Region.t> Patch (* "patch" *)
|
%token <Region.t> Patch (* "patch" *)
|
||||||
%token <Region.t> Procedure (* "procedure" *)
|
|
||||||
%token <Region.t> Record (* "record" *)
|
%token <Region.t> Record (* "record" *)
|
||||||
%token <Region.t> Remove (* "remove" *)
|
%token <Region.t> Remove (* "remove" *)
|
||||||
%token <Region.t> Set (* "set" *)
|
%token <Region.t> Set (* "set" *)
|
||||||
%token <Region.t> Skip (* "skip" *)
|
%token <Region.t> Skip (* "skip" *)
|
||||||
%token <Region.t> Step (* "step" *)
|
|
||||||
%token <Region.t> Then (* "then" *)
|
%token <Region.t> Then (* "then" *)
|
||||||
%token <Region.t> To (* "to" *)
|
%token <Region.t> To (* "to" *)
|
||||||
%token <Region.t> Type (* "type" *)
|
%token <Region.t> Type (* "type" *)
|
||||||
|
@ -116,7 +116,7 @@ contract:
|
|||||||
declaration:
|
declaration:
|
||||||
type_decl { TypeDecl $1 }
|
type_decl { TypeDecl $1 }
|
||||||
| const_decl { ConstDecl $1 }
|
| const_decl { ConstDecl $1 }
|
||||||
| lambda_decl { LambdaDecl $1 }
|
| fun_decl { FunDecl $1 }
|
||||||
|
|
||||||
(* Type declarations *)
|
(* Type declarations *)
|
||||||
|
|
||||||
@ -137,23 +137,27 @@ type_decl:
|
|||||||
}
|
}
|
||||||
|
|
||||||
type_expr:
|
type_expr:
|
||||||
cartesian { TProd $1 }
|
sum_type { TSum $1 }
|
||||||
| sum_type { TSum $1 }
|
|
||||||
| record_type { TRecord $1 }
|
| record_type { TRecord $1 }
|
||||||
|
| cartesian { $1 }
|
||||||
|
|
||||||
cartesian:
|
cartesian:
|
||||||
nsepseq(function_type,TIMES) {
|
function_type TIMES nsepseq(function_type,TIMES) {
|
||||||
let region = nsepseq_to_region type_expr_to_region $1
|
let value = Utils.nsepseq_cons $1 $2 $3 in
|
||||||
in {region; value=$1}}
|
let region = nsepseq_to_region type_expr_to_region value
|
||||||
|
in TProd {region; value}
|
||||||
|
}
|
||||||
|
| function_type { ($1 : type_expr) }
|
||||||
|
|
||||||
function_type:
|
function_type:
|
||||||
core_type {
|
core_type {
|
||||||
$1
|
$1
|
||||||
}
|
}
|
||||||
| core_type ARROW function_type {
|
| core_type ARROW function_type {
|
||||||
let region = cover (type_expr_to_region $1)
|
let start = type_expr_to_region $1
|
||||||
(type_expr_to_region $3)
|
and stop = type_expr_to_region $3 in
|
||||||
in TFun {region; value = ($1, $2, $3)} }
|
let region = cover start stop in
|
||||||
|
TFun {region; value = $1,$2,$3} }
|
||||||
|
|
||||||
core_type:
|
core_type:
|
||||||
type_name {
|
type_name {
|
||||||
@ -200,7 +204,7 @@ sum_type:
|
|||||||
|
|
||||||
variant:
|
variant:
|
||||||
Constr Of cartesian {
|
Constr Of cartesian {
|
||||||
let region = cover $1.region $3.region
|
let region = cover $1.region (type_expr_to_region $3)
|
||||||
and value = {constr = $1; args = Some ($2, $3)}
|
and value = {constr = $1; args = Some ($2, $3)}
|
||||||
in {region; value}
|
in {region; value}
|
||||||
}
|
}
|
||||||
@ -209,21 +213,21 @@ variant:
|
|||||||
|
|
||||||
record_type:
|
record_type:
|
||||||
Record sep_or_term_list(field_decl,SEMI) End {
|
Record sep_or_term_list(field_decl,SEMI) End {
|
||||||
let elements, terminator = $2 in
|
let ne_elements, terminator = $2 in
|
||||||
let region = cover $1 $3
|
let region = cover $1 $3
|
||||||
and value = {
|
and value = {
|
||||||
opening = Kwd $1;
|
opening = Kwd $1;
|
||||||
elements = Some elements;
|
ne_elements;
|
||||||
terminator;
|
terminator;
|
||||||
closing = End $3}
|
closing = End $3}
|
||||||
in {region; value}
|
in {region; value}
|
||||||
}
|
}
|
||||||
| Record LBRACKET sep_or_term_list(field_decl,SEMI) RBRACKET {
|
| Record LBRACKET sep_or_term_list(field_decl,SEMI) RBRACKET {
|
||||||
let elements, terminator = $3 in
|
let ne_elements, terminator = $3 in
|
||||||
let region = cover $1 $4
|
let region = cover $1 $4
|
||||||
and value = {
|
and value = {
|
||||||
opening = KwdBracket ($1,$2);
|
opening = KwdBracket ($1,$2);
|
||||||
elements = Some elements;
|
ne_elements;
|
||||||
terminator;
|
terminator;
|
||||||
closing = RBracket $4}
|
closing = RBracket $4}
|
||||||
in {region; value} }
|
in {region; value} }
|
||||||
@ -235,11 +239,7 @@ field_decl:
|
|||||||
and value = {field_name = $1; colon = $2; field_type = $3}
|
and value = {field_name = $1; colon = $2; field_type = $3}
|
||||||
in {region; value} }
|
in {region; value} }
|
||||||
|
|
||||||
(* Function and procedure declarations *)
|
(* Function declarations *)
|
||||||
|
|
||||||
lambda_decl:
|
|
||||||
fun_decl { FunDecl $1 }
|
|
||||||
| proc_decl { ProcDecl $1 }
|
|
||||||
|
|
||||||
fun_decl:
|
fun_decl:
|
||||||
Function fun_name parameters COLON type_expr Is
|
Function fun_name parameters COLON type_expr Is
|
||||||
@ -259,30 +259,31 @@ fun_decl:
|
|||||||
ret_type = $5;
|
ret_type = $5;
|
||||||
kwd_is = $6;
|
kwd_is = $6;
|
||||||
local_decls = $7;
|
local_decls = $7;
|
||||||
block = $8;
|
block = Some $8;
|
||||||
kwd_with = $9;
|
kwd_with = Some $9;
|
||||||
return = $10;
|
return = $10;
|
||||||
terminator = $11}
|
terminator = $11}
|
||||||
in {region;value}}
|
in {region;value}}
|
||||||
|
| Function fun_name parameters COLON type_expr Is
|
||||||
proc_decl:
|
expr option(SEMI) {
|
||||||
Procedure fun_name parameters Is
|
|
||||||
seq(local_decl)
|
|
||||||
block option(SEMI)
|
|
||||||
{
|
|
||||||
let stop =
|
let stop =
|
||||||
match $7 with
|
match $8 with
|
||||||
Some region -> region
|
Some region -> region
|
||||||
| None -> $6.region in
|
| None -> expr_to_region $7 in
|
||||||
let region = cover $1 stop
|
let region = cover $1 stop
|
||||||
and value = {
|
and value = {
|
||||||
kwd_procedure = $1;
|
kwd_function = $1;
|
||||||
name = $2;
|
name = $2;
|
||||||
param = $3;
|
param = $3;
|
||||||
kwd_is = $4;
|
colon = $4;
|
||||||
local_decls = $5;
|
ret_type = $5;
|
||||||
block = $6;
|
kwd_is = $6;
|
||||||
terminator = $7}
|
local_decls = [];
|
||||||
|
block = None;
|
||||||
|
kwd_with = None;
|
||||||
|
return = $7;
|
||||||
|
terminator = $8;
|
||||||
|
}
|
||||||
in {region;value}}
|
in {region;value}}
|
||||||
|
|
||||||
parameters:
|
parameters:
|
||||||
@ -310,7 +311,7 @@ param_decl:
|
|||||||
in ParamConst {region; value}}
|
in ParamConst {region; value}}
|
||||||
|
|
||||||
param_type:
|
param_type:
|
||||||
cartesian { TProd $1 }
|
cartesian { $1 }
|
||||||
|
|
||||||
block:
|
block:
|
||||||
Begin sep_or_term_list(statement,SEMI) End {
|
Begin sep_or_term_list(statement,SEMI) End {
|
||||||
@ -342,7 +343,7 @@ open_data_decl:
|
|||||||
| open_var_decl { LocalVar $1 }
|
| open_var_decl { LocalVar $1 }
|
||||||
|
|
||||||
open_const_decl:
|
open_const_decl:
|
||||||
Const unqualified_decl(EQUAL) {
|
Const unqualified_decl(EQ) {
|
||||||
let name, colon, const_type, equal, init, stop = $2 in
|
let name, colon, const_type, equal, init, stop = $2 in
|
||||||
let region = cover $1 stop
|
let region = cover $1 stop
|
||||||
and value = {
|
and value = {
|
||||||
@ -371,7 +372,6 @@ open_var_decl:
|
|||||||
|
|
||||||
local_decl:
|
local_decl:
|
||||||
fun_decl { LocalFun $1 }
|
fun_decl { LocalFun $1 }
|
||||||
| proc_decl { LocalProc $1 }
|
|
||||||
| data_decl { LocalData $1 }
|
| data_decl { LocalData $1 }
|
||||||
|
|
||||||
data_decl:
|
data_decl:
|
||||||
@ -398,10 +398,6 @@ var_decl:
|
|||||||
| open_var_decl { $1 }
|
| open_var_decl { $1 }
|
||||||
|
|
||||||
instruction:
|
instruction:
|
||||||
single_instr { Single $1 }
|
|
||||||
| block { Block $1 }
|
|
||||||
|
|
||||||
single_instr:
|
|
||||||
conditional { Cond $1 }
|
conditional { Cond $1 }
|
||||||
| case_instr { CaseInstr $1 }
|
| case_instr { CaseInstr $1 }
|
||||||
| assignment { Assign $1 }
|
| assignment { Assign $1 }
|
||||||
@ -437,7 +433,7 @@ map_remove:
|
|||||||
in {region; value}}
|
in {region; value}}
|
||||||
|
|
||||||
set_patch:
|
set_patch:
|
||||||
Patch path With injection(Set,expr) {
|
Patch path With ne_injection(Set,expr) {
|
||||||
let region = cover $1 $4.region in
|
let region = cover $1 $4.region in
|
||||||
let value = {
|
let value = {
|
||||||
kwd_patch = $1;
|
kwd_patch = $1;
|
||||||
@ -447,7 +443,7 @@ set_patch:
|
|||||||
in {region; value}}
|
in {region; value}}
|
||||||
|
|
||||||
map_patch:
|
map_patch:
|
||||||
Patch path With injection(Map,binding) {
|
Patch path With ne_injection(Map,binding) {
|
||||||
let region = cover $1 $4.region in
|
let region = cover $1 $4.region in
|
||||||
let value = {
|
let value = {
|
||||||
kwd_patch = $1;
|
kwd_patch = $1;
|
||||||
@ -495,6 +491,28 @@ injection(Kind,element):
|
|||||||
closing = RBracket $3}
|
closing = RBracket $3}
|
||||||
in {region; value}}
|
in {region; value}}
|
||||||
|
|
||||||
|
ne_injection(Kind,element):
|
||||||
|
Kind sep_or_term_list(element,SEMI) End {
|
||||||
|
let ne_elements, terminator = $2 in
|
||||||
|
let region = cover $1 $3
|
||||||
|
and value = {
|
||||||
|
opening = Kwd $1;
|
||||||
|
ne_elements;
|
||||||
|
terminator;
|
||||||
|
closing = End $3}
|
||||||
|
in {region; value}
|
||||||
|
}
|
||||||
|
| Kind LBRACKET sep_or_term_list(element,SEMI) RBRACKET {
|
||||||
|
let ne_elements, terminator = $3 in
|
||||||
|
let region = cover $1 $4
|
||||||
|
and value = {
|
||||||
|
opening = KwdBracket ($1,$2);
|
||||||
|
ne_elements;
|
||||||
|
terminator;
|
||||||
|
closing = RBracket $4}
|
||||||
|
in {region; value}
|
||||||
|
}
|
||||||
|
|
||||||
binding:
|
binding:
|
||||||
expr ARROW expr {
|
expr ARROW expr {
|
||||||
let start = expr_to_region $1
|
let start = expr_to_region $1
|
||||||
@ -507,7 +525,7 @@ binding:
|
|||||||
in {region; value}}
|
in {region; value}}
|
||||||
|
|
||||||
record_patch:
|
record_patch:
|
||||||
Patch path With record_expr {
|
Patch path With ne_injection(Record,field_assignment) {
|
||||||
let region = cover $1 $4.region in
|
let region = cover $1 $4.region in
|
||||||
let value = {
|
let value = {
|
||||||
kwd_patch = $1;
|
kwd_patch = $1;
|
||||||
@ -522,7 +540,7 @@ proc_call:
|
|||||||
conditional:
|
conditional:
|
||||||
If expr Then if_clause option(SEMI) Else if_clause {
|
If expr Then if_clause option(SEMI) Else if_clause {
|
||||||
let region = cover $1 (if_clause_to_region $7) in
|
let region = cover $1 (if_clause_to_region $7) in
|
||||||
let value = {
|
let value : conditional = {
|
||||||
kwd_if = $1;
|
kwd_if = $1;
|
||||||
test = $2;
|
test = $2;
|
||||||
kwd_then = $3;
|
kwd_then = $3;
|
||||||
@ -533,19 +551,22 @@ conditional:
|
|||||||
in {region; value} }
|
in {region; value} }
|
||||||
|
|
||||||
if_clause:
|
if_clause:
|
||||||
instruction {
|
instruction { ClauseInstr $1 }
|
||||||
ClauseInstr $1
|
| clause_block { ClauseBlock $1 }
|
||||||
}
|
|
||||||
|
clause_block:
|
||||||
|
block {
|
||||||
|
LongBlock $1 }
|
||||||
| LBRACE sep_or_term_list(statement,SEMI) RBRACE {
|
| LBRACE sep_or_term_list(statement,SEMI) RBRACE {
|
||||||
let region = cover $1 $3 in
|
let region = cover $1 $3 in
|
||||||
let value = {
|
let value = {
|
||||||
lbrace = $1;
|
lbrace = $1;
|
||||||
inside = $2;
|
inside = $2;
|
||||||
rbrace = $3} in
|
rbrace = $3} in
|
||||||
ClauseBlock {value; region} }
|
ShortBlock {value; region} }
|
||||||
|
|
||||||
case_instr:
|
case_instr:
|
||||||
case(instruction) { $1 instr_to_region }
|
case(if_clause) { $1 if_clause_to_region }
|
||||||
|
|
||||||
case(rhs):
|
case(rhs):
|
||||||
Case expr Of option(VBAR) cases(rhs) End {
|
Case expr Of option(VBAR) cases(rhs) End {
|
||||||
@ -616,38 +637,42 @@ while_loop:
|
|||||||
in While {region; value}}
|
in While {region; value}}
|
||||||
|
|
||||||
for_loop:
|
for_loop:
|
||||||
For var_assign Down? To expr option(step_clause) block {
|
For var_assign To expr block {
|
||||||
let region = cover $1 $7.region in
|
let region = cover $1 $5.region in
|
||||||
let value = {
|
let value = {
|
||||||
kwd_for = $1;
|
kwd_for = $1;
|
||||||
assign = $2;
|
assign = $2;
|
||||||
down = $3;
|
kwd_to = $3;
|
||||||
kwd_to = $4;
|
bound = $4;
|
||||||
bound = $5;
|
block = $5}
|
||||||
step = $6;
|
|
||||||
block = $7}
|
|
||||||
in For (ForInt {region; value})
|
in For (ForInt {region; value})
|
||||||
}
|
}
|
||||||
| For var option(arrow_clause) In expr block {
|
| For var option(arrow_clause) COLON type_expr
|
||||||
let region = cover $1 $6.region in
|
In collection expr block {
|
||||||
|
let region = cover $1 $9.region in
|
||||||
let value = {
|
let value = {
|
||||||
kwd_for = $1;
|
kwd_for = $1;
|
||||||
var = $2;
|
var = $2;
|
||||||
bind_to = $3;
|
bind_to = $3;
|
||||||
kwd_in = $4;
|
colon = $4;
|
||||||
expr = $5;
|
elt_type = $5;
|
||||||
block = $6}
|
kwd_in = $6;
|
||||||
|
collection = $7;
|
||||||
|
expr = $8;
|
||||||
|
block = $9}
|
||||||
in For (ForCollect {region; value})}
|
in For (ForCollect {region; value})}
|
||||||
|
|
||||||
|
collection:
|
||||||
|
Map { Map $1 }
|
||||||
|
| Set { Set $1 }
|
||||||
|
| List { List $1 }
|
||||||
|
|
||||||
var_assign:
|
var_assign:
|
||||||
var ASS expr {
|
var ASS expr {
|
||||||
let region = cover $1.region (expr_to_region $3)
|
let region = cover $1.region (expr_to_region $3)
|
||||||
and value = {name = $1; assign = $2; expr = $3}
|
and value = {name = $1; assign = $2; expr = $3}
|
||||||
in {region; value}}
|
in {region; value}}
|
||||||
|
|
||||||
step_clause:
|
|
||||||
Step expr { $1,$2 }
|
|
||||||
|
|
||||||
arrow_clause:
|
arrow_clause:
|
||||||
ARROW var { $1,$2 }
|
ARROW var { $1,$2 }
|
||||||
|
|
||||||
@ -658,8 +683,22 @@ interactive_expr:
|
|||||||
|
|
||||||
expr:
|
expr:
|
||||||
case(expr) { ECase ($1 expr_to_region) }
|
case(expr) { ECase ($1 expr_to_region) }
|
||||||
|
| cond_expr { $1 }
|
||||||
| disj_expr { $1 }
|
| disj_expr { $1 }
|
||||||
|
|
||||||
|
cond_expr:
|
||||||
|
If expr Then expr option(SEMI) Else expr {
|
||||||
|
let region = cover $1 (expr_to_region $7) in
|
||||||
|
let value : cond_expr = {
|
||||||
|
kwd_if = $1;
|
||||||
|
test = $2;
|
||||||
|
kwd_then = $3;
|
||||||
|
ifso = $4;
|
||||||
|
terminator = $5;
|
||||||
|
kwd_else = $6;
|
||||||
|
ifnot = $7}
|
||||||
|
in ECond {region; value} }
|
||||||
|
|
||||||
disj_expr:
|
disj_expr:
|
||||||
disj_expr Or conj_expr {
|
disj_expr Or conj_expr {
|
||||||
let start = expr_to_region $1
|
let start = expr_to_region $1
|
||||||
@ -701,7 +740,7 @@ comp_expr:
|
|||||||
and value = {arg1 = $1; op = $2; arg2 = $3}
|
and value = {arg1 = $1; op = $2; arg2 = $3}
|
||||||
in ELogic (CompExpr (Lt {region; value}))
|
in ELogic (CompExpr (Lt {region; value}))
|
||||||
}
|
}
|
||||||
| comp_expr LEQ cat_expr {
|
| comp_expr LE cat_expr {
|
||||||
let start = expr_to_region $1
|
let start = expr_to_region $1
|
||||||
and stop = expr_to_region $3 in
|
and stop = expr_to_region $3 in
|
||||||
let region = cover start stop
|
let region = cover start stop
|
||||||
@ -715,21 +754,21 @@ comp_expr:
|
|||||||
and value = {arg1 = $1; op = $2; arg2 = $3}
|
and value = {arg1 = $1; op = $2; arg2 = $3}
|
||||||
in ELogic (CompExpr (Gt {region; value}))
|
in ELogic (CompExpr (Gt {region; value}))
|
||||||
}
|
}
|
||||||
| comp_expr GEQ cat_expr {
|
| comp_expr GE cat_expr {
|
||||||
let start = expr_to_region $1
|
let start = expr_to_region $1
|
||||||
and stop = expr_to_region $3 in
|
and stop = expr_to_region $3 in
|
||||||
let region = cover start stop
|
let region = cover start stop
|
||||||
and value = {arg1 = $1; op = $2; arg2 = $3}
|
and value = {arg1 = $1; op = $2; arg2 = $3}
|
||||||
in ELogic (CompExpr (Geq {region; value}))
|
in ELogic (CompExpr (Geq {region; value}))
|
||||||
}
|
}
|
||||||
| comp_expr EQUAL cat_expr {
|
| comp_expr EQ cat_expr {
|
||||||
let start = expr_to_region $1
|
let start = expr_to_region $1
|
||||||
and stop = expr_to_region $3 in
|
and stop = expr_to_region $3 in
|
||||||
let region = cover start stop
|
let region = cover start stop
|
||||||
and value = {arg1 = $1; op = $2; arg2 = $3}
|
and value = {arg1 = $1; op = $2; arg2 = $3}
|
||||||
in ELogic (CompExpr (Equal {region; value}))
|
in ELogic (CompExpr (Equal {region; value}))
|
||||||
}
|
}
|
||||||
| comp_expr NEQ cat_expr {
|
| comp_expr NE cat_expr {
|
||||||
let start = expr_to_region $1
|
let start = expr_to_region $1
|
||||||
and stop = expr_to_region $3 in
|
and stop = expr_to_region $3 in
|
||||||
let region = cover start stop
|
let region = cover start stop
|
||||||
@ -817,7 +856,7 @@ unary_expr:
|
|||||||
core_expr:
|
core_expr:
|
||||||
Int { EArith (Int $1) }
|
Int { EArith (Int $1) }
|
||||||
| Nat { EArith (Nat $1) }
|
| Nat { EArith (Nat $1) }
|
||||||
| Mtz { EArith (Mtz $1) }
|
| Mutez { EArith (Mutez $1) }
|
||||||
| var { EVar $1 }
|
| var { EVar $1 }
|
||||||
| String { EString (String $1) }
|
| String { EString (String $1) }
|
||||||
| Bytes { EBytes $1 }
|
| Bytes { EBytes $1 }
|
||||||
@ -826,6 +865,7 @@ core_expr:
|
|||||||
| C_Unit { EUnit $1 }
|
| C_Unit { EUnit $1 }
|
||||||
| annot_expr { EAnnot $1 }
|
| annot_expr { EAnnot $1 }
|
||||||
| tuple_expr { ETuple $1 }
|
| tuple_expr { ETuple $1 }
|
||||||
|
| par(expr) { EPar $1 }
|
||||||
| list_expr { EList $1 }
|
| list_expr { EList $1 }
|
||||||
| C_None { EConstr (NoneExpr $1) }
|
| C_None { EConstr (NoneExpr $1) }
|
||||||
| fun_call { ECall $1 }
|
| fun_call { ECall $1 }
|
||||||
@ -859,6 +899,7 @@ set_expr:
|
|||||||
map_expr:
|
map_expr:
|
||||||
map_lookup { MapLookUp $1 }
|
map_lookup { MapLookUp $1 }
|
||||||
| injection(Map,binding) { MapInj $1 }
|
| injection(Map,binding) { MapInj $1 }
|
||||||
|
| injection(BigMap,binding) { BigMapInj $1 }
|
||||||
|
|
||||||
map_lookup:
|
map_lookup:
|
||||||
path brackets(expr) {
|
path brackets(expr) {
|
||||||
@ -888,7 +929,7 @@ record_expr:
|
|||||||
Record sep_or_term_list(field_assignment,SEMI) End {
|
Record sep_or_term_list(field_assignment,SEMI) End {
|
||||||
let elements, terminator = $2 in
|
let elements, terminator = $2 in
|
||||||
let region = cover $1 $3
|
let region = cover $1 $3
|
||||||
and value = {
|
and value : field_assign AST.reg injection = {
|
||||||
opening = Kwd $1;
|
opening = Kwd $1;
|
||||||
elements = Some elements;
|
elements = Some elements;
|
||||||
terminator;
|
terminator;
|
||||||
@ -898,7 +939,7 @@ record_expr:
|
|||||||
| Record LBRACKET sep_or_term_list(field_assignment,SEMI) RBRACKET {
|
| Record LBRACKET sep_or_term_list(field_assignment,SEMI) RBRACKET {
|
||||||
let elements, terminator = $3 in
|
let elements, terminator = $3 in
|
||||||
let region = cover $1 $4
|
let region = cover $1 $4
|
||||||
and value = {
|
and value : field_assign AST.reg injection = {
|
||||||
opening = KwdBracket ($1,$2);
|
opening = KwdBracket ($1,$2);
|
||||||
elements = Some elements;
|
elements = Some elements;
|
||||||
terminator;
|
terminator;
|
||||||
@ -906,7 +947,7 @@ record_expr:
|
|||||||
in {region; value} }
|
in {region; value} }
|
||||||
|
|
||||||
field_assignment:
|
field_assignment:
|
||||||
field_name EQUAL expr {
|
field_name EQ expr {
|
||||||
let region = cover $1.region (expr_to_region $3)
|
let region = cover $1.region (expr_to_region $3)
|
||||||
and value = {
|
and value = {
|
||||||
field_name = $1;
|
field_name = $1;
|
||||||
@ -920,13 +961,14 @@ fun_call:
|
|||||||
in {region; value = $1,$2}}
|
in {region; value = $1,$2}}
|
||||||
|
|
||||||
tuple_expr:
|
tuple_expr:
|
||||||
tuple_inj { TupleInj $1 }
|
par(tuple_comp) { $1 }
|
||||||
|
|
||||||
tuple_inj:
|
tuple_comp:
|
||||||
par(nsepseq(expr,COMMA)) { $1 }
|
expr COMMA nsepseq(expr,COMMA) {
|
||||||
|
Utils.nsepseq_cons $1 $2 $3}
|
||||||
|
|
||||||
arguments:
|
arguments:
|
||||||
tuple_inj { $1 }
|
par(nsepseq(expr,COMMA)) { $1 }
|
||||||
|
|
||||||
list_expr:
|
list_expr:
|
||||||
injection(List,expr) { List $1 }
|
injection(List,expr) { List $1 }
|
||||||
@ -935,14 +977,18 @@ list_expr:
|
|||||||
(* Patterns *)
|
(* Patterns *)
|
||||||
|
|
||||||
pattern:
|
pattern:
|
||||||
nsepseq(core_pattern,CONS) {
|
core_pattern CONS nsepseq(core_pattern,CONS) {
|
||||||
let region = nsepseq_to_region pattern_to_region $1
|
let value = Utils.nsepseq_cons $1 $2 $3 in
|
||||||
in PCons {region; value=$1}}
|
let region = nsepseq_to_region pattern_to_region value
|
||||||
|
in PCons {region; value}}
|
||||||
|
| core_pattern { $1 }
|
||||||
|
|
||||||
core_pattern:
|
core_pattern:
|
||||||
var { PVar $1 }
|
var { PVar $1 }
|
||||||
| WILD { PWild $1 }
|
| WILD { PWild $1 }
|
||||||
| Int { PInt $1 }
|
| Int { PInt $1 }
|
||||||
|
| Nat { PNat $1 }
|
||||||
|
| Bytes { PBytes $1 }
|
||||||
| String { PString $1 }
|
| String { PString $1 }
|
||||||
| C_Unit { PUnit $1 }
|
| C_Unit { PUnit $1 }
|
||||||
| C_False { PFalse $1 }
|
| C_False { PFalse $1 }
|
||||||
|
File diff suppressed because it is too large
Load Diff
@ -12,3 +12,5 @@ val tokens_to_string : AST.t -> string
|
|||||||
val path_to_string : AST.path -> string
|
val path_to_string : AST.path -> string
|
||||||
val pattern_to_string : AST.pattern -> string
|
val pattern_to_string : AST.pattern -> string
|
||||||
val instruction_to_string : AST.instruction -> string
|
val instruction_to_string : AST.instruction -> string
|
||||||
|
|
||||||
|
val pp_ast : Buffer.t -> AST.t -> unit
|
||||||
|
@ -103,6 +103,14 @@ let () =
|
|||||||
try
|
try
|
||||||
let ast = Parser.contract tokeniser buffer in
|
let ast = Parser.contract tokeniser buffer in
|
||||||
if Utils.String.Set.mem "ast" options.verbose
|
if Utils.String.Set.mem "ast" options.verbose
|
||||||
|
then let buffer = Buffer.create 131 in
|
||||||
|
begin
|
||||||
|
ParserLog.offsets := options.offsets;
|
||||||
|
ParserLog.mode := options.mode;
|
||||||
|
ParserLog.pp_ast buffer ast;
|
||||||
|
Buffer.output_buffer stdout buffer
|
||||||
|
end
|
||||||
|
else if Utils.String.Set.mem "ast-tokens" options.verbose
|
||||||
then let buffer = Buffer.create 131 in
|
then let buffer = Buffer.create 131 in
|
||||||
begin
|
begin
|
||||||
ParserLog.offsets := options.offsets;
|
ParserLog.offsets := options.offsets;
|
||||||
|
@ -312,7 +312,7 @@ and unary_expr = parser
|
|||||||
and core_expr = parser
|
and core_expr = parser
|
||||||
[< 'Int _ >] -> ()
|
[< 'Int _ >] -> ()
|
||||||
| [< 'Nat _ >] -> ()
|
| [< 'Nat _ >] -> ()
|
||||||
| [< 'Mtz _ >] -> ()
|
| [< 'Mutez _ >] -> ()
|
||||||
| [< 'Ident _; _ = opt core_suffix >] -> ()
|
| [< 'Ident _; _ = opt core_suffix >] -> ()
|
||||||
| [< 'String _ >] -> ()
|
| [< 'String _ >] -> ()
|
||||||
| [< 'Bytes _ >] -> ()
|
| [< 'Bytes _ >] -> ()
|
||||||
|
@ -39,7 +39,7 @@ let help language extension () =
|
|||||||
print " -q, --quiet No output, except errors (default)";
|
print " -q, --quiet No output, except errors (default)";
|
||||||
print " --columns Columns for source locations";
|
print " --columns Columns for source locations";
|
||||||
print " --bytes Bytes for source locations";
|
print " --bytes Bytes for source locations";
|
||||||
print " --verbose=<stages> cmdline, cpp, ast (colon-separated)";
|
print " --verbose=<stages> cmdline, cpp, ast-tokens, ast (colon-separated)";
|
||||||
print " --version Commit hash on stdout";
|
print " --version Commit hash on stdout";
|
||||||
print " -h, --help This help";
|
print " -h, --help This help";
|
||||||
exit 0
|
exit 0
|
||||||
|
@ -62,20 +62,20 @@ module type TOKEN =
|
|||||||
|
|
||||||
type int_err = Non_canonical_zero
|
type int_err = Non_canonical_zero
|
||||||
type ident_err = Reserved_name
|
type ident_err = Reserved_name
|
||||||
type invalid_natural =
|
type nat_err = Invalid_natural
|
||||||
| Invalid_natural
|
|
||||||
| Non_canonical_zero_nat
|
| Non_canonical_zero_nat
|
||||||
|
type sym_err = Invalid_symbol
|
||||||
|
|
||||||
(* Injections *)
|
(* Injections *)
|
||||||
|
|
||||||
|
val mk_int : lexeme -> Region.t -> (token, int_err) result
|
||||||
|
val mk_nat : lexeme -> Region.t -> (token, nat_err) result
|
||||||
|
val mk_mutez : lexeme -> Region.t -> (token, int_err) result
|
||||||
|
val mk_ident : lexeme -> Region.t -> (token, ident_err) result
|
||||||
|
val mk_sym : lexeme -> Region.t -> (token, sym_err) result
|
||||||
val mk_string : lexeme -> Region.t -> token
|
val mk_string : lexeme -> Region.t -> token
|
||||||
val mk_bytes : lexeme -> Region.t -> token
|
val mk_bytes : lexeme -> Region.t -> token
|
||||||
val mk_int : lexeme -> Region.t -> (token, int_err) result
|
|
||||||
val mk_nat : lexeme -> Region.t -> (token, invalid_natural) result
|
|
||||||
val mk_mtz : lexeme -> Region.t -> (token, int_err) result
|
|
||||||
val mk_ident : lexeme -> Region.t -> (token, ident_err) result
|
|
||||||
val mk_constr : lexeme -> Region.t -> token
|
val mk_constr : lexeme -> Region.t -> token
|
||||||
val mk_sym : lexeme -> Region.t -> token
|
|
||||||
val eof : Region.t -> token
|
val eof : Region.t -> token
|
||||||
|
|
||||||
(* Predicates *)
|
(* Predicates *)
|
||||||
|
@ -103,20 +103,20 @@ module type TOKEN =
|
|||||||
|
|
||||||
type int_err = Non_canonical_zero
|
type int_err = Non_canonical_zero
|
||||||
type ident_err = Reserved_name
|
type ident_err = Reserved_name
|
||||||
type invalid_natural =
|
type nat_err = Invalid_natural
|
||||||
| Invalid_natural
|
|
||||||
| Non_canonical_zero_nat
|
| Non_canonical_zero_nat
|
||||||
|
type sym_err = Invalid_symbol
|
||||||
|
|
||||||
(* Injections *)
|
(* Injections *)
|
||||||
|
|
||||||
|
val mk_int : lexeme -> Region.t -> (token, int_err) result
|
||||||
|
val mk_nat : lexeme -> Region.t -> (token, nat_err) result
|
||||||
|
val mk_mutez : lexeme -> Region.t -> (token, int_err) result
|
||||||
|
val mk_ident : lexeme -> Region.t -> (token, ident_err) result
|
||||||
|
val mk_sym : lexeme -> Region.t -> (token, sym_err) result
|
||||||
val mk_string : lexeme -> Region.t -> token
|
val mk_string : lexeme -> Region.t -> token
|
||||||
val mk_bytes : lexeme -> Region.t -> token
|
val mk_bytes : lexeme -> Region.t -> token
|
||||||
val mk_int : lexeme -> Region.t -> (token, int_err) result
|
|
||||||
val mk_nat : lexeme -> Region.t -> (token, invalid_natural) result
|
|
||||||
val mk_mtz : lexeme -> Region.t -> (token, int_err) result
|
|
||||||
val mk_ident : lexeme -> Region.t -> (token, ident_err) result
|
|
||||||
val mk_constr : lexeme -> Region.t -> token
|
val mk_constr : lexeme -> Region.t -> token
|
||||||
val mk_sym : lexeme -> Region.t -> token
|
|
||||||
val eof : Region.t -> token
|
val eof : Region.t -> token
|
||||||
|
|
||||||
(* Predicates *)
|
(* Predicates *)
|
||||||
@ -343,6 +343,7 @@ module Make (Token: TOKEN) : (S with module Token = Token) =
|
|||||||
type Error.t += Broken_string
|
type Error.t += Broken_string
|
||||||
type Error.t += Invalid_character_in_string
|
type Error.t += Invalid_character_in_string
|
||||||
type Error.t += Reserved_name
|
type Error.t += Reserved_name
|
||||||
|
type Error.t += Invalid_symbol
|
||||||
type Error.t += Invalid_natural
|
type Error.t += Invalid_natural
|
||||||
|
|
||||||
let error_to_string = function
|
let error_to_string = function
|
||||||
@ -386,6 +387,9 @@ module Make (Token: TOKEN) : (S with module Token = Token) =
|
|||||||
| Reserved_name ->
|
| Reserved_name ->
|
||||||
"Reserved named.\n\
|
"Reserved named.\n\
|
||||||
Hint: Change the name.\n"
|
Hint: Change the name.\n"
|
||||||
|
| Invalid_symbol ->
|
||||||
|
"Invalid symbol.\n\
|
||||||
|
Hint: Check the LIGO syntax you use.\n"
|
||||||
| Invalid_natural ->
|
| Invalid_natural ->
|
||||||
"Invalid natural."
|
"Invalid natural."
|
||||||
| _ -> assert false
|
| _ -> assert false
|
||||||
@ -432,9 +436,9 @@ module Make (Token: TOKEN) : (S with module Token = Token) =
|
|||||||
| Error Token.Invalid_natural ->
|
| Error Token.Invalid_natural ->
|
||||||
fail region Invalid_natural
|
fail region Invalid_natural
|
||||||
|
|
||||||
let mk_mtz state buffer =
|
let mk_mutez state buffer =
|
||||||
let region, lexeme, state = sync state buffer in
|
let region, lexeme, state = sync state buffer in
|
||||||
match Token.mk_mtz lexeme region with
|
match Token.mk_mutez lexeme region with
|
||||||
Ok token -> token, state
|
Ok token -> token, state
|
||||||
| Error Token.Non_canonical_zero ->
|
| Error Token.Non_canonical_zero ->
|
||||||
fail region Non_canonical_zero
|
fail region Non_canonical_zero
|
||||||
@ -443,7 +447,7 @@ module Make (Token: TOKEN) : (S with module Token = Token) =
|
|||||||
let region, lexeme, state = sync state buffer in
|
let region, lexeme, state = sync state buffer in
|
||||||
let lexeme = Str.string_before lexeme (String.index lexeme 't') in
|
let lexeme = Str.string_before lexeme (String.index lexeme 't') in
|
||||||
let lexeme = Z.mul (Z.of_int 1_000_000) (Z.of_string lexeme) in
|
let lexeme = Z.mul (Z.of_int 1_000_000) (Z.of_string lexeme) in
|
||||||
match Token.mk_mtz (Z.to_string lexeme ^ "mtz") region with
|
match Token.mk_mutez (Z.to_string lexeme ^ "mutez") region with
|
||||||
Ok token -> token, state
|
Ok token -> token, state
|
||||||
| Error Token.Non_canonical_zero ->
|
| Error Token.Non_canonical_zero ->
|
||||||
fail region Non_canonical_zero
|
fail region Non_canonical_zero
|
||||||
@ -457,9 +461,9 @@ module Make (Token: TOKEN) : (S with module Token = Token) =
|
|||||||
let num = Z.of_string (integral ^ fractional)
|
let num = Z.of_string (integral ^ fractional)
|
||||||
and den = Z.of_string ("1" ^ String.make (len-index-1) '0')
|
and den = Z.of_string ("1" ^ String.make (len-index-1) '0')
|
||||||
and million = Q.of_string "1000000" in
|
and million = Q.of_string "1000000" in
|
||||||
let mtz = Q.make num den |> Q.mul million in
|
let mutez = Q.make num den |> Q.mul million in
|
||||||
let should_be_1 = Q.den mtz in
|
let should_be_1 = Q.den mutez in
|
||||||
if Z.equal Z.one should_be_1 then Some (Q.num mtz) else None
|
if Z.equal Z.one should_be_1 then Some (Q.num mutez) else None
|
||||||
| exception Not_found -> assert false
|
| exception Not_found -> assert false
|
||||||
|
|
||||||
let mk_tz_decimal state buffer =
|
let mk_tz_decimal state buffer =
|
||||||
@ -467,7 +471,7 @@ module Make (Token: TOKEN) : (S with module Token = Token) =
|
|||||||
let lexeme = Str.string_before lexeme (String.index lexeme 't') in
|
let lexeme = Str.string_before lexeme (String.index lexeme 't') in
|
||||||
match format_tz lexeme with
|
match format_tz lexeme with
|
||||||
| Some tz -> (
|
| Some tz -> (
|
||||||
match Token.mk_mtz (Z.to_string tz ^ "mtz") region with
|
match Token.mk_mutez (Z.to_string tz ^ "mutez") region with
|
||||||
Ok token ->
|
Ok token ->
|
||||||
token, state
|
token, state
|
||||||
| Error Token.Non_canonical_zero ->
|
| Error Token.Non_canonical_zero ->
|
||||||
@ -487,8 +491,10 @@ module Make (Token: TOKEN) : (S with module Token = Token) =
|
|||||||
in Token.mk_constr lexeme region, state
|
in Token.mk_constr lexeme region, state
|
||||||
|
|
||||||
let mk_sym state buffer =
|
let mk_sym state buffer =
|
||||||
let region, lexeme, state = sync state buffer
|
let region, lexeme, state = sync state buffer in
|
||||||
in Token.mk_sym lexeme region, state
|
match Token.mk_sym lexeme region with
|
||||||
|
Ok token -> token, state
|
||||||
|
| Error Token.Invalid_symbol -> fail region Invalid_symbol
|
||||||
|
|
||||||
let mk_eof state buffer =
|
let mk_eof state buffer =
|
||||||
let region, _, state = sync state buffer
|
let region, _, state = sync state buffer
|
||||||
@ -518,11 +524,16 @@ let byte_seq = byte | byte (byte | '_')* byte
|
|||||||
let bytes = "0x" (byte_seq? as seq)
|
let bytes = "0x" (byte_seq? as seq)
|
||||||
let esc = "\\n" | "\\\"" | "\\\\" | "\\b"
|
let esc = "\\n" | "\\\"" | "\\\\" | "\\b"
|
||||||
| "\\r" | "\\t" | "\\x" byte
|
| "\\r" | "\\t" | "\\x" byte
|
||||||
let symbol = ';' | ',' | '(' | ')'| '[' | ']' | '{' | '}'
|
let pascaligo_sym = "=/=" | '#' | ":="
|
||||||
| '#' | '|' | "->" | ":=" | '=' | ':'
|
let cameligo_sym = "<>" | "::" | "||" | "&&"
|
||||||
| '<' | "<=" | '>' | ">=" | "=/=" | "<>"
|
|
||||||
| '+' | '-' | '*' | '/' | '.' | '_' | '^'
|
let symbol =
|
||||||
| "::" | "||" | "&&"
|
';' | ',' | '(' | ')'| '[' | ']' | '{' | '}'
|
||||||
|
| '=' | ':' | '|' | "->" | '.' | '_' | '^'
|
||||||
|
| '+' | '-' | '*' | '/'
|
||||||
|
| '<' | "<=" | '>' | ">="
|
||||||
|
| pascaligo_sym | cameligo_sym
|
||||||
|
|
||||||
let string = [^'"' '\\' '\n']* (* For strings of #include *)
|
let string = [^'"' '\\' '\n']* (* For strings of #include *)
|
||||||
|
|
||||||
(* RULES *)
|
(* RULES *)
|
||||||
@ -548,7 +559,7 @@ and scan state = parse
|
|||||||
| bytes { (mk_bytes seq) state lexbuf |> enqueue }
|
| bytes { (mk_bytes seq) state lexbuf |> enqueue }
|
||||||
| natural 'n' { mk_nat state lexbuf |> enqueue }
|
| natural 'n' { mk_nat state lexbuf |> enqueue }
|
||||||
| natural 'p' { mk_nat state lexbuf |> enqueue }
|
| natural 'p' { mk_nat state lexbuf |> enqueue }
|
||||||
| natural "mtz" { mk_mtz state lexbuf |> enqueue }
|
| natural "mutez" { mk_mutez state lexbuf |> enqueue }
|
||||||
| natural "tz" { mk_tz state lexbuf |> enqueue }
|
| natural "tz" { mk_tz state lexbuf |> enqueue }
|
||||||
| decimal "tz" { mk_tz_decimal state lexbuf |> enqueue }
|
| decimal "tz" { mk_tz_decimal state lexbuf |> enqueue }
|
||||||
| natural { mk_int state lexbuf |> enqueue }
|
| natural { mk_int state lexbuf |> enqueue }
|
||||||
|
@ -1,12 +1,11 @@
|
|||||||
(* This module defines the sorts of markup recognised by the LIGO
|
(** This module defines the sorts of markup recognised by the LIGO
|
||||||
lexer *)
|
lexer *)
|
||||||
|
|
||||||
module Region = Simple_utils.Region
|
module Region = Simple_utils.Region
|
||||||
|
|
||||||
(* A lexeme is piece of concrete syntax belonging to a token. In
|
(** A lexeme is piece of concrete syntax belonging to a token. In
|
||||||
algebraic terms, a token is also a piece of abstract lexical
|
algebraic terms, a token is also a piece of abstract lexical
|
||||||
syntax. Lexical units emcompass both markup and lexemes. *)
|
syntax. Lexical units emcompass both markup and lexemes. *)
|
||||||
|
|
||||||
type lexeme = string
|
type lexeme = string
|
||||||
|
|
||||||
type t =
|
type t =
|
||||||
@ -19,7 +18,7 @@ type t =
|
|||||||
|
|
||||||
type markup = t
|
type markup = t
|
||||||
|
|
||||||
(* Pretty-printing of markup
|
(** Pretty-printing of markup
|
||||||
|
|
||||||
The difference between [to_lexeme] and [to_string] is that the
|
The difference between [to_lexeme] and [to_string] is that the
|
||||||
former builds the corresponding concrete syntax (the lexeme),
|
former builds the corresponding concrete syntax (the lexeme),
|
||||||
|
@ -6,6 +6,7 @@
|
|||||||
tezos-utils
|
tezos-utils
|
||||||
parser
|
parser
|
||||||
ast_simplified
|
ast_simplified
|
||||||
|
self_ast_simplified
|
||||||
operators)
|
operators)
|
||||||
(modules ligodity pascaligo simplify)
|
(modules ligodity pascaligo simplify)
|
||||||
(preprocess
|
(preprocess
|
||||||
|
@ -49,28 +49,6 @@ module Errors = struct
|
|||||||
] in
|
] in
|
||||||
error ~data title message
|
error ~data title message
|
||||||
|
|
||||||
let unsupported_arith_op expr =
|
|
||||||
let title () = "arithmetic expressions" in
|
|
||||||
let message () =
|
|
||||||
Format.asprintf "this arithmetic operator is not supported yet" in
|
|
||||||
let expr_loc = Raw.expr_to_region expr in
|
|
||||||
let data = [
|
|
||||||
("expr_loc",
|
|
||||||
fun () -> Format.asprintf "%a" Location.pp_lift @@ expr_loc)
|
|
||||||
] in
|
|
||||||
error ~data title message
|
|
||||||
|
|
||||||
let unsupported_string_catenation expr =
|
|
||||||
let title () = "string expressions" in
|
|
||||||
let message () =
|
|
||||||
Format.asprintf "string concatenation is not supported yet" in
|
|
||||||
let expr_loc = Raw.expr_to_region expr in
|
|
||||||
let data = [
|
|
||||||
("expr_loc",
|
|
||||||
fun () -> Format.asprintf "%a" Location.pp_lift @@ expr_loc)
|
|
||||||
] in
|
|
||||||
error ~data title message
|
|
||||||
|
|
||||||
let untyped_fun_param var =
|
let untyped_fun_param var =
|
||||||
let title () = "function parameter" in
|
let title () = "function parameter" in
|
||||||
let message () =
|
let message () =
|
||||||
@ -431,13 +409,12 @@ let rec simpl_expression :
|
|||||||
let n = Z.to_int @@ snd @@ n in
|
let n = Z.to_int @@ snd @@ n in
|
||||||
return @@ e_literal ~loc (Literal_nat n)
|
return @@ e_literal ~loc (Literal_nat n)
|
||||||
)
|
)
|
||||||
| EArith (Mtz n) -> (
|
| EArith (Mutez n) -> (
|
||||||
let (n , loc) = r_split n in
|
let (n , loc) = r_split n in
|
||||||
let n = Z.to_int @@ snd @@ n in
|
let n = Z.to_int @@ snd @@ n in
|
||||||
return @@ e_literal ~loc (Literal_mutez n)
|
return @@ e_literal ~loc (Literal_mutez n)
|
||||||
)
|
)
|
||||||
| EArith _ as e ->
|
| EArith (Neg e) -> simpl_unop "NEG" e
|
||||||
fail @@ unsupported_arith_op e
|
|
||||||
| EString (String s) -> (
|
| EString (String s) -> (
|
||||||
let (s , loc) = r_split s in
|
let (s , loc) = r_split s in
|
||||||
let s' =
|
let s' =
|
||||||
@ -446,8 +423,11 @@ let rec simpl_expression :
|
|||||||
in
|
in
|
||||||
return @@ e_literal ~loc (Literal_string s')
|
return @@ e_literal ~loc (Literal_string s')
|
||||||
)
|
)
|
||||||
| EString (Cat _) as e ->
|
| EString (Cat c) ->
|
||||||
fail @@ unsupported_string_catenation e
|
let (c, loc) = r_split c in
|
||||||
|
let%bind string_left = simpl_expression c.arg1 in
|
||||||
|
let%bind string_right = simpl_expression c.arg2 in
|
||||||
|
return @@ e_string_cat ~loc string_left string_right
|
||||||
| ELogic l -> simpl_logic_expression l
|
| ELogic l -> simpl_logic_expression l
|
||||||
| EList l -> simpl_list_expression l
|
| EList l -> simpl_list_expression l
|
||||||
| ECase c -> (
|
| ECase c -> (
|
||||||
|
@ -8,7 +8,6 @@ open Combinators
|
|||||||
|
|
||||||
let nseq_to_list (hd, tl) = hd :: tl
|
let nseq_to_list (hd, tl) = hd :: tl
|
||||||
let npseq_to_list (hd, tl) = hd :: (List.map snd tl)
|
let npseq_to_list (hd, tl) = hd :: (List.map snd tl)
|
||||||
let npseq_to_nelist (hd, tl) = hd, (List.map snd tl)
|
|
||||||
let pseq_to_list = function
|
let pseq_to_list = function
|
||||||
| None -> []
|
| None -> []
|
||||||
| Some lst -> npseq_to_list lst
|
| Some lst -> npseq_to_list lst
|
||||||
@ -36,26 +35,6 @@ module Errors = struct
|
|||||||
] in
|
] in
|
||||||
error ~data title message
|
error ~data title message
|
||||||
|
|
||||||
let unsupported_proc_decl decl =
|
|
||||||
let title () = "procedure declarations" in
|
|
||||||
let message () =
|
|
||||||
Format.asprintf "procedures are not supported yet" in
|
|
||||||
let data = [
|
|
||||||
("declaration",
|
|
||||||
fun () -> Format.asprintf "%a" Location.pp_lift @@ decl.Region.region)
|
|
||||||
] in
|
|
||||||
error ~data title message
|
|
||||||
|
|
||||||
let unsupported_local_proc region =
|
|
||||||
let title () = "local procedure declarations" in
|
|
||||||
let message () =
|
|
||||||
Format.asprintf "local procedures are not supported yet" in
|
|
||||||
let data = [
|
|
||||||
("declaration",
|
|
||||||
fun () -> Format.asprintf "%a" Location.pp_lift @@ region)
|
|
||||||
] in
|
|
||||||
error ~data title message
|
|
||||||
|
|
||||||
let corner_case ~loc message =
|
let corner_case ~loc message =
|
||||||
let title () = "corner case" in
|
let title () = "corner case" in
|
||||||
let content () = "We don't have a good error message for this case. \
|
let content () = "We don't have a good error message for this case. \
|
||||||
@ -89,79 +68,6 @@ module Errors = struct
|
|||||||
] in
|
] in
|
||||||
error ~data title message
|
error ~data title message
|
||||||
|
|
||||||
let unsupported_proc_calls call =
|
|
||||||
let title () = "procedure calls" in
|
|
||||||
let message () =
|
|
||||||
Format.asprintf "procedure calls are not supported yet" in
|
|
||||||
let data = [
|
|
||||||
("call_loc",
|
|
||||||
fun () -> Format.asprintf "%a" Location.pp_lift @@ call.Region.region)
|
|
||||||
] in
|
|
||||||
error ~data title message
|
|
||||||
|
|
||||||
let unsupported_for_loops region =
|
|
||||||
let title () = "bounded iterators" in
|
|
||||||
let message () =
|
|
||||||
Format.asprintf "only simple for loops are supported for now" in
|
|
||||||
let data = [
|
|
||||||
("loop_loc",
|
|
||||||
fun () -> Format.asprintf "%a" Location.pp_lift @@ region)
|
|
||||||
] in
|
|
||||||
error ~data title message
|
|
||||||
|
|
||||||
let unsupported_empty_record_patch record_expr =
|
|
||||||
let title () = "empty record patch" in
|
|
||||||
let message () =
|
|
||||||
Format.asprintf "empty record patches are not supported yet" in
|
|
||||||
let data = [
|
|
||||||
("record_loc",
|
|
||||||
fun () -> Format.asprintf "%a" Location.pp_lift @@ record_expr.Region.region)
|
|
||||||
] in
|
|
||||||
error ~data title message
|
|
||||||
|
|
||||||
let unsupported_map_patches patch =
|
|
||||||
let title () = "map patches" in
|
|
||||||
let message () =
|
|
||||||
Format.asprintf "map patches (a.k.a. functional updates) are \
|
|
||||||
not supported yet" in
|
|
||||||
let data = [
|
|
||||||
("patch_loc",
|
|
||||||
fun () -> Format.asprintf "%a" Location.pp_lift @@ patch.Region.region)
|
|
||||||
] in
|
|
||||||
error ~data title message
|
|
||||||
|
|
||||||
let unsupported_set_patches patch =
|
|
||||||
let title () = "set patches" in
|
|
||||||
let message () =
|
|
||||||
Format.asprintf "set patches (a.k.a. functional updates) are \
|
|
||||||
not supported yet" in
|
|
||||||
let data = [
|
|
||||||
("patch_loc",
|
|
||||||
fun () -> Format.asprintf "%a" Location.pp_lift @@ patch.Region.region)
|
|
||||||
] in
|
|
||||||
error ~data title message
|
|
||||||
|
|
||||||
(* let unsupported_set_removal remove =
|
|
||||||
let title () = "set removals" in
|
|
||||||
let message () =
|
|
||||||
Format.asprintf "removal of elements in a set is not \
|
|
||||||
supported yet" in
|
|
||||||
let data = [
|
|
||||||
("removal_loc",
|
|
||||||
fun () -> Format.asprintf "%a" Location.pp_lift @@ remove.Region.region)
|
|
||||||
] in
|
|
||||||
error ~data title message *)
|
|
||||||
|
|
||||||
let unsupported_deep_set_rm path =
|
|
||||||
let title () = "set removals" in
|
|
||||||
let message () =
|
|
||||||
Format.asprintf "removal of members from embedded sets is not supported yet" in
|
|
||||||
let data = [
|
|
||||||
("path_loc",
|
|
||||||
fun () -> Format.asprintf "%a" Location.pp_lift @@ path.Region.region)
|
|
||||||
] in
|
|
||||||
error ~data title message
|
|
||||||
|
|
||||||
let unsupported_non_var_pattern p =
|
let unsupported_non_var_pattern p =
|
||||||
let title () = "pattern is not a variable" in
|
let title () = "pattern is not a variable" in
|
||||||
let message () =
|
let message () =
|
||||||
@ -221,13 +127,14 @@ module Errors = struct
|
|||||||
] in
|
] in
|
||||||
error ~data title message
|
error ~data title message
|
||||||
|
|
||||||
let unsupported_sub_blocks b =
|
let unsupported_deep_access_for_collection for_col =
|
||||||
let title () = "block instructions" in
|
let title () = "deep access in loop over collection" in
|
||||||
let message () =
|
let message () =
|
||||||
Format.asprintf "Sub-blocks are not supported yet" in
|
Format.asprintf "currently, we do not support deep \
|
||||||
|
accesses in loops over collection" in
|
||||||
let data = [
|
let data = [
|
||||||
("block_loc",
|
("pattern_loc",
|
||||||
fun () -> Format.asprintf "%a" Location.pp_lift @@ b.Region.region)
|
fun () -> Format.asprintf "%a" Location.pp_lift @@ for_col.Region.region)
|
||||||
] in
|
] in
|
||||||
error ~data title message
|
error ~data title message
|
||||||
|
|
||||||
@ -309,7 +216,7 @@ let rec simpl_type_expression (t:Raw.type_expr) : type_expression result =
|
|||||||
let%bind lst = bind_list
|
let%bind lst = bind_list
|
||||||
@@ List.map aux
|
@@ List.map aux
|
||||||
@@ List.map apply
|
@@ List.map apply
|
||||||
@@ pseq_to_list r.value.elements in
|
@@ npseq_to_list r.value.ne_elements in
|
||||||
let m = List.fold_left (fun m (x, y) -> SMap.add x y m) SMap.empty lst in
|
let m = List.fold_left (fun m (x, y) -> SMap.add x y m) SMap.empty lst in
|
||||||
ok @@ T_record m
|
ok @@ T_record m
|
||||||
| TSum s ->
|
| TSum s ->
|
||||||
@ -317,10 +224,11 @@ let rec simpl_type_expression (t:Raw.type_expr) : type_expression result =
|
|||||||
let args =
|
let args =
|
||||||
match v.value.args with
|
match v.value.args with
|
||||||
None -> []
|
None -> []
|
||||||
| Some (_, product) ->
|
| Some (_, t_expr) ->
|
||||||
npseq_to_list product.value in
|
match t_expr with
|
||||||
let%bind te = simpl_list_type_expression
|
TProd product -> npseq_to_list product.value
|
||||||
@@ args in
|
| _ -> [t_expr] in
|
||||||
|
let%bind te = simpl_list_type_expression @@ args in
|
||||||
ok (v.value.constr.value, te)
|
ok (v.value.constr.value, te)
|
||||||
in
|
in
|
||||||
let%bind lst = bind_list
|
let%bind lst = bind_list
|
||||||
@ -389,8 +297,7 @@ let rec simpl_expression (t:Raw.expr) : expr result =
|
|||||||
let (x' , loc) = r_split x in
|
let (x' , loc) = r_split x in
|
||||||
return @@ e_literal ~loc (Literal_bytes (Bytes.of_string @@ fst x'))
|
return @@ e_literal ~loc (Literal_bytes (Bytes.of_string @@ fst x'))
|
||||||
| ETuple tpl ->
|
| ETuple tpl ->
|
||||||
let (Raw.TupleInj tpl') = tpl in
|
let (tpl' , loc) = r_split tpl in
|
||||||
let (tpl' , loc) = r_split tpl' in
|
|
||||||
simpl_tuple_expression ~loc @@ npseq_to_list tpl'.inside
|
simpl_tuple_expression ~loc @@ npseq_to_list tpl'.inside
|
||||||
| ERecord r ->
|
| ERecord r ->
|
||||||
let%bind fields = bind_list
|
let%bind fields = bind_list
|
||||||
@ -442,7 +349,7 @@ let rec simpl_expression (t:Raw.expr) : expr result =
|
|||||||
let n = Z.to_int @@ snd @@ n in
|
let n = Z.to_int @@ snd @@ n in
|
||||||
return @@ e_literal ~loc (Literal_nat n)
|
return @@ e_literal ~loc (Literal_nat n)
|
||||||
)
|
)
|
||||||
| EArith (Mtz n) -> (
|
| EArith (Mutez n) -> (
|
||||||
let (n , loc) = r_split n in
|
let (n , loc) = r_split n in
|
||||||
let n = Z.to_int @@ snd @@ n in
|
let n = Z.to_int @@ snd @@ n in
|
||||||
return @@ e_literal ~loc (Literal_mutez n)
|
return @@ e_literal ~loc (Literal_mutez n)
|
||||||
@ -463,6 +370,12 @@ let rec simpl_expression (t:Raw.expr) : expr result =
|
|||||||
| ELogic l -> simpl_logic_expression l
|
| ELogic l -> simpl_logic_expression l
|
||||||
| EList l -> simpl_list_expression l
|
| EList l -> simpl_list_expression l
|
||||||
| ESet s -> simpl_set_expression s
|
| ESet s -> simpl_set_expression s
|
||||||
|
| ECond c ->
|
||||||
|
let (c , loc) = r_split c in
|
||||||
|
let%bind expr = simpl_expression c.test in
|
||||||
|
let%bind match_true = simpl_expression c.ifso in
|
||||||
|
let%bind match_false = simpl_expression c.ifnot in
|
||||||
|
return @@ e_matching expr ~loc (Match_bool {match_true; match_false})
|
||||||
| ECase c -> (
|
| ECase c -> (
|
||||||
let (c , loc) = r_split c in
|
let (c , loc) = r_split c in
|
||||||
let%bind e = simpl_expression c.expr in
|
let%bind e = simpl_expression c.expr in
|
||||||
@ -489,6 +402,18 @@ let rec simpl_expression (t:Raw.expr) : expr result =
|
|||||||
bind_map_list aux lst in
|
bind_map_list aux lst in
|
||||||
return @@ e_map ~loc lst
|
return @@ e_map ~loc lst
|
||||||
)
|
)
|
||||||
|
| EMap (BigMapInj mi) -> (
|
||||||
|
let (mi , loc) = r_split mi in
|
||||||
|
let%bind lst =
|
||||||
|
let lst = List.map get_value @@ pseq_to_list mi.elements in
|
||||||
|
let aux : Raw.binding -> (expression * expression) result =
|
||||||
|
fun b ->
|
||||||
|
let%bind src = simpl_expression b.source in
|
||||||
|
let%bind dst = simpl_expression b.image in
|
||||||
|
ok (src, dst) in
|
||||||
|
bind_map_list aux lst in
|
||||||
|
return @@ e_big_map ~loc lst
|
||||||
|
)
|
||||||
| EMap (MapLookUp lu) -> (
|
| EMap (MapLookUp lu) -> (
|
||||||
let (lu , loc) = r_split lu in
|
let (lu , loc) = r_split lu in
|
||||||
let%bind path = match lu.path with
|
let%bind path = match lu.path with
|
||||||
@ -594,8 +519,7 @@ and simpl_local_declaration : Raw.local_decl -> _ result = fun t ->
|
|||||||
let (f , loc) = r_split f in
|
let (f , loc) = r_split f in
|
||||||
let%bind (name , e) = simpl_fun_declaration ~loc f in
|
let%bind (name , e) = simpl_fun_declaration ~loc f in
|
||||||
return_let_in ~loc name e
|
return_let_in ~loc name e
|
||||||
| LocalProc d ->
|
|
||||||
fail @@ unsupported_local_proc d.Region.region
|
|
||||||
and simpl_data_declaration : Raw.data_decl -> _ result = fun t ->
|
and simpl_data_declaration : Raw.data_decl -> _ result = fun t ->
|
||||||
match t with
|
match t with
|
||||||
| LocalVar x ->
|
| LocalVar x ->
|
||||||
@ -630,11 +554,13 @@ and simpl_fun_declaration :
|
|||||||
fun ~loc x ->
|
fun ~loc x ->
|
||||||
let open! Raw in
|
let open! Raw in
|
||||||
let {name;param;ret_type;local_decls;block;return} : fun_decl = x in
|
let {name;param;ret_type;local_decls;block;return} : fun_decl = x in
|
||||||
(match npseq_to_list param.value.inside with
|
let statements =
|
||||||
| [] ->
|
match block with
|
||||||
fail @@
|
| Some block -> npseq_to_list block.value.statements
|
||||||
corner_case ~loc:__LOC__ "parameter-less function should not exist"
|
| None -> []
|
||||||
| [a] -> (
|
in
|
||||||
|
(match param.value.inside with
|
||||||
|
a, [] -> (
|
||||||
let%bind input = simpl_param a in
|
let%bind input = simpl_param a in
|
||||||
let name = name.value in
|
let name = name.value in
|
||||||
let (binder , input_type) = input in
|
let (binder , input_type) = input in
|
||||||
@ -642,7 +568,7 @@ and simpl_fun_declaration :
|
|||||||
bind_map_list simpl_local_declaration local_decls in
|
bind_map_list simpl_local_declaration local_decls in
|
||||||
let%bind instructions = bind_list
|
let%bind instructions = bind_list
|
||||||
@@ List.map simpl_statement
|
@@ List.map simpl_statement
|
||||||
@@ npseq_to_list block.value.statements in
|
@@ statements in
|
||||||
let%bind result = simpl_expression return in
|
let%bind result = simpl_expression return in
|
||||||
let%bind output_type = simpl_type_expression ret_type in
|
let%bind output_type = simpl_type_expression ret_type in
|
||||||
let body = local_declarations @ instructions in
|
let body = local_declarations @ instructions in
|
||||||
@ -655,6 +581,7 @@ and simpl_fun_declaration :
|
|||||||
ok ((name , type_annotation) , expression)
|
ok ((name , type_annotation) , expression)
|
||||||
)
|
)
|
||||||
| lst -> (
|
| lst -> (
|
||||||
|
let lst = npseq_to_list lst in
|
||||||
let arguments_name = "arguments" in
|
let arguments_name = "arguments" in
|
||||||
let%bind params = bind_map_list simpl_param lst in
|
let%bind params = bind_map_list simpl_param lst in
|
||||||
let (binder , input_type) =
|
let (binder , input_type) =
|
||||||
@ -672,7 +599,7 @@ and simpl_fun_declaration :
|
|||||||
bind_map_list simpl_local_declaration local_decls in
|
bind_map_list simpl_local_declaration local_decls in
|
||||||
let%bind instructions = bind_list
|
let%bind instructions = bind_list
|
||||||
@@ List.map simpl_statement
|
@@ List.map simpl_statement
|
||||||
@@ npseq_to_list block.value.statements in
|
@@ statements in
|
||||||
let%bind result = simpl_expression return in
|
let%bind result = simpl_expression return in
|
||||||
let%bind output_type = simpl_type_expression ret_type in
|
let%bind output_type = simpl_type_expression ret_type in
|
||||||
let body = tpl_declarations @ local_declarations @ instructions in
|
let body = tpl_declarations @ local_declarations @ instructions in
|
||||||
@ -703,13 +630,11 @@ and simpl_declaration : Raw.declaration -> declaration Location.wrap result =
|
|||||||
ok @@ Declaration_constant (name.value , type_annotation , expression)
|
ok @@ Declaration_constant (name.value , type_annotation , expression)
|
||||||
in
|
in
|
||||||
bind_map_location simpl_const_decl (Location.lift_region x)
|
bind_map_location simpl_const_decl (Location.lift_region x)
|
||||||
| LambdaDecl (FunDecl x) -> (
|
| FunDecl x -> (
|
||||||
let (x , loc) = r_split x in
|
let (x , loc) = r_split x in
|
||||||
let%bind ((name , ty_opt) , expr) = simpl_fun_declaration ~loc x in
|
let%bind ((name , ty_opt) , expr) = simpl_fun_declaration ~loc x in
|
||||||
ok @@ Location.wrap ~loc (Declaration_constant (name , ty_opt , expr))
|
ok @@ Location.wrap ~loc (Declaration_constant (name , ty_opt , expr))
|
||||||
)
|
)
|
||||||
| LambdaDecl (ProcDecl decl) ->
|
|
||||||
fail @@ unsupported_proc_decl decl
|
|
||||||
|
|
||||||
and simpl_statement : Raw.statement -> (_ -> expression result) result =
|
and simpl_statement : Raw.statement -> (_ -> expression result) result =
|
||||||
fun s ->
|
fun s ->
|
||||||
@ -717,7 +642,7 @@ and simpl_statement : Raw.statement -> (_ -> expression result) result =
|
|||||||
| Instr i -> simpl_instruction i
|
| Instr i -> simpl_instruction i
|
||||||
| Data d -> simpl_data_declaration d
|
| Data d -> simpl_data_declaration d
|
||||||
|
|
||||||
and simpl_single_instruction : Raw.single_instr -> (_ -> expression result) result =
|
and simpl_single_instruction : Raw.instruction -> (_ -> expression result) result =
|
||||||
fun t ->
|
fun t ->
|
||||||
match t with
|
match t with
|
||||||
| ProcCall x -> (
|
| ProcCall x -> (
|
||||||
@ -743,17 +668,35 @@ and simpl_single_instruction : Raw.single_instr -> (_ -> expression result) resu
|
|||||||
let%bind body = simpl_block l.block.value in
|
let%bind body = simpl_block l.block.value in
|
||||||
let%bind body = body None in
|
let%bind body = body None in
|
||||||
return_statement @@ e_loop cond body
|
return_statement @@ e_loop cond body
|
||||||
| Loop (For (ForInt {region; _} | ForCollect {region ; _})) ->
|
| Loop (For (ForInt fi)) ->
|
||||||
fail @@ unsupported_for_loops region
|
let%bind loop = simpl_for_int fi.value in
|
||||||
|
let%bind loop = loop None in
|
||||||
|
return_statement @@ loop
|
||||||
|
| Loop (For (ForCollect fc)) ->
|
||||||
|
let%bind loop = simpl_for_collect fc.value in
|
||||||
|
let%bind loop = loop None in
|
||||||
|
return_statement @@ loop
|
||||||
| Cond c -> (
|
| Cond c -> (
|
||||||
let (c , loc) = r_split c in
|
let (c , loc) = r_split c in
|
||||||
let%bind expr = simpl_expression c.test in
|
let%bind expr = simpl_expression c.test in
|
||||||
let%bind match_true = match c.ifso with
|
let%bind match_true = match c.ifso with
|
||||||
| ClauseInstr i -> simpl_instruction_block i
|
ClauseInstr i ->
|
||||||
| ClauseBlock b -> simpl_statements @@ fst b.value.inside in
|
simpl_single_instruction i
|
||||||
|
| ClauseBlock b ->
|
||||||
|
match b with
|
||||||
|
LongBlock {value; _} ->
|
||||||
|
simpl_block value
|
||||||
|
| ShortBlock {value; _} ->
|
||||||
|
simpl_statements @@ fst value.inside in
|
||||||
let%bind match_false = match c.ifnot with
|
let%bind match_false = match c.ifnot with
|
||||||
| ClauseInstr i -> simpl_instruction_block i
|
ClauseInstr i ->
|
||||||
| ClauseBlock b -> simpl_statements @@ fst b.value.inside in
|
simpl_single_instruction i
|
||||||
|
| ClauseBlock b ->
|
||||||
|
match b with
|
||||||
|
LongBlock {value; _} ->
|
||||||
|
simpl_block value
|
||||||
|
| ShortBlock {value; _} ->
|
||||||
|
simpl_statements @@ fst value.inside in
|
||||||
let%bind match_true = match_true None in
|
let%bind match_true = match_true None in
|
||||||
let%bind match_false = match_false None in
|
let%bind match_false = match_false None in
|
||||||
return_statement @@ e_matching expr ~loc (Match_bool {match_true; match_false})
|
return_statement @@ e_matching expr ~loc (Match_bool {match_true; match_false})
|
||||||
@ -784,10 +727,19 @@ and simpl_single_instruction : Raw.single_instr -> (_ -> expression result) resu
|
|||||||
let (c , loc) = r_split c in
|
let (c , loc) = r_split c in
|
||||||
let%bind expr = simpl_expression c.expr in
|
let%bind expr = simpl_expression c.expr in
|
||||||
let%bind cases =
|
let%bind cases =
|
||||||
let aux (x : Raw.instruction Raw.case_clause Raw.reg) =
|
let aux (x : Raw.if_clause Raw.case_clause Raw.reg) =
|
||||||
let%bind i = simpl_instruction_block x.value.rhs in
|
let%bind case_clause =
|
||||||
let%bind i = i None in
|
match x.value.rhs with
|
||||||
ok (x.value.pattern, i) in
|
ClauseInstr i ->
|
||||||
|
simpl_single_instruction i
|
||||||
|
| ClauseBlock b ->
|
||||||
|
match b with
|
||||||
|
LongBlock {value; _} ->
|
||||||
|
simpl_block value
|
||||||
|
| ShortBlock {value; _} ->
|
||||||
|
simpl_statements @@ fst value.inside in
|
||||||
|
let%bind case_clause = case_clause None in
|
||||||
|
ok (x.value.pattern, case_clause) in
|
||||||
bind_list
|
bind_list
|
||||||
@@ List.map aux
|
@@ List.map aux
|
||||||
@@ npseq_to_list c.cases.value in
|
@@ npseq_to_list c.cases.value in
|
||||||
@ -797,30 +749,72 @@ and simpl_single_instruction : Raw.single_instr -> (_ -> expression result) resu
|
|||||||
| RecordPatch r -> (
|
| RecordPatch r -> (
|
||||||
let r = r.value in
|
let r = r.value in
|
||||||
let (name , access_path) = simpl_path r.path in
|
let (name , access_path) = simpl_path r.path in
|
||||||
let%bind inj = bind_list
|
|
||||||
|
let head, tail = r.record_inj.value.ne_elements in
|
||||||
|
|
||||||
|
let%bind tail' = bind_list
|
||||||
@@ List.map (fun (x: Raw.field_assign Region.reg) ->
|
@@ List.map (fun (x: Raw.field_assign Region.reg) ->
|
||||||
let (x , loc) = r_split x in
|
let (x , loc) = r_split x in
|
||||||
let%bind e = simpl_expression x.field_expr
|
let%bind e = simpl_expression x.field_expr
|
||||||
in ok (x.field_name.value, e , loc)
|
in ok (x.field_name.value, e , loc)
|
||||||
)
|
)
|
||||||
@@ pseq_to_list r.record_inj.value.elements in
|
@@ List.map snd tail in
|
||||||
|
|
||||||
|
let%bind head' =
|
||||||
|
let (x , loc) = r_split head in
|
||||||
|
let%bind e = simpl_expression x.field_expr
|
||||||
|
in ok (x.field_name.value, e , loc) in
|
||||||
|
|
||||||
let%bind expr =
|
let%bind expr =
|
||||||
let aux = fun (access , v , loc) ->
|
let aux = fun (access , v , loc) ->
|
||||||
e_assign ~loc name (access_path @ [Access_record access]) v in
|
e_assign ~loc name (access_path @ [Access_record access]) v in
|
||||||
let assigns = List.map aux inj in
|
|
||||||
match assigns with
|
let hd, tl = aux head', List.map aux tail' in
|
||||||
| [] -> fail @@ unsupported_empty_record_patch r.record_inj
|
|
||||||
| hd :: tl -> (
|
|
||||||
let aux acc cur = e_sequence acc cur in
|
let aux acc cur = e_sequence acc cur in
|
||||||
ok @@ List.fold_left aux hd tl
|
ok @@ List.fold_left aux hd tl
|
||||||
)
|
|
||||||
in
|
in
|
||||||
return_statement @@ expr
|
return_statement @@ expr
|
||||||
)
|
)
|
||||||
| MapPatch patch ->
|
| MapPatch patch -> (
|
||||||
fail @@ unsupported_map_patches patch
|
let (map_p, loc) = r_split patch in
|
||||||
| SetPatch patch ->
|
let (name, access_path) = simpl_path map_p.path in
|
||||||
fail @@ unsupported_set_patches patch
|
let%bind inj = bind_list
|
||||||
|
@@ List.map (fun (x:Raw.binding Region.reg) ->
|
||||||
|
let x = x.value in
|
||||||
|
let (key, value) = x.source, x.image in
|
||||||
|
let%bind key' = simpl_expression key in
|
||||||
|
let%bind value' = simpl_expression value
|
||||||
|
in ok @@ (key', value')
|
||||||
|
)
|
||||||
|
@@ npseq_to_list map_p.map_inj.value.ne_elements in
|
||||||
|
let expr =
|
||||||
|
match inj with
|
||||||
|
| [] -> e_skip ~loc ()
|
||||||
|
| _ :: _ ->
|
||||||
|
let assigns = List.fold_right
|
||||||
|
(fun (key, value) map -> (e_map_add key value map))
|
||||||
|
inj
|
||||||
|
(e_accessor ~loc (e_variable name) access_path)
|
||||||
|
in e_assign ~loc name access_path assigns
|
||||||
|
in return_statement @@ expr
|
||||||
|
)
|
||||||
|
| SetPatch patch -> (
|
||||||
|
let (setp, loc) = r_split patch in
|
||||||
|
let (name , access_path) = simpl_path setp.path in
|
||||||
|
let%bind inj =
|
||||||
|
bind_list @@
|
||||||
|
List.map simpl_expression @@
|
||||||
|
npseq_to_list setp.set_inj.value.ne_elements in
|
||||||
|
let expr =
|
||||||
|
match inj with
|
||||||
|
| [] -> e_skip ~loc ()
|
||||||
|
| _ :: _ ->
|
||||||
|
let assigns = List.fold_right
|
||||||
|
(fun hd s -> e_constant "SET_ADD" [hd ; s])
|
||||||
|
inj (e_accessor ~loc (e_variable name) access_path) in
|
||||||
|
e_assign ~loc name access_path assigns in
|
||||||
|
return_statement @@ expr
|
||||||
|
)
|
||||||
| MapRemove r -> (
|
| MapRemove r -> (
|
||||||
let (v , loc) = r_split r in
|
let (v , loc) = r_split r in
|
||||||
let key = v.key in
|
let key = v.key in
|
||||||
@ -837,12 +831,16 @@ and simpl_single_instruction : Raw.single_instr -> (_ -> expression result) resu
|
|||||||
)
|
)
|
||||||
| SetRemove r -> (
|
| SetRemove r -> (
|
||||||
let (set_rm, loc) = r_split r in
|
let (set_rm, loc) = r_split r in
|
||||||
let%bind set = match set_rm.set with
|
let%bind (varname, set, path) = match set_rm.set with
|
||||||
| Name v -> ok v.value
|
| Name v -> ok (v.value, e_variable v.value, [])
|
||||||
| Path path -> fail @@ unsupported_deep_set_rm path in
|
| Path path ->
|
||||||
|
let(name, p') = simpl_path set_rm.set in
|
||||||
|
let%bind accessor = simpl_projection path in
|
||||||
|
ok @@ (name, accessor, p')
|
||||||
|
in
|
||||||
let%bind removed' = simpl_expression set_rm.element in
|
let%bind removed' = simpl_expression set_rm.element in
|
||||||
let expr = e_constant ~loc "SET_REMOVE" [removed' ; e_variable set] in
|
let expr = e_constant ~loc "SET_REMOVE" [removed' ; set] in
|
||||||
return_statement @@ e_assign ~loc set [] expr
|
return_statement @@ e_assign ~loc varname path expr
|
||||||
)
|
)
|
||||||
|
|
||||||
and simpl_path : Raw.path -> string * Ast_simplified.access_path = fun p ->
|
and simpl_path : Raw.path -> string * Ast_simplified.access_path = fun p ->
|
||||||
@ -886,7 +884,7 @@ and simpl_cases : type a . (Raw.pattern * a) list -> a matching result = fun t -
|
|||||||
| [] -> ok x'
|
| [] -> ok x'
|
||||||
| _ -> ok t
|
| _ -> ok t
|
||||||
)
|
)
|
||||||
| _ -> fail @@ corner_case ~loc:__LOC__ "unexpected pattern" in
|
| pattern -> ok pattern in
|
||||||
let get_constr (t: Raw.pattern) =
|
let get_constr (t: Raw.pattern) =
|
||||||
match t with
|
match t with
|
||||||
| PConstr v -> (
|
| PConstr v -> (
|
||||||
@ -951,18 +949,9 @@ and simpl_cases : type a . (Raw.pattern * a) list -> a matching result = fun t -
|
|||||||
bind_map_list aux lst in
|
bind_map_list aux lst in
|
||||||
ok @@ Match_variant constrs
|
ok @@ Match_variant constrs
|
||||||
|
|
||||||
and simpl_instruction_block : Raw.instruction -> (_ -> expression result) result =
|
|
||||||
fun t ->
|
|
||||||
match t with
|
|
||||||
| Single s -> simpl_single_instruction s
|
|
||||||
| Block b -> simpl_block b.value
|
|
||||||
|
|
||||||
and simpl_instruction : Raw.instruction -> (_ -> expression result) result =
|
and simpl_instruction : Raw.instruction -> (_ -> expression result) result =
|
||||||
fun t ->
|
fun t ->
|
||||||
trace (simplifying_instruction t) @@
|
trace (simplifying_instruction t) @@ simpl_single_instruction t
|
||||||
match t with
|
|
||||||
| Single s -> simpl_single_instruction s
|
|
||||||
| Block b -> fail @@ unsupported_sub_blocks b
|
|
||||||
|
|
||||||
and simpl_statements : Raw.statements -> (_ -> expression result) result =
|
and simpl_statements : Raw.statements -> (_ -> expression result) result =
|
||||||
fun ss ->
|
fun ss ->
|
||||||
@ -979,5 +968,206 @@ and simpl_statements : Raw.statements -> (_ -> expression result) result =
|
|||||||
and simpl_block : Raw.block -> (_ -> expression result) result = fun t ->
|
and simpl_block : Raw.block -> (_ -> expression result) result = fun t ->
|
||||||
simpl_statements t.statements
|
simpl_statements t.statements
|
||||||
|
|
||||||
|
and simpl_for_int : Raw.for_int -> (_ -> expression result) result = fun fi ->
|
||||||
|
(* cond part *)
|
||||||
|
let var = e_variable fi.assign.value.name.value in
|
||||||
|
let%bind value = simpl_expression fi.assign.value.expr in
|
||||||
|
let%bind bound = simpl_expression fi.bound in
|
||||||
|
let comp = e_annotation (e_constant "LE" [var ; bound]) t_bool
|
||||||
|
in
|
||||||
|
(* body part *)
|
||||||
|
let%bind body = simpl_block fi.block.value in
|
||||||
|
let%bind body = body None in
|
||||||
|
let step = e_int 1 in
|
||||||
|
let ctrl = e_assign
|
||||||
|
fi.assign.value.name.value [] (e_constant "ADD" [ var ; step ]) in
|
||||||
|
let rec add_to_seq expr = match expr.expression with
|
||||||
|
| E_sequence (_,a) -> add_to_seq a
|
||||||
|
| _ -> e_sequence body ctrl in
|
||||||
|
let body' = add_to_seq body in
|
||||||
|
let loop = e_loop comp body' in
|
||||||
|
return_statement @@ e_let_in (fi.assign.value.name.value, Some t_int) value loop
|
||||||
|
|
||||||
|
(** simpl_for_collect
|
||||||
|
For loops over collections, like
|
||||||
|
|
||||||
|
``` concrete syntax :
|
||||||
|
for x : int in set myset
|
||||||
|
begin
|
||||||
|
myint := myint + x ;
|
||||||
|
myst := myst ^ "to" ;
|
||||||
|
end
|
||||||
|
```
|
||||||
|
|
||||||
|
are implemented using a MAP_FOLD, LIST_FOLD or SET_FOLD:
|
||||||
|
|
||||||
|
``` pseudo Ast_simplified
|
||||||
|
let #COMPILER#folded_record = list_fold( mylist ,
|
||||||
|
record st = st; acc = acc; end;
|
||||||
|
lamby = fun arguments -> (
|
||||||
|
let #COMPILER#acc = arguments.0 in
|
||||||
|
let #COMPILER#elt = arguments.1 in
|
||||||
|
#COMPILER#acc.myint := #COMPILER#acc.myint + #COMPILER#elt ;
|
||||||
|
#COMPILER#acc.myst := #COMPILER#acc.myst ^ "to" ;
|
||||||
|
#COMPILER#acc
|
||||||
|
)
|
||||||
|
) in
|
||||||
|
{
|
||||||
|
myst := #COMPILER#folded_record.myst ;
|
||||||
|
myint := #COMPILER#folded_record.myint ;
|
||||||
|
}
|
||||||
|
```
|
||||||
|
|
||||||
|
We are performing the following steps:
|
||||||
|
1) Simplifying the for body using ̀simpl_block`
|
||||||
|
|
||||||
|
2) Detect the free variables and build a list of their names
|
||||||
|
(myint and myst in the previous example)
|
||||||
|
|
||||||
|
3) Build the initial record (later passed as 2nd argument of
|
||||||
|
`MAP/SET/LIST_FOLD`) capturing the environment using the
|
||||||
|
free variables list of (2)
|
||||||
|
|
||||||
|
4) In the filtered body of (1), replace occurences:
|
||||||
|
- free variable of name X as rhs ==> accessor `#COMPILER#acc.X`
|
||||||
|
- free variable of name X as lhs ==> accessor `#COMPILER#acc.X`
|
||||||
|
And, in the case of a map:
|
||||||
|
- references to the iterated key ==> variable `#COMPILER#elt_key`
|
||||||
|
- references to the iterated value ==> variable `#COMPILER#elt_value`
|
||||||
|
in the case of a set/list:
|
||||||
|
- references to the iterated value ==> variable `#COMPILER#elt`
|
||||||
|
|
||||||
|
5) Append the return value to the body
|
||||||
|
|
||||||
|
6) Prepend the declaration of the lambda arguments to the body which
|
||||||
|
is a serie of `let .. in`'s
|
||||||
|
Note that the parameter of the lambda ̀arguments` is a tree of
|
||||||
|
tuple holding:
|
||||||
|
* In the case of `list` or ̀set`:
|
||||||
|
( folding record , current list/set element ) as
|
||||||
|
( #COMPILER#acc , #COMPILER#elt )
|
||||||
|
* In the case of `map`:
|
||||||
|
( folding record , current map key , current map value ) as
|
||||||
|
( #COMPILER#acc , #COMPILER#elt_key , #COMPILER#elt_value )
|
||||||
|
|
||||||
|
7) Build the lambda using the final body of (6)
|
||||||
|
|
||||||
|
8) Build a sequence of assignments for all the captured variables
|
||||||
|
to their new value, namely an access to the folded record
|
||||||
|
(#COMPILER#folded_record)
|
||||||
|
|
||||||
|
9) Attach the sequence of 8 to the ̀let .. in` declaration
|
||||||
|
of #COMPILER#folded_record
|
||||||
|
|
||||||
|
**)
|
||||||
|
and simpl_for_collect : Raw.for_collect -> (_ -> expression result) result = fun fc ->
|
||||||
|
(* STEP 1 *)
|
||||||
|
let%bind for_body = simpl_block fc.block.value in
|
||||||
|
let%bind for_body = for_body None in
|
||||||
|
(* STEP 2 *)
|
||||||
|
let%bind captured_name_list = Self_ast_simplified.fold_expression
|
||||||
|
(fun (prev : type_name list) (ass_exp : expression) ->
|
||||||
|
match ass_exp.expression with
|
||||||
|
| E_assign ( name , _ , _ ) ->
|
||||||
|
if (String.contains name '#') then
|
||||||
|
ok prev
|
||||||
|
else
|
||||||
|
ok (name::prev)
|
||||||
|
| _ -> ok prev )
|
||||||
|
[]
|
||||||
|
for_body in
|
||||||
|
(* STEP 3 *)
|
||||||
|
let add_to_record (prev: expression type_name_map) (captured_name: string) =
|
||||||
|
SMap.add captured_name (e_variable captured_name) prev in
|
||||||
|
let init_record = e_record (List.fold_left add_to_record SMap.empty captured_name_list) in
|
||||||
|
(* STEP 4 *)
|
||||||
|
let replace exp =
|
||||||
|
match exp.expression with
|
||||||
|
(* replace references to fold accumulator as rhs *)
|
||||||
|
| E_assign ( name , path , expr ) -> (
|
||||||
|
match path with
|
||||||
|
| [] -> ok @@ e_assign "#COMPILER#acc" [Access_record name] expr
|
||||||
|
(* This fails for deep accesses, see LIGO-131 LIGO-134 *)
|
||||||
|
| _ ->
|
||||||
|
(* ok @@ e_assign "#COMPILER#acc" ((Access_record name)::path) expr) *)
|
||||||
|
fail @@ unsupported_deep_access_for_collection fc.block )
|
||||||
|
| E_variable name -> (
|
||||||
|
if (List.mem name captured_name_list) then
|
||||||
|
(* replace references to fold accumulator as lhs *)
|
||||||
|
ok @@ e_accessor (e_variable "#COMPILER#acc") [Access_record name]
|
||||||
|
else match fc.collection with
|
||||||
|
(* loop on map *)
|
||||||
|
| Map _ ->
|
||||||
|
let k' = e_variable "#COMPILER#collec_elt_k" in
|
||||||
|
if ( name = fc.var.value ) then
|
||||||
|
ok @@ k' (* replace references to the the key *)
|
||||||
|
else (
|
||||||
|
match fc.bind_to with
|
||||||
|
| Some (_,v) ->
|
||||||
|
let v' = e_variable "#COMPILER#collec_elt_v" in
|
||||||
|
if ( name = v.value ) then
|
||||||
|
ok @@ v' (* replace references to the the value *)
|
||||||
|
else ok @@ exp
|
||||||
|
| None -> ok @@ exp
|
||||||
|
)
|
||||||
|
(* loop on set or list *)
|
||||||
|
| (Set _ | List _) ->
|
||||||
|
if (name = fc.var.value ) then
|
||||||
|
(* replace references to the collection element *)
|
||||||
|
ok @@ (e_variable "#COMPILER#collec_elt")
|
||||||
|
else ok @@ exp
|
||||||
|
)
|
||||||
|
| _ -> ok @@ exp in
|
||||||
|
let%bind for_body = Self_ast_simplified.map_expression replace for_body in
|
||||||
|
(* STEP 5 *)
|
||||||
|
let rec add_return (expr : expression) = match expr.expression with
|
||||||
|
| E_sequence (a,b) -> e_sequence a (add_return b)
|
||||||
|
| _ -> e_sequence expr (e_variable "#COMPILER#acc") in
|
||||||
|
let for_body = add_return for_body in
|
||||||
|
(* STEP 6 *)
|
||||||
|
let for_body =
|
||||||
|
let ( arg_access: Types.access_path -> expression ) = e_accessor (e_variable "arguments") in
|
||||||
|
( match fc.collection with
|
||||||
|
| Map _ ->
|
||||||
|
(* let acc = arg_access [Access_tuple 0 ; Access_tuple 0] in
|
||||||
|
let collec_elt_v = arg_access [Access_tuple 1 ; Access_tuple 0] in
|
||||||
|
let collec_elt_k = arg_access [Access_tuple 1 ; Access_tuple 1] in *)
|
||||||
|
(* The above should work, but not yet (see LIGO-131) *)
|
||||||
|
let temp_kv = arg_access [Access_tuple 1] in
|
||||||
|
let acc = arg_access [Access_tuple 0] in
|
||||||
|
let collec_elt_v = e_accessor (e_variable "#COMPILER#temp_kv") [Access_tuple 0] in
|
||||||
|
let collec_elt_k = e_accessor (e_variable "#COMPILER#temp_kv") [Access_tuple 1] in
|
||||||
|
e_let_in ("#COMPILER#acc", None) acc @@
|
||||||
|
e_let_in ("#COMPILER#temp_kv", None) temp_kv @@
|
||||||
|
e_let_in ("#COMPILER#collec_elt_k", None) collec_elt_v @@
|
||||||
|
e_let_in ("#COMPILER#collec_elt_v", None) collec_elt_k (for_body)
|
||||||
|
| _ ->
|
||||||
|
let acc = arg_access [Access_tuple 0] in
|
||||||
|
let collec_elt = arg_access [Access_tuple 1] in
|
||||||
|
e_let_in ("#COMPILER#acc", None) acc @@
|
||||||
|
e_let_in ("#COMPILER#collec_elt", None) collec_elt (for_body)
|
||||||
|
) in
|
||||||
|
(* STEP 7 *)
|
||||||
|
let%bind collect = simpl_expression fc.expr in
|
||||||
|
let lambda = e_lambda "arguments" None None for_body in
|
||||||
|
let op_name = match fc.collection with
|
||||||
|
| Map _ -> "MAP_FOLD" | Set _ -> "SET_FOLD" | List _ -> "LIST_FOLD" in
|
||||||
|
let fold = e_constant op_name [collect ; init_record ; lambda] in
|
||||||
|
(* STEP 8 *)
|
||||||
|
let assign_back (prev : expression option) (captured_varname : string) : expression option =
|
||||||
|
let access = e_accessor (e_variable "#COMPILER#folded_record")
|
||||||
|
[Access_record captured_varname] in
|
||||||
|
let assign = e_assign captured_varname [] access in
|
||||||
|
match prev with
|
||||||
|
| None -> Some assign
|
||||||
|
| Some p -> Some (e_sequence p assign) in
|
||||||
|
let reassign_sequence = List.fold_left assign_back None captured_name_list in
|
||||||
|
(* STEP 9 *)
|
||||||
|
let final_sequence = match reassign_sequence with
|
||||||
|
(* None case means that no variables were captured *)
|
||||||
|
| None -> e_skip ()
|
||||||
|
| Some seq -> e_let_in ("#COMPILER#folded_record", None) fold seq in
|
||||||
|
return_statement @@ final_sequence
|
||||||
|
|
||||||
let simpl_program : Raw.ast -> program result = fun t ->
|
let simpl_program : Raw.ast -> program result = fun t ->
|
||||||
bind_list @@ List.map simpl_declaration @@ nseq_to_list t.decl
|
bind_list @@ List.map simpl_declaration @@ nseq_to_list t.decl
|
@ -1,81 +1,22 @@
|
|||||||
|
(** Converts PascaLIGO programs to the Simplified Abstract Syntax Tree. *)
|
||||||
|
|
||||||
open Trace
|
open Trace
|
||||||
open Ast_simplified
|
open Ast_simplified
|
||||||
|
|
||||||
module Raw = Parser.Pascaligo.AST
|
module Raw = Parser.Pascaligo.AST
|
||||||
module SMap = Map.String
|
module SMap = Map.String
|
||||||
|
|
||||||
(*
|
module Errors :
|
||||||
val nseq_to_list : 'a * 'a list -> 'a list
|
sig
|
||||||
val npseq_to_list : 'a * ( 'b * 'a ) list -> 'a list
|
|
||||||
*)
|
|
||||||
val npseq_to_nelist : 'a * ( 'b * 'c ) list -> 'a * 'c list
|
|
||||||
(*
|
|
||||||
val pseq_to_list : ('a * ('b * 'a) list) option -> 'a list
|
|
||||||
val get_value : 'a Raw.reg -> 'a
|
|
||||||
*)
|
|
||||||
|
|
||||||
module Errors : sig
|
|
||||||
(*
|
|
||||||
val unsupported_cst_constr : Raw.pattern -> unit -> error
|
|
||||||
val unsupported_ass_None : Raw.wild -> unit -> error
|
|
||||||
val unsupported_entry_decl : 'a Raw.reg -> unit -> error
|
|
||||||
val unsupported_proc_decl : 'a Raw.reg -> unit -> error
|
|
||||||
*)
|
|
||||||
val bad_bytes : Location.t -> string -> unit -> error
|
val bad_bytes : Location.t -> string -> unit -> error
|
||||||
(*
|
|
||||||
val unsupported_local_proc : Raw.wild -> unit -> error
|
|
||||||
val corner_case : loc:string -> string -> unit -> error
|
|
||||||
val unknown_predefined_type : string Raw.reg -> unit -> error
|
|
||||||
*)
|
|
||||||
val unsupported_arith_op : Raw.expr -> unit -> error
|
val unsupported_arith_op : Raw.expr -> unit -> error
|
||||||
(*
|
|
||||||
val unsupported_set_expr : Raw.expr -> unit -> error
|
|
||||||
*)
|
|
||||||
val unsupported_proc_calls : 'a Raw.reg -> unit -> error
|
|
||||||
(*
|
|
||||||
val unsupported_for_loops : Raw.wild -> unit -> error
|
|
||||||
val unsupported_deep_map_assign : 'a Raw.reg -> unit -> error
|
|
||||||
val unsupported_empty_record_patch : 'a Raw.reg -> unit -> error
|
|
||||||
val unsupported_map_patches : 'a Raw.reg -> unit -> error
|
|
||||||
val unsupported_set_patches : 'a Raw.reg -> unit -> error
|
|
||||||
val unsupported_deep_map_rm : 'a Raw.reg -> unit -> error
|
|
||||||
val unsupported_set_removal : 'a Raw.reg -> unit -> error
|
|
||||||
val unsupported_non_var_pattern : Raw.pattern -> unit -> error
|
|
||||||
val only_constructors : Raw.pattern -> unit -> error
|
|
||||||
val unsupported_tuple_pattern : Raw.pattern -> unit -> error
|
|
||||||
val unsupported_deep_Some_patterns : Raw.pattern -> unit -> error
|
|
||||||
val unsupported_deep_list_patterns : 'a Raw.reg -> unit -> error
|
|
||||||
val unsupported_sub_blocks : 'a Raw.reg -> unit -> error
|
|
||||||
val simplifying_instruction : Raw.instruction -> unit -> error
|
|
||||||
*)
|
|
||||||
end
|
end
|
||||||
|
|
||||||
(*
|
|
||||||
val r_split : 'a Raw.reg -> 'a * Location.t
|
(** Convert a concrete PascaLIGO expression AST to the simplified expression AST
|
||||||
val return : expr -> ( expr option -> expr result ) result
|
used by the compiler. *)
|
||||||
val return_let_in : ?loc:Location.t -> string * type_expression option -> expr -> ( expr option -> expr result ) result
|
|
||||||
val simpl_type_expression : Raw.type_expr -> type_expression result
|
|
||||||
val simpl_list_type_expression : Raw.type_expr list -> type_expression result
|
|
||||||
*)
|
|
||||||
val simpl_expression : Raw.expr -> expr result
|
val simpl_expression : Raw.expr -> expr result
|
||||||
(*
|
|
||||||
val simpl_logic_expression : Raw.logic_expr -> expression result
|
(** Convert a concrete PascaLIGO program AST to the simplified program AST used
|
||||||
val simpl_list_expression : Raw.list_expr -> expression result
|
by the compiler. *)
|
||||||
val simpl_set_expression : Raw.set_expr -> expression result
|
|
||||||
val simpl_binop : string -> Raw.wild Raw.bin_op Region.reg -> expression result
|
|
||||||
val simpl_unop : string -> Raw.wild Raw.un_op Region.reg -> expression result
|
|
||||||
val simpl_tuple_expression : ?loc:Location.t -> Raw.expr list -> expression result
|
|
||||||
val simpl_local_declaration : Raw.local_decl -> ( expr option -> expr result) result
|
|
||||||
val simpl_data_declaration : Raw.data_decl -> ( expr option -> expr result ) result
|
|
||||||
val simpl_param : Raw.param_decl -> (type_name * type_expression) result
|
|
||||||
val simpl_fun_declaration : loc:Location.t -> Raw.fun_decl -> ((name * type_expression option) * expression) result
|
|
||||||
val simpl_declaration : Raw.declaration -> declaration Location.wrap result
|
|
||||||
val simpl_single_instruction : Raw.single_instr -> (expression option -> expression result) result
|
|
||||||
val simpl_path : Raw.path -> string * Ast_simplified.access_path
|
|
||||||
val simpl_cases : (Raw.pattern * 'a) list -> 'a matching result
|
|
||||||
val simpl_instruction_block : Raw.instruction -> (expression option -> expression result) result
|
|
||||||
val simpl_instruction : Raw.instruction -> (expression option -> expression result) result
|
|
||||||
val simpl_statements : Raw.statements -> (expression option -> expression result) result
|
|
||||||
val simpl_block : Raw.block -> (expression option -> expression result) result
|
|
||||||
*)
|
|
||||||
val simpl_program : Raw.ast -> program result
|
val simpl_program : Raw.ast -> program result
|
||||||
|
@ -1,8 +1,93 @@
|
|||||||
open Ast_simplified
|
open Ast_simplified
|
||||||
open Trace
|
open Trace
|
||||||
|
|
||||||
type mapper = expression -> expression result
|
type 'a folder = 'a -> expression -> 'a result
|
||||||
|
let rec fold_expression : 'a folder -> 'a -> expression -> 'a result = fun f init e ->
|
||||||
|
let self = fold_expression f in
|
||||||
|
let%bind init' = f init e in
|
||||||
|
match e.expression with
|
||||||
|
| E_literal _ | E_variable _ | E_skip -> ok init'
|
||||||
|
| E_list lst | E_set lst | E_tuple lst | E_constant (_ , lst) -> (
|
||||||
|
let%bind res' = bind_fold_list self init' lst in
|
||||||
|
ok res'
|
||||||
|
)
|
||||||
|
| E_map lst | E_big_map lst -> (
|
||||||
|
let%bind res' = bind_fold_list (bind_fold_pair self) init' lst in
|
||||||
|
ok res'
|
||||||
|
)
|
||||||
|
| E_look_up ab | E_sequence ab | E_loop ab | E_application ab -> (
|
||||||
|
let%bind res' = bind_fold_pair self init' ab in
|
||||||
|
ok res'
|
||||||
|
)
|
||||||
|
| E_lambda { binder = _ ; input_type = _ ; output_type = _ ; result = e }
|
||||||
|
| E_annotation (e , _) | E_constructor (_ , e) -> (
|
||||||
|
let%bind res' = self init' e in
|
||||||
|
ok res'
|
||||||
|
)
|
||||||
|
| E_assign (_ , path , e) | E_accessor (e , path) -> (
|
||||||
|
let%bind res' = fold_path f init' path in
|
||||||
|
let%bind res' = self res' e in
|
||||||
|
ok res'
|
||||||
|
)
|
||||||
|
| E_matching (e , cases) -> (
|
||||||
|
let%bind res = self init' e in
|
||||||
|
let%bind res = fold_cases f res cases in
|
||||||
|
ok res
|
||||||
|
)
|
||||||
|
| E_record m -> (
|
||||||
|
let aux init'' _ expr =
|
||||||
|
let%bind res' = fold_expression self init'' expr in
|
||||||
|
ok res'
|
||||||
|
in
|
||||||
|
let%bind res = bind_fold_smap aux (ok init') m in
|
||||||
|
ok res
|
||||||
|
)
|
||||||
|
| E_let_in { binder = _ ; rhs ; result } -> (
|
||||||
|
let%bind res = self init' rhs in
|
||||||
|
let%bind res = self res result in
|
||||||
|
ok res
|
||||||
|
)
|
||||||
|
|
||||||
|
and fold_path : 'a folder -> 'a -> access_path -> 'a result = fun f init p -> bind_fold_list (fold_access f) init p
|
||||||
|
|
||||||
|
and fold_access : 'a folder -> 'a -> access -> 'a result = fun f init a ->
|
||||||
|
match a with
|
||||||
|
| Access_map e -> (
|
||||||
|
let%bind e' = fold_expression f init e in
|
||||||
|
ok e'
|
||||||
|
)
|
||||||
|
| _ -> ok init
|
||||||
|
|
||||||
|
and fold_cases : 'a folder -> 'a -> matching_expr -> 'a result = fun f init m ->
|
||||||
|
match m with
|
||||||
|
| Match_bool { match_true ; match_false } -> (
|
||||||
|
let%bind res = fold_expression f init match_true in
|
||||||
|
let%bind res = fold_expression f res match_false in
|
||||||
|
ok res
|
||||||
|
)
|
||||||
|
| Match_list { match_nil ; match_cons = (_ , _ , cons) } -> (
|
||||||
|
let%bind res = fold_expression f init match_nil in
|
||||||
|
let%bind res = fold_expression f res cons in
|
||||||
|
ok res
|
||||||
|
)
|
||||||
|
| Match_option { match_none ; match_some = (_ , some) } -> (
|
||||||
|
let%bind res = fold_expression f init match_none in
|
||||||
|
let%bind res = fold_expression f res some in
|
||||||
|
ok res
|
||||||
|
)
|
||||||
|
| Match_tuple (_ , e) -> (
|
||||||
|
let%bind res = fold_expression f init e in
|
||||||
|
ok res
|
||||||
|
)
|
||||||
|
| Match_variant lst -> (
|
||||||
|
let aux init' ((_ , _) , e) =
|
||||||
|
let%bind res' = fold_expression f init' e in
|
||||||
|
ok res' in
|
||||||
|
let%bind res = bind_fold_list aux init lst in
|
||||||
|
ok res
|
||||||
|
)
|
||||||
|
|
||||||
|
type mapper = expression -> expression result
|
||||||
let rec map_expression : mapper -> expression -> expression result = fun f e ->
|
let rec map_expression : mapper -> expression -> expression result = fun f e ->
|
||||||
let self = map_expression f in
|
let self = map_expression f in
|
||||||
let%bind e' = f e in
|
let%bind e' = f e in
|
||||||
|
@ -4,6 +4,27 @@ open Trace
|
|||||||
let peephole_expression : expression -> expression result = fun e ->
|
let peephole_expression : expression -> expression result = fun e ->
|
||||||
let return expression = ok { e with expression } in
|
let return expression = ok { e with expression } in
|
||||||
match e.expression with
|
match e.expression with
|
||||||
|
| E_constant ("BIG_MAP_LITERAL" , lst) -> (
|
||||||
|
let%bind elt =
|
||||||
|
trace_option (simple_error "big_map literal expects a single parameter") @@
|
||||||
|
List.to_singleton lst
|
||||||
|
in
|
||||||
|
let%bind lst =
|
||||||
|
trace (simple_error "big_map literal expects a list as parameter") @@
|
||||||
|
get_e_list elt.expression
|
||||||
|
in
|
||||||
|
let aux = fun (e : expression) ->
|
||||||
|
trace (simple_error "big_map literal expects a list of pairs as parameter") @@
|
||||||
|
let%bind tpl = get_e_tuple e.expression in
|
||||||
|
let%bind (a , b) =
|
||||||
|
trace_option (simple_error "of pairs") @@
|
||||||
|
List.to_pair tpl
|
||||||
|
in
|
||||||
|
ok (a , b)
|
||||||
|
in
|
||||||
|
let%bind pairs = bind_map_list aux lst in
|
||||||
|
return @@ E_big_map pairs
|
||||||
|
)
|
||||||
| E_constant ("MAP_LITERAL" , lst) -> (
|
| E_constant ("MAP_LITERAL" , lst) -> (
|
||||||
let%bind elt =
|
let%bind elt =
|
||||||
trace_option (simple_error "map literal expects a single parameter") @@
|
trace_option (simple_error "map literal expects a single parameter") @@
|
||||||
@ -25,6 +46,13 @@ let peephole_expression : expression -> expression result = fun e ->
|
|||||||
let%bind pairs = bind_map_list aux lst in
|
let%bind pairs = bind_map_list aux lst in
|
||||||
return @@ E_map pairs
|
return @@ E_map pairs
|
||||||
)
|
)
|
||||||
|
| E_constant ("BIG_MAP_EMPTY" , lst) -> (
|
||||||
|
let%bind () =
|
||||||
|
trace_strong (simple_error "BIG_MAP_EMPTY expects no parameter") @@
|
||||||
|
Assert.assert_list_empty lst
|
||||||
|
in
|
||||||
|
return @@ E_big_map []
|
||||||
|
)
|
||||||
| E_constant ("MAP_EMPTY" , lst) -> (
|
| E_constant ("MAP_EMPTY" , lst) -> (
|
||||||
let%bind () =
|
let%bind () =
|
||||||
trace_strong (simple_error "MAP_EMPTY expects no parameter") @@
|
trace_strong (simple_error "MAP_EMPTY expects no parameter") @@
|
||||||
|
@ -21,3 +21,7 @@ let all_program =
|
|||||||
let all_expression =
|
let all_expression =
|
||||||
let all_p = List.map Helpers.map_expression all in
|
let all_p = List.map Helpers.map_expression all in
|
||||||
bind_chain all_p
|
bind_chain all_p
|
||||||
|
|
||||||
|
let map_expression = Helpers.map_expression
|
||||||
|
|
||||||
|
let fold_expression = Helpers.fold_expression
|
||||||
|
@ -707,6 +707,42 @@ and type_expression : environment -> Solver.state -> I.expression -> (O.annotate
|
|||||||
* let%bind (name', tv) =
|
* let%bind (name', tv) =
|
||||||
* type_constant name tv_lst tv_opt ae.location in
|
* type_constant name tv_lst tv_opt ae.location in
|
||||||
* return (E_constant (name' , lst')) tv *)
|
* return (E_constant (name' , lst')) tv *)
|
||||||
|
| E_constant ( ("LIST_FOLD"|"MAP_FOLD"|"SET_FOLD") as opname ,
|
||||||
|
[ collect ;
|
||||||
|
init_record ;
|
||||||
|
( { expression = (I.E_lambda { binder = (lname, None) ;
|
||||||
|
input_type = None ;
|
||||||
|
output_type = None ;
|
||||||
|
result }) ;
|
||||||
|
location = _ }) as _lambda
|
||||||
|
] ) ->
|
||||||
|
let _TODO = (opname, collect, init_record, lname, result) in
|
||||||
|
failwith "TODO: E_constant merge"
|
||||||
|
(* ******************************************************************************************************************************************************** *)
|
||||||
|
(*
|
||||||
|
(* this special case is here force annotation of the untyped lambda
|
||||||
|
generated by pascaligo's for_collect loop *)
|
||||||
|
let%bind (v_col , v_initr ) = bind_map_pair (type_expression e) (collect , init_record ) in
|
||||||
|
let tv_col = get_type_annotation v_col in (* this is the type of the collection *)
|
||||||
|
let tv_out = get_type_annotation v_initr in (* this is the output type of the lambda*)
|
||||||
|
let%bind input_type = match tv_col.type_value' with
|
||||||
|
| O.T_constant ( ("list"|"set") , t) -> ok @@ t_tuple (tv_out::t) ()
|
||||||
|
| O.T_constant ( "map" , t) -> ok @@ t_tuple (tv_out::[(t_tuple t ())]) ()
|
||||||
|
| _ ->
|
||||||
|
let wtype = Format.asprintf
|
||||||
|
"Loops over collections expect lists, sets or maps, got type %a" O.PP.type_value tv_col in
|
||||||
|
fail @@ simple_error wtype in
|
||||||
|
let e' = Environment.add_ez_binder lname input_type e in
|
||||||
|
let%bind body = type_expression ?tv_opt:(Some tv_out) e' result in
|
||||||
|
let output_type = body.type_annotation in
|
||||||
|
let lambda' = make_a_e (E_lambda {binder = lname ; body}) (t_function input_type output_type ()) e in
|
||||||
|
let lst' = [v_col; v_initr ; lambda'] in
|
||||||
|
let tv_lst = List.map get_type_annotation lst' in
|
||||||
|
let%bind (opname', tv) =
|
||||||
|
type_constant opname tv_lst tv_opt ae.location in
|
||||||
|
return (E_constant (opname' , lst')) tv
|
||||||
|
*)
|
||||||
|
(* ******************************************************************************************************************************************************** *)
|
||||||
| E_application (f, arg) ->
|
| E_application (f, arg) ->
|
||||||
let%bind (f' , state') = type_expression e state f in
|
let%bind (f' , state') = type_expression e state f in
|
||||||
let%bind (arg , state'') = type_expression e state' arg in
|
let%bind (arg , state'') = type_expression e state' arg in
|
||||||
|
@ -1,3 +1,7 @@
|
|||||||
|
(* The Transpiler is a function that takes as input the Typed AST, and outputs expressions in a language that is basically a Michelson with named variables and first-class-environments.
|
||||||
|
|
||||||
|
For more info, see back-end.md: https://gitlab.com/ligolang/ligo/blob/dev/gitlab-pages/docs/contributors/big-picture/back-end.md *)
|
||||||
|
|
||||||
open! Trace
|
open! Trace
|
||||||
open Helpers
|
open Helpers
|
||||||
|
|
||||||
@ -125,6 +129,7 @@ let rec transpile_type (t:AST.type_value) : type_value result =
|
|||||||
| T_constant (Type_name "timestamp", []) -> ok (T_base Base_timestamp)
|
| T_constant (Type_name "timestamp", []) -> ok (T_base Base_timestamp)
|
||||||
| T_constant (Type_name "unit", []) -> ok (T_base Base_unit)
|
| T_constant (Type_name "unit", []) -> ok (T_base Base_unit)
|
||||||
| T_constant (Type_name "operation", []) -> ok (T_base Base_operation)
|
| T_constant (Type_name "operation", []) -> ok (T_base Base_operation)
|
||||||
|
| T_constant (Type_name "signature", []) -> ok (T_base Base_signature)
|
||||||
| T_constant (Type_name "contract", [x]) ->
|
| T_constant (Type_name "contract", [x]) ->
|
||||||
let%bind x' = transpile_type x in
|
let%bind x' = transpile_type x in
|
||||||
ok (T_contract x')
|
ok (T_contract x')
|
||||||
@ -296,21 +301,7 @@ and transpile_annotated_expression (ae:AST.annotated_expression) : expression re
|
|||||||
| E_application (a, b) ->
|
| E_application (a, b) ->
|
||||||
let%bind a = transpile_annotated_expression a in
|
let%bind a = transpile_annotated_expression a in
|
||||||
let%bind b = transpile_annotated_expression b in
|
let%bind b = transpile_annotated_expression b in
|
||||||
let%bind contains_closure =
|
return @@ E_application (a, b)
|
||||||
Self_mini_c.Helpers.fold_type_value
|
|
||||||
(fun contains_closure exp ->
|
|
||||||
ok (contains_closure
|
|
||||||
|| match exp with
|
|
||||||
| T_deep_closure _ -> true
|
|
||||||
| _ -> false))
|
|
||||||
false
|
|
||||||
b.type_value in
|
|
||||||
if contains_closure
|
|
||||||
then
|
|
||||||
let errmsg = Format.asprintf "Cannot apply closure in function arguments: %a\n"
|
|
||||||
Mini_c.PP.expression_with_type b in
|
|
||||||
fail @@ simple_error errmsg
|
|
||||||
else return @@ E_application (a, b)
|
|
||||||
| E_constructor (m, param) -> (
|
| E_constructor (m, param) -> (
|
||||||
let%bind param' = transpile_annotated_expression param in
|
let%bind param' = transpile_annotated_expression param in
|
||||||
let (param'_expr , param'_tv) = Combinators.Expression.(get_content param' , get_type param') in
|
let (param'_expr , param'_tv) = Combinators.Expression.(get_content param' , get_type param') in
|
||||||
|
@ -14,31 +14,46 @@ let get : environment -> string -> michelson result = fun e s ->
|
|||||||
error title content in
|
error title content in
|
||||||
generic_try error @@
|
generic_try error @@
|
||||||
(fun () -> Environment.get_i s e) in
|
(fun () -> Environment.get_i s e) in
|
||||||
let rec aux = fun n ->
|
let rec aux_bubble = fun n ->
|
||||||
match n with
|
match n with
|
||||||
| 0 -> i_dup
|
| 0 -> i_dup
|
||||||
| n -> seq [
|
| n -> seq [
|
||||||
dip @@ aux (n - 1) ;
|
dip @@ aux_bubble (n - 1) ;
|
||||||
i_swap ;
|
i_swap ;
|
||||||
]
|
]
|
||||||
in
|
in
|
||||||
let code = aux position in
|
let aux_dig = fun n -> seq [
|
||||||
|
dipn n i_dup ;
|
||||||
|
i_dig n ;
|
||||||
|
]
|
||||||
|
in
|
||||||
|
let code =
|
||||||
|
if position < 2
|
||||||
|
then aux_bubble position
|
||||||
|
else aux_dig position in
|
||||||
|
|
||||||
ok code
|
ok code
|
||||||
|
|
||||||
let set : environment -> string -> michelson result = fun e s ->
|
let set : environment -> string -> michelson result = fun e s ->
|
||||||
let%bind (_ , position) =
|
let%bind (_ , position) =
|
||||||
generic_try (simple_error "Environment.get") @@
|
generic_try (simple_error "Environment.set") @@
|
||||||
(fun () -> Environment.get_i s e) in
|
(fun () -> Environment.get_i s e) in
|
||||||
let rec aux = fun n ->
|
let rec aux_bubble = fun n ->
|
||||||
match n with
|
match n with
|
||||||
| 0 -> dip i_drop
|
| 0 -> dip i_drop
|
||||||
| n -> seq [
|
| n -> seq [
|
||||||
i_swap ;
|
i_swap ;
|
||||||
dip (aux (n - 1)) ;
|
dip (aux_bubble (n - 1)) ;
|
||||||
]
|
]
|
||||||
in
|
in
|
||||||
let code = aux position in
|
let aux_dug = fun n -> seq [
|
||||||
|
dipn (n + 1) i_drop ;
|
||||||
|
i_dug n ;
|
||||||
|
] in
|
||||||
|
let code =
|
||||||
|
if position < 2
|
||||||
|
then aux_bubble position
|
||||||
|
else aux_dug position in
|
||||||
|
|
||||||
ok code
|
ok code
|
||||||
|
|
||||||
@ -73,5 +88,12 @@ let pack_closure : environment -> selector -> michelson result = fun e lst ->
|
|||||||
ok code
|
ok code
|
||||||
|
|
||||||
let unpack_closure : environment -> michelson result = fun e ->
|
let unpack_closure : environment -> michelson result = fun e ->
|
||||||
|
match e with
|
||||||
|
| [] -> ok @@ seq []
|
||||||
|
| _ :: tl -> (
|
||||||
let aux = fun code _ -> seq [ i_unpair ; dip code ] in
|
let aux = fun code _ -> seq [ i_unpair ; dip code ] in
|
||||||
ok (List.fold_right' aux (seq []) e)
|
let unpairs = (List.fold_right' aux (seq []) tl) in
|
||||||
|
ok @@ seq [ i_unpiar ; dip unpairs ]
|
||||||
|
)
|
||||||
|
(* let aux = fun code _ -> seq [ i_unpair ; dip code ] in
|
||||||
|
* ok (List.fold_right' aux (seq []) e) *)
|
||||||
|
@ -151,14 +151,13 @@ and translate_expression (expr:expression) (env:environment) : michelson result
|
|||||||
return @@ seq [
|
return @@ seq [
|
||||||
closure_pack_code ;
|
closure_pack_code ;
|
||||||
i_push lambda_ty lambda_body_code ;
|
i_push lambda_ty lambda_body_code ;
|
||||||
i_pair ;
|
i_swap ;
|
||||||
|
i_apply ;
|
||||||
]
|
]
|
||||||
)
|
)
|
||||||
| _ -> simple_fail "expected closure type"
|
| _ -> simple_fail "expected closure type"
|
||||||
)
|
)
|
||||||
| E_application (f , arg) -> (
|
| E_application (f , arg) -> (
|
||||||
match Combinators.Expression.get_type f with
|
|
||||||
| T_function _ -> (
|
|
||||||
trace (simple_error "Compiling quote application") @@
|
trace (simple_error "Compiling quote application") @@
|
||||||
let%bind f = translate_expression f env in
|
let%bind f = translate_expression f env in
|
||||||
let%bind arg = translate_expression arg env in
|
let%bind arg = translate_expression arg env in
|
||||||
@ -168,17 +167,6 @@ and translate_expression (expr:expression) (env:environment) : michelson result
|
|||||||
prim I_EXEC ;
|
prim I_EXEC ;
|
||||||
]
|
]
|
||||||
)
|
)
|
||||||
| T_deep_closure (_ , _ , _) -> (
|
|
||||||
let%bind f_code = translate_expression f env in
|
|
||||||
let%bind arg_code = translate_expression arg env in
|
|
||||||
return @@ seq [
|
|
||||||
arg_code ;
|
|
||||||
dip (seq [ f_code ; i_unpair ; i_swap ]) ; i_pair ;
|
|
||||||
prim I_EXEC ;
|
|
||||||
]
|
|
||||||
)
|
|
||||||
| _ -> simple_fail "E_applicationing something not appliable"
|
|
||||||
)
|
|
||||||
| E_variable x ->
|
| E_variable x ->
|
||||||
let%bind code = Compiler_environment.get env x in
|
let%bind code = Compiler_environment.get env x in
|
||||||
return code
|
return code
|
||||||
|
@ -32,24 +32,24 @@ module Ty = struct
|
|||||||
let mutez = Mutez_t None
|
let mutez = Mutez_t None
|
||||||
let string = String_t None
|
let string = String_t None
|
||||||
let key = Key_t None
|
let key = Key_t None
|
||||||
let list a = List_t (a, None)
|
let list a = List_t (a, None , has_big_map a)
|
||||||
let set a = Set_t (a, None)
|
let set a = Set_t (a, None)
|
||||||
let address = Address_t None
|
let address = Address_t None
|
||||||
let option a = Option_t ((a, None), None, None)
|
let option a = Option_t (a, None , has_big_map a)
|
||||||
let contract a = Contract_t (a, None)
|
let contract a = Contract_t (a, None)
|
||||||
let lambda a b = Lambda_t (a, b, None)
|
let lambda a b = Lambda_t (a, b, None)
|
||||||
let timestamp = Timestamp_t None
|
let timestamp = Timestamp_t None
|
||||||
let map a b = Map_t (a, b, None)
|
let map a b = Map_t (a, b, None , has_big_map b)
|
||||||
let pair a b = Pair_t ((a, None, None), (b, None, None), None)
|
let pair a b = Pair_t ((a, None, None), (b, None, None), None , has_big_map a || has_big_map b)
|
||||||
let union a b = Union_t ((a, None), (b, None), None)
|
let union a b = Union_t ((a, None), (b, None), None , has_big_map a || has_big_map b)
|
||||||
|
|
||||||
let field_annot = Option.map (fun ann -> `Field_annot ann)
|
let field_annot = Option.map (fun ann -> `Field_annot ann)
|
||||||
|
|
||||||
let union_ann (anna, a) (annb, b) =
|
let union_ann (anna, a) (annb, b) =
|
||||||
Union_t ((a, field_annot anna), (b, field_annot annb), None)
|
Union_t ((a, field_annot anna), (b, field_annot annb), None , has_big_map a || has_big_map b)
|
||||||
|
|
||||||
let pair_ann (anna, a) (annb, b) =
|
let pair_ann (anna, a) (annb, b) =
|
||||||
Pair_t ((a, field_annot anna, None), (b, field_annot annb, None), None)
|
Pair_t ((a, field_annot anna, None), (b, field_annot annb, None), None , has_big_map a || has_big_map b)
|
||||||
|
|
||||||
let not_comparable name () = error (thunk "not a comparable type") (fun () -> name) ()
|
let not_comparable name () = error (thunk "not a comparable type") (fun () -> name) ()
|
||||||
let not_compilable_type name () = error (thunk "not a compilable type") (fun () -> name) ()
|
let not_compilable_type name () = error (thunk "not a compilable type") (fun () -> name) ()
|
||||||
@ -68,6 +68,7 @@ module Ty = struct
|
|||||||
| Base_timestamp -> return timestamp_k
|
| Base_timestamp -> return timestamp_k
|
||||||
| Base_bytes -> return bytes_k
|
| Base_bytes -> return bytes_k
|
||||||
| Base_operation -> fail (not_comparable "operation")
|
| Base_operation -> fail (not_comparable "operation")
|
||||||
|
| Base_signature -> fail (not_comparable "signature")
|
||||||
|
|
||||||
let comparable_type : type_value -> ex_comparable_ty result = fun tv ->
|
let comparable_type : type_value -> ex_comparable_ty result = fun tv ->
|
||||||
match tv with
|
match tv with
|
||||||
@ -97,6 +98,7 @@ module Ty = struct
|
|||||||
| Base_timestamp -> return timestamp
|
| Base_timestamp -> return timestamp
|
||||||
| Base_bytes -> return bytes
|
| Base_bytes -> return bytes
|
||||||
| Base_operation -> return operation
|
| Base_operation -> return operation
|
||||||
|
| Base_signature -> return signature
|
||||||
|
|
||||||
let rec type_ : type_value -> ex_ty result =
|
let rec type_ : type_value -> ex_ty result =
|
||||||
function
|
function
|
||||||
@ -115,11 +117,10 @@ module Ty = struct
|
|||||||
let%bind (Ex_ty arg) = type_ arg in
|
let%bind (Ex_ty arg) = type_ arg in
|
||||||
let%bind (Ex_ty ret) = type_ ret in
|
let%bind (Ex_ty ret) = type_ ret in
|
||||||
ok @@ Ex_ty (lambda arg ret)
|
ok @@ Ex_ty (lambda arg ret)
|
||||||
| T_deep_closure (c, arg, ret) ->
|
| T_deep_closure (_, arg, ret) ->
|
||||||
let%bind (Ex_ty capture) = environment_representation c in
|
|
||||||
let%bind (Ex_ty arg) = type_ arg in
|
let%bind (Ex_ty arg) = type_ arg in
|
||||||
let%bind (Ex_ty ret) = type_ ret in
|
let%bind (Ex_ty ret) = type_ ret in
|
||||||
ok @@ Ex_ty (pair (lambda (pair arg capture) ret) capture)
|
ok @@ Ex_ty (lambda arg ret)
|
||||||
| T_map (k, v) ->
|
| T_map (k, v) ->
|
||||||
let%bind (Ex_comparable_ty k') = comparable_type k in
|
let%bind (Ex_comparable_ty k') = comparable_type k in
|
||||||
let%bind (Ex_ty v') = type_ v in
|
let%bind (Ex_ty v') = type_ v in
|
||||||
@ -183,6 +184,7 @@ let base_type : type_base -> O.michelson result =
|
|||||||
| Base_timestamp -> ok @@ O.prim T_timestamp
|
| Base_timestamp -> ok @@ O.prim T_timestamp
|
||||||
| Base_bytes -> ok @@ O.prim T_bytes
|
| Base_bytes -> ok @@ O.prim T_bytes
|
||||||
| Base_operation -> ok @@ O.prim T_operation
|
| Base_operation -> ok @@ O.prim T_operation
|
||||||
|
| Base_signature -> ok @@ O.prim T_signature
|
||||||
|
|
||||||
let rec type_ : type_value -> O.michelson result =
|
let rec type_ : type_value -> O.michelson result =
|
||||||
function
|
function
|
||||||
@ -219,10 +221,10 @@ let rec type_ : type_value -> O.michelson result =
|
|||||||
let%bind arg = type_ arg in
|
let%bind arg = type_ arg in
|
||||||
let%bind ret = type_ ret in
|
let%bind ret = type_ ret in
|
||||||
ok @@ O.prim ~children:[arg;ret] T_lambda
|
ok @@ O.prim ~children:[arg;ret] T_lambda
|
||||||
| T_deep_closure (c , arg , ret) ->
|
| T_deep_closure (_ , arg , ret) ->
|
||||||
let%bind capture = environment_closure c in
|
let%bind arg = type_ arg in
|
||||||
let%bind lambda = lambda_closure (c , arg , ret) in
|
let%bind ret = type_ ret in
|
||||||
ok @@ O.t_pair lambda capture
|
ok @@ O.prim ~children:[arg;ret] T_lambda
|
||||||
|
|
||||||
and annotated : type_value annotated -> O.michelson result =
|
and annotated : type_value annotated -> O.michelson result =
|
||||||
function
|
function
|
||||||
@ -243,7 +245,7 @@ and lambda_closure = fun (c , arg , ret) ->
|
|||||||
let%bind capture = environment_closure c in
|
let%bind capture = environment_closure c in
|
||||||
let%bind arg = type_ arg in
|
let%bind arg = type_ arg in
|
||||||
let%bind ret = type_ ret in
|
let%bind ret = type_ ret in
|
||||||
ok @@ O.t_lambda (O.t_pair arg capture) ret
|
ok @@ O.t_lambda (O.t_pair capture arg) ret
|
||||||
|
|
||||||
and environment_closure =
|
and environment_closure =
|
||||||
function
|
function
|
||||||
|
@ -8,16 +8,16 @@ open Script_ir_translator
|
|||||||
|
|
||||||
let rec translate_value ?bm_opt (Ex_typed_value (ty, value)) : value result =
|
let rec translate_value ?bm_opt (Ex_typed_value (ty, value)) : value result =
|
||||||
match (ty, value) with
|
match (ty, value) with
|
||||||
| Pair_t ((a_ty, _, _), (b_ty, _, _), _), (a, b) -> (
|
| Pair_t ((a_ty, _, _), (b_ty, _, _), _ , _), (a, b) -> (
|
||||||
let%bind a = translate_value ?bm_opt @@ Ex_typed_value(a_ty, a) in
|
let%bind a = translate_value ?bm_opt @@ Ex_typed_value(a_ty, a) in
|
||||||
let%bind b = translate_value ?bm_opt @@ Ex_typed_value(b_ty, b) in
|
let%bind b = translate_value ?bm_opt @@ Ex_typed_value(b_ty, b) in
|
||||||
ok @@ D_pair(a, b)
|
ok @@ D_pair(a, b)
|
||||||
)
|
)
|
||||||
| Union_t ((a_ty, _), _, _), L a -> (
|
| Union_t ((a_ty, _), _, _ , _), L a -> (
|
||||||
let%bind a = translate_value ?bm_opt @@ Ex_typed_value(a_ty, a) in
|
let%bind a = translate_value ?bm_opt @@ Ex_typed_value(a_ty, a) in
|
||||||
ok @@ D_left a
|
ok @@ D_left a
|
||||||
)
|
)
|
||||||
| Union_t (_, (b_ty, _), _), R b -> (
|
| Union_t (_, (b_ty, _), _ , _), R b -> (
|
||||||
let%bind b = translate_value ?bm_opt @@ Ex_typed_value(b_ty, b) in
|
let%bind b = translate_value ?bm_opt @@ Ex_typed_value(b_ty, b) in
|
||||||
ok @@ D_right b
|
ok @@ D_right b
|
||||||
)
|
)
|
||||||
@ -47,16 +47,16 @@ let rec translate_value ?bm_opt (Ex_typed_value (ty, value)) : value result =
|
|||||||
ok @@ D_string s
|
ok @@ D_string s
|
||||||
| (Bytes_t _), b ->
|
| (Bytes_t _), b ->
|
||||||
ok @@ D_bytes (Tezos_stdlib.MBytes.to_bytes b)
|
ok @@ D_bytes (Tezos_stdlib.MBytes.to_bytes b)
|
||||||
| (Address_t _), s ->
|
| (Address_t _), (s , _) ->
|
||||||
ok @@ D_string (Alpha_context.Contract.to_b58check s)
|
ok @@ D_string (Alpha_context.Contract.to_b58check s)
|
||||||
| (Unit_t _), () ->
|
| (Unit_t _), () ->
|
||||||
ok @@ D_unit
|
ok @@ D_unit
|
||||||
| (Option_t _), None ->
|
| (Option_t _), None ->
|
||||||
ok @@ D_none
|
ok @@ D_none
|
||||||
| (Option_t ((o_ty, _), _, _)), Some s ->
|
| (Option_t (o_ty, _, _)), Some s ->
|
||||||
let%bind s' = translate_value @@ Ex_typed_value (o_ty, s) in
|
let%bind s' = translate_value @@ Ex_typed_value (o_ty, s) in
|
||||||
ok @@ D_some s'
|
ok @@ D_some s'
|
||||||
| (Map_t (k_cty, v_ty, _)), m ->
|
| (Map_t (k_cty, v_ty, _ , _)), m ->
|
||||||
let k_ty = Script_ir_translator.ty_of_comparable_ty k_cty in
|
let k_ty = Script_ir_translator.ty_of_comparable_ty k_cty in
|
||||||
let lst =
|
let lst =
|
||||||
let aux k v acc = (k, v) :: acc in
|
let aux k v acc = (k, v) :: acc in
|
||||||
@ -95,7 +95,7 @@ let rec translate_value ?bm_opt (Ex_typed_value (ty, value)) : value result =
|
|||||||
| None -> ok orig_rem in
|
| None -> ok orig_rem in
|
||||||
bind_fold_list aux original_big_map lst in
|
bind_fold_list aux original_big_map lst in
|
||||||
ok @@ D_big_map lst'
|
ok @@ D_big_map lst'
|
||||||
| (List_t (ty, _)), lst ->
|
| (List_t (ty, _ , _)), lst ->
|
||||||
let%bind lst' =
|
let%bind lst' =
|
||||||
let aux = fun t -> translate_value (Ex_typed_value (ty, t)) in
|
let aux = fun t -> translate_value (Ex_typed_value (ty, t)) in
|
||||||
bind_map_list aux lst
|
bind_map_list aux lst
|
||||||
@ -113,7 +113,7 @@ let rec translate_value ?bm_opt (Ex_typed_value (ty, value)) : value result =
|
|||||||
in
|
in
|
||||||
ok @@ D_set lst''
|
ok @@ D_set lst''
|
||||||
)
|
)
|
||||||
| (Operation_t _) , op ->
|
| (Operation_t _) , (op , _) ->
|
||||||
ok @@ D_operation op
|
ok @@ D_operation op
|
||||||
| ty, v ->
|
| ty, v ->
|
||||||
let%bind error =
|
let%bind error =
|
||||||
|
@ -14,13 +14,44 @@ open Tezos_utils.Michelson
|
|||||||
without effects other than gas consumption. It must never fail. *)
|
without effects other than gas consumption. It must never fail. *)
|
||||||
|
|
||||||
let arity : prim -> int option = function
|
let arity : prim -> int option = function
|
||||||
|
(* stack things *)
|
||||||
|
| I_DIP -> None
|
||||||
|
| I_DROP -> None
|
||||||
|
| I_DUP -> None
|
||||||
|
| I_SWAP -> None
|
||||||
|
| I_DIG -> None
|
||||||
|
| I_DUG -> None
|
||||||
|
(* control *)
|
||||||
|
| I_FAILWITH -> None
|
||||||
|
| I_EXEC -> None
|
||||||
|
| I_IF -> None
|
||||||
|
| I_IF_CONS -> None
|
||||||
|
| I_IF_LEFT -> None
|
||||||
|
| I_IF_NONE -> None
|
||||||
|
| I_LOOP -> None
|
||||||
|
| I_MAP -> None
|
||||||
|
| I_ITER -> None
|
||||||
|
| I_LOOP_LEFT -> None
|
||||||
|
(* internal ops *)
|
||||||
|
| I_CREATE_ACCOUNT -> None
|
||||||
|
| I_CREATE_CONTRACT -> None
|
||||||
|
| I_TRANSFER_TOKENS -> None
|
||||||
|
| I_SET_DELEGATE -> None
|
||||||
|
(* tez arithmetic (can fail) *)
|
||||||
|
| I_ADD -> None
|
||||||
|
| I_MUL -> None
|
||||||
|
| I_SUB -> None (* can fail for tez *)
|
||||||
|
(* etc *)
|
||||||
|
| I_CONCAT -> None (* sometimes 1, sometimes 2 :( *)
|
||||||
|
| I_CAST -> None
|
||||||
|
| I_RENAME -> None
|
||||||
|
(* stuff *)
|
||||||
| I_PACK -> Some 1
|
| I_PACK -> Some 1
|
||||||
| I_UNPACK -> Some 1
|
| I_UNPACK -> Some 1
|
||||||
| I_BLAKE2B -> Some 1
|
| I_BLAKE2B -> Some 1
|
||||||
| I_SHA256 -> Some 1
|
| I_SHA256 -> Some 1
|
||||||
| I_SHA512 -> Some 1
|
| I_SHA512 -> Some 1
|
||||||
| I_ABS -> Some 1
|
| I_ABS -> Some 1
|
||||||
| I_ADD -> None (* can fail for tez *)
|
|
||||||
| I_AMOUNT -> Some 0
|
| I_AMOUNT -> Some 0
|
||||||
| I_AND -> Some 2
|
| I_AND -> Some 2
|
||||||
| I_BALANCE -> Some 0
|
| I_BALANCE -> Some 0
|
||||||
@ -28,39 +59,24 @@ let arity : prim -> int option = function
|
|||||||
| I_CDR -> Some 1
|
| I_CDR -> Some 1
|
||||||
| I_CHECK_SIGNATURE -> Some 3
|
| I_CHECK_SIGNATURE -> Some 3
|
||||||
| I_COMPARE -> Some 2
|
| I_COMPARE -> Some 2
|
||||||
| I_CONCAT -> None (* sometimes 1, sometimes 2 :( *)
|
|
||||||
| I_CONS -> Some 2
|
| I_CONS -> Some 2
|
||||||
| I_CREATE_ACCOUNT -> None (* effects, kind of *)
|
|
||||||
| I_CREATE_CONTRACT -> None (* effects, kind of *)
|
|
||||||
| I_IMPLICIT_ACCOUNT -> Some 1
|
| I_IMPLICIT_ACCOUNT -> Some 1
|
||||||
| I_DIP -> None
|
|
||||||
| I_DROP -> None
|
|
||||||
| I_DUP -> None
|
|
||||||
| I_EDIV -> Some 2
|
| I_EDIV -> Some 2
|
||||||
| I_EMPTY_MAP -> Some 0
|
| I_EMPTY_MAP -> Some 0
|
||||||
| I_EMPTY_SET -> Some 0
|
| I_EMPTY_SET -> Some 0
|
||||||
| I_EQ -> Some 1
|
| I_EQ -> Some 1
|
||||||
| I_EXEC -> None (* effects *)
|
|
||||||
| I_FAILWITH -> None
|
|
||||||
| I_GE -> Some 1
|
| I_GE -> Some 1
|
||||||
| I_GET -> Some 2
|
| I_GET -> Some 2
|
||||||
| I_GT -> Some 1
|
| I_GT -> Some 1
|
||||||
| I_HASH_KEY -> Some 1
|
| I_HASH_KEY -> Some 1
|
||||||
| I_IF -> None
|
|
||||||
| I_IF_CONS -> None
|
|
||||||
| I_IF_LEFT -> None
|
|
||||||
| I_IF_NONE -> None
|
|
||||||
| I_INT -> Some 1
|
| I_INT -> Some 1
|
||||||
| I_LAMBDA -> Some 0
|
| I_LAMBDA -> Some 0
|
||||||
| I_LE -> Some 1
|
| I_LE -> Some 1
|
||||||
| I_LEFT -> Some 1
|
| I_LEFT -> Some 1
|
||||||
| I_LOOP -> None
|
|
||||||
| I_LSL -> Some 1
|
| I_LSL -> Some 1
|
||||||
| I_LSR -> Some 1
|
| I_LSR -> Some 1
|
||||||
| I_LT -> Some 1
|
| I_LT -> Some 1
|
||||||
| I_MAP -> None
|
|
||||||
| I_MEM -> Some 2
|
| I_MEM -> Some 2
|
||||||
| I_MUL -> None (* can fail for tez *)
|
|
||||||
| I_NEG -> Some 1
|
| I_NEG -> Some 1
|
||||||
| I_NEQ -> Some 1
|
| I_NEQ -> Some 1
|
||||||
| I_NIL -> Some 0
|
| I_NIL -> Some 0
|
||||||
@ -78,21 +94,17 @@ let arity : prim -> int option = function
|
|||||||
| I_SELF -> Some 0
|
| I_SELF -> Some 0
|
||||||
| I_SLICE -> Some 3
|
| I_SLICE -> Some 3
|
||||||
| I_STEPS_TO_QUOTA -> Some 0
|
| I_STEPS_TO_QUOTA -> Some 0
|
||||||
| I_SUB -> None (* can fail for tez *)
|
|
||||||
| I_SWAP -> None
|
|
||||||
| I_TRANSFER_TOKENS -> None (* effects, kind of *)
|
|
||||||
| I_SET_DELEGATE -> None (* effects, kind of *)
|
|
||||||
| I_UNIT -> Some 0
|
| I_UNIT -> Some 0
|
||||||
| I_UPDATE -> Some 3
|
| I_UPDATE -> Some 3
|
||||||
| I_XOR -> Some 2
|
| I_XOR -> Some 2
|
||||||
| I_ITER -> None
|
|
||||||
| I_LOOP_LEFT -> None
|
|
||||||
| I_ADDRESS -> Some 1
|
| I_ADDRESS -> Some 1
|
||||||
| I_CONTRACT -> Some 1
|
| I_CONTRACT -> Some 1
|
||||||
| I_ISNAT -> Some 1
|
| I_ISNAT -> Some 1
|
||||||
| I_CAST -> None
|
| I_CHAIN_ID -> Some 0
|
||||||
| I_RENAME -> None
|
| I_EMPTY_BIG_MAP -> Some 0
|
||||||
|
| I_APPLY -> Some 2
|
||||||
|
|
||||||
|
(* not instructions *)
|
||||||
| K_parameter
|
| K_parameter
|
||||||
| K_storage
|
| K_storage
|
||||||
| K_code
|
| K_code
|
||||||
@ -126,7 +138,9 @@ let arity : prim -> int option = function
|
|||||||
| T_timestamp
|
| T_timestamp
|
||||||
| T_unit
|
| T_unit
|
||||||
| T_operation
|
| T_operation
|
||||||
| T_address -> None
|
| T_address
|
||||||
|
| T_chain_id
|
||||||
|
-> None
|
||||||
|
|
||||||
let is_nullary_op (p : prim) : bool =
|
let is_nullary_op (p : prim) : bool =
|
||||||
match arity p with
|
match arity p with
|
||||||
@ -264,15 +278,15 @@ let rec iterate_optimizer (f : michelson -> bool * michelson) : michelson -> mic
|
|||||||
|
|
||||||
let opt_drop2 : peep2 = function
|
let opt_drop2 : peep2 = function
|
||||||
(* nullary_op ; DROP ↦ *)
|
(* nullary_op ; DROP ↦ *)
|
||||||
| Prim (_, p, _, _), Prim (_, I_DROP, _, _) when is_nullary_op p -> Some []
|
| Prim (_, p, _, _), Prim (_, I_DROP, [], _) when is_nullary_op p -> Some []
|
||||||
(* DUP ; DROP ↦ *)
|
(* DUP ; DROP ↦ *)
|
||||||
| Prim (_, I_DUP, _, _), Prim (_, I_DROP, _, _) -> Some []
|
| Prim (_, I_DUP, _, _), Prim (_, I_DROP, [], _) -> Some []
|
||||||
(* unary_op ; DROP ↦ DROP *)
|
(* unary_op ; DROP ↦ DROP *)
|
||||||
| Prim (_, p, _, _), Prim (_, I_DROP, _, _) when is_unary_op p -> Some [i_drop]
|
| Prim (_, p, _, _), Prim (_, I_DROP, [], _) when is_unary_op p -> Some [i_drop]
|
||||||
(* binary_op ; DROP ↦ DROP ; DROP *)
|
(* binary_op ; DROP ↦ DROP ; DROP *)
|
||||||
| Prim (_, p, _, _), Prim (_, I_DROP, _, _) when is_binary_op p -> Some [i_drop; i_drop]
|
| Prim (_, p, _, _), Prim (_, I_DROP, [], _) when is_binary_op p -> Some [i_drop; i_drop]
|
||||||
(* ternary_op ; DROP ↦ DROP ; DROP ; DROP *)
|
(* ternary_op ; DROP ↦ DROP ; DROP ; DROP *)
|
||||||
| Prim (_, p, _, _), Prim (_, I_DROP, _, _) when is_ternary_op p -> Some [i_drop; i_drop; i_drop]
|
| Prim (_, p, _, _), Prim (_, I_DROP, [], _) when is_ternary_op p -> Some [i_drop; i_drop; i_drop]
|
||||||
| _ -> None
|
| _ -> None
|
||||||
|
|
||||||
let opt_drop4 : peep4 = function
|
let opt_drop4 : peep4 = function
|
||||||
@ -280,7 +294,7 @@ let opt_drop4 : peep4 = function
|
|||||||
| Prim (_, I_DUP, _, _),
|
| Prim (_, I_DUP, _, _),
|
||||||
(Prim (_, p, _, _) as unary_op),
|
(Prim (_, p, _, _) as unary_op),
|
||||||
Prim (_, I_SWAP, _, _),
|
Prim (_, I_SWAP, _, _),
|
||||||
Prim (_, I_DROP, _, _)
|
Prim (_, I_DROP, [], _)
|
||||||
when is_unary_op p ->
|
when is_unary_op p ->
|
||||||
Some [unary_op]
|
Some [unary_op]
|
||||||
| _ -> None
|
| _ -> None
|
||||||
@ -294,19 +308,6 @@ let opt_dip1 : peep1 = function
|
|||||||
(* DIP { unary_op } ↦ SWAP ; unary_op ; SWAP *)
|
(* DIP { unary_op } ↦ SWAP ; unary_op ; SWAP *)
|
||||||
| Prim (_, I_DIP, [Seq (_, [(Prim (_, p, _, _) as unary_op)])], _) when is_unary_op p ->
|
| Prim (_, I_DIP, [Seq (_, [(Prim (_, p, _, _) as unary_op)])], _) when is_unary_op p ->
|
||||||
Some [i_swap ; unary_op ; i_swap]
|
Some [i_swap ; unary_op ; i_swap]
|
||||||
(* saves 5 bytes *)
|
|
||||||
(* DIP { DROP } ↦ SWAP ; DROP *)
|
|
||||||
| Prim (_, I_DIP, [Seq (_, [Prim (_, I_DROP, _, _)])], _) ->
|
|
||||||
Some [i_swap; i_drop]
|
|
||||||
(* saves 3 bytes *)
|
|
||||||
(* DIP { DROP ; DROP } ↦ SWAP ; DROP ; SWAP ; DROP *)
|
|
||||||
| Prim (_, I_DIP, [Seq (_, [Prim (_, I_DROP, _, _) ; Prim (_, I_DROP, _, _)])], _) ->
|
|
||||||
Some [i_swap; i_drop; i_swap; i_drop]
|
|
||||||
(* still saves 1 byte *)
|
|
||||||
(* DIP { DROP ; DROP ; DROP } ↦ SWAP ; DROP ; SWAP ; DROP ; SWAP ; DROP *)
|
|
||||||
| Prim (_, I_DIP, [Seq (_, [Prim (_, I_DROP, _, _) ; Prim (_, I_DROP, _, _) ; Prim (_, I_DROP, _, _)])], _) ->
|
|
||||||
Some [i_swap; i_drop; i_swap; i_drop; i_swap; i_drop]
|
|
||||||
(* after this, DIP { DROP ; ... } is smaller *)
|
|
||||||
| _ -> None
|
| _ -> None
|
||||||
|
|
||||||
let opt_dip2 : peep2 = function
|
let opt_dip2 : peep2 = function
|
||||||
@ -316,16 +317,16 @@ let opt_dip2 : peep2 = function
|
|||||||
| Prim (_, I_DIP, [Seq (_, code1)], _), Prim (_, I_DIP, [Seq (_, code2)], _) ->
|
| Prim (_, I_DIP, [Seq (_, code1)], _), Prim (_, I_DIP, [Seq (_, code2)], _) ->
|
||||||
Some [Prim (0, I_DIP, [Seq (0, code1 @ code2)], [])]
|
Some [Prim (0, I_DIP, [Seq (0, code1 @ code2)], [])]
|
||||||
(* DIP { code } ; DROP ↦ DROP ; code *)
|
(* DIP { code } ; DROP ↦ DROP ; code *)
|
||||||
| Prim (_, I_DIP, code, _), (Prim (_, I_DROP, _, _) as drop) ->
|
| Prim (_, I_DIP, [Seq (_, code)], _), (Prim (_, I_DROP, [], _) as drop) ->
|
||||||
Some (drop :: code)
|
Some (drop :: code)
|
||||||
(* nullary_op ; DIP { code } ↦ code ; nullary_op *)
|
(* nullary_op ; DIP { code } ↦ code ; nullary_op *)
|
||||||
| (Prim (_, p, _, _) as nullary_op), Prim (_, I_DIP, [Seq (_, code)], _) when is_nullary_op p ->
|
| (Prim (_, p, _, _) as nullary_op), Prim (_, I_DIP, [Seq (_, code)], _) when is_nullary_op p ->
|
||||||
Some (code @ [nullary_op])
|
Some (code @ [nullary_op])
|
||||||
(* DIP { code } ; unary_op ↦ unary_op ; DIP { code } *)
|
(* DIP { code } ; unary_op ↦ unary_op ; DIP { code } *)
|
||||||
| (Prim (_, I_DIP, _, _) as dip), (Prim (_, p, _, _) as unary_op) when is_unary_op p ->
|
| (Prim (_, I_DIP, [Seq _], _) as dip), (Prim (_, p, _, _) as unary_op) when is_unary_op p ->
|
||||||
Some [unary_op; dip]
|
Some [unary_op; dip]
|
||||||
(* unary_op ; DIP { code } ↦ DIP { code } ; unary_op *)
|
(* unary_op ; DIP { code } ↦ DIP { code } ; unary_op *)
|
||||||
(* | (Prim (_, p, _, _) as unary_op), (Prim (_, I_DIP, _, _) as dip) when is_unary_op p ->
|
(* | (Prim (_, p, _, _) as unary_op), (Prim (_, I_DIP, [Seq _], _) as dip) when is_unary_op p ->
|
||||||
* Some [dip; unary_op] *)
|
* Some [dip; unary_op] *)
|
||||||
| _ -> None
|
| _ -> None
|
||||||
|
|
||||||
@ -371,6 +372,24 @@ let rec opt_tail_fail : michelson -> michelson =
|
|||||||
Prim (l, p, List.map opt_tail_fail args, annot)
|
Prim (l, p, List.map opt_tail_fail args, annot)
|
||||||
| x -> x
|
| x -> x
|
||||||
|
|
||||||
|
let rec opt_combine_drops (x : michelson) : michelson =
|
||||||
|
let rec combine : michelson list -> michelson list = function
|
||||||
|
| [] -> []
|
||||||
|
| Prim (_, I_DROP, [], []) :: xs ->
|
||||||
|
let xs' = combine xs in
|
||||||
|
begin match xs' with
|
||||||
|
| [] -> [Prim (-1, I_DROP, [], [])]
|
||||||
|
| Prim (_, I_DROP, [], []) :: xs' -> Prim (-1, I_DROP, [Int (-1, Z.of_int 2)], []) :: xs'
|
||||||
|
| Prim (_, I_DROP, [Int (_, n)], []) :: xs' -> Prim (-1, I_DROP, [Int (-1, Z.of_int (1 + Z.to_int n))], []) :: xs'
|
||||||
|
| x' :: xs' -> Prim (-1, I_DROP, [], []) :: x' :: xs'
|
||||||
|
end
|
||||||
|
| x :: xs -> x :: combine xs in
|
||||||
|
match x with
|
||||||
|
| Seq (l, args) -> Seq (l, combine (List.map opt_combine_drops args))
|
||||||
|
| Prim (l, p, args, annot) ->
|
||||||
|
Prim (l, p, List.map opt_combine_drops args, annot)
|
||||||
|
| x -> x
|
||||||
|
|
||||||
let optimize : michelson -> michelson =
|
let optimize : michelson -> michelson =
|
||||||
fun x ->
|
fun x ->
|
||||||
let x = use_lambda_instr x in
|
let x = use_lambda_instr x in
|
||||||
@ -384,4 +403,5 @@ let optimize : michelson -> michelson =
|
|||||||
peephole @@ peep2 opt_swap2 ;
|
peephole @@ peep2 opt_swap2 ;
|
||||||
] in
|
] in
|
||||||
let x = iterate_optimizer (sequence_optimizers optimizers) x in
|
let x = iterate_optimizer (sequence_optimizers optimizers) x in
|
||||||
|
let x = opt_combine_drops x in
|
||||||
x
|
x
|
||||||
|
@ -85,6 +85,7 @@ module Simplify = struct
|
|||||||
("list_iter" , "LIST_ITER") ;
|
("list_iter" , "LIST_ITER") ;
|
||||||
("list_fold" , "LIST_FOLD") ;
|
("list_fold" , "LIST_FOLD") ;
|
||||||
("list_map" , "LIST_MAP") ;
|
("list_map" , "LIST_MAP") ;
|
||||||
|
(*ici*)
|
||||||
("map_iter" , "MAP_ITER") ;
|
("map_iter" , "MAP_ITER") ;
|
||||||
("map_map" , "MAP_MAP") ;
|
("map_map" , "MAP_MAP") ;
|
||||||
("map_fold" , "MAP_FOLD") ;
|
("map_fold" , "MAP_FOLD") ;
|
||||||
@ -154,6 +155,7 @@ module Simplify = struct
|
|||||||
("Set.add" , "SET_ADD") ;
|
("Set.add" , "SET_ADD") ;
|
||||||
("Set.remove" , "SET_REMOVE") ;
|
("Set.remove" , "SET_REMOVE") ;
|
||||||
("Set.fold" , "SET_FOLD") ;
|
("Set.fold" , "SET_FOLD") ;
|
||||||
|
("Set.size", "SIZE") ;
|
||||||
|
|
||||||
("Map.find_opt" , "MAP_FIND_OPT") ;
|
("Map.find_opt" , "MAP_FIND_OPT") ;
|
||||||
("Map.find" , "MAP_FIND") ;
|
("Map.find" , "MAP_FIND") ;
|
||||||
@ -167,6 +169,18 @@ module Simplify = struct
|
|||||||
("Map.literal" , "MAP_LITERAL" ) ;
|
("Map.literal" , "MAP_LITERAL" ) ;
|
||||||
("Map.size" , "SIZE" ) ;
|
("Map.size" , "SIZE" ) ;
|
||||||
|
|
||||||
|
("Big_map.find_opt" , "MAP_FIND_OPT") ;
|
||||||
|
("Big_map.find" , "MAP_FIND") ;
|
||||||
|
("Big_map.update" , "MAP_UPDATE") ;
|
||||||
|
("Big_map.add" , "MAP_ADD") ;
|
||||||
|
("Big_map.remove" , "MAP_REMOVE") ;
|
||||||
|
("Big_map.literal" , "BIG_MAP_LITERAL" ) ;
|
||||||
|
("Big_map.empty" , "BIG_MAP_EMPTY" ) ;
|
||||||
|
|
||||||
|
("Bitwise.lor" , "OR") ;
|
||||||
|
("Bitwise.land" , "AND") ;
|
||||||
|
("Bitwise.lxor" , "XOR") ;
|
||||||
|
|
||||||
("String.length", "SIZE") ;
|
("String.length", "SIZE") ;
|
||||||
("String.size", "SIZE") ;
|
("String.size", "SIZE") ;
|
||||||
("String.slice", "SLICE") ;
|
("String.slice", "SLICE") ;
|
||||||
@ -456,7 +470,10 @@ module Typer = struct
|
|||||||
let%bind () = assert_eq_1 op_lst (t_list (t_operation ()) ()) in
|
let%bind () = assert_eq_1 op_lst (t_list (t_operation ()) ()) in
|
||||||
ok @@ (t_pair (t_operation ()) (t_address ()) ())
|
ok @@ (t_pair (t_operation ()) (t_address ()) ())
|
||||||
|
|
||||||
let get_contract = typer_1_opt "CONTRACT" @@ fun _ tv_opt ->
|
let get_contract = typer_1_opt "CONTRACT" @@ fun addr_tv tv_opt ->
|
||||||
|
if not (type_value_eq (addr_tv, t_address ()))
|
||||||
|
then fail @@ simple_error (Format.asprintf "get_contract expects an address, got %a" PP.type_value addr_tv)
|
||||||
|
else
|
||||||
let%bind tv =
|
let%bind tv =
|
||||||
trace_option (simple_error "get_contract needs a type annotation") tv_opt in
|
trace_option (simple_error "get_contract needs a type annotation") tv_opt in
|
||||||
let%bind tv' =
|
let%bind tv' =
|
||||||
@ -497,11 +514,15 @@ module Typer = struct
|
|||||||
then ok @@ t_int () else
|
then ok @@ t_int () else
|
||||||
if eq_1 a (t_mutez ()) && eq_1 b (t_nat ())
|
if eq_1 a (t_mutez ()) && eq_1 b (t_nat ())
|
||||||
then ok @@ t_mutez () else
|
then ok @@ t_mutez () else
|
||||||
|
if eq_1 a (t_mutez ()) && eq_1 b (t_mutez ())
|
||||||
|
then ok @@ t_nat () else
|
||||||
simple_fail "Dividing with wrong types"
|
simple_fail "Dividing with wrong types"
|
||||||
|
|
||||||
let mod_ = typer_2 "MOD" @@ fun a b ->
|
let mod_ = typer_2 "MOD" @@ fun a b ->
|
||||||
if (eq_1 a (t_nat ()) || eq_1 a (t_int ())) && (eq_1 b (t_nat ()) || eq_1 b (t_int ()))
|
if (eq_1 a (t_nat ()) || eq_1 a (t_int ())) && (eq_1 b (t_nat ()) || eq_1 b (t_int ()))
|
||||||
then ok @@ t_nat () else
|
then ok @@ t_nat () else
|
||||||
|
if eq_1 a (t_mutez ()) && eq_1 b (t_mutez ())
|
||||||
|
then ok @@ t_mutez () else
|
||||||
simple_fail "Computing modulo with wrong types"
|
simple_fail "Computing modulo with wrong types"
|
||||||
|
|
||||||
let add = typer_2 "ADD" @@ fun a b ->
|
let add = typer_2 "ADD" @@ fun a b ->
|
||||||
|
@ -25,7 +25,7 @@ let literal ppf (l:literal) = match l with
|
|||||||
| Literal_int n -> fprintf ppf "%d" n
|
| Literal_int n -> fprintf ppf "%d" n
|
||||||
| Literal_nat n -> fprintf ppf "+%d" n
|
| Literal_nat n -> fprintf ppf "+%d" n
|
||||||
| Literal_timestamp n -> fprintf ppf "+%d" n
|
| Literal_timestamp n -> fprintf ppf "+%d" n
|
||||||
| Literal_mutez n -> fprintf ppf "%dmtz" n
|
| Literal_mutez n -> fprintf ppf "%dmutez" n
|
||||||
| Literal_string s -> fprintf ppf "%S" s
|
| Literal_string s -> fprintf ppf "%S" s
|
||||||
| Literal_bytes b -> fprintf ppf "0x%s" @@ Bytes.to_string @@ Bytes.escaped b
|
| Literal_bytes b -> fprintf ppf "0x%s" @@ Bytes.to_string @@ Bytes.escaped b
|
||||||
| Literal_address s -> fprintf ppf "@%S" s
|
| Literal_address s -> fprintf ppf "@%S" s
|
||||||
|
@ -1,3 +1,5 @@
|
|||||||
|
(** Pretty printer for the Simplified Abstract Syntax Tree *)
|
||||||
|
|
||||||
open Types
|
open Types
|
||||||
open Format
|
open Format
|
||||||
|
|
||||||
@ -32,7 +34,7 @@ val matching_variant_case : (formatter -> 'a -> unit) -> formatter -> (construct
|
|||||||
val matching : (formatter -> 'a -> unit) -> formatter -> 'a matching -> unit
|
val matching : (formatter -> 'a -> unit) -> formatter -> 'a matching -> unit
|
||||||
*)
|
*)
|
||||||
|
|
||||||
(* Shows the type expected for the matched value *)
|
(** Shows the type expected for the matched value *)
|
||||||
val matching_type : formatter -> 'a matching -> unit
|
val matching_type : formatter -> 'a matching -> unit
|
||||||
|
|
||||||
(*
|
(*
|
||||||
@ -41,4 +43,5 @@ val matching_variant_case_type : formatter -> ( ( constructor_name * name) * 'a)
|
|||||||
val declaration : formatter -> declaration -> unit
|
val declaration : formatter -> declaration -> unit
|
||||||
|
|
||||||
*)
|
*)
|
||||||
|
(** Pretty print a full program AST *)
|
||||||
val program : formatter -> program -> unit
|
val program : formatter -> program -> unit
|
||||||
|
@ -70,7 +70,7 @@ and literal ppf (l:literal) : unit =
|
|||||||
| Literal_int n -> fprintf ppf "%d" n
|
| Literal_int n -> fprintf ppf "%d" n
|
||||||
| Literal_nat n -> fprintf ppf "+%d" n
|
| Literal_nat n -> fprintf ppf "+%d" n
|
||||||
| Literal_timestamp n -> fprintf ppf "+%d" n
|
| Literal_timestamp n -> fprintf ppf "+%d" n
|
||||||
| Literal_mutez n -> fprintf ppf "%dmtz" n
|
| Literal_mutez n -> fprintf ppf "%dmutez" n
|
||||||
| Literal_string s -> fprintf ppf "%s" s
|
| Literal_string s -> fprintf ppf "%s" s
|
||||||
| Literal_bytes b -> fprintf ppf "0x%s" @@ Bytes.to_string @@ Bytes.escaped b
|
| Literal_bytes b -> fprintf ppf "0x%s" @@ Bytes.to_string @@ Bytes.escaped b
|
||||||
| Literal_address s -> fprintf ppf "@%s" s
|
| Literal_address s -> fprintf ppf "@%s" s
|
||||||
|
@ -20,6 +20,7 @@ let type_base ppf : type_base -> _ = function
|
|||||||
| Base_timestamp -> fprintf ppf "timestamp"
|
| Base_timestamp -> fprintf ppf "timestamp"
|
||||||
| Base_bytes -> fprintf ppf "bytes"
|
| Base_bytes -> fprintf ppf "bytes"
|
||||||
| Base_operation -> fprintf ppf "operation"
|
| Base_operation -> fprintf ppf "operation"
|
||||||
|
| Base_signature -> fprintf ppf "signature"
|
||||||
|
|
||||||
let rec type_ ppf : type_value -> _ = function
|
let rec type_ ppf : type_value -> _ = function
|
||||||
| T_or(a, b) -> fprintf ppf "(%a) | (%a)" annotated a annotated b
|
| T_or(a, b) -> fprintf ppf "(%a) | (%a)" annotated a annotated b
|
||||||
@ -53,7 +54,7 @@ let rec value ppf : value -> unit = function
|
|||||||
| D_int n -> fprintf ppf "%d" n
|
| D_int n -> fprintf ppf "%d" n
|
||||||
| D_nat n -> fprintf ppf "+%d" n
|
| D_nat n -> fprintf ppf "+%d" n
|
||||||
| D_timestamp n -> fprintf ppf "+%d" n
|
| D_timestamp n -> fprintf ppf "+%d" n
|
||||||
| D_mutez n -> fprintf ppf "%dmtz" n
|
| D_mutez n -> fprintf ppf "%dmutez" n
|
||||||
| D_unit -> fprintf ppf "unit"
|
| D_unit -> fprintf ppf "unit"
|
||||||
| D_string s -> fprintf ppf "\"%s\"" s
|
| D_string s -> fprintf ppf "\"%s\"" s
|
||||||
| D_bytes x ->
|
| D_bytes x ->
|
||||||
|
@ -6,7 +6,7 @@ type type_base =
|
|||||||
| Base_int | Base_nat | Base_tez
|
| Base_int | Base_nat | Base_tez
|
||||||
| Base_timestamp
|
| Base_timestamp
|
||||||
| Base_string | Base_bytes | Base_address
|
| Base_string | Base_bytes | Base_address
|
||||||
| Base_operation
|
| Base_operation | Base_signature
|
||||||
|
|
||||||
type 'a annotated = string option * 'a
|
type 'a annotated = string option * 'a
|
||||||
|
|
||||||
|
@ -1,3 +1,5 @@
|
|||||||
|
(* Test that a string is cast to an address given a type annotation *)
|
||||||
|
|
||||||
const lst : list(int) = list [] ;
|
const lst : list(int) = list [] ;
|
||||||
|
|
||||||
const address : address = ("tz1KqTpEZ7Yob7QbPE4Hy4Wo8fHG8LhKxZSx" : address) ;
|
const address : address = ("tz1KqTpEZ7Yob7QbPE4Hy4Wo8fHG8LhKxZSx" : address) ;
|
||||||
|
30
src/test/contracts/arithmetic.mligo
Normal file
30
src/test/contracts/arithmetic.mligo
Normal file
@ -0,0 +1,30 @@
|
|||||||
|
// Test CameLIGO arithmetic operators
|
||||||
|
|
||||||
|
let mod_op (n : int) : nat =
|
||||||
|
n mod 42
|
||||||
|
|
||||||
|
let plus_op (n : int) : int =
|
||||||
|
n + 42
|
||||||
|
|
||||||
|
let minus_op (n : int) : int =
|
||||||
|
n - 42
|
||||||
|
|
||||||
|
let times_op (n : int) : int =
|
||||||
|
n * 42
|
||||||
|
|
||||||
|
let div_op (n : int) : int =
|
||||||
|
n / 2
|
||||||
|
|
||||||
|
(* TODO (?): Support conversion from nat to int and back
|
||||||
|
|
||||||
|
let int_op (n : nat) : int =
|
||||||
|
Int n
|
||||||
|
|
||||||
|
*)
|
||||||
|
|
||||||
|
let neg_op (n : int) : int =
|
||||||
|
-n
|
||||||
|
|
||||||
|
let foo (n : int) : int = n + 10
|
||||||
|
|
||||||
|
let neg_op_2 (b: int) : int = -(foo b)
|
@ -1,30 +1,36 @@
|
|||||||
type storage_ is big_map(int, int) * unit
|
type storage_ is big_map(int, int) * unit
|
||||||
|
type foo is big_map(int, int)
|
||||||
|
|
||||||
function main(const p : unit; const s : storage_) : list(operation) * storage_ is
|
function main(const p : unit; const s : storage_) : list(operation) * storage_ is
|
||||||
var r : big_map(int, int) := s.0 ;
|
|
||||||
var toto : option (int) := Some(0);
|
var toto : option (int) := Some(0);
|
||||||
block {
|
block {
|
||||||
toto := r[23];
|
toto := s.0[23];
|
||||||
r[2] := 444;
|
s.0[2] := 444;
|
||||||
s.0 := r;
|
|
||||||
}
|
}
|
||||||
with ((nil: list(operation)), s)
|
with ((nil: list(operation)), s)
|
||||||
|
|
||||||
function set_ (var n : int ; var m : storage_) : storage_ is block {
|
function set_ (var n : int ; var m : foo) : foo is block {
|
||||||
var tmp : big_map(int,int) := m.0 ;
|
m[23] := n ;
|
||||||
tmp[23] := n ;
|
|
||||||
m.0 := tmp ;
|
|
||||||
} with m
|
} with m
|
||||||
|
|
||||||
function rm (var m : storage_) : storage_ is block {
|
function rm (var m : foo) : foo is block {
|
||||||
var tmp : big_map(int,int) := m.0 ;
|
remove 42 from map m;
|
||||||
remove 42 from map tmp;
|
|
||||||
m.0 := tmp;
|
|
||||||
} with m
|
} with m
|
||||||
|
|
||||||
function gf (const m : storage_) : int is begin skip end with get_force(23, m.0)
|
function gf (const m : foo) : int is begin skip end with get_force(23, m)
|
||||||
|
|
||||||
function get (const m : storage_) : option(int) is
|
function get (const m : foo) : option(int) is begin skip end with m[42]
|
||||||
begin
|
|
||||||
skip
|
const empty_big_map : big_map(int,int) = big_map end
|
||||||
end with m.0[42]
|
|
||||||
|
const big_map1 : big_map(int,int) = big_map
|
||||||
|
23 -> 0 ;
|
||||||
|
42 -> 0 ;
|
||||||
|
end
|
||||||
|
|
||||||
|
function mutimaps (const m : foo ; const n : foo) : foo is block
|
||||||
|
{
|
||||||
|
var bar : foo := m ;
|
||||||
|
bar[42] := 0 ;
|
||||||
|
n[42] := get_force(42, bar) ;
|
||||||
|
} with n
|
@ -1,12 +1,21 @@
|
|||||||
type storage_ = ((int, int) big_map * unit)
|
type foo = (int, int) big_map
|
||||||
|
|
||||||
let set_ (n : int) (m : storage_) : storage_ =
|
let set_ (n : int) (m : foo) : foo = Big_map.update 23 (Some(n)) m
|
||||||
(Map.update 23 (Some(n)) m.(0), ())
|
|
||||||
|
|
||||||
let rm (m : storage_) : storage_ =
|
let rm (m : foo) : foo = Big_map.remove 42 m
|
||||||
(Map.remove 42 m.(0), ())
|
|
||||||
|
|
||||||
let gf (m : storage_) : int = Map.find 23 m.(0)
|
let gf (m : foo) : int = Big_map.find 23 m
|
||||||
|
|
||||||
let get (m: storage_): int option =
|
let get (m: foo): int option = Big_map.find_opt 42 m
|
||||||
Map.find_opt 42 m.(0)
|
|
||||||
|
let empty_map : foo = Big_map.empty
|
||||||
|
|
||||||
|
let map1 : foo = Big_map.literal
|
||||||
|
[ (23 , 0) ; (42, 0) ]
|
||||||
|
|
||||||
|
let map1 : foo = Big_map.literal
|
||||||
|
[ (23 , 0) ; (42, 0) ]
|
||||||
|
|
||||||
|
let mutimaps (m : foo) (n : foo) : foo =
|
||||||
|
let bar : foo = Big_map.update 42 (Some(0)) m in
|
||||||
|
Big_map.update 42 (get(bar)) n
|
10
src/test/contracts/bitwise_arithmetic.mligo
Normal file
10
src/test/contracts/bitwise_arithmetic.mligo
Normal file
@ -0,0 +1,10 @@
|
|||||||
|
(* Test CameLIGO bitwise operators *)
|
||||||
|
|
||||||
|
let or_op (n : nat) : nat =
|
||||||
|
Bitwise.lor n 4p
|
||||||
|
|
||||||
|
let and_op (n : nat) : nat =
|
||||||
|
Bitwise.land n 7p
|
||||||
|
|
||||||
|
let xor_op (n : nat) : nat =
|
||||||
|
Bitwise.lxor n 7p
|
2
src/test/contracts/blockless.ligo
Normal file
2
src/test/contracts/blockless.ligo
Normal file
@ -0,0 +1,2 @@
|
|||||||
|
function blockless (const n: int) : int is
|
||||||
|
n + 10;
|
16
src/test/contracts/boolean_operators.mligo
Normal file
16
src/test/contracts/boolean_operators.mligo
Normal file
@ -0,0 +1,16 @@
|
|||||||
|
// Test CameLIGO boolean operators
|
||||||
|
|
||||||
|
let or_true (b : bool) : bool =
|
||||||
|
b || true
|
||||||
|
|
||||||
|
let or_false (b : bool) : bool =
|
||||||
|
b || false
|
||||||
|
|
||||||
|
let and_true (b : bool) : bool =
|
||||||
|
b && true
|
||||||
|
|
||||||
|
let and_false (b : bool) : bool =
|
||||||
|
b && false
|
||||||
|
|
||||||
|
let not_bool (b: bool) : bool =
|
||||||
|
not b
|
5
src/test/contracts/condition-annot.mligo
Normal file
5
src/test/contracts/condition-annot.mligo
Normal file
@ -0,0 +1,5 @@
|
|||||||
|
let%entry main (i : int) =
|
||||||
|
if (i = 2 : bool) then
|
||||||
|
(42 : int)
|
||||||
|
else
|
||||||
|
(0 : int)
|
9
src/test/contracts/condition-shadowing.mligo
Normal file
9
src/test/contracts/condition-shadowing.mligo
Normal file
@ -0,0 +1,9 @@
|
|||||||
|
(* TODO : make a test using mutation, not shadowing *)
|
||||||
|
let%entry main (i : int) =
|
||||||
|
let result = 0 in
|
||||||
|
if i = 2 then
|
||||||
|
let result = 42 in
|
||||||
|
result
|
||||||
|
else
|
||||||
|
let result = 0 in
|
||||||
|
result
|
@ -8,3 +8,9 @@ function main (const i : int) : int is
|
|||||||
else
|
else
|
||||||
result := 0
|
result := 0
|
||||||
end with result
|
end with result
|
||||||
|
|
||||||
|
function foo (const b : bool) : int is
|
||||||
|
var x : int := 41 ;
|
||||||
|
begin
|
||||||
|
x := 1 + (if b then x else main(x)) ;
|
||||||
|
end with x
|
7
src/test/contracts/condition.mligo
Normal file
7
src/test/contracts/condition.mligo
Normal file
@ -0,0 +1,7 @@
|
|||||||
|
// Test if conditional in CameLIGO
|
||||||
|
|
||||||
|
let%entry main (i : int) =
|
||||||
|
if i = 2 then
|
||||||
|
42
|
||||||
|
else
|
||||||
|
0
|
8
src/test/contracts/fibo.mligo
Normal file
8
src/test/contracts/fibo.mligo
Normal file
@ -0,0 +1,8 @@
|
|||||||
|
type storage = unit
|
||||||
|
|
||||||
|
|
||||||
|
let%entry main (p:unit) storage =
|
||||||
|
(fun (f : (int * int) -> int) (x : int) (y : int) -> f (y, x))
|
||||||
|
(fun (x : int) (y : int) -> x + y)
|
||||||
|
0
|
||||||
|
1
|
7
src/test/contracts/fibo2.mligo
Normal file
7
src/test/contracts/fibo2.mligo
Normal file
@ -0,0 +1,7 @@
|
|||||||
|
type storage = unit
|
||||||
|
|
||||||
|
let%entry main (p:unit) storage =
|
||||||
|
(fun (f : int -> int) (x : int) (y : int) -> (f y))
|
||||||
|
(fun (x : int) -> x)
|
||||||
|
0
|
||||||
|
1
|
7
src/test/contracts/fibo3.mligo
Normal file
7
src/test/contracts/fibo3.mligo
Normal file
@ -0,0 +1,7 @@
|
|||||||
|
type storage = unit
|
||||||
|
|
||||||
|
let%entry main (p:unit) storage =
|
||||||
|
(fun (f : int -> int -> int) (x : int) (y : int) -> (f y) (x + y))
|
||||||
|
(fun (x : int) (y : int) -> x + y)
|
||||||
|
0
|
||||||
|
1
|
6
src/test/contracts/fibo4.mligo
Normal file
6
src/test/contracts/fibo4.mligo
Normal file
@ -0,0 +1,6 @@
|
|||||||
|
type storage = unit
|
||||||
|
|
||||||
|
let%entry main (p:unit) storage =
|
||||||
|
(fun (f : int -> int) (x : int) -> (f x))
|
||||||
|
(fun (x : int) -> x)
|
||||||
|
1
|
7
src/test/contracts/function-shared.mligo
Normal file
7
src/test/contracts/function-shared.mligo
Normal file
@ -0,0 +1,7 @@
|
|||||||
|
(* Test use of multiple subroutines in a CameLIGO function *)
|
||||||
|
|
||||||
|
let foo (i: int) : int = i + 20
|
||||||
|
|
||||||
|
let bar (i: int) : int = i + 50
|
||||||
|
|
||||||
|
let foobar (i: int) : int = (foo i) + (bar i)
|
@ -17,12 +17,11 @@ function foobar2 (const i : int) : int is
|
|||||||
block { skip } with i;
|
block { skip } with i;
|
||||||
block { skip } with higher2(i,foo2)
|
block { skip } with higher2(i,foo2)
|
||||||
|
|
||||||
// This is not supported yet:
|
const a : int = 0;
|
||||||
// const a : int = 123;
|
function foobar3 (const i : int) : int is
|
||||||
// function foobar3 (const i : int) : int is
|
function foo2 (const i : int) : int is
|
||||||
// function foo2 (const i : int) : int is
|
block { skip } with (a+i);
|
||||||
// block { skip } with (a+i);
|
block { skip } with higher2(i,foo2)
|
||||||
// block { skip } with higher2(i,foo2)
|
|
||||||
|
|
||||||
function f (const i : int) : int is
|
function f (const i : int) : int is
|
||||||
block { skip }
|
block { skip }
|
||||||
@ -35,3 +34,16 @@ function g (const i : int) : int is
|
|||||||
function foobar4 (const i : int) : int is
|
function foobar4 (const i : int) : int is
|
||||||
block { skip }
|
block { skip }
|
||||||
with g(g(i))
|
with g(g(i))
|
||||||
|
|
||||||
|
function higher3(const i: int; const f: int -> int; const g: int -> int): int is
|
||||||
|
block {
|
||||||
|
const ii: int = f(g(i));
|
||||||
|
} with ii
|
||||||
|
|
||||||
|
function foobar5 (const i : int) : int is
|
||||||
|
const a : int = 0;
|
||||||
|
function foo (const i : int) : int is
|
||||||
|
block { skip } with (a+i);
|
||||||
|
function goo (const i : int) : int is
|
||||||
|
block { skip } with foo(i);
|
||||||
|
block { skip } with higher3(i,foo,goo)
|
||||||
|
Some files were not shown because too many files have changed in this diff Show More
Loading…
Reference in New Issue
Block a user