<html>
<head>
<meta http-equiv="Content-Type" content="text/html; charset=utf-8">
</head>
<body text="#000000" bgcolor="#FFFFFF">
<span id="result_box" class="" lang="en"><span class="">When the
termination checking fails you can complete it again on a
normalized part of the program. Then the speed is completely
preserved because we will rarely need normalization passe .
Therefore, normalized parts could be stored and used during
coding generation.<br>
<br>
</span></span>Kiam la sintaksa finiĝokontrolo fiaskas oni povas
denove plenumi ĝin sur normalita parto de la programo. Tiam la
rapideco estas tute konservita pro oni malofte bezonos nomaladan
fazon. Sekve, normalitaj partoj povus esti konservi kaj uzitaj dum
la kodokonstuado.<br>
<br>
<div class="moz-cite-prefix">On 2018-07-31 03:19, Jason -Zhong
Sheng- Hu wrote:<br>
</div>
<blockquote type="cite"
cite="mid:YTXPR0101MB1472AFFA79F85E7958CDC7DCAF2E0@YTXPR0101MB1472.CANPRD01.PROD.OUTLOOK.COM">
<p style="margin-top:0;margin-bottom:0">I am wondering if there is
any implementation of dependent type theory performs
normalization before termination checking?</p>
<p style="margin-top:0;margin-bottom:0"><br>
</p>
<p style="margin-top:0;margin-bottom:0">It sounds theoretical
doable, but somehow termination checking is commonly
syntactical, like agda and coq. However, those who care about
compilation performance might not be too happy about it.<br>
</p>
<p style="margin-top:0;margin-bottom:0"><br>
</p>
<p style="margin-top:0;margin-bottom:0">I am wondering if there is
an example makes normalization here looks non-trivial?<br>
</p>
<p style="margin-top:0;margin-bottom:0"><br>
</p>
</blockquote>
<br>
<div class="moz-signature">-- <br>
Serge Leblanc
<hr>
gpg --search-keys 0x67B17A3F
<br>
Fingerprint = 2B2D AC93 8620 43D3 D2C2 C2D3 B67C F631 67B1 7A3F</div>
</body>
</html>