# source:Papers/polymorphic-variants-2012/polymorphic-variants.tex@2525

Last change on this file since 2525 was 2525, checked in by sacerdot, 7 years ago

...

File size: 32.5 KB
Line
1\documentclass[smallextended]{svjour3}
2
3\smartqed
4
5\usepackage{amsmath}
6\usepackage[english]{babel}
8\usepackage{listings}
9\usepackage{microtype}
10\usepackage{prooftree}
11
12\newcommand{\cons}{{::}}
13\newcommand{\ent}{\vdash}
14\newcommand{\fall}[2]{\forall{#1}.#2}
15\newcommand{\funarr}{\rightarrow}
16\newcommand{\lam}[2]{\lambda{#1}.{#2}}
17\newcommand{\letin}[3]{\mathbf{let}\ {#1}\ {:=}\ {#2}\ \mathbf{in}\ {#3}}
18\newcommand{\match}[3]{\mathbf{match}\ #1\ \mathbf{with}\ #2\ \Rightarrow #3}
19\newcommand{\pair}[2]{\langle #1, #2 \rangle}
20\newcommand{\polytag}[1]{{}#1}
21\newcommand{\rulefont}[1]{\ensuremath{(\mathbf{#1})}}
22
23\renewcommand{\verb}{\lstinline}
24\def\lstlanguagefiles{lst-grafite.tex}
25\lstset{language=Grafite}
26
27\bibliographystyle{spphys}
28
29\author{Dominic P. Mulligan \and Claudio Sacerdoti Coen}
30\title{Polymorphic variants in dependent type theory\thanks{The project CerCo acknowledges the financial support of the Future and Emerging Technologies (FET) programme within the Seventh Framework Programme for Research of the European Commission, under FET-Open grant number: 243881.}}
31
32\institute{
33  Dominic P. Mulligan \at
34  Computer Laboratory,\\
35  University of Cambridge.
36  \email{dominic.p.mulligan@gmail.com} \and
37  Claudio Sacerdoti Coen \at
38  Dipartimento di Scienze dell'Informazione,\\
39  Universit\a di Bologna.
40  \email{sacerdot@cs.unibo.it}
41}
42
43\begin{document}
44
45\maketitle
46
47\begin{abstract}
48
49Big long abstract introducing the work
50
51\keywords{Polymorphic variants \and dependent type theory \and Matita theorem prover}
52
53\end{abstract}
54
55%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%
56% Section
57%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%
58\section{Introduction}
59\label{sect.introduction}
60
61%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%
62% Section
63%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%
64\section{Polymorphic variants}
65\label{sect.polymorphic.variants}
66
67In this section we will attempt to provide a self-contained \emph{pr\'ecis} of polymorphic variants for the unfamiliar reader.
68Those readers who wish to survey a more complete introduction to the subject are referred to Garrigue's original publications on the subject matter~\cite{garrigue:programming:1998,garrigue:code:2000}.
69
70Most mainstream functional programming languages, such as OCaml and Haskell, have mechanisms for defining new inductive types from old through the use of algebraic data type definitions.
71Each algebraic data type may be described as a sum-of-products.
72The programmer provides a fixed number of distinct \emph{constructors} with each constructor expects a (potentially empty) product of arguments.
73Values of a given inductive data type are built using a data type's constructors.
74Quotidian data structures---such as lists, trees, heaps, zippers, and so forth---can all be introduced using this now familiar mechanism.
75
76Having built data from various combinations of constructors, to complete the picture we now need some facility for picking said data apart, in order to build functions that operate over that data.
77Functional languages almost uniformly employ \emph{pattern matching} for this task.
78Given any inhabitant of an algebraic data type, by the aforementioned sum-of-products property, we know that it must consist of some constructor applied to various arguments.
79Using pattern matching we can therefore deconstruct algebraic data by performing a case analysis on the constructors of a given type.
80
81The combination of algebraic data types and pattern matching is powerful, and is arguably the main branching mechanism for most functional programming languages.
82Furthermore, using pattern matching it is easy to define new functions that consume algebraic data---the set of operations that can be defined for any given algebraic type is practically without bound.
83
84Unfortunately, in practical programming, we often want to expand a previously defined algebraic data type, adding more constructors.
85When it comes to extending algebraic data types with new constructors in this way, we hit a problem: these types are closed'.
86In order to circumvent this restriction, we must introduce a new algebraic type with the additional constructor, lifting the old type---and any functions defined over it---into this type.
87This manual extension, coupled with the lifting of any existing functions into the new type, is cumbersome and error-prone.
88
89\begin{figure}[ht]
90\begin{minipage}[b]{0.45\linewidth}
91\begin{lstlisting}
93data Term
94  = Lit Int
96
97eval :: Term -> Int
98eval (Lit i)   = i
99eval (Add l r) = eval l + eval r
100\end{lstlisting}
101\end{minipage}
102\hspace{0.5cm}
103\begin{minipage}[b]{0.45\linewidth}
104\begin{lstlisting}
105interface Term {
106  int eval();
107}
108
109class Lit implements Term {
110  private int v = 0;
111  ...
112  int eval() { return v; }
113}
114
116  private Term left, right = null;
117  ...
118  int eval() {
119    return left.eval() + right.eval();
120  }
121}
122\end{lstlisting}
123\end{minipage}
124\label{fig.pattern-matching.vs.oop}
125\caption{A simple language of integer arithmetic embedded as an algebraic data type and as a class hierarchy.}
126\end{figure}
127
128However, we can look abroad to see what other families of programming languages do to solve this problem.
129For instance, we can compare and contrast functional programming languages' use of algebraic data and pattern matching with the approach taken by object-oriented languages (see Figure~\ref{fig.pattern-matching.vs.oop} for a concrete example).
130In mainstream, class-based object-oriented languages---such as Java, for example---algebraic data types can be emulated by interfaces, coupled with a number of subclasses corresponding to constructors, implementing the root' interface.
131The base interface of the object hierarchy specifies the permitted operations defined for the type.
132Branching by pattern matching is emulated using the language's dynamic dispatch mechanism.
133
134As all permitted operations on
135it is hard to enlarge the set of operations defined over a given type without altering the entire class hierarchy.
136If the interface changes so must every class implementing it.
137However, note it is easy to extend the hierarchy to new cases, corresponding to the introduction of a new constructor in the functional world, by merely adding another class corresponding to that constructor implementing the interface.
138
139\begin{figure}[ht]
140\begin{minipage}[b]{0.45\linewidth}
141\begin{lstlisting}
142...
143data Term'
144  = Lift Term
145  | Mul Term' Term'
146
147eval' :: Term' -> Int
148eval' (Lift t)  = eval t
149eval' (Mul l r) = eval' l * eval ' r
150\end{lstlisting}
151\end{minipage}
152\hspace{0.5cm}
153\begin{minipage}[b]{0.45\linewidth}
154\begin{lstlisting}
155...
156class Mul implements Term {
157  private Term left, right = null;
158  ...
159  int eval() {
160    return left.eval() * right.eval();
161  }
162}
163\end{lstlisting}
164\end{minipage}
165\label{fig.extending.base.type}
166\caption{Extending our simple language of arithmetic with a new grammatical element.}
167\end{figure}
168
169\begin{figure}[ht]
170\begin{minipage}[b]{0.45\linewidth}
171\begin{lstlisting}
172...
173show :: Term -> String
174show (Lit i)   = show i
176  show l ++ " + " ++ show r
177\end{lstlisting}
178\end{minipage}
179\hspace{0.5cm}
180\begin{minipage}[b]{0.45\linewidth}
181\begin{lstlisting}
182interface Term {
183  int eval();
184  String show();
185}
186
187class Lit implements Term {
188  private int v = 0;
189  ...
190  int eval() { return v; }
191  String show() {
192    return Int.toString(v);
193  }
194}
195
197  private Term left, right = null;
198  ...
199  int eval() {
200    return left.eval() + right.eval();
201  }
202  ...
203  String show() {
204    return left.show() + \
205      " + " + right.show();
206  }
207}
208\end{lstlisting}
209\end{minipage}
210\label{fig.extending.base.type}
211\caption{Extending the permitted operations over our simple language of arithmetic.}
212\end{figure}
213
214[dpm: reword the above --- hard to phrase precisely ]
215
216
217\begin{itemize}
218 \item General introduction, motivations
219 \item Bounded vs not-bounded.
220\end{itemize}
221
222\begin{definition}
223\label{defn.garrigues.calculus}
224Inductively define a
225\end{definition}
226
227\begin{figure}
228\begin{gather*}
229\begin{prooftree}
230(\Gamma(x) = \sigma)
231\justifies
232K; \Gamma \ent x : \sigma
233\using\rulefont{{\Gamma}x}
234\end{prooftree}
236\begin{prooftree}
237K; \Gamma, x : \phi \ent r : \psi
238\justifies
239K; \Gamma \ent \lam{x}r : \phi \funarr \psi
240\using\rulefont{{\Gamma}\lambda}
241\end{prooftree}
243\begin{prooftree}
244K;\Gamma \ent r : \phi \funarr \psi \quad K;\Gamma \ent s : \phi
245\justifies
246K;\Gamma \ent rs : \psi
247\using\rulefont{{\Gamma}rs}
248\end{prooftree}
249\\[1.5ex]
250\begin{prooftree}
251K; \Gamma \ent r : \sigma \quad K; \Gamma, x:\sigma \ent s : \psi
252\justifies
253K; \Gamma \ent \letin{x}{r}{s} : \psi
254\using\rulefont{{\Gamma}{:=}}
255\end{prooftree}
257\begin{prooftree}
258K;\Gamma \ent r : \sigma \quad (a \not\in ftv(\Gamma))
259\justifies
260K;\Gamma \ent r : \fall{a}{\sigma}
261\using\rulefont{{\Gamma}{\forall}{a}}
262\end{prooftree}
263\\[1.5ex]
264\begin{prooftree}
265K;\Gamma \ent r : \fall{a}{\sigma} \quad K;\Gamma \ent s : \psi
266\justifies
267K;\Gamma \ent rs : \sigma[a := \psi]
268\using\rulefont{{\Gamma}inst{a}}
269\end{prooftree}
270\\[1.5ex]
271\begin{prooftree}
272(L \subseteq L' \quad H \subseteq H')
273\justifies
274K \oplus i_{{\geq \pair{L}{U}}} \ent i \geq \pair{L'}{H'}
275\using\rulefont{{K}i}
276\end{prooftree}
278\begin{prooftree}
279K;\Gamma \ent r : \phi \quad K \ent i \geq \pair{tag}{\top}
280\justifies
281K;\Gamma \ent \polytag{tag(r)} : [ i \mid tag : \phi \cons T ]
282\using\rulefont{{K}variant}
283\end{prooftree}
284\\[1.5ex]
285\begin{prooftree}
286K;\Gamma \ent r : [ i \mid \{ tag_k : \phi_k \}^n_1 \cons T ] \quad K \ent i \geq \pair{\bot}{\{tag_k\}^n_1} \quad
287K;\Gamma, x_k : \phi_k \ent r_k : \psi \quad (1 \leq k \leq n)
288\justifies
289K;\Gamma \ent \mathbf{match}\ r\ \mathbf{with}\ \{ \polytag{tag_k(x_k)} \Rightarrow r_k \}^n_1 : \psi
290\using\rulefont{{K}match}
291\end{prooftree}
292\\[1.5ex]
293\begin{prooftree}
294K;\Gamma \ent r : \sigma \quad (\rho \not\in ftv(\Gamma))
295\justifies
296K;\Gamma \ent r : \fall{\rho}{\sigma}
297\using\rulefont{{K}{\forall}{\rho}}
298\end{prooftree}
300\begin{prooftree}
301K;\Gamma \ent r : \fall{\rho}{\sigma} \quad K;\Gamma \ent s : T
302\justifies
303K;\Gamma \ent rs : \sigma[\rho := T]
304\using\rulefont{{K}inst{\rho}}
305\end{prooftree}
306\\[1.5ex]
307\begin{prooftree}
308K \oplus i_{\geq \pair{L}{U}};\Gamma \ent r : \sigma
309\justifies
310K;\Gamma \ent r : \fall{i_{\geq \pair{L}{U}}}{\sigma}
311\using\rulefont{{K}{\forall}{i}}
312\end{prooftree}
314\begin{prooftree}
315K;\Gamma \ent r : \fall{i_{\geq \pair{L}{U}}}{\sigma} \quad K \ent i' \geq \pair{L, U}
316\justifies
317K;\Gamma \ent r : \sigma[i := i']
318\using\rulefont{{K}inst{i}}
319\end{prooftree}
320\end{gather*}
321\caption{Garrigue's typing relation for polymorphic variants}
322\label{fig.garrigue.typing.relation}
323\end{figure}
324
325%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%
326% Section
327%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%
328\subsection{Matita}
329\label{subsect.matita}
330
331Matita~\cite{asperti:user:2007} is a dependently-typed proof assistant developed in Bologna, implementing the Calculus of (Co)inductive Constructions, a type theory similar to that of Coq.
332Throughout this paper, all running examples are provided in Matita's proof script vernacular.
333This vernacular, similar to the syntax of OCaml or Coq, should be mostly straightforward.
334
335One possible source of confusion is our use of $?$ and $\ldots$, which correspond to a term, or terms, to be inferred automatically by unification, respectively.
336Any terms that cannot be inferred by unification from the surrounding context are left for the user to provide as proof obligations.
337
338\subsection{Subtyping as instantiation \emph{vs} subtyping as a safe static cast}
339
340\subsection{Syntax and type checking rules}
341The ones of Guarrigue + casts, but also for the bounded case?
342Casts support both styles of subtyping.
343
344\subsection{Examples}
345The weird function types that only work in subtyping as instantiation
346
347\subsection{Solution to the expression problem}
348Our running example in pseudo-OCaml syntax
349
350\section{Bounded polymorphic variants via dependent types}
351The topic of this paper, and this section in particular, is to present an
352\emph{encoding} of a subclass of polymorphic variants into dependent type
353theory. The final aim is to provide a solution to the expression problem
354that is implementable in user space into interactive theorem provers
355based on dependent type theories, in particular Coq, Agda and Matita.
356Moreover, we are only interested into an encoding that satisfies a number
357of constraints listed below. Essentially, the constraints ask that code
358automatically extracted from the system should be compiled efficiently
359without any change to the extraction machine. The encoding has been tested
360in the interactive theorem prover Matita, but we believe that it should be
361easily ported to the other cited systems.
362
363\subsection{Code extraction}
364Since our requirements have code extraction in mind, we briefly recall here
365what code extraction is in our context.
366
367Code extraction, as implemented in Coq, Matita and other systems, is the
368procedure that, given a proof term, extracts a program in a functional
369programming language that implements the computational content of the proof
370term. In particular, all proof parts that only establish correctness of the
371answer, totality of functions, etc. are to be erased. In particular, given
372a precondition $P$ and a postcondition $Q$, from a
373constructive proof of a $\forall x:A.P(x) \to \exists y:B. Q(x,y)$ statement
374one extracts a (partial) function $f$ from the encoding of $A$ to the encoding
375of $B$ that satisfies $\forall x. P(x) \to Q(x,f(x))$.
376
377Code extraction, as implemented by the above systems, does not attempt any
378optimization of the representation of data types. The only modification allowed
379is that parts of the data that are propositional and thus bare no content
380(being isomorphic to the unit type) are simplified to the unit type and
381possibly completely erased using the categorical properties of unit types
382($1 * T \simeq T$, $1 \to T \simeq T$, etc.). For more information, one can
384
385Thus, if a polymorphic variant is encoded in type theory as an n-ary sum of
386n-ary products, these being inefficiently represented using binary sums and
387products, the same holds for the extracted code.
388
389\subsection{The requirements}
390
391We list the following requirements.
392\begin{enumerate}
393 \item The representation of a polymorphic variant type in the extracted code
394   should be as efficient, in memory and time, as the isomorphic inductive
395   type. Efficiency should here be considered up to constants, and not
396   asymptotic.
397 \item Polymorphic types in the subtyping relation should be encoded using
398   data types that, once extracted, are in the subtyping relation too.
399   Before code extraction, instead, it is allowed to use explicit non identity
400   functions to coerce inhabitants of the sub-type into the super-type.
401   What matters is that the computational content of these coercions should be
402   the identity.
403\end{enumerate}
404The first requirements prevents the encoding of polymorphic variants
405using binary sums and products. For example, a constructor of an inductive
406type with four constructors requires one memory cell, but the same constructor
407for a binary sum of binary sums requires two memory cells, and twice the time
408to inhabit the data type. The solution is to force the code extracted from the
409polymorphic variant $\tau$ to be an inductive type. In turn this forces the
410encoding of polymorphic variants to be based on inductive types.
411
412The second requirement is in contrast with the possibility to extend a
413polymorphic variant type: when a type $\tau$ is extended to $\tau + a:T$,
414an inhabitant of the extraction of $\tau$ should already be an inhabitant of
415the extraction of $\tau + a:T$. This is possible if polymorphic variants were
416extracted to polymorphic variants in the target language, but this is not
417actually the case in any of the considered systems (e.g. Coq or Matita).
418Indeed, because of the first requirement, $\tau$ and $\tau + a:T$ are to be
419extracted to two inductive types. The only way to obtain subtyping is to
420already use the same representation: we need to anticipate or \emph{bound}
421the extension.
422
423Thus we accept the following limitation, together with an additional requirement
424to mitigate the limitation:
425\begin{enumerate}
426 \item[3] We only encode \emph{bounded polymorphic variants}, whose typing
427  rules are given in Figure~\ref{???}.
428 \item[4] Bounded polymorphic variants are characterized by a universe, i.e.
429  a set of fields together with their type. Every bounded polymorphic variant
430  is a sub-variant of the universe, obtained by selecting only certain fields.
431  We ask that adding a new field to the universe should have no impact on
432  pre-existing code that does not use it. More precisely, it should be possible
433  to add a field to the universe and recompile all pre-existing developments
434  without further modifications.
435\end{enumerate}
436
437\subsection{Discussion on the requirements and alternative approaches}
438The set of requirements we have imposed drastically restrict the set of possible
439solutions for the encoding. Moreover, they clearly represent a trade off between
440flexibility and efficiency. We discuss here alternative approaches based on
441different set of requirements.
442
443Maximal flexibility is only obtained dropping the boundedness limitation.
444One of the benefits or open polymorphic variants is separate compilation:
445the developers of libraries can use their own set of constructors and the
446user can freely mix them when the types of the different fields are coherent.
447Moreover, separate compilation can be made efficient if one accepts that
448correct programs can be rejected with very low probability at linking
449time~\cite{guarriguexxx}.
450
451Requirement 4 has been imposed to recover the flexibility of open variants.
452Suppose that every library starts with a file that declares its own universe.
453Then, to mix two libraries, one needs to merge the two universe declarations
454together, which is only possible if the universes are coherent. Once this is
455done, the library files will keep compiling because of the requirement. What
456is irremediably lost is separate compilation. We believe that this loss can
457be withstood in many practical situations, because 1) merging of different
458libraries that need to share at least some constructor is rare and needs to
459be done just once; 2) the addition of new fields to the universe of a library
460under development usually requires a significant effort to fix the library that
461should make the recompilation time of the other libraries negligible.
462
463If separate compilation is considered fundamental, then the only possible
464approach seems that of introducing a syntactic data type (a universe) of
465descriptions of polymorphic variants type, which is then interpreted by
466dependent functions (type formers) to the appropriate sums of types.
467The problem, generalized to that of giving a universe to the full syntax of
468inductive types, is being studied by the groups of Altenkirch and
469Ghani~\cite{aa,bb,cc,dd}. This approach, which is mathematically
470very elegant, is in conflict with requirements 1 and 2. The conflict can be
471solved by devising an ad-hoc extraction procedure which targets a programming
472language that already implements polymorphic variants efficiently. As far as
473we know, this has not been done yet. The next generation of Epigram, however,
474is supposed to be implemented around this idea. Instead of an ad-hoc extraction
475procedure, Epigram will be directly compiled to executable code, optimizing the
476poor encodings that are generated by the syntactic universe.
477
478\subsection{The encoding}
479In order to encode bounded polymorphic variants in dependent type theory,
480we encode 1) the universe; 2) the polymorphic variant types; 3) the
481constructors of the polymorphic variant type; 4) the pattern matching operator;
4825) an elimination proof principle for polymorphic variants. The pattern matching
483operator and the elimination principle have the same behaviour, according to
484the Curry-Howard isomorphism. We assume that the latter will be used to prove
485statements without taking care of the computational content of the proof
486(proof irrelevance). Thus only the pattern matching operator, and not the
487elimination principle, will need to satisfy requirement 1.
488
489The encoding is a careful mix of a deep and a shallow encodings, with user
490provided connections between them. The shallow part of the encoding is there
491for code extraction to satisfy requirements 1. In particular, the
492shallow encoding of the universe is just an ordinary inductive type, which is
493extracted to an algebraic type. The encoding of a polymorphic variant type is
494just a $\Sigma$-type whose carrier is the universe, which is extracted
495exactly as the universe (so satisfying requirement 2). The deep part of the
496encoding allows to perform dependent computations on the list of fields that
497are allowed in a polymorphic variant type. For example, from the list of fields
498a dependently typed function can generate the type of the elimination principle.
499The links between the two encodings are currently user provided, but it simple
500to imagine a simple program that automates the task.
501
502\begin{definition}[Universe encoding]
503A parametric universe
504$$(\alpha_1,\ldots,\alpha_m) u := K_1:T_1 ~|~ \ldots ~|~ K_n:T_n$$
505(where $\alpha_1,\ldots, \alpha_m$ are type variables bound in
506$T_1,\ldots,T_n$) is encoded by means of:
507\begin{itemize}
508 \item Its shallow encoding, the corresponding inductive type
509  $$inductive~u~(\alpha_1:Type[0],\ldots,\alpha_m:Type[0]) u : Type[0] := 510 K_1:T_1 \to u~\alpha_1~\ldots~\alpha_m ~|~ \ldots ~|~ K_n:T_n \to u~\alpha_1~\ldots~\alpha_m$$
511 \item A disjoint sum (an inductive type) of names for the tags of $u$:
512  $$inductive~tag\_~ : Type[0] := k_1: tag\_ ~|~ \ldots ~|~ k_n: tag\_$$
513 \item An equality test $eq\_tag: tag\_ \to tag\_ \to bool$ that returns true
514  iff the two tags are equal. This is boilerplate code that is often
515  automatized by interactive theorem provers. We write $tag$ for the type
516  the $tag\_$ coupled with its decidable equality test $eq\_tag$ and
517  a user provided proof that $\forall x,y:tag. eq\_tag x y = true \iff x=y$.
518 \item A simple function $$tag\_of\_expr: \forall \alpha_1,\ldots,\alpha_m. 519 u \alpha_1 \ldots \alpha_m \to tag$$
520   that performs a case analysis on the shallow encoding of a universe
521   inhabitant and returns the name of the corresponding constructor.
522   This boilerplate code is the bridge from the shallow to the deep
523   encoding of the universe.
524 \item A type former
525   $$Q\_holds\_for\_tag: \forall \alpha_1,\ldots,\alpha_m. 526 \forall Q: u~\alpha_1~\ldots~\alpha_n \to Prop. 527 \forall t:tag. Prop$$
528   that performs pattern matching over the tag and returns the proof obligation
529   for the proof of $Q$ in the case corresponding to the tag.
530 \item A proof of exhaustivity of pattern matching over the tag. This
531   boilerplate proof is the bridge from the deep to the shallow encoding of the
532   universe. The statement is:
533   $$Q\_holds\_for\_tag\_spec: 534 \forall \alpha_1,\ldots,\alpha_m. 535 \forall Q: u~\alpha_1~\ldots~\alpha_m \to Prop. 536 \forall x: u~\alpha_1~\ldots~\alpha_m. 537 Q\_holds\_for\_tag~\ldots~Q~\ldots~(tag\_of\_expr~\ldots~x) \to Q x$$
538\end{itemize}
539\end{definition}
540
541\begin{example}[The universe for our running example]
542The following lines of Matita code implement the universe encoding for our
543running example. They constitute the file~\texttt{test.ma}.
544\begin{lstlisting}
545inductive expr (E: Type[0]) : Type[0] ≝
546   Num : nat -> expr E
547 | Plus : E -> E -> expr E
548 | Mul : E -> E -> expr E.
549
550inductive tag_ : Type[0] := num : tag_ | plus : tag_ | mul : tag_.
551
552definition eq_tag : tag_ -> tag_ -> bool :=
553 λt1,t2.
554  match t1 with
555  [ num => match t2 with [num => true | _ => false]
556  | plus => match t2 with [plus => true | _ => false]
557  | mul => match t2 with [mul => true | _ => false]].
558
559definition tag : DeqSet ≝ mk_DeqSet tag_ eq_tag ?.
560 ** normalize /2/ % #abs destruct
561qed.
562
563definition tag_of_expr : ∀E:Type[0]. expr E -> tag :=
564 λE,e.
565   match e with
566   [ Num _ => num
567   | Plus _ _ => plus
568   | Mul _ _ => mul ].
569
570definition Q_holds_for_tag:
571 ∀E:Type[0]. ∀Q: expr E → Prop. ∀t:tag. Prop ≝
572 λE,Q,t.
573  match t with
574    [ num  ⇒ ∀n.   Q (Num … n)
575    | plus ⇒ ∀x,y. Q (Plus … x y)
576    | mul  ⇒ ∀x,y. Q (Mul … x y)
577    ].
578
579theorem Q_holds_for_tag_spec:
580 ∀E:Type[0]. ∀Q: expr E → Prop.
581  ∀x: expr E. Q_holds_for_tag … Q … (tag_of_expr … x) → Q x.
582#E #Q * normalize //
583qed.
584\end{lstlisting}
585\end{example}
586
587We are now ready to lift the universe encoding to the encoding of polymorphic
588variant types, constructors, case analysis and elimination principle.
589
590The idea consists in following quite litteraly the description of the syntax
591of bounded polymorphic variants: a bounded polymorphic variant is deeply
592encoded by the list of constructors that are allowed. Row polymorphism is
593achieved by quantifying over the list, which can be bounded from above and
594below using normal hypotheses. A dependently typed semantic function coerces
595the deep encoding to the shallow one, that we will soon describe. We will
596handle the function has an implicit coercion, so that we will simply write
597$i$ to represent both the list of tag names $i$ and the type former which is
598the polymorphic variant type described by $i$. The context always allow to
599disambiguate between the two uses and insert the coercion every time that
600a list $i$ is used in a type former.
601For example, the list $[num;plus]$ for our
602running example will represent the polymorphic variant type
603$[i_{\geq \pair{[num;plus]}{[num;plus]}}]$ and a function that expects at
604least $num$ but that does not handle $mul$ will be typed as either
605$f : \forall E. \forall i. [num] \subseteq i \subseteq [num;plus] \to i~E \to \ldots$
606or $f: \forall E. \forall i. i \subseteq [plus] \to (num::i)~E \to \ldots$.
607
608We defined the shallow encoding of the bounded polymorphic variant type
609specified by a list $i$ of constructors as a $\Sigma$-type whose carrier
610is the universe data type $u$ and whose property rules out all constructors
611whose tag is not in $i$. Code extraction will simplify the $\Sigma$-type to
612its carrier, satisfying both requirements 1 and 2.
613
614\begin{definition}[Polymorphic variant type encoding]
615A bounded polymorphic variant $i$ where $i_{\geq \pair{L}{U}}$ is deeply encoded
616by a variable $i$ of type list of tags ($i: list~tag$) under the hypothesis
617that $L \subseteq i \subseteq U$.
618The corresponding shallow encoding is
619$\Sigma x:u~\alpha_1~\ldots~\alpha_n. tag\_of\_expr \ldots x \in i$.
620An implicit coercion maps $i : list~tag \mapsto \Sigma x:u~\alpha_1~\ldots~\alpha_n. tag\_of\_expr \ldots x \in i$.
621\end{definition}
622
623Subtyping is not necessary for polymorphic variants \a la Guarrigue~\cite{???}:
624the slogan, borrowed by~\cite{???}, is \emph{structural subtyping as
625instantitation}.
626Subtyping as instantiation fits very well with the Hindley-Milner type system
627because it supports type inference. Since we work with a type system more
628expressive than Hindley-Milner where type inference is undecidable, we are
629not restricted to subtyping as instantiation, but we can also exploit the
630natural subtyping rule between polymorphic variants types~\ref{???}.
631
632\begin{definition}[Subtyping]
633There exists a coercion of type
634$\forall i_1,i_2: list~tag. i_1 \subseteq i_2 \to \forall \alpha_1,\ldots,\alpha_m. i_1~\alpha_1~\ldots~\alpha_n \to 635 i_2~\alpha_1~\ldots~\alpha_n$
636\end{definition}
637In the encoding of the universe we required an equality test over tags.
638The test is exploited here to implement a computational version of the
639$\subseteq$ relation between list of tags. I.e. it is possible to define
640$\subseteq$ in such a way that for closed terms $i_1$ and $i_2$,
641$i_1 \subseteq i_2$ reduces to $True$. As we will see, this suggests a different
642style where polymorphic variant types are represented using the less number
643of row quantifications and type membership and subtyping are automatically
644decided by mere computation, without any user intervention.
645
646The subtyping coercion takes apart the inhabitant of the polymorphic variant
647$i_1$, obtaining both an inhabitant $x$ of the universe and a proof that its
648constructors are all in $i_1$. Then it builds an inhabitant of the polymorphic
649variant $i_2$ by coupling $x$ with the proof that all of its constructors are
650in $i_2 \superseteq i_1$. The computational content of the coercion is just
651the identity function and code extraction erases it: requirement 2 is satisfied.
652
653\begin{definition}[Constructor encoding]
654A constructor $K_i : T_i \to u~\alpha_1~\ldots~\alpha_m$ of the universe is
655lifted to a polymorphic variant constructor
656$T_i \to i~\alpha_1~\ldots~\alpha_m$ when
657$\forall x:T_i. tag\_of\_expr (K_i x) \in i$.
658The lifting is obtained by inhabiting the sigma type
659$\Sigma x:u~\alpha_1~\ldots~\alpha_n. tag\_of\_expr \ldots x \in i$
660with a pair whose first component is the applied constructor and the second
661component the proof of the condition above.
662\end{definition}
663We can exploit again the equality test over tag types to implement a
664computational version of the predicate $tag\_of\_expr (K_i x) \in i$.
665Therefore, when $i$ is a closed term, all tests reduce to $True$.
666Matita implements a generalized system of coercions that allow to declare
667the lifting of polymorphic variant as an implicit coercion. Since the coercion
668requires a proof of the condition, a new proof obligation is opened.
669The proof obligation is just $True$ when $i$ is closed.
670
671The computational content of the lifted constructor is just the identity
672function: the $n$-th constructor of a polymorphic variant type is represented
673after code extraction by the $n$-th constructor of the unvierse inductive type,
674satisfying requirement 1.
675
676\subsection{Simulation (reduction + type checking)}
677\subsection{Examples}
678The weird function types redone
679\subsection{Subtyping as instantiation vs subtyping as safe static cast}
680Here we show/discuss how our approach supports both styles at once.
681\subsection{Solution to the expression problem, I}
682Using subtyping as cast, the file I have produced
683\subsection{Solution to the expression problem, II}
684Using subtyping as instantiation, comparisons, pros vs cons
685\subsection{Negative encoding (??)}
686The negative encoding and application to the expression problem
687\subsection{Other encodings (??)}
688Hints to other possible encodings
689
690\section{Extensible records (??)}
691
692\section{Comparison to related work and alternatives}
693\begin{itemize}
694 \item Disjoint unions: drawbacks
695 \item Encoding the unbounded case: drawbacks
696\end{itemize}
697
698\section{Appendix: interface of library functions used to implement everything}
699
700We will explain our technique with a simple running example: a trivial language of arithmetic including numeric literals and two binary operations corresponding to addition and multiplication.
701Our language is captured by the following inductive data type:
702\begin{lstlisting}
703inductive expression (E: Type[0]) : Type[0] :=
704 | Num : nat $\rightarrow$ expression E
705 | Plus : E $\rightarrow$ E $\rightarrow$ expression E
706 | Mul : E $\rightarrow$ E $\rightarrow$ expression E.
707\end{lstlisting}
708Here, \texttt{expression} is parameterised by a type, \texttt{E}.
710Paired with our inductive type representing the AST of natural number expressions, we have a type of tags:
711\begin{lstlisting}
712inductive tag_ : Type[0] :=
713  | num : tag_
714  | plus : tag_
715  | mul : tag_.
716\end{lstlisting}
717Tags, and the constructors of the inductive type they are shadowing, are kept in one-one correspondence with each other; for each constructor in the \texttt{expression} inductive type we have a single corresponding constructor in the \texttt{tag\_} inductive type, and \emph{vice versa}.
718
719On the \texttt{tag\_} inductive type, we define the obvious' decidable equality function:
720\begin{lstlisting}
721definition eq_tag_: tag_ $\rightarrow$ tag_ $\rightarrow$ bool :=
722  $\lambda$t1, t2.
723    match t1 with
724    [ num $\Rightarrow$
725      match t2 with
726      [ num $\Rightarrow$ true
727      | _ $\Rightarrow$ false
728      ]
729    $\ldots$
730    ].
731\end{lstlisting}
732Using \texttt{eq\_tag\_} we lift the \texttt{tag\_} inductive type into the Matita standard library's \texttt{DeqSet}.
733The \texttt{DeqSet} abstraction represents types with a decidable equality.
734Many generic functions in the Matita library are defined to work with this abstraction.
735The lifting is straightforward:
736\begin{lstlisting}
737definition tag : DeqSet :=
738  mk_DeqSet tag_ eq_tag ?.
739  * * normalize /2/ %
740  #abs destruct
741qed.
742\end{lstlisting}
743Here, \texttt{mk\_DeqSet} is the constructor of the \texttt{DeqSet} record.
744By convention in Matita, for a record type name \texttt{Foo}, there exists a constructor for that record type name \texttt{mk\_Foo}.
745We pass the constructor function three arguments: the type that possess a decidable equality, in this case \texttt{tag\_}
746
747\bibliography{polymorphic-variants}
748
749\end{document}
Note: See TracBrowser for help on using the repository browser.