1 | \documentclass[smallextended]{svjour3} |
---|
2 | |
---|
3 | \smartqed |
---|
4 | |
---|
5 | \usepackage{amsmath} |
---|
6 | \usepackage[english]{babel} |
---|
7 | \usepackage[colorlinks]{hyperref} |
---|
8 | \usepackage{listings} |
---|
9 | \usepackage{microtype} |
---|
10 | \usepackage{prooftree} |
---|
11 | |
---|
12 | \newcommand{\cons}{{::}} |
---|
13 | \newcommand{\ent}{\vdash} |
---|
14 | \newcommand{\fall}[2]{\forall{#1}.#2} |
---|
15 | \newcommand{\funarr}{\rightarrow} |
---|
16 | \newcommand{\lam}[2]{\lambda{#1}.{#2}} |
---|
17 | \newcommand{\letin}[3]{\mathbf{let}\ {#1}\ {:=}\ {#2}\ \mathbf{in}\ {#3}} |
---|
18 | \newcommand{\match}[3]{\mathbf{match}\ #1\ \mathbf{with}\ #2\ \Rightarrow #3} |
---|
19 | \newcommand{\pair}[2]{\langle #1, #2 \rangle} |
---|
20 | \newcommand{\polytag}[1]{{`}#1} |
---|
21 | \newcommand{\rulefont}[1]{\ensuremath{(\mathbf{#1})}} |
---|
22 | |
---|
23 | \renewcommand{\verb}{\lstinline} |
---|
24 | \def\lstlanguagefiles{lst-grafite.tex} |
---|
25 | \lstset{language=Grafite} |
---|
26 | |
---|
27 | \bibliographystyle{spphys} |
---|
28 | |
---|
29 | \author{Dominic P. Mulligan \and Claudio Sacerdoti Coen} |
---|
30 | \title{Polymorphic variants in dependent type theory\thanks{The project CerCo acknowledges the financial support of the Future and Emerging Technologies (FET) programme within the Seventh Framework Programme for Research of the European Commission, under FET-Open grant number: 243881.}} |
---|
31 | |
---|
32 | \institute{ |
---|
33 | Dominic P. Mulligan \at |
---|
34 | Computer Laboratory,\\ |
---|
35 | University of Cambridge. |
---|
36 | \email{dominic.p.mulligan@gmail.com} \and |
---|
37 | Claudio Sacerdoti Coen \at |
---|
38 | Dipartimento di Scienze dell'Informazione,\\ |
---|
39 | Universit\`a di Bologna. |
---|
40 | \email{sacerdot@cs.unibo.it} |
---|
41 | } |
---|
42 | |
---|
43 | \begin{document} |
---|
44 | |
---|
45 | \maketitle |
---|
46 | |
---|
47 | \begin{abstract} |
---|
48 | |
---|
49 | Big long abstract introducing the work |
---|
50 | |
---|
51 | \keywords{Polymorphic variants \and dependent type theory \and Matita theorem prover} |
---|
52 | |
---|
53 | \end{abstract} |
---|
54 | |
---|
55 | %%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%% |
---|
56 | % Section |
---|
57 | %%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%% |
---|
58 | \section{Introduction} |
---|
59 | \label{sect.introduction} |
---|
60 | |
---|
61 | %%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%% |
---|
62 | % Section |
---|
63 | %%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%% |
---|
64 | \section{Polymorphic variants} |
---|
65 | \label{sect.polymorphic.variants} |
---|
66 | |
---|
67 | In this section we will attempt to provide a self-contained \emph{pr\'ecis} of polymorphic variants for the unfamiliar reader. |
---|
68 | Those readers who wish to survey a more complete introduction to the subject are referred to Garrigue's original publications on the subject matter~\cite{garrigue:programming:1998,garrigue:code:2000}. |
---|
69 | |
---|
70 | Most mainstream functional programming languages, such as OCaml and Haskell, have mechanisms for defining new inductive types from old through the use of algebraic data type definitions. |
---|
71 | Each algebraic data type may be described as a sum-of-products. |
---|
72 | The programmer provides a fixed number of distinct \emph{constructors} with each constructor expects a (potentially empty) product of arguments. |
---|
73 | Values of a given inductive data type are built using a data type's constructors. |
---|
74 | Quotidian data structures---such as lists, trees, heaps, zippers, and so forth---can all be introduced using this now familiar mechanism. |
---|
75 | |
---|
76 | Having built data from various combinations of constructors, to complete the picture we now need some facility for picking said data apart, in order to build functions that operate over that data. |
---|
77 | Functional languages almost uniformly employ \emph{pattern matching} for this task. |
---|
78 | Given any inhabitant of an algebraic data type, by the aforementioned sum-of-products property, we know that it must consist of some constructor applied to various arguments. |
---|
79 | Using pattern matching we can therefore deconstruct algebraic data by performing a case analysis on the constructors of a given type. |
---|
80 | |
---|
81 | The combination of algebraic data types and pattern matching is powerful, and is arguably the main branching mechanism for most functional programming languages. |
---|
82 | Furthermore, using pattern matching it is easy to define new functions that consume algebraic data---the set of operations that can be defined for any given algebraic type is practically without bound. |
---|
83 | |
---|
84 | Unfortunately, in practical programming, we often want to expand a previously defined algebraic data type, adding more constructors. |
---|
85 | When it comes to extending algebraic data types with new constructors in this way, we hit a problem: these types are `closed'. |
---|
86 | In order to circumvent this restriction, we must introduce a new algebraic type with the additional constructor, lifting the old type---and any functions defined over it---into this type. |
---|
87 | This manual extension, coupled with the lifting of any existing functions into the new type, is cumbersome and error-prone. |
---|
88 | |
---|
89 | \begin{figure}[ht] |
---|
90 | \begin{minipage}[b]{0.45\linewidth} |
---|
91 | \begin{lstlisting} |
---|
92 | \lstset{language=haskell} |
---|
93 | data Term |
---|
94 | = Lit Int |
---|
95 | | Add Term Term |
---|
96 | |
---|
97 | eval :: Term -> Int |
---|
98 | eval (Lit i) = i |
---|
99 | eval (Add l r) = eval l + eval r |
---|
100 | \end{lstlisting} |
---|
101 | \end{minipage} |
---|
102 | \hspace{0.5cm} |
---|
103 | \begin{minipage}[b]{0.45\linewidth} |
---|
104 | \begin{lstlisting} |
---|
105 | interface Term { |
---|
106 | int eval(); |
---|
107 | } |
---|
108 | |
---|
109 | class Lit implements Term { |
---|
110 | private int v = 0; |
---|
111 | ... |
---|
112 | int eval() { return v; } |
---|
113 | } |
---|
114 | |
---|
115 | class Add implements Term { |
---|
116 | private Term left, right = null; |
---|
117 | ... |
---|
118 | int eval() { |
---|
119 | return left.eval() + right.eval(); |
---|
120 | } |
---|
121 | } |
---|
122 | \end{lstlisting} |
---|
123 | \end{minipage} |
---|
124 | \label{fig.pattern-matching.vs.oop} |
---|
125 | \caption{A simple language of integer arithmetic embedded as an algebraic data type and as a class hierarchy.} |
---|
126 | \end{figure} |
---|
127 | |
---|
128 | However, we can look abroad to see what other families of programming languages do to solve this problem. |
---|
129 | For instance, we can compare and contrast functional programming languages' use of algebraic data and pattern matching with the approach taken by object-oriented languages (see Figure~\ref{fig.pattern-matching.vs.oop} for a concrete example). |
---|
130 | In mainstream, class-based object-oriented languages---such as Java, for example---algebraic data types can be emulated by interfaces, coupled with a number of subclasses corresponding to constructors, implementing the `root' interface. |
---|
131 | The base interface of the object hierarchy specifies the permitted operations defined for the type. |
---|
132 | Branching by pattern matching is emulated using the language's dynamic dispatch mechanism. |
---|
133 | |
---|
134 | As all permitted operations on |
---|
135 | it is hard to enlarge the set of operations defined over a given type without altering the entire class hierarchy. |
---|
136 | If the interface changes so must every class implementing it. |
---|
137 | However, note it is easy to extend the hierarchy to new cases, corresponding to the introduction of a new constructor in the functional world, by merely adding another class corresponding to that constructor implementing the interface. |
---|
138 | |
---|
139 | \begin{figure}[ht] |
---|
140 | \begin{minipage}[b]{0.45\linewidth} |
---|
141 | \begin{lstlisting} |
---|
142 | ... |
---|
143 | data Term' |
---|
144 | = Lift Term |
---|
145 | | Mul Term' Term' |
---|
146 | |
---|
147 | eval' :: Term' -> Int |
---|
148 | eval' (Lift t) = eval t |
---|
149 | eval' (Mul l r) = eval' l * eval ' r |
---|
150 | \end{lstlisting} |
---|
151 | \end{minipage} |
---|
152 | \hspace{0.5cm} |
---|
153 | \begin{minipage}[b]{0.45\linewidth} |
---|
154 | \begin{lstlisting} |
---|
155 | ... |
---|
156 | class Mul implements Term { |
---|
157 | private Term left, right = null; |
---|
158 | ... |
---|
159 | int eval() { |
---|
160 | return left.eval() * right.eval(); |
---|
161 | } |
---|
162 | } |
---|
163 | \end{lstlisting} |
---|
164 | \end{minipage} |
---|
165 | \label{fig.extending.base.type} |
---|
166 | \caption{Extending our simple language of arithmetic with a new grammatical element.} |
---|
167 | \end{figure} |
---|
168 | |
---|
169 | \begin{figure}[ht] |
---|
170 | \begin{minipage}[b]{0.45\linewidth} |
---|
171 | \begin{lstlisting} |
---|
172 | ... |
---|
173 | show :: Term -> String |
---|
174 | show (Lit i) = show i |
---|
175 | show (Add l r) = |
---|
176 | show l ++ " + " ++ show r |
---|
177 | \end{lstlisting} |
---|
178 | \end{minipage} |
---|
179 | \hspace{0.5cm} |
---|
180 | \begin{minipage}[b]{0.45\linewidth} |
---|
181 | \begin{lstlisting} |
---|
182 | interface Term { |
---|
183 | int eval(); |
---|
184 | String show(); |
---|
185 | } |
---|
186 | |
---|
187 | class Lit implements Term { |
---|
188 | private int v = 0; |
---|
189 | ... |
---|
190 | int eval() { return v; } |
---|
191 | String show() { |
---|
192 | return Int.toString(v); |
---|
193 | } |
---|
194 | } |
---|
195 | |
---|
196 | class Add implements Term { |
---|
197 | private Term left, right = null; |
---|
198 | ... |
---|
199 | int eval() { |
---|
200 | return left.eval() + right.eval(); |
---|
201 | } |
---|
202 | ... |
---|
203 | String show() { |
---|
204 | return left.show() + \ |
---|
205 | " + " + right.show(); |
---|
206 | } |
---|
207 | } |
---|
208 | \end{lstlisting} |
---|
209 | \end{minipage} |
---|
210 | \label{fig.extending.base.type} |
---|
211 | \caption{Extending the permitted operations over our simple language of arithmetic.} |
---|
212 | \end{figure} |
---|
213 | |
---|
214 | [dpm: reword the above --- hard to phrase precisely ] |
---|
215 | |
---|
216 | |
---|
217 | \begin{itemize} |
---|
218 | \item General introduction, motivations |
---|
219 | \item Bounded vs not-bounded. |
---|
220 | \end{itemize} |
---|
221 | |
---|
222 | \begin{definition} |
---|
223 | \label{defn.garrigues.calculus} |
---|
224 | Inductively define a |
---|
225 | \end{definition} |
---|
226 | |
---|
227 | \begin{figure} |
---|
228 | \begin{gather*} |
---|
229 | \begin{prooftree} |
---|
230 | (\Gamma(x) = \sigma) |
---|
231 | \justifies |
---|
232 | K; \Gamma \ent x : \sigma |
---|
233 | \using\rulefont{{\Gamma}x} |
---|
234 | \end{prooftree} |
---|
235 | \qquad |
---|
236 | \begin{prooftree} |
---|
237 | K; \Gamma, x : \phi \ent r : \psi |
---|
238 | \justifies |
---|
239 | K; \Gamma \ent \lam{x}r : \phi \funarr \psi |
---|
240 | \using\rulefont{{\Gamma}\lambda} |
---|
241 | \end{prooftree} |
---|
242 | \qquad |
---|
243 | \begin{prooftree} |
---|
244 | K;\Gamma \ent r : \phi \funarr \psi \quad K;\Gamma \ent s : \phi |
---|
245 | \justifies |
---|
246 | K;\Gamma \ent rs : \psi |
---|
247 | \using\rulefont{{\Gamma}rs} |
---|
248 | \end{prooftree} |
---|
249 | \\[1.5ex] |
---|
250 | \begin{prooftree} |
---|
251 | K; \Gamma \ent r : \sigma \quad K; \Gamma, x:\sigma \ent s : \psi |
---|
252 | \justifies |
---|
253 | K; \Gamma \ent \letin{x}{r}{s} : \psi |
---|
254 | \using\rulefont{{\Gamma}{:=}} |
---|
255 | \end{prooftree} |
---|
256 | \qquad |
---|
257 | \begin{prooftree} |
---|
258 | K;\Gamma \ent r : \sigma \quad (a \not\in ftv(\Gamma)) |
---|
259 | \justifies |
---|
260 | K;\Gamma \ent r : \fall{a}{\sigma} |
---|
261 | \using\rulefont{{\Gamma}{\forall}{a}} |
---|
262 | \end{prooftree} |
---|
263 | \\[1.5ex] |
---|
264 | \begin{prooftree} |
---|
265 | K;\Gamma \ent r : \fall{a}{\sigma} \quad K;\Gamma \ent s : \psi |
---|
266 | \justifies |
---|
267 | K;\Gamma \ent rs : \sigma[a := \psi] |
---|
268 | \using\rulefont{{\Gamma}inst{a}} |
---|
269 | \end{prooftree} |
---|
270 | \\[1.5ex] |
---|
271 | \begin{prooftree} |
---|
272 | (L \subseteq L' \quad H \subseteq H') |
---|
273 | \justifies |
---|
274 | K \oplus i_{{\geq \pair{L}{U}}} \ent i \geq \pair{L'}{H'} |
---|
275 | \using\rulefont{{K}i} |
---|
276 | \end{prooftree} |
---|
277 | \qquad |
---|
278 | \begin{prooftree} |
---|
279 | K;\Gamma \ent r : \phi \quad K \ent i \geq \pair{tag}{\top} |
---|
280 | \justifies |
---|
281 | K;\Gamma \ent \polytag{tag(r)} : [ i \mid tag : \phi \cons T ] |
---|
282 | \using\rulefont{{K}variant} |
---|
283 | \end{prooftree} |
---|
284 | \\[1.5ex] |
---|
285 | \begin{prooftree} |
---|
286 | K;\Gamma \ent r : [ i \mid \{ tag_k : \phi_k \}^n_1 \cons T ] \quad K \ent i \geq \pair{\bot}{\{tag_k\}^n_1} \quad |
---|
287 | K;\Gamma, x_k : \phi_k \ent r_k : \psi \quad (1 \leq k \leq n) |
---|
288 | \justifies |
---|
289 | K;\Gamma \ent \mathbf{match}\ r\ \mathbf{with}\ \{ \polytag{tag_k(x_k)} \Rightarrow r_k \}^n_1 : \psi |
---|
290 | \using\rulefont{{K}match} |
---|
291 | \end{prooftree} |
---|
292 | \\[1.5ex] |
---|
293 | \begin{prooftree} |
---|
294 | K;\Gamma \ent r : \sigma \quad (\rho \not\in ftv(\Gamma)) |
---|
295 | \justifies |
---|
296 | K;\Gamma \ent r : \fall{\rho}{\sigma} |
---|
297 | \using\rulefont{{K}{\forall}{\rho}} |
---|
298 | \end{prooftree} |
---|
299 | \qquad |
---|
300 | \begin{prooftree} |
---|
301 | K;\Gamma \ent r : \fall{\rho}{\sigma} \quad K;\Gamma \ent s : T |
---|
302 | \justifies |
---|
303 | K;\Gamma \ent rs : \sigma[\rho := T] |
---|
304 | \using\rulefont{{K}inst{\rho}} |
---|
305 | \end{prooftree} |
---|
306 | \\[1.5ex] |
---|
307 | \begin{prooftree} |
---|
308 | K \oplus i_{\geq \pair{L}{U}};\Gamma \ent r : \sigma |
---|
309 | \justifies |
---|
310 | K;\Gamma \ent r : \fall{i_{\geq \pair{L}{U}}}{\sigma} |
---|
311 | \using\rulefont{{K}{\forall}{i}} |
---|
312 | \end{prooftree} |
---|
313 | \qquad |
---|
314 | \begin{prooftree} |
---|
315 | K;\Gamma \ent r : \fall{i_{\geq \pair{L}{U}}}{\sigma} \quad K \ent i' \geq \pair{L, U} |
---|
316 | \justifies |
---|
317 | K;\Gamma \ent r : \sigma[i := i'] |
---|
318 | \using\rulefont{{K}inst{i}} |
---|
319 | \end{prooftree} |
---|
320 | \end{gather*} |
---|
321 | \caption{Garrigue's typing relation for polymorphic variants} |
---|
322 | \label{fig.garrigue.typing.relation} |
---|
323 | \end{figure} |
---|
324 | |
---|
325 | %%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%% |
---|
326 | % Section |
---|
327 | %%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%% |
---|
328 | \subsection{Matita} |
---|
329 | \label{subsect.matita} |
---|
330 | |
---|
331 | Matita~\cite{asperti:user:2007} is a dependently-typed proof assistant developed in Bologna, implementing the Calculus of (Co)inductive Constructions, a type theory similar to that of Coq. |
---|
332 | Throughout this paper, all running examples are provided in Matita's proof script vernacular. |
---|
333 | This vernacular, similar to the syntax of OCaml or Coq, should be mostly straightforward. |
---|
334 | |
---|
335 | One possible source of confusion is our use of $?$ and $\ldots$, which correspond to a term, or terms, to be inferred automatically by unification, respectively. |
---|
336 | Any terms that cannot be inferred by unification from the surrounding context are left for the user to provide as proof obligations. |
---|
337 | |
---|
338 | \subsection{Subtyping as instantiation vs subtyping as safe static cast} |
---|
339 | |
---|
340 | \subsection{Syntax and type checking rules} |
---|
341 | The ones of Guarrigue + casts, but also for the bounded case? |
---|
342 | Casts support both styles of subtyping. |
---|
343 | |
---|
344 | \subsection{Examples} |
---|
345 | The weird function types that only work in subtyping as instantiation |
---|
346 | |
---|
347 | \subsection{Solution to the expression problem} |
---|
348 | Our running example in pseudo-OCaml syntax |
---|
349 | |
---|
350 | \section{Bounded polymorphic variants via dependent types} |
---|
351 | The topic of this paper, and this section in particular, is to present an |
---|
352 | \emph{encoding} of a subclass of polymorphic variants into dependent type |
---|
353 | theory. The final aim is to provide a solution to the expression problem |
---|
354 | that is implementable in user space into interactive theorem provers |
---|
355 | based on dependent type theories, in particular Coq, Agda and Matita. |
---|
356 | Moreover, we are only interested into an encoding that satisfies a number |
---|
357 | of constraints listed below. Essentially, the constraints ask that code |
---|
358 | automatically extracted from the system should be compiled efficiently |
---|
359 | without any change to the extraction machine. The encoding has been tested |
---|
360 | in the interactive theorem prover Matita, but we believe that it should be |
---|
361 | easily ported to the other cited systems. |
---|
362 | |
---|
363 | \subsection{Code extraction} |
---|
364 | Since our requirements have code extraction in mind, we briefly recall here |
---|
365 | what code extraction is in our context. |
---|
366 | |
---|
367 | Code extraction, as implemented in Coq, Matita and other systems, is the |
---|
368 | procedure that, given a proof term, extracts a program in a functional |
---|
369 | programming language that implements the computational content of the proof |
---|
370 | term. In particular, all proof parts that only establish correctness of the |
---|
371 | answer, totality of functions, etc. are to be erased. In particular, given |
---|
372 | a precondition $P$ and a postcondition $Q$, from a |
---|
373 | constructive proof of a $\forall x:A.P(x) \to \exists y:B. Q(x,y)$ statement |
---|
374 | one extracts a (partial) function $f$ from the encoding of $A$ to the encoding |
---|
375 | of $B$ that satisfies $\forall x. P(x) \to Q(x,f(x))$. |
---|
376 | |
---|
377 | Code extraction, as implemented by the above systems, does not attempt any |
---|
378 | optimization of the representation of data types. The only modification allowed |
---|
379 | is that parts of the data that are propositional and thus bare no content |
---|
380 | (being isomorphic to the unit type) are simplified to the unit type and |
---|
381 | possibly completely erased using the categorical properties of unit types |
---|
382 | ($1 * T \simeq T$, $1 \to T \simeq T$, etc.). For more information, one can |
---|
383 | read~\cite{berardi,christine-mohring,letouzey}. |
---|
384 | |
---|
385 | Thus, if a polymorphic variant is encoded in type theory as an n-ary sum of |
---|
386 | n-ary products, these being inefficiently represented using binary sums and |
---|
387 | products, the same holds for the extracted code. |
---|
388 | |
---|
389 | \subsection{The requirements} |
---|
390 | |
---|
391 | We list the following requirements. |
---|
392 | \begin{enumerate} |
---|
393 | \item The representation of a polymorphic variant type in the extracted code |
---|
394 | should be as efficient, in memory and time, as the isomorphic inductive |
---|
395 | type. Efficiency should here be considered up to constants, and not |
---|
396 | asymptotic. |
---|
397 | \item Polymorphic types in the subtyping relation should be encoded using |
---|
398 | data types that, once extracted, are in the subtyping relation too. |
---|
399 | Before code extraction, instead, it is allowed to use explicit non identity |
---|
400 | functions to coerce inhabitants of the sub-type into the super-type. |
---|
401 | What matters is that the computational content of these coercions should be |
---|
402 | the identity. |
---|
403 | \end{enumerate} |
---|
404 | The first requirements prevents the encoding of polymorphic variants |
---|
405 | using binary sums and products. For example, a constructor of an inductive |
---|
406 | type with four constructors requires one memory cell, but the same constructor |
---|
407 | for a binary sum of binary sums requires two memory cells, and twice the time |
---|
408 | to inhabit the data type. The solution is to force the code extracted from the |
---|
409 | polymorphic variant $\tau$ to be an inductive type. In turn this forces the |
---|
410 | encoding of polymorphic variants to be based on inductive types. |
---|
411 | |
---|
412 | The second requirement is in contrast with the possibility to extend a |
---|
413 | polymorphic variant type: when a type $\tau$ is extended to $\tau + a:T$, |
---|
414 | an inhabitant of the extraction of $\tau$ should already be an inhabitant of |
---|
415 | the extraction of $\tau + a:T$. This is possible if polymorphic variants were |
---|
416 | extracted to polymorphic variants in the target language, but this is not |
---|
417 | actually the case in any of the considered systems (e.g. Coq or Matita). |
---|
418 | Indeed, because of the first requirement, $\tau$ and $\tau + a:T$ are to be |
---|
419 | extracted to two inductive types. The only way to obtain subtyping is to |
---|
420 | already use the same representation: we need to anticipate or \emph{bound} |
---|
421 | the extension. |
---|
422 | |
---|
423 | Thus we accept the following limitation, together with an additional requirement |
---|
424 | to mitigate the limitation: |
---|
425 | \begin{enumerate} |
---|
426 | \item[3] We only encode \emph{bounded polymorphic variants}, whose typing |
---|
427 | rules are given in Figure~\ref{???}. |
---|
428 | \item[4] Bounded polymorphic variants are characterized by a universe, i.e. |
---|
429 | a set of fields together with their type. Every bounded polymorphic variant |
---|
430 | is a sub-variant of the universe, obtained by selecting only certain fields. |
---|
431 | We ask that adding a new field to the universe should have no impact on |
---|
432 | pre-existing code that does not use it. More precisely, it should be possible |
---|
433 | to add a field to the universe and recompile all pre-existing developments |
---|
434 | without further modifications. |
---|
435 | \end{enumerate} |
---|
436 | |
---|
437 | \subsection{Discussion on the requirements and alternative approaches} |
---|
438 | The set of requirements we have imposed drastically restrict the set of possible |
---|
439 | solutions for the encoding. Moreover, they clearly represent a trade off between |
---|
440 | flexibility and efficiency. We discuss here alternative approaches based on |
---|
441 | different set of requirements. |
---|
442 | |
---|
443 | Maximal flexibility is only obtained dropping the boundedness limitation. |
---|
444 | One of the benefits or open polymorphic variants is separate compilation: |
---|
445 | the developers of libraries can use their own set of constructors and the |
---|
446 | user can freely mix them when the types of the different fields are coherent. |
---|
447 | Moreover, separate compilation can be made efficient if one accepts that |
---|
448 | correct programs can be rejected with very low probability at linking |
---|
449 | time~\cite{guarriguexxx}. |
---|
450 | |
---|
451 | Requirement 4 has been imposed to recover the flexibility of open variants. |
---|
452 | Suppose that every library starts with a file that declares its own universe. |
---|
453 | Then, to mix two libraries, one needs to merge the two universe declarations |
---|
454 | together, which is only possible if the universes are coherent. Once this is |
---|
455 | done, the library files will keep compiling because of the requirement. What |
---|
456 | is irremediably lost is separate compilation. We believe that this loss can |
---|
457 | be withstood in many practical situations, because 1) merging of different |
---|
458 | libraries that need to share at least some constructor is rare and needs to |
---|
459 | be done just once; 2) the addition of new fields to the universe of a library |
---|
460 | under development usually requires a significant effort to fix the library that |
---|
461 | should make the recompilation time of the other libraries negligible. |
---|
462 | |
---|
463 | If separate compilation is considered fundamental, then the only possible |
---|
464 | approach seems that of introducing a syntactic data type (a universe) of |
---|
465 | descriptions of polymorphic variants type, which is then interpreted by |
---|
466 | dependent functions (type formers) to the appropriate sums of types. |
---|
467 | The problem, generalized to that of giving a universe to the full syntax of |
---|
468 | inductive types, is being studied by the groups of Altenkirch and |
---|
469 | Ghani~\cite{aa,bb,cc,dd}. This approach, which is mathematically |
---|
470 | very elegant, is in conflict with requirements 1 and 2. The conflict can be |
---|
471 | solved by devising an ad-hoc extraction procedure which targets a programming |
---|
472 | language that already implements polymorphic variants efficiently. As far as |
---|
473 | we know, this has not been done yet. The next generation of Epigram, however, |
---|
474 | is supposed to be implemented around this idea. Instead of an ad-hoc extraction |
---|
475 | procedure, Epigram will be directly compiled to executable code, optimizing the |
---|
476 | poor encodings that are generated by the syntactic universe. |
---|
477 | |
---|
478 | \subsection{The encoding} |
---|
479 | In order to encode bounded polymorphic variants in dependent type theory, |
---|
480 | we encode 1) the universe; 2) the polymorphic variant types; 3) the |
---|
481 | constructors of the polymorphic variant type; 4) the pattern matching operator; |
---|
482 | 5) an elimination proof principle for polymorphic variants. The pattern matching |
---|
483 | operator and the elimination principle have the same behaviour, according to |
---|
484 | the Curry-Howard isomorphism. We assume that the latter will be used to prove |
---|
485 | statements without taking care of the computational content of the proof |
---|
486 | (proof irrelevance). Thus only the pattern matching operator, and not the |
---|
487 | elimination principle, will need to satisfy requirement 1. |
---|
488 | |
---|
489 | The encoding is a careful mix of a deep and a shallow encodings, with user |
---|
490 | provided connections between them. The shallow part of the encoding is there |
---|
491 | for code extraction to satisfy requirements 1. In particular, the |
---|
492 | shallow encoding of the universe is just an ordinary inductive type, which is |
---|
493 | extracted to an algebraic type. The encoding of a polymorphic variant type is |
---|
494 | just a $\Sigma$-type whose carrier is the universe, which is extracted |
---|
495 | exactly as the universe (so satisfying requirement 2). The deep part of the |
---|
496 | encoding allows to perform dependent computations on the list of fields that |
---|
497 | are allowed in a polymorphic variant type. For example, from the list of fields |
---|
498 | a dependently typed function can generate the type of the elimination principle. |
---|
499 | The links between the two encodings are currently user provided, but it simple |
---|
500 | to imagine a simple program that automates the task. |
---|
501 | |
---|
502 | \begin{definition}[Universe encoding] |
---|
503 | A parametric universe $(\alpha_1,\ldots,\alpha_m) u := K_1:T_1 ~|~ \ldots ~|~ K_n:T_n$ (where $\alpha_1,\ldots, \alpha_m$ are type variables bound in |
---|
504 | $T_1,\ldots,T_n$) is encoded by means of: |
---|
505 | \begin{itemize} |
---|
506 | \item Its shallow encoding, the corresponding inductive type |
---|
507 | $$inductive~u~(\alpha_1:Type[0],\ldots,\alpha_m:Type[0]) u : Type[0] := |
---|
508 | K_1:T_1 \to u~\alpha_1~\ldots~\alpha_m ~|~ \ldots ~|~ K_n:T_n$ \to u~\alpha_1~\ldots~\alpha_m$$ |
---|
509 | \item A disjoint sum (an inductive type) of names for the tags of $u$: |
---|
510 | $$inductive~tag_~ : Type[0] := k_1: tag_ ~|~ \ldots ~|~ k_n: tag_$$ |
---|
511 | \item An equality test $eq_tag: tag_ \to tag_ \to bool$ that returns true |
---|
512 | iff the two tags are the same. This is boilerplate code that is often |
---|
513 | automatized by interactive theorem provers. We write $tag$ for the type |
---|
514 | the $tag_$ coupled with its decidable equality test $eq_tag$. This is just |
---|
515 | a minor technicality of no importance. |
---|
516 | \item A simple function $$tag_of_expr: \forall \alpha_1,\ldots,\alpha_m. |
---|
517 | u \alpha_1 \ldots \alpha_m \to tag$$ |
---|
518 | that performs a case analysis on the shallow encoding of a universe |
---|
519 | inhabitant and returns the name of the corresponding constructor. |
---|
520 | This boilerplate code is the bridge from the shallow to the deep |
---|
521 | encoding of the universe. |
---|
522 | \item A dual function that is the bridge from the deep to the shallow encoding. |
---|
523 | We |
---|
524 | \end{itemize} |
---|
525 | \end{definition} |
---|
526 | |
---|
527 | \subsection{Simulation (reduction + type checking)} |
---|
528 | \subsection{Examples} |
---|
529 | The weird function types redone |
---|
530 | \subsection{Subtyping as instantiation vs subtyping as safe static cast} |
---|
531 | Here we show/discuss how our approach supports both styles at once. |
---|
532 | \subsection{Solution to the expression problem, I} |
---|
533 | Using subtyping as cast, the file I have produced |
---|
534 | \subsection{Solution to the expression problem, II} |
---|
535 | Using subtyping as instantiation, comparisons, pros vs cons |
---|
536 | \subsection{Negative encoding (??)} |
---|
537 | The negative encoding and application to the expression problem |
---|
538 | \subsection{Other encodings (??)} |
---|
539 | Hints to other possible encodings |
---|
540 | |
---|
541 | \section{Extensible records (??)} |
---|
542 | |
---|
543 | \section{Comparison to related work and alternatives} |
---|
544 | \begin{itemize} |
---|
545 | \item Disjoint unions: drawbacks |
---|
546 | \item Encoding the unbounded case: drawbacks |
---|
547 | \end{itemize} |
---|
548 | |
---|
549 | \section{Appendix: interface of library functions used to implement everything} |
---|
550 | |
---|
551 | We will explain our technique with a simple running example: a trivial language of arithmetic including numeric literals and two binary operations corresponding to addition and multiplication. |
---|
552 | Our language is captured by the following inductive data type: |
---|
553 | \begin{lstlisting} |
---|
554 | inductive expression (E: Type[0]) : Type[0] := |
---|
555 | | Num : nat $\rightarrow$ expression E |
---|
556 | | Plus : E $\rightarrow$ E $\rightarrow$ expression E |
---|
557 | | Mul : E $\rightarrow$ E $\rightarrow$ expression E. |
---|
558 | \end{lstlisting} |
---|
559 | Here, \texttt{expression} is parameterised by a type, \texttt{E}. |
---|
560 | We will say more about this later. |
---|
561 | Paired with our inductive type representing the AST of natural number expressions, we have a type of tags: |
---|
562 | \begin{lstlisting} |
---|
563 | inductive tag_ : Type[0] := |
---|
564 | | num : tag_ |
---|
565 | | plus : tag_ |
---|
566 | | mul : tag_. |
---|
567 | \end{lstlisting} |
---|
568 | Tags, and the constructors of the inductive type they are shadowing, are kept in one-one correspondence with each other; for each constructor in the \texttt{expression} inductive type we have a single corresponding constructor in the \texttt{tag\_} inductive type, and \emph{vice versa}. |
---|
569 | |
---|
570 | On the \texttt{tag\_} inductive type, we define the `obvious' decidable equality function: |
---|
571 | \begin{lstlisting} |
---|
572 | definition eq_tag\_: tag_ $\rightarrow$ tag_ $\rightarrow$ bool := |
---|
573 | $\lambda$t1, t2. |
---|
574 | match t1 with |
---|
575 | [ num $\Rightarrow$ |
---|
576 | match t2 with |
---|
577 | [ num $\Rightarrow$ true |
---|
578 | | _ $\Rightarrow$ false |
---|
579 | ] |
---|
580 | $\ldots$ |
---|
581 | ]. |
---|
582 | \end{lstlisting} |
---|
583 | Using \texttt{eq\_tag\_} we lift the \texttt{tag\_} inductive type into the Matita standard library's \texttt{DeqSet}. |
---|
584 | The \texttt{DeqSet} abstraction represents types with a decidable equality. |
---|
585 | Many generic functions in the Matita library are defined to work with this abstraction. |
---|
586 | The lifting is straightforward: |
---|
587 | \begin{lstlisting} |
---|
588 | definition tag : DeqSet := |
---|
589 | mk_DeqSet tag_ eq_tag ?. |
---|
590 | * * normalize /2/ % |
---|
591 | #abs destruct |
---|
592 | qed. |
---|
593 | \end{lstlisting} |
---|
594 | Here, \texttt{mk\_DeqSet} is the constructor of the \texttt{DeqSet} record. |
---|
595 | By convention in Matita, for a record type name \texttt{Foo}, there exists a constructor for that record type name \textt{mk\_Foo}. |
---|
596 | We pass the constructor function three arguments: the type that possess a decidable equality, in this case \texttt{tag\_} |
---|
597 | |
---|
598 | \bibliography{polymorphic-variants} |
---|
599 | |
---|
600 | \end{document} |
---|