-
Notifications
You must be signed in to change notification settings - Fork 0
/
Copy pathdoc.tex
591 lines (508 loc) · 27.9 KB
/
doc.tex
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
79
80
81
82
83
84
85
86
87
88
89
90
91
92
93
94
95
96
97
98
99
100
101
102
103
104
105
106
107
108
109
110
111
112
113
114
115
116
117
118
119
120
121
122
123
124
125
126
127
128
129
130
131
132
133
134
135
136
137
138
139
140
141
142
143
144
145
146
147
148
149
150
151
152
153
154
155
156
157
158
159
160
161
162
163
164
165
166
167
168
169
170
171
172
173
174
175
176
177
178
179
180
181
182
183
184
185
186
187
188
189
190
191
192
193
194
195
196
197
198
199
200
201
202
203
204
205
206
207
208
209
210
211
212
213
214
215
216
217
218
219
220
221
222
223
224
225
226
227
228
229
230
231
232
233
234
235
236
237
238
239
240
241
242
243
244
245
246
247
248
249
250
251
252
253
254
255
256
257
258
259
260
261
262
263
264
265
266
267
268
269
270
271
272
273
274
275
276
277
278
279
280
281
282
283
284
285
286
287
288
289
290
291
292
293
294
295
296
297
298
299
300
301
302
303
304
305
306
307
308
309
310
311
312
313
314
315
316
317
318
319
320
321
322
323
324
325
326
327
328
329
330
331
332
333
334
335
336
337
338
339
340
341
342
343
344
345
346
347
348
349
350
351
352
353
354
355
356
357
358
359
360
361
362
363
364
365
366
367
368
369
370
371
372
373
374
375
376
377
378
379
380
381
382
383
384
385
386
387
388
389
390
391
392
393
394
395
396
397
398
399
400
401
402
403
404
405
406
407
408
409
410
411
412
413
414
415
416
417
418
419
420
421
422
423
424
425
426
427
428
429
430
431
432
433
434
435
436
437
438
439
440
441
442
443
444
445
446
447
448
449
450
451
452
453
454
455
456
457
458
459
460
461
462
463
464
465
466
467
468
469
470
471
472
473
474
475
476
477
478
479
480
481
482
483
484
485
486
487
488
489
490
491
492
493
494
495
496
497
498
499
500
501
502
503
504
505
506
507
508
509
510
511
512
513
514
515
516
517
518
519
520
521
522
523
524
525
526
527
528
529
530
531
532
533
534
535
536
537
538
539
540
541
542
543
544
545
546
547
548
549
550
551
552
553
554
555
556
557
558
559
560
561
562
563
564
565
566
567
568
569
570
571
572
573
574
575
576
577
578
579
580
581
582
583
584
585
586
587
588
589
590
591
\documentclass{amsart}
\usepackage{amssymb}
\usepackage[all]{xy}
\usepackage{verbatim}
\usepackage{ifthen}
\usepackage{xargs}
\usepackage{bussproofs}
\providecommand\WarningsAreErrors{false}
\ifthenelse{\equal{\WarningsAreErrors}{true}}{\renewcommand{\GenericWarning}[2]{\GenericError{#1}{#2}{}{This warning has been turned into a fatal error.}}}{}
\newcommand{\newref}[4][]{
\ifthenelse{\equal{#1}{}}{\newtheorem{h#2}[hthm]{#4}}{\newtheorem{h#2}{#4}[#1]}
\expandafter\newcommand\csname r#2\endcsname[1]{#3~\ref{#2:##1}}
\expandafter\newcommand\csname R#2\endcsname[1]{#4~\ref{#2:##1}}
\newenvironmentx{#2}[2][1=,2=]{
\ifthenelse{\equal{##2}{}}{\begin{h#2}}{\begin{h#2}[##2]}
\ifthenelse{\equal{##1}{}}{}{\label{#2:##1}}
}{\end{h#2}}
}
\newref[section]{thm}{Theorem}{Theorem}
\newref{lem}{Lemma}{Lemma}
\newref{prop}{Proposition}{Proposition}
\newref{cor}{Corollary}{Corollary}
\theoremstyle{definition}
\newref{defn}{Definition}{Definition}
\newref{example}{Example}{Example}
\theoremstyle{remark}
\newref{remark}{Remark}{Remark}
\newcommand{\cat}[1]{\mathbf{#1}}
\newcommand{\C}{\cat{C}}
\newcommand{\sSet}{\cat{sSet}}
\newcommand{\bs}{\beta\sigma}
\newcommand{\red}{\Rightarrow}
\newcommand{\elim}{\Leftarrow}
\newcommand{\deq}{\Leftrightarrow}
\numberwithin{figure}{section}
\begin{document}
\title{Homotopy Type Theory with an interval type}
\author{Valery Isaev}
% \begin{abstract}
% Abstract
% \end{abstract}
\maketitle
\section{Introduction}
We define a version of homotopy type theory based on a notion of the interval type.
We give a simple formulation of the univalence axiom and describe a few extenisons of the system.
The first extension is called data types with conditions.
Data types with conditions generalize ordinary data types, allowing us to define not only coproducts of types, but also pushouts of certain functions (which we can call cofibrations).
Using data types with condtions, we will show how to define higher inductive types which have a simple description of elimination principles.
The second extension is called records with conditions and it is dual to the first one.
Using this extensions, we can define path types for which the usual elimination principle J holds.
\section{Syntax}
In this section, we will describe the inference rules of the basic system, which has $\Pi$-types and path types.
We denote by $\deq$ the computational equality, which is a reflexive, symmetric, and transitive closure of a union of $\deq_\eta$ and $\red$, which is a union of $\red_\beta$ and $\red_\sigma$.
Inference and reduction rules are defined in table~\ref{table:inf-rules}.
The main ingridient of the system is the interval type $I$, which allows us to give a simple computational description of path types, univalence, and higher inductive types.
The interval type has two constructors ($left$ and $right$) and one elimination rule ($coe$).
The behaviour of $coe$ can be described as follows:
given a fibration $\lambda x \red A$ over $I$ and a point $a$ in the fibre over $left$, $\lambda i \red coe_{\lambda x \red A}\ a\ i$ gives us a section of this fibration.
Path types are heterogeneous, that is we can construct paths between different types (which are itself connected by a path).
We write $a =_A a'$ (or simply $a = a'$) for usual homogeneous paths, which are defined as $Path\ (\lambda i \red A)\ a\ a'$.
A path between $a$ and $a'$ over $\lambda x \red A$ is a function $p : (x : I) \to A$ such that $p\ left \deq a$ and $p\ right \deq a'$.
But we still can define usual functions for path types using these constructions.
\begin{table}
\medskip
\begin{center}
\AxiomC{}
\UnaryInfC{$\varnothing \vdash$}
\DisplayProof
\quad
\AxiomC{$\Gamma \vdash A$}
\RightLabel{, $x \notin \Gamma$}
\UnaryInfC{$\Gamma, x : A \vdash$}
\DisplayProof
\quad
\AxiomC{$\Gamma \vdash$}
\RightLabel{, $x : A \in \Gamma$}
\UnaryInfC{$\Gamma \vdash x : A$}
\DisplayProof
\end{center}
\medskip
\begin{center}
\AxiomC{$\Gamma \vdash a : A$}
\AxiomC{$\Gamma \vdash B$}
\RightLabel{, $A \deq B$}
\BinaryInfC{$\Gamma \vdash a : B$}
\DisplayProof
\end{center}
\medskip
\begin{center}
\AxiomC{$\Gamma \vdash A$}
\AxiomC{$\Gamma, x : A \vdash B$}
\BinaryInfC{$\Gamma \vdash (x : A) \to B$}
\DisplayProof
\end{center}
\medskip
\begin{center}
\AxiomC{$\Gamma, x : A \vdash b : B$}
\UnaryInfC{$\Gamma \vdash \lambda x \red b : (x : A) \to B$}
\DisplayProof
\quad
\AxiomC{$\Gamma \vdash f : (x : A) \to B$}
\AxiomC{$\Gamma \vdash a : A$}
\BinaryInfC{$\Gamma \vdash f\ a : B[x := a]$}
\DisplayProof
\end{center}
\medskip
\begin{center}
\AxiomC{$\Gamma \vdash$}
\UnaryInfC{$\Gamma \vdash I$}
\DisplayProof
\quad
\AxiomC{$\Gamma \vdash$}
\UnaryInfC{$\Gamma \vdash left : I$}
\DisplayProof
\quad
\AxiomC{$\Gamma \vdash$}
\UnaryInfC{$\Gamma \vdash right : I$}
\DisplayProof
\end{center}
\medskip
\begin{center}
\AxiomC{$\Gamma, x : I \vdash A$}
\AxiomC{$\Gamma \vdash a : A[x := left]$}
\AxiomC{$\Gamma \vdash j : I$}
\TrinaryInfC{$\Gamma \vdash coe_{\lambda x \red A}\ a\ j : A[x := j]$}
\DisplayProof
\end{center}
\medskip
\begin{center}
\AxiomC{$\Gamma, x : I \vdash A$}
\AxiomC{$\Gamma \vdash a : A[x := left]$}
\AxiomC{$\Gamma \vdash a' : A[x := right]$}
\TrinaryInfC{$\Gamma \vdash Path\ (\lambda x \red A)\ a\ a'$}
\DisplayProof
\end{center}
\medskip
\begin{center}
\AxiomC{$\Gamma, x : I \vdash a : A$}
\UnaryInfC{$\Gamma \vdash path\ (\lambda x \red a) : Path\ (\lambda x \red A)\ a[x := left]\ a[x := right]$}
\DisplayProof
\end{center}
\medskip
\begin{center}
\AxiomC{$\Gamma \vdash p : Path\ (\lambda x \red A)\ a\ a'$}
\AxiomC{$\Gamma \vdash i : I$}
\BinaryInfC{$\Gamma \vdash p\ @_{a,a'}\ i : A[x := i]$}
\DisplayProof
\end{center}
\begin{align*}
& (\lambda x \red b)\ a \red_\beta b[x := a] \\
& coe_{\lambda k \red A}\ a\ left \red_\beta a \\
& coe_{\lambda k \red A}\ a\ i \red_\sigma a \text{ if } k \notin FV(A) \\
& path\ (\lambda x \red t)\ @_{a,a'}\ i \red_\beta t[x := i] \\
& p\ @_{a,a'}\ left \red_\beta a \\
& p\ @_{a,a'}\ right \red_\beta a' \\
& (\lambda x \red f\ x) \deq_\eta f \text{ if } x \notin FV(f) \\
& path\ (\lambda x \red p\ @\ x) \deq_\eta p \text{ if } x \notin FV(p)
\end{align*}
\caption{Inference and reduction rules.}
\label{table:inf-rules}
\end{table}
Identity path (reflexivity) is a constant function $\lambda i \red x$.
\begin{align*}
& idp : (x : A) \to x = x \\
& idp = \lambda x \red path\ (\lambda i \red x)
\end{align*}
Path map (congruence) is defined in terms of a function composition.
If we have a function $p : I \to A$ and a function $f : A \to B$, then $f \circ p : I \to B$ is a path between $f\ (p\ left)$ and $f\ (p\ right)$.
\begin{align*}
& pmap : (f : A \to B) \to (a\ a' : A) \to a = a' \to f\ a = f\ a' \\
& pmap = \lambda f\ a\ a'\ p \red path\ (\lambda i \red f\ (p\ @\ i))
\end{align*}
It is well known that functional extensionality can be defined in terms of an interval type.
\begin{align*}
& funExt : (f\ g : (a : A) \to B[x := a]) \to ((a : A) \to f\ a = g\ a) \to f = g \\
& funExt = \lambda f\ g\ p \red path\ (\lambda i\ a \red p\ a\ @\ i)
\end{align*}
Transporting along a path (substitution) is defined as a coercion along a path in types:
\begin{align*}
& transport : (a\ a' : A) \to a = a' \to B[x := a] \to B[x := a'] \\
& transport = \lambda a\ a'\ p\ b \red coe_{\lambda i \red B[x := p\ @\ i]}\ b\ right
\end{align*}
The definition of $J$ is a little bit more complicated.
First, we need to define a function $squeeze$, satisfying the following rules:
\begin{align*}
& squeeze : I \to I \to I \\
& squeeze\ left\ j \red left \\
& squeeze\ right\ j \red j \\
& squeeze\ i\ left \red left \\
& squeeze\ i\ right \red i
\end{align*}
The idea is that for each point on the interval $i : I$ function $squeeze\ i : I \to I$ maps $left$ to $left$ and $right$ to $i$, so it squeezes the interval.
Using $squeeze$ we can defined a useful function $psqueeze$, which squeezes paths.
\begin{align*}
& psqueeze : (x\ y : A) (p : x = y) (i : I) \to x = p\ @\ i \\
& psqueeze = \lambda x\ y\ p\ i \red path\ (\lambda j \red p\ @\ squeeze\ i\ j)
\end{align*}
Using $psqueeze$ we can define $J$ as follows:
\begin{align*}
& J : ((a : A) \to B[x := a, y := a, p := idp\ a]) \to (x\ y : A) (p : x = y) \to B \\
& J = \lambda d\ x\ y\ p \red coe_{\lambda i \red B[y := p\ @\ i, p := psqueeze\ x\ y\ p\ i]}\ (d\ x)\ right
\end{align*}
This definition type checks since $psqueeze$ satisfies the following rules:
\begin{align*}
& psqueeze\ x\ y\ p\ left \deq idp\ x \\
& psqueeze\ x\ y\ p\ right \deq p
\end{align*}
\subsection{Cubical fillers}
The theory we are describing has many similarities to the theory of cubical sets.
The reason is that we already have a $1$-cube, namely the interval type, and we can define an $n$-cube to be the product of intervals.
We do not have the product type yet, but we can form a context $x_1 : I, \ldots x_n : I$, and a term of type $A$ in this context is simply an $n$-cube in $A$.
Another reason is that $coe$ gives us all $n$-dimensional fillers for cubical horns.
We will use this to define $squeeze$ in a similar way it is done in the appendix of \cite{cubical}.
Given an $n$-dimensional cubical horn we can always find a filler for it as follows:
\begin{align*}
& fill^1_{\lambda x_1 \red A}\ a_1\ j_1 = coe_{\lambda x_1 \red A}\ a_1\ j_1 \\
& fill^{n+1}_{\lambda \overline{x}_{n+1} \red A}\ (\lambda \overline{x}_n \red a_{n+1})\ (\lambda \overline{x}_n \red a'_{n+1}) \ldots \\
& \indent (\lambda \overline{x}_{\hat{2},n+1} \red a_2)\ (\lambda \overline{x}_{\hat{2},n+1} \red a'_2)\ (\lambda \overline{x}_{\hat{1},n+1} \red a_1) = \\
& \indent \lambda \overline{j}_{n+1} \red fill^n_{\lambda \overline{x}_n \red Path\ (\lambda x_{n+1} \red A)\ a_{n+1}\ a'_{n+1}} \\
& \indent \indent (\lambda \overline{x}_{n-1} \red path\ (\lambda x_{n+1} \red a_n))\ (\lambda \overline{x}_{n-1} \red path\ (\lambda x_{n+1} \red a'_n)) \ldots \\
& \indent \indent (\lambda \overline{x}_{\hat{2},n} \red path\ (\lambda x_{n+1} \red a_2))\ (\lambda \overline{x}_{\hat{2},n} \red path\ (\lambda x_{n+1} \red a'_2)) \\
& \indent \indent (\lambda \overline{x}_{\hat{1},n} \red path\ (\lambda x_{n+1} \red a_1))\ j_1 \ldots j_n\ @\ j_{n+1}
\end{align*}
Here, $\overline{x}_k$ denotes the sequence $x_1 \ldots x_k$, and $\overline{x}_{\hat{i},k}$ denotes the sequence $x_1 \ldots x_{i-1}\ x_{i+1} \ldots x_k$.
The idea is that $a_i$ and $a'_i$ are hyperfaces of an $n$-dimensional cube over $A$ that form a cubical horn, and $fill^n$ gives us a filler of this horn.
In order for this to make sense, $A$, $a_i$, and $a'_i$ must satisfy the following rules:
\[ \overline{x}_{n} \vdash A \qquad \overline{x}_{\hat{i},n} \vdash a_i : A[x_i := left] \qquad \overline{x}_{\hat{i},n} \vdash a'_i : A[x_i := right] \]
\[ a_{i_1}[x_{i_2} := left] \deq a_{i_2}[x_{i_1} := left] \qquad a'_{i_1}[x_{i_2} := left] \deq a_{i_2}[x_{i_1} := right] \]
\[ a_{i_1}[x_{i_2} := right] \deq a'_{i_2}[x_{i_1} := left] \qquad a'_{i_1}[x_{i_2} := right] \deq a'_{i_2}[x_{i_1} := right] \]
In this case $fill^n$ satisfies the following typing rule:
\begin{align*}
& \vdash fill^n_{\lambda \overline{x} \red A}\ (\lambda \overline{x}_{n-1} \red a_n)\ (\lambda \overline{x}_{n-1} \red a'_n) \ldots \\
& \indent (\lambda \overline{x}_{\hat{2},n} \red a_2)\ (\lambda \overline{x}_{\hat{2},n} \red a'_2)\ (\lambda \overline{x}_{\hat{1},n} \red a_1) \\
& : (\overline{j}_n : I) \to A[x_1 := j_1] \ldots [x_n := j_n].
\end{align*}
If $j_k$ is equal to $left$ or $right$ for some $k$, then $fill^n$ on $j_1 \ldots j_n$ equals to $a_k[x_1 := j_1] \ldots [x_n := j_n]$ or $a'_k[x_1 := j_1] \ldots [x_n := j_n]$ respectively.
So $fill^n$ indeed returns an $n$-cube with given hyperfaces.
Now, we can define function $squeeze$ which satisfies the required rules.
There rules can be stated as follows.
We need to find a function $squeeze : I \to I \to I$, which is given on borders by the following diagram:
\[ \xymatrix{ left \ar[r]^{\lambda i \red left} \ar[d]_{\lambda j \red left} & left \ar[d]^{\lambda j \red j} \\
left \ar[r]_{\lambda i \red i} & right
}\]
We can define function $sq$, which satisfies three of the four rules, by filling the following horn:
\[ \xymatrix{ left \ar[r]^{\lambda i \red left} \ar[d]_{\lambda j \red left} & left \ar[d]^{\lambda j \red j} \\
left & right
}\]
\begin{align*}
& sq : I \to I \to I \\
& sq = fill^2_{\lambda i\ j \red I}\ (\lambda j \red left)\ (\lambda j \red j)\ (\lambda i \red left)
\end{align*}
Now, we can construct $squeeze$, which satisfies all of the rules, by filling the following horn:
\[ \xymatrix @C=0.5pc @R=0.5pc
{ left \ar[rrrr] \ar[dddd] & & & & left \ar[dddd] \\
& left \ar[rr] \ar[dd] \ar[ul] & & left \ar[dd] \ar[ur] & \\
& & & & \\
& left \ar[rr] \ar[dl] & & left \ar[dr] & \\
left \ar[rrrr] & & & & right
}\]
The inner, left, and top squares are $\lambda i\ j \red left$, the bottom and right squares are $sq$, and the filler gives us the outter square which is the required function $squeeze$.
\begin{align*}
& squeeze : I \to I \to I \\
& squeeze = fill^3_{\lambda x_1\,x_2\,x_3 \red I}\ (\lambda x_1\ x_2 \red left)\ sq\ (\lambda x_1\ x_3 \red left)\ sq\ (\lambda x_2\ x_3 \red left)\ right
\end{align*}
\subsection{Univalent universes}
Now, we show how to add a univalent universe to the system.
\medskip
\begin{center}
\AxiomC{$\Gamma \vdash$}
\UnaryInfC{$\Gamma \vdash Type$}
\DisplayProof
\quad
\AxiomC{$\Gamma \vdash$}
\UnaryInfC{$\Gamma \vdash I : Type$}
\DisplayProof
\quad
\AxiomC{$\Gamma \vdash A : Type$}
\AxiomC{$\Gamma, x : A \vdash B : Type$}
\BinaryInfC{$\Gamma \vdash (x : A) \to B : Type$}
\DisplayProof
\end{center}
\medskip
\begin{center}
\Axiom$\fCenter \Gamma \vdash A : Type$
\noLine
\UnaryInf$\fCenter \Gamma \vdash B : Type$
\def\extraVskip{1pt}
\Axiom$\fCenter \Gamma \vdash f : A \to B$
\noLine
\UnaryInf$\fCenter \Gamma \vdash g : B \to A$
\Axiom$\fCenter \Gamma \vdash p : (a : A) \to g\ (f\ a) = a$
\noLine
\UnaryInf$\fCenter \Gamma \vdash q : (b : B) \to f\ (g\ b) = b$
\def\extraVskip{2pt}
\AxiomC{$\Gamma \vdash i : I$}
\QuaternaryInfC{$\Gamma \vdash iso\ A\ B\ f\ g\ p\ q\ i : Type$}
\DisplayProof
\end{center}
\medskip
We also add the following reduction rules:
\begin{itemize}
\item $iso\ A\ B\ f\ g\ p\ q\ left \red_\beta A$
\item $iso\ A\ B\ f\ g\ p\ q\ right \red_\beta B$
\item $iso\ A\ B\ (\lambda x \red x)\ (\lambda x \red x)\ idp\ idp\ i \red_\beta A$
\item $coe_{\lambda k \red iso\,A\,B\,f\,g\,p\,q\,k}\ a\ right \red_\beta f\ a$ if $k \notin FV(A\ B\ f\ g\ p\ q)$
\end{itemize}
Usually, univalence is stated in the form of an axiom as follows:
\begin{align*}
& isContr\ A = \Sigma\ (a : A)\ ((a' : A) \to a = a') \\
& \\
& isEquiv\ f = (b : B) \to isContr\ (\Sigma\ (a : A)\ (f\ a = b)) \\
& \\
& pte : (A\ B : Type) \to A = B \to \Sigma\ (f : A \to B)\ (isEquiv\ f) \\
& pte = \lambda A\ B\ p \red (transport\ A\ B\ p, \_) \\
& \\
& univalence = (A\ B : Type) \to isEquiv\ (pte\ A\ B)
\end{align*}
Instead of $\_$ there should be a proof that $transport\ A\ B\ p$ is an equivalence, which we omit because of its length.
We can show that $univalence$ holds using $iso$.
First, we construct the following function
\begin{align*}
& etp : (A\ B : Type) \to \Sigma\ (f : A \to B)\ (isEquiv\ f) \to A = B \\
& etp = \lambda A\ B\ p \red path\ (\lambda i \red iso\ A\ B\ p.proj_1\ \_\ \_\ \_\ i)
\end{align*}
Omitted terms can be constructed using a proof of $isEquiv\ f$.
To prove that $pte\ A\ B$ is an equivalence it is enough to show that $etp\ A\ B$ is its inverse.
The first components of pairs $pte\ A\ B\ (etp\ A\ B\ (f, e))$ and $(f, e)$ are equal definitionally, and its second components are equal since $isEquiv\ f$ is a proposition.
Finally, we need to show that for each $p : A = B$ we have a path $etp\ A\ B\ (pte\ A\ B\ p) = p$.
After applying $J$ we only need to show this for $p = idp\ A$, but this holds definitionally.
\subsection{Data types with conditions}
In this section we describe, we will describe an extension of usual inductive data types, which allows us to define higher inductive types.
To do this, we need to describe a way of defining functions over inductive data types.
Traditionally such functions are defined either by pattern matching or through eliminators.
The former is more convenient when working inside the theory, but the latter is easier to describe formally.
Our approach combines both of these methods.
To allow definitions of data types and functions we extend the system by introducing the notion of signature.
A signature consists of declarations of data types and functions.
We need to modify our typing judgements:
\begin{itemize}
\item $\Sigma; \Gamma \vdash$ means that $\Gamma$ is well typed context in signature $\Sigma$.
\item $\Sigma; \Gamma \vdash A$ means that $A$ is a well typed type in context $\Gamma$ and signature $\Sigma$.
\item $\Sigma; \Gamma \vdash a : A$ means that $a$ is a well typed term of type $A$ in context $\Gamma$ and signature $\Sigma$.
\end{itemize}
We omit $\Sigma$ from the notation if it is clear which signature we are using.
An inductive data type $D$ with parameters $a_1 : A_1, \ldots a_n : A_n$ is described by a list of its constructors, and for each constructor a list of types of its arguments:
\[ c_1\ (x_1 : B^1_1)\ \ldots\ (x_{k_1} : B^1_{k_1}) \]
\[ \vdots \]
\[ c_m\ (x_1 : B^m_1)\ \ldots\ (x_{k_m} : B^m_{k_m}) \]
We require data types to be \emph{strictly positive}, which means that if $D$ appears in $B^i_j$, then $B^i_j = F(D\ a_1\ \ldots\ a_n)$ for some $a_1, \ldots a_n$ and some strictly positive $F$.
A function $F$ is strictly positive if it is inductively generated by the following rules:
\begin{itemize}
\item $F(Z) = Z$
\item $F(Z) = Path\ (\lambda i \red F(Z))\ z_1\ z_2$
\item $F(Z) = (e : E) \to F(Z)$
\end{itemize}
We say that a data type is \emph{simple} if it doesn't use the last rule in the description of $F$.
Of course, a data type definition must be well typed, which means that the following holds:
\begin{itemize}
\item $\Sigma; a_1 : A_1. \ldots a_n : A_n \vdash$.
\item $\Sigma, D'; x_1 : B^i_1, \ldots x_{j-1} : B^i_{j-1} \vdash B^i_j$ for each constructor $c_i$ and each $1 \leq j \leq k_i$, where $D'$ is a data type with the same parameters as $D$, but without constructors.
\end{itemize}
A function definition consists of a name of the function with its type and a body of the function:
\begin{align*}
& f : (x_1 : A_1) \ldots (x_n : A_n) \to B \\
& f\ x_1\ \ldots\ x_n \red b
\end{align*}
The definition is well typed if $x_1 : A_1, \ldots x_n : A_n \vdash b : B$.
We also introduce another way of defining a function, which is writtern like this:
\begin{align*}
& f' : (x_1 : A_1) \ldots (x_n : A_n) \to B \\
& f'\ x_1\ \ldots\ x_n \elim b
\end{align*}
Such definitions are well typed if the following holds:
\[ x_1 : A_1, \ldots x_n : A_n \vdash b : B \]
The difference between these definitions lies in the computation rules.
For the former definition we add usual computation rules:
\[ f\ a_1\ \ldots\ a_n \red_\beta b[x_1 := a_1] \ldots [x_n := a_n] \]
For the latter computation rules are defined by the following inductive rule:
\begin{center}
\AxiomC{$b[x_1 := a_1] \ldots [x_n := a_n] \red_t b'$}
\UnaryInfC{$f'\ a_1\ \ldots\ a_n \red_\beta b'$}
\DisplayProof
\end{center}
where $\red_t$ is the union of the basic reduction rules that we gave in table~\ref{table:inf-rules} and here for function definitions.
That is, $\red_t$ makes reductions only on the top level of the term, but not inside it.
Such definitions are useful when the right hand side is a big term, which usually performs some sort of case analysis by pattern matching,
and we want to reduce a function application only when this case analysis is satisfied.
To allow definitions by pattern matching we introduce new primitive operator $elim$:
\[ elim\ e\ \{\ c_1\ x_1\ \ldots\ x_{k_1} = b_1;\ \ldots;\ c_m\ x_1\ \ldots\ x_{k_m} = b_m\ \}, \]
where each of the $=$ signs is either $\red$ or $\elim$, $x_i$ are variables, $c_i$ are constructors of some data type, and $e$ and $b_i$ are some terms.
The set of rules inside $\{$ $\}$ is unordered, and each constructor $c_i$ occurs exactly once.
Reduction rules are the following:
\begin{center}
\AxiomC{$b_i[x_1 := a_1] \ldots [x_n := a_n] \red_t b'$}
\RightLabel{,}
\UnaryInfC{$elim\ c_i\ a_1\ \ldots\ a_{k_i}\ \{\ c_i\ x_1\ \ldots\ x_{k_i} \elim b_i;\ \ldots\ \} \red_\beta b' $}
\DisplayProof
\end{center}
\[ elim\ c_i\ a_1\ \ldots\ a_{k_i}\ \{\ c_i\ x_1\ \ldots\ x_{k_i} \red b_i;\ \ldots\ \} \red_\beta b_i[x_1 := a_1] \ldots [x_{k_i} := a_{k_i}]. \]
Inference rules for $elim$ are the following:
\begin{center}
\def\extraVskip{1pt}
\Axiom $\fCenter \Gamma, y : D\ p_1\ \ldots\ p_n \vdash B$
\noLine
\UnaryInf $\fCenter \Gamma \vdash e : D\ p_1\ \ldots\ p_n$
\noLine
\UnaryInf $\fCenter \Gamma, x_1 : A^i_1, \ldots x_{k_i} : A^i_{k_i} \vdash b_i : B[y := c_i\ x_1\ \ldots\ x_{k_i}] \text{ for each $1 \leq i \leq m$}$
\def\extraVskip{2pt}
\RightLabel{,}
\UnaryInf $\fCenter \Gamma \vdash elim\ e\ \{\ c_1\ x_1\ \ldots\ x_{k_1} = b_1;\ \ldots;\ c_m\ x_1\ \ldots\ x_{k_m} = b_m\ \} : B[y := e]$
\DisplayProof
\end{center}
where $c_i\ (x_1 : A^i_1)\ \ldots\ (x_{k_i} : A^i_{k_i})$ are constructors of data type $D$.
To allow recursive definitions we introduce new context, which consists of recursive calls to the function we are defining.
Now, judgements look like this $\Gamma; \rho \vdash a : A$.
Here, $\Gamma$ is the usual context, and $\rho$ is the recursive calls context.
A recursive function $f$ as before is well typed if the following holds:
\[ x_1 : A_1, \ldots x_n : A_n; f\ x_1\ \ldots\ x_n : B \vdash b : B \]
We add the following inference rule, which allows us to use recursive calls:
\begin{center}
\AxiomC{}
\UnaryInfC{$\Gamma; t : T, t_1 : T_1, \ldots t_n : T_n \vdash t_i : T_i$}
\DisplayProof
\end{center}
The first entry $t : T$ is always a recursive call to the function with the same arguments, so we do not allow to use it.
We add inference rules for $elim$, which extends recursive calls context appropriately:
\begin{center}
\AxiomC{$\fCenter \Gamma \vdash B$}
\AxiomC{$\fCenter \Gamma'_i; \rho'_i \vdash b_i : B[z_j := c_i\ x_1\ \ldots\ x_{k_i}] \text{ for each $1 \leq i \leq m$}$}
\RightLabel{,}
\BinaryInfC{$\fCenter \Gamma; \rho \vdash elim\ z_j\ \{\ c_1\ x_1\ \ldots\ x_{k_1} = b_1;\ \ldots;\ c_m\ x_1\ \ldots\ x_{k_m} = b_m\ \} : B$}
\DisplayProof
\end{center}
where $z_j$ is a variable in context $\Gamma$, and contexts $\Gamma'_i$ and $\rho'_i$ are defined as follows:
\begin{itemize}
\item If $\Gamma = z_1 : Z_1, \ldots z_s : Z_s$, then $\Gamma'_i = z_1 : Z_1, \ldots z_{j-1} : Z_{j-1}, x_1 : A^i_1, \ldots x_{k_i} : A^i_{k_i}, z_{j+1} : Z_{j+1}[z_j := c_i\ x_1\ \ldots\ x_{k_i}], \ldots z_s : Z_s[z_j := c_i\ x_1\ \ldots\ x_{k_i}]$.
\item
\end{itemize}
A data type with conditions also allows us to put a condition on a constructor.
A condition on a constructor is simply a partial definition of a function with the type of this constructor.
Syntactically, this means that constructors now can be evaluated if its arguments match one of the clauses of this partially defined function.
Also, this means that when we define a function over such a data type we need to check that this definition respects conditions.
Semantically, ordinary data types allow to define coproducts of types, and data types with conditions allow us to define pushouts.
One of the maps in the pushout diagram must be given by a collection of constructors of a data type, which means it is (often can be interpreted as) a cofibration.
This implies that this pushout is allways a correct homotopy pushout.
Let us give an example. We can define integers as the data type with two constructors:
\[ negative : \mathbb{N} \to \mathbb{Z} \]
\[ positive : \mathbb{N} \to \mathbb{Z}. \]
The problem is that we get two different zeros: $negative\ 0$ and $positive\ 0$.
Of course, we can define $-1$ to be $negative\ 0$, $-2$ to be $negative\ 1$, and so one, but this can easily lead to a confusion.
It is better to identify the positive and the negative zeros, that is to define $\mathbb{Z}$ as the pushout of $1 \overset{0}\to \mathbb{N}$ with itself.
So, we add the following condition to the definition of $\mathbb{Z}$:
\[ negative\ 0 = positive\ 0. \]
Now, when we define a function $f$ over $\mathbb{Z}$, we need to check that $f\ (negative\ 0) \deq f\ (positive\ 0)$.
Using data types with conditions, we can define higher inductive types.
For example, the circle $S^1$ can be defined as follows: it has two constructors
\[ base : S^1 \]
\[ loop : I \to S^1 \]
and two conditions
\[ loop\ left = base \]
\[ loop\ right = base. \]
After we add path types, it will be posible to describe elimination rules for such data types,
but it is more convenient to simply define functions over them.
\subsection{Records with conditions}
Records with conditions are dual to data types with conditions.
Ordinary records allow us to define product of types, and records with conditions allow us to define pullbacks.
A record is given by a list of fields, and a record with conditions can additionally put conditions on some fields.
A condition on a field is a partially defined function with the type of this field.
Semantically, such a record is given by a pullback of a function $(B \to X) \to (A \to X)$ for some cofibration $A \to B$, which means that it is still fibrant.
Let us show how to define path types as a record with conditions.
It will be a record $Path$ with three parameters $A : I \to Type$, $a : A\ left$, and $a' : A\ right$.
We will define a ``heterogeneous path type'' because it seems impossible to define useful elimination rules for higher inductive types using only homogeneous path types.
We abbreviate $Path\ (\lambda i \red A)\ a\ a'$ to $a = a'$.
The record $Path\ (A : I \to Type)\ (a : A\ left)\ (a' : A\ right)$ has one constructor
\[ path : ((i : I) \to A\ i) \to Path\ (A : I \to Type)\ (a : A\ left)\ (a' : A\ right), \]
one field
\[ at : (i : I) \to A\ i, \]
and two conditions
\[ at\ left = a \]
\[ at\ right = a'. \]
When we define an element of a record with conditions, we need to check that it satisfies them.
And when we access fields with conitions, they might evaluate, if arguments to it match some condition on it.
For example, if $p : a = a'$, then $p.at\ left$ evaluates to $a$, $p.at\ right$ evaluates to $a'$,
and if we want to consruct a path of type $a = a'$, then we need to specify a function $f : I \to A$ such that $f\ left \deq a$ and $f\ right \deq a'$.
We can define the $J$ rule as follows:
\begin{itemize}
\item[] $idp : (A : Type) (a : A) \to Path\ (\lambda i \red A)\ a\ a$
\item[] $idp\ A\ a = path (\lambda i \red a)$
\item[]
\item[] $J : (A : Type) (C : (x\ y : A) \to x = y \to Type) \to$
\item[] \qquad $((a : A) \to C\ a\ a\ (idp\ A\ a)) \to (a\ a' : A) (p : a = a') \to C\ a\ a'\ p$
\item[] $J\ A\ C\ d\ a\ a'\ p =\ coe_{\lambda i \red \,C\,a\,(p.at\,i)\,(path\,(\lambda j \red \,p.at\,(squeeze\,i\,j)))}\ left\ (d\ a)\ right$
\end{itemize}
Now, we can prove the univalence axiom in its usual form, which states that a certain function $F$ is equivalence.
Function $F$ maps path types $A = B$ to the type of functions $A \to B$ which are equivalences.
We can construct an inverse $G$ to function $F$: given an equivalence $A \to B$, $G$ construct a path $A = B$ using $iso$.
If $f : A \to B$ is an equivalence, then $F\ (G\ f)$ definitionally equals to $f$ because of the compuation rule for $iso$.
If $p : A = B$, then we can show that there is a path between $G\ (F\ p)$ and $p$ using $J$.
Actually, we can't use $J$ directly because of the size issues, but we can expand its definition, and then it works fine.
\section{A model of data types and records with conditions}
In this section, we will show how to interpret data types and records with conditions in simplicial sets.
\subsection{Data types with conditions}
Each constructor of a data type gives us an accessible functor $C_i : \sSet \to \sSet$.
% If a data type without conditions has $n$ constructors, then we can interpret it as an algebra for endofunctor $\coprod_{1 \leq i \leq n} C_i$.
\subsection{Records with conditions}
\bibliographystyle{amsplain}
\bibliography{ref}
\end{document}