[Agda] Coinductive families
Dan Doel
dan.doel at gmail.com
Mon Oct 4 04:47:55 CEST 2010
On Sunday 03 October 2010 6:10:34 pm Nils Anders Danielsson wrote:
> On 2010-10-01 15:09, Iain Lane wrote:
> > record Tailspin : Set where
> >
> > field
> >
> > Kaboom : Tailspin
> >
> > test : Tailspin
> > test = ?
>
> I think the problem here is that you are using a recursive record type.
> Such types are not really supported, but for some reason they are not
> disallowed, maybe to allow experiments with coinductive records:
>
> record Stream (A : Set) : Set where
> constructor _∷_
> field
> head : A
> tail : ∞ (Stream A)
I posted this in response to an issue on the bug tracker* (so sorry if you're
seeing this twice), but there may be some eyes here that it wouldn't reach
otherwise, and it's related to the above.
I was thinking not long ago about notations for defining codata. One nice
special case is when the functor in question is a product. Since the
observation is then T -> FT*GT*..., you can write it like:
codata Stream A = Head :: A & Tail :: Stream A
But, I was thinking about a more GADT/inductive family-like syntax, and the
obvious choice is:
codata Stream (A : Set) : Set where
head :: Stream A -> A
tail :: Stream A -> Stream A
where the list specifies the destructors, and the type being defined is a
product thereof, instead of a sum as in the inductive case (and this is
probably not surprising; inductives and sums are colimits, and coinductives
and products are limits). Charity handles type definitions in this way.
But, in the ordinary coinductive case, the first argument is always the same,
so we could think about eliding it, which of course produces the suggestive:
codata Stream (A : Set) : Set where
head :: A
tail :: Stream A
and this looks exactly like records. And, in fact, it doesn't just look like
records, it *behaves* a lot like (Agda) records in several ways.
1) Both records and codata are defined by the observations that can be
performed on them, rather than how they're constructed, as inductives are.
2) The eta rule for records is very much like proper extensional equality for
coinductive types: two values are equal if all observations/projections of the
two are equal. For instance, for the empty record:
record ⊤ : Set where
with no projections, as a codata, we'd expect to, for any type not needing to
support any observations, there to be a canonical injection into the empty
record:
ana : ∀{A : Set} → A → ⊤
ana _ = _
term : ∀{A} → (f : A → ⊤) → f ≡ ana
term f = refl
And lo and behold, there is. Not that this isn't equally justified by
induction for non-recursive records (inductive and coinductive records should
coincide when non-recursive), but the focus on equality of
projections/observations seems like a coinductive take.
3) In the bug in question, there was a problem with the termination checker
not accepting code involving something like:
foo ... (x , y) = ... (foo ... x) ... (foo ... y) ...
because the match on (x , y) is translated to projections from the record.
However, Nils also pointed out that the following code should not check:
record R : Set where
constructor c
field
p : R
foo : R -> R
foo (c r) = foo r
because it translates into:
foo : R -> R
foo r = foo (p r)
which will not normalize (foo r => foo (p r) => foo (p (p r)) => ...). The
solution found was to make 'p r <= r', so that the checker is aware that the
argument isn't getting *bigger*, but it isn't necessarily getting smaller,
either (the latter part invalidates foo). However, is this situation not
rather strange?
If R were indeed inductive, we'd expect it to be empty. For instance:
data R : Set where
c : R -> R
¬R : R → ⊥
¬R (c r) = ¬R r
but this is exactly the function we've just ruled out, because it won't
normalize with records. And in the inductive case, p r must be smaller than r,
but we are treating it as if it isn't.
However, if records are coinductive, the above makes perfect sense:
a) p r <= r, because coinductive types admit circular definitions, and so
projecting out of one may yield something of equal size, but not greater.
b) foo is obviously not a productive corecursive definition, so it's natural
to rule it out. There's no longer a strange situation where we must rule
out an ostensibly inductively justified definition because it would fail
to normalize.
c) It's okay that we can't prove that R is uninhabited, because it *is*
inhabited by c (c (c (c ...))).
4) I haven't bothered to think this through much yet, but: coinductives had
problems (and still do in Coq, I think) with subject reduction when they were
initially conceived of as being built out of constructors. Taking the view
that values of coinductive type are opaque things with observation functions
seems important to a well-working theory.
When constructors and matching thereon were added for records, I believe there
were similar issues with subject reduction, which is why the matches are now
translated to projections.
I suspect these two situations are related.
So, how about it? Is it possible that Agda has had non-recursive coinductive
types all along, and that codata would have been better added by fixing the
way that recursive records work?
On the topic specified in the subject line, I've been thinking about what
coinductive families might look like. Inductive families are initial algebras
in a slice category (which makes them a way of specifying how an entire family
T i is constructed at once), so a coinductive family should be a final
coalgebra in the same. I'm not sure I have the chops to specify what that is,
exactly, *but*, I can speculate wildly.
We allow the specification of inductive families by allowing constructors to
target arbitrary values of the index:
data Fin : ℕ → Set where
zero : ∀{n} → Fin (suc n)
suc : ∀{n} → Fin n → Fin (suc n)
So, it stands to reason that we should be able to specify coinductive families
by allowing projections to specify arbitrary starting indices:
codata CoFin : ℕ → Set where
pred : Fin (suc n) → Maybe (Fin n)
empty : Fin 0 → ⊥
Of course, where the 'n' comes from in the pred destructor is something I
haven't figured out yet. :) Perhaps the presence of the n at all is an
illusion in both cases. Edwin Brady et al. tell us that inductive families
need not store their indices**. Perhaps this is not merely an optimization,
but a recognition that quantification over indices is part of a schema for
defining many constructors, not actually a field of the constructor, and:
pred : ∀{n} → Fin (suc n) → Maybe (Fin n)
is a similarly sensible schema (apologies if the paper actually makes this
observation; it's been a while since I read it).
If this definition is a little surprising, note that we could also write Fin
as:
data Fin : ℕ → Set where
mk : ∀{n} → Maybe (Fin n) → Fin (suc n)
empty : ⊥ → Fin 0
which makes it a little clearer that we're dealing with the same functors
here.
Thoughts?
-- Dan
[*] http://code.google.com/p/agda/issues/detail?id=334
[**] http://citeseerx.ist.psu.edu/viewdoc/summary?doi=10.1.1.62.3849
Also, here's a fancier proof than the top one: the anamorphism into the
product record is unique. Of course, I don't expect eta rules to extend nicely
into providing extensional equality of recursive records-as-codata.
module Prod where
record _×_ (A B : Set) : Set where
field
fst : A
snd : B
open _×_
ana : ∀{A B C : Set} → (C → A) → (C → B) → C → A × B
ana f g x = record { fst = f x ; snd = g x }
comm₁ : ∀{A B C} → (f : C → A) (g : C → B) → (fst ∘ ana f g) ≡ f
comm₁ f g = refl
comm₂ : ∀{A B C} → (f : C → A) (g : C → B) → (snd ∘ ana f g) ≡ g
comm₂ f g = refl
open ≡-Reasoning
univ : ∀{A B C : Set} → (f : C → A) (g : C → B)
→ (h : C → A × B)
→ (fst ∘ h) ≡ f
→ (snd ∘ h) ≡ g
→ h ≡ ana f g
univ f g h eq₁ eq₂ = begin
(λ x → h x)
≡⟨ refl ⟩
(λ x → record { fst = (fst ∘ h) x
; snd = (snd ∘ h) x })
≡⟨ cong (λ k →
λ x → record { fst = k x
; snd = (snd ∘ h) x })
eq₁ ⟩
(λ x → record { fst = f x
; snd = (snd ∘ h) x })
≡⟨ cong (λ k → λ x → record { fst = f x
; snd = k x })
eq₂ ⟩
(λ x → record { fst = f x ; snd = g x })
≡⟨ refl ⟩
(λ x → ana f g x)
∎
More information about the Agda
mailing list