My Dad got his PhD in 1970, three years before he got me: he added
pattern matching to a dialect of LISP, and he used it to develop
software for symbolic mathematics. The header of this page comes from
page 24 of his thesis. It's correspondingly unremarkable that, as a
little boy, I learned the λ-calculus and the differential
calculus at the same time. That program has been with me all my life.
This page is about the way I use it.
I've written or co-written a number of articles on this topic. Some have
even been published! Others have been rejected, or are still in the pipeline.
Here they are:
- Dependently Typed Programs and their
Proofs. PhD thesis, LFCS, Edinburgh, 2000.
I hadn't had the Platform Four moment at that point, but I was
interested in one-hole contexts because I wanted to talk about the
occur-check in unification. Sections 7.3.6 and 7.3.7 (pp230-236), are
the relevant bits. You can probably tell that I was cracking up quite
badly at the time.
- The Derivative of a Regular Type is its
Type of One-Hole Contexts. Rejected from LICS 2001.
One referee complained that it had nothing to do with regular
expressions, which will teach me to borrow words more carefully: I
was thinking of the Bird-Meertens notion of ‘regular’
(contrasted with ‘nested’) types. Another complained
that it said nothing more than Huet had in ‘The
Zipper’. The general consensus, which is fair enough, was
that the paper contained an interesting observation, but
was short on
explanation. One thing I understated was that I'd given a universe
construction for the regular types,
closed under μ in dependent type theory. This kicked off a line of
work in generic programming.
- Derivatives of Containers,
with Michael Abbott, Thorsten Altenkirch and Neil Ghani. in
proceedings of TLCA 2003.
I was working for Durham but living in Nottingham, popping in to see
Thorsten on Friday afternoons. He, Neil and Michael had been working on
containers, which they'd presented at FOSSACS (revised for TCS). This
way of looking at data structures had enabled them to give a uniform
definition of the derivative, with respect to which we were able to verify
the calculation I'd presented. They asked me to help write it up. At the
time, I was even less of a category theorist than I am now, so it's easy
to tell which bits are mine.
- Exploring the Regular Tree Types,
with Peter Morris and Thorsten Altenkirch. In proceedings of TYPES 2004.
I'd just started working for Nottingham, and we were just getting
the hang of programming in Epigram. Thorsten and I gave Peter a copy of
‘the reject’ and he got cracking. As you can see, we were still
in some angst about what to call the types: I think this variant is due
to Uday Reddy. In this paper, the universe construction comes first and
calculus provides the finale.
- ∂ for Data, with Michael
Abbott, Thorsten Altenkirch and Neil Ghani. In Fundementa Informaticae,
(65, 1&2), 2005.
I made much more of a nuisance of myself when we came to write
the journal version of ‘der cont’. I also made a lot more
pictures. The story about inductive types is much more algorithmic.
Some interesting notational stuff comes in here: pattern matching
quantifiers and the like. Also, some amusing use of views.
- Why walk when you can take the tube?,
with Lucas Dixon and Peter Hancock. Work in
I was visiting Hank in Edinburgh. We'd just spent the afternoon in
his garden, talking about talking about indexed containers, but that's
another story. Exhausted, we went for a pint at wee Bennet's, where Hank
suggested writing a ‘sort of
down” operation’. Some time later, Lucas was visiting us
in Nottingham: on the way to the bus stop, he suggested a representing
proof terms in a way which made it easier to get hold of the
occurrences of metavariables, for greater rapidity of progress. Could we
use derivatives to represent paths to metavariables, he asked?
I started cooking. Turned out, Hank's down operator was just the
tool for the job. The draft is little more than the lhs2TeXing of the code.
More chat required.
- Clowns to the left of me, jokers to the
right. Work in progress, 2007.
I've been murmuring about dissecting data structures for
ages. Back in 2002, I'd done this gig in Nottingham about universes,
with division of regular expressions as the example, capturing
the notion of leftmost hole. My subsequent
attempts to express division for you-name-it ran into trouble at
composition: a leftmost hole in an F⋅G does not necessarily
sit in the leftmost F position: you can have a bunch of closed
Gs to the left of the G with the leftmost element. What to
Round about the same time, I wrote mirror for binary trees
tail-recursively, on a beer mat in the Elm Tree in Durham
while James McKinna and I were waiting for the quiz. But how to write
any old fold? Dissection! I did a bit of a turn about it at the
CODS Workshop, Nottingham, June 2005, but it's high time I wrote it up.
I collect one-hole contexts in strings which always yield words
when a vowel is plugged in the hole. Examples (in various dialects of English)
include b.g, b.ll, b.t,
d.ll, d.n, d.ne, g.t, h.m, h.t, l.g, m.te, n.t, p.t, t.t, w.n, but even
some of these
may be debatable. Yes, the whole thing is S.lly.