This year I finally made it to ICFP after years of wanting to go. Here are notes on a few of the talks I was most interested in.
This paper presents an idea I’ve been very excited about — describing
transformations between algebraic data types rather than the tradition diff
approach of comparing textual representations line-by-line.
Some thoughts:
diff
is an insufficient tool and just begging to be superseded by better (structural, type-aware) tools.As an aside mentioned James Quantified Class Constraints. The introduction of this paper is an easy read, mostly historical. A few highlights:
And (Higher Order) Hereditary Harrop Formulas:
Hereditary Harrop formulas are an extension to Horn clauses in which the body of clauses can contain implications and universal quantifiers… We have designed and built a logic programming system which implements hohh in much the same way Prolog implements first-order Horn clauses. This language and its interpreter, collectively called λProlog, will be described.
As far as the main idea of the talk, James proposes the following:
There’s of course a lot more to it, but for that I’ll direct you to James’ abstract.
This piqued my interest as I’ve been doing a lot of generic programming. Abstract:
Generic programming libraries such as Scrap Your Boilerplate eliminate the need to write repetitive code, but typically introduce significant performance overheads. This leaves programmers with the regrettable choice between writing succinct but slow programs and writing tedious but efficient programs. Applying structured multi-stage programming techniques transforms Scrap Your Boilerplate from an inefficient library into a typed optimising code generator, bringing its performance in line with hand-written code, and so combining high-level programming with uncompromised performance.
Great paper.
To me this was some of the most impressive work at ICFP. The common thread seems to be tooling to compile functional programs to hardware. As always, Conal does it in the most elegant way possible, and explains his work very well.
Carter and I presented at the Haskell Implementor’s Workshop on some work we’ve done related to modeling financial contracts. The talk had three four main themes, as I see it:
Thinking about GHC core as a compiler target. We may or may not be serious about doing this. We should use GHC Core rather than inventing our own compiler intermediate language. Core is the result of decades worth of engineering experience and active use, so we can avoid repeating mistakes by just using it.
n-ary versions of the linear operators are neat. As an example, the par operator can be thought of as a fork that, in order to typecheck, guarantees that all of its children clean up their resources. The 0-ary version (forking to 0 children) is equivalent to halt.
GHC core has a notion of demand analysis, which does both strictness and absence analysis. This looks a lot like linearity checking! We should figure out how exactly they’re related.
Closely related to the last — GHC Core is based on System F-Omega (well, System Fc). We’ve started some work to try integrating F-Omega with linear types. For similar work, see Do Be Do Be Do and Quantitative Type Theory.