data Stream (s a : Data) where MkStream : (s -> Step s a) -> s -> Stream s a data Step (s a : Data) where Yield : a -> s -> Step s a Skip : s -> Step s a Done : Step s a -- | Apply a function to every element of a stream. smap (f : a -> b) (ss : Stream s a) : Stream s b = case ss of MkStream stepA sA0 -> let stepB q = case stepA q of Yield x sA1 -> Yield (f x) sA1 Skip sA2 -> Skip sA2 Done -> Done in MkStream stepB sA0 -- | Take the given number of elements from the front of a stream. stake (n : Nat) (ss : Stream s a) : Stream (Tup2 s Nat) a = case ss of MkStream fA sA0 -> let stepB q = case q of T2 sA ix -> if ix >= n then Done else case fA sA of Yield x sA2 -> Yield x (T2 sA2 (ix + 1)) Skip sA3 -> Skip (T2 sA3 ix) Done -> Done in MkStream stepB (T2 sA0 0)As we can see, work on larger demos is starting to be hampered by the lack of pattern matching sugar in the source language, so I'm going to add that next. After adding pattern matching, I'm going to spend a couple of weeks bug fixing, and should get the DDC 0.5 release out in early February 2015.
More info at the The Disco Discus Compiler (DDC) Homepage.
Tuesday, December 30, 2014
Higher order functions and interface files.
Work on DDC progresses, though recently it's been straightforward compiler engineering rather than anything novel, so I haven't been blogging.
DDC now drops interface files when compiling modules, multi-module compilation works, and the front-end supports unicode string literals. I've also implemented a lambda lifter and support for higher order functions. This part is well baked enough to implement simple stream functions, as used in the Haskell Data.Vector library, though the compiler doesn't do the associated fusion yet.
For example:
Sunday, March 2, 2014
Civilized error messages
I spent the weekend fixing bugs in preparation for the upcoming 0.4 release. I've finished all the necessary compiler hacking, but still need to update the cabal files, tutorial, and wiki before releasing it proper.
I've made sure to fix all the bugs that would result in a poor user experience for people that just want to spend an afternoon kicking the tyres. Even though DDC still has lots of missing features (garbage collection, compilation for higher order functions etc), the stuff that is implemented is reasonably well baked. If one tries to do something unsupported it should at least give a civilized error message.
For example:
Trying to use an anonymous type function (higher order functions are not supported yet)
Trying to use higher kinded type arguments (which need an 'Any' region to implement safely, in general)
Trying to partially apply a primitive operator:
In contrast, the old ddc-alpha compiler (which I wrote for my PhD work), would signal its displeasure with partially applied primops by producing a program that segfaulted at runtime. We turn our backs on the bad old days.
For example:
Trying to use an anonymous type function (higher order functions are not supported yet)
module Test with letrec id [a : Data] (x : a) : a = x foo (_ : Unit) : Unit = do id (/\a. \(x : a). x) () Fragment violation when converting Tetra module to Salt module. Cannot convert expression. Cannot convert type abstraction in this context. The program must be lambda-lifted before conversion. with: /\(a : Data). \(x : a). x
Trying to use higher kinded type arguments (which need an 'Any' region to implement safely, in general)
module Test data List (a : Data) where Nil : a -> List a Cons : a -> List a -> List a with letrec foo [a : Data ~> Data] [b : Data] (x : b) : b = x bar (_ : Unit) : Nat# = foo [List] [Nat#] 5# Cannot convert expression. Unsupported type argument to function or constructor. In particular, we don't yet handle higher kinded type arguments. See [Note: Salt conversion for higher kinded type arguments] in the implementation of the Tetra to Salt conversion. with: [List]
Trying to partially apply a primitive operator:
module Test with letrec thing (_ : Unit) : Nat# -> Nat# = add# [Nat#] 5# Fragment violation when converting Tetra module to Salt module. Cannot convert expression. Partial application of primitive operators is not supported. with: add# [Nat#] 5#
In contrast, the old ddc-alpha compiler (which I wrote for my PhD work), would signal its displeasure with partially applied primops by producing a program that segfaulted at runtime. We turn our backs on the bad old days.
Nested Data Types
Pleasingly, the new type inferencer does seem to work with some non-standard programs -- even though these don't compile all the way to object code yet. Here is a Core Tetra program using nested data types:module Test data Tuple2 (a b : Data) where T2 : a -> b -> Tuple2 a b data Nest (a : Data) where NilN : Nest a ConsN : a -> Nest (Tuple2 a a) -> Nest a with letrec thing (_ : Unit) = ConsN 7# (ConsN (T2 1# 2#) (ConsN (T2 (T2 6# 7#) (T2 7# 4#)) NilN))This example is based on one from the Bird and Meertens 1998 paper. Note that the second argument of the ConsN constructor takes a Nest where the element type is more complex than the original parameter. The type inference algorithm in the alpha compiler would have diverged on this program.
Higher Rank Types
I've also tried out some simple examples with higher ranked types, here is one:module Test with letrec thing1 (blerk : ([a : Data]. a -> a) -> Nat#) : Nat# = blerk (/\a. \(x : a). x) thing2 (u : Unit) = thing1 (\(f : [a : Data]. a -> a). f 5#)thing1 has a rank-3 type because it is a function, that takes a function, that takes a function which is polymorphic. There is a quantifier that appears at depth 3 contravariantly. Writing the type of thing1 in full makes this easier to see:
thing1 :: ((([a : Data]. a -> a) -> Nat#) -> Nat#Again, I can't compile this example to object code yet because code generation for higher order functions isn't finished. However, type inference is a separate concern, and I don't know of any remaining problems in this part.
Tuesday, February 11, 2014
Bidirectional type inference for DDC
DDC now includes a bidirectional type inferencer based on Joshua Dunfield and Neelakantan Krishnaswami's recent ICFP paper "Complete and Easy Bidirectional Type Checking for Higher-Rank Polymorphism". I've extended the base algorithm to infer the kinds of type parameters, as well as to automatically insert type abstractions and applications into the result.
The DDC command-line interface allows one to apply the type inferencer to such a source file, producing a core file with explicit type annotations, like so:
Type inference for Disciple Source Tetra
With both the type inferencer and new coeffect system now working, Disciple source programs are looking significantly less offensive. For example, some simple list functions:data List (a : Data) where Nil : List a Cons : a -> List a -> List a singleton (x : a) : List a = Cons x Nil append (xx : List a) (yy : List a) : List a = case xx of Nil -> yy Cons x xs -> Cons x (append xs yy) mapS [a b : Data] [e : Effect] (f : a -> S e b) (xx : List a) : S e (List b) = box case xx of Nil -> Nil Cons x xs -> Cons (run f x) (run mapS f xs)etc. etc. The above 'mapS' function is the version using the coeffect system I described in my last post. Using effects currently requires the user to add explicit 'box' and 'run' casts, though these will be automatically inserted in the next DDC release.
The DDC command-line interface allows one to apply the type inferencer to such a source file, producing a core file with explicit type annotations, like so:
$ bin/ddc -infer -load demo/Tetra/Lists/Lists.dst ... module Lists data List (a : Data) where { Nil : List a; Cons : a -> List a -> List a; } with letrec { singleton : [a : Data].a -> List a = /\(a : Data). \(x : a). Cons [a] x (Nil [a]); append : [a : Data].List a -> List a -> List a = /\(a : Data). \(xx yy : List a). case xx of { Nil -> yy; Cons (x : a) (xs : List a) -> Cons [a] x (append [a] xs yy) }; mapS : [a b : Data].[e : Effect].(a -> S e b) -> List a -> S e (List b) = /\(a b : Data)./\(e : Effect). \(f : a -> S e b).\(xx : List a). box case xx of { Nil -> Nil [b]; Cons (x : a) (xs : List a) -> Cons [b] (run f x) (run mapS [a] [b] [e] f xs) } } ...Such a file can then be converted to C or LLVM for compilation to object code. In the current implementation higher-order programs will type-check, but cannot be compiled all the way to object code. I need to finish the runtime support for that.
Type inference for Disciple Core Salt
In practical terms, work on the runtime system is already being helped by the new type inferencer. The DDC runtime is written in a first-order imperative fragment named 'Disciple Core Salt', which compiled with DDC itself. Here is the code that allocates a boxed heap object on a 64-bit platform:allocBoxed [r : Region] (tag : Tag#) (arity : Nat#) : Ptr# r Obj = do -- Multiple arity by 8 bytes-per-pointer to get size of payload. bytesPayload = shl# arity (size2# [Addr#]) bytesObj = add# (size# [Word32#]) (add# (size# [Word32#]) bytesPayload) -- Check there is enough space on the heap. case check# bytesObj of True# -> allocBoxed_ok tag arity bytesObj False# -> fail# allocBoxed_ok [r : Region] (tag : Tag#) (arity : Nat#) (bytesObj : Nat#) : Ptr# r Obj = do addr = alloc# bytesObj tag32 = promote# [Word32#] [Tag#] tag format = 0b00100001w32# header = bor# (shl# tag32 8w32#) format write# addr 0# header -- Truncate arity to 32-bits. arity32 = truncate# [Word32#] [Nat#] arity write# addr 4# arity32 makePtr# addrSome type annotations are still required to specify data formats and the like, but all the day-to-day annotations can be inferred.
$ bin/ddc -infer -load packages/ddc-code/salt/runtime64/Object.dcs [slightly reformatted] ... allocBoxed : [r : Region].Tag# -> Nat# -> Ptr# r Obj = /\(r : Region). \(tag : Tag#).\(arity : Nat#). let bytesPayload : Nat# = shl# [Nat#] arity (size2# [Addr#]) in let bytesObj : Nat# = add# [Nat#] (size# [Word32#]) (add# [Nat#] (size# [Word32#]) bytesPayload) in case check# bytesObj of { True# -> allocBoxed_ok [r] tag arity bytesObj; False# -> fail# [Ptr# r Obj] }; allocBoxed_ok : [r : Region].Tag# -> Nat# -> Nat# -> Ptr# r Obj = /\(r : Region). \(tag : Tag#).\(arity bytesObj : Nat#). let addr : Addr# = alloc# bytesObj in let tag32 : Word32# = promote# [Word32#] [Tag#] tag in let format : Word32# = 33w32# in let header : Word32# = bor# [Word32#] (shl# [Word32#] tag32 8w32#) format in let _ : Void# = write# [Word32#] addr 0# header in let arity32 : Word32# = truncate# [Word32#] [Nat#] arity in let _ : Void# = write# [Word32#] addr 4# arity32 in makePtr# [r] [Obj] addr; ...
Upcoming 0.4 Release
There are still some bugs to fix before I can enable type inference by default, but when that is done I will release DDC 0.4, which I'm hoping to get done by the end of the month.
Subscribe to:
Posts (Atom)