I have been reading the page in the Haskell wiki on type arithmetic and have had a little trouble with the section on the lambda calculus embedded in the type system. From that section, I gathered that the following code should not work with GHC/GHCi - apparently GHC shouldn't be able to determine the type signature of g.
{-# OPTIONS -fglasgow-exts #-}
{-# LANGUAGE FlexibleContexts #-}
{-# LANGUAGE UndecidableInstances #-}
data X
data App t u
data Lam t
class Subst s t u | s t -> u
instance Subst X u u
instance (Subst s u s', Subst t u t') => Subst (App s t) u (App s' t')
instance Subst (Lam t) u (Lam t)
class Apply s t u | s t -> u
instance (Subst s t u, Eval u u') => Apply (Lam s) t u'
class Eval t u | t -> u
instance Eval X X
instance Eval (Lam t) (Lam t)
instance (Eval s s', Apply s' t u) => Eval (App s t) u
f :: Eval (App (Lam X) X) u => u
f = undefined
g :: Eval (App (Lam (App X X )) (Lam (App X X ))) u => u
g = undefined
Note that I had to add FlexibleContexts and UndecidableInstances, since the given code doesn't seem to compile without these extensions. However, when I run this with GHCi (version 8.0.2), I get the following results:
*Main> :t f
f :: X
*Main> :t g
g :: u
This is especially strange to me, because the type u hasn't been defined anywhere. Is this a result of the two language extensions above interacting with each other and glasgow-exts? If so, how?