The types that are inferred by Hindley/Milner type inference can become exponentially large if you compose them in the right way. For example:
fun f x = (x, x, x)
val t = f (f (f (f (f 0))))
Here, t
is a nested triple, whose nesting depth corresponds to the number n of invocations of f
. Consequently, the overall type has size 3^n.
However, this is not actually the worst case, since the type has a regular structure and can be represented by a graph efficiently in linear space (because on each level, all three constituent types are the same and can be shared).
The real worst case uses polymorphic instantiation to defeat this:
fun f x y z = (x, y, z)
val p1 = (f, f, f)
val p2 = (p1, p1, p1)
val p3 = (p2, p2, p2)
In this case, the type is again exponentially large, but unlike above, all constituent types are different fresh type variables, so that even a graph representation grows exponentially (in the number of pN declarations).
So yes, Hindley/Milner-style type inference is worst-case exponential (in space and time). It is worth pointing out, however, that the exponential cases can only occur where types get exponentially large -- i.e. in cases, that you cannot even realistically express without type inference.