38
votes

I have a weird scenario where type inference isn't working as I'd expect when using a lambda expression. Here's an approximation of my real scenario:

static class Value<T> {
}

@FunctionalInterface
interface Bar<T> {
  T apply(Value<T> value); // Change here resolves error
}

static class Foo {
  public static <T> T foo(Bar<T> callback) {
  }
}

void test() {
  Foo.foo(value -> true).booleanValue(); // Compile error here
}

The compile error I get on the second to last line is

The method booleanValue() is undefined for the type Object

if I cast the lambda to Bar<Boolean>:

Foo.foo((Bar<Boolean>)value -> true).booleanValue();

or if I change the method signature of Bar.apply to use raw types:

T apply(Value value);

then the problem goes away. The way I'd expect this to work is that:

  • Foo.foo call should infer a return type of boolean
  • value in the lambda should be inferred to Value<Boolean>.

Why doesn't this inference work as expected and how can I change this API to make it work as expected?

6
Which Java Version you use?aw-think
@NwDx Java 1.8.0_25Josh Stone
You simply have a type mismatch, because Boolean is not a Value<Boolean>. How the compiler should be able to know that T should be a Value<T>? And of course it don't know how, it assumes Object. Look at the Function Interface apply method. Value is a concrete type in context of your Bar Interface.aw-think
@NwDx I think what you're describing is the basis for @fukanchik's answer below. The problem there is that the compiler infers Value<Object> instead of Value<Boolean> which is what I'm after. Please feel free to share any ideas on how to get it to work.Josh Stone
If you were not throwing away the return type of Foo.foo, the compiler could use the target type to derive bounds for T...Brian Goetz

6 Answers

32
votes

Under the Hood

Using some hidden javac features, we can get more information about what's happening:

$ javac -XDverboseResolution=deferred-inference,success,applicable LambdaInference.java 
LambdaInference.java:16: Note: resolving method foo in type Foo to candidate 0
    Foo.foo(value -> true).booleanValue(); // Compile error here
       ^
  phase: BASIC
  with actuals: <none>
  with type-args: no arguments
  candidates:
      #0 applicable method found: <T>foo(Bar<T>)
        (partially instantiated to: (Bar<Object>)Object)
  where T is a type-variable:
    T extends Object declared in method <T>foo(Bar<T>)
LambdaInference.java:16: Note: Deferred instantiation of method <T>foo(Bar<T>)
    Foo.foo(value -> true).booleanValue(); // Compile error here
           ^
  instantiated signature: (Bar<Object>)Object
  target-type: <none>
  where T is a type-variable:
    T extends Object declared in method <T>foo(Bar<T>)
LambdaInference.java:16: error: cannot find symbol
    Foo.foo(value -> true).booleanValue(); // Compile error here
                          ^
  symbol:   method booleanValue()
  location: class Object
1 error

This is a lot of information, let's break it down.

LambdaInference.java:16: Note: resolving method foo in type Foo to candidate 0
    Foo.foo(value -> true).booleanValue(); // Compile error here
       ^
  phase: BASIC
  with actuals: <none>
  with type-args: no arguments
  candidates:
      #0 applicable method found: <T>foo(Bar<T>)
        (partially instantiated to: (Bar<Object>)Object)
  where T is a type-variable:
    T extends Object declared in method <T>foo(Bar<T>)

phase: method applicability phase
actuals: the actual arguments passed in
type-args: explicit type arguments
candidates: potentially applicable methods

actuals is <none> because our implicitly typed lambda is not pertinent to applicability.

The compiler resolves your invocation of foo to the only method named foo in Foo. It has been partially instantiated to Foo.<Object> foo (since there were no actuals or type-args), but that can change at the deferred-inference stage.

LambdaInference.java:16: Note: Deferred instantiation of method <T>foo(Bar<T>)
    Foo.foo(value -> true).booleanValue(); // Compile error here
           ^
  instantiated signature: (Bar<Object>)Object
  target-type: <none>
  where T is a type-variable:
    T extends Object declared in method <T>foo(Bar<T>)

instantiated signature: the fully instantiated signature of foo. It is the result of this step (at this point no more type inference will be made on the signature of foo).
target-type: the context the call is being made in. If the method invocation is a part of an assignment, it will be the left hand side. If the method invocation is itself part of a method invocation, it will be the parameter type.

Since your method invocation is dangling, there is no target-type. Since there is no target-type, no more inference can be done on foo and T is inferred to be Object.


Analysis

The compiler does not use implicitly typed lambdas during inference. To a certain extent, this makes sense. In general, given param -> BODY, you will not be able to compile BODY until you have a type for param. If you did try to infer the type for param from BODY, it might lead to a chicken-and-egg type problem. It's possible that some improvements will be made on this in future releases of Java.


Solutions

Foo.<Boolean> foo(value -> true)

This solution provides an explicit type argument to foo (note the with type-args section below). This changes the partial instantiation of the method signature to (Bar<Boolean>)Boolean, which is what you want.

LambdaInference.java:16: Note: resolving method foo in type Foo to candidate 0
    Foo.<Boolean> foo(value -> true).booleanValue(); // Compile error here
       ^
  phase: BASIC
  with actuals: <none>
  with type-args: Boolean
  candidates:
      #0 applicable method found: <T>foo(Bar<T>)
        (partially instantiated to: (Bar<Boolean>)Boolean)
  where T is a type-variable:
    T extends Object declared in method <T>foo(Bar<T>)
LambdaInference.java:16: Note: resolving method booleanValue in type Boolean to candidate 0
    Foo.<Boolean> foo(value -> true).booleanValue(); // Compile error here
                                    ^
  phase: BASIC
  with actuals: no arguments
  with type-args: no arguments
  candidates:
      #0 applicable method found: booleanValue()

Foo.foo((Value<Boolean> value) -> true)

This solution explicitly types your lambda, which allows it to be pertinent to applicability (note with actuals below). This changes the partial instantiation of the method signature to (Bar<Boolean>)Boolean, which is what you want.

LambdaInference.java:16: Note: resolving method foo in type Foo to candidate 0
    Foo.foo((Value<Boolean> value) -> true).booleanValue(); // Compile error here
       ^
  phase: BASIC
  with actuals: Bar<Boolean>
  with type-args: no arguments
  candidates:
      #0 applicable method found: <T>foo(Bar<T>)
        (partially instantiated to: (Bar<Boolean>)Boolean)
  where T is a type-variable:
    T extends Object declared in method <T>foo(Bar<T>)
LambdaInference.java:16: Note: Deferred instantiation of method <T>foo(Bar<T>)
    Foo.foo((Value<Boolean> value) -> true).booleanValue(); // Compile error here
           ^
  instantiated signature: (Bar<Boolean>)Boolean
  target-type: <none>
  where T is a type-variable:
    T extends Object declared in method <T>foo(Bar<T>)
LambdaInference.java:16: Note: resolving method booleanValue in type Boolean to candidate 0
    Foo.foo((Value<Boolean> value) -> true).booleanValue(); // Compile error here
                                           ^
  phase: BASIC
  with actuals: no arguments
  with type-args: no arguments
  candidates:
      #0 applicable method found: booleanValue()

Foo.foo((Bar<Boolean>) value -> true)

Same as above, but with a slightly different flavor.

LambdaInference.java:16: Note: resolving method foo in type Foo to candidate 0
    Foo.foo((Bar<Boolean>) value -> true).booleanValue(); // Compile error here
       ^
  phase: BASIC
  with actuals: Bar<Boolean>
  with type-args: no arguments
  candidates:
      #0 applicable method found: <T>foo(Bar<T>)
        (partially instantiated to: (Bar<Boolean>)Boolean)
  where T is a type-variable:
    T extends Object declared in method <T>foo(Bar<T>)
LambdaInference.java:16: Note: Deferred instantiation of method <T>foo(Bar<T>)
    Foo.foo((Bar<Boolean>) value -> true).booleanValue(); // Compile error here
           ^
  instantiated signature: (Bar<Boolean>)Boolean
  target-type: <none>
  where T is a type-variable:
    T extends Object declared in method <T>foo(Bar<T>)
LambdaInference.java:16: Note: resolving method booleanValue in type Boolean to candidate 0
    Foo.foo((Bar<Boolean>) value -> true).booleanValue(); // Compile error here
                                         ^
  phase: BASIC
  with actuals: no arguments
  with type-args: no arguments
  candidates:
      #0 applicable method found: booleanValue()

Boolean b = Foo.foo(value -> true)

This solution provides an explicit target for your method call (see target-type below). This allows the deferred-instantiation to infer that the type parameter should be Boolean instead of Object (see instantiated signature below).

LambdaInference.java:16: Note: resolving method foo in type Foo to candidate 0
    Boolean b = Foo.foo(value -> true);
                   ^
  phase: BASIC
  with actuals: <none>
  with type-args: no arguments
  candidates:
      #0 applicable method found: <T>foo(Bar<T>)
        (partially instantiated to: (Bar<Object>)Object)
  where T is a type-variable:
    T extends Object declared in method <T>foo(Bar<T>)
LambdaInference.java:16: Note: Deferred instantiation of method <T>foo(Bar<T>)
    Boolean b = Foo.foo(value -> true);
                       ^
  instantiated signature: (Bar<Boolean>)Boolean
  target-type: Boolean
  where T is a type-variable:
    T extends Object declared in method <T>foo(Bar<T>)

Disclaimer

This is the behavior that's occurring. I don't know if this is what is specified in the JLS. I could dig around and see if I could find the exact section that specifies this behavior, but type inference notation gives me a headache.

This also doesn't fully explain why changing Bar to use a raw Value would fix this issue:

LambdaInference.java:16: Note: resolving method foo in type Foo to candidate 0
    Foo.foo(value -> true).booleanValue();
       ^
  phase: BASIC
  with actuals: <none>
  with type-args: no arguments
  candidates:
      #0 applicable method found: <T>foo(Bar<T>)
        (partially instantiated to: (Bar<Object>)Object)
  where T is a type-variable:
    T extends Object declared in method <T>foo(Bar<T>)
LambdaInference.java:16: Note: Deferred instantiation of method <T>foo(Bar<T>)
    Foo.foo(value -> true).booleanValue();
           ^
  instantiated signature: (Bar<Boolean>)Boolean
  target-type: <none>
  where T is a type-variable:
    T extends Object declared in method <T>foo(Bar<T>)
LambdaInference.java:16: Note: resolving method booleanValue in type Boolean to candidate 0
    Foo.foo(value -> true).booleanValue();
                          ^
  phase: BASIC
  with actuals: no arguments
  with type-args: no arguments
  candidates:
      #0 applicable method found: booleanValue()

For some reason, changing it to use a raw Value allows the deferred instantiation to infer that T is Boolean. If I had to speculate, I would guess that when the compiler tries to fit the lambda to the Bar<T>, it can infer that T is Boolean by looking at the body of the lambda. This implies that my earlier analysis is incorrect. The compiler can perform type inference on the body of a lambda, but only on type variables that only appear in the return type.

5
votes

Inference on lambda parameter type cannot depend on the lambda body.

The compiler faces a tough job trying to make sense of implicit lambda expressions

    foo( value -> GIBBERISH )

The type of value must be inferred first before GIBBERISH can be compiled, because in general the interpretation of GIBBERISH depends on the definition of value.

(In your special case, GIBBERISH happens to be a simple constant independent of value.)

Javac must infer Value<T> first for parameter value; there's no constraints in context, therefore T=Object. Then, lambda body true is compiled and recognized as Boolean, compatible with T.

After you made the change to the functional interface, the lambda parameter type does not require inference; T remains uninfered. Next, the lambda body is compiled, and the return type appears to be Boolean, which is set as a lower bound for T.


Another example demonstrating the issue

<T> void foo(T v, Function<T,T> f) { ... }

foo("", v->42);  // Error. why can't javac infer T=Object ?

T is inferred to be String; the body of lambda did not participate in the inference.

In this example, javac's behavior seems very reasonable to us; it likely prevented a programming error. You don't want inference to be too powerful; if everything we write compiles somehow, we'll lose the confidence on compiler finding errors for us.


There are other examples where lambda body appears to provide unequivocal constraints, yet the compiler cannot use that information. In Java, the lambda parameter types must be fixed first, before the body can be looked at. This is a deliberate decision. In contrast, C# is willing to try different parameter types and see which makes the code compile. Java considers that too risky.

In any case, when implicit lambda fails, which happens rather frequently, provide explicit types for lambda parameters; in your case, (Value<Boolean> value)->true

4
votes

The easy way to fix this is a type declaration on the method call to foo:

Foo.<Boolean>foo(value -> true).booleanValue();

Edit: I can't find the specific documentation about why this is necessary, pretty much just like everyone else. I suspected it might be because of primitive types, but that wasn't right. Regardless, This syntax is called using a Target Type. Also Target Type in Lambdas. The reasons elude me though, I can't find documentation anywhere on why this particular use case is necessary.

Edit 2: I found this relevant question:

Generic type inference not working with method chaining?

It looks like it's because you're chaining the methods here. According to the JSR comments referenced in the accepted answer there, it was a deliberate omission of functionality because the compiler doesn't have a way to pass inferred generic type information between chained method calls in both directions. As a result, the entire type of erased by time it gets to the call to booleanValue. Adding the target type in removes this behavior by providing the constraint manually instead of letting the compiler make the decision using the rules outlined in JLS §18, which doesn't seem to mention this at all. This is the only info I could come up with. If anyone finds anything better, I'd love to see it.

3
votes

Like other answers, I am also hoping someone smarter can point out why the compiler isn't able to infer that T is Boolean.

One way to help out the compiler do the right thing, without requiring any changes to your existing class/interface design, is by explicitly declaring the formal parameter's type in your lambda expression. So, in this case, by explicitly declaring that the type of the value parameter is Value<Boolean>.

void test() {
  Foo.foo((Value<Boolean> value) -> true).booleanValue();
}
1
votes

I don't know why but you need to add separate return type:

public class HelloWorld{
static class Value<T> {
}

@FunctionalInterface
interface Bar<T,R> {
      R apply(Value<T> value); // Return type added
}

static class Foo {
  public static <T,R> R foo(Bar<T,R> callback) {
      return callback.apply(new Value<T>());
  }
}

void test() {
  System.out.println( Foo.foo(value -> true).booleanValue() ); // No compile error here
}
     public static void main(String []args){
         new HelloWorld().test();
     }
}

some smart guy probably can explain that.

1
votes

Problem

Value will infer to type Value<Object> because you interpreted the lambda wrong. Think of it, like you call with the lambda directly the apply method. So what you do is:

Boolean apply(Value value);

and this is correctly inferred to:

Boolean apply(Value<Object> value);

since you haven't given the type for Value.

Simple Solution

Call the lambda in a correct way:

Foo.foo((Value<Boolean> value) -> true).booleanValue();

this will be inferred to:

Boolean apply(Value<Boolean> value);

(My) Recommended Solution

Your solution should be a bit more clear. If you want a callback, then you need a type value which will be returned.

I've made a generic Callback interface, a generic Value class and an UsingClass to show how to use it.

Callback interface

/**
 *
 * @param <P> The parameter to call
 * @param <R> The return value you get
 */
@FunctionalInterface
public interface Callback<P, R> {

  public R call(P param);
}

Value class

public class Value<T> {

  private final T field;

  public Value(T field) {
    this.field = field;
  }

  public T getField() {
    return field;
  }
}

UsingClass class

public class UsingClass<T> {

  public T foo(Callback<Value<T>, T> callback, Value<T> value) {
    return callback.call(value);
  }
}

TestApp with main

public class TestApp {

  public static void main(String[] args) {
    Value<Boolean> boolVal = new Value<>(false);
    Value<String> stringVal = new Value<>("false");

    Callback<Value<Boolean>, Boolean> boolCb = (v) -> v.getField();
    Callback<Value<String>, String> stringCb = (v) -> v.getField();

    UsingClass<Boolean> usingClass = new UsingClass<>();
    boolean val = usingClass.foo(boolCb, boolVal);
    System.out.println("Boolean value: " + val);

    UsingClass<String> usingClass1 = new UsingClass<>();
    String val1 = usingClass1.foo(stringCb, stringVal);
    System.out.println("String value: " + val1);

    // this will give you a clear and understandable compiler error
    //boolean val = usingClass.foo(boolCb, stringVal);
  }
}