This code passes the compiler (for clarification lifetimes are not elided):
struct Foo<'a> {
_field: &'a i32,
}
fn test<'a, 'b, 'c>(_x: &'a mut Foo<'c>, _y: &'b bool) { // case 1
}
fn main() {
let f = &mut Foo { _field: &0 };
{
let p = false;
test(f, &p);
}
}
If I use 'b
instead of 'c
in test
's definition like so:
fn test<'a, 'b>(_x: &'a mut Foo<'b>, _y: &'b bool) { // case 2
}
the code fails to compile ("p does not live long enough")!
What I would expect to happen at the call of test
in case 2 is:
'a
is set to the actual lifetime off
,'b
is set to the intersection of theFoo
's actual lifetime and&p
's actual lifetime which is&p
's lifetime,
and everything should be fine, as in case 1.
Instead, what actually seems to happen in case 2 is that 'b
is forced to become the lifetime of the Foo
which is too big for &p
's lifetime, hence the compiler error 'p does not live long enough'. True?
Even stranger (case 3): this only fails if test
takes a &mut. If I leave the <'b>
in, but remove the mut
like so:
fn test<'a, 'b>(_x: &'a Foo<'b>, _y: &'b bool) { // case 3
}
the code passes again.
Anyone to shed light on this?
Cheers.