What the standard conversion term means is covered in the following clause of the C++ Standard:
4 Standard conversions [conv]
Standard conversions are implicit conversions with built-in meaning. Clause 4 enumerates the full set of such conversions. A standard conversion sequence is a sequence of standard conversions in the following order:
— Zero or one conversion from the following set: lvalue-to-rvalue conversion, array-to-pointer conversion,
and function-to-pointer conversion.
— Zero or one conversion from the following set: integral promotions, floating point promotion, integral conversions, floating point conversions, floating-integral conversions, pointer conversions, pointer to member conversions, and boolean conversions.
— Zero or one qualification conversion.
[ Note: A standard conversion sequence can be empty, i.e., it can consist of no conversions. — end note ]
A standard conversion sequence will be applied to an expression if necessary to convert it to a required
destination type.
In other words, the standard conversion is a set of built-in rules the compiler can apply when converting one type to another. Those built-in conversions include:
- No conversions
- Lvalue-to-rvalue conversion
- Array-to-pointer conversion
- Function-to-pointer conversion
- Qualification conversions
- Integral promotions
- Floating point promotion
- Integral conversions
- Floating point conversions
- Floating-integral conversions
- Pointer conversions
- Pointer to member conversions
- Boolean conversions
The standard conversion sequence can appear twice during the user-defined conversion sequence - either before and/or after the user-defined conversion:
§ 13.3.3.1.2 User-defined conversion sequences [over.ics.user]
A user-defined conversion sequence consists of an initial standard conversion sequence followed by a user-defined conversion (12.3) followed by a second standard conversion sequence. If the user-defined conversion is specified by a constructor (12.3.1), the initial standard conversion sequence converts the source type to the type required by the argument of the constructor. If the user-defined conversion is specified by a conversion function (12.3.2), the initial standard conversion sequence converts the source type to the implicit object parameter of the conversion function.
The second standard conversion sequence converts the result of the user-defined conversion to the target type for the sequence. Since an implicit conversion sequence is an initialization, the special rules for initialization by user-defined conversion apply when selecting the best user-defined conversion for a user-defined conversion sequence (see 13.3.3 and 13.3.3.1).
Having that said, for the following conversion:
A a;
B b = a;
the compiler will search for the conversion constructor in B
that can take an instance of A
(source type) through some initial standard conversion sequence so that it could then perform that user-defined conversion through selected constructor, and then apply another standard conversion - second standard conversion - for converting the resultant type of the user-defined conversion to the target type;
or:
the compiler will search for the conversion function in A
that is callable after some initial standard conversion sequence of the implicit context, that could then convert an instance of A
to some type convertible through another standard conversion - the second standard conversion - to the target type B
.
As a tangible example let's consider the below conversion:
struct A
{
operator int() const;
};
A a;
bool b = a;
The compiler considers the following user-defined conversion sequence:
Initial standard conversion: Qualification conversion of A*
to const A*
to call const
-qualified operator int() const
.
User-defined conversion: conversion of A
to int
, through user-defined conversion function.
Second standard conversion: Boolean conversion of int
to bool
.
The case you are asking about can be split as follows:
struct A
{
operator int&();
};
int&& b = A();
- The source type is
A
.
- The target type is
int&&
.
- The user-defined conversion sequence is the conversion of
A
to int&&
.
- The initial standard conversion sequence is No conversion at all.
- The user-defined conversion is the conversion of
A
to int&
.
- The second standard conversion sequence (converting the result of the user-defined conversion to the target type) that is a part of the overall user-defined conversion sequence would be here the standard conversion of
int&
to int&&
- an Lvalue-to-rvalue conversion. That conversion is considered since int&
and int&&
are reference-compatible types. According to the below statement §8.5.3 [dcl.init.ref]/p5:
[...] if the reference is an rvalue reference and the second standard conversion sequence of the user-defined conversion sequence includes an lvalue-to-rvalue conversion, the program is ill-formed.
that conversion is not applicable in the overall user-defined conversion sequence.