Here is my theory on that. I think it has much to do with what operators are valid (syntactically) for symbols. Consider
int a[1]; // a[1] is valid (syntactically)
int *a; // *a is valid
int a(char, bool); // a(<a char>, <a bool>) is valid (function call)
int C::*a; // <a C>.*a is valid
Conceptually, in those declarations what is named with a type (C
, char
, bool
) is substituted with an expression of that type later on. Of course the intention is to reuse as much of the existing language as possible. So i think he used &
:
int &a; // &a is valid
The important one is that &
is only valid on the kind of expression a reference denotes: For lvalues
. References are lvalues (named variables are too) and only for them &
can be applied:
int &g(); // &g() is valid (taking the address of the referred to integer)
int g(); // &g() is *not* valid (can't apply to temporary int)