I need some advice with this strange behavior – lets have this code:
int ** p;
This compiles without any trouble:
p++;
Old versions of gcc support something called "lvalue casts" -- if you cast something that is an lvalue the result is an lvalue and can be treated as such. The main use for it is allowing you to increment a pointer by an amount corresponding to a different size:
int *p;
++(char *)p; /* increment p by one byte, resulting in an unaligned pointer */
This extension was deprecated some time around gcc v3.0 and removed in gcc v4.0
To do the equivalent thing in more recent versions of gcc, you need do an addition and assignment (instead of an increment) casting the pointer to the type for the addition and back for the assignment:
p = (int *)((char *)p + 1);
Note that trying to dereference the pointer after this is undefined behavior, so don't count on it doing anything useful.
Why isn't the result of this cast an lvalue?
I draw your attention to section 6.5.4 of the C99 specification, line 4, footnote 86, which states:
A cast does not yield an lvalue.
You have a cast.
The result is not an lvalue.
The ++
operator requires an lvalue.
Therefore your program is an error.
In C language all conversions (including explicit casts) always produce rvalues. No exceptions. The fact that you are casting it to the same type does not make it exempt from that rule. (Actually, it would be strange to expect it to make such an inconsistent exception.)
In fact, one of fundamental properties of the entire C language is that it always converts lvalues to rvalues in expressions as quickly as possible. Lvalues in C expressions are like 115th element of Mendeleev table: they typically live a very short life, quickly decaying to rvalues. This is a major difference between C and C++, with the latter always attempting to preserve lvalues in expressions as long as possible (although in C++ this specific cast would also produce an rvalue).
When you typecast an expression, the result of that expression is an rvalue rather than an lvalue. Intuitively, a typecast says "give me the value that this expression would have if it had some other type," so typecasting a variable to its own type still produces an rvalue and not an lvalue. Consequently, it's not legal to apply the ++
operator to the result of a typecast, since ++
requires an lvalue and you're providing an rvalue.
That said, it is in principle possible to redefine the C language so that casting a value to its own type produces an lvalue if the original expression is an lvalue, but for simplicity's and consistency's sake I suppose the language designers didn't do this.
Hope this helps!