I was wondering if there is difference between the two forms of increment. Some of the links says i++ is faster that i=i+1;
Also as one of the person my observation is
Runthe code profiler between both of them it will be same for both i++ and i = i+1 for current version of gcc compiler. This purely depends on the compiler optimizations.
If you speak about processor specifically you can see the below chart for machine cycles,
INC - Increment
Usage: INC dest
Modifies flags: AF OF PF SF ZF
Adds one to destination unsigned binary operand.
Clocks Size
Operands 808x 286 386 486 Bytes
reg8 3 2 2 1 2
reg16 3 2 2 1 1
reg32 3 2 2 1 1
mem 15+EA 7 6 3 2-4 (W88=23+EA)
ADD - Arithmetic Addition
Usage: ADD dest,src
Modifies flags: AF CF OF PF SF ZF
Adds "src" to "dest" and replacing the original contents of "dest".
Both operands are binary.
Clocks Size
Operands 808x 286 386 486 Bytes
reg,reg 3 2 2 1 2
mem,reg 16+EA 7 7 3 2-4 (W88=24+EA)
reg,mem 9+EA 7 6 2 2-4 (W88=13+EA)
reg,immed 4 3 2 1 3-4
mem,immed 17+EA 7 7 3 3-6 (W88=23+EA)
accum,immed 4 3 2 1 2-3
You can find the machine cycles taken for each instruction ADD and INC for various processors, 8086, 80286, 80386, 80486,above, you can find this documented in intel processor manuals. Lower the machine cycles faster the response.