Why do Java and C# have bitshifts operators?
问题 Is the difference between integer multiply(temporarily forgetting about division) still in favor of shifting and if so how big is the difference? It simply seems such a low level optimization, even if you wanted it the shouldn't the (C#/Java) to bytecode compiler or the jit catch it in most cases? Note: I tested the compiled output for C#(with gmcs Mono C# compiler version 2.6.7.0) and the multiply examples didn't use shift for multiplying even when multiplying by a multiple of 2. C# http:/