问题
I'm using the miscutil library to communicate between and Java and C# application using a socket. I am trying to figure out the difference between the following code (this is Groovy, but the Java result is the same):
import java.io.*
def baos = new ByteArrayOutputStream();
def stream = new DataOutputStream(baos);
stream.writeInt(5000)
baos.toByteArray().each { println it }
/* outputs - 0, 0, 19, -120 */
and C#:
using (var ms = new MemoryStream())
using (EndianBinaryWriter writer = new EndianBinaryWriter(EndianBitConverter.Big, ms, Encoding.UTF8))
{
writer.Write(5000);
ms.Position = 0;
foreach (byte bb in ms.ToArray())
{
Console.WriteLine(bb);
}
}
/* outputs - 0, 0, 19, 136 */
As you can see, the last byte is -120
in the Java version and 136
in C#.
回答1:
This has to do with the fact that bytes in Java (the JVM) are signed, and in C# they are not. It has nothing to do with big or little endian representation.
In other words, Java's bytes range from -128 - 127
, and C# bytes range from 0 - 255
.
Thus, when trying to represent a byte above 127
in Java, you overflow and wrap around to -120
.
From the Java tutorials:
byte: The byte data type is an 8-bit signed two's complement integer. It has a minimum value of -128 and a maximum value of 127 (inclusive). The byte data type can be useful for saving memory in large arrays, where the memory savings actually matters. They can also be used in place of int where their limits help to clarify your code; the fact that a variable's range is limited can serve as a form of documentation.
From the MSDN:
The Byte value type represents unsigned integers with values ranging from 0 to 255.
来源:https://stackoverflow.com/questions/2597715/difference-between-c-sharp-and-java-big-endian-bytes-using-miscutil