I have been programming in Java since 2004, mostly enterprise and web applications. But I have never used short or byte, other than a toy program just to know
I have used shorts and bytes in Java apps communicating to custom usb or serial micro-controllers to receive 10bit values wrapped in 2 bytes as shorts.
The Java language itself makes it unreasonably difficult to use the byte
or short
types. Whenever you perform any operation on a byte
or short
value, Java promotes it to an int
first, and the result of the operation is returned as an int
. Also, they're signed, and there are no unsigned equivalents, which is another frequent source of frustration.
So you end up using byte
a lot because it's still the basic building block of all things cyber, but the short
type might as well not exist.
Until today I haven't notice how seldom I use them.
I've use byte for network related stuff, but most of the times they were for my own tools/learning. In work projects these things are handled by frameworks ( JSP for instance )
Short? almost never.
Long? Neither.
My preferred integer literals are always int, for loops, counters, etc.
When data comes from another place ( a database for instance ) I use the proper type, but for literals I use always int.
I use byte a lot. Usually in the form of byte arrays or ByteBuffer, for network communications of binary data.
I rarely use float or double, and I don't think I've ever used short.
I have used bytes when saving State while doing model checking. In that application the space savings are worth the extra work. Otherwise I never use them.
I use bytes in lots of different places, mostly involving low-level data processing. Unfortunately, the designers of the Java language made bytes signed. I can't think of any situation in which having negative byte values has been useful. Having a 0-255 range would have been much more helpful.
I don't think I've ever used shorts in any proper code. I also never use floats (if I need floating point values, I always use double).
I agree with Tom. Ideally, in high-level languages we shouldn't be concerned with the underlying machine representations. We should be able to define our own ranges or use arbitrary precision numbers.