When studying programming 8085, 8086 and microporcessors in general we always have hexadecimal representation. Its ok that binary numbers are important in computers. But how these hexadecimal numbers are important? Any historical importance?
It would be nice if someone point to some historical papers also.
EDIT:
How computers handle hexadecimal numbers? For example what happens in 8085 when a hexadecimal number is given as input?
Hexadecimal has a closer visual mapping to the various bytes used to store a number than decimal does.
For example, you can tell from the hexadecimal number 0x12345678
that the most significant byte will hold 0x12
and the least significant byte will hold 0x78
. The decimal equivalent of that, 305419896
, tells you nothing.
From a historical perspective, it's worth mentioning that octal was more commonly used when working with certain older computers that employed a different number of bits per word than modern 16/32-bit computers. From the Wikipedia article on octal:
Octal became widely used in computing when systems such as the PDP-8, ICL 1900 and IBM mainframes employed 12-bit, 24-bit or 36-bit words. Octal was an ideal abbreviation of binary for these machines because their word size is divisible by three
As for how computers handle hexadecimal numbers, by the time the computer is dealing with it, the original base used to input the number is completely irrelevant. The computer is just dealing with bits and bytes.
Hexadecimal numbers can be very easily converted to binary numbers and vice versa.
Basically everyone, who has to work with binary numbers has a cheat sheet on the monitor which says:
0000 = 0
0001 = 1
...
1111 = F
You convert one hex digit to four binary digits. Example:
0x1A5F = 0001 1010 0101 1111
Hexadecimal is the easiest way to write down binary numbers in a compact format.
One important reason is because hex is ALOT shorter and easier to read than binary is for humans.
Hexadecimal number is very easy to convert the numbers into binary, octal,and decimal system also. so mainly we can use the hexadecimal form.
来源:https://stackoverflow.com/questions/16513806/importance-of-hexadecimal-numbers-in-computer-science