If entropy measure the valuability of data, that is how much it contains information, then why random information has higher entropy than a constant value ?
Doesn\'t conc