I happened to see some strange behaviour during checking the size (minBound
,maxBound
) and \"length in decimal representation\" of different integra
Confusingly, this is still the monomorphism restriction at play (or rather the lack thereof when in GHCi). Since GHCi doesn't have the monomorphism restriction enabled, your definitions of mi
and ma
don't get specialized to Int
as you think they will - instead they stay general as mi, ma :: Bounded a => a
and the a
variable gets instantiated twice
()
in fromIntegral $ length $ show ma
(as you observed, this is a default)Int
in [mi,ma,le] :: [Int]
If you want mi
and ma
to actually be of type Int
, annotate them as such directly
Prelude> :{
Prelude| let mi, ma :: Int
Prelude| mi = minBound
Prelude| ma = maxBound
Prelude| le = fromIntegral $ length $ show ma
Prelude| in [mi,ma,le]
Prelude| :}
[-9223372036854775808,9223372036854775807,19]
Or turn on the monormorphism restriction manually in GHCi
Prelude> :set -XMonomorphismRestriction
Prelude> :{
Prelude| let mi = minBound
Prelude| ma = maxBound
Prelude| le = fromIntegral $ length $ show ma
Prelude| in [mi,ma,le] :: [Int]
Prelude| :}
[-9223372036854775808,9223372036854775807,19]