问题
I happened to see some strange behaviour during checking the size (minBound
,maxBound
) and "length in decimal representation" of different integral types.
Using GHCi:
Prelude> :{
Prelude| let mi = minBound
Prelude| ma = maxBound
Prelude| le = fromIntegral $ length $ show ma
Prelude| in [mi,ma,le] :: [Int]
Prelude| :}
[-9223372036854775808,922372036854775807,2]
^
in the last place I would expect 19
.
My first guess is that maxBound
defaults to ()
and thus yields 2
, but I don't understand that because ma
should be an Int
by the explicit type annotation (:: [Int]
) - and by referential transparency all symbols named ma
should be equal.
If I put the statement above in a file and load it into GHCi, I get the correct result.
So why do I get a wrong result?
回答1:
Confusingly, this is still the monomorphism restriction at play (or rather the lack thereof when in GHCi). Since GHCi doesn't have the monomorphism restriction enabled, your definitions of mi
and ma
don't get specialized to Int
as you think they will - instead they stay general as mi, ma :: Bounded a => a
and the a
variable gets instantiated twice
- once as
()
infromIntegral $ length $ show ma
(as you observed, this is a default) - once as
Int
in[mi,ma,le] :: [Int]
If you want mi
and ma
to actually be of type Int
, annotate them as such directly
Prelude> :{
Prelude| let mi, ma :: Int
Prelude| mi = minBound
Prelude| ma = maxBound
Prelude| le = fromIntegral $ length $ show ma
Prelude| in [mi,ma,le]
Prelude| :}
[-9223372036854775808,9223372036854775807,19]
Or turn on the monormorphism restriction manually in GHCi
Prelude> :set -XMonomorphismRestriction
Prelude> :{
Prelude| let mi = minBound
Prelude| ma = maxBound
Prelude| le = fromIntegral $ length $ show ma
Prelude| in [mi,ma,le] :: [Int]
Prelude| :}
[-9223372036854775808,9223372036854775807,19]
来源:https://stackoverflow.com/questions/42233527/ghci-defaulting-confusion