I have this code for generating 1D noise in obj-c, it's working perfectly well:
- (float)makeNoise1D:(int)x {
x = (x >> 13) ^ x;
x = (x * (x * x * (int)_seed + 19990303) + 1376312589) & RAND_MAX;
return ( 1.0 - ( (x * (x * x * 15731 + 789221) + 1376312589) & RAND_MAX) / 1073741824.0);
}
Now I'm trying to reproduce it in Swift, but it always fail and shows EXEC_BAD_INSTRUCTION on return. This is how it looks like now, I had to spit the final expression, but I'm pretty sure that's not the problem.
func makeNoise1D(var x : Int) -> Float{
x = (x >> 13) ^ x;
x = (x * (x * x * seed! + 19990303) + 1376312589) & 0x7fffffff
var inner = (x * (x * x * 15731 + 789221) + 1376312589) & 0x7fffffff
return ( 1.0 - ( Float(inner) ) / 1073741824.0)
}
I've already tried out many different casts and splitting into sub expressions, but still fails. The only thing I've figured out that the first and the last line works. (Most of my test cases x was set to 20 and seed to 10, just to make it simple)
Thanks for the help!
The exception is caused by an "arithmetic overflow" that occurs if the
result of one your calculations cannot be represented as an Int
.
Unlike (Objective-)C, adding and multiplying integers in Swift does not "wrap around" or "truncate", but causes an error if the result does not fit into the data type.
But you can use the Swift "overflow operators" &*
and &+
instead, which always truncate the result:
func makeNoise1D(x : Int) -> Float{
var x = x
x = (x >> 13) ^ x;
x = (x &* (x &* x &* seed! &+ 19990303) &+ 1376312589) & 0x7fffffff
let inner = (x &* (x &* x &* 15731 &+ 789221) &+ 1376312589) & 0x7fffffff
return ( 1.0 - ( Float(inner) ) / 1073741824.0)
}
来源:https://stackoverflow.com/questions/26313978/perlin-noise-generator-in-swift