From a paper I\'m reading right know:
... S(t+1, k) = S(t, k) + ... + C*∆ ... ∆ is a standard random variable with mean 0 and variance 1. ...
You can use the Box-Muller transform.
Suppose U1 and U2 are independent random variables that are uniformly distributed in the interval (0, 1]. Let and Then Z0 and Z1 are independent random variables with a normal distribution of standard deviation 1.
Suppose U1 and U2 are independent random variables that are uniformly distributed in the interval (0, 1]. Let
and
Then Z0 and Z1 are independent random variables with a normal distribution of standard deviation 1.