I need to train a network to multiply or add 2 inputs, but it doesn\'t seem to approximate well for all points after 20000 iterations. More specifically, I train it on the whole
A network consisting of a single neuron with weights={1,1}, bias=0 and linear activation function performs the addition of the two input numbers.
Multiplication may be harder. Here are two approaches that a net can use:
a*b = a*(b0*2^0 + b1*2^1 + ... + bk*2^k) = a*b0*2^0 + a*b1*2^1 + ... + a*bk*2^k
. This approach is simple, but requires variable number of neurons proportional to the length (logarithm) of the input b
.a*b = exp(ln(a) + ln(b))
This network can work on numbers of any length as long as it can approximate the logarithm and exponent well enough.