问题
I have a large matrix of the shape (2,2,2,...n) of nD dimensions, which often varies.
However I am also receiving incoming data which is always a 1D array of shape (2,).
Now I want to multiply my former matrix of nD dimensions with my 1D array via reshape... and I also have an 'index' of which dimensions I want to broadcast and modify in particular.
Thus I'm doing the following (within a loop):
matrix_nd *= array_1d.reshape(1 if i!=index else dimension for i, dimension in enumerate(matrix_nd.shape))
However this generator as input does not seem to be valid. Note that the dimension would always equal to 2 and only be placed once within our sequence.
For example, if we have a 5D matrix of shape (2,2,2,2,2) and an index of 3; we would want to reshape the 1D array to a (1,1,1,2,1).
Any ideas?
Thanks in advance.
EDIT:
So it turns out my entire approach is wrong: Getting the tuple that I was after still seems to broadcast the (2,) 1D array to all dimensions.
For example:
I have numpy array test_nd.shape
of (2,2,2) and which looks like this:
array([[[1, 1],
[1, 1]],
[[1, 1],
[1, 1]]])
I then reshape a (2,) 1D array to be broadcasted to the 3rd dimensions only:
toBroadcast = numpy.asarray([0,0]).reshape(1,1,2)
Where toBroadcast has the form array([[[0, 0]]])
However... test_nd*toBroadcast
returns the following result:
array([[[0, 0],
[0, 0]],
[[0, 0],
[0, 0]]])
It seems to have been broadcasting to all the dimensions. Any ideas?
回答1:
You can define a function like
def broadcast_axis(data, ndims, axis):
newshape = [1] * ndims
newshape[axis] = -1
return data.reshape(*newshape)
and use it like
vector = broadcast_axis(vector, matrix.ndim, 3)
回答2:
One way would be to permute axes. So, we could push the relevant axis from matrix_nd
to the last, let it be multiplied with the 1D array and finally permute back the axes. Hence, with given axis
along which in matrix_nd
, we need to multiply 1D array, it would be -
np.moveaxis(np.moveaxis(matrix_nd,axis,-1)*array_1d,-1,axis)
Again, we don't need to reshape the 1D array to (1,1,1,2,1)
. We can reshape it to just the relevant axis, i.e. (2,1)
and broadcasting
would still work, as the leading axes are broadcasted automatically. Hence, another way would be -
matrix_nd*array_1d.reshape((-1,)+(1,)*(matrix_nd.ndim-axis-1))
来源:https://stackoverflow.com/questions/58979588/broadcasting-a-1d-array-to-a-particular-dimension-of-a-varying-nd-array-via-res