I am new to Numpy and I would like to ask you how to calculate euclidean distance between points stored in a vector.
Let\'s assume that we have a numpy.array each ro
While you can use vectorize, @Karl's approach will be rather slow with numpy arrays.
The easier approach is to just do np.hypot(*(points - single_point).T)
. (The transpose assumes that points is a Nx2 array, rather than a 2xN. If it's 2xN, you don't need the .T
.
However this is a bit unreadable, so you write it out more explictly like this (using some canned example data...):
import numpy as np
single_point = [3, 4]
points = np.arange(20).reshape((10,2))
dist = (points - single_point)**2
dist = np.sum(dist, axis=1)
dist = np.sqrt(dist)