I am new to Numpy and I would like to ask you how to calculate euclidean distance between points stored in a vector.
Let\'s assume that we have a numpy.array each ro
To get the distance you can use the norm method of the linalg module in numpy:
np.linalg.norm(x - y)
To apply a function to each element of a numpy array, try numpy.vectorize.
To do the actual calculation, we need the square root of the sum of squares of differences (whew!) between pairs of coordinates in the two vectors.
We can use zip
to pair the coordinates, and sum
with a comprehension to sum up the results. That looks like:
sum((x - y) ** 2 for (x, y) in zip(singlePoint, pointFromArray)) ** 0.5
While you can use vectorize, @Karl's approach will be rather slow with numpy arrays.
The easier approach is to just do np.hypot(*(points - single_point).T)
. (The transpose assumes that points is a Nx2 array, rather than a 2xN. If it's 2xN, you don't need the .T
.
However this is a bit unreadable, so you write it out more explictly like this (using some canned example data...):
import numpy as np
single_point = [3, 4]
points = np.arange(20).reshape((10,2))
dist = (points - single_point)**2
dist = np.sum(dist, axis=1)
dist = np.sqrt(dist)
import numpy as np
single_point = [3, 4]
points = np.arange(20).reshape((10,2))
distance = euclid_dist(single_point,points)
def euclid_dist(t1, t2):
return np.sqrt(((t1-t2)**2).sum(axis = 1))
import numpy as np
def distance(v1, v2):
return np.sqrt(np.sum((v1 - v2) ** 2))