When you want to plot a numpy array with imshow
, this is what you normally do:
import numpy as np
import matplotlib.pyplot as plt
A=np.array([[3,2,5],[8,1,2],[6,6,7],[3,5,1]]) #The array to plot
im=plt.imshow(A,origin="upper",interpolation="nearest",cmap=plt.cm.gray_r)
plt.colorbar(im)
Which gives us this simple image:
In this image, the x and y coordinates are simply extracted from the position of each value in the array. Now, let's say that A
is an array of values that refer to some specific coordinates:
real_x=np.array([[15,16,17],[15,16,17],[15,16,17],[15,16,17]])
real_y=np.array([[20,21,22,23],[20,21,22,23],[20,21,22,23]])
These values are made-up to just make my case. Is there a way to force imshow to assign each value in A the corresponding pair of coordinates (real_x,real_y)?
PS: I am not looking for adding or subtracting something to the array-based x and y to make them match real_x and real_y, but for something that reads these values from the real_x and real_y arrays. The intended outcome is then an image with the real_x values on the x-axis and the real_y values on the y-axis.
Setting the extent
Assuming you have
real_x=np.array([15,16,17])
real_y=np.array([20,21,22,23])
you would set the image extent as
dx = (real_x[1]-real_x[0])/2.
dy = (real_y[1]-real_y[0])/2.
extent = [real_x[0]-dx, real_x[-1]+dx, real_y[0]-dy, real_y[-1]+dy]
plt.imshow(data, extent=extent)
Changing ticklabels
An alternative would be to just change the ticklabels
real_x=np.array([15,16,17])
real_y=np.array([20,21,22,23])
plt.imshow(data)
plt.gca().set_xticks(range(len(real_x)))
plt.gca().set_yticks(range(len(real_x)))
plt.gca().set_xticklabels(real_x)
plt.gca().set_yticklabels(real_y)
If I understand correctly, this is about producing a raster for imshow, that is, given X - image coordinates and y - values, produce input matrix for imshow. I am not aware of a standard function for that, so implemented it
import numpy as np
def to_raster(X, y):
"""
:param X: 2D image coordinates for values y
:param y: vector of scalar or vector values
:return: A, extent
"""
def deduce_raster_params():
"""
Computes raster dimensions based on min/max coordinates in X
sample step computed from 2nd - smallest coordinate values
"""
unique_sorted = np.vstack((np.unique(v) for v in X.T)).T
d_min = unique_sorted[0] # x min, y min
d_max = unique_sorted[-1] # x max, y max
d_step = unique_sorted[1]-unique_sorted[0] # x, y step
nsamples = (np.round((d_max - d_min) / d_step) + 1).astype(int)
return d_min, d_max, d_step, nsamples
d_min, d_max, d_step, nsamples = deduce_raster_params()
# Allocate matrix / tensor for raster. Allow y to be vector (e.g. RGB triplets)
A = np.full((*nsamples, 1 if y.ndim==1 else y.shape[-1]), np.NaN)
# Compute index for each point in X
ind = np.round((X - d_min) / d_step).T.astype(int)
# Scalar/vector values assigned over outer dimension
A[list(ind)] = y # cell id
# Prepare extent in imshow format
extent = np.vstack((d_min, d_max)).T.ravel()
return A, extent
This can then be used with imshow as:
import matplotlib.pyplot as plt
A, extent = to_raster(X, y)
plt.imshow(A, extent=extent)
Note that deduce_raster_params() works in O(n*log(n)) instead of O(n) because of the sort in np.unique() - this simplifies the code and probably shouldn't be a problem with things sent to imshow
来源:https://stackoverflow.com/questions/44260491/matplotlib-how-to-make-imshow-read-x-y-coordinates-from-other-numpy-arrays