I want to find the most efficient (fastest) method to calculate the distances between pairs of lat long coordinates.
A not so efficient solution has been presented (here) using sapply
and spDistsN1{sp}
. I believe this could be made much faster if one would use spDistsN1{sp}
inside data.table
with the :=
operator but I haven't been able to do that. Any suggestions?
Here is a reproducible example:
# load libraries
library(data.table)
library(dplyr)
library(sp)
library(rgeos)
library(UScensus2000tract)
# load data and create an Origin-Destination matrix
data("oregon.tract")
# get centroids as a data.frame
centroids <- as.data.frame(gCentroid(oregon.tract,byid=TRUE))
# Convert row names into first column
setDT(centroids, keep.rownames = TRUE)[]
# create Origin-destination matrix
orig <- centroids[1:754, ]
dest <- centroids[2:755, ]
odmatrix <- bind_cols(orig,dest)
colnames(odmatrix) <- c("origi_id", "long_orig", "lat_orig", "dest_id", "long_dest", "lat_dest")
My failed attempt using data.table
odmatrix[ , dist_km := spDistsN1(as.matrix(long_orig, lat_orig), as.matrix(long_dest, lat_dest), longlat=T)]
Here is a solution that works (but probably less efficiently)
odmatrix$dist_km <- sapply(1:nrow(odmatrix),function(i)
spDistsN1(as.matrix(odmatrix[i,2:3]),as.matrix(odmatrix[i,5:6]),longlat=T))
head(odmatrix)
> origi_id long_orig lat_orig dest_id long_dest lat_dest dist_km
> (chr) (dbl) (dbl) (chr) (dbl) (dbl) (dbl)
> 1 oregon_0 -123.51 45.982 oregon_1 -123.67 46.113 19.0909
> 2 oregon_1 -123.67 46.113 oregon_2 -123.95 46.179 22.1689
> 3 oregon_2 -123.95 46.179 oregon_3 -123.79 46.187 11.9014
> 4 oregon_3 -123.79 46.187 oregon_4 -123.83 46.181 3.2123
> 5 oregon_4 -123.83 46.181 oregon_5 -123.85 46.182 1.4054
> 6 oregon_5 -123.85 46.182 oregon_6 -123.18 46.066 53.0709
I wrote my own version of geosphere::distHaversine
so that it would more naturally fit into a data.table
:=
call, and it might be of use here
dt.haversine <- function(lat_from, lon_from, lat_to, lon_to, r = 6378137){
radians <- pi/180
lat_to <- lat_to * radians
lat_from <- lat_from * radians
lon_to <- lon_to * radians
lon_from <- lon_from * radians
dLat <- (lat_to - lat_from)
dLon <- (lon_to - lon_from)
a <- (sin(dLat/2)^2) + (cos(lat_from) * cos(lat_to)) * (sin(dLon/2)^2)
return(2 * atan2(sqrt(a), sqrt(1 - a)) * r)
}
Update 18/07/2019
You can also write a C++ version through Rcpp.
#include <Rcpp.h>
using namespace Rcpp;
double inverseHaversine(double d){
return 2 * atan2(sqrt(d), sqrt(1 - d)) * 6378137.0;
}
double distanceHaversine(double latf, double lonf, double latt, double lont,
double tolerance){
double d;
double dlat = latt - latf;
double dlon = lont - lonf;
d = (sin(dlat * 0.5) * sin(dlat * 0.5)) + (cos(latf) * cos(latt)) * (sin(dlon * 0.5) * sin(dlon * 0.5));
if(d > 1 && d <= tolerance){
d = 1;
}
return inverseHaversine(d);
}
double toRadians(double deg){
return deg * 0.01745329251; // PI / 180;
}
// [[Rcpp::export]]
Rcpp::NumericVector rcpp_distance_haversine(Rcpp::NumericVector latFrom, Rcpp::NumericVector lonFrom,
Rcpp::NumericVector latTo, Rcpp::NumericVector lonTo,
double tolerance) {
int n = latFrom.size();
NumericVector distance(n);
double latf;
double latt;
double lonf;
double lont;
double dist = 0;
for(int i = 0; i < n; i++){
latf = toRadians(latFrom[i]);
lonf = toRadians(lonFrom[i]);
latt = toRadians(latTo[i]);
lont = toRadians(lonTo[i]);
dist = distanceHaversine(latf, lonf, latt, lont, tolerance);
distance[i] = dist;
}
return distance;
}
Save this file somewhere and use Rcpp::sourceCpp("distance_calcs.cpp")
to load the functions into your R session.
Here are some benchmarks on how they performs against the original geosphere::distHaversine
, and geosphere::distGeo
I've made the objects 85k rows just so it's more meaningful
dt <- rbindlist(list(odmatrix, odmatrix, odmatrix, odmatrix, odmatrix, odmatrix))
dt <- rbindlist(list(dt, dt, dt, dt, dt, dt, dt, dt, dt, dt, dt, dt, dt, dt, dt, dt, dt, dt, dt))
dt1 <- copy(dt); dt2 <- copy(dt); dt3 <- copy(dt); dt4 <- copy(dt)
library(microbenchmark)
microbenchmark(
rcpp = {
dt4[, dist := rcpp_distance_haversine(lat_orig, long_orig, lat_dest, long_dest, tolerance = 10000000000.0)]
},
dtHaversine = {
dt1[, dist := dt.haversine(lat_orig, long_orig, lat_dest, long_dest)]
} ,
haversine = {
dt2[ , dist := distHaversine(matrix(c(long_orig, lat_orig), ncol = 2),
matrix(c(long_dest, lat_dest), ncol = 2))]
},
geo = {
dt3[ , dist := distGeo(matrix(c(long_orig, lat_orig), ncol = 2),
matrix(c(long_dest, lat_dest), ncol = 2))]
},
times = 5
)
# Unit: milliseconds
# expr min lq mean median uq max neval
# rcpp 5.622847 5.683959 6.208954 5.925277 6.036025 7.776664 5
# dtHaversine 9.024500 12.413380 12.335681 12.992920 13.590566 13.657037 5
# haversine 30.911136 33.628153 52.503700 36.038927 40.791089 121.149197 5
# geo 83.646104 83.971163 88.694377 89.548176 90.569327 95.737117 5
Naturally, due to the way the distances are calculated in the two different techniques (geo & haversine), the results will differ slightly.
Thanks to @chinsoon12's comment I found a quite fast solution combining distGeo{geosphere}
and data.table
. In my laptop the fast solutions was than 120 times faster than the alternative.
Let's make the data set larger to compare speed performance.
# Multiplicate data observations by 1000
odmatrix <- odmatrix[rep(seq_len(nrow(odmatrix)), 1000), ]
slow solution
system.time(
odmatrix$dist_km <- sapply(1:nrow(odmatrix),function(i)
spDistsN1(as.matrix(odmatrix[i,2:3]),as.matrix(odmatrix[i,5:6]),longlat=T))
)
> user system elapsed
> 222.17 0.08 222.84
fast solution
# load library
library(geosphere)
# convert the data.frame to a data.table
setDT(odmatrix)
system.time(
odmatrix[ , dist_km2 := distGeo(matrix(c(long_orig, lat_orig), ncol = 2),
matrix(c(long_dest, lat_dest), ncol = 2))/1000]
)
> user system elapsed
> 1.76 0.03 1.79
来源:https://stackoverflow.com/questions/36817423/how-to-efficiently-calculate-distance-between-pair-of-coordinates-using-data-tab