问题
I'm conducting a series of queries to DBpedia SPARQL endpoint (from inside a loop). The code looks more or less like this:
for (String citySplit : citiesSplit) {
RepositoryConnection conn = dbpediaEndpoint.getConnection();
String sparqlQueryLat = " SELECT ?lat ?lon WHERE { "
+ "<http://dbpedia.org/resource/" + citySplit.trim().replaceAll(" ", "_") + "> <http://www.w3.org/2003/01/geo/wgs84_pos#lat> ?lat . "
+ "<http://dbpedia.org/resource/" + citySplit.trim().replaceAll(" ", "_") + "> <http://www.w3.org/2003/01/geo/wgs84_pos#long> ?lon ."
+ "}";
TupleQuery queryLat = conn.prepareTupleQuery(QueryLanguage.SPARQL, sparqlQueryLat);
TupleQueryResult resultLat = queryLat.evaluate();
}
The problem is that, after a few iterations, I get a 503 message:
httpclient.wire.header - << "HTTP/1.1 503 Service Temporarily Unavailable[\r][\n]"
(...)
org.openrdf.query.QueryInterruptedException
at org.openrdf.http.client.HTTPClient.getTupleQueryResult(HTTPClient.java:1041)
at org.openrdf.http.client.HTTPClient.sendTupleQuery(HTTPClient.java:438)
at org.openrdf.http.client.HTTPClient.sendTupleQuery(HTTPClient.java:413)
at org.openrdf.repository.http.HTTPTupleQuery.evaluate(HTTPTupleQuery.java:41)
If I understand correctly, this 503 message is from DBpedia. Am I right? The number of consecutive queries that manage to succeed is variable. Sometimes it runs for 13 seconds before getting the message, sometimes 15 minutes. In any case, I don't think this is normal. What could be happening?
回答1:
The Accessing the DBpedia Data Set over the Web page of the DBpedia wiki says, in section 1.1. Public SPARQL Endpoint says:
Fair Use Policy: Please read this post for information about restrictions on the public DBpedia endpoint. These might also be usefull [sic]: 1, 2.
The linked post says that the public DBpedia SPARQL endpoint implements rate limiting.
The http://dbpedia.org/sparql endpoint has both rate limiting on the number of connections/sec you can make, as well as restrictions on resultset and query time, as per the following settings:
[SPARQL] ResultSetMaxRows = 2000 MaxQueryExecutionTime = 120 MaxQueryCostEstimationTime = 1500
These are in place to make sure that everyone has a equal chance to de-reference data from dbpedia.org, as well as to guard against badly written queries/robots.
I think that it is likely that you are hitting that limit.
来源:https://stackoverflow.com/questions/7447052/repeating-503s-messages-when-querying-dbpedia