问题
I am writting a public domain calculator, whose code is available at: https://github.com/okfn/pdcalc/blob/master/pd/map.rdf
The code is currently unable to properly determine the public domain status of a work because of an issue that has been encountered with sparql 1.0: it does not appear to be possible to perform arithmetic operation on dates, which means that the calculator cannot determine e.g. whether or not the work has been published 70 years after the death of the author. Unfortunately, none of the standard python's rdf libraries have yet implemented a support for sparql 1.1 Hence, I was wondering whether anyone had any suggestion as to how to overcome this limitation, or maybe knows of any python library with a better support of sparql ?
Looking forward to your feeback !
回答1:
Even SPARQL 1.1 does not support arithmetic operations on dates by default. See the section on SPARQL operator mapping: arithmetic operations are only defined on numeric datatypes.
There may be some SPARQL 1.1 implementations that offer an extension for this purpose, but I'm not immediately aware of any that have this built-in now, certainly not in Python.
Your best bet is to get in touch with developers of the SPARQL engine of your choice and pester them to implement such an extension, or alternatively roll your own of course.
As a workaround, most SPARQL engines (even 1.0) do support compare operations on dates, so you can do things like sorting and comparing, but you'll have to do some custom post-processing on your query result.
Update I just realized I overlooked something rather important: SPARQL 1.1 of course does support functions like year()
, month()
etc, which return the year and month-component of a datetime-value as an integer, and which you could conceivably use to do a roundabout arithmetic on dates.
回答2:
While you cannot do arithmetic operations on dates in SPARQL 1.0 if an implementation follows the specification you should at least be able to compare dates:
PREFIX xsd: <http://www.w3.org/2001/XMLSchema#>
SELECT *
WHERE
{
# Your Triple Patterns here
FILTER( ?date > "2011-11-20T00:00:00Z"^^xsd:dateTime)
}
Now that still doesn't get round your problem that you need to take the date of death of the author and add 70 to it. What you probably need to do is calculate that part in your client code and inject it into your SPARQL queries. So this means you may have to do two queries - one to get the information and one to calculate if it is public work. Tbh you can probably calculate the 2nd part just in client code to save the extra query.
While this is not ideal until there is a good SPARQL 1.1 compliant python library it's what you're stuck with.
来源:https://stackoverflow.com/questions/8201012/how-to-perform-arithmetic-operations-in-sparql-with-python