Pellet Reasoner with Jena

蹲街弑〆低调 提交于 2019-12-13 05:48:13

问题


I am running Pellet with Jena as the following:

public void storeInferredModel(Data data) {
    System.out.println("creating inferred dataset ");
    Dataset dataset = TDBFactory.createDataset(data.getInferredResultsPath());
    System.out.println("creating OntModel ");
    OntModel Infmodel = ModelFactory.createOntologyModel(
                          PelletReasonerFactory.THE_SPEC, 
                          dataset.getNamedModel(this.URL));
    System.out.println("adding schema (OWL) to OntModel");
    Infmodel.add(this.owl);
    System.out.println("adding data (RDF) to OntModel ");
    Infmodel.add(data.tdb);
    System.out.println("creating ModelExtractor ");
    ModelExtractor ext = new ModelExtractor(Infmodel);
    System.out.println("replacing OntModel by the Extracted Model");
    dataset.replaceNamedModel(this.URL, ext.extractModel());
    System.out.println("saving inferred model");
    Infmodel.close();
    System.out.println("closing inferred dataset");
    dataset.close();
}

My TDB or the raw data is 2.7G. I have been running the reasoner against the TDB but I got the problem of “GC overhead limit exceeded java” though I give the program around 45G memory and the reasoner has taken only 30G and then crashed. In other words, it did not reached the Max of computer memory.

Another question is with Pellet, when I run the code above in small dataset I got too many sameAS, disjoint, etc. Is that a bug or am I doing something wrong with Pellet?

来源:https://stackoverflow.com/questions/17687838/pellet-reasoner-with-jena

易学教程内所有资源均来自网络或用户发布的内容,如有违反法律规定的内容欢迎反馈
该文章没有解决你所遇到的问题?点击提问,说说你的问题,让更多的人一起探讨吧!