I'm using Hermit Reasoner with OWL-API 5 as follows:
OWLOntologyManager manager= OWLManager.createOWLOntologyManager(); //create the manager
OWLOntology ontology=manager.loadOntologyFromOntologyDocument(new File("ontologies/E1G1.owl"));
OWLDataFactory datafact=manager.getOWLDataFactory();
Configuration config= new Configuration();
Reasoner reasoner= new Reasoner(config, ontology);
reasoner.classifyClasses();
reasoner.classifyDataProperties();
reasoner.classifyObjectProperties();
System.out.println(reasoner.isConsistent());
Now I would like to execute SPARQL Query in analogous way as Protégé SPARQL Plugin over the inferred ontology. I'm experimenting JENA ARQ, but it is not clear to me how to integrate them. Any suggestions?
I do not think there is existing integration between Jena and HermiT. OpenPellet, a reasoner built on top of Pellet has Jena integration.
The question is whether you indeed need an external reasoner. If not you can use the OWL reasoners provided as part of Jena. See Jena OWL Reasoners.
Related
I am using Protege to explore the SSN ontology,
but a lot of classes like the survivalRange seem to be missing from the main file.
How could I import all the classes and entities of the ssn-system,
without importing them manually one by one by the IRI?
You can import http://www.w3.org/ns/ssn/systems/, which itself imports http://www.w3.org/ns/ssn/.
I am trying to use Jena API to create a hierarchy of OWL ontology, which is similar to the one created by Protege. I have tried two methods to get subclasses of owl:Thing and then recurs for next levels.:
Using the listSubClasses(true)
Using the listHierarchyRootClasses()
They both have worked for the OWL classes having rdfs:subClassOf as owl:Thing. However, for OWL complex classes (owl:unionOf, owl:intersectionOf, and owl:complementOf), the first method has not listed anything. While the result of the second method has not been as correct as Protege. It often includes more subclasses of owl:Thing than Protege does.
Someone said that is the limitation of Jena API. Is it true? Should I switch to using OWL API instead of Jena? I would like to have your advice.
What's missing is a reasoner. You cannot have complete results without using a reasoner to infer subclass relationships. this is true for both Jena and OWL API.
I am loading a JSON-LD document using Jena:
Model mj = RDFDataMgr.loadModel([filename]);
The actual content being loaded is here: http://build.fhir.org/account-example.jsonld
Jena goes off and resolves the context, and returns an error (LOADING_REMOTE_CONTEXT_FAILED - lovely suppression of the actual cause in the Jena code :-(). But I need to override the context anyway, and use a different source, because I am doing the build for what will be posted at build.fhir.org, and I need to use my local versions instead. I can't see how to override the context resolution
Alternatively, I could use the loading method documented here: https://github.com/jsonld-java/jsonld-java#code-example - but I have no idea how to get to a Jena graph from there (and I haven't figured out how make the custom resolution work in my Eclipse context either)
How can I get to a Jena graph using a context defined in code somewhere?
I think Jena devs are subscribed to the relevant tag RSS streams. They might weigh in on the clarity of LOADING_REMOTE_CONTEXT_FAILED error. But it seems pretty clear to me.
In order to override the context, you can use read(Model model, String uri, Context context) method. ModelFactory.createDefaultModel() will create a intance of a Model that you can pass as a first argument. See more examples here: https://github.com/apache/jena/tree/master/jena-arq/src-examples/arq/examples/riot
Alternative library is not Jena-compatible (nor RDF4J, which strikes me as rather silly), so there is no easy way to use it with Jena-dependent code.
Finally, you provided the code example for getting a model but now mention a graph – there is a method for that as well: read(Graph graph, String uri, Context context).
In Jena I'm loading an ontology into a Model, using this code:
Model model = FileManager.get().loadModel("/path/myontology.owl");
My problem is that "myontology.owl" imports another ontology with owl:imports. In pseudo code lets just say that "Myontology.owl" imports other files to complete the ontology, since several individuals are declared in external files, e.g.:
In myontology.owl
Import → myontologywithindividuals.owl
My problem is that I can't import the ontology with its individuals into a single Model in Jena. That is,
Model model = FileManager.get().loadModel("/path/myontology.owl");
doesn't seem to work. Any idea why? How can I import this correctly?
Plain models in Jena don't do any processing of owl:imports, because plain RDF doesn't have any notion of importing other documents. Ontology imports are an OWL concept, and you'll need to use an OntModel if you want imports processing. You may need to use setDynamicImports() to enable import processing. If the imports statements refer to ontologies using their ontology IRI, but you want to retrieve them from a local file, you may also need to setup the OntModel's OntDocumentManager and FileManager to take care of the appropriate mapping from IRIs to local files.
In the Inferring a JVM Model section of the Xtext documentation (http://www.eclipse.org/Xtext/documentation.html#_17) it starts by saying:
"In many cases, you will want your DSLs concepts to be usable as Java elements. E.g. an Entity will become a Java class and should be usable as such".
In the example above, how can I use the generated Entity class outside of xbase, i.e. in real Java code in a different project to the xtext one?
What I am essentially asking is if the Java classes created my by the model Inferrer can actually be used as real java classes, which can have their methods called and fields accessed from java code, in an altogether different project, and if so how this can be done?
My going through the documentation has lead me to fear that the generated "Java classes" are only Xbase types, only referenceabe in an xtext context, and are not therefore real java classes...
The Xbase compiler can compile all Xbase expressions to plain Java code usable everywhere where Java codes are available.
If you add your own elements to the language, you have to extend the generator to also support these elements - for this reason you define your own JVMModelInferrer.
The basic Xtext compiler then executes the JVMModelInferrer, calculates the JVM model that might (or might not) contain Xbase expressions as well; then this JVM model can be generated into a Java-compilable (thus Java-reusable) code.
If you want to test this functionality, simply generate the Xtext Domain Model example (available from the New... wizards in Xtext/Examples category), and evaluate the results: when you edit your domain model, Xtext automatically generates the usable Java code (if the required dependencies are set).