For following model
i need to create individuals for class1 and set literal values for property4 and property5 for created individuals.
For this i am creating individual for Anonymous class2(in1) and setting property values for it. Then i create individual for Anonymous class1(in0) and use addproperty(property2,in1), again i create individual for class 1(in) and use addproperty(property1,in0).
String ns = "url.com";
OntModel model = ModelFactory.createOntologyModel(OntModelSpec.RDFS_MEM);
OntClass class1 = model.createClass(ns+"class1");
OntClass Aclass1= model.createClass();
OntClass Aclass2= model.createClass();
OntProperty pro1 = model.createOntProperty(ns + "pro1");
OntProperty pro2 = model.createOntProperty(ns + "pro2");
OntProperty pro3 = model.createOntProperty(ns + "pro3");
DatatypeProperty pro4 = model.createDatatypeProperty(ns + "pro4");
DatatypeProperty pro5 = model.createDatatypeProperty(ns + "pro5");
Individual in1 = Aclass2.createIndividual(ns + "in1");
in1.addProperty( pro4, model.createTypedLiteral( 50 ) )
.addProperty( pro5, model.createTypedLiteral( 60) );
Individual in0=Aclass1.createIndividual(ns+"in2");
in0.addProperty(pro2,in1);
Individual in=class1.createIndividual(ns+"indi");
in.addProperty(pro1, in0);
this is giving following exception when run
Exception in thread "main" com.hp.hpl.jena.ontology.ProfileException: Attempted to use language construct DATATYPE_PROPERTY that is not supported in the current language profile: RDFS
at com.hp.hpl.jena.ontology.impl.OntModelImpl.checkProfileEntry(OntModelImpl.java:3058)
at com.hp.hpl.jena.ontology.impl.OntModelImpl.createDatatypeProperty(OntModelImpl.java:1395)
at com.hp.hpl.jena.ontology.impl.OntModelImpl.createDatatypeProperty(OntModelImpl.java:1375)
at test1.Hello.main(Hello.java:46)
What am i doing wrong and is there a better way for doing this?
The spec is wrong, it does not support owl:DatatypeProperty (and a lot of things from OntModel), but only RDFS vocabulary.
Try OntModelSpec.OWL_DL_MEM. It should eliminate exception.
But note: OntModelSpec#OWL_DL_MEM is about OWL1-DL, not OWL2DL. Jena does not support OWL2 at all.
If you want to use full OWL2DL specification with Jena, you can take a look on ONT-API, which is jena-based OWL-API.
Related
I generally keep my ontologies in two different files.
First ontology file contains the classes, subclasses, data properties and object properties.
The second file containing all the individuals and relationships between the individuals.
So, I need to merge these two files in order to have a complete model. I wonder how this could be achieved using owlapi?
In Jena, I do this as follows:
OntModel model = ModelFactory.createOntologyModel(OntModelSpec.OWL_MEM,
null);
try {
model.read(new FileInputStream(MyOntologyFile), "...");
model.read(new FileInputStream(MyOntologyWithIndividualsFile), "...");
} catch (Exception e) {
log.error("Loading Model failed:" + e);
}
In the similar fashion when I tried to load my ontology files using owlapi, I get error:
OWLOntologyManager manager = OWLManager.createOWLOntologyManager();
OWLObjectRenderer renderer = new DLSyntaxObjectRenderer();
File file = new File(MyOntologyFile);
File fileIndividuals = new File(MyOntologyWithIndividualsFile);
OWLOntology localOntology = null;
// Now load the local copy
try {
localOntology = manager.loadOntologyFromOntologyDocument(file);
localOntology = manager
.loadOntologyFromOntologyDocument(fileIndividuals);
} catch (OWLOntologyCreationException ex) {
ex.printStackTrace();
}
Error:
org.semanticweb.owlapi.model.OWLOntologyAlreadyExistsException: Ontology already exists. OntologyID(OntologyIRI(<http://www.semanticweb.org/lp4220/ontologies/2014/4/untitled-ontology-35>))
at uk.ac.manchester.cs.owl.owlapi.OWLOntologyManagerImpl.loadOntology(OWLOntologyManagerImpl.java:880)
at uk.ac.manchester.cs.owl.owlapi.OWLOntologyManagerImpl.loadOntologyFromOntologyDocument(OWLOntologyManagerImpl.java:806)
at uk.ac.manchester.cs.owl.owlapi.OWLOntologyManagerImpl.loadOntologyFromOntologyDocument(OWLOntologyManagerImpl.java:821)
Update:
As it turns out, merging of ontologies is only possible with those having different IRI's, and hence I presume it is not acceptable to divide an ontology into two with the same IRI. A solution for this (as commented by Joshua) may be to read all individuals and axioms from one ontology and then add them to an already loaded ontology.
For ontologies with distinct IRI's merging can be done as follows (example courtesy Ignazio's OWLED 2011 slides - slide no. 27):
OWLOntologyManager m = create();
OWLOntology o1 = m.loadOntology(pizza_iri);
OWLOntology o2 = m.loadOntology(example_iri);
// Create our ontology merger
OWLOntologyMerger merger = new OWLOntologyMerger(m);
// Merge all of the loaded ontologies, specifying an IRI for the
new ontology
IRI mergedOntologyIRI =
IRI.create(
"http://www.semanticweb.com/mymergedont"
);
OWLOntology merged = merger.createMergedOntology(m,
mergedOntologyIRI);
assertTrue(merged.getAxiomCount() > o1.getAxiomCount());
assertTrue(merged.getAxiomCount() > o2.getAxiomCount());
Your problem is not having the same iri in the data but ontologies with the same iris loaded in the same manager. Load the ontologies in separate managers and add all the axioms from one to the other, that will give you a merged ontology.
In general, you do not make "Individuals and Relationships" an Ontology, unless they require for classifications - say to define Class "American Company" you need an Individual "US". Otherwise, that other part is should be your RDF triples that refer to the Ontology.
recently I started my adventure with F#. I'm trying to create F# library that I will use in my C# projects.
Now I'm facing the problem that I have two types definitions that (as I wish) could use themselves (I'm trying to create fluent API for c# usage).
How I want to use it in c# (simplified example).
Shopping shopping = new Shopping();
Stuff[] stuff = shopping.GoTo("Wallmart").Buy(new [] { "Candies", "Ice cream", "Milk" }).GoTo("Drug store").Buy(new [] { "Anvil" }).GetStuff();
Now I have two types (in separted files):
type ShopResult(someContext: ShoppingContext) =
//some logic
member this.GoTo shopName = new ToDoResult(someContext)
type ToDoResult(someContext: ShoppingContext) =
//some logic
member this.Buy what = new ShopResult(someContext)
Now the file order causing compilation error and I'm wondering if there's any solution for my case? or have I to drop the fluent api idea?
Put both types in the same file and change the definitions to the following:
type ShopResult(someContext: ShoppingContext) =
//some logic
member this.GoTo shopName = new ToDoResult(someContext)
and ToDoResult(someContext: ShoppingContext) =
//some logic
member this.Buy what = new ShopResult(someContext)
For more information, see the section 'Mutually Recursive Types' in the language reference on MSDN.
I have an OWL ontology file as RDF and want to store my data in a TDB and want to use reasoning. Actually this sounds simple so far :)
But here is the point where I'm confuesd:
I created a TDB an stored via SPARQL some statements. Then I tried to load the TDB via a model and OWL reasoner:
OntModelSpec ontModelSpec = OntModelSpec.OWL_MEM;
Reasoner reasoner = ReasonerRegistry.getOWLReasoner();
ontModelSpec.setReasoner(reasoner);
Model schemaModel = FileManager.get().loadModel("D:/Users/jim/Desktop/ontology/schema.rdf");
OntModel schema = ModelFactory.createOntologyModel( ontModelSpec, schemaModel);
Location location = new Location("D:/Users/jim/Desktop/jena-fuseki-0.2.5/DB");
Dataset dataset = TDBFactory.createDataset(location);
Model model = dataset.getDefaultModel();
OntModel ontModel = ModelFactory.createOntologyModel(ontModelSpec, model);
When I now create new resources via API, they are not stored in the TDB. And I'm not able to see the Statments have added via SPARQL?!
The SPAQRL statement shows me only the entries I've added with SPARQL
QueryExecution qExec = QueryExecutionFactory.create(
StrUtils.strjoinNL("SELECT ?s ?p ?prop",
"WHERE {?s ?p ?prop}"),
dataset) ;
ResultSet rs = qExec.execSelect() ;
try {
ResultSetFormatter.out(rs) ;
} finally { qExec.close() ; System.out.println("closed connection");}
and this returns only the Resource added with the API
System.out.print("instance: " + ontModel.getResource(NS + "TestItem"));
And when I call this:
ExtendedIterator<Statement> iter = ontModel.listStatements();
I get the following exception:
org.openjena.atlas.lib.InternalErrorException: Invalid id node for subject (null node): ([0000000000000067], [0000000000000093], [00000000000000C8])
Is someone able to explain that behavior? Or could someone please give me a hint how to separate schema and date with TDB in right way with using the OntModel?
Partial answer:
org.openjena.atlas.lib.InternalErrorException: Invalid id node for subject (null node): ([0000000000000067], [0000000000000093], [00000000000000C8])
You are using TDB without transactions - try adding TDB.sync before exiting to flush changes to the disk.
I just started to use Neo4j with Spring Data and I'm not able to recover graph objects and convert them back to domain objects. I must say I have no previous experience in that kind of databases.
In that way, I'm using Spring Data repositories. For standard queries the repository code is auto-generated but I would like to define some custom methods, so I created my custom repository as explained here.
For example, I would like to be able to update a certain property value (currentValue property in this case) from a given edge between two certain nodes (searchByUserName is a previously defined index in my node entity which represents a user). I'm using the query method from the Neo4j template in my custom repository implementation as follows:
public class TwitterUserRepositoryImpl implements TwitterUserRepositoryCustom{
#Autowired
private Neo4jOperations neo4jTemplate;
public void updateRelationshipValueByUserName(
String userAUserName, String userBUserName, double value){
HashedMap params = new HashedMap();
params.put("userAUserName", userAUserName);
params.put("userBUserName", userBUserName);
params.put("value", value);
String query = "START x=node:searchByUserName(userName = {userAUserName}), " +
"y=node:searchByUserName(userName = {userBUserName})" +
" MATCH (x)-[r:FOLLOWS]->(y)" +
" SET r.currentValue = {value}" +
" RETURN r";
Result<Map<String, Object>> relationships = neo4jTemplate.query(query, params);
/* let's try to recover the relationship entity and do some more stuff */
}
The cypher query returns an "edge" between two users, where its relationship type is "FOLLOWS", simulating a Twitter users network. I have no idea how to convert this QueryResult object back to my RelationshipEntity object. Is that possible?
Just use the result-dsl: http://static.springsource.org/spring-data/data-graph/snapshot-site/reference/html/#d5e1118
relationships.to(MyRelationshipEntity.class)
will return you a Result<MyRelationshipEntity> which is an Iterable
I need to use OGNL for reading some properties from Java object. OGNL is completely new thing to me. The documentation available for OGNL is OGNL's website is really confusing to me.
So anyone can provide a simple HelloWorld example for using OGNL (or any link to a tutorial is also helpful).
Try this:
Dimension d = new Dimension(2,2);
String expressionString = "width";
Object expr = Ognl.parseExpression(expressionString);
OgnlContext ctx = new OgnlContext();
Object value = Ognl.getValue(expr, ctx, d);
System.out.println("Value: " + value);
If the intention is only to read properties from an object then PropertyUtils.getProperty (from commons-beanutils) may suffice. However, if the intention is to evaluate conditionals and such, then Ognl may benefit.
Here is the same Dimension example with a boolean:
Dimension d = new Dimension();
d.setSize(100,200) ;// width and height
Map<String,Object> map = new HashMap<String,Object>();
map.put("dimension", d);
String expression = "dimension.width == 100 && dimension.height == 200";
Object exp = Ognl.parseExpression(expression);
Boolean b = (Boolean) Ognl.getValue(exp,map);
// b would evaluate to true in this case
OGNL allows you to access objects fields and methods via string expressions which becomes very useful when you have lose coupled architecture between data and it's consumers. It's using reflection under the hood but definitely speeds up development compared to a pure reflection approach.
Some one line examples
System.out.println(Ognl.getValue("x", new Point(5,5)));
System.out.println(Ognl.getValue("size", new ArrayList<Object>()));
Documentation already has a number of basic and more advanced ognl expressions.
Here is an example helloworld for jython (python that compiles to java).
from ognl import Ognl, OgnlContext
from java.lang import String
exp = Ognl.parseExpression("substring(2, 5)")
print Ognl.getValue(exp, OgnlContext(), String("abcdefghj"))