Modify Jena model returned by D2RQ - jena

D2RQ creates a RDF representation of the DB using a Jena model.
Model m = new ModelD2RQ("file:outfile2.ttl");
I know that the returned model is a "read-only" model.
Hence, if I try to add a resource to the model I get a "jena.shared.AddDeniedException" exception.
Resource r1=m.createResource("http://www.stackoverflow.com#34");
r1.addProperty(RDF.type, ...); <-throws the exception
How can I decouple the model m from the database so that I can modify it? I don't want to write the model back, I just use D2RQ to get an RDF based DB-dump which I want to process further (I know that extensions like D2RQ update enable the modification of the database by modifying the RDF graph but I don't want to modify the DB)
Thanks

Take a copy to disconnect the model from the database:
Model m = new ModelD2RQ("file:outfile2.ttl");
Model mCopy = ModelFactory.createDefaultModel() ;
mCopy.add(m) ;
mCopy.addProperty(...)
Another way is to have a union model, where the in-memory part is the first, and update-able, part of the union.
Model m = new ModelD2RQ("file:outfile2.ttl");
Model extra = ModelFactory.createDefaultModel() ;
Model m2 = ModelFactory.createUnion(exrta, m2) ;
...

Related

Raw SQL Mapping Stored Procedure Results to POCO/DTO Using Entity Framework Core 2

I've pretty much looked everywhere I can, and I'm having a hard time trying to find the solution. it took me a week to create an immensely complex calculation query using a stored procedure, and I'd like to fetch the results from this query and place into a POCO class, similar to what I've done before using EF 6.
Map Stored Procedure Column names to POCO DTO
Basically this:
var p1 = new SqlParameter { ParameterName = "Param1", Value = 1 };
var p2 = new SqlParameter { ParameterName = "Param2", Value = 2 };
string query = "EXEC Calcs_Selections #Param1, #Param2";
var calcs = context.Database.SqlQuery<CalcViewPOCO>(query, p1, p2).ToList();
I've read literature found here:
EF Core Querying Raw SQL
Raw SQL Query without DbSet - Entity Framework Core
And discovered there is no more "free sql" in EF Core anymore. The FromSQL basically projects the results into a real entity found in the database, which I don't want because I don't have a table that has the same columns found. Instead, one solution is to extend the DBContext, and create a new DBSet
But I'm really not sure how to do this. I have a database first model, and used Scaffold-DbContext:
Getting Started with EF Core on ASP.NET Core with an Existing Database
I don't want to add to the context that was created automatically, in my case ProjectAContext, since if I make any more changes to the database, and run scaffolding again, it will overwrite my changes.
Though I couldn't wait any longer for the newer versions of EF Core to add functionality to what I asked, I found a library that helped me solve this solution.
https://github.com/snickler/EFCore-FluentStoredProcedure
It allowed me to take a result set, and map to a list of DTOs.
Sample shown on from the repo:
var dbContext = GetDbContext();
dbContext.LoadStoredProc("dbo.SomeSproc")
.WithSqlParam("fooId", 1)
.ExecuteStoredProc((handler) =>
{
var fooResults = handler.ReadToList<FooDto>();
// do something with your results.
});

Add custom data to breeze metadata on server

In a app I'm developing I create some UI widgets that are generated based on breeze metadata and attributes on my classes and properties. So on the server I read the attributes, and add them on breeze's generated metadata. On the client, after I load the metadata I add those attributes on the type.
An example of this
(on the client, after the enriched metadata loads):
function finishMetadata(md) {
for (var i = 0; i < md.schema.entityType.length; i++) {
var et = md.schema.entityType[i];
var etype = manager.metadataStore.getEntityType(et.name);
//isExportable is a class attribute
etype.isExportable = et.isExportable;
for (var j = 0; j < et.property.length; j++) {
var p = et.property[j];
var prop = etype.getProperty(p.name);
//displayName is a property attribute
prop.displayName = p.displayName ? p.displayName : p.name;
}
}
}
with this, when I call manager.metadataStore.getEntityType(entityName), I get all the entityType data, including the properties and all the attributes I added on the server.
This worked fine until today. I have added inheritance (TPT) on some classes (Customer:Person) and since the generated metadata from the Customer entity does not have the Person properties, I cannot add them to the metadataStore type. When I call metadataStore.getEntityType for Person, I get all attributes, but when I call it for Customer I do not get my custom attributes (and this is because on the code above Customer does not list the parent Person properties so I do not have the chance to plug in my custom attributes.
Anyway, this feels hacky and messy, even to explain it. So here we are, breeze its in version 1.4.7 and I wonder if there is an easier way of adding custom data to the metadata that would not break with TPT?
PS.: I know that I can hand-craft the metadata but I would like to stick with the default as much as possible to avoid problems with future changes. So, basically all metadata changes should be minimal and automatic, based on the classes.
Well, I ended up goind further down on the way I'm doing things. Instead of just use manager.metadataStore.getEntityType to get the type data (which does not bring my custom metadata), I look if my entityType has a baseType and if it does, I load that base type data and merge it with the child class. In my case, basically I get the dataProperties of the Person and use them instead of the dataProperties of the Customer. It works but I still looking for a cleaner and simpler way of doing something like this.

Jena throwing ConversionException when trying to cast to OntClass

I have a particular Class URI for which I am trying to get an OntClass. The model is a regular model.
I wrote some code to find out whether the right statements were in the model, and it seems that they are so I can't understand why it won't let me view this as an OntClass. (tblURI is a String passed as a method parameter)
Resource tblR = m.createResource(tblURI);
List<Statement> prp = tblR.listProperties().toList();
for(Statement s : prp)
System.out.println(s);
System.out.println(tblR.canAs(OntClass.class));
OntClass tbl = tblR.as(OntClass.class);
This is the output:
[kps:datasource/EnsembleMS#translation_stable_id, http://www.w3.org/1999/02/22-rdf-syntax-ns#type, http://www.w3.org/2002/07/owl#Class]
[kps:datasource/EnsembleMS#translation_stable_id, http://www.w3.org/1999/02/22-rdf-syntax-ns#type, http://www.w3.org/2000/01/rdf-schema#Class]
[kps:datasource/EnsembleMS#translation_stable_id, http://www.w3.org/2000/01/rdf-schema#isDefinedBy, kps:datasource/EnsembleMS]
[kps:datasource/EnsembleMS#translation_stable_id, http://www.w3.org/2000/01/rdf-schema#label, "translation_stable_id"]
false
com.hp.hpl.jena.ontology.ConversionException: Cannot convert node kps:datasource/EnsembleMS#translation_stable_id to OntClass: it does not have rdf:type owl:Class or equivalent
at com.hp.hpl.jena.ontology.impl.OntClassImpl$1.wrap(OntClassImpl.java:81)
at com.hp.hpl.jena.enhanced.EnhNode.convertTo(EnhNode.java:155)
at com.hp.hpl.jena.enhanced.EnhNode.convertTo(EnhNode.java:34)
at com.hp.hpl.jena.enhanced.Polymorphic.asInternal(Polymorphic.java:66)
at com.hp.hpl.jena.enhanced.EnhNode.as(EnhNode.java:110)
at com.KPS.myApp.exampleMethod(myApp.java:123)
Why is it throwing an exception and how can I get an OntClass for the resource with uri tblURI?
Thanks for any pointers
You don't say what kind of model m is. In particular, if m was created with the RDFS language profile, the OntModel will be looking for an rdf:type of rdfs:Class, not owl:Class. If that's not the issue, then a complete minimal (i.e. runnable) example would help.
By the way, there's another problem I can see: resource URI's in the model should be in absolute form, not abbreviated form. The fact that you've got q-name URI's in your model, like kps:datasource/EnsembleMS#translation_stable_id, suggest that something is going wrong with your prefix handling. That won't by itself cause the problem you've reported, but it's a red flag to investigate.
Update
Responding to questions:
yes, you need to be using an OntModel, otherwise it's not possible for the OntClass to know which langauge profile to use. Either create the model as OntModel in the first place:
OntModel m = modelFactory.createOntologyModel( OntModelSpec.OWL_MEM );
or wrap your plain model as an OntModel:
OntModel om = modelFactory.createOntologyModel( OntModelSpec.OWM_MEM, m );
Of course, you many use any of the model specifications, as you please, OWL_MEM is just one option.
createResource will not expand prefixes for you. So, you should expand them yourself before creating the resource:
m.createResource( m.expandPrefix( "foo:bar" ) );
Of course, this requires the prefix "foo" to be registered as a prefix. This happens automatically if you read an RDF document that defines the prefix in its syntax, but otherwise can be done manually with setNsPrefix.

How to move from untyped DataSets to POCO\LINQ2SQL in legacy application

Good day!
I've a legacy application where data access layer consists of classes where queries are done using SqlConnection/SqlCommand and results are passed to upper layers wrapped in untyped DataSets/DataTable.
Now I'm working on integrating this application into newer one where written in ASP.NET MVC 2 where LINQ2SQL is used for data access. I don't want to rewrite fancy logic of generating complex queries that are passed to SqlConnection/SqlCommand in LINQ2SQL (and don't have permission to do this), but I'd like to have result of these queries as strong-typed objects collection instead of untyped DataSets/DataTable.
The basic idea is to wrap old data access code in a nice-looking from ASP.NET MVC "Model".
What is the fast\easy way of doing this?
Additionally to the answer below here is a nice solution based on AutoMapper: http://elegantcode.com/2009/10/16/mapping-from-idatareaderidatarecord-with-automapper/
An approach that you could take is using the DataReader and transfer. So for every object you want to work with define the class in a data transfer object folder (or however your project is structured) then in you data access layer have something along the lines of the below.
We used something very similar to this in a project with a highly normalized database but in the code we did not need that normalization so we used procedures to put the data into more usable objects. If you need to be able to save these objects as well you will need handle translating the objects into database commands.
What is the fast\easy way of doing
this?
Depending on the number of classes etc this is could not be the fastest approach but it will allow you to use the objects very similarly to the Linq objects and depending on the type of collections used (IList, IEnumerable etc) you will be able to use the extension methods on those types of collections.
public IList<NewClass> LoadNewClasses(string abc)
{
List<NewClass> newClasses = new List<NewClass>();
using (DbCommand command = /* Get the command */)
{
// Add parameters
command.Parameters["#Abc"].Value = abc;
// Could also put the DataReader in a using block
IDataReader reader = /* Get Data Reader*/;
while (reader.Read())
{
NewClass newClass = new NewClass();
newClass.Id = (byte)reader["Id"];
newClass.Name = (string)reader["Name"];
newClasses.Add(newClass);
}
reader.Close();
}
return newClasses;
}

SubSonic and Stored Procedures

When using SubSonic, do you return the data as a dataset or do you put that in a strongly typed custom collection or a generic object?
I ran through the subsonic project and for the four stored procs I have in my DB, it gave me a Sps.cs with 4 methods which return a StoredProcedure object.
If you used a MVC, do you usually use the StoredProcedure object or wrap that around your business logic and return a dataset, list, collection or something else?
Are datasets still the norm or is that replaced by something else?
If the results of the stored procedure has the same schema as one of your tables, you can build a collection using this code (SubSonic 2.1):
ProductCollection coll = new ProductCollection();
coll.LoadAndCloseReader(SPs.GetProducts(1).GetReader());
ExecuteTypedList<> is your best friend in this case:
IList<Product> list=SPs.GetProducts().ExecuteTypedList<Product>();
If my stored procedure returns all the fields from one of the tables for which I have a SubSonic object then I do a LoadAndCloseReader on the result of the stored procedure. If my stored procedure returns data that does not match a SubSonic object then I just work with it as a dataset.
Perhaps return a datareader and then iterate it to populate some custom objects. Alternatively the quick and dirty way (since you're not using domain driven design) create a view in the DB with the same structure as the stored proc, then load the result into your ViewObjectCollection similar to John's code.
You can do data readers, but that's so 1999. Returning objects is a breeze with SubSonic, and easier to use than a data reader. You can retrieve objects like so:
Dim Charts As Generic.List(Of MusicDB.Billboard) = _
New SubSonic.Select(MusicDB.DB.Repository.Provider, New String() _
{"Prefix", "Artist", "Track", "ArtistNarrowToken", "TrackNarrowToken", "ArtistId", "TrackId", "TrackYear"}). _
From(MetadataTagger.MusicDB.Tables.Billboard). _
Where(MusicDB.Billboard.Columns.ArtistNarrowToken).IsLessThan(10). _
Or(MusicDB.Billboard.Columns.TrackId).IsNull(). _
OrderAsc(New String() {"TrackYear"}).ExecuteTypedList(Of MetadataTagger.MusicDB.Billboard)()

Resources