Spatial querydsl - GeometryExpressions. Find entities that range contains given "Point" - spatial

Used tools and their versions:
I am using:
spring boot 2.2.6
hibernate/hibernate-spatial 5.3.10 with dialect set to: org.hibernate.spatial.dialect.mysql.MySQL56SpatialDialect
querydsl-spatial 4.2.1
com.vividsolutions.jts 1.13
jscience 4.3.1
Problem description:
I have an entity that represents medical-clinic:
import com.vividsolutions.jts.geom.Polygon;
#Entity
public class Clinic {
#Column(name = "range", columnDefinition = "Polygon")
private Polygon range;
}
The range is a circle calculated earlier based on the clinic's gps-location and radius. It represents the operating area for that clinic. That means that it treats only patients with their home address lying within that circle. Let's assume that above circle, is correct.
My goal (question):
I have a gps point with a patient location: 45.7602322 4.8444941. I would like to find all clinics that are able to treat that patient. That means, to find all the clinics that their range field contains 45.7602322 4.8444941.
My solution (partially correct (I think))
To get it done, I have created a simple "Predicate"/"BooleanExpression":
GeometryExpressions.asGeometry(QClinic.clinic.range)
.contains(Wkt.fromWkt("Point(45.7602322 4.8444941)"))
and it actualy works, because I can see proper sql query in console:
select (...) where
ST_Contains(clinic0_.range, ?)=1 limit ?
first binding param: POINT(45.7602322 4.8444941)
But I have two problems with that:
QClinic.clinic.range is marked as "warning" in intellij as: "Unchecked assignment: 'com.querydsl.spatial.jts.JTSPolygonPath' to 'com.querydsl.core.types.Expression<org.geolatte.geom.Geometry'". Yes, in QClinic range is com.querydsl.spatial.jts.JTSPolygonPath
Using debugger and intellij's "evaluate" on the above line (that creates the expression) i can see that there is an error message: "unknown operation with operator CONTAINS and args [clinic.range, POINT(45.7602322 4.8444941)]"

You can ignore the second warning. The spatial specific operations are simply not registered in the serializer used for the toString of the operation. They reside in their own module.
The first warning indicates that you're mixing up Geolatte and JTS expressions. From your mapping it seems you intend to use JTS. In that case you need to use com.querydsl.spatial.jts.JTSGeometryExpressions instead of com.querydsl.spatial.jts.GeometryExpressions in order to get rid of that warning.

Related

Can I visualize a Multibody pose without explicitly calculating every body's full transform?

In the examples/quadrotor/ example, a custom QuadrotorPlant is specified and its output is passed into QuadrotorGeometry where the QuadrotorPlant state is packaged into FramePoseVector for the SceneGraph to visualize.
The relevant code segment in QuadrotorGeometry that does this:
...
builder->Connect(
quadrotor_geometry->get_output_port(0),
scene_graph->get_source_pose_port(quadrotor_geometry->source_id_));
...
void QuadrotorGeometry::OutputGeometryPose(
const systems::Context<double>& context,
geometry::FramePoseVector<double>* poses) const {
DRAKE_DEMAND(frame_id_.is_valid());
const auto& state = get_input_port(0).Eval(context);
math::RigidTransformd pose(
math::RollPitchYawd(state.segment<3>(3)),
state.head<3>());
*poses = {{frame_id_, pose.GetAsIsometry3()}};
}
In my case, I have a floating based multibody system (think a quadrotor with a pendulum attached) of which I've created a custom plant (LeafSystem). The minimal coordinates for such a system would be 4 (quaternion) + 3 (x,y,z) + 1 (joint angle) = 7. If I were to follow the QuadrotorGeometry example, I believe I would need to specify the full RigidTransformd for the quadrotor and the full RigidTransformd of the pendulum.
Question
Is it possible to set up the visualization / specify the pose such that I only need to specify the 7 (pose of quadrotor + joint angle) state minimal coordinates and have the internal MultibodyPlant handle the computation of each individual body's (quadrotor and pendulum) full RigidTransform which can then be passed to the SceneGraph for visualization?
I believe this was possible with the "attic-ed" (which I take to mean "to be deprecated") RigidBodyTree, which was accomplished in examples/compass_gait
lcm::DrakeLcm lcm;
auto publisher = builder.AddSystem<systems::DrakeVisualizer>(*tree, &lcm);
publisher->set_name("publisher");
builder.Connect(compass_gait->get_floating_base_state_output_port(),
publisher->get_input_port(0));
Where get_floating_base_state_output_port() was outputting the CompassGait state with only 7 states (3 rpy + 3 xyz + 1 hip angle).
What is the MultibodyPlant, SceneGraph equivalent of this?
Update (Using MultibodyPositionToGeometryPose from Russ's deleted answer
I created the following function which, attempts to create a MultibodyPlant from the given model_file and connects the given plant pose_output_port through MultibodyPositionToGeometryPose.
The pose_output_port I'm using is the 4(quaternion) + 3(xyz) + 1(joint angle) minimal state.
void add_plant_visuals(
systems::DiagramBuilder<double>* builder,
geometry::SceneGraph<double>* scene_graph,
const std::string model_file,
const systems::OutputPort<double>& pose_output_port)
{
multibody::MultibodyPlant<double> mbp;
multibody::Parser parser(&mbp, scene_graph);
auto model_id = parser.AddModelFromFile(model_file);
mbp.Finalize();
auto source_id = *mbp.get_source_id();
auto multibody_position_to_geometry_pose = builder->AddSystem<systems::rendering::MultibodyPositionToGeometryPose<double>>(mbp);
builder->Connect(pose_output_port,
multibody_position_to_geometry_pose->get_input_port());
builder->Connect(
multibody_position_to_geometry_pose->get_output_port(),
scene_graph->get_source_pose_port(source_id));
geometry::ConnectDrakeVisualizer(builder, *scene_graph);
}
The above fails with the following exception
abort: Failure at multibody/plant/multibody_plant.cc:2015 in get_geometry_poses_output_port(): condition 'geometry_source_is_registered()' failed.
So, there's a lot in here. I have a suspicion there's a simple answer, but we may have to converge on it.
First, my assumptions:
You've got an "internal" MultibodyPlant (MBP). Presumably, you also have a context for it, allowing you to perform meaningful state-dependent calculations.
Furthermore, I presume the MBP was responsible for registering the geometry (probably happened when you parsed it).
Your LeafSystem will directly connect to the SceneGraph to provide poses.
Given your state, you routinely set the state in the MBP's context to do that evaluation.
Option 1 (Edited):
In your custom LeafSystem, create the FramePoseVector output port, create the calc callback for it, and inside that callback, simply invoke the Eval() of the pose output port of the internal MBP that your LeafSystem own (having previously set the state in your locally owned Context for the MBP and passing in the pointer to the FramePoseVector that your LeafSystem's callback was provided with).
Essentially (in a very coarse way):
MySystem::MySystem() {
this->DeclareAbstractOutputPort("geometry_pose",
&MySystem::OutputGeometryPose);
}
void MySystem::OutputGeometryPose(
const Context& context, FramePoseVector* poses) const {
mbp_context_.get_mutable_continuous_state()
.SetFromVector(my_state_vector);
mbp_.get_geometry_poses_output_port().Eval(mpb_context_, poses);
}
Option 2:
Rather than implementing a LeafSystem that has an internal plant, you could have a Diagram that contains an MBP and exports the MBP's FramePoseVector output directly through the diagram to connect.
This answer addresses, specifically, your edit where you are attempting to use the MultibodyPositionToGeometryPose approach. It doesn't address the larger design issues.
Your problem is that the MultibodyPositiontToGeometryPose system takes a reference to an MBP and keeps a reference to that same MBP. That means the MBP must be alive and well for at least as long as the MPTGP is. However, in your code snippet, your MBP is local to the add_plant_visuals() function so it is destroyed as soon as the function is over.
You'll need to create something that is persisted and owned by someone else.
(This is tightly related to my option 2 - now edited for improved clarity.)

Neo4j - Discriminating relationships based on end node sub class type

I notice when inheritence is used in a spring-data-neo4j 4 data model, the super class is used as the discriminator when loading and discriminating relationships based on end node type. Is there any way I can force spring-data-neo4j-4 to use a subclass as the discriminator instead?
For example, let's say we have a data model (class diagram) as the following on the left and its neo4j database representation on the right.
Currently, if we have an Owner entity with code along the lines of:
#NodeEntity
class Owner extends BaseNodeEntity {
...
#Relationship(type="OWNS")
private Set<Dog> dogs; // both dog and cat are mapped here
#Relationship(type="OWNS")
private Set<Cat> cats; // both dog and cat are mapped here
...
}
The spring-data-neo4j-4 framework will automatically map both Dog and Cat to the set of Cats, and the set of Dogs, so I then have two 'dogs' (although one of them is actually a cat), and two 'cats', (although one of them is actually a dog) loaded from the database. If I remove the super classes Pet and BaseNodeEntity from the data model, then Dog and Cat are automatically mapped to their respective sets correctly.
Is there a way that I can force spring-data-neo4j-4 to map my data model correctly, or will I be forced to change my data model? I'm not keen to remove super classes from my data model as I have a lot of re-use going on, and adding extra relationships (i.e. splitting OWNS to CAT_OWNER and DOG_OWNER) would be similarly annoying.
Update
I've now noticed this behaviour is not consistent. I've written a unit tests that tests the mappings. Strangely, sometimes it passes and sometimes it fails (sometimes it erroneously maps the cat as a dog and the dog as a cat and sometimes it doesn't). Manually running the unit test 10 times gave me the following pass / fail results.
Run, Result (pass / fail)
1 P
2 F
3 F
4 P
5 F
6 P
7 F
8 F
9 P
10 F
Surely mapping behaviour should be consistent. Could this be a bug in SDN-4?
Update 08 May 2016
Sorry about the delay in updating this. I've only just had the opportunity to start working on this project again. I have retested this and am getting the same results with the latest stable neo4j release.
<neo4j.version>2.3.2</neo4j.version>
<sdn.version>4.1.1.RELEASE</sdn.version>
<java.version>1.8</java.version>
<neo4j-ogm.version>2.0.1</neo4j-ogm.version>
<spring-data-commons.version>1.12.1.RELEASE</spring-data-commons.version>
I've attached below the actual domain model used in our test case. The test case details can be seen at https://github.com/johndeverall/thescene-spa/issues/142. The test case code is:
#Test
public void saveAndLoadTags() {
log.info("Given we have a member with some events");
Member member = createMember();
Event event1 = eventService.createEvent("Event1", member);
Event event2 = eventService.createEvent("Event2", member);
log.info("When I tag my events with some tags");
contentService.tag("Tag1", "", event1, member);
contentService.tag("Tag1", "", event1, member); // should create no new tags or relationships
contentService.tag("Tag2", "", event1, member);
contentService.tag("Tag2", "", event2, member);
log.info("Then my events should be appropriately tagged");
event1 = eventService.loadEventBySceneId(event1.getSceneId());
assertThat(event1.getTags().size(), is(equalTo(2)));
assertThat(event2.getTags().size(), is(equalTo(1)));
log.info("And I should have created two tags total");
Iterator<Tag> iterator = contentService.getAllTags().iterator();
List<Tag> tags = StreamSupport.stream(Spliterators.spliteratorUnknownSize(iterator, Spliterator.ORDERED), false).collect(Collectors.<Tag> toList());
assertThat(tags.size(), is(equalTo(2)));
log.info("And our member should have created two tags");
member = memberService.loadMemberByEmailAddressPasswordAccount(emailAddress);
// **************************************************************
// member.getCreatedTags().size() more often than not returns 3 causing my test failure!
// **************************************************************
assertThat(member.getCreatedTags().size(), is(equalTo(2)));
}
The problem shown by the test case
The following method:
public Member loadMemberByEmailAddressPasswordAccount(String emailAddress) {
Filter filter = new Filter("email", emailAddress);
Collection<EmailAddressPasswordAccount> emailAddressPasswordAccounts = session.loadAll(EmailAddressPasswordAccount.class, filter, 2);
return emailAddressPasswordAccounts.isEmpty() ? null : emailAddressPasswordAccounts.iterator().next().getMember();
}
Returns a Member object that has a Set of createdTags containing 2 Tags and a tag that is actually an Event.
My actual domain model is below (not simplified to Dogs and Cats):
Additional info:
I have tried annotating all model objects apart from BaseNodeEntity with #NodeEntity annotation. Properties are annotated.
Update 09 May 2016
I get the same intermittent results if I swap out the ogm code for loadMemberByEmailAddressPasswordAccount with derived finder on a repository.
The case above which involved collections of relationships with the same type but different end node types was a bug- http://github.com/neo4j/neo4j-ogm/issues/161
This has been fixed and is available in Neo4j OGM 2.0.2

Properties of Entity sent from iOS are set to null when objectify is used to store the entity into datastore

I send an entity from an iOS client and it is processed by the following backendAPI method:
#ApiMethod(name="dataInserter.insertData",path="insertData",httpMethod="post")
public Entity insertData(customEntity userInput){
ofy().save().entity(userInput).now();
return userInput;
}
customEntity is defined within customEntity.java as follows:
//Import Statements here
#Entity
public class customEntity {
#Id public String someID;
#Index String providedData;
}
After the above code runs, datastore contains the following entry:
ID/Name providedData
id=5034... <null>
If I add the following lines to my method:
customEntity badSoup=new customEntity();
badSoup.providedData="I am exhausted";
ofy().save().entity(badSoup).now();
I see the following in the datastore after I run the code:
ID/Name providedData
id=5034... I am exhausted
In a post almost similar to this one, the poster -- Drux -- concludes "...assignments to #Indexed properties only have actual effects on indices (and hence queries) if they are carried out directly with Objectify on the server (not indirectly on iOS clients and then passed to the server with Google Cloud Endpoints)." stickfigure then responds, "It sounds like what you're saying is 'cloud endpoints is not reconstituting your SomeEntity object correctly'. Objectify is not involved; it just saves whatever you give it."
It's hard to tell whether stickfigure is correct most especially given the fact that when I explore my API using Google's APIs Explorer, the same problem described above still occurs.
Is anyone able to explain what's causing this or is Drux's conclusion correct?

ELKI DBSCAN R* tree index

In MiniGUi, I can see db.index. How do I set it to tree.spatial.rstarvariants.rstar.RStartTreeFactory via Java code?
I have implemented:
params.addParameter(AbstractDatabase.Parameterizer.INDEX_ID,tree.spatial.rstarvariants.rstar.RStarTreeFactory);
For the second parameter of addParameter() function tree.spatial...RStarTreeFactory class not found
// Setup parameters:
ListParameterization params = new ListParameterization();
params.addParameter(
FileBasedDatabaseConnection.Parameterizer.INPUT_ID,
fileLocation);
params.addParameter(AbstractDatabase.Parameterizer.INDEX_ID,
RStarTreeFactory.class);
I am getting NullPointerException. Did I use RStarTreeFactory.class correctly?
The ELKI command line (and MiniGui; which is a command line builder) allow to specify shorthand class names, leaving out the package prefix of the implemented interface.
The full command line documentation yields:
-db.index <object_1|class_1,...,object_n|class_n>
Database indexes to add.
Implementing de.lmu.ifi.dbs.elki.index.IndexFactory
Known classes (default package de.lmu.ifi.dbs.elki.index.):
-> tree.spatial.rstarvariants.rstar.RStarTreeFactory
-> ...
I.e. for this parameter, the class prefix de.lmu.ifi.dbs.elki.index. may be omitted.
The full class name thus is:
de.lmu.ifi.dbs.elki.index.tree.spatial.rstarvariants.rstar.RStarTreeFactory
or you just type RStarTreeFactory, and let eclipse auto-repair the import:
params.addParameter(AbstractDatabase.Parameterizer.INDEX_ID,
RStarTreeFactory.class);
// Bulk loading static data yields much better trees and is much faster, too.
params.addParameter(RStarTreeFactory.Parameterizer.BULK_SPLIT_ID,
SortTileRecursiveBulkSplit.class);
// Page size should fit your dimensionality.
// For 2-dimensional data, use page sizes less than 1000.
// Rule of thumb: 15...20 * (dim * 8 + 4) is usually reasonable
// (for in-memory bulk-loaded trees)
params.addParameter(AbstractPageFileFactory.Parameterizer.PAGE_SIZE_ID, 300);
See also: Geo Indexing example in the tutorial folder.

Is there anything like "CheckSum" in Dart (on Objects)?

For Testing purposes I'm trying to design a way to verify that the results of statistical tests are identical across versions, platforms and such. There are a lot things that go on that include ints, nums, dates, Strings and more inside our collections of Objects.
In the end I want to 'know' that the whole set of instantiated objects sum to the same value (by just doing something like adding the checkSum of all internal properties).
I can write low level code for each internal value to return a checkSum but I was thinking that perhaps something like this already exists.
Thanks!
_swarmii
This sounds like you should be using the serialization library (install via Pub).
Here's a simple example to get you started:
import 'dart:io';
import 'package:serialization/serialization.dart';
class Address {
String street;
int number;
}
main() {
var address = new Address()
..number = 5
..street = 'Luumut';
var serialization = new Serialization()
..addRuleFor(address);
Map output = serialization.write(address, new SimpleJsonFormat());
print(output);
}
Then depending on what you want to do exactly, I'm sure you can fine tune the code for your purpose.

Resources