Comparison method violates its general contract, with long comparsion - comparison

Collections.sort(cells, new Comparator<MyCell>() {
#Override
public int compare(MyCell o1, MyCell o2) {
if (o1.getX() <= o2.getX() && o1.getY() <= o2.getY()) {
return -1;
} else {
return 1;
}
}
});
Here the full stack trace:
Exception in thread "main" java.lang.IllegalArgumentException: Comparison method violates its general contract!
at java.util.TimSort.mergeHi(Unknown Source)
at java.util.TimSort.mergeAt(Unknown Source)
at java.util.TimSort.mergeCollapse(Unknown Source)
at java.util.TimSort.sort(Unknown Source)
at java.util.TimSort.sort(Unknown Source)
at java.util.Arrays.sort(Unknown Source)
at java.util.Collections.sort(Unknown Source)
I know there are many questions like this one but I don't get it why my comparision is wrong. getX() and getY() returns a long. So how can I fix this?
I already search for it, but didn't get an answer.
Thanks in advance.

Here is a good answer on the topic. A comparator must be transitive to work. Otherwise, you would not get the same results from different orderings of the initial items.

Related

KStream-GlobalKTable-Join using Spring-Cloud-Stream - How to check the content of the GlobalKTable?

I'm implementing a KStream-GlobalKTable-Join using Spring-Cloud-Stream and I'm facing the problem, that the join operation doesn't get any matches, but it definitely should. The code looks as follows:
#Component
#EnableBinding(CustomProcessor.class)
public class MyProcessor {
private static final Log LOGGER =
LogFactory.getLog(MyProcessor.class);
#Autowired
private InteractiveQueryService interactiveQueryService;
ReadOnlyKeyValueStore<Object, Object> keyValueStore;
#StreamListener
#SendTo(CustomProcessor.OUTPUT)
public KStream<EventKey, EventEnriched> process(
#Input(CustomProcessor.INPUT) KStream<EventKey, EventEnriched> inputStream,
#Input(CustomProcessor.LOOKUP) GlobalKTable<LookupKey, LookupData> lookupStore
) {
keyValueStore = interactiveQueryService.getQueryableStore("lookupStore", QueryableStoreTypes.keyValueStore());
LOGGER.info("Lookup: " + keyValueStore.get(new LookupKey("google.de")));
return inputStream.leftJoin(
lookupStore,
(inputKey, inputValue) -> {
return new LookupKey(inputValue.getDomain().replace("www.", ""));
},
this::enrichData
);
}
public EventEnriched enrichData(EventEnriched input, LookupData lookupRecord) {
...
}
}
Here the CustomProcessor:
public interface CustomProcessor extends KafkaStreamsProcessor {
String INPUT = "input";
String OUTPUT = "output";
String LOOKUP = "lookupTable";
#Input(CustomProcessor.LOOKUP)
GlobalKTable<LookupKey, ?> lookupTable();
}
Without calling the line in MyProcessor
keyValueStore.get(...)
the code runs fine, but the GlobalKTable seems to be null. But if I call
LOGGER.info("Lookup: " + keyValueStore.get(new LookupKey("google.de")));
in order to inpect the GlobalKTable, runnig the application fails with:
Error starting ApplicationContext. To display the conditions report re-run your application with 'debug' enabled.
2019-06-26T09:04:00.000 [ERROR] [main-858] [org.springframework.boot.SpringApplication] [reportFailure:858] Application run failed
org.springframework.beans.factory.BeanInitializationException: Cannot setup StreamListener for public org.apache.kafka.streams.kstream.KStream MyProcessor.process(org.apache.kafka.streams.kstream.KStream,org.apache.kafka.streams.kstream.GlobalKTable); nested exception is java.lang.reflect.InvocationTargetException
at org.springframework.cloud.stream.binder.kafka.streams.KafkaStreamsStreamListenerSetupMethodOrchestrator.orchestrateStreamListenerSetupMethod(KafkaStreamsStreamListenerSetupMethodOrchestrator.java:214)
at org.springframework.cloud.stream.binding.StreamListenerAnnotationBeanPostProcessor.doPostProcess(StreamListenerAnnotationBeanPostProcessor.java:226)
at org.springframework.cloud.stream.binding.StreamListenerAnnotationBeanPostProcessor.lambda$postProcessAfterInitialization$0(StreamListenerAnnotationBeanPostProcessor.java:196)
at java.base/java.lang.Iterable.forEach(Iterable.java:75)
at org.springframework.cloud.stream.binding.StreamListenerAnnotationBeanPostProcessor.injectAndPostProcessDependencies(StreamListenerAnnotationBeanPostProcessor.java:330)
at org.springframework.cloud.stream.binding.StreamListenerAnnotationBeanPostProcessor.afterSingletonsInstantiated(StreamListenerAnnotationBeanPostProcessor.java:113)
at org.springframework.beans.factory.support.DefaultListableBeanFactory.preInstantiateSingletons(DefaultListableBeanFactory.java:866)
at org.springframework.context.support.AbstractApplicationContext.finishBeanFactoryInitialization(AbstractApplicationContext.java:877)
at org.springframework.context.support.AbstractApplicationContext.refresh(AbstractApplicationContext.java:549)
at org.springframework.boot.web.servlet.context.ServletWebServerApplicationContext.refresh(ServletWebServerApplicationContext.java:142)
at org.springframework.boot.SpringApplication.refresh(SpringApplication.java:775)
at org.springframework.boot.SpringApplication.refreshContext(SpringApplication.java:397)
at org.springframework.boot.SpringApplication.run(SpringApplication.java:316)
at org.springframework.boot.SpringApplication.run(SpringApplication.java:1260)
at org.springframework.boot.SpringApplication.run(SpringApplication.java:1248)
at Transformer.main(Transformer.java:31)
Caused by: java.lang.reflect.InvocationTargetException: null
at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at java.base/jdk.internal.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.base/java.lang.reflect.Method.invoke(Method.java:566)
at org.springframework.cloud.stream.binder.kafka.streams.KafkaStreamsStreamListenerSetupMethodOrchestrator.orchestrateStreamListenerSetupMethod(KafkaStreamsStreamListenerSetupMethodOrchestrator.java:179)
... 15 common frames omitted
Caused by: java.lang.NullPointerException: null
at MyProcessor.process(MyProcessor.java:62)
... 20 common frames omitted
Process finished with exit code 1
Does anybody see a problem in the code? How can I inspect the content of the GlobaKTable?
Best regards
Martin
Now I'm getting closer to the problem. I have tried to query the lookupStore. If I use
final ReadOnlyKeyValueStore<LookupKey, LookupData> lookupStore =
interactiveQueryService.getQueryableStore("myStore", QueryableStoreTypes.<LookupKey, LookupData>keyValueStore())
Then
lookupStore.get(key)
never returns a value. But if I create a HashMap like this:
final KeyValueIterator<LookupKey, LookupData> lookups = lookupStore.all();
Map<LookupKey, LookupData> lookupMap = new HashMap<>();
while (lookups.hasNext()) {
KeyValue<LookupKey, LookupData> nextLookup = lookups.next();
lookupMap.put(nextLookup.key, nextLookup.value);
}
lookups.close();
the hashMap contains the correct data and is returning the correct value to each key. But the GlobalKTable itself cannot be joined for some reason. It never gets any matches.

Apply Side input to BigQueryIO.read operation in Apache Beam

Is there a way to apply a side input to a BigQueryIO.read() operation in Apache Beam.
Say for example I have a value in a PCollection that I want to use in a query to fetch data from a BigQuery table. Is this possible using side input? Or should something else be used in such a case?
I used NestedValueProvider in a similar case but I guess we can use that only when a certain value depends on my runtime value. Or can I use the same thing here? Please correct me if I'm wrong.
The code that I tried:
Bigquery bigQueryClient = start_pipeline.newBigQueryClient(options.as(BigQueryOptions.class)).build();
Tabledata tableRequest = bigQueryClient.tabledata();
PCollection<TableRow> existingData = readData.apply("Read existing data",ParDo.of(new DoFn<String,TableRow>(){
#ProcessElement
public void processElement(ProcessContext c) throws IOException
{
List<TableRow> list = c.sideInput(bqDataView);
String tableName = list.get(0).get("table").toString();
TableDataList table = tableRequest.list("projectID","DatasetID",tableName).execute();
for(TableRow row:table.getRows())
{
c.output(row);
}
}
}).withSideInputs(bqDataView));
The error that I get is:
Exception in thread "main" java.lang.IllegalArgumentException: unable to serialize BeamTest.StarterPipeline$1#86b455
at org.apache.beam.sdk.util.SerializableUtils.serializeToByteArray(SerializableUtils.java:53)
at org.apache.beam.sdk.util.SerializableUtils.clone(SerializableUtils.java:90)
at org.apache.beam.sdk.transforms.ParDo$SingleOutput.<init>(ParDo.java:569)
at org.apache.beam.sdk.transforms.ParDo.of(ParDo.java:434)
at BeamTest.StarterPipeline.main(StarterPipeline.java:158)
Caused by: java.io.NotSerializableException: com.google.api.services.bigquery.Bigquery$Tabledata
at java.io.ObjectOutputStream.writeObject0(Unknown Source)
at java.io.ObjectOutputStream.defaultWriteFields(Unknown Source)
at java.io.ObjectOutputStream.writeSerialData(Unknown Source)
at java.io.ObjectOutputStream.writeOrdinaryObject(Unknown Source)
at java.io.ObjectOutputStream.writeObject0(Unknown Source)
at java.io.ObjectOutputStream.writeObject(Unknown Source)
at org.apache.beam.sdk.util.SerializableUtils.serializeToByteArray(SerializableUtils.java:49)
... 4 more
The Beam model does not currently support this kind of data-dependent operation very well.
A way of doing it is to code your own DoFn that receives the side input and connects directly to BQ. Unfortunately, this would not give you any parallelism, as the DoFn would run completely on the same thread.
Once Splittable DoFns are supported in Beam, this will be a different story.
In the current state of the world, you would need to use the BQ client library to add code that would query BQ as if you were not in a Beam pipeline.
Given the code in your question, a rough idea on how to implement this is the following:
class ReadDataDoFn extends DoFn<String,TableRow>(){
private Tabledata tableRequest;
private Bigquery bigQueryClient;
private Bigquery createBigQueryClientWithinDoFn() {
// I'm not sure how you'd implement this, but you had the right idea
}
#Setup
public void setup() {
bigQueryClient = createBigQueryClientWithinDoFn();
tableRequest = bigQueryClient.tabledata();
}
#ProcessElement
public void processElement(ProcessContext c) throws IOException
{
List<TableRow> list = c.sideInput(bqDataView);
String tableName = list.get(0).get("table").toString();
TableDataList table = tableRequest.list("projectID","DatasetID",tableName).execute();
for(TableRow row:table.getRows())
{
c.output(row);
}
}
}
PCollection<TableRow> existingData = readData.apply("Read existing data",ParDo.of(new ReadDataDoFn()));

Cannot set model for Listbox from EventQueue.subscribe method after refreshing page

Cannot set model from EventQueue.subscribe method after refreshing page.
I have two pages - my main .zul and included .zul files.There are separate controllers for each zul. I publish event from included page's controller when a user clicks on the listbox on the included page and pass customer object.
eq = EventQueues.lookup("CLIENTS", EventQueues.DESKTOP, true);
eq.publish(new Event("onClick", null, customer));
In my main .zul page's controller I receive event and retrieve customer object. Then, based on its id I provide main listbox with corresponding data.
eq = EventQueues.lookup("CLIENTS", EventQueues.DESKTOP, true);
eq.subscribe(new EventListener() {
public void onEvent(Event event) throws Exception {
if (!Executions.getCurrent().getDesktop().isAlive()) {
eq.unsubscribe(this);
return;
}
Customer customer = (Customer) event.getData();
if (customer != null){
id = customer.getId();// Need to identify what data to retrieve from database
crm_div.setVisible(false); // Listbox from included page
dataListbox.setVisible(true); // Listbox on main page
dataListbox.setModel(new DataListboxModel());// Go to database and extract relevant data
}
else{
alert("No client");
}
}
});
First time, it works fine. I receive event, get the object and successfuly provide listbox with model. However, when I go to another page and return I get NullPointerException. In log file, I noticed that session is the same, page was destroyed, but desktop is alive. I am using ZK 5.0.10.
at
org.zkoss.zk.ui.AbstractComponent.getAttachedUiEngine(AbstractComponent.java:387) at org.zkoss.zk.ui.AbstractComponent.smartUpdate(AbstractComponent.java:1487) at org.zkoss.zk.ui.AbstractComponent.smartUpdate(AbstractComponent.java:1462) at org.zkoss.zk.ui.AbstractComponent.smartUpdate(AbstractComponent.java:1495) at org.zkoss.zul.Listbox.resetDataLoader(Listbox.java:2982) at org.zkoss.zul.Listbox.setModel(Listbox.java:2377) at com.is.sdbooks.controller.ComposerTest.refreshModel(ComposerTest.java:169) at com.is.sdbooks.controller.ComposerTest.onDoubleClick$dataGrid(ComposerTest.java:180) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(Unknown Source) at java.lang.reflect.Method.invoke(Unknown Source) at org.zkoss.zk.ui.event.GenericEventListener.onEvent(GenericEventListener.java:81) at org.zkoss.zk.ui.impl.EventProcessor.process0(EventProcessor.java:192) at org.zkoss.zk.ui.impl.EventProcessor.process(EventProcessor.java:138) at org.zkoss.zk.ui.event.Events.sendEvent(Events.java:306) at org.zkoss.zk.ui.event.Events.sendEvent(Events.java:329) at org.zkoss.zk.ui.AbstractComponent$ForwardListener.onEvent(AbstractComponent.java:3052) at org.zkoss.zk.ui.impl.EventProcessor.process0(EventProcessor.java:192) at org.zkoss.zk.ui.impl.EventProcessor.process(EventProcessor.java:138) at org.zkoss.zk.ui.impl.UiEngineImpl.processEvent(UiEngineImpl.java:1626) at org.zkoss.zk.ui.impl.UiEngineImpl.process(UiEngineImpl.java:1410) at org.zkoss.zk.ui.impl.UiEngineImpl.execUpdate(UiEngineImpl.java:1134) at org.zkoss.zk.au.http.DHtmlUpdateServlet.process(DHtmlUpdateServlet.java:562) at org.zkoss.zk.au.http.DHtmlUpdateServlet.doGet(DHtmlUpdateServlet.java:457) at org.zkoss.zk.au.http.DHtmlUpdateServlet.doPost(DHtmlUpdateServlet.java:465) at javax.servlet.http.HttpServlet.service(HttpServlet.java:637) at javax.servlet.http.HttpServlet.service(HttpServlet.java:717) at org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:290) at org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:206) at org.apache.catalina.core.StandardWrapperValve.invoke(StandardWrapperValve.java:233) at org.apache.catalina.core.StandardContextValve.invoke(StandardContextValve.java:191) at org.apache.catalina.core.StandardHostValve.invoke(StandardHostValve.java:127) at org.apache.catalina.valves.ErrorReportValve.invoke(ErrorReportValve.java:102) at org.apache.catalina.core.StandardEngineValve.invoke(StandardEngineValve.java:109) at org.apache.catalina.connector.CoyoteAdapter.service(CoyoteAdapter.java:298) at org.apache.coyote.http11.Http11Processor.process(Http11Processor.java:852) at org.apache.coyote.http11.Http11Protocol$Http11ConnectionHandler.process(Http11Protocol.java:588) at org.apache.tomcat.util.net.JIoEndpoint$Worker.run(JIoEndpoint.java:489) at java.lang.Thread.run(Unknown Source)
Problem solved. Just added condition to check if current page is alive
if (!Executions.getCurrent().getDesktop().isAlive()) {
eq.unsubscribe(this);
return;
}
if(!self.getPage().isAlive()){
eq.unsubscribe(this);
return;
}
Customer customer = (Customer) event.getData();

Neo4j TimeTree REST API Previous and Next Navigation

I am currently using Neo4j TimeTree REST API and is there any way to navigate to the time before and after a given timestamp? My resolution is Second and I just realize that if the minute has changed, then there is no 'NEXT' relationship bridging the previous Second in previous Minute to the current Second. This makes the cypher query quite complicated and I just don't want to reinvent the wheel again if it's already available.
Thanks in advance and your response would be really appreciated!
EDIT
I've got to reproduce the missing NEXT relationship issue again, as you can see in the picture below. This starts to happen from the third time I add a new Second time instant.
I actually create a NodeEntity to operate with the Second nodes. The class is like below.
#NodeEntity(label = "Second")
public class TimeTreeSecond {
#GraphId
private Long id;
private Integer value;
#Relationship(type = "CREATED_ON", direction = Relationship.INCOMING)
private FilterVersionChange relatedFilterVersionChange;
#Relationship(type = "NEXT", direction = Relationship.OUTGOING)
private TimeTreeSecond nextTimeTreeSecond;
#Relationship(type = "NEXT", direction = Relationship.INCOMING)
private TimeTreeSecond prevTimeTreeSecond;
public TimeTreeSecond() {
}
public Long getId() {
return id;
}
public void next(TimeTreeSecond nextTimeTreeSecond) {
this.nextTimeTreeSecond = nextTimeTreeSecond;
}
public FilterVersionChange getRelatedFilterVersionChange() {
return relatedFilterVersionChange;
}
}
The problem here is the Incoming NEXT relationship. When I omit that, everything works fine.
Sometimes I even get this kind of exception in my console when I create the time instant repetitively with short delay.
Exception in thread "main" org.neo4j.ogm.session.result.ResultProcessingException: Could not initialise response
at org.neo4j.ogm.session.response.GraphModelResponse.<init>(GraphModelResponse.java:38)
at org.neo4j.ogm.session.request.SessionRequestHandler.execute(SessionRequestHandler.java:55)
at org.neo4j.ogm.session.Neo4jSession.load(Neo4jSession.java:108)
at org.neo4j.ogm.session.Neo4jSession.load(Neo4jSession.java:100)
at org.springframework.data.neo4j.repository.GraphRepositoryImpl.findOne(GraphRepositoryImpl.java:50)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:606)
at org.springframework.data.repository.core.support.RepositoryFactorySupport$QueryExecutorMethodInterceptor.executeMethodOn(RepositoryFactorySupport.java:452)
at org.springframework.data.repository.core.support.RepositoryFactorySupport$QueryExecutorMethodInterceptor.doInvoke(RepositoryFactorySupport.java:437)
at org.springframework.data.repository.core.support.RepositoryFactorySupport$QueryExecutorMethodInterceptor.invoke(RepositoryFactorySupport.java:409)
at org.springframework.aop.framework.ReflectiveMethodInvocation.proceed(ReflectiveMethodInvocation.java:179)
at org.springframework.transaction.interceptor.TransactionInterceptor$1.proceedWithInvocation(TransactionInterceptor.java:99)
at org.springframework.transaction.interceptor.TransactionAspectSupport.invokeWithinTransaction(TransactionAspectSupport.java:281)
at org.springframework.transaction.interceptor.TransactionInterceptor.invoke(TransactionInterceptor.java:96)
at org.springframework.aop.framework.ReflectiveMethodInvocation.proceed(ReflectiveMethodInvocation.java:179)
at org.springframework.dao.support.PersistenceExceptionTranslationInterceptor.invoke(PersistenceExceptionTranslationInterceptor.java:136)
at org.springframework.aop.framework.ReflectiveMethodInvocation.proceed(ReflectiveMethodInvocation.java:179)
at org.springframework.aop.framework.JdkDynamicAopProxy.invoke(JdkDynamicAopProxy.java:207)
at com.sun.proxy.$Proxy32.findOne(Unknown Source)
at de.rwthaachen.service.core.FilterDefinitionServiceImpl.createNewFilterVersionChange(FilterDefinitionServiceImpl.java:100)
at sampleapp.FilterLauncher.main(FilterLauncher.java:50)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:606)
at com.intellij.rt.execution.application.AppMain.main(AppMain.java:140)
Caused by: org.neo4j.ogm.session.result.ResultProcessingException: "errors":[{"code":"Neo.ClientError.Statement.InvalidType","message":"Expected a numeric value for empty iterator, but got null"}]}
at org.neo4j.ogm.session.response.JsonResponse.parseErrors(JsonResponse.java:128)
at org.neo4j.ogm.session.response.JsonResponse.parseColumns(JsonResponse.java:102)
at org.neo4j.ogm.session.response.JsonResponse.initialiseScan(JsonResponse.java:46)
at org.neo4j.ogm.session.response.GraphModelResponse.initialiseScan(GraphModelResponse.java:66)
at org.neo4j.ogm.session.response.GraphModelResponse.<init>(GraphModelResponse.java:36)
... 27 more
2015-05-23 01:30:46,204 INFO ork.data.neo4j.config.Neo4jConfiguration: 62 - Intercepted exception
Below is one REST call example which I use to create the time instant nodes:
http://localhost:7474/graphaware/timetree/1202/single/1432337658713?resolution=Second&timezone=Europe/Amsterdam
method that I use to create the data :
public FilterVersionChange createNewFilterVersionChange(String projectName,
String filterVersionName,
String filterVersionChangeDescription,
Set<FilterState> filterStates)
{
Long filterVersionNodeId = filterVersionRepository.findFilterVersionByName(projectName, filterVersionName);
FilterVersion newFilterVersion = filterVersionRepository.findOne(filterVersionNodeId, 2);
// Populate all the existing filters in the current project
Map<String, Filter> existingFilters = new HashMap<String, Filter>();
try
{
for(Filter filter : newFilterVersion.getProject().getFilters())
{
existingFilters.put(filter.getMatchingString(), filter);
}
}
catch(Exception e) {}
// Map the filter states to the populated filters, if any. Otherwise, create new filter for it.
for(FilterState filterState : filterStates)
{
Filter filter = existingFilters.get(filterState.getMatchingString());
if(filter == null)
{
filter = new Filter(filterState.getMatchingString(), filterState.getMatchingType(), newFilterVersion.getProject());
}
filterState.stateOf(filter);
}
Long now = System.currentTimeMillis();
TimeTreeSecond timeInstantNode = timeTreeSecondRepository.findOne(timeTreeService.getFilterTimeInstantNodeId(projectName, now));
FilterVersionChange filterVersionChange = new FilterVersionChange(filterVersionChangeDescription, now, filterStates, filterStates, newFilterVersion, timeInstantNode);
FilterVersionChange addedFilterVersionChange = filterVersionChangeRepository.save(filterVersionChange);
return addedFilterVersionChange;
}
Leaving aside for a moment the specific use of TimeTree, I'd like to describe how to generally manage a doubly-linked list using SDN 4, specifically for the case where the underlying graph uses a single relationship type between nodes, e.g.
(post:Post)-[:NEXT]->(post:Post)
What you can't do
Due to limitations in the mapping framework, it is not possible to reliably declare the same relationship type twice in two different directions in your object model, i.e. this (currently) will not work:
class Post {
#Relationship(type="NEXT", direction=Relationship.OUTGOING)
Post next;
#Relationship(type="NEXT", direction=Relationship.INCOMING)
Post previous;
}
What you can do
Instead we can combine the #Transient annotation with the use of annotated setter methods to obtain the desired result:
class Post {
Post next;
#Transient Post previous;
#Relationship(type="NEXT", direction=Relationship.OUTGOING)
public void setNext(Post next) {
this.next = next;
if (next != null) {
next.previous = this;
}
}
}
As a final point, if you then wanted to be able to navigate forwards and backwards through the entire list of Posts from any starting Post without having to continually refetch them from the database, you can set the fetch depth to -1 when you load the post, e.g:
findOne(post.getId(), -1);
Bear in mind that an infinite depth query will fetch every reachable object in the graph from the matched one, so use it with care!
Hope this is helpful
The Seconds are linked to each other via a NEXT relationship, even across minutes.
Hope this is what you meant

Grails and Quartz: Bad value for type long

I'm trying to save quartz jobs into the database. I've set up the tables, created quartz.properties files, but when I try to run the app, this exception shows up:
2012-02-01 17:36:23,708 [main] ERROR context.GrailsContextLoader - Error executing bootstraps: org.quartz.JobPersistenceException: Couldn't store trigger 'expirationTrigger' for 'com.pldtglobal.svngateway.ExpirationCheckerJob' job:Bad value for type long : \254\355\000\005sr\000\025org.quartz.JobDataMap\237\260\203\350\277\251\260\313\002\000\000xr\000&org.quartz.utils.StringKeyDirtyFlagMap\202\010\350\303\373\305](\002\000\001Z\000\023allowsTransientDataxr\000\035org.quartz.utils.DirtyFlagMap\023\346.\255(v\012\316\002\000\002Z\000\005dirtyL\000\003mapt\000\017Ljava/util/Map;xp\001sr\000\021java.util.HashMap\005\007\332\301\303\026`\321\003\000\002F\000\012loadFactorI\000\011thresholdxp?#\000\000\000\000\000\014w\010\000\000\000\020\000\000\000\001t\000'org.grails.plugins.quartz.grailsJobNamet\000.com.pldtglobal.svngateway.ExpirationCheckerJobx\000 [See nested exception: org.postgresql.util.PSQLException: Bad value for type long : \254\355\000\005sr\000\025org.quartz.JobDataMap\237\260\203\350\277\251\260\313\002\000\000xr\000&org.quartz.utils.StringKeyDirtyFlagMap\202\010\350\303\373\305](\002\000\001Z\000\023allowsTransientDataxr\000\035org.quartz.utils.DirtyFlagMap\023\346.\255(v\012\316\002\000\002Z\000\005dirtyL\000\003mapt\000\017Ljava/util/Map;xp\001sr\000\021java.util.HashMap\005\007\332\301\303\026`\321\003\000\002F\000\012loadFactorI\000\011thresholdxp?#\000\000\000\000\000\014w\010\000\000\000\020\000\000\000\001t\000'org.grails.plugins.quartz.grailsJobNamet\000.com.pldtglobal.svngateway.ExpirationCheckerJobx\000]
org.codehaus.groovy.runtime.InvokerInvocationException: org.quartz.JobPersistenceException: Couldn't store trigger 'expirationTrigger' for 'com.pldtglobal.svngateway.ExpirationCheckerJob' job:Bad value for type long : \254\355\000\005sr\000\025org.quartz.JobDataMap\237\260\203\350\277\251\260\313\002\000\000xr\000&org.quartz.utils.StringKeyDirtyFlagMap\202\010\350\303\373\305](\002\000\001Z\000\023allowsTransientDataxr\000\035org.quartz.utils.DirtyFlagMap\023\346.\255(v\012\316\002\000\002Z\000\005dirtyL\000\003mapt\000\017Ljava/util/Map;xp\001sr\000\021java.util.HashMap\005\007\332\301\303\026`\321\003\000\002F\000\012loadFactorI\000\011thresholdxp?#\000\000\000\000\000\014w\010\000\000\000\020\000\000\000\001t\000'org.grails.plugins.quartz.grailsJobNamet\000.com.pldtglobal.svngateway.ExpirationCheckerJobx\000 [See nested exception: org.postgresql.util.PSQLException: Bad value for type long : \254\355\000\005sr\000\025org.quartz.JobDataMap\237\260\203\350\277\251\260\313\002\000\000xr\000&org.quartz.utils.StringKeyDirtyFlagMap\202\010\350\303\373\305](\002\000\001Z\000\023allowsTransientDataxr\000\035org.quartz.utils.DirtyFlagMap\023\346.\255(v\012\316\002\000\002Z\000\005dirtyL\000\003mapt\000\017Ljava/util/Map;xp\001sr\000\021java.util.HashMap\005\007\332\301\303\026`\321\003\000\002F\000\012loadFactorI\000\011thresholdxp?#\000\000\000\000\000\014w\010\000\000\000\020\000\000\000\001t\000'org.grails.plugins.quartz.grailsJobNamet\000.com.pldtglobal.svngateway.ExpirationCheckerJobx\000]
at org.grails.tomcat.TomcatServer.start(TomcatServer.groovy:212)
at grails.web.container.EmbeddableServer$start.call(Unknown Source)
at _GrailsRun_groovy$_run_closure5_closure12.doCall(_GrailsRun_groovy:158)
at _GrailsRun_groovy$_run_closure5_closure12.doCall(_GrailsRun_groovy)
at _GrailsSettings_groovy$_run_closure10.doCall(_GrailsSettings_groovy:280)
at _GrailsSettings_groovy$_run_closure10.call(_GrailsSettings_groovy)
at _GrailsRun_groovy$_run_closure5.doCall(_GrailsRun_groovy:149)
at _GrailsRun_groovy$_run_closure5.call(_GrailsRun_groovy)
at _GrailsRun_groovy.runInline(_GrailsRun_groovy:116)
at _GrailsRun_groovy.this$4$runInline(_GrailsRun_groovy)
at _GrailsRun_groovy$_run_closure1.doCall(_GrailsRun_groovy:59)
at RunApp$_run_closure1.doCall(RunApp:33)
at gant.Gant$_dispatch_closure5.doCall(Gant.groovy:381)
at gant.Gant$_dispatch_closure7.doCall(Gant.groovy:415)
at gant.Gant$_dispatch_closure7.doCall(Gant.groovy)
at gant.Gant.withBuildListeners(Gant.groovy:427)
at gant.Gant.this$2$withBuildListeners(Gant.groovy)
at gant.Gant$this$2$withBuildListeners.callCurrent(Unknown Source)
at gant.Gant.dispatch(Gant.groovy:415)
at gant.Gant.this$2$dispatch(Gant.groovy)
at gant.Gant.invokeMethod(Gant.groovy)
at gant.Gant.executeTargets(Gant.groovy:590)
at gant.Gant.executeTargets(Gant.groovy:589)
Caused by: org.quartz.JobPersistenceException: Couldn't store trigger 'expirationTrigger' for 'com.pldtglobal.svngateway.ExpirationCheckerJob' job:Bad value for type long : \254\355\000\005sr\000\025org.quartz.JobDataMap\237\260\203\350\277\251\260\313\002\000\000xr\000&org.quartz.utils.StringKeyDirtyFlagMap\202\010\350\303\373\305](\002\000\001Z\000\023allowsTransientDataxr\000\035org.quartz.utils.DirtyFlagMap\023\346.\255(v\012\316\002\000\002Z\000\005dirtyL\000\003mapt\000\017Ljava/util/Map;xp\001sr\000\021java.util.HashMap\005\007\332\301\303\026`\321\003\000\002F\000\012loadFactorI\000\011thresholdxp?#\000\000\000\000\000\014w\010\000\000\000\020\000\000\000\001t\000'org.grails.plugins.quartz.grailsJobNamet\000.com.pldtglobal.svngateway.ExpirationCheckerJobx\000 [See nested exception: org.postgresql.util.PSQLException: Bad value for type long : \254\355\000\005sr\000\025org.quartz.JobDataMap\237\260\203\350\277\251\260\313\002\000\000xr\000&org.quartz.utils.StringKeyDirtyFlagMap\202\010\350\303\373\305](\002\000\001Z\000\023allowsTransientDataxr\000\035org.quartz.utils.DirtyFlagMap\023\346.\255(v\012\316\002\000\002Z\000\005dirtyL\000\003mapt\000\017Ljava/util/Map;xp\001sr\000\021java.util.HashMap\005\007\332\301\303\026`\321\003\000\002F\000\012loadFactorI\000\011thresholdxp?#\000\000\000\000\000\014w\010\000\000\000\020\000\000\000\001t\000'org.grails.plugins.quartz.grailsJobNamet\000.com.pldtglobal.svngateway.ExpirationCheckerJobx\000]
at org.quartz.impl.jdbcjobstore.JobStoreSupport.storeTrigger(JobStoreSupport.java:1241)
at org.quartz.impl.jdbcjobstore.JobStoreSupport$5.execute(JobStoreSupport.java:1147)
at org.quartz.impl.jdbcjobstore.JobStoreSupport$40.execute(JobStoreSupport.java:3670)
at org.quartz.impl.jdbcjobstore.JobStoreCMT.executeInLock(JobStoreCMT.java:242)
at org.quartz.impl.jdbcjobstore.JobStoreSupport.executeInLock(JobStoreSupport.java:3666)
at org.quartz.impl.jdbcjobstore.JobStoreSupport.storeTrigger(JobStoreSupport.java:1143)
at org.quartz.core.QuartzScheduler.scheduleJob(QuartzScheduler.java:790)
at org.quartz.impl.StdScheduler.scheduleJob(StdScheduler.java:254)
at org.quartz.Scheduler$scheduleJob.call(Unknown Source)
at QuartzGrailsPlugin$_closure5_closure24.doCall(QuartzGrailsPlugin.groovy:223)
at QuartzGrailsPlugin$_closure5.doCall(QuartzGrailsPlugin.groovy:218)
at QuartzGrailsPlugin.invokeMethod(QuartzGrailsPlugin.groovy)
at QuartzGrailsPlugin$_closure3_closure21.doCall(QuartzGrailsPlugin.groovy:169)
at QuartzGrailsPlugin$_closure3.doCall(QuartzGrailsPlugin.groovy:167)
... 23 more
Caused by: org.postgresql.util.PSQLException: Bad value for type long : \254\355\000\005sr\000\025org.quartz.JobDataMap\237\260\203\350\277\251\260\313\002\000\000xr\000&org.quartz.utils.StringKeyDirtyFlagMap\202\010\350\303\373\305](\002\000\001Z\000\023allowsTransientDataxr\000\035org.quartz.utils.DirtyFlagMap\023\346.\255(v\012\316\002\000\002Z\000\005dirtyL\000\003mapt\000\017Ljava/util/Map;xp\001sr\000\021java.util.HashMap\005\007\332\301\303\026`\321\003\000\002F\000\012loadFactorI\000\011thresholdxp?#\000\000\000\000\000\014w\010\000\000\000\020\000\000\000\001t\000'org.grails.plugins.quartz.grailsJobNamet\000.com.pldtglobal.svngateway.ExpirationCheckerJobx\000
at org.postgresql.jdbc2.AbstractJdbc2ResultSet.toLong(AbstractJdbc2ResultSet.java:2796)
at org.postgresql.jdbc2.AbstractJdbc2ResultSet.getLong(AbstractJdbc2ResultSet.java:2019)
at org.postgresql.jdbc4.Jdbc4ResultSet.getBlob(Jdbc4ResultSet.java:52)
at org.postgresql.jdbc2.AbstractJdbc2ResultSet.getBlob(AbstractJdbc2ResultSet.java:335)
at org.quartz.impl.jdbcjobstore.StdJDBCDelegate.getObjectFromBlob(StdJDBCDelegate.java:3462)
at org.quartz.impl.jdbcjobstore.StdJDBCDelegate.selectJobDetail(StdJDBCDelegate.java:904)
at org.quartz.impl.jdbcjobstore.JobStoreSupport.storeTrigger(JobStoreSupport.java:1197)
... 36 more
Application context shutting down...
Application context shutdown.
I don't have any idea on where the actual problem is. The code is alright and running when the jobs weren't saved in the database.
In your grails-app/conf/quartz.properties, replace
org.quartz.jobStore.driverDelegateClass = org.quartz.impl.jdbcjobstore.StdJDBCDelegate
with
org.quartz.jobStore.driverDelegateClass = org.quartz.impl.jdbcjobstore.PostgreSQLDelegate
I'm getting the same error even using the correct delegate, so no promises.
For spring boot, you can also specify the PG driver using the following property in application.properties -
spring.quartz.properties.org.quartz.jobStore.driverDelegateClass=org.quartz.impl.jdbcjobstore.PostgreSQLDelegate
For anyone using Quartz and Spring Boot, I had the same problem after migrating from using Quartz in Tomcat to Spring Boot. In Tomcat, we were using a quartz properties file and manually loading it when creating the Scheduler. One of those properties was:
org.quartz.jobStore.driverDelegateClass = org.quartz.impl.jdbcjobstore.PostgreSQLDelegate
In Spring Boot, the scheduler is created automatically through an auto config, and therefore our properties weren't being applied.
Our solution was to use a SchedulerFactoryBeanCustomizer and set the Quartz properties. This customizer is applied before the scheduler is created so it's a good place to configure Quartz.
#Bean
public SchedulerFactoryBeanCustomizer schedulerFactoryBeanCustomizer()
{
return new SchedulerFactoryBeanCustomizer()
{
#Override
public void customize(SchedulerFactoryBean bean)
{
bean.setQuartzProperties(createQuartzProperties());
}
};
}
private Properties createQuartzProperties()
{
// Could also load from a file
Properties props = new Properties();
props.put("org.quartz.jobStore.driverDelegateClass", "org.quartz.impl.jdbcjobstore.PostgreSQLDelegate");
return props;
}
And for reference here is the full quartz.properties we migrated from:
org.quartz.scheduler.instanceName=ProcessAutomation
org.quartz.scheduler.instanceId=AUTO
org.quartz.scheduler.jmx.export=true
org.quartz.threadPool.class=org.quartz.simpl.SimpleThreadPool
org.quartz.threadPool.threadCount=10
org.quartz.threadPool.threadPriority=5
org.quartz.jobStore.class=org.quartz.impl.jdbcjobstore.JobStoreCMT
org.quartz.jobStore.dataSource=QuartzDS
org.quartz.jobStore.nonManagedTXDataSource=springNonTxDataSource.ProcessAutomation
org.quartz.jobStore.driverDelegateClass=org.quartz.impl.jdbcjobstore.PostgreSQLDelegate
org.quartz.jobStore.misfireThreshold=60000
org.quartz.jobStore.isClustered=true
org.quartz.jobStore.clusterCheckinInterval=20000
#Bean
public Properties quartzProperties() throws IOException {
PropertiesFactoryBean propertiesFactoryBean = new PropertiesFactoryBean();
propertiesFactoryBean.setLocation(new ClassPathResource("application.properties"));
Properties props = new Properties();
props.put("org.quartz.jobStore.driverDelegateClass", "org.quartz.impl.jdbcjobstore.PostgreSQLDelegate");
propertiesFactoryBean.setProperties(props);
propertiesFactoryBean.afterPropertiesSet();
return propertiesFactoryBean.getObject();
}
Alternately if you want to set all quartz properties like clustered, thread-pool etc.. Instead of typing them here in this method, create a quartz.properties file and use below;
#Autowired
private QuartzProperties quartzProperties;
#Autowired
DataSource dataSource;
#Bean
public SchedulerFactoryBean schedulerFactoryBean() throws IOException {
SchedulerFactoryBean factory = new SchedulerFactoryBean();
factory.setOverwriteExistingJobs(true);
factory.setDataSource(dataSource);
factory.setQuartzProperties(quartzProperties());
AutowiringSpringBeanJobFactory jobFactory = new AutowiringSpringBeanJobFactory();
jobFactory.setApplicationContext(applicationContext);
factory.setJobFactory(jobFactory);
return factory;
}
#Bean
public Properties quartzProperties() throws IOException {
PropertiesFactoryBean propertiesFactoryBean = new PropertiesFactoryBean();
propertiesFactoryBean.setLocation(new ClassPathResource("/application.properties"));
Properties props = new Properties();
props.putAll(quartzProperties.getProperties());
propertiesFactoryBean.setProperties(props);
propertiesFactoryBean.afterPropertiesSet(); //it's important
return propertiesFactoryBean.getObject();
}
quartz.properties file example below:-
org.quartz.scheduler.instanceName=springBootQuartzApp
org.quartz.scheduler.instanceId=AUTO
org.quartz.threadPool.threadCount=50
org.quartz.jobStore.class=org.quartz.impl.jdbcjobstore.JobStoreTX
org.quartz.jobStore.driverDelegateClass=org.quartz.impl.jdbcjobstore.PostgreSQLDelegate
org.quartz.jobStore.useProperties=true
#org.quartz.jobStore.misfireThreshold=60000
org.quartz.jobStore.isClustered=true
org.quartz.plugin.shutdownHook.class=org.quartz.plugins.management.ShutdownHookPlugin
org.quartz.plugin.shutdownHook.cleanShutdown=TRUE
I also face this issue and I just add :
properties.put("org.quartz.jobStore.driverDelegateClass", "org.quartz.impl.jdbcjobstore.PostgreSQLDelegate");
full bean configuration below
#Bean
public SchedulerFactoryBean scheduler(Trigger... triggers) {
SchedulerFactoryBean schedulerFactory = new SchedulerFactoryBean();
Properties properties = new Properties();
properties.setProperty("org.quartz.scheduler.instanceName", "MY_INSTANCE_NAME");
properties.setProperty("org.quartz.scheduler.instanceId", "INSTANCE_ID_01");
properties.put("org.quartz.jobStore.driverDelegateClass", "org.quartz.impl.jdbcjobstore.PostgreSQLDelegate");
schedulerFactory.setOverwriteExistingJobs(true);
schedulerFactory.setAutoStartup(true);
schedulerFactory.setQuartzProperties(properties);
schedulerFactory.setDataSource(dataSource);
schedulerFactory.setJobFactory(springBeanJobFactory());
schedulerFactory.setWaitForJobsToCompleteOnShutdown(true);
if (ArrayUtils.isNotEmpty(triggers)) {
schedulerFactory.setTriggers(triggers);
}
return schedulerFactory;
}

Resources