LocalDateTime misformated with #Query, the seconds are removed - spring-data-elasticsearch

I have a mapping similar to this one :
#Field(type = FieldType.Date, format = DateFormat.custom, pattern = "yyyy-MM-dd'T'HH:mm:ss")
private LocalDateTime startTime;
#Field(type = FieldType.Date, format = DateFormat.custom, pattern = "yyyy-MM-dd'T'HH:mm:ss")
private LocalDateTime endTime;
And a request in a repository like this :
#Query("{" +
" \"nested\": {" +
" \"path\" : \"panel.channels\"," +
" \"query\" : {" +
" \"bool\" : {" +
" \"must\" : [" +
" { \"range\" : { \"panel.channels.startTime\" : { \"lte\": \"?1\" } } }," +
" { \"range\" : { \"panel.channels.endTime\" : { \"gte\": \"?0\" } } } " +
" ]" +
" }" +
" }" +
" }" +
"}")
Flux<Document> search(LocalDateTime from, LocalDateTime to);
If I provide date times with the seconds at zero, at start of day for instance, like 2021-03-15T00:00:00, the date is not well formatted and the requests returns no result, with this error in the logs (in the ES response) :
"caused_by":{"type":"illegal_argument_exception","reason":"failed to parse date field [2021-03-15T00:00] with format [yyyy-MM-dd'T'HH:mm:ss]"
When I check the requests is really wrong :
{
"nested": {
"path": "panel.channels",
"query": {
"bool": {
"must": [
{
"range": {
"panel.channels.startTime": {
"lte": "2021-03-15T00:00"
}
}
},
{
"range": {
"panel.channels.endTime": {
"gte": "2020-01-01T00:00"
}
}
}
]
}
}
}
}
But if there's seconds in the LocalDateTime, like 2021-03-15T00:00:01, is works !

I did some debugging and testing; so what is happening here?
You have defined a custom date format with the #Field annotation on your properties. By this, internally for each
of these properties a custom converter is created that can convert from a LocalDateTime to a String in the
desired format and back. Important: This converter is attached to the property of the entity and not a generally available converter for
LocalDateTime (you might have different formats for different properties). This converter is then used when Spring
Data Elasticsearch stores or retrieves an entity.
Things get more complicated when doing a query. When you create for example a CriteriaQuery Spring Data Elasticsearch
knows
which properties are used in the query and can use the corresponding converter to correctly convert the property
values to query parameters. This is not possible with a query that is defined by the #Query annotation on a
repository method. These queries are just used as strings and the paramater parts (?0, ?1) are replaced by the
converted parameter values of the method. converted here means converted by the default conversions that are
available, not the ones attached to the property, because we don't know to which property of an entity a parameter
might relate. We would need to implement a query parser that finds the corresponding property - in your case
panel.channels.startTime - to get the converter and I don't think that we will implement such a parser.
So in your case the default conversion service is searched for a converter for a LocalDateTime instance, and there
is no one registered. Then a fallback is used whcih calls toString() on the object to be converted and this in case
of zero seconds returns the strings that you see - just containing the hours and minutes, that's the implementation
of LocalDateTime.toString() that returns this.
The first workaround that came to my mind is to define a converter for LocalDateTime -> String and register that
in Spring Data Elasticsearch, but I found that this is not used by the #Query parameter resolution - which
definitely is a bug - that I will fix tomorrow. The next round of maintenance releases for Spring Data Elasticsearch
4.1, 4.2 and the current main branch is next friday, so you could get that change pretty fast. I tried that already
locally and the produced query is built correctly.
Another possible solution might be to introduce additional property hint annotations on the method parameters, that
would be something like
Flux<Document> search(
#Property("panel.channels.endTime") LocalDateTime from,
#Property("panel.channels.startTime") LocalDateTime to);
but to be honest, I don't like this.
When I have the fix implemented and the fixed versions are released, I would edit this answer with the additional
information about how to add the converter to your setup and get this running.

Related

Context dependent ANTLR4 ParseTreeVisitor implementation

I am working on a project where we migrate massive number (more than 12000) views to Hadoop/Impala from Oracle. I have written a small Java utility to extract view DDL from Oracle and would like to use ANTLR4 to traverse the AST and generate an Impala-compatible view DDL statement.
The most of the work is relatively simple, only involves re-writing some Oracle specific syntax quirks to Impala style. However, I am facing an issue, where I am not sure I have the best answer yet: we have a number of special cases, where values from a date field are extracted in multiple nested function calls. For example, the following extracts the day from a Date field:
TO_NUMBER(TO_CHAR(d.R_DATE , 'DD' ))
I have an ANTLR4 grammar declared for Oracle SQL and hence get the visitor callback when it reaches TO_NUMBER and TO_CHAR as well, but I would like to have special handling for this special case.
Is not there any other way than implementing the handler method for the outer function and then resorting to manual traversal of the nested structure to see
I have something like in the generated Visitor class:
#Override
public String visitNumber_function(PlSqlParser.Number_functionContext ctx) {
// FIXME: seems to be dodgy code, can it be improved?
String functionName = ctx.name.getText();
if (functionName.equalsIgnoreCase("TO_NUMBER")) {
final int childCount = ctx.getChildCount();
if (childCount == 4) {
final int functionNameIndex = 0;
final int openRoundBracketIndex = 1;
final int encapsulatedValueIndex = 2;
final int closeRoundBracketIndex = 3;
ParseTree encapsulated = ctx.getChild(encapsulatedValueIndex);
if (encapsulated instanceof TerminalNode) {
throw new IllegalStateException("TerminalNode is found at: " + encapsulatedValueIndex);
}
String customDateConversionOrNullOnOtherType =
customDateConversionFromToNumberAndNestedToChar(encapsulated);
if (customDateConversionOrNullOnOtherType != null) {
// the child node contained our expected child element, so return the converted value
return customDateConversionOrNullOnOtherType;
}
// otherwise the child was something unexpected, signalled by null
// so simply fall-back to the default handler
}
}
// some other numeric function, default handling
return super.visitNumber_function(ctx);
}
private String customDateConversionFromToNumberAndNestedToChar(ParseTree parseTree) {
// ...
}
For anyone hitting the same issue, the way to go seems to be:
changing the grammar definition and introducing custom sub-types for
the encapsulated expression of the nested function.
Then, I it is possible to hook into the processing at precisely the desired location of the Parse tree.
Using a second custom ParseTreeVisitor that captures the values of function call and delegates back the processing of the rest of the sub-tree to the main, "outer" ParseTreeVisitor.
Once the second custom ParseTreeVisitor has finished visiting all the sub-ParseTrees I had the context information I required and all the sub-tree visited properly.

FLUTTER How to get variable based on passed string name?

I have stored variables in a class with their code names.
Suppose I want to get XVG from that class, I want to do
String getIconsURL(String symbol) {
var list = new URLsList();
//symbol = 'XVG'
return list.(symbol);
}
class URLsList{
var XVG = 'some url';
var BTC = 'some url';
}
Can someone help me achieve this or provide me with a better solution?
Dart when used in flutter doesn't support reflection.
If it's text that you want to have directly in your code for some reason, I'd advise using a text replace (using your favourite tool or using intellij's find + replace with regex) to change it into a map, i.e.
final Map<String, String> whee = {
'XVG': 'url 1',
'BTC': 'url 2',
};
Another alternative is saving it as a JSON file in your assets, and then loading it and reading it when the app opens, or even downloading it from a server on first run / when needed (in case the URLs need updating more often than you plan on updating the app). Hardcoding a bunch of data like that isn't necessarily always a good idea.
EDIT: how to use.
final Map<String, String> whee = .....
String getIconsURL(String symbol) {
//symbol = 'XVG'
return whee[symbol];
}
If you define it in a class make sure you set it to static as well so it doesn't make another each time the class is instantiated.
Also, if you want to iterate through them you have the option of using entries, keys, or values - see the Map Class documentation
I'd just implement a getProperty(String name) method or the [] operator like:
class URLsList{
var XVG = 'some url';
var BTC = 'some url';
String get operator [](String key) {
switch(key) {
case 'XVG': return XVG;
case 'BTC': return BTC;
}
}
}
String getIconsURL(String symbol) {
var list = new URLsList();
return list[symbol];
}
You can also use reflectable package that enables you to use reflection-like code by code generation.
Assuming that the class is being created from a JSON Object, you can always use objectName.toJSON() and then use the variable names are array indices to do your computations.

Groovy getProperties() call invoking getter for non-existent attribute over 1000 times

Ran into this while doing a refactor. Calls to getProperties() were causing our CPU usage to spike. What we discovered is that if you have a getter without an associated attribute, when you make a call to getProperties() that getter is called over 1000 times. The fix/workaround is obvious and we know it has something to do with metaprogramming but why is this happening (what point in the groovy source)? See groovy script code below:
class tester {
int count = 0
public getVar() {
println count++ + " getVar() called!"
return var
}
}
def t = new tester()
t.getProperties()
println "done!"
You should see getVar() called over 1000 times. 1068 to be exact for us.
The question has probably already been answered in the comments but I dug a little deeper to also answer the "what point in the groovy source" part.
When you call getProperties() on the instance of tester Groovy will do its magic and finally call DefaultGroovyMethods#getProperties(Object) which (in Groovy 2.4.7) looks like this:
public static Map getProperties(Object self) {
List<PropertyValue> metaProps = getMetaPropertyValues(self); // 1
Map<String, Object> props = new LinkedHashMap<String, Object>(metaProps.size());
for (PropertyValue mp : metaProps) {
try {
props.put(mp.getName(), mp.getValue()); // 2
} catch (Exception e) {
LOG.throwing(self.getClass().getName(), "getProperty(" + mp.getName() + ")", e);
}
}
return props;
}
First, Groovy determines the meta properties of the given object (see 1). This will return three properties:
var: getter only (getVar()), no setter, no field
class: getter only (inherited from Object), no setter, no field
count: getter, setter (both generated by Groovy) and field
You can easily verify this by calling t.getMetaPropertyValues().
Next, Groovy tries to get the current value of each property and puts it in a map (see 2). When it reaches var, it remembers that var has a getter (namely getVar()) and calls it. getVar() however, returns var again. For Groovy, this is the exact same property as determined in the first step. Once again, it calls its getter getVar() and the endless loop begins.
At some point, depending on the JVM, this results in a StackOverflowError, which is exactly what this site is all about :-D

Spray Returning JSON

I've just started using Spray and I'm building a sample API that simply returns some JSON.
Take the following example. I have the alphabet stored in a singleton...
class Alphabet {}
object Alphabet {
final val alphabet = Array('a', 'b', 'c', ...)
}
I then have a simple spray route...
path("list") {
get {
respondWithMediaType(`application/json`) {
complete(Alphabet.alphabet)
}
}
}
This works fine and seems to return an "application/json" response with the correct data. But is this valid? i.e. is this a correctly formatted response that would be expected by an end user?
The reason I ask is I have looked at a number of Spray examples and most seem to use case classes and specify custom JSON formatters similar to this...
object CustomJsonProtocol extends DefaultJsonProtocol {
implicit val responseFormat = jsonFormat3(CaseClassHere)
}
...
complete {
CaseClassHere("Test", "Test");
}
Whats the correct approach?
Thanks

Quartz CronExpression get all expression parameters info

Following to my previous question, I subclssed CronExpression and changed getSet to be public. this method gets int type, and i have a String containing the cron expression. How do I get the info about this expression (hour\days\etc) ? what do I need to pass to getSet method? or maybe I should use another method? this is very unclear for me.
The problem with CronExpression is that even though it states it:
Provides a parser and evaluator for unix-like cron expressions.
The API is obscure and hidden under protected methods. By far it is not a general-purpose CRON expression parser. However with a few tweaks you can easily take advantage of the parsing logic:
class MyCronExpression extends CronExpression {
public MyCronExpression(String cronExpression) throws ParseException
{
super(cronExpression);
}
public TreeSet<Integer> getSeconds()
{
return super.getSet(CronExpression.SECOND);
}
public TreeSet<Integer> getMinutes()
{
return super.getSet(CronExpression.MINUTE);
}
public TreeSet<Integer> getHours()
{
return super.getSet(CronExpression.HOUR);
}
//...
}
Usage:
final MyCronExpression cronExpression = new MyCronExpression("0 30 9,12,15 * * ?");
System.out.println(cronExpression.getSeconds()); //0
System.out.println(cronExpression.getMinutes()); //30
System.out.println(cronExpression.getHours()); //9, 12, 15
You might be tempted to parse CRON expression manually using regular expressions... Here is a regex from job_scheduling_data_2_0.xsd Quartz schema:
(((([0-9]|[0-5][0-9]),)*([0-9]|[0-5][0-9]))|(([\*]|[0-9]|[0-5][0-9])(/|-)([0-9]|[0-5][0-9]))|([\?])|([\*]))[\s](((([0-9]|[0-5][0-9]),)*([0-9]|[0-5][0-9]))|(([\*]|[0-9]|[0-5][0-9])(/|-)([0-9]|[0-5][0-9]))|([\?])|([\*]))[\s](((([0-9]|[0-1][0-9]|[2][0-3]),)*([0-9]|[0-1][0-9]|[2][0-3]))|(([\*]|[0-9]|[0-1][0-9]|[2][0-3])(/|-)([0-9]|[0-1][0-9]|[2][0-3]))|([\?])|([\*]))[\s](((([1-9]|[0][1-9]|[1-2][0-9]|[3][0-1]),)*([1-9]|[0][1-9]|[1-2][0-9]|[3][0-1])(C)?)|(([1-9]|[0][1-9]|[1-2][0-9]|[3][0-1])(/|-)([1-9]|[0][1-9]|[1-2][0-9]|[3][0-1])(C)?)|(L(-[0-9])?)|(L(-[1-2][0-9])?)|(L(-[3][0-1])?)|(LW)|([1-9]W)|([1-3][0-9]W)|([\?])|([\*]))[\s](((([1-9]|0[1-9]|1[0-2]),)*([1-9]|0[1-9]|1[0-2]))|(([1-9]|0[1-9]|1[0-2])(/|-)([1-9]|0[1-9]|1[0-2]))|(((JAN|FEB|MAR|APR|MAY|JUN|JUL|AUG|SEP|OCT|NOV|DEC),)*(JAN|FEB|MAR|APR|MAY|JUN|JUL|AUG|SEP|OCT|NOV|DEC))|((JAN|FEB|MAR|APR|MAY|JUN|JUL|AUG|SEP|OCT|NOV|DEC)(-|/)(JAN|FEB|MAR|APR|MAY|JUN|JUL|AUG|SEP|OCT|NOV|DEC))|([\?])|([\*]))[\s]((([1-7],)*([1-7]))|([1-7](/|-)([1-7]))|(((MON|TUE|WED|THU|FRI|SAT|SUN),)*(MON|TUE|WED|THU|FRI|SAT|SUN)(C)?)|((MON|TUE|WED|THU|FRI|SAT|SUN)(-|/)(MON|TUE|WED|THU|FRI|SAT|SUN)(C)?)|(([1-7]|(MON|TUE|WED|THU|FRI|SAT|SUN))?(L|LW)?)|(([1-7]|MON|TUE|WED|THU|FRI|SAT|SUN)#([1-7])?)|([\?])|([\*]))([\s]?(([\*])?|(19[7-9][0-9])|(20[0-9][0-9]))?| (((19[7-9][0-9])|(20[0-9][0-9]))(-|/)((19[7-9][0-9])|(20[0-9][0-9])))?| ((((19[7-9][0-9])|(20[0-9][0-9])),)*((19[7-9][0-9])|(20[0-9][0-9])))?)
Or maybe someone knows a better general-purpose CRON expression parser for Java?

Resources