search_after with spring-data-elasticsearch - spring-data-elasticsearch

Has anyone successfully used search_after with spring-data-elasticsearch?
I've added _uid to my sort which is correctly getting put in my Pageable by the #Controller. I turned on the slow query log and validated if I manually run the query the #Repository generates, I get in the response:
"sort": [
1522270372773,
"log#AWJuYn7SAKReCIGzMYda"
]
I added into my pojo (the class returned by the #Repository):
#JsonProperty("sort")
String[] sort;
Sort is always coming back as null. Tried a couple different things and cannot figure out how to get sort to be set. Any ideas?

I've solved the problem but discovered some disappointing details.
The culprit is here: https://github.com/spring-projects/spring-data-elasticsearch/blob/3.0.x/src/main/java/org/springframework/data/elasticsearch/core/DefaultResultMapper.java
In the mapResults method, the code populates my POJO from SearchHit.sourceAsString(). sourceAsString provides a subset of the original JSON which does not include the sort array, i.e.
"sort": [
1522270147602,
"log#AWJuXxJ_AKReCIGzMYdV"
]
The access modifiers (privates with no getters and finals) did not provide for an elegant enhancement. I ended up copying DefaultResultMapper and implementing a method similar to setPersistentEntityId which sets sortValues into my pojo. The impl for this method is as follows:
private <T> void setSearchSortValues(T result, Object[] sortValues, Class<T> clazz) {
if(SortAware.class.isAssignableFrom(clazz)) {
((SortAware) result).setSortValues(sortValues);
}
}
My POJO implements the SortAware interface which I defined as follows:
public interface SortAware {
public Object[] getSortValues();
public void setSortValues(Object[] sortValues);
}

Related

Security Context in terms of QueryDslPredicateExecutor and Spring Data Rest

I'm building REST API on the top of Spring Data Rest. Initially all repositories where extending JpaRepository. Lately decision has been made to take a more flexible approach and use QueryDslPredicateExecutor<T> along with QuerydslBinderCustomizer<Q>.
Pretty much all findAll methods exposed in repositories should address two scenarios
principal has a role ROLE_ADMIN then no filtering should be applied a part from Pageable,Sort
principal does not have a role ROLE_ADMIN I would return only those entities which belong to the current user
Getting that done was as simple as annotating findAll method as below.
#Query("select e from Entity e where e.field = ?#{principal} or 1=?#{hasRole('ROLE_ADMIN') ? 1 : 0}")
Page<Entity> findAll(Pageable pageable);
Now I want our findAll to be something similar to below
Page<Entity> findAll(Predicate predicate, Pageable pageable)
Predicate is being build from request parameters(courtesy of #QuerydslPredicate) and is being passed in to RepositoryEntityController which is all being managed by spring-data-rest which is great.
#ResponseBody
#RequestMapping(value = BASE_MAPPING, method = RequestMethod.GET)
public Resources<?> getCollectionResource(#QuerydslPredicate RootResourceInformation resourceInformation,
DefaultedPageable pageable, Sort sort, PersistentEntityResourceAssembler assembler)
throws ResourceNotFoundException, HttpRequestMethodNotSupportedException {
I want to tweak that predicate(2 scenarios as above that I want to address).
It would be something simialr to below.
BooleanBuilder builder = new BooleanBuilder(predicateBuildFromHttpRequest);
builder.and(predicateAddressingOurRequirements);
builder.getValue();
#PostFilter won't be an option as return type for all repos is Page<Entity>.
Use case that I want to address seems to be quite common to me. Having said that I had a look at spring-data and spring-data-rest documentation and could not find anything related to my question.
Question is : Am I missing something obvious here and there is a quick win for it? or I would need to implement custom solution myself? Any comments very much appreciated!
The Querydsl predicates are constructed by QuerydslAwareRootResourceInformationHandlerMethodArgumentResolver which is sadly package private and can't be directly extended.
However, you can make a copy of that, add your security predicate logic and then drop in your implementation instead of the former resolver.
public class MyQueryDslRootResourceArgumentResolver extends RootResourceInformationHandlerMethodArgumentResolver {
// the most of the code is ommitted, the content is identical with
// QuerydslAwareRootResourceInformationHandlerMethodArgumentResolver,
// the important part is postProcessMethod where you can modify the predicate
#Override
#SuppressWarnings({"unchecked"})
protected RepositoryInvoker postProcess(MethodParameter parameter, RepositoryInvoker invoker,
Class<?> domainType, Map<String, String[]> parameters) {
Object repository = repositories.getRepositoryFor(domainType);
if (!QueryDslPredicateExecutor.class.isInstance(repository)
|| !parameter.hasParameterAnnotation(QuerydslPredicate.class)) {
return invoker;
}
ClassTypeInformation<?> type = ClassTypeInformation.from(domainType);
QuerydslBindings bindings = factory.createBindingsFor(null, type);
// modify your predicate here
Predicate predicate = predicateBuilder.getPredicate(type, toMultiValueMap(parameters), bindings);
return new QuerydslRepositoryInvokerAdapter(invoker, (QueryDslPredicateExecutor<Object>) repository, predicate);
}
}
Then add you own configuration class with the custom resolver implementation.
public class CustomRepositoryRestMvcConfiguration extends RepositoryRestMvcConfiguration {
#Autowired
ApplicationContext applicationContext;
#Override
public RootResourceInformationHandlerMethodArgumentResolver repoRequestArgumentResolver() {
QuerydslBindingsFactory factory = applicationContext.getBean(QuerydslBindingsFactory.class);
QuerydslPredicateBuilder predicateBuilder = new QuerydslPredicateBuilder(defaultConversionService(),
factory.getEntityPathResolver());
return new MyQueryDslRootResourceArgumentResolver(repositories(),
repositoryInvokerFactory(defaultConversionService()), resourceMetadataHandlerMethodArgumentResolver(),
predicateBuilder, factory);
}
}
Here is an example project that modifies the Predicate (that is produced by the parameters from url) before passing it to the repository.
The demonstration of what David Siro explained above
https://github.com/yeldarxman/QueryDslPredicateModifier

Dataflow output parameterized type to avro file

I have a pipeline that successfully outputs an Avro file as follows:
#DefaultCoder(AvroCoder.class)
class MyOutput_T_S {
T foo;
S bar;
Boolean baz;
public MyOutput_T_S() {}
}
#DefaultCoder(AvroCoder.class)
class T {
String id;
public T() {}
}
#DefaultCoder(AvroCoder.class)
class S {
String id;
public S() {}
}
...
PCollection<MyOutput_T_S> output = input.apply(myTransform);
output.apply(AvroIO.Write.to("/out").withSchema(MyOutput_T_S.class));
How can I reproduce this exact behavior except with a parameterized output MyOutput<T, S> (where T and S are both Avro code-able using reflection).
The main issue is that Avro reflection doesn't work for parameterized types. So based on these responses:
Setting Custom Coders & Handling Parameterized types
Using Avrocoder for Custom Types with Generics
1) I think I need to write a custom CoderFactory but, I am having difficulty figuring out exactly how this works (I'm having trouble finding examples). Oddly enough, a completely naive coder factory appears to let me run the pipeline and inspect proper output using DataflowAssert:
cr.RegisterCoder(MyOutput.class, new CoderFactory() {
#Override
public Coder<?> create(List<? excents Coder<?>> componentCoders) {
Schema schema = new Schema.Parser().parse("{\"type\":\"record\,"
+ "\"name\":\"MyOutput\","
+ "\"namespace\":\"mypackage"\","
+ "\"fields\":[]}"
return AvroCoder.of(MyOutput.class, schema);
}
#Override
public List<Object> getInstanceComponents(Object value) {
MyOutput<Object, Object> myOutput = (MyOutput<Object, Object>) value;
List components = new ArrayList();
return components;
}
While I can successfully assert against the output now, I expect this will not cut it for writing to a file. I haven't figured out how I'm supposed to use the provided componentCoders to generate the correct schema and if I try to just shove the schema of T or S into fields I get:
java.lang.IllegalArgumentException: Unable to get field id from class null
2) Assuming I figure out how to encode MyOutput. What do I pass to AvroIO.Write.withSchema? If I pass either MyOutput.class or the schema I get type mismatch errors.
I think there are two questions (correct me if I am wrong):
How do I enable the coder registry to provide coders for various parameterizations of MyOutput<T, S>?
How do I values of MyOutput<T, S> to a file using AvroIO.Write.
The first question is to be solved by registering a CoderFactory as in the linked question you found.
Your naive coder is probably allowing you to run the pipeline without issues because serialization is being optimized away. Certainly an Avro schema with no fields will result in those fields being dropped in a serialization+deserialization round trip.
But assuming you fill in the schema with the fields, your approach to CoderFactory#create looks right. I don't know the exact cause of the message java.lang.IllegalArgumentException: Unable to get field id from class null, but the call to AvroCoder.of(MyOutput.class, schema) should work, for an appropriately assembled schema. If there is an issue with this, more details (such as the rest of the stack track) would be helpful.
However, your override of CoderFactory#getInstanceComponents should return a list of values, one per type parameter of MyOutput. Like so:
#Override
public List<Object> getInstanceComponents(Object value) {
MyOutput<Object, Object> myOutput = (MyOutput<Object, Object>) value;
return ImmutableList.of(myOutput.foo, myOutput.bar);
}
The second question can be answered using some of the same support code as the first, but otherwise is independent. AvroIO.Write.withSchema always explicitly uses the provided schema. It does use AvroCoder under the hood, but this is actually an implementation detail. Providing a compatible schema is all that is necessary - such a schema will have to be composed for each value of T and S for which you want to output MyOutput<T, S>.

e4 dependency reinjection order: field vs. method

I was quite surprised to see that there is no deterministic behavior for the order in which objects get reinjected.
public class Test {
#Inject private Boolean testBool;
#Inject
public void checkNewObject(Boolean testBoolNew) {
if (!testBoolNew.equals(this.testBool)) {
System.out.println("Out of sync!");
} else {
System.out.println("In sync!");
}
}
}
And this is how I use the class:
context.set(Boolean.class, new Boolean(true));
Test test = ContextInjectionFactory.make(Test.class, context);
context.set(Boolean.class, new Boolean(false));
So, sometimes I get the output:
In sync!
In sync!
And sometimes I get:
In sync!
Out of sync!
Is this really non deterministic or am I just overseeing something?
The documentation clearly states that the injection order should be:
Constructor injection: the public or protected constructor annotated with #Inject with the greatest number of resolvable arguments is selected
Field injection: values are injected into fields annotated with #Inject and that have a satisfying type
Method injection: values are injected into methods annotated with #Inject and that have satisfying arguments
See: https://wiki.eclipse.org/Eclipse4/RCP/Dependency_Injection#Injection_Order
I'm not sure, why this doesn't work as expected in your case.
How is equals() implemented in MyContent?
Is MyContent annotated with #Creatable and or #Singleton?
As a side note: Is this a practical or just an academic problem? Why is it necessary to inject the same instance into a field and into a method on the same target-instance? If you want to have a field variable to cache the value, you can set this from the method.
If you feel this is a bug, please file it here: https://bugs.eclipse.org/bugs/enter_bug.cgi?product=Platform

How to override Grails' default (binding) conversion of parameter values with commas to String array?

I'm using Grails 2.3.7 and have a controller method which binds the request to a command class. One of the fields in the command class is a String array, as it expects multiple params with the same name to be specified - e.g. ?foo=1&foo=2&foo=3&bar=0.
This works fine most of the time. The one case where it fails is if the param value contains a comma - e.g. ?foo=val1,val2&foo=val3,val4. In this case I'm getting an array with 4 values: ["val1","val2","val3","val4"], not the intended output of ["val1,val2","val1,val2"]. URL escaping/encoding the comma does not help and in my cases the param value is surrounded by quote characters as well (foo=%22a+string+with+commas,+etc%22).
I had a similar problem with Spring 3.x which I was able to solve: How to prevent parameter binding from interpreting commas in Spring 3.0.5?. I've attempted one of the answers by adding the following method to my controller:
#InitBinder
public void initBinder(WebDataBinder binder) {
binder.registerCustomEditor(String[].class, new StringArrayPropertyEditor(null));
}
However this didn't work.
I also tried specifying a custom conversion service, based on the comment in https://jira.grails.org/browse/GRAILS-8997.
Config/spring/resources.groovy:
beans = {
conversionService (org.springframework.context.support.ConversionServiceFactoryBean) {
converters = [new CustomStringToArrayConverter()]
}
}
and
import org.springframework.core.convert.converter.Converter
import org.springframework.util.StringUtils
class CustomStringToArrayConverter implements Converter<String, String[]> {
#Override
public String[] convert(String source) {
return StringUtils.delimitedListToStringArray(source, ";");
}
}
but I couldn't get this to work either.
For now, I've come up with a work around. My Controller method has an extra line to set the troublesome field explicitly with:
commandObj.foo = params.list('foo') as String[]
I'm still open to suggestions on how to configure grails to not split on a comma...

How to clear SessionMap() in JSF without getting null in reference to this map

I use tomcat 7.0 and JSF 2.1 and I have problem when I call in my .xhtml page something like that: #{homePage.get("userName")}
I get javax.el.ELException: Caused by: java.lang.NullPointerException
at mainPacket.HomePageBean.get(HomePageBean.java:35)
I have ManagedBean like below:
#ManagedBean(name = "homePage")
#ViewScoped
public class HomePageBean {
private Map<String, Object> map;
public HomePageBean() {
map= FacesContext.getCurrentInstance().getExternalContext().getSessionMap();
//remove unnecessary values from sessionMap
FacesContext.getCurrentInstance().getExternalContext().getSessionMap().clear();
}
public String get(Object s){
return map.get(s).toString();
}
}
When I don't use clear, everything works ok. But I want to clear sessionMap. How to resolve it ?
Thanks
Java is an object oriented language. It doesn't give you a copy of the object everytime you request it. No, it gives you a reference to the object instance in memory. At the moment you invoke Map#clear() on the session map, then the map reference which you obtained just beforehand is basically also emptied, because it points to exactly the same map instance which you just emptied!
Your concrete functional requirement is nowhere mentioned in the question and the whole design in the code posted so far makes honestly no utter sense (I can't imagine any sensible real world use case for this), so it's hard to propose you the right solution. Best what you can get is the advice to add a nullcheck.
public String get(Object s){
Object value = map.get(s);
return (value != null) ? value.toString() : null;
}
You should by the way be very careful with abruptly emptying the session map this way. JSF stores view scoped and session scoped managed beans in there and it's also used by the flash scope.

Resources