How do I construct the query JSON so that while filtering it checks for the presence of external params.
{"query": {
"filtered": {
"query": {
"match_all": {}
}}},
"filter": {
"and": {
"filters": [
{
"term": {
"locality_name": params[:locality_name] if params[:locality_name].present?
}
}
]
}
}}:
The if clause in the JSON is invalid syntax for query DSL.
I think you can combine existFilter and termFilter with and filter like this.
This will retrieve documents for which locality_name field exist and locality_name field value is equal to your specified value.
"filter" : {
"and" : [
{
"exists" : { "field" : "locality_name" }
},
{
"term" : { "locality_name" : "your_locality_name" }
}
]
}
http://www.elasticsearch.org/guide/reference/query-dsl/exists-filter/
http://www.elasticsearch.org/guide/reference/query-dsl/and-filter/
Related
I have Customer collection on MongoDB. With status field. Which can have the same Id fields.
And I need find first changed value like 'Guest' and push it Id's to specific pipeline named as 'guests'.
And customers with status 'Member' I need push tu another pipeline named as 'members' who Id'd equal Id's from aggregation pipeline 'guests'.
This is done in order to obtain the quantity elements in 'guests' and 'members'.
Its member item:
{"_id"=>{"$oid"=>"5ce2ecb3ad71852e7fa9e73f"},
"status"=>"member",
"duration"=>nil,
"is_deleted"=>false,
"customer_id"=>"17601",
"customer_journal_item_id"=>"62769",
"customer_ids"=>"17601",
"customer_journal_item_ids"=>"62769",
"self_customer_status_id"=>"21078",
"self_customer_status_created_at"=>"2017-02-01T00:00:00.000Z",
"self_customer_status_updated_at"=>"2017-02-01T00:00:00.000Z",
"updated_at"=>"2019-05-20T18:06:43.655Z",
"created_at"=>"2019-05-20T18:06:43.655Z"}}
My aggregation
{
'$sort': {'self_customer_status_created_at': 1}
},
{'$match':
{
'self_customer_status_created_at':
{
"$gte": Time.parse('2017-01-17').beginning_of_month,
"$lte": Time.parse('2017-01-17').end_of_month
}
}
},
{
"$facet": {
"guests":
[
{
"$group": {
"_id": "$_id",
"data": {
'$first': '$$ROOT'
}
}
},
{
"$match": {
"data.status": "guest"
}
}, {
"$group": {
"_id":nil,
"array":{
"$push": "$data.self_customer_status_id"
}
}
},
{
"$project":{
"array": 1,
"_id":0
}
}
], "members":
[
{
"$group": {
"_id": "$_id", "data": {
'$last': '$$ROOT'
}
}
},
{
"$match": {
"data.status": "member",
"data.self_customer_status_id": {
"$in": [
"$guests.array"
]
}
}
}
}
]
}
}, {
"$project":
{
"members": 1,
"guests.array": 1
}
}
]
).as_json
Instead "guests.array" array? I have error:
Mongo::Error::OperationFailure: $in needs an array (2)
What am I doing wrong?
Sorry my English!
second expression in faced doesnt seen first expression
need delete
,
"data.self_customer_status_id": {
"$in": {
"$arrayElemAt":
[
"$guests.array",
0
]
}
}
{"$match": {"data.self_customer_status_id": { "$in": ["guests.array"] } } }
```
this link paste before $project
I’ve got a question I can’t seemingly resolve on my own.
Together with basic Query, Mutation and so on types I’ve made the following type definition:
module Types
UserType = GraphQL::ObjectType.define do
name 'User'
description 'A user'
implements GraphQL::Relay::Node.interface
global_id_field :id
field :email, !types.String, 'Email address'
connection :docs, DocType.connection_type, 'Available docs'
end
end
And I then try to query it with:
query FileListQuery(
$after: String
$first: Int
) {
viewer {
currentUser {
docs(first: $first, after: $after) {
edges {
node {
id
name
__typename
}
cursor
}
pageInfo {
endCursor
hasNextPage
hasPreviousPage
startCursor
}
}
id
}
id
}
}
And I pass the following as query variables:
{
"first": 1,
"after": null
}
The problem is it bails out with the following:
{
"errors": [
{
"message": "Int isn't a defined input type (on $first)",
"locations": [
{
"line": 3,
"column": 3
}
],
"fields": [
"query FileListQuery"
]
}
]
}
I honestly have no clue why it complains about the Int type…
If I get rid of the problematic $first query variable in the request, it works fine.
This:
query FileListQuery(
$after: String
) {
viewer {
currentUser {
docs(first: 10, after: $after) {
edges {
node {
id
name
__typename
}
cursor
}
pageInfo {
endCursor
hasNextPage
hasPreviousPage
startCursor
}
}
id
}
id
}
}
Produces this:
{
"data": {
"viewer": {
"currentUser": {
"docs": {
"edges": [
{
"node": {
"id": "1",
"name": "First Doc",
"__typename": "Doc"
},
"cursor": "MQ=="
}
],
"pageInfo": {
"endCursor": "MQ==",
"hasNextPage": false,
"hasPreviousPage": false,
"startCursor": "MQ=="
}
},
"id": "1"
},
"id": "VIEWER"
}
}
}
Any hints, ideas on how to fix this? I use the graphql gem v1.6.3.
Currently, there seems to be a bug in graphql-ruby that prevents types not explicitly used in a schema from being propagated. Check out this issue on GitHub: https://github.com/rmosolgo/graphql-ruby/issues/788#issuecomment-308996229
To fix the error one has to include an Int field somewhere in the schema. Turns out I haven't had one. Yikes.
This fixed it for me:
# Make sure Int is included in the schema:
field :testInt, types.Int
I'm trying to do a search where I look for "test" in any field while filtering for a specific client in the client_id field. Can't seem to figure this one out. This is how fat I got (but it's not working):
{
query: {
filtered: {
query: "test",
filter: {
term: {client_id: #client.id}
}
}
}
}
This is the right syntax
{
"query": {
"filtered": {
"query": {
"match": {
"_all": "test"
}
},
"filter": {
"term": {
"client_id": #client.id
}
}
}
}
}
From ES Docs: The _all field allows you to search for values in documents without knowing which field contains the value
Below is json I translated from ruby hash for ease of representation for this question using hash.to_json. Notice how the key range is being repeated since the values in the nested doc are different. How do I merge the ranges so that for the weight key both "gt": 2232, "lt": 4444 fall under the one hash key weight inside range. Is there some union or collapse method in ruby to sort of "compactify" hashes?
{
"must": [
{
"match": {
"status_type": "good"
}
},
{
"range": {
"created_date": {
"lte": 43252
}
}
},
{
"range": {
"created_date": {
"gt": "42323"
}
}
},
{
"range": {
"created_date": {
"gte": 523432
}
}
},
{
"range": {
"weight": {
"gt": 2232
}
}
},
{
"range": {
"weight": {
"lt": 4444
}
}
}
],
"should": [
{
"match": {
"product_age": "old"
}
}
]
}
Want to change the above to this:
{
"must": [
{
"range": {
"created_date": {
"gte": 523432,
"gt": "42323"
}
}
},
{
"range": {
"weight": {
"gt": 2232,
"lt": 4444
}
}
}
],
"should": [
{
"match": {
"product_age": "old"
}
}
]
}
I don't know of a built in way to handle something like this, but you could write a method that does something like this:
def collapse(array, key)
# Get only the hashes with :range
to_collapse = array.select { |elem| elem.has_key? key }
uncollapsed = array - to_collapse
# Get the hashes that :range points to
to_collapse = to_collapse.map { |elem| elem.values }.flatten
collapsed = {}
# Iterate through each range hash and their subsequent subhashes.
# Collapse the values into the collapsed hash as necessary
to_collapse.each do |elem|
elem.each do |k, v|
collapsed[k] = {} unless collapsed.has_key? k
v.each do |inner_key, inner_val|
collapsed[k][inner_key] = inner_val
end
end
end
[uncollapsed, collapsed].flatten
end
hash[:must] = collapse hash[:must], :range
Note that this is a specific solution that's mainly applicable to the presented problem. It only works for the hash/array depths specified here. You could probably write a recursive solution that could potentially work at any level of depth with a bit more work.
I am new to elasticsearch. I have a filtered query which gives me correct results using console:
GET _search
{
"query": {
"filtered": {
"query": {
"bool" : {
"should" : [
{
"match" : { "name" : "necklace" }
},
{
"match" : { "skuCode" : "necklace" }
}
]
}
},
"filter": {
"bool" : {
"must" : [
{
"term" : { "enabled" : true }
},
{
"term" : { "type" : "SIMPLE" }
},
{
"term" : { "tenantCode" : "Triveni" }
}
]
}
}
}
}
}
I am unable to get the corresponding spring-data version going. Here is what I tried:
SearchQuery searchQuery = new NativeSearchQueryBuilder().withQuery(boolQuery().should(matchQuery("skuCode", keyword)).should(matchQuery("name", keyword))).withFilter(
boolFilter().must(termFilter("enabled", true), termFilter("type", "SIMPLE"), termFilter("tenantCode", "Triveni"))).build();
This query gives me no results.
Can somebody please help me with this?
NativeSearchQueryBuilder.withFilter is converted to so called post_filter. See Post Filter for more details. So the query you have done on the console differs from the one that is generated by spring-data elasticsearch. To mimic the query from the console you have to use the FilteredQuery instead.
Change your query building to this:
QueryBuilder boolQueryBuilder = boolQuery().should(matchQuery("skuCode", keyword)).should(matchQuery("name", keyword));
FilterBuilder filterBuilder = boolFilter().must(termFilter("enabled", true), termFilter("type", "SIMPLE"), termFilter("tenantCode", "Triveni"));
NativeSearchQueryBuilder().withQuery(QueryBuilders.filteredQuery(boolQueryBuilder, filterBuilder).build();
Although as long as you do not use aggregations, this should not affect the (hits) results.