How to return a response object based on OR criteria in Open Policy Agent (OPA). Getting error complete rules must not produce multiple outputs? - open-policy-agent

I am trying to return a response object in OPA policy based on the conditions that triggered that rule but gives the error "Error Evaluating
policy.rego:11: eval_conflict_error: complete rules must not produce multiple outputs" since both the OR conditions are evaluating to true.
For example, here is a sample OPA policy:
package play
allow = response {
mainRule
OptionalRule
response := {
"field": OptionalRule.field
}
}
OptionalRule = resp {
input.a == "01"
input.b == "C"
resp := {
"field": "OptionalRule1"
}
}
OptionalRule = resp {
input.c != 3
resp := {
"field": "OptionalRule2"
}
}
mainRule {
input.d > 50
input.e < 5
}
With following Input:
{
"a": "01",
"b": "C",
"c": 4,
"d": 55,
"e": 1
}
Here, I am trying to implement OptionalRule as OR condition and trying to return which optionalRule conditions triggered it but it given the above error. Any idea on how this can be implemented?

You can leverage rules with incremental definitions to implement your policy. For example,
package authz
allow = response {
mainRule
response := {
"field": OptionalRule
}
}
OptionalRule[resp] {
input.a == "01"
input.b == "C"
resp := "OptionalRule1"
}
OptionalRule[resp] {
input.c != 3
resp := "OptionalRule2"
}
mainRule {
input.d > 50
input.e < 5
}
Now with an input like { "a": "01", "b": "C", "c": 4, "d": 55, "e": 1 }, the allow rule would return
{
"field": [
"OptionalRule2",
"OptionalRule1"
]
}
Similarly for the input { "a": "01", "b": "C", "c": 3, "d": 55, "e": 1 }, the allow rule would return
{
"field": [
"OptionalRule1"
]
}

Related

Merge object and add keys from JSON format using Ruby on Rails

I have a list of array that is queried that needs to be merged with the same location_id based on the objects.
**this are the code for generating array **
filled = Product.on_hand_location(pid).to_a
empty = Product.on_hand_location_empty_cylinder(pid).to_a
data = filled + empty
result = data.map{ |k|
{
details: {
location_id: k['location_id'],
"location_name"=>k['location_name'],
"onhandcylynder"=>k['onhand'] == nil ? 0 : k['onhand'],
"emptycylynder"=> k['emptyonhand'] == nil ? 0 : k['emptyonhand']
} ,
}
}
respond_with [ onhand: result ]
This JSON format below is the output of code above. which has location_id that needs to be merge
[{
"onhand": [{
"details": {
"location_id": 1,
"location_name": "Capitol Drive",
"onhandcylynder": "4.0",
"emptycylynder": 0
}
},
{
"details": {
"location_id": 2,
"location_name": "SM City Butuan",
"onhandcylynder": "5.0",
"emptycylynder": 0
}
},
{
"details": {
"location_id": 1,
"location_name": null,
"onhandcylynder": 0,
"emptycylynder": "2.0"
}
}
]
}]
My desired output
[{
"onhand": [{
"details": {
"location_id": 1,
"location_name": "Capitol Drive",
"onhandcylynder": "4.0",
"emptycylynder": 0
}
},
{
"details": {
"location_id": 2,
"location_name": "SM City Butuan",
"onhandcylynder": "5.0",
"emptycylynder": "2.0"
}
}
]
}]
I think instead of data = filled + empty you should try
data = Product.on_hand_location(pid).to_a
empty = Product.on_hand_location_empty_cylinder(pid).to_a
empty.each do |location|
data.push(location) if data.none? { |item| item['location_id'] == product['location_id'] }
end
result = ...
or
hash = {}
Product.on_hand_location_empty_cylinder(pid).map { |l| hash[l['location_id']] = l }
Product.on_hand_location(pid).map { |l| hash[l['location_id']] = l }
data = hash.values
and if you need some data from both of the queries you should try
hash = {}
Product.on_hand_location_empty_cylinder(pid).map { |l| hash[l['location_id']] = l }
Product.on_hand_location(pid).map { |l| hash[l['location_id']].merge!(l) }
data = hash.values
to merge in place
I refactor my code and able to get my desired result
filled = Product.on_hand_location(pid).to_a
emptyTank = Product.on_hand_location_empty_cylinder(pid).to_a
data = filled + emptyTank
cylinder = data.map(&:dup)
.group_by { |e| e.delete('location_id') }
finalResult = cylinder.map{|_, location_id | location_id.reduce({}) { |result, location_id|
result.merge(location_id)
} }
respond_with [ onhand: finalResult]
The result already merge with the same location id and emptyonhand keys are merge already in corresponding to its location ID
[
{
"onhand": [
{
"onhand": "1.0",
"location_name": "Capitol Drive",
"emptyonhand": "5.0"
},
{
"onhand": "5.0",
"location_name": "SM City Butuan"
}
]
}
]

How to find ids on array who is created facet operator

I have Customer collection on MongoDB. With status field. Which can have the same Id fields.
And I need find first changed value like 'Guest' and push it Id's to specific pipeline named as 'guests'.
And customers with status 'Member' I need push tu another pipeline named as 'members' who Id'd equal Id's from aggregation pipeline 'guests'.
This is done in order to obtain the quantity elements in 'guests' and 'members'.
Its member item:
{"_id"=>{"$oid"=>"5ce2ecb3ad71852e7fa9e73f"},
"status"=>"member",
"duration"=>nil,
"is_deleted"=>false,
"customer_id"=>"17601",
"customer_journal_item_id"=>"62769",
"customer_ids"=>"17601",
"customer_journal_item_ids"=>"62769",
"self_customer_status_id"=>"21078",
"self_customer_status_created_at"=>"2017-02-01T00:00:00.000Z",
"self_customer_status_updated_at"=>"2017-02-01T00:00:00.000Z",
"updated_at"=>"2019-05-20T18:06:43.655Z",
"created_at"=>"2019-05-20T18:06:43.655Z"}}
My aggregation
{
'$sort': {'self_customer_status_created_at': 1}
},
{'$match':
{
'self_customer_status_created_at':
{
"$gte": Time.parse('2017-01-17').beginning_of_month,
"$lte": Time.parse('2017-01-17').end_of_month
}
}
},
{
"$facet": {
"guests":
[
{
"$group": {
"_id": "$_id",
"data": {
'$first': '$$ROOT'
}
}
},
{
"$match": {
"data.status": "guest"
}
}, {
"$group": {
"_id":nil,
"array":{
"$push": "$data.self_customer_status_id"
}
}
},
{
"$project":{
"array": 1,
"_id":0
}
}
], "members":
[
{
"$group": {
"_id": "$_id", "data": {
'$last': '$$ROOT'
}
}
},
{
"$match": {
"data.status": "member",
"data.self_customer_status_id": {
"$in": [
"$guests.array"
]
}
}
}
}
]
}
}, {
"$project":
{
"members": 1,
"guests.array": 1
}
}
]
).as_json
Instead "guests.array" array? I have error:
Mongo::Error::OperationFailure: $in needs an array (2)
What am I doing wrong?
Sorry my English!
second expression in faced doesnt seen first expression
need delete
,
"data.self_customer_status_id": {
"$in": {
"$arrayElemAt":
[
"$guests.array",
0
]
}
}
{"$match": {"data.self_customer_status_id": { "$in": ["guests.array"] } } }
```
this link paste before $project

how to check if a particular json key has a value which is same for all the keys

For example, I have a json response of array type where the second key in each array has same value. The count of the array is dynamic but I need to check everytime, if that particular key value is same in all the arrays, i need to hide a label.
"loadable": [
{
"position": {
"positionType": "XXX",
"thirdKey": 1,
"fourthKey": 1,
},
},
{
"position": {
"positionType": "XXX",
"thirdKey": 1,
"fourthKey": 1,
},
},
{
"position": {
"positionType": "XXX",
"thirdKey": 1,
"fourthKey": 1,
},
{
"position": {
"positionType": "XXX",
"thirdKey": 1,
"fourthKey": 1,
},
}
}
]
Here i want to check if all the values for the key positionType == "XXX", then i need to hide a label. Please provide the answer in swift.
If you want to check each value is equal in array then you can make it like this.
let searchValue = "xxx"
if loadables.index(where: {$0.position?.positionType != searchValue}) != nil {
//positionType for all objects are not equal to searchValue
footer.labelTitle.isHidden = false
}
else {
//positionType for all objects are equal to searchValue
footer.labelTitle.isHidden = true
}
This is less of a json problem but more a question of "Do all elements in my dictionary have the same value", where the elements would be the key that you are looking for.
func checkValues(array: Array<Element>) -> Bool {
guard let myValue = array.firstObject()?.position?.positionType else {
return false
}
for element in array {
if element.position.positionType != myValue {
return false
}
}
return true
}

Simulating a join in ElasticSearch

Assume there are documents in an ES index that have two fields, user_id and action_id. How to count users such that there are documents both with action_id = 1 and action_id = 2?
Equivalent SQL would be
SELECT COUNT(DISTINCT `a`.`uuid`)
FROM `action` AS `a`
JOIN `action` AS `b` ON `a`.`user_id` = `b`.`user_id`
WHERE `a`.`action_id` = 1
AND `b`.`action_id` = 2
I found the only way to do so: request twice all unique user_ids with these action_ids and find intersection of resulting sets on the ES client. Yet this approach needs to transfer megabytes of data from ES, so I'm searching for an alternative.
You can do it like this:
first you have a query that filters your documents with actions 1 and 2 only (I have no idea if you can have other action types)
then the magic is with aggregations
the first aggregation is a terms one for user_id, so that you can do individual calculations per user
then you use a cardinality sub-aggregation to count the number of distinct actions per user. Since the query is for actions 1 and 2 that number can only be 1 or 2
then you use a bucket_selector sub-aggregation to only keep those users that have the cardinality result of 2.
{
"size": 0,
"query": {
"bool": {
"should": [
{
"terms": {
"action_id": [
1,
2
]
}
}
]
}
},
"aggs": {
"users": {
"terms": {
"field": "user_id",
"size": 10
},
"aggs": {
"actions": {
"cardinality": {
"field": "action_id"
}
},
"actions_count_bucket_filter": {
"bucket_selector": {
"buckets_path": {
"totalActions": "actions"
},
"script": "totalActions >= 2"
}
}
}
}
}
}
The result will look like this:
"aggregations": {
"users": {
"doc_count_error_upper_bound": 0,
"sum_other_doc_count": 0,
"buckets": [
{
"key": 1,
"doc_count": 2,
"actions": {
"value": 2
}
},
{
"key": 5,
"doc_count": 2,
"actions": {
"value": 2
}
}
]
}
}
The keys are the user_ids whose actions are 1 and 2. bucket_selector aggregation is available in 2.x+ version of ES.

Fuseki charset issue

There is a strange problem: insert the data encoded as UTF-8:
INSERT DATA { <http://onto.pro/Test> <http://www.w3.org/2000/01/rdf-schema#label> "Проверка" }
then extract them:
SELECT * WHERE { <http://onto.pro/Test> <http://www.w3.org/2000/01/rdf-schema#label> ?a }
and the encoding is broken:
{ "head": { "vars": [ "a" ] } , "results": { "bindings": [ { "a": { "type": "literal" , "value": "????????" } } ] } }
As can be seen, instead of "Проверка" returned "????????". The problem occurs suddenly and disappears after restart Fuseki. Where to dig?

Resources