Kong: migrating plugin from 2.8.0 to 3.x - lua

I am migrating from Kong 2.8.0 to 3.0.0.
I have a few custom plugins which are giving me trouble while migrating.
Once i start migration I am getting this error:
[error] 1#0: init_by_lua error: /usr/local/share/lua/5.1/kong/init.lua:560: error loading plugin schemas: on plugin 'file-log-extended': [postgres] 2 schema violations (fields: expected an array; name: field required for entity check)
stack traceback:
[C]: in function 'assert'
/usr/local/share/lua/5.1/kong/init.lua:560: in function 'init'
init_by_lua:3: in main chunk
So, the problem to me seems related to the schema.lua :
local typedefs = require "kong.db.schema.typedefs"
local pl_utils = require "pl.utils"
return {
fields = {
path = { required = true, type = "string"},
log_bodies = { type = "boolean", default = true }
}
}
What I've done is changing the schema to:
...
return {
fields = {{
config = {
type = "record",
fields = {
path = { required = true, type = "string"},
log_bodies = { type = "boolean", default = true }
}
}
}}
}
But now when I start Kong I get the following error:
[error] 1#0: init_by_lua error: /usr/local/share/lua/5.1/kong/init.lua:543: error
loading plugin schemas: on plugin 'file-log-extended': failed converting legacy schema for file-log-extended: unknown legacy field attribute: "config"
stack traceback:
[C]: in function 'assert'
/usr/local/share/lua/5.1/kong/init.lua:543: in function 'init'
init_by_lua:3: in main chunk
Can someone help me understand why I can't migrate properly this plugin?
Thanks

Actually the format was incorrect.
This version is accepted.
return {
name="file-log-extended",
fields = {
{
-- this plugin will only be applied to Services or Routes
consumer = typedefs.no_consumer
},
{
config = {
type = "record",
fields = {
-- Describe your plugin's configuration's schema here.
{
path = {
required = true,
type = "string"
}
},
{
log_bodies = {
type = "boolean",
default = true
}
}
}
}
}
}
}

Related

Column_limit for yapf and pylsp on neovim

I'm running nvim 0.9 with config I took from kickstart.nvim, so nvim-lspconfig, mason plus other stuff.
I configured yapf, based on how I understand the LSP docs and the kickstart.nvim, yet it is not respecting custom column_limit, it seems to be stuck to 79 line length. If yapf is actually the one doing the formatting.
Here is the Format command:
vim.api.nvim_buf_create_user_command(bufnr, 'Format', function(_)
vim.lsp.buf.format()
end, { desc = 'Format current buffer with LSP' })
And config for pylsp (autopep8 switched off like the docs say):
pylsp = {
plugins = {
autopep8 = {
enabled = false
},
yapf = {
enabled = true,
args = '--style={based_on_style: google column_limit: 120}'
},
pylint = {
enabled = true,
maxLineLength = 120
},
}
}
I'm new to Lua, I'm missing something but can't figure it out where or get a nice search hit on it.

Task :shared:linkDebugFrameworkIos FAILED

I'm trying to play with Kotlin Multiplatform and can't get it compiled for an iOS project.
My build.gradle.kt file:
plugins {
kotlin("multiplatform")
// kotlin("native.cocoapods") //version "1.5.10"
id("co.touchlab.native.cocoapods")
id("kotlinx-serialization")
}
kotlin {
// ios()
// Revert to just ios() when gradle plugin can properly resolve it
val onPhone = System.getenv("SDK_NAME")?.startsWith("iphoneos") ?: false
if (onPhone) {
iosArm64("ios")
} else {
iosX64("ios")
}
version = "1.1"
sourceSets { ... }
cocoapodsext {
summary = "Common library for the KaMP starter kit"
homepage = "https://github.com/touchlab/KaMPKit"
// isStatic = false
framework {
isStatic = false
transitiveExport = true
}
}
}
I have tried to use both co.touchlab.native.cocoapods and native.cocoapods plugins and I always get the same error for any way I choose.
The error:
> Task :kotlin-api-client:compileKotlinIosX64 UP-TO-DATE
> Task :shared:generateIosMainKaMPKitDbInterface UP-TO-DATE
> Task :shared:compileKotlinIos UP-TO-DATE
> Task :shared:linkDebugFrameworkIos FAILED
FAILURE: Build failed with an exception.
* What went wrong:
Execution failed for task ':shared:linkDebugFrameworkIos'.
> 'void org.jetbrains.kotlin.konan.target.Distribution.<init>(java.lang.String, boolean, java.lang.String, java.util.Map, int, kotlin.jvm.internal.DefaultConstructorMarker)'
What could be wrong? Any ideas at least where to look at?

Handlebars - own defined custom format throwing uncaught reference error "Could not find Intl object"

I'm facing an issue when defining an own custom format using handlebars.js and handlebars-intl.js.
According to Handlebars you define the formats like
var intlData = {
"locales": "de-CH",
"formats": {
"number": {
"CHF": {
"style": "currency",
"currency": "CHF"
}
}
}
};
and then your handlebar expression
{{formatNumber Amount "CHF"}}
The console tells me that there is a reference error: "Could not find Intl object: formats.number.CHF".
And the formatNumber function throws an error saying "A number must be provided to {{formatNumber}}.
This is how the function looks like when throwing this exception
function formatNumber(num, format, options) {
assertIsNumber(num, 'A number must be provided to {{formatNumber}}');
if (!options) {
options = format;
format = null;
}
var locales = options.data.intl && options.data.intl.locales;
var formatOptions = getFormatOptions('number', format, options);
return $$helpers$$getNumberFormat(locales, formatOptions).format(num);
}
num is filled with my sample data (100000), format with "CHF" and options contains all data provided by the model.
When I directly write the format and style into the expression it works properly and I can't really see what I'm either missing or doing wrong since I'm sure that I'm following the guidelines.
This is how you would define the format directly in the expression
{{formatNumber price style="currency" currency="USD"}}
Any help is appreciated. Thanks in advance.
Update
This is my js
$(document).ready(function () {
HandlebarsIntl.registerWith(Handlebars);
renderTemplate(#Html.Raw(Json.Encode(Model)))
});
var templateSource = $('#deals-template').html();
var handlebarsTemplate = Handlebars.compile(templateSource);
var intlData = {
"locales": "de-CH",
"formats": {
"number": {
"CHF": {
"style": "currency",
"currency": "CHF"
},
"percentage": {
"style": "percent"
}
}
}
};
function renderTemplate(data) {
$('#template').append(handlebarsTemplate(item, { data: { intlData }}));
});
}
I believe I'm passing the intlData variable correctly so this can't be the issue.
Example by handlebars example
var html = template(context, {
data: {intl: intlData}
});
I've finally figured out what caused this exception.
As I've mentioned the handlebars example looks like this
var html = template(context, { data: {intl: intlData} });
And this is what I've passed
$('#template').append(handlebarsTemplate(item, { data: { intlData }}));
I've not defined intl..
So this is the correct way
$('#template').append(handlebarsTemplate(item, { data: { intl: intlData }}));
Maybe this answer will save someone a few hours of searching but honestly it's pretty obvious.

Icinga2 check_mem plugin doesn't accept parameters

​Hello,
I've created a custom command in Icinga 2 using this plugin:
https://github.com/justintime/nagios-plugins/blob/master/check_mem/check_mem.pl​
check_command
object CheckCommand "memory" {
import "plugin-check-command"
command = [ PluginDir + "/check_mem" ]
arguments = {
"-w" = {
required = true
value = "$mem_warning$"
}
"-c" = {
required = true
value = "$mem_critical$"
}
"-u" = {
required = true
value = "$mem_used$"
}
"-C" = "$mem_cache$"
}
vars.mem_used = true
vars.mem_cache = true
vars.mem_warning = 85
vars.mem_critical = 95
}
service
apply Service "Memory" {
import "generic-service"
check_command = "memory"
assign where host.address
}
However the plugin cannot check the memory and gives the following output in Icinga Web 2 interface:
​Plugin Output
*** You must define WARN and CRITICAL levels!
\ncheck_​mem.​pl v1.​0 - Nagios Plugin\n\nusage:​\n check_​mem.​pl -\ncheck_​mem.​pl comes with absolutely NO WARRANTY either implied or explicit\nThis program is licensed under the terms of the\nMIT License (check source code for details)
Could you please help, what is wrong with this check?
This works with your service
object CheckCommand "memory" {
import "plugin-check-command"
command = [ PluginDir + "/check_mem.pl" ]
arguments = {
"-w" = {
value = "$mem_warning$"
}
"-c" = {
value ="$mem_critical$"
}
"-u" = {
set_if = "$mem_used$"
}
"-C" = {
set_if = "$mem_cache$"
}
}
vars.mem_warning = 85
vars.mem_critical = 95
vars.mem_used = true
vars.mem_cache = true
}
Give like this, you command will get values from service at run time.
apply Service "Memory" {
import "generic-service"
check_command = "memory"
vars.mem_used = true
vars.mem_cache = true
vars.mem_warning = 85
vars.mem_critical = 95
assign where host.address
}
This will replaced while execution refering this will help .If you are using NRPE, please update your question with that, answer may differ for that please refer arguments passing icinga to NRPE.

Grails - ElasticSearch - QueryParsingException[[index] No query registered for [query]]; with elasticSearchHelper; JSON via curl works fine though

I have been working on a Grails project, clubbed with ElasticSearch ( v 20.6 ), with a custom build of elasticsearch-grails-plugin(to support geo_point indexing : v.20.6)
have been trying to do a filtered Search, while using script_fields (to calculate distance). Following is Closure & the generated JSON from the GXContentBuilder :
Closure
records = Domain.search(searchType:'dfs_query_and_fetch'){
query {
filtered = {
query = {
if(queryTxt){
query_string(query: queryTxt)
}else{
match_all {}
}
}
filter = {
geo_distance = {
distance = "${userDistance}km"
"location"{
lat = latlon[0]?:0.00
lon = latlon[1]?:0.00
}
}
}
}
}
script_fields = {
distance = {
script = "doc['location'].arcDistanceInKm($latlon)"
}
}
fields = ["_source"]
}
GXContentBuilder generated query JSON :
{
"query": {
"filtered": {
"query": {
"match_all": {}
},
"filter": {
"geo_distance": {
"distance": "5km",
"location": {
"lat": "37.752258",
"lon": "-121.949886"
}
}
}
}
},
"script_fields": {
"distance": {
"script": "doc['location'].arcDistanceInKm(37.752258, -121.949886)"
}
},
"fields": ["_source"]
}
The JSON query, using curl-way, works perfectly fine. But when I try to execute it from Groovy Code, I mean with this (taken from ElasticSearchService.groovy) where request is SearchRequest instance :
elasticSearchHelper.withElasticSearch { Client client ->
def response = client.search(request).actionGet()
}
It throws following error :
Failed to execute phase [dfs], total failure; shardFailures {[1][index][3]: SearchParseException[[index][3]: from[0],size[60]: Parse Failure [Failed to parse source [{"from":0,"size":60,"query_binary":"eyJxdWVyeSI6eyJmaWx0ZXJlZCI6eyJxdWVyeSI6eyJtYXRjaF9hbGwiOnt9fSwiZmlsdGVyIjp7Imdlb19kaXN0YW5jZSI6eyJkaXN0YW5jZSI6IjVrbSIsImNvbXBhbnkuYWRkcmVzcy5sb2NhdGlvbiI6eyJsYXQiOiIzNy43NTIyNTgiLCJsb24iOiItMTIxLjk0OTg4NiJ9fX19fSwic2NyaXB0X2ZpZWxkcyI6eyJkaXN0YW5jZSI6eyJzY3JpcHQiOiJkb2NbJ2NvbXBhbnkuYWRkcmVzcy5sb2NhdGlvbiddLmFyY0Rpc3RhbmNlSW5LbSgzNy43NTIyNTgsIC0xMjEuOTQ5ODg2KSJ9fSwiZmllbGRzIjpbIl9zb3VyY2UiXX0=","explain":true}]]]; nested: QueryParsingException[[index] No query registered for [query]]; }
The above Closure works if I only use filtered = { ... } script_fields = { ... } but it doesn't return the calculated distance.
Anyone had any similar problem ?
Thanks in advance :)
It's possible that I might have been dim to point out the obvious here :P

Resources