Override json file values with environment variables docker - docker

Assume that I've a complex json file that is used to configurate my project.
Like the json below:
{
"apis": {
"payment": {
"base_url": "https://example.com/"
},
"order": {
"base_url": "https://example.com/"
},
},
"features": {
"authentication": {
"authProviders": true,
"registration": false
}
},
"availableLocales": [
"en",
"es"
]
}
With .Net there's a feature that allows us to override the values based on environment variables.
If I wanted to override the value of apis.payment.base_url I could pass an environment variable: APIS__PAYMENT__BASE_URL and the value would be replaced.
Since I'm currently not using .Net is there any alternatives?
This is what I'm using right now, but this does not fit my needs
FROM code as prepare-build
ENV JQ_VERSION=1.6
RUN wget --no-check-certificate \
https://github.com/stedolan/jq/releases/download/jq-${JQ_VERSION}/jq-linux64 \
-O /tmp/jq-linux64
RUN cp /tmp/jq-linux64 /usr/bin/jq
RUN chmod +x /usr/bin/jq
WORKDIR /code/public
RUN jq 'reduce path(recurse | scalars) as $p (.;setpath($p; "$" + ($p | join("_"))))' \
./configurations/settings.json > ./configurations/settings.temp.json && \
yez | cp ./configurations/settings.temp.json ./configurations/settings.json
WORKDIR /code/deploy
RUN echo "#!/usr/bin/env sh" | tee -a /code/deploy/start.sh > /dev/null && \
echo 'export EXISTING_VARS=$(printenv | awk -F= '\''{print $1}'\'' | sed '\''s/^/\$/g'\'' | paste -sd,);' | tee -a /code/deploy/start.sh > /dev/null && \
echo 'for file in $CONFIGURATIONS_FOLDER;' | tee -a /code/deploy/start.sh > /dev/null && \
echo 'do' | tee -a /code/deploy/start.sh > /dev/null && \
echo ' cat $file | envsubst $EXISTING_VARS | tee $file' | tee -a /code/deploy/start.sh > /dev/null && \
echo 'done' | tee -a /code/deploy/start.sh > /dev/null && \
echo 'nginx -g '\''daemon off;'\''' | tee -a /code/deploy/start.sh > /dev/null
WORKDIR /code
This was I have a problem that, I need to pass all the json paths as environment variables, to override it correctly. If not, the variables will be replaced with the path of it, only.
I think the best approach would be:
Read the environment variables and create a json file with their values, then override the existing json file with the values of the created one.
Does anyone have any thing that could help me achieve this?
To summarize.
In order to make easy to identify which environment variables I should use, let's assume it will have a prefix of SETTINGS.
Example of how I would override values.
JSON PATH
EQUIVALENT ENVIRONMENT VARIABLE
APIS.PAYMENT.BASE_URL
SETTINGS__APIS__PAYMENT__BASE_URL
AVAILABLELOCALES[0]
SETTINGS__AVAILABLELOCALES__0

The task can be solved using jq.
The version is robust against settings that do not match a path in the document.
Variables
SETTINGS__APIS__PAYMENT__BASE_URL=https://example2.com
SETTINGS__AVAILABLELOCALES__0=cs
SETTINGS__UNAVAILABLE__PATH=1
Code
jq 'def settings:
def prepareVariables:
[$ENV | to_entries[] | select(.key | startswith("SETTINGS__"))] # select all variables that starts with "SETTINGS__"
| map(.key |= (. / "__" | map(tonumber? // .))[1:]); # convert variable names to path arrays
[paths(scalars) | [., map(ascii_upcase? // .)]] | # collect all leaf paths from input file and add uppercase path
reduce .[] as $leafPath # add leaf paths to corresponding settings
(prepareVariables; map(select($leafPath[1] == .key) |= . + {path: $leafPath[0]})) |
map(select(has("path"))); # drop settings for unknown paths
. as $input |
reduce settings[] as $setting # apply new settings from variables to input file
($input; . | setpath($setting["path"]; $setting["value"]))
' input.json
Output
{
"apis": {
"payment": {
"base_url": "https://example2.com"
},
"order": {
"base_url": "https://example.com/"
}
},
"features": {
"authentication": {
"authProviders": true,
"registration": false
}
},
"availableLocales": [
"cs",
"es"
]
}

I'm a jq novice, and I'd be very interested in a better jq script, but here's one way to use environment variables to modify a settings.json file.
$ cat settings.json
{
"apis": {
"payment": {
"base_url": "https://example.com/"
},
"order": {
"base_url": "https://example.com/"
}
},
"features": {
"authentication": {
"authProviders": true,
"registration": false
}
},
"availableLocales": [
"en",
"es"
]
}
$ printenv|grep SETTINGS__
SETTINGS__APIS__PAYMENT__BASE_URL=https://example2.com
SETTINGS__AVAILABLELOCALES__0=cs
$ jq -n '
inputs as $i
| [ $i
| ..
| keys_unsorted?
| .[]
| strings
]
| unique as $allKeys
|
def fixCase:
. as $w
| reduce ($allKeys[]|select(length == ($w|length))) as $k
("";. + $k|match($w;"i").string)
;
def envpaths:
[
$ENV
| to_entries[]
| select(.key | startswith("SETTINGS__"))
| [[ (.key|split("__"))[1:][]
| if test("^[0-9]+$") then tonumber else fixCase end
],
.value
]
]
;
reduce envpaths[] as $p ($i; .|setpath($p[0];$p[1]))' settings.json
# the output
{
"apis": {
"payment": {
"base_url": "https://example2.com"
},
"order": {
"base_url": "https://example.com/"
}
},
"features": {
"authentication": {
"authProviders": true,
"registration": false
}
},
"availableLocales": [
"cs",
"es"
]
}
See it work on jqplay.org.

Related

special character conversation in bash output

Hi :P I'm a beginner here and i was doing a small project in bash to automatically create prewritten text files when at the last moment I realized that characters like "" (example: "format_version" goes out format_version)were deleted when writing output to my file. Please help me ('-')
python ID_1.py
python ID_2.py
echo "bash(...)"
ID1=$(cat link1.txt)
ID2=$(cat link2.txt)
echo "Name of your pack=$NP"
read NP
echo "This description=$DC"
read DC
cd storage/downloads
echo "{
("format_version"): 1,
"header": {
"name": "$NP",
"description": "$DC",
"uuid": "$ID1",
"version": [0, 0, 1]
},
"modules": [
{
"type": "resources",
"uuid": "$ID2",
"version": [0, 0, 1]
}
]
}" > manisfest.json

how to make scripText in xml human readable

use case:
adding a pre-action is readable through xcode,
but gibberish in xml
question:
is there a way to make scriptText human readable on xml?
example:
code from Pre-actions
function decode() { echo "${*}" | base64 --decode; }
KEY="FLAVOR"
FILE=${SRCROOT}/Flutter/DartDefineFlavor.xcconfig
test -f $FILE || touch $FILE
IFS=',' read -r -a ENCODED_ITEMS <<< "$DART_DEFINES"
for ENCODED_ITEM in ${ENCODED_ITEMS[#]}
do
DECODED_ITEM=$(decode "$ENCODED_ITEM")
IFS=' ' read -r K V <<< ${DECODED_ITEM//[=]/ };
if [ $K = $KEY ]; then
echo "DART_DEFINE_BUNDLE_SUFFIX=.$V" >> $FILE
break
fi
done
expectation:
<ActionContent
title = "Load Dart Define"
scriptText =
"
function decode() { echo "${*}" | base64 --decode; }
KEY="FLAVOR"
FILE=${SRCROOT}/Flutter/DartDefineFlavor.xcconfig
test -f $FILE || touch $FILE
IFS=',' read -r -a ENCODED_ITEMS <<< "$DART_DEFINES"
for ENCODED_ITEM in ${ENCODED_ITEMS[#]}
do
DECODED_ITEM=$(decode "$ENCODED_ITEM")
IFS=' ' read -r K V <<< ${DECODED_ITEM//[=]/ };
if [ $K = $KEY ]; then
echo "DART_DEFINE_BUNDLE_SUFFIX=.$V" >> $FILE
break
fi
done
">
<EnvironmentBuildable>
reality:
<ActionContent
title = "Load Dart Define"
scriptText = "#!/bin/bash
function decode() { echo "${*}" | base64 --decode; }
KEY=FLAVOR
FILE=${SRCROOT}/Flutter/DartDefineFlavor.xcconfig
test -f $FILE || touch $FILE
IFS=&apos;,&apos; read -r -a ENCODED_ITEMS <<< "$DART_DEFINES"
for ENCODED_ITEM in ${ENCODED_ITEMS[#]}
do
DECODED_ITEM=$(decode "$ENCODED_ITEM")
IFS=&apos;=&apos; read -r K V <<< $DECODED_ITEM;
if [ $K = $KEY ]; then
NAME_SUFFIX="DART_DEFINE_NAME_SUFFIX= $V"
BUNDLE_SUFFIX="DART_DEFINE_BUNDLE_SUFFIX=.$V"
echo $NAME_SUFFIX >> $FILE
echo $BUNDLE_SUFFIX >> $FILE
break
fi
done
">
<EnvironmentBuildable>

How can I add a nix package when using buildPythonPackage?

The Python Part
I have a python application with multiple entrypoints, json_out and json_in. I can run them both with this default.nix
with import <nixpkgs> {};
(
let jsonio = python37.pkgs.buildPythonPackage rec {
pname = "jsonio";
version = "0.0.1";
src = ./.;
};
in python37.withPackages (ps: [ jsonio ])
).env
Like so:
$ nix-shell --run "json_out"
{ "a" : 1, "b", 2 }
$ nix-shell --run "echo { \"a\" : 1, \"b\", 2 } | json_in"
keys: a,b
values: 1,2
The System Part
I want to also invoke jq in the nix shell, like this:
$ nix-shell --run --pure "json_out | jq '.a' | json_in"
But I can't because it is not included. I know that I can include jq into the nix shell using this default.nix
with import <nixpkgs> {};
stdenv.mkDerivation rec {
name = "jsonio-environment";
buildInputs = [ pkgs.jq ];
}
And it works on its own:
$ nix-shell --run --pure "echo { \"a\" : 1, \"b\", 2 } | jq '.a'"
{ "a" : 1 }
But now I don't have my application:
$ nix-shell --run "json_out | jq '.a'"
/tmp/nix-shell-20108-0/rc: line 1: json_out: command not found
The Question
What default.nix file can I provide that will include both my application and the jq package?
My preferred way to achieve this is to use .overrideAttrs to add additional dependencies to the environment like so:
with import <nixpkgs> {};
(
let jsonio = python37.pkgs.buildPythonPackage rec {
pname = "jsonio";
version = "0.0.1";
src = ./.;
};
in python37.withPackages (ps: [jsonio ])
).env.overrideAttrs (drv: {
buildInputs = [ jq ];
})
I needed to:
provide the output of buildPythonPackage as part of the input of mkDerivation
omit the env. Based on a hint from an error message:
Python 'env' attributes are intended for interactive nix-shell
sessions, not for building!
Here's what I ended up with:
with import <nixpkgs> {};
let jsonio_installed = (
let jsonio_module = (
python37.pkgs.buildPythonPackage rec {
pname = "jsonio";
version = "0.0.1";
src = ./.;
}
);
in python37.withPackages (ps: [jsonio_module ])
);
in stdenv.mkDerivation rec {
name = "jsonio-environment";
buildInputs = [ pkgs.jq jsonio_installed ];
}

Adding numbers that are not in array format? Or how to filter to array so I can sum up

In previous versions of jq I was able to run the following:
cat pull_requests.json | jq '.data.organization.repositories.nodes[] | .pullRequests.totalCount | add'
On this sample data:
{
"data": {
"organization": {
"repositories": {
"nodes": [{
"pullRequests": {
"totalCount": 2
}
},
{
"pullRequests": {
"totalCount": 8
}
},
{
"pullRequests": {
"totalCount": 23
}
}
]
}
}
}
}
And I would get the correct result.
But currently on jq-1.6 I am getting the following error:
jq: error (at <stdin>:24): Cannot iterate over number (2)
What I noticed from the output without the add filter is that is not an array:
➤ cat pull_requests.json | jq '.data.organization.repositories.nodes[] | .pullRequests.totalCount'
2
8
23
So my question is how to add these numbers up?
I also tried casting it to array by using [.pullRequests.totalCount] but I was unable to merge, meld, join the arrays to get the final count.
You are mistaken in thinking that the jq filter as shown used to work on the JSON as shown.
There are fortunately two simple fixes:
[ .data.organization.repositories.nodes[]
| .pullRequests.totalCount ]
| add
or:
.data.organization.repositories.nodes
| map(.pullRequests.totalCount)
| add
Using sigma/1
Another option is to use a stream-oriented summation function:
def sigma(s): reduce s as $s (null; .+$s);
.data.organization.repositories.nodes
| sigma(.[].pullRequests.totalCount)

Limit of 4Kbytes when using print in AWK?

I'm trying to replace a a blank line in a set of text files (*.txt) for a "--" if the previous line matchs a pattern. My code is
awk 'BEGIN{$headerfound=0} { if (/pattern/) {print> FILENAME ; $headerfound=1} else { if((/^\s*$/) && ($headerfound == 1)) { $headerfound=0; print "--" > FILENAME } else {print > FILENAME} } }' *.txt
But for some reason, output is limited to 4kbytes files (if the file is larger, it gets clipped). Do you know where is the limitation?
Thanks,
Ariel
See #glennjackman's comments for problems in your script.
Since you are using GNU awk (you used \s which is gawk-specific) you can use inplace-editing and write your script as (spread out with white space to improve readability):
awk -i inplace '{
if (/pattern/) {
print
headerfound=1
} else {
if((/^\s*$/) && (headerfound == 1)) {
headerfound=0
print "--"
} else {
print
}
}
}' *.txt
but you can do the same thing much more concisely (and awk-ishly) as:
awk -i inplace '
/pattern/ { headerfound=1 }
headerfound && !NF { $0="--"; headerfound=0 }
1' *.txt
If you don't have inplace editing then do it this way:
for file in *.txt; do
awk '
/pattern/ { headerfound=1 }
headerfound && !NF { $0="--"; headerfound=0 }
1' "$file" > tmp$$ &&
mv tmp$$ "$file"
done
You can probably get away with:
suffix=".$$.tmp" '
awk -v suf="$suffix" '
FNR == 1 {outfile = FILENAME suf}
/pattern/ {headerfound = 1}
headerfound && /^[[:blank:]]*$/ {$1 = "--"}
{ print > outfile }
' *.txt
for f in *.txt; do
echo mv "${f}$suffix" "$f"
done
Remove the echo from the for loop if you're satisfied it's working.
Missed the "just after" requirement (using Ed's use of NF to find a blank line):
awk -v suf="$suffix" '
FNR == 1 {outfile = FILENAME suf}
/pattern/ {lineno = FNR}
FNR == lineno+1 && NF == 0 {$0 = "--"}
{ print > outfile }
' *.txt

Resources