Persist variables using Newman? - environment-variables

I know using Collection Runner in Postman you're able to check a 'Persist variables' checkbox. Is there any way of achieving this when using their cli tool Newman?

Eventually found that you can get desired outcome with adding --export-environment like so:
newman run coll.json -e env.json --export-environment env.json
Link to a Github-issue here.

Related

Setting custom system-property for Payara Docker image

I am new to Docker and trying to create a Payara image for my application.
In that, I need to set a bunch of custom system-properties as server configs...like I have them in my Payara domain.xml:
<configs>
<config name="server-config">
<system-property name="com.myorg.config.propertyA" value="abc"></system-property>
<system-property name="com.myorg.config.propertyB" value="def"></system-property>
.....
......
.......
So far, the Dockerfile I wrote, is like this.
I am trying to set just one system-property as of now, to experiment with ..and it's not working:
FROM payara/server-full
COPY myapp.war $DEPLOY_DIR
RUN echo 'set configs.config.server-config.system-property.com.myorg.config.propertyA=abc' > $POSTBOOT_COMMANDS
If I look at the post-boot-commands.asadmin inside the running container, it looks like this:
set configs.config.server-config.system-property.com.myorg.config.propertyA=abc
deploy /opt/payara/deployments/myapp.war
My application WAR ultimately fails to deploy due to being unable to find the property 'com.myorg.config.propertyA'.
I think I am trying to set the system property in the wrong way. Can anybody please advise? TIA
I found this works in the Dockerfile.
(So, I was trying to set it the wrong way initially).
RUN echo 'create-system-properties com.myorg.config.propertyA=abc' > $POSTBOOT_COMMANDS

conemu pass env var to WSL bash terminal

I'm trying to get a task defined in ConEmu to run multiple instance of Ubuntu bash using the WSL layer of Windows 10.
I followed the examples to set up a task to split the UI the way I want, and that part works great. My problem is that I'm trying to use environment variables to pass through commands to run after logging in, and I want different things to run in each panel.
Here is the task command I'm using:
set "STARTUP_CMD='gfp && make server' " & set "PATH=%ConEmuBaseDirShort%\wsl;%PATH%" & %ConEmuBaseDirShort%\conemu-cyg-64.exe --wsl -cur_console:p -cur_console:d:C:\xxx\yyy
On the Linux side I have code in my ~/.bash_aliases file that looks for the STARTUP_CMD env var and tries to execute it. I found code that can pull env vars from the Windows side, which is where the 'set' commands appear to be storing things. Problem is, Windows doesn't know what to do with these, and it tries to expand them when they are read, so it all blows up.
I had this working before, but had to wipe and rebuild my machine recently, and unfortunately didn't have the working command backed up anywhere.
I thought this was the recommended way to run bash with WSL, but I would rather have a way to send stuff directly to the Linux layer as env vars (or if someone has a better way to queue up different commands for each pane, I'm all for that too). Any help would be much appreciated.
Thanks!
Oh course I find the answer right after posting the question... posting here to help others that hit the same issue (or my future self if I forget and have to wipe my machine again).
set "PATH=%ConEmuBaseDirShort%\wsl;%PATH%" & %ConEmuBaseDirShort%\conemu-cyg-64.exe --wsl -eSTARTUP_CMD="gfp && make server" -cur_console:p -cur_console:d:C:\xxx\yyy
You just have to prefix the env var you want with -e and pass it as a param to conemu-cyg. It goes through without any modification on the Windows side and you can read it just like any other env var on the Linux side.

How to change the screenshot path in Rails 5.1 system test

Using Rails 5.1.2
Creating a system test and using the take_screenshot method.
How do i change the location these screenshots are created at?
Looks like the image path is hardcoded in, so you won't be able to change it currently. Probably wouldn't be too difficult to change if you wanted to open an issue over there or create a pull request for them.
If you want to do this on CI, here's the solution I came up with. In my setup I already had a "test-runner.sh" script, with the rspec invocation at the end. There's probably also some sort of after_script: setting in the yml config also available, but I didn't look into it.
rspec ......
status=$?
# /tmp/test-results is where CircleCI looks for "artifacts" which it makes
# available for download after a test run
[ -d "tmp/screenshots" ] && cp -a tmp/screenshots /tmp/test-results/
exit $status

Protect Jenkins build from clean up via api

I want to protect some Jenkins builds from the auto cleanup. I have found the http://ci.jenkins.com/job/[job_name]/[build_v]/toggleLogKeep however this requires me to check the state. Are there any other end points I can use. Ideally it would be /keepBuildForever /dontKeepBuildForever
It looks like there is no good solution. The best way is to list all keep forever builds. Then check if it is already in the list, if not hit the /toggleLogKeep endpoint
buildsXml = http://ci.jenkins.com/api/xml?depth=2&xpath=/hudson/job/build[keepLog=%22true%22]/url&wrapper=forever
#check if your build is in buildsXml
After accidently deleting an important build, I found this alternative solution:
# if running inside a job, the following vars are already populated:
#JOB_NAME=yourjobname
#BUILD_NUMBER=123 #your build number
#JENKINS_HOST=192.168.1.11
#JENKINS_PORT=8080
#JENKINS_URL=http://${JENKINS_HOST}:${JENKINS_PORT}
wget --no-check-certificate "${JENKINS_URL}/jnlpJars/jenkins-cli.jar"
java -jar jenkins-cli.jar -s "$JENKINS_URL" keep-build "$JOB_NAME" "$BUILD_NUMBER"

Analysing a shell script

This would be part of a reverse-engineering project.
To determine and document what a shell script (ksh, bash, sh) does, it is comfortable, if you have information about what other programs/scripts it calls.
How could one automate this task? Do you know any program or framework that can parse a shell script? This way for instance, I could recognize external command calls -- a step to the right direction.
For bash/sh/ksh, I think you can easily modify their source to log what has been executed. That would be a solution.
How about:
Get a list of distinct words in that script
Search $PATH to find a hit for each
?
bash -v script.sh ?
Bash's xtrace is your friend.
You can invoke it with:
set -x at the top of your script,
by calling your script with bash -x (or even bash --debugger -x),
or recursively by doing (set -x; export SHELLOPTS; your-script; )
If you can't actually run the script, try loading it into a text editor that supports syntax highlighting for Bash. It will color-code all of the text and should help indicate what is a reserved word, variable, external command, etc.

Resources