Required new man report to be displayed to run multiple collection through Newman - newman

Html report is not getting generated using newman html extra. I have created a Batch, i want the run to be captured in html file. It executes but the html report is not generated. Can you help me
Below is my code:
SET postman_collection=Newman.postman_collection.json
SET postman_environment=JGestab.postman_environment.json
SET postman_folder="NewManExecution"
SET postman_data="A252ST_InputSheet_Pari_V0.1.csv"
SET postman_data1="A515ST_InputSheet_Pari_V0.1.csv"
call newman run run %postman_collection% -r htmlextra,cli -e %postman_environment% -d %postman_data% --insecure
call newman run run %postman_collection% -r htmlextra,cli -e %postman_environment% -d %postman_data1% --insecure

The htmlextra report is an external report to newman package and requires separate installation:
npm install newman-reporter-htmlextra
Once this is complete, re-run the batch and the report will be generated
The htmlextra report is a Postman/Newman Community Maintained Report, more details can be found here: Newman - Using External Reporters
The newman application package only get bundled with the cli, json and junit reporters as default - see here: GitHub - postmanlabs/newman/lib/reporters/

Related

Anydesk installation by bash script using wget

I'm trying to write a bash script for automating the installation of anydesk by wget with the help of the following code:
echo -e "[ - ] Installing AnyDesk..."
wget --max-redirect 1 --trust-server-names 'https://anydesk.com/en/downloads/thank-you?dv=deb_64' -O anydesk.deb
sudo apt install ./anydesk.deb
echo -e "[ ✔ ] AnyDesk ➜ INSTALLED\n"
The problem is that https://anydesk.com/en/downloads/thank-you?dv=deb_64 returns a HTML page, not a Debian package.
How can I parse the HTML page to find the download link to the Debian package?
I examined page source of https://anydesk.com/en/downloads/thank-you?dv=deb_64 and download is triggered by JavaScript depending on User-Agent of browser, wget does not support JavaScript execution therefore you are actually getting HTML page source not actual .deb file. Use tool which support JavaScript execution to get actual file.
You can run the following command:
wget -O anydesk.deb https://download.anydesk.com/linux/anydesk_6.2.1-1_amd64.deb
this will allow you to download Anydesk, via wget.

Using JMeter plugins with justb4/jmeter Docker image results in error

Goal
I am using Docker to run JMeter in Azure Devops. I am trying to use Blazemeter's Parallel Controller, which is not native to JMeter. So, according to the justb4/jmeter image documentation, I used the following command to get the image going and run the JMeter test:
docker run --name jmetertest -i -v /home/vsts/work/1/s/plugins:/plugins -v $ROOTPATH:/test -w /test justb4/jmeter ${#:2}
Error
However, it produces the following error while trying to accommodate for the plugin (I know the plugin makes the difference due to testing without the plugin):
cp: can't create '/test/lib/ext': No such file or directory
As far as I understand, this is an error produced when one of the parent directories of the directory you are trying to make does not exist. Is there something I am doing wrong, or is there actually something wrong with the image?
References
For reference, I will include links to the image documentation and the repository.
Image: https://hub.docker.com/r/justb4/jmeter
Repository: https://github.com/justb4/docker-jmeter
Looking into the Dockerfile:
ENV JMETER_HOME /opt/apache-jmeter-${JMETER_VERSION}
Looking into entrypoint.sh
if [ -d /plugins ]
then
for plugin in /plugins/*.jar; do
cp $plugin $(pwd)/lib/ext
done;
fi
It basically copies the plugins from /plugins folder (if it is present) to /lib/ext folder relative to current working directory
I don't know why did you add this stanza -w /test to your command line but it explicitly "tells" the container that local working directory is /test, not /opt/apache-jmeter-xxxx, that's why the script is failing to copy the files.
In general I don't think that the approach is very valid because:
In Azure DevOps you won't have your "local" folder (unless you want to add plugins binaries under the version control system)
Some JMeter Plugins have other .jars as the dependencies so when you're installing the plugin you should:
put the plugin itself under /lib/ext folder of your JMeter installation
put the plugin dependencies under /lib folder of your JMeter installation
So I would recommend amending the Dockerfile, download JMeter Plugins Manager and installed the plugin(s) you need from the command line
Something like:
RUN wget https://jmeter-plugins.org/get/ -O /opt/apache-jmeter-${JMETER_VERSION}/lib/ext/jmeter-plugins-manager.jar
RUN wget https://repo1.maven.org/maven2/kg/apc/cmdrunner/2.2/cmdrunner-2.2.jar -P /opt/apache-jmeter-${JMETER_VERSION}/lib/
RUN java -cp /opt/apache-jmeter-${JMETER_VERSION}/lib/ext/jmeter-plugins-manager.jar org.jmeterplugins.repository.PluginManagerCMDInstaller
RUN /opt/apache-jmeter-${JMETER_VERSION}/bin/./PluginsManagerCMD.sh install bzm-parallel

How to generate and display coverage when running tests with Pongo for custom Kong API Gateway plugins written in Lua

I am writing a few kong custom plugins in Lua. I am using Kong 2.3.3 and Lua 5.1.
I have some test cases (unit tests + integration tests) and i am running them with pongo run -coverage option. I have already installed luacov (and also cluacov, both with luarocks install) and all my tests are passing but no luacov files are being generated with coverage data. I am not running pongo from Docker, i have installed and configured it in my local machine (which is Linux Ubuntu 20.04).
I have already tried a few things as follows:
my .busted file is setting coverage = true, verbose = true and output = "gtest" (already tried utfTerminal, tap and json too)
tried adding luacov as a dependency to my rockspec file... the build does not fail but no coverage file is generated
i even tried running the tests without pongo, using busted directly but this is a very bad option because things like spec.helpers, or the cjson lib are not set in my LUAPATH
A quick way to do this is to modify pongo
Edit your pongo.sh file to:
add coverage flag to busted --coverage
call luacov to generate the report luacov
display the report cat luacov.report.out
locate where busted is called, line 959 for me:
"/bin/sh" "-c" "bin/busted --coverage --helper=bin/busted_helper.lua ${busted_params[*]} ${busted_files[*]};luacov;cat luacov.report.out"
Install luacov, edit assets/Dockerfile
after busted installation add luacov:
&& luarocks install busted-htest \
&& luarocks install luacov \
pongo run will give you
[...]
==============================================================================
Summary
==============================================================================
File Hits Missed Coverage
-------------------------------------------------------------------------------------------------------
/kong-plugin/kong/plugins/myplugin/schema.lua 105 1 99.06%
/kong-plugin/spec/myplugin/01-schema_spec.lua 199 5 97.55%
[...]
You can create a docker image based on pongo
spec/unit/docker/Dockerfile
FROM kong-pongo-test:2.3.2
USER root
RUN luarocks install luacov
WORKDIR /kong-plugin
COPY . .
spec/unit/docker/run.sh
#!/bin/sh
busted --coverage spec/unit
luacov
cat luacov.report.out
Run
docker build -f spec/unit/docker/Dockerfile -t my-coverage .
docker run my-coverage sh spec/unit/docker/run.sh
Pongo gained some support for this (still a PR). Note that it only covers unit tests, not integration ones.
See https://github.com/Kong/kong-pongo/pull/184
btw: the other anwers are too complex imo, you can add .pongo/pongo-setup.sh to install LuaCov, and move the .luacov file from /kong-plugin to /kong. That should be all that is necessary.
Running tests with coverage can be simply done by passing the flag, without any need to edit pongo or the dockerfile. Try pongo run -- --coverage for example.

How to Activate Dataflow Shuffle Service through gcloud CLI

I am trying to activate the Dataflow Shuffle [DS] through gcloud command line interface.
I am using this command:
gcloud dataflow jobs run ${JOB_NAME_STANDARD} \
--project=${PROJECT_ID} \
--region=us-east1 \
--service-account-email=${SERVICE_ACCOUNT} \
--gcs-location=${TEMPLATE_PATH}/template \
--staging-location=${PIPELINE_FOLDER}/staging \
--parameters "experiments=[shuffle_mode=\"service\"]"
The job starts. The Dataflow UI reflects it:
However, the logs showing the error with parsing the value:
Failed to parse SDK pipeline options: json: cannot unmarshal string into Go struct
field sdkPipelineOptions.experiments of type []string
What am I doing wrong?
This question is indeed related to an existing question:
How to activate Dataflow Shuffle service?
however the original question was covering python API, while my problem is with gcloud CLI.
[DS] https://cloud.google.com/dataflow/docs/guides/deploying-a-pipeline#cloud-dataflow-shuffle
P.S. Update
I have also tried:
No luck.
There's currently no way (I know of) to enable shuffle_service for template.
You have two options:
a) Run a job not from template
b) create a template that already has shuffle_service enabled.
The unmarshalling issue is most likely because templates only support fixed amount of parameters and template does not support "experiments" parameter.
----UPD----
I was asked on how to create template with shuffle_service enabled.
Here are sample steps I took.
Follow WordCountTutorial to create project with pipeline definition.
Created template with following command:
mvn -Pdataflow-runner compile exec:java -Dexec.mainClass=org.apache.beam.examples.WindowedWordCount -Dexec.args="--project={project-name} --stagingLocation=gs://{staging-location} --inputFile=gs://apache-beam-samples/shakespeare/* --output=gs://{output-location} --runner=DataflowRunner --experiments=shuffle_mode=service --region=us-central1 --templateLocation=gs://{resulting-template-location}"
Note --experiments=shuffle_mode=service argument
Invoked template from UI or via command:
cloud dataflow jobs run {job-name} --project={project-name} --region=us-central1 --gcs-location=gs://{resulting-template-location}

How to use swagger code-generator

I am working on creating the rest client and I will be calling an API which gives this big json output .I want to know how to create the Pojo classes automatically by inputting this json to swagger code gen and let it create my pojo classes for me which will save manual time . Here is what I have tried
To generate a PHP client for http://petstore.swagger.io/v2/swagger.json, please run the following
git clone https://github.com/swagger-api/swagger-codegen
cd swagger-codegen
mvn clean package
java -jar modules/swagger-codegen-cli/target/swagger-codegen-cli.jar generate \
-i http://petstore.swagger.io/v2/swagger.json \
-l php \
-o /var/tmp/php_api_client
(if you're on Windows, replace the last command with java -jar modules\swagger-codegen-cli\target\swagger-codegen-cli.jar generate -i http://petstore.swagger.io/v2/swagger.json -l php -o c:\temp\php_api_client)
I could not get pass the mvn clean package and it is giving the error
[ERROR] Failed to execute goal org.apache.maven.plugins:maven-surefire-plugin:2.19.1:test (default-test) on project swagger-codegen: Execution default-test of goal org.apache.maven.plugins:maven-surefire-plugin:2.19.1:test failed: There was an error in the forked process
[ERROR] java.lang.NoClassDefFoundError: io/swagger/models/properties/Property
Anyone have successfully used this swagger ? or even if you can suggest any other framework which can do this functionality would be of great help . Thanks in advance ..,
I seen the following link
Update code generated by Swagger code-gen
and able to run the application .. Can anyone explain if I can use this to get the pojo object created for the json input?
Your problem is not swagger itself. Your problem comes from maven and it says that it can't find a certain class. I downloaded the repo and it compiles on my machine with mvn validate package. Make sure you have .m2\repository\io\swagger\swagger-models... in your standard maven cache directory. Thats the dependency, which has the Property class.
Actually maven should download it right before compiling. Check the maven output for connection errors or unreachable downloads etc.

Resources