How to run integration tests with bazel and docker compose - docker

I am writing a client SDK library and use bazel as the build system. I want to write a test which spin up two docker instances. One is a server, which can accept client requests. And the other is test binary of SDK library. So this is an integration test. My goal is that this test can be run with bazel test //sdk/gtest_binary. I know that bazel has a rule called sh_test, which can be used to trigger docker compose to start the two docker instances. I am wondering if this is achievable and is there a better way to run such integration tests? Should I use bazel docker rule?

Related

VueJS, run e2e tests using Cypress into a Jenkins pipeline using a docker image

I created a VueJs project with some unit tests (using Jest) and integration tests using Cypress.
I have also a Jenkins pipeline in order to build, test and deploy the application.
I Integrated, as test stage, the unit tests but I would like to integrate also Cypress in order to run the integration tests into a dedicated pipeline step.
Is is possible to have this without installing any additional Cypress Jenkins plugin?
I mean, Is it possible to use a docker image to run the tests using Cypress?
Can you point me to some examples?
You should be able to use this with Jenkins Docker capabilities. Also here is another example you can refer to.

Spinning up Karate test client in a docker container

I am setting up the integration test framework for a Java rest api in our project and we want to run integration test in the gitlab pipeline. Since these tests are running in the same project as the API, we are wondering couple of things:
We dont want to run Karate tests during the maven build process. We want to run them only at integration test stage after the application deployment stage is complete. How do we do that as the maven build process runs both the junit unit tests and karate tests.
Since the API requires authentication, we need to run the karate test in a docker container, since we can inject our credentials only in the container as we are using hashi-corp vault to store the credentials. How do we launch a container with Karate client.
There are ways to run only a subset using Maven. What I do is define a different JUnit test, and call that from the command-line. Read the docs for more: https://github.com/karatelabs/karate#command-line
As long as you can pass environment variables (which you certainly can in Docker) you are good. Refer: https://stackoverflow.com/a/52821230/143475

How can I create a temporary instance of a docker image and execute a command on the instance within an Azure Pipeline?

I have multiple build and deploy pipelines for my application (User Interface, Internal APIs, External APIs, etc...). I have another build pipeline for my automated tests (which use Node JS, Nightwatch-API, Cucumber, etc..) that builds a Docker image and pushes it to the Container registry.
I want to be able to pull the testing image into my deployment pipelines and execute the appropriate test script command (i.e. npm run test:InternalAPIs). My test scripts will publish the results to a separate system. I am trying to find the best way to execute the automated testing from within the deployment pipeline.
This seems like it should be an easy task within the pipeline build, I just cannot find the task that does what I need. Any assistance would be greatly appreciated.
I'd probably write a bash script for it, and then run the script with the Bash#3 task.
Alternatively, you could make use of built in tasks, such as Docker#2 and npm#1.
Refer to Microsoft's documenation for more details.
Edit: You can create a temporary instance of the docker image with the docker run command.

Gitlab CI: How to configure cypress e2e tests with multiple server instances?

My goal is to run a bunch of e2e tests every night to check if the code changes made the day before break core features of our app.
Our platform is an Angular app which calls 3 separate Node.js backends (auth-backend, old- and new-backend). Also we use a MongoDB as Database.
Let's consider every of the 4 projects to have a branch called develop which should only be testet.
My approach would be the following:
I am running every backend plus the database in a separate docker container.
Therefor I need to get either the latest build of that project from gitlab using ssh
or clone the repo to the docker container and run a build inside it.
After all project are running on the right ports (which I'd specify somewhere) I start the npm script for running cypress e2e tests.
All of that should be defined in some file. Is that even possible?
I do not have experience with the gitlab CI, but I know, that other CI-systems provide the possibility, to run e.g. bash scripts.
So I guess you can do the following:
Write a local bash script that pulls all the repos (since gitlab can provide secret keys, you can use these in order to authenticate against your gitlab repos)
After all of these repos were pulled, you can run all your build commands for your different repos
Since you have some repos working and depending on each other, you possibly have to add a build command for exactly this use case, so that you always have production state, or whatever you need
After you have pulled and built your repos, you should start your servers for your backends
I guess your angular app uses some kind of environment variables to define the servers to send the request to, so you also have to define them in your build command/script for your app
Then you should be able to run your tests
Personally I think that docker is kind of overdose for this use case. Possibly you should define and run a pipeline to always create a new develop state of your backend, push the docker file to your sever. Then you should be able to create your test-pipeline which first starts the docker-container on your own server (so you do not have an "in-pipeline-server"). This should then have started all your backends, so that your test pipeline can now run your e2e tests against those set up Backend servers.
I as well advise, that you should not run this pipeline every night, but when the develop state of one of those linked repos changes.
If you need help setting this up, feel free to contact me.

Steps to run Test framework in Docker and Jenkins

Background:
I am a newbie to docker.
I have 2 automation frameworks in my local PC - One for Mobile and other a web application. I have integrated the test frameworks with Jenkins.
Both test frameworks have open Jar dependencies mentioned in Maven pom.xml.
Now i want that when I click on Jenkins Job run to execute tests, my tests should run in a docker container.
Can anyone please give me steps to
Configure Docker in this completer Integrated framework
How to push my dependencies in docker
How to integrate jenkins and Docker
how to run Tests of web and mobile apps in docker on jenkins job click
I'm not a Jenkins professional, but from my experience, there are many possible setups here:
Assumptions:
By "Automation Framework", I understand that there is some java module (built by maven, I believe for gradle it will be pretty much the same) that has some tests that in turn call various APIs that should exist "remotely". It can be HTTP calls, working with selenium servers and so forth.
Currently, your Jenkins job looks like this (it doesn't really matter whether its an "old-school" job "step-by-step" definition or groovy script (pipelines):
Checkout from GIT
run mvn test
publish test results
If so, you need to prepare a docker image that will run your test suite (preferably with maven) to take advantage of surefire reports.
So you'll need to build this docker image once (see docker build command) and make it available in the private repository / docker hub depending on what your organization prefers. Technically for this docker image, you can consider a Java image as a base image, get the maven (download and unzip + configure) then issue the "git pull command". You might want to pass credentials as system variables to the docker process itself (see '-e' flag)
The main point here is that maven inside the docker image will run the build, so it will resolve the dependencies automatically (you might want to configure custom repositories if you have them in settings.xml of maven). This effectively answers the second question.
One subtle point is results that should be somehow shown in Jenkins:
You might want to share the volume with surefire-results folder with the Jenkins "host machine" so that Jenkins's plugins that are supposed to show the results of tests will work. The same idea is applicable if you're using something like allure reports, spock reports and so forth.
Now when the image is ready the integration with Jenkins might be as simple as running a docker run command and wait till it's done. So now the Jenkins job will look like:
docker run pre-defined image -e <credentials for git>
show reports
This is one example of possible integration.
One slightly different option is running docker build as a job definition. This might be beneficial if for each build that image should be significantly different but it will make the build slower.
Following approach can be followed to achieve your goal
Create a docker file with all your setup as well as dependency ( refer)
Install docker plugin on jenkins to integrate the support of docker (refer)
Use Jenkinsfile's approach to pull the docker image or create it by dockerfile and run the test within docker.
below sample code just for reference
node
{
checkout scm
docker.withRegistry('https://registry.example.com', 'credentials-id')
{
def customImage = docker.build("my-image")
docker.image('my-image').inside
{
//Run inside the container
sh 'run test'
}
}
}

Resources