Implement Jmeter/taurus with Openshift - jenkins

I am Implementing Jmeter/taurus for performance testing for microservices. We are using Openshift PaaS solution to run all microservices. I am able to deploy jmeter/taurus inside Openshift using jenkins pipeline and generated the taurus report using jmx report in the container. My requirement is to publish the taurus report to Jenkins, rather than storing it to cloud storage or nexus. Can someone advise me what should be best approach to publish performance report for developers on Jenkins or any other optimal way to publish.
I found something by googling where they Jenkins agent was deployed inside Openshift and checkout the test suite Git repo into the agent's workspace just want to make sure if this is the best approach for my scenario. Our Jenkins master is running on Google cloud platform VM's with some dynamic slaves.
Thanks in Advance!

According to Dump Summary for Jenkins Plugins Taurus User Manual Chapter, you just need to add reporting module definition to your YAML configuration file like:
reporting:
- module: final-stats
dump-xml: stats.xml
And "feed" this stats.xml file to Jenkins Performance Plugin
That's it, you should get Performance Report added to your build dashboard. Check out How to Run Taurus with the Jenkins Performance Plugin article for more information if needed.

Related

How to add Sonarqube into Jenkins

I have installed sonar and jenkins. Now I want to add Sonarqube into Jenkins. But in the manage plugins, it doesn't show me the Sonarqube.the display I get
SonarQube is a standalone server. It offers a web user interface to visualize bugs, code smells and vulnerabilities. You cannot include this web-ui SonarQube in Jenkins.
However, you can trigger a scan as part of your Jenkins job. This scan can "send" its findings into a SonarQube installation - either hosted by your own (on-premise), or using the hosting offer at sonarcloud.io.
There are a couple of different ways to include the scanner in your job, but the setup is specific to your programming language and build tools (Maven, VisualStudio, command line, ...). Check the sonarcloud docs for the way, that fit's best to your situation.

What is the best practice for CI development?

We are starting to develop CI workflow for our systems in my company.
Currently we just making few basic tasks like build, tests, and upload to Nexus.
The tech stack is a Java project which build in Gradle and Jenkins makes our build.
Currently i'm working with some basic Groovy script to make what we need, but each time i'm copy and paste my updated code to Jenkins and running the job from Jenkins UI to see the results, and to me it seems like not a very good approach for developing such automation code.
My question is, what is the best practice to build and run Jenkins jobs?
Is it possible to run it straight from Intellij ?
Do we need to create a Jenkins project which should be saved as a repository and then deploy it to Jenkins machine?
Do we need to use some Intellij plugins in order to work with Jenkins?
More best practices are welcome :)
Jenkins has an API - so you can do whatever you want!
But in general, for small to medium teams it's better to use Jenkinsfile and let Jenkins pull code changes (or pull-requests) from SCM and trigger builds. You can also configure hooks to trigger builds if your SCM supports this (Github & bitbucket supports this).
If you are eventually pushing your artifacts to a docker image, I would highly recommend docker multi-stage builds.
If you are completely new to CI/CD stuff - Atlassian has a lot of good resources https://www.atlassian.com/continuous-delivery/principles/continuous-integration-vs-delivery-vs-deployment

Automate configuration of Jenkins & Sonarqube

I am trying to find a solution to automate installation and configuration of Jenkins & SonarQube. The idea is to provide an easy to use provisioning utility for setting up CI. Ideally I would love to automate the following
Installation
Set up users,Build, Unit testing and Code coverage
Is there an SDK, CLI or similar which can be used from batch script?
Thanks
You can use the Jenkins docker image for the installation part - even if you're not using Docker you can still copy the installation procedure:
https://github.com/jenkinsci/docker
For the setup of jobs I would recommend the Job DSL:
https://github.com/jenkinsci/job-dsl-plugin
For the rest you can use the Jenkins CLI or you can manually configure it once and then extract the corresponding XML file from the Jenkins home and copy it into other installations.

Jenkins Puppet integration

My development setup is such that for every svn checkin code is built,unit tested, packaged and published in Artifactory. Now I want to automate my deployment process & run integration(Selenium) test as part of this process. I am thinking of using Puppet to managed the deployment
Is puppet the correct tool for this
What is the process I should use to trigger puppet master to initiate a fresh installation on agents, I couldn't find any Jenkins plugin that would actually trigger puppet. One option is to call
puppet apply ...
as a Jenkins post build task
Any suggestions welcome, thank you.
Have a look at this Selenium Jenkins article from Saucelabs, a service that automates cross-browser testing. Though they are a vendor with a service to sell, the article covers how to do Selenium testing yourself with Jenkins. It also exposes common pain points you are likely to run into with this approach.
A Puppet master doesn't serve the function of orchestrating client convergences. Take a look at Mcollective. This is a tool that will allow you to trigger puppet runs on target systems from a Jenkins agent via script commands.
Some Mcollective getting started material:
http://www.slideshare.net/PuppetLabs/presentation-16281121
http://puppetlabs.com/mcollective

Do I specify binary artifact settings in the build scripts or the CI server?

I'm prototyping a new build system using Jenkins, Gradle, and Artifactory. There seems to be conflicting or rather overlapping features in these tools, in regards to specifying the build artifacts and their destination. I see three paths going forward:
Specify the artifact settings on the particular task in Jenkins, using the Jenkins Artifactory plugin.
Specify the artifact settings in the Gradle build scripts, using the Gradle Artifactory plugin.
Specify generic maven repo settings in the Gradle build scripts, using the standard Gradle "maven" plugin.
I see pro's and con's to all of these approaches, but nothing is missing a critical feature for our builds, as far as I can see.
To further my confusion, the Gradle Artifactory plugin wiki states:
Build Server Integration - When running Gradle builds in your
continuous integration build server, it is recommended to use one of
the Artifactory Plugins for Jenkins, TeamCity or Bamboo to configure
resolution and publishing to Artifactory with build-info capturing,
via your build server UI.
So, some questions to get the conversation going:
Does it make sense to clutter the build scripts with artifact logic? It might help to add that developer's don't deploy. Currently, I only see build artifacts being uploaded from the Jenkins task.
Does leaving all of this build logic in the task configuration expose us to issues, in the event that the CI server is down?
What about version control for artifact changes done through the CI interface?
I've seen simple Bamboo configurations that specify the build artifacts through the CI server UI, rather than the pom's. Is this just a bad build practice?
Is there a killer tool integration feature that separates one of these approaches from the other?
How useful is the build info object? Is that only available in the Jenkins Artifactory plugin and not the Gradle Artifactory plugin?
I am really hoping to hear from existing users of these tools and what pitfalls/requirements may have led them to one of the approaches above (or perhaps even a better one that I haven't considered yet).
Does it make sense to clutter the build scripts with artifact logic? It might help to add that developer's don't deploy. Currently, I only see build artifacts being uploaded from the Jenkins task.
I'd say that's the way to go. Your build server is the single point of truth, and only artifacts built in the build server should be deployed.
Does leaving all of this build logic in the task configuration expose us to issues, in the event that the CI server is down?
That one is simple - you shouldn't deploy while your CI server is down. Building on local machine might produce wrong artifacts, which shouldn't be deployed.
What about version control for artifact changes done through the CI interface?
Not sure I understood your question.
I've seen simple Bamboo configurations that specify the build artifacts through the CI server UI, rather than the pom's. Is this just a bad build practice?
This configuration ignores Maven's ability to deploy, and I am not sure I can find a good scenario to justify it. The only thing I can think of is deferred deploy, but Artifactory plugin can take care of that.
Is there a killer tool integration feature that separates one of these approaches from the other?
Now we got to the essence :)
Well, the advantage of defining what you deploy in your build script (in case of Gradle) gives you the flexibility to fine-tuning every aspect of the deployment (think about the dynamic properties you might want to add in certain cases). Another very serious advantage is that your build is source, which means it is versionable in your version control.
The advantage of defining the deployment details in the build server configuration is that the build server is the only place the deployment should occur. So, if you don't have the deployment details in your build script, you know for sure it won't be deployed standalone.
So, how can you combine between the two to get the advantages of both worlds?
Code your deployment logic in your Gradle script using the Artifactory plugin DSL. Provide details like username and password from properties, which exists on build server only.
How useful is the build info object?
Extremely useful. The information in buildInfo was harvested during the build process and the buidInfo is the only place it exists. Having this information is the only option you will be able to reproduce this build in the future.
Is that only available in the Jenkins Artifactory plugin and not the Gradle Artifactory plugin?
'artifactory' and 'artifactory-publish' Gradle plugins both generate the buildInfo object, regardless of where are they running (be it your local machine or Jenkins build server).

Resources