Background
After a lot of hard work we finally got a Jenkins CI pulling code from out GitHub repositories and are now doing Continuous Integration as well as Deployment.
We get the code and only deploy it if all the tests pass, as usual.
Now I have checked that there are a number of plugins for Java that besides running the tests, also do test coverage, like Cobertura.
But we don't use Java. We use Elixir.
In the Elixir world, we have excoveralls, which is a facade for the coveralls API. The coveralls API supports jenkins so it stands to reason I would find a Coveralls Plugin for Jenkins.
I was wrong. There is nothing.
Questions
So now I have a test coverage metric that is basically useless because I can't integrate it with Jenkins.
Are there any Erlang/Elixir plugins one can use with Jenkins for code coverage?
I also created a Issue in the projects ( which seems to be abandoned ... ) https://github.com/parroty/excoveralls/issues/167
I have a stage to publish the coverage on my Jenkinsfile. I'm not sure if that is the metric that you want but...
stage('Publish Coverage') {
when{
branch 'master'
}
steps {
publishHTML target: [
allowMissing: true,
alwaysLinkToLastBuild: true,
keepAll: true,
reportDir: 'cover',
reportFiles: 'excoveralls.html',
reportName: 'Coverage Report'
]
}
}
I have found 2 ways of doing this:
Using Hex package JUnit formatter together with junit post pipeline step
Using covertool together with Cobertura Jenkins pluing
Option 1
This solution works and is quite nice. It forces me to change the test_helper.exs but that is a minor inconvenience overall. It is nice but it only offers the most basic of reports and for me this is where it fails.
Option 2
The option I decided to go with. Yes, making the Jenkinsfile work for Cobertura was a nightmare, specially because in previous versions it was not even possible and because there is contradictory information scattered all over the place.
However, once you get that Jenkinsfile going, you get to rip those sweet reports from Cobertura. Cobertura was made with Java in mind, there is no two ways about it. In the reports you see things like Class coverage and such, but you can easily translate that do modules. The interface offers a lot more information and tracks coverage over time, which is something I actually want.
For future notice, here is my Jenkinsfile:
pipeline {
agent any
environment {
SOME_VAR = "/home/deployer"
}
stages {
stage("Build") {
steps {
sh "MIX_ENV=test mix do deps.get, deps.compile"
}
}
stage("Test") {
steps {
sh "mix test --cover"
}
}
stage("Credo"){
steps{
sh "mix credo --strict"
}
}
stage("Deploy"){
when{
expression{
env.BRANCH_NAME == "master"
}
}
steps{
sh '''
echo "Deploy with AWS or GCP or whatever"
'''
}
}
}
post{
always{
cobertura coberturaReportFile: "coverage.xml"
}
}
}
Of notice:
1. I am extremely Nazi with my code, so I also use Credo. You can further configure it as to not blow the entire pipeline because you missed a new line at the end of file but as I said, I am quite Nazi with my code.
2. The Deploy stage only runs if the pushed branch is Master. There are other ways of doing this, but I found it that having this way for a small project was good enough.
Overall I like covertools for now but I don't know if the first solution has the same potential. At least I didn't see it.
Hope this post helps!
Original thread:
https://elixirforum.com/t/excoveralls-plugin-for-jenkins-ci/18842
Another way to post coverage from Jenkins for Elixir project is using ExCoveralls option mix coveralls.post. This allows you to post the coverage from any host, including your Jenkins server. Based on the example on this Jenkins tutorial page, you can write in Jenkinsfile like this:
pipeline {
agent any
stages {
// Assuming all environment variables are set beforehand
stage('run unit test') {
steps {
sh 'echo "Run Unit Test and Post coverage"'
sh '''
MIX_ENV=test mix coveralls.post --token $COVERALLS_REPO_TOKEN --sha $GIT_COMMIT --branch $GIT_BRANCH --name "jenkins" --message $GIT_COMMIT_MSG
'''
}
}
}
}
Related
I am currently a jenkins user and is exploring the Teamcity.
In jenkins we have a concept of shared libraries, which basically extends a generic groovy code into different jenkins pipeline and avoid re-writing the same functionality in each jenkins file following the DRY (don't repeat yourself) , hide implementation complexity, keep pipelines short and easier to understand
Example:
There could be a repository having all the Groovy functions like:
Repo: http:://github.com/DEVOPS/Utilities.git (repo Utilities)
Sample Groovy Scipt ==>> GitUtils.groovy with below functions
public void setGitConfig(String userName, String email) {
sh "git config --global user.name ${userName}"
sh "git config --global user.mail ${email}"
}
public void gitPush(StringbranchName) {
sh "git push origin ${branchName}"
}
In jenkinsfile we can just call this function like below (of course we need to define config in jenkins for it to know the Repo url for Shared library and give it a name):
Pipeline
//name of shared library given in jenkins
#Library('utilities') _
pipeline {
agent any
stages {
stage ('Example') {
steps {
// log.info 'Starting'
script {
def gitutil = new GitUtils()
gitutils.setGitConfig("Ray", "Ray#rayban.com")
}
}
}
}
}
And that's it anyone wanting the same function has to just include the library in jenkinsfile and use it in pipeline
Questions:
Can we migrate over the same to Teamcity, if yes how can it be done? We do not want to spend lot of time to re-writing
Jenkins also support Stashing and unstashing of workspace between stages, is the similar concept present in teamcity?
Example:
pipeline {
agent any
stages {
stage('Git checkout'){
steps {
stash includes: '/root/hello-world/*', name: 'mysrc'
}
}
stage('maven build'){
agent { label 'slave-1' }
steps {
unstash 'mysrc'
sh label: '', script: 'mvn clean package'
}
}
}
}
As for reusing common TeamCity Kotlin DSL libraries, this can be done via maven dependencies. For that you have to mention it in the pom.xml file within your DSL code. You can also consider using JitPack if your DSL library code is hosted on GitHub for example and you do not want to handle building it separately and publishing its maven artifacts.
Although with migration from Jenkins to TeamCity you will most likely have to rewrite the common library (if you still need one at all), as TeamCity project model and DSL are quite different to what you have in Jenkins.
Speaking of stashing/unstashing workspaces, it may be covered by either artifact rules and artifact dependencies (as described here: https://www.jetbrains.com/help/teamcity/artifact-dependencies.html) or repository clone mirroring on agents.
In a nutshell, how manage to get an automated build server with jenkins or other, to build several Delphi's projetcs using MSBuild?
I am currently a trainee in a company. I have managed to find a solution to migrate and change the old SCM software : PVCS to SVN. But they are using old shell scripts and Cygwin to build with several options to compile/release all or certain Delphi projects and produce DLL and EXE. I wanted firstly to use Jenkins to try to reproduce the same mechanism, but I am not sure it is the best way to deal with this. I have tried to set a free-style job and a multibranch pipeline. The first is ok to build one project but the latter is not a success, I don't know groovy...
I am not interested in the test part of the continuous integration I just want to have an automated build for several Delphi projects.
I don't know how to deal with this. Maybe the best way is to make as much as jenkins' jobs as there is delphi's projects? But how to control them after?
I have read about Maven and Ant but I am not sure it is relevant in my case.
Any advices are welcomed
You can create simple jobs "free-style job" or "pipelines". The pipelines are more powerful, but more complicated if you are starting.
You can start by creating a Job for each project. Then you can chain projects with different jenkins options. When a Job finish the other job start. See image following image.
You can also use to compile an existing plugin for existing RAD Studio for Jenkins. Use it in "free-style job".
The other option is to use pipelines, but you should know something about Groovy.
For example, a simple pipeline with several steps would be this:
pipeline {
agent any
stages {
stage('Stage: Show message Hola Mundo') {
steps {
echo 'Paso 1. Hola Mundo'
}
}
stage('Download source from GIT') {
steps {
echo 'Downloading...'
git([url: 'https://XXX_repository_xxxx.git/gitProject', branch: 'master', credentialsId: 'a234234a-344e-2344-9440-423444xxxxxx'])
}
}
stage('Executing MSDOS file (BAT)') {
steps {
echo '-- Sample Executing BAT file'
bat '"c:\\Program Files (x86)\\Embarcadero\\Studio\\19.0\\bin\\rsvars.bat"'
}
}
stage('MSBuild a Delphi project') {
steps {
println("************ EXECUTING MSBUILD ******************")
echo '-- Lanzar la ejecuciĆ³n de rsVars ---------'
bat '"c:\\Program Files (x86)\\Embarcadero\\Studio\\19.0\\bin\\rsvars.bat"'
echo '-- MSBuils del proyecto TestLauncher -------'
bat '"c:\\local\\AutomaticTestsProject\\compilar.bat"'
}
}
stage('Execute a test project (EXE)') {
steps {
bat 'c:\\local\\AutomaticTestsProject\\BIN\\AutomaticTestsProject.exe'
}
}
stage('Send emeil') {
steps {
emailext (
subject: "Job '${env.JOB_NAME} ${env.BUILD_NUMBER}'",
body: """<p>Check console output at ${env.JOB_NAME}</p>""",
to: "destinatary#hotmail.com",
from: "JenkinsMachine#mail.com" )
}
}
}
}
I have a scenario where in I have a frontend repository with multiple branches.
Here's my repo vs application structure.
I have a single Jenkinsfile like below:
parameters{
string(name: 'CUSTOMER_NAME', defaultValue: 'customer_1')
}
stages {
stage('Build') {
steps {
sh '''
yarn --mutex network
/usr/local/bin/grunt fetch_and_deploy:$CUSTOMER_NAME -ac test
/usr/local/bin/grunt collect_web'''
}
}
}
The above Jenkinsfile is same for all customers so I would like to understand what is the best way to have multiple customers build using a same Jenkinsfile and build different pipelines based on the parameter $CUSTOMER_NAME
I am not sure if I understood your problem. But I guess you could use a shared pipeline library: https://jenkins.io/doc/book/pipeline/shared-libraries/
You can put the build step in the library and call it with CUSTOMER_NAME as parameter.
(Please note: a shared pipeline library must be stored in a separate GIT repository!)
Currently working on a basic deployment pipeline in Jenkins (with pipeline). I am looking for the best way of doing the following:
When the developer pushes to the development branch, all stages but deploy is executed.
When the developer pushes to the master branch, all stages including deploy is executed.
I have read about matching patterns you can do, but not sure if this is the right way as the information I read was dated.
My Jenkins pipeline file
node {
stage('Preparation') {
git 'git#bitbucket.org:foo/bar.git'
}
stage('Build') {
sh 'mkdir -p app/cache app/logs web/media/cache web/uploads'
sh 'composer install'
}
stage('Test') {
sh 'codecept run'
}
stage('Deploy') {
sh 'mage deploy to:prod'
}
}
There's no magic here. This is just Groovy code. The branch in scope will be available as a parameter in some way. Inside the "stage" block, add an "if" check to compare the branch name with whatever logic you need, and either execute the body or not, depending on what branch is in scope.
It looks like support for test results analyzer was added to the pipeline plugin from the jira issue below. I'm having trouble figuring out how to acutally implement the plugin using a pipeline script.
https://issues.jenkins-ci.org/browse/JENKINS-30522
Regardless of the jira issue, how can I run the test results analyzer through my pipeline?
When you add test reports to your pipeline script this works automatically. The "Test Results Analyzer" button shows up right away for jobs that have tests, including those that use the pipeline plugin.
For example when using the standard "junit" report plugin like this, it should work out of the box:
stage('Unit tests') {
steps {
sh 'unit-tests.sh'
}
post {
always {
junit 'path/to/report.xml'
}
}
}