I have an application which needs to be tested for several architectures (Centos5, Centos6, Centos7);
I implemented a Jenkinsfile in which I run the set of tests for the chosen a architecture target.
Now I want to somehow run these tests for all target architectures. How can I achieve this?
I was told that I need to investigate about Jenkins multi-configuration projects, but all the examples I could find are about Java based projects only. And if this is the approach to be used, how can I call my Jenkins script with different input parameter values?
I will appreciate if someone could provide me some hints on where to start and with which Jenkins plugins.
thanks
Now, I want to achieve a matrix based approach, namely build
I have written a Jenkins pipeline which builds an application.
Related
I am in a process of configuring Jenkins to deploy artifacts. I only need apache ant and java to create artifacts(both are available on the host machine) and no other external libraries. So, I think using Maven will make it unnecessarily complex as I have only 2 ant files. I want to keep it as simple as possible.
What I want to achieve is:
1. Trigger a Jenkins job 'A' to build the artifact and deploy it to nexus repository.
2. Trigger another Jenkins Job 'B' to take same artifact generated in in step above and deploy it to target environment.
Can anyone please help me to identify challenges with my approach and share some useful links to achieve what I have specified.
A short answer is Yes you can. Each of the component you mentioned can be used individually and can be integrated into your build pipeline. TBH, your use case isn't one off and can be easily done if you start here.
I'm new to working with Jenkins and could use some help.
Right now, I have a project I want to build with Jenkins. I have a rough idea on how to build a simple project. What im wondering is, can I build the project with certain compiler flags, and then build the project again with different flags automatically?
My goal with all this is to be able to submit a program to Jenkins, and it will compile the program, run some tests, and then restart but this time with different compiler settings. Then I check the results to see under which compiler settings the code runs fastest. I need to use Jenkins and I need to do this testing.
My current strategy was to setup a master/agent system, and have the Master server go through a pipeline where each step it compiles the code a certain way and pushes it to the appropriate agent queue where it will be executed. Is this feasible? How should I go about this?
I dont know if I understood you correctly but I will do the different build you want to run the same compile/tests with different flags.
I will do a jenkins pipeline with all together and I will start different stages sequentially like:
Stages:
CheckSCM (git clone)
Build with flag1
cleanWS (CleanWork Space)
CheckSCM
build with flag2
cleanWS
CheckSCM
build with flag3
I am working with Jenkins, and we have quite a few projects that all use the same tasks, i.e. we set a few variables, change the version, restore packages, start sonarqube, build the solution, run unit/integration tests, stop sonarqube etc. The only difference would be like {Solution_Name}, everything else is exactly the same.
What my question is, is there a way to create 1 'Shared' job, that does all that work, while the job for building the project passes the variables down to that shared worker job. What i'm looking for is the ability to not have to create all the tasks for all of our services/components. It be really nice if each of our services/components could have only 2 tasks, one to set the variables, another to run the shared job.
Is this possible?
Thanks in advance.
You could potentially benefit from looking into the new pipelines as code feature.
https://jenkins.io/doc/book/pipeline/
Using this pattern, you define your build pipeline in a groovy script rather than the jenkins' UI. This script is then kept in the codebase of the project it builds in a file called Jenkinsfile.
By checking this pipeline into a git repository, you can create a minimal configuration on the jenkins' side and simply tell it to look towards a specific repo and do the things that pipeline says to do.
There's a few benefits to this approach if it works for your setup. The big one being that your build pipeline will be fully versioned just like the project it builds. And the repository becomes portable, easily able to be built on any jenkins' installation across as many jobs as long as the pipeline plugins are installed.
I only have experience to do build with Jenkins, and deployment was via UrbanCode. Now if I want to use Jenkins for release and deployment automation to replace UrbanCode, where shall I start? I see workflow, pipeline, etc., don't know what's the exact solution from Jenkins for the deployment automation.
After spend a day and two, I am still struggling, this time is with so many different plugins. Don't know exactly what are the plugins should be used, e.g. what's the relationship between this: https://wiki.jenkins-ci.org/display/JENKINS/Pipeline+Plugin and this:https://wiki.jenkins-ci.org/display/JENKINS/Build+Pipeline+Plugin? I am desperately looking for an well organised document on how to use Pipeline plugin to build a complete build and deployment automation solution.
I would like to be able to configure centrally something like "build profiles" which I can apply to multiple projects in Jenkins.
For instance, I want to setup a compile, email, deploy chain to be used by several projects. When I change something in this chain, I want to automatically apply the changes to all linked projects.
Is there a convenient way to do this? I am also open to suggestions for other build systems, as long as they can deal with sbt projects.
I see there is a SBT plugin for Jenkins which looks popular-I haven't used it
I have used the jenkins job-dsl which covers sbt out the box. This works by a build step in a job to create/regenerate other jobs (with an optional template)
The problem with having a generic job building separate projects is that all the job history gets merged together. I think it is better to use stand-alone jobs for each task and the job-dsl will allow you to do that
TeamCity supports build configuration templates out of the box and recently added basic sbt support.