Migrating cruisecontrol to jenkins - jenkins

I want to migrate My jobs from cruisecontrol to Jenkins.
What are the steps that I need to do to achieve this ?

1) Install Jenkins
2) Use the Jenkins docs to create some simple jobs (in other words, learn how Jenkins works but using it for some simple demo projects)
3) Examine your existing cruise control jobs and break them up into groups whose builds are similar.
4) Migrate one group at a time by first creating a Jenkins job form one of the Cruise Control projects, then create the other jobs based on the first (Jenkins lets you create new jobs based on an existing job).

If you don't have specific needs, the migration is easy as CruiseControl.NET simply execute task and build reports, as all other continuous integration tools.
Simply create new tasks in Jenkins
Fill it with the executable tasks you had in CC.NET
(eventually) need to improve the dashboard if you
want specific information

You did not specify your Build/Test/Deploy Steps or of any special needs ,
But as a Vanilla CI would be
1) Build using MSBuild
2) Test Using MSTest
3) Deploy using MSBuild/NANt/ANT/Powershell or even Batch scripts .
All Those steps can be easily replicated to Jenkins Because there are dedicated plugins for all of them .
I would suggest for you to select a simple Build>Test>Deploy Flow from CC.NET and recreate it in Jenkins, after that copying it for other projects,

Related

Creating single github repository with jenkins for each datastage jobs

I need to create each repository for each job present in the datastage folder.
Previously I was doing this by creating repository one by one for every single job manually..
The problem is I have 1000 jobs in datastage so doing this manually is a time consuming task..
Is there any automation in Jenkins to do the process??
Check out mettleci.com - they've built a whole suite of Continuous DevOps tools around DataStage, including interaction with Jenkins.

Jenkins with Shared jobs

I am working with Jenkins, and we have quite a few projects that all use the same tasks, i.e. we set a few variables, change the version, restore packages, start sonarqube, build the solution, run unit/integration tests, stop sonarqube etc. The only difference would be like {Solution_Name}, everything else is exactly the same.
What my question is, is there a way to create 1 'Shared' job, that does all that work, while the job for building the project passes the variables down to that shared worker job. What i'm looking for is the ability to not have to create all the tasks for all of our services/components. It be really nice if each of our services/components could have only 2 tasks, one to set the variables, another to run the shared job.
Is this possible?
Thanks in advance.
You could potentially benefit from looking into the new pipelines as code feature.
https://jenkins.io/doc/book/pipeline/
Using this pattern, you define your build pipeline in a groovy script rather than the jenkins' UI. This script is then kept in the codebase of the project it builds in a file called Jenkinsfile.
By checking this pipeline into a git repository, you can create a minimal configuration on the jenkins' side and simply tell it to look towards a specific repo and do the things that pipeline says to do.
There's a few benefits to this approach if it works for your setup. The big one being that your build pipeline will be fully versioned just like the project it builds. And the repository becomes portable, easily able to be built on any jenkins' installation across as many jobs as long as the pipeline plugins are installed.

Jenkins: how to test the slaves

I am creating a list of Jenkins jobs for sanity test of our Jenkins build environment. I want to create layers of jobs. The first layer of jobs will check the environment, e.g. if all slaves are up, the 2nd layer then can check the integration to other tools such as GitHub, TFS, SonarQube, then the 3rd layer can run some typical build projects. This sanity test can also be used to verify the environment after any major changes to the Jenkins servers.
We have about 10 slaves created on two servers, one Windows and one Linux. I know I can create a job to run on a specific slave, therefore test if the slave is online, but this way I need to create 10 jobs just to test all slaves. Is there a best approach to check if all slaves are online?
One option is to use Jenkins Groovy scripting for a task like this. The Groovy plugin provides the Jenkins Script Console (a useful way to experiment) and the ability to run groovy scripts as build steps. If you're going to use the Script Console for periodic maintenance, you'll also want the Scriptler plugin which allows you to manage the scripts that you run.
From Manage Jenkins -> Script Console, you can write a groovy script that iterates through the slaves and checks whether they are online:
for (node in Jenkins.instance.nodes) {
println "${node.name}, ${node.numExecutors}"
def computer = node.computer
println "Online: ${computer.online}, ${computer.connectTime} (${computer.offlineCauseReason})"
}
Once you have the basic checks worked out, you can create either a standalone script in Scriptler, or a special build to run these checks periodically.
It often takes some iteration to figure out the right set of properties to examine. As I describe in another answer, you can write functions to introspect the objects available to scripting. And so with some trial and error, you can develop a script performs the checks you want to run.

run a Jenkins job on another Jenkins instance from the Jenkins job

I want to create a Jenkins job that starts other Jenkins jobs. That would be quite easy, because Jenkins Template Project Plugin allows us to create a build step of a type "use builders from another project". However, what makes my situation harder is that I have to start Jenkins jobs on other machines. Is there any standard way to do that?
In case you want only to trigger new build of Job You Have plenty of ways to accomplish it
you can use remote access API and Trigger a request to build target job from source Job.
https://wiki.jenkins-ci.org/display/JENKINS/Remote+access+API
Or you can use https://wiki.jenkins-ci.org/display/JENKINS/Parameterized+Remote+Trigger+Plugin
which is handy in handling server details and other stuff. you shoukld ensure ssh keys shared by both servers.

Centrally managed build profiles in Jenkins

I would like to be able to configure centrally something like "build profiles" which I can apply to multiple projects in Jenkins.
For instance, I want to setup a compile, email, deploy chain to be used by several projects. When I change something in this chain, I want to automatically apply the changes to all linked projects.
Is there a convenient way to do this? I am also open to suggestions for other build systems, as long as they can deal with sbt projects.
I see there is a SBT plugin for Jenkins which looks popular-I haven't used it
I have used the jenkins job-dsl which covers sbt out the box. This works by a build step in a job to create/regenerate other jobs (with an optional template)
The problem with having a generic job building separate projects is that all the job history gets merged together. I think it is better to use stand-alone jobs for each task and the job-dsl will allow you to do that
TeamCity supports build configuration templates out of the box and recently added basic sbt support.

Resources