Migration from Jenkins to Teamcity - jenkins

Suggest me how to migrate all build logs, jobs, config files jenkins to teamcity .
From teamcity to teamcity it is possible, from same CI servers it is possible.. But what are the steps to migrate jenkins to teamcity?

There is no direct upgrade path.
You must manually recreate project, build configurations and steps based on the technology you use.
It's not possible to convert build history, artifacts or other data from Jenkins.

Related

How to get Jenkins old build inside Jenkins

I just tried to find out the old success full build in Jenkins, I did not find any way.
can any one help to find the old build in Jenkins ?
Jenkins has a setting where you can control for how long or how many builds you keep the build history or the build artefacts.
If you still have the build in history but you lost the artefacts you can rebuild it from the same hash. If the build definiton and the build environment (jenkins slave) is the same, you should obtain the same build result.
In this moment all the build engineering good practices that you kept will pay of.
To help you more, you need to tell us:
is it a UI (traditional) build definition, a pipeline defined in jenkins or a pipeline based on a jenkinsfile committed in the repository?
do you tag the sorce code repository (hopefully git) with the version of the build, or at least do you have in the build artefact the hash from which was built ?
do you store your build artefact in a repository so you can trace the history ? (artefactory, nexus, docker repo, GCP repo etc)

From TFS + TeamBuild to Jenkins + Perforce

Our current CI build process is all around TFS and its TeamBuild. We're evaluating Jenkins + Perforce as our future solution.
My question is how do I translate everything I have configured in the TFS build definitions to Jenkins in order to make it build my projects? My understanding is Jenkins uses MSBuild's config file to build. If that's true, does that mean I'll have to somehow have all the information currently in TFS build definition in MSBuild's config file?
Jenkins does not use any msbuild files per default. It might bpossible be possible using plugins though. Or simply calling msbuild which will have the same effect. If you know how to build from a commandline, its trivial doing it from Jenkins. Simply create a freestyle project and add a windows 32 shell command.

Do I specify binary artifact settings in the build scripts or the CI server?

I'm prototyping a new build system using Jenkins, Gradle, and Artifactory. There seems to be conflicting or rather overlapping features in these tools, in regards to specifying the build artifacts and their destination. I see three paths going forward:
Specify the artifact settings on the particular task in Jenkins, using the Jenkins Artifactory plugin.
Specify the artifact settings in the Gradle build scripts, using the Gradle Artifactory plugin.
Specify generic maven repo settings in the Gradle build scripts, using the standard Gradle "maven" plugin.
I see pro's and con's to all of these approaches, but nothing is missing a critical feature for our builds, as far as I can see.
To further my confusion, the Gradle Artifactory plugin wiki states:
Build Server Integration - When running Gradle builds in your
continuous integration build server, it is recommended to use one of
the Artifactory Plugins for Jenkins, TeamCity or Bamboo to configure
resolution and publishing to Artifactory with build-info capturing,
via your build server UI.
So, some questions to get the conversation going:
Does it make sense to clutter the build scripts with artifact logic? It might help to add that developer's don't deploy. Currently, I only see build artifacts being uploaded from the Jenkins task.
Does leaving all of this build logic in the task configuration expose us to issues, in the event that the CI server is down?
What about version control for artifact changes done through the CI interface?
I've seen simple Bamboo configurations that specify the build artifacts through the CI server UI, rather than the pom's. Is this just a bad build practice?
Is there a killer tool integration feature that separates one of these approaches from the other?
How useful is the build info object? Is that only available in the Jenkins Artifactory plugin and not the Gradle Artifactory plugin?
I am really hoping to hear from existing users of these tools and what pitfalls/requirements may have led them to one of the approaches above (or perhaps even a better one that I haven't considered yet).
Does it make sense to clutter the build scripts with artifact logic? It might help to add that developer's don't deploy. Currently, I only see build artifacts being uploaded from the Jenkins task.
I'd say that's the way to go. Your build server is the single point of truth, and only artifacts built in the build server should be deployed.
Does leaving all of this build logic in the task configuration expose us to issues, in the event that the CI server is down?
That one is simple - you shouldn't deploy while your CI server is down. Building on local machine might produce wrong artifacts, which shouldn't be deployed.
What about version control for artifact changes done through the CI interface?
Not sure I understood your question.
I've seen simple Bamboo configurations that specify the build artifacts through the CI server UI, rather than the pom's. Is this just a bad build practice?
This configuration ignores Maven's ability to deploy, and I am not sure I can find a good scenario to justify it. The only thing I can think of is deferred deploy, but Artifactory plugin can take care of that.
Is there a killer tool integration feature that separates one of these approaches from the other?
Now we got to the essence :)
Well, the advantage of defining what you deploy in your build script (in case of Gradle) gives you the flexibility to fine-tuning every aspect of the deployment (think about the dynamic properties you might want to add in certain cases). Another very serious advantage is that your build is source, which means it is versionable in your version control.
The advantage of defining the deployment details in the build server configuration is that the build server is the only place the deployment should occur. So, if you don't have the deployment details in your build script, you know for sure it won't be deployed standalone.
So, how can you combine between the two to get the advantages of both worlds?
Code your deployment logic in your Gradle script using the Artifactory plugin DSL. Provide details like username and password from properties, which exists on build server only.
How useful is the build info object?
Extremely useful. The information in buildInfo was harvested during the build process and the buidInfo is the only place it exists. Having this information is the only option you will be able to reproduce this build in the future.
Is that only available in the Jenkins Artifactory plugin and not the Gradle Artifactory plugin?
'artifactory' and 'artifactory-publish' Gradle plugins both generate the buildInfo object, regardless of where are they running (be it your local machine or Jenkins build server).

Publish to Artifactory copied from another project artifacts in Jenkins

So I got few separated jobs in Jenkins. The first one gets the project from a Git repository, builds it and produces artifacts. And another one has to copy certificates from the first job and publish them to Artifactory (tried to make it using the Artifactory plugin). But the thing is that the Artifactory plugin's available only in the Build job, there's nothing like "Generic-Artifactory integration" in second job's configuration.
Does anyone know what are the requirements for making the plugin work in the Publish job?
You can write a small shell script leveraging Artifactory REST API and execute it in your second, non-build job.
I have done a similar thing with maven and a zip file. I have deployed a zip with a build step in maven calling a deploy:deploy-file and setting my Artifactory repository in settings.xml and deploying directly on my artifactory repository.

Retrieving old builds for re-deployment in Jenkins

I have built an automated deployment system using Jenkins, Subversion and ANT on a set of environments. It all works, allowing me to deploy old tagged releases to a set of environments or automatically deploy the latest build using the Subversion Release Manager within Jenkins.
The problem is on the client site, we have to utilise Perforce (which does not currently have a Release Manager plugin within Jenkins [don't really want to write one, but possible]). What is the best way to setup up Jenkins to be able to deploy certain releases to environments? I started looking at Ivy and Artifactory as a possibility.
If anyone has any suggestions, or any guides online, that would be great!
We currently build a WAR file for each Subversion checkin - and make this available with the copy artifact plugin. Redeploy is merely a matter of executing the deploy task for this WAR.

Resources