Publish latest build artifact from "LOCAL" Jenkins to Azure DevOps Release Pipeline? - jenkins

I have a local Jenkins server running on one of my spare computers (win10). Note that it is not behind any sort of a server and hence is only available within my local network. I have set it up so that it does the continuous fetch from my remote git repo and builds the artifacts and archives them for a successful build. I would like to publish these archives to my AzureDevops Release pipeline. How do I do this? (And yes I have looked through all the tutorials but they assume that I have Jenkins running on a VM somewhere on the cloud).
So far I have had no luck with the tutorials on the web since I donot really have a URL to this instance of Jenkins since it is only available on my local network. I cannot really build these artifacts on a remote Jenkins server, so I am really restricted to using this solution for running the builds.
I am looking to have these archives that Jenkins builds be directly available within my Azure DevOps release pipeline, on every successful build. Thanks for the help!

So since nobody else has answered this I am going to detail what I ended up doing (maybe not the best of the approaches but it works for my setup, suggestions are welcome!).
To interface with the Azure DevOps platform from a local machine you will need to configure a self-hosted agent (based on your specific OS), which will allow you to trigger builds, archive and upload the build artifacts to the Azure DevOps platform. This way you also donot have to poll for SCM changes too (which I think is not that elegant sometimes).
1. So you will need to go through the setup as outlined here for you local self-hosted agent:
Windows: https://learn.microsoft.com/en-us/azure/devops/pipelines/agents/v2-windows?view=azure-devops
Linux: https://learn.microsoft.com/en-us/azure/devops/pipelines/agents/v2-linux?view=azure-devops
MacOS: https://learn.microsoft.com/en-us/azure/devops/pipelines/agents/v2-osx?view=azure-devops
NOTE: I have chosen to run the agent as service on windows for my setup
2. Next setup your Jenkins build job how you normally would, with your usual repo access setup. Things to keep in mind are following:
Under "Build Triggers", select the Poll SCM option, but make sure that the schedule is blank, this will make sure that the trigger from your post-commit hook from the agent works. Example setup shown below:
Under "Post-build Actions", make sure that you are archiving the artifacts as required. Example shown below:
3. Now time to setup your project's "Jenkins Service Connection", this can be accessed from the Project Settings tab on the bottom left of you project view in Azure DevOps. Note that this basically helps you self-hosted agent to locate and communicate with the Jenkins instance running locally (or an other network accessible location!). Go under Pipelines -> Service Connections and a new service connection for Jenkins. Note that the trick here is to use the URL for the connection as seen by you local self-hosted agent, which means it can be just any IP (including localhost) that the agent can access normally. Username and password are the same as the ones you setup in Jenkins. Example shown below:
NOTE: You can try to do "Verify and Save" but it will throw an error, so ignore the error or just go ahead and "Save without verification". Also you will have to do this per project, unlike the self-hosted agent setup which is per machine.
4. Now you just need to configure your build pipeline to give jobs to the right agent and pointing to the right service end-point. Now under you build pipeline settings use the agent pool that has the self-hosted agent(s) which can access your build servers. And choose the Jenkins connection that you just created in the above step. The rest of the setup is identical to how you would normally setup your project's build pipeline. An example would be as follows:
NOTE: The key here is the correct "Job name" (this should be the same as the one you have setup in you Jenkins build server instance) and the correct "Jenkins service connection".
5. The rest is straight forward in the sense that you just now need to make sure that you have a step to "Download artifacts" (NOT necessary if you donot want the artifacts on the DevOps platform) & "Publish Artifacts" (this is needed for your release pipeline to see that build artifact and to trigger it too if you want), after your jenkins queue job step. Make sure to setup the correct job directories for download from you local self-hosted agent. Example setup for both the steps:
NOTE: If you are having trouble with the paths for download and publish refer to this link for predefined variables for the self-hosted agents: https://learn.microsoft.com/en-us/azure/devops/pipelines/build/variables?view=azure-devops&tabs=yaml
6. Now in your release pipeline you should be able to add the artifact sources from you build pipeline. Example shown below:
Now you should be able to get the local artifacts in the cloud on the Azure DevOps platform, in case you cannot use the build agents provided by Microsoft for any reason!

Related

Is it possible to integrate SonarQube, Jenkins and GitLab (all in dockers)?

Currently, I am working in a quality process so as to ensure that the code is acceptable. For that, I'm integrating Jenkins, SonarQube and GitLab, which are running in different servers (actually they are in different docker containers).
The idea is to check with SonarQube everytime the code is pushed against GitLab and block commits, merges, and so on, whether SonarQube has not passed.
I have already integrated Jenkins with SonarQube, but Jenkins checks the code inside his workspace, so imagine a situation where a developer in his laptop needs to push his changes.
My conceptual question is simple: Is it possible to integrate these technologies in order to do this? And, if the question is yes, which steps are necessary?
PD: I don't need to see code, configuration files,and so on. I just need something like:
Configure SonarQube to work with Jenkins
Do an script so as to copy that file in that folder,
...
First, in docker means each tool is in its own container.
They only need to see each other through the network, which is where a Docker Engine in Swarm mode comes in.
Second "configure Jenkins to work with SonarQube"... that is what I have done in my shop, and there isn't much to it.
Once the Jenkins SonarQube plugin is installed, and the address for the SonarQube server entered, you can configure your job and call sonar (for instance with maven: $SONAR_MAVEN_GOAL -Dsonar.host.url=$SONAR_HOST_URL)
The analysis done in the Jenkins workspace will then be published in the SonarQube server.
A swarm server is the more modern version of this 2015 docker-compose.yml file from the marcelbirkner/docker-ci-tool-stack project.
The idea remains the same though: each element is isolated in its own container.
I haven't tried It myself but https://gitlab.talanlabs.com/gabriel-allaigre/sonar-gitlab-plugin could be interesting in your setup.

Communicate between Jenkins server without setting up master slave relation

I would like to set up jenkins server that would run test scripts based on successful build deployments on other Jenkins servers. for example, if the QA jenkins server is named JQA1OnMachine1 and i have three others that are named
J2OnMachine2, J3OnMachine3, J4OnMachine4 (different jenkins server on different boxes) can the JQA1OnMachine1 (QA jenkis) poll the others at regular interval to see if a build was deployed successfully? if so can anyone tell me how?
Jenkins master slave along with Jenkins Pipeline Plugin would be one of the better ways to implement this however, since you don't want to use that approach you can explore PSTools to remotely capture processes or files on different server.
Your builds may update a file on the build server post completion of the build and your QA machine can run script with PSTools to monitor and trigger the QA testing based on the file content

How to modify jenkins Web Interface to pick artifacts to publish?

I have an project to upgrade my jenkins. So the problem is I use jenkins plugins called "Publish over ssh". As I read that plugin can publish our artifactory file to another server by ssh connection. But in the configuration it always run after our job finised. So every jobs that I run, always publish to server.
So, I want to ask, how can I pick which build history artifacts that I want to publish? and it can publish when I want to publish. And then I have an Idea to adding button in jenkins web interface, to trigger adding job configuration, build it, and publish it over SSH. But, I don't know how to modify jenkins web interface. My server use tomcat, that run java ".war" file.
So, do you all have suggestion for my problem, how to modify jenkins web interface, or how to pick certain build artifacts?
Thanks...

Jenkins- Create Jobs on different servers

I want to configure Jenkins to build my code on 1 server. Then want to deploy it on another server using Jenkins.Both servers are using Linux I want to automate the entire process as much as possible. I went through some of plugins like pipeline, Job Import Plugin, etc
Can anyone guide me how to go about it ? Which plugins will be useful ? Any example or tutorial somewhere will be useful. The configuration of build pipeline plugin on jenkins was not seamless for me.
Thanks,
Bhargav
I would work it this way :
Install jenkins on your first server
Install the following plugins : ssh credentials, ssh slaves, copy to
slave, and restart jenkins
Go to Manage jenkins -> Manage credentials, and add ssh credentials
for your second server
Go to Manage jenkins -> Manage nodes, and create a passive slave.
The launch method should be "Launch slave agents on Unix machines
via ssh". You should use the credentials that you have added in step
3
Create a job to build your code. In the advanded options of job, you
should indicate that the job must only be built on master node.
Create a job to deploy your code on the second server. In the
avanded options of job, you should indicate that the job must only
be built on slave node.
In the "Build Environment" section, check the "Copy files into workspace before building" box and configure what files you want to copy from first server (https://wiki.jenkins-ci.org/display/JENKINS/Copy+To+Slave+Plugin)
The code will be copied into the jenkins slave's workspace.

Need help on automating QA, Stage, Prod delpoy using Jenkins\Hudson

We are using Hudson as a CI tool. At present we are needed to use Jenkins, to deploy the build to Stage, Prod environment. What is the best aproach we should follow.
I know about promote buld plugin, but the issue is authentication. I want whevener we need to promote a build to deploy to Stage or Prod, it should ask for netqwork credential first. And then the promote job should execute the Batch command using the creadential supplied. At present, the promote plugin, runs using the credentials which the Tomcat server is configured to run.
Same issue with Build Pipeline plugin.
I want only dev or even hudson admin also should not be able to execute the promote build unless credential supplied. (We have windows 2008 r2 OS)
Can you please help me in resolving the issue. so that basically whenever a user click on Promote build to QA\Stage\Prod the plugin should ask for credential or should use the logged on users credential and execute the batch script using the logged users credential only and not by using the credentials of the account with which the tomcat server is configured.
Can you please help me?
Please suggests us the best aproach for making automated build on prod\stage.
For deployment I normally use SSH, Private/Public keys takes care of the authentication problems normally associated with running commands on other servers.
SSH is normally associated with unix based systems, but it does support windows.
Finally, I would recommend considering decoupling your build system (jenkins) from the system performing the deployment by using an intermediate repository. See the following answer for more details:
Jenkins : how to check out artifact from Nexus and Deploy on Tomcat-

Resources