Jenkins build pipeline not reflecting changes - jenkins

We have a jenkins build pipeline for CI/CD. The build is successful and also when the build is deployed in UAT, it shows successfully completed. However the changes are not reflected in UAT once the application is up. Don't see any errors in the logs.
Also the correct build and instance is being used.
Any idea what could be the issue.
regards,
R

Related

What should be mentioned in build stage and deploy stange in Jenkins script...what is difference between deploy and build state.?

What should be mentioned in build stage and deploy stange in Jenkins script...what is difference between deploy and build state.?
Try to search it on Google for better understanding..but no luck
Deploy should mean take all of my artifacts and either copy them to a server, or execute them on a server. It should truly be a simple process.
Build means, process all of my code/artifacts and prepare them for deployment. Meaning compile, generate code, package, etc

Build fails on VSTS with error connect ETIMEDOUT 52.173.242.81:8080, on jenkins build is successfully executed

I'm trying out DevOps with VSTS, Docker and Java and I'm stuck with failing builds PFA screenshots of the same.
VSTS build failing while Queue Jenkins job task
Job is successfully executed on Jenkins and .War file is also produced.
This has been setup on Azure following document
https://github.com/msdevno/hol-oss-devops
Another approach is to have the commit to VSTS trigger the build in Jenkins. Then have Jenkins trigger the release in VSTS. Setting up CI/CD with the TFS Plugin for Jenkins This would not require using VSTS for build.
I was able to resolve the above issue.
My guess worked out correct.
Changing the Public IP from Dynamic to Static caused this issue.
Looked into Jenkins configuration(Manange Jenkins > Configure System > Jenkins Location > Jenkins URL) This was containing dynamic IP and now I've changed it to new static IP and voila it works fine. Build Succeeds and status updated on VSTS as well :)
Thanks for sparing your time in looking to my issue #Starain-MSFT & #Donovan

Run script before removing job in Jenkins Pipelines

I'm setting up a development environment where I have Jenkins as CI server (using pipelines), and the last build step in Jenkinsfile is a deployment to staging. The idea is to have a staging environment for each branch that is pushed.
Whenever someone deletes a branch (sometimes after merging), Jenkins automatically removes its respective job.
I wonder if there is a way to run a custom script before the automatic job removal, then I would be able to connect to the staging server and stop or remove all services that are running for the job that is going to be deleted.
The plugin multibranch-action-triggers-plugin might be worth a look.
This plugin enables building/triggering other jobs when a Pipeline job is created or deleted, or when a Run (also known as Build) is deleted by a Multi Branch Pipeline Job.

Jenkins deleting latest build when SCM polling is configured

I have a maven jenkins job which builds to a directory called 'build.x86_64'. All of the artifacts are built to this directory. For some reason if I enable SCM polling, this directory gets deleted after the build completes. I can't see anything in the console output which says it is deleting the target.
Jenkins does however keep the build artifacts in it's own configured directory
/var/lib/jenkins/jobs/[my job]/builds
I have a downstream job which needs the artifacts but they keep getting deleting.
If I turn off SCM polling and use the 'Build Now' option in the GUI, it doesn't delete the build directory. I can' see anything in the configuration which could cause this. The jenkins job is cloned from one with the same configuration. The problem does not occur in the job I cloned from.
This was caused by the misconfiguration of the Source Code Management section of the Jenkins config. Under the Additional Behaviours section I had added, Clean before checkout. It should have been set to Clean after checkout.

Jenkins automation

Is there any way in jenkins where as soon as we detect the failed build, job revert back the perforce code to the last successful build changelist and fire a build again.
Flow -
1. so as soon as we have failed build - Notifcation will be sent out to dev team with possible checkins which causing the build failure
Revert back the recent code to the last working code and submit it
Initiate a build.
It is possible, but I don't see any reasons or use case to do it as it is not a correct workflow and can be confusing.
But if you decided to do it, the next steps are required:
Example how to do it using Perforce source control.
Steps inside job settings:
Before build triggers you need to save latest changelist number $P4_CHANGELIST - 1
Perforce plugin for Jenkins: Perforce plugin for Jenkins
Build
Get last error code
Get last error code from the: batch
If code != 0 then checkout and build changelsit $P4_CHANGELIST - 1
Jenkins is not a production server. It runs tasks and do not have options that I know for that purpose.
What is your source code ? webapps ? others ?
What steps are you performing ?
Are you performing some automatic tests ?
My assumption is that you got some tests that may invalidate the build.
These tests should be runned :
* on a mock server to prevent deploying on your server
* or somewhere else
Like that, if build failed, nothing is deployed.
If build success, you can deploy your project normally.
If this not reply to your answer, please provide requested information to undestrand a bit more your job process.
If your using an artifact repository like Nexus or Artifactory to manage your project artifacts then you could always redeploy the previous working version of your application when a failure is detected.
Your not cancelling any checked in code that potentially broke the build but you are preserving your test environment. You can configure Jenkins to notify the user who checked in the latest erroneous change set and they can work on resolving the issue.
Jenkins also provides a rich API which allows you to delete a job, start a job, get information about previously run jobs. You could leverage some of these services along with your artifact repo to achieve the experience you described.

Resources