Commit a file to SVN using Jenkins Job - jenkins

I am new to Jenkins. I want to build a jenkins job which replaces files in SVN repository and commits them. The SVN repository has folders and files in it which we replace and commit manually right now to SVN. Using the Jenkins job itself, I can place the files on a server, not sure if that will be helpful for Jenkins to replace and commit those files to SVN.
Is this possible with Jenkins? (using Jenkins Enterprise) I have mostly seen posts and questions about triggering build after commit in SVN using Jenkins. So, not sure if this is even possible.

Jenkins allows you to execute shell commands in a pipline. So you can execute the same shell commands as you would on your local machine:
sh 'svn commit -m "my message"'
or if you are on a windows node:
powershell/cmd 'svn commit -m "my message"'
Personally I have no experience with svn, but this approach works well with git and I strongly assume it also works with svn.

Related

How to copy only changed files with Publish over SSH

I'm setting up Jenkins to build and then send changed files to a remote server using SSH. However, using the Publish over SSH plugin, I can only find an option to specify files to send over. I only want to send over files that have changed on GitHub. Is there a way to achieve this?
What you want to do might be outside of the scope of the Publish Over SSH plugin, but it is doable as a shell script.
You can run a command like this to get the files changed between the current commit and the last commit: git diff --name-only $GIT_PREVIOUS_COMMIT $GIT_COMMIT
Then using the results of that you can run a shell scp command.
You can do this in a pipeline or in a execute script post-build action.

How to run Jenkins pipeline automatically when "git push" happens for specific folder in bitbucket

I have started using Jenkins recently and there is only scenario where I am stuck. I need to run "Jenkins pipeline" automatically when git push happens for specific folder in master branch. Only if something is added to specific folder, than pipeline should run.
I have already tried SCM with Sparse checkout path, and mentioned my folder, but that's not working.
I am using GUI free style project, I dont know groovy.
I had the same issue and I resolved it by configuring the Git poll.
I used poll SCM to trigger a build and I used the additional behavior of Jenkins Git plugin named "Polling ignores commits in certain paths" > "Included Regions" : my_specific_folder/.*
By the way, using Sparse checkout path allows jenkins to checkout only the folder you mentioned.

Jenkins is checking out the entire SVN repo twice

I have a Jenkins Pipeline setup, and a Jenkins file which has the below content:
pipeline {
agent any
stages {
stage('Build') {
steps {
echo 'Hey'
}
}
}
}
A post commit hook triggers the Jenkins build successfully, and I can see it starting from the Jenkins UI. It states that it is checking out the repo to read from the Jenkins file and it stores the checkout in the workspace#script folder on the server.
Checking out svn https://<svn_server>/svn/Test/Core into C:\Program Files (x86)\Jenkins\jobs\CI_Build\workspace#script to read JenkinsPipeline/Jenkinsfile
Checking out a fresh workspace because C:\Program Files (x86)\Jenkins\jobs\CI_Build\workspace#script doesn't exist
Cleaning local Directory .
After this is complete, I make a change to a file in the repo and the build triggers via the post commit hook happily, but then it tries to checkout the entire code base again into a folder called workspace. I would have expected that the checkout happens once and then the "use SVN update as much as possible" option would kick in and only update the changed files? Or maybe I have the wrong logic?
SVN version - 1.9.7
Jenkins version - 2.84
Jenkins has to know what is in your pipeline script before it knows if it should checkout your code. It is possible that your pipeline says not to check out the code, and you set into a subdirectory and fire off the checkout yourself. Or maybe checkout multiple repos in different places. Until Jenkins sees your Jenkinsfile, it can't know what you want. So it has to checkout the repo once to see your pipeline, then again to do the work.
With git (and maybe some versions of other repo plugins) lightweight or sparse checkouts are supported, so then it only grabs the jenkinsfile instead of the entire repo. I don't think this is a supported option in SVN yet.
Lightweight checkouts are now supported by the SVN plugin, the SVN plugin was updated in version 2.12.0 to add this feature - see https://wiki.jenkins.io/display/JENKINS/Subversion+Plugin.

How to take backup of Jenkins jobs to remote machine

I want to take scheduled backup of Jenkins jobs form Jenkins server to some remote machine. I tried exploring several Jenkins plugins for that but none of that are not able to take backup to the remote machine.
I successfully take backup of the Jenkins workspace using the shell script but in that case it's hard to restore the backup.
Any Suggestions?
If I can suggest a different approach - if you're using any kind of source control it'll be better to backup your files and configuration there. for example - if you will work with git you can open a repository for your Jenkins configuration.
Back up your:
jobs folder
nodes folder
parent folder files (config.xml, all plugins configurations, etc.)
then it is only a matter of running a scheduled job from Jenkins every 12 hours running:
cd $JENKINS_HOME
git add --all
git commit -m "Automated backup commit: $BUILD_TIMESTAMP"
git push
* Make sure you have the right permissions to run those command on the master
This will enable you to:
Keep backups for you Jenkins configuration
Manage versions of backups
View history of changes you made to your configurations
Hope it helps.

Get Git commit from "upstream" build in manually triggered Jenkins job

I have a Build job in Jenkins that checks out a specific Git commit and packages it for deployment as artifacts.
There is a later Deployment job that takes the built artifacts and actually deploys the code. It also does a sparse Git checkout of a specific directory containing deployment scripts. After successful completion, we write a Git tag.
The problem is that the tag is being written to the HEAD of master, not to the hash of the commit used for the original upstream build. (master is the branch defined in the job configuration.)
Is there a way to get the upstream SCM information if it's not passed directly through a parameterized trigger? I can see commits listed in the build.xml file that Jenkins generates in the build directory; is there a way to read this information from the downstream job?
I realize that it's not really "downstream", since it's manually triggered. We do have a selector that defines UPSTREAM_BUILD and UPSTREAM_PROJECT, though.
If you are using the Copy Artifact plugin, you could write a file with the commit hash during the Build job and read it back in during the Deployment job:
# Build
echo ${GIT_COMMIT} > COMMIT_HASH
# Deployment, after copying COMMIT_HASH into the workspace
git checkout $(cat COMMIT_HASH)

Resources