Perform jenkins build on local files - jenkins

We have a Jenkins server where I have already defined my job. It uses Perforce as SCM.
I would like to replicate all the steps that Jenkins takes to build the project but use the files in my local workspace instead. Basically, I would like to run a jenkins build locally based on a job defined on another server.
How would I do the same?

Something like what I created for my Perforce users might work for you -- I added a job in Jenkins that will grab shelved files (so, the user would need to shelve the files), create a build from there, then let the user know if it was successful (they also have the option of running tests or creating a deployable build). The gist of it is to request the shelved changelist #, then do this: "p4 unshelve -s %SHELVEDCL% " and proceed as usual. They use it when they feel like it; it's been useful. But it does require access to Jenkins.

1) Install Jenkins on your local workstation (if you have no already done so).
2) Copy the /Jenkins/jobs/ directory to the /Jenkins/jobs/ directory on your local workstation.
3) Fire it up and edit the Perforce workspace (and any other settings) as necessary.

IMO, you should probably takes these steps:
Create a new Jenkins job from the existing one. 2) Modify the job to
be a string "parameterized" job where you pass branch-name as the
parameter. You can do using "This build is parameterized" option in
the configuration of the job. 3) Under the configuration of the job,
for Source Code Management section, change the Branch Specifier to
use the String Parameter variable name (created from #2 above).
4) Create your feature branch on Perforce and make intended changes
there. 5) Run the newly created job with your branch as the parameter
to it.
Hope this helps.

Related

How can Jenkins read a polling text file checked in GIT to trigger a deployment?

Current scenario: Build and deployment happens in development environment and the code is checked in to GIT and the JAR file is placed in Nexus. Then a change request is raised to deploy the same to the QA environments. The CR is attached with two parameterized text files (One of which contains the nexus path and other contains website URL) which act as input for parametrized build along with selection of environment. Run deploy
Target Scenario:We want to remove the CR part and in doing so we want a file (containing parameters which were attached in CR) which when pushed to GIT, a copy-paste should happen to the parametrized Jenkins job in respective parameters and select the environment from dropdown.
What is the best way to achieve this, either by creating another Jenkins job which can read the parameters from the file or is there any other way.
P.S. We don't want to make any editing in the existing Parameterized Jenkins jobs.
Using the Jenkins GitHub Plugin, you can create a separate job with a GitHub build trigger. By adding the GitHub repo (where the parameter file is pushed) to this Jenkins job, you can process the file to get the parameters you want in order to kick off the appropriate Jenkins jobs.
For Jenkins to process the parameters, one option is to use the EnvInject Plugin. (As suggested in this answer.) Another suggestion: Extended Choice Parameter Plugin (from this answer).

Jenkins Pipeline per branch environment variable configuration

I have several Jenkins Pipeline jobs set up on my Jenkins installation all of them with a Jenkinsfile inside the repository.
These pipelines are run for all branches, and contains all steps necessary to build and deploy the branch. However, there are some differences for the different branches with regards to building and deploying them, and I would like to be able to configure different environment variables for the different branches.
Is that possible with Jenkins, or do I need to reevaluate my approach or use another CI system?
#rednax answer works if you're using a branch-per-environment git strategy. But if you're using git-flow (or any strategy where you assume that changes will be propogated up, possibly without human intervention, to master/production) you'll run into headaches where a merge will overwrite scripts/variables.
We use a set of folders which match the environment names : infrastructure/Jenkinsfile contains the common steps, and infrastructure/test/Jenkinsfile contains the steps specific to the test environment (the folders also contain Dockerfiles and cloudformation scripts). You could make that very complex with cascading includes or file merges, or simply have almost-identical copies of each file in each folder.
When configuring the job you can specify for Jenkins to grab the script (Jenkins file) from the branch on which you are running. This mean that technically you can adjust the script on each of your branches to set up parameters there. Or you can grab the script from the same source control location, but commit a configuration file in each of your branches and have the script read that file after the checkout.

Access Jenkins host drive, beside the job workspace

I would like to share byproducts of one jenkins job, with another one that run after.
I am aware that I can set "use custom workspace", but that would merge the jobs together; which is not what I want. I just need to move few files in a location, that are read by the next job.
So far I can't find out how you actually tell Jenkins jobs to look for a specific folder; since it does not have a concept of file system, beyond what is going on in the job workspace folder.
Is there a way to access the host file system, or declare a shared folder inside jenkins (like in the main workspace folder, which contains all the other jobs?), so I can copy and read files in it, from different jobs?
Where possible I would like to avoid plugins and extras; I would like to use what is included with Jenkins base.
I realize you want to avoid plugins, but the Jenkins-y way to accomplish this is to use the Copy Artifacts plugin, which does exactly what you want.
There are a variety of problems that you may run into when trying to manage the filesystem yourself. (How do you publish to a common location when running on different build nodes? How do you handle unsuccessful builds?) This solution uses Jenkins to track builds and artifacts. In the absence of a separate artifact repository, its a lot better than trying to manage it yourself.
To use Copy Artifacts:
As a Post-Build step, choose "Archive Artifacts" in the first job and enter the path(s) to the generated files.
Then in the second job, add a "Copy Artifacts from another project" build step to grab some or all files marked as artifacts in your first job. (By default, Jenkins will re-create the paths of the generated files in the second job's workspace, which may or may not be what you want, but you can change this behavior.)
Configure the Jenkins to run a Maven build, and deploy your artifacts with "mvn clean deploy" This will push it to an "artifact server" which you probably have, or if not, need to add / configure.
Then in your downstream job, also a Maven job, you configure it to depend on the same artifact that was published in the upstream job. This will trigger a download of the artifact from the artifact server and make it available to the build.

How do I trigger deploy after the successful build of a specific branch?

I have a Jenkins task that triggers on any changes made to a gitlab project.
There are a few situations I'd like to be able to set up, however I'm not sure how to best accomplish them. Most of it centers around being able to do the following:
Once the job is complete, I'd like to trigger another job that takes the contents of the first job's workspace (emptying out the initial one).
I'd like for a way to only run certain other jobs when the workspace contains a specific branch (automatically deploy develop branch to a preview environment).
"to trigger another job that takes the contents of the first job's workspace" see Shared workspace plugin:
This plugin allows to share workspaces by Jenkins jobs with the same SCM repos.

Way to clone a job from one jenkins to another

I have two Jenkins, both are master. Both have 5 salve Jenkins each. I have one job on first jenkins that needs to be cloned for each job.
I can clone the job on first jenkins and its slave but not on second master jenkins. Is there a way to clone a job from one jenkins to another?
I have one more question can I archive the job at some defined location other than master jenkins, May be on slave?
I assume you have a job called "JOB" on "Jenkins1" and you want to copy it to "Jenkins2":
curl JENKINS1_URL/job/JOB/config.xml | java -jar jenkins-cli.war -s JENKINS2_URL create-job
You might need to add username and password if you have turned on security in Jenkins. The jenkins-cli.war is available from your $JENKINS_URL/cli.
Ideally you should make sure you have the same plugins installed on both Jenkins1 and Jenkins2. More similar you can make the two Jenkins masters, the fewer problems you will have importing the the job.
For the second part of your question: slaves don't store any Jenkins configuration. All configuration is done on Master. There is a lot of backup plugins, some backup the whole Jenkins, some backup just job configuration, some backup individual jobs, export them to files, or even store/track changes from SCM such as SVN.
So "archiving job configuration to slave" simply makes no sense. But at the end of the day, a job configuration is simply an .xml file, and you can take that file and copy it anywhere you want.
As for the first part of the question, it's unclear what you want. Do you want to clone a job automatically (as part of another job's process), programmatically (through some script) or manually (through the UI, other means)?
Edit:
Go to your JENKINS_HOME directory on the server filesystem, navigate to the jobs folder, then select the specific job folder that you want.
Copy the config.xml to another server, this will create the same job with the same configuration (make sure your plugins are same)
Copy the whole job_name folder if you want to preserve history, builds, artifacts, etc

Resources