I am trying to use the sshagent and credentials plugin to execute couple of git commands on a remote repo in a jenkins pipeline script .
I am trying to do this checkout some files from a large repo . This runs on a docker slave .
sshagent(['a5f11347-dff6-4586-ae77-34adeffb0063']) {
script {
sh 'git archive --remote=ssh://git#git.comp.com/something.git HEAD Jenkinsfile | tar -x'
sh 'git archive --remote=ssh://git#git.comp.com/something.git HEAD:pipeline/ | tar -x;'
}
}
But it fails with a hostkey verification failed error. Any insight on why this is failing .
If i do a
checkout([$class: 'GitSCM', branches: [[name: "${BRANCH_NAME}"]],
doGenerateSubmoduleConfigurations: false,
extensions: [[$class: 'CloneOption', depth: 0, noTags: false, reference: '', shallow: true],
[$class: 'CloneOption', noTags: true, reference: '', shallow: true],
[$class: 'SparseCheckoutPaths', sparseCheckoutPaths: [[path: 'Jenkinsfile'], [path: 'pipeline/*']]]],
submoduleCfg: [], userRemoteConfigs: [[credentialsId: 'a5f11347-dff6-4586-ae77-34adeffb0063',
url: 'ssh://git#git.comp.com/something.git']]])
it works fine ,but the large repo is fetched ,which i want to avoid
Regarding the host key verification problem, you need to add the host key for git.comp.com to your SSH known_hosts file. Try integrating the following command in your Jenkins script before the sshagent block:
ssh-keyscan git.comp.com >> ~/.ssh/known_hosts
This worked for me.
See Jenkins ssh-agent starts and then stops immediately in pipeline build for more info.
Related
I'm new to Jenkins and I'm trying to understand the following step in Jenkins pipeline line by line:
checkout scm
dir("some_directory") {
checkout(
changelog: false,
poll: false,
scm: [
$class : 'GitSCM',
branches : [[name: SOME_BRANCH_NAME]],
doGenerateSubmoduleConfigurations: false,
extensions : [[$class: 'CloneOption', depth: 0, honorRefspec: true, reference: '', shallow: false]],
submoduleCfg : [],
userRemoteConfigs : [[url: SOME_URL]]
]
)
sh 'pwd; ls'
}
From the research I've done I understood that
checkout scm dir("some_directory")
'dir' creates a folder in workspace if it doesn't exist, and git project gets checked out into this directory
checkout(
changelog:false,
poll:false,
scm: [...]
)
this block of code specifies git parameters of the git project that is being checked out into the directory specified above.
This is so far what I understood, can someone please lt me know if my understanding is correct? And possibly add more details to it.
Also, I am confused with the current code syntax. Would it make any difference if I rewrite the top few lines as:
checkout scm dir("some_directory")(
changelog: false,
poll: false,
etc.
)
instead of using 'checkout' two times.
I have jenkins multibranch pipeline with jenkins git plugin.
When the new pull requested is created a new PR job starts, and checkout of the repository is done automatically. The problem is sometimes it hits timeout (networking).
I try to do retry in pipeline by using GitSCM code with some conditionals:
checkout([
$class: 'GitSCM',
branches: scm.branches,
doGenerateSubmoduleConfigurations: scm.doGenerateSubmoduleConfigurations,
extensions: scm.extensions + [[$class: 'CloneOption', noTags: false, reference: '', shallow: false]],
submoduleCfg: [],
userRemoteConfigs: scm.userRemoteConfigs
])
}
It repeats the checkout just fine, but I still need to disable the first default checkout from the plugin(if it fails the job fails). How do I do that? How do I override the built-in checkout?
skipDefaultCheckout option should disable default checkout. E.g.:
options { skipDefaultCheckout() }
Read more here about it: https://www.jenkins.io/doc/book/pipeline/syntax/#available-options
I created a configuration file with Jenkins file to configure the Jenkins pipeline.
An error occurred when doing git pull.
The cause was that the groovy file did not have the Git LFS pull after checkout setting.
I do not know how to write Git LFS pull after checkout setting to groovy.
git(
url: git#...,
branch: "master",
credentialsId:"abcdefg"
)
// Git LFS pull after checkout setting??
Here's how I was able to use the Git plugin in the pipeline. Refer to the documentation here for more info:
checkout([ $class: "GitSCM",
branches: [[name: "refs/heads/${your branch name}"]],
extensions: [
[$class: "GitLFSPull"]
],
userRemoteConfigs: [
[credentialsId: "${your git credential ID}",
url: "${your git URL}"]
]
])
Considering you are trying to download a large file, you may want to increase the time out limit too (default is set to 10 minutes):
checkout([ $class: 'GitSCM',
branches: [[name: 'refs/heads/'+"${branch_or_tag}"]],
extensions: [[$class: 'GitLFSPull']]
+[[$class: 'CloneOption', timeout: 30]],
userRemoteConfigs: [
[credentialsId: "${your git credential ID}",
url: "${your git URL}"]
]
])
I have a Java project in http://localhost:7990/scm/bout/boutique-a.git
I want to have 2 Jenkins pipeline jobs:
Job 1/ trigger on commit done on */develop
Job 2/ trigger on commit done on any */feature
p
each job will do a basic mvn install, mvn test, sonar ...
a simple script with
node {
checkout([$class: 'GitSCM',
branches: [[name: 'develop]],
doGenerateSubmoduleConfigurations: false,
extensions: [[$class: 'SubmoduleOption', disableSubmodules: false,
parentCredentials: false, recursiveSubmodules: true, reference: '',
trackingSubmodules: false]], submoduleCfg: [],
userRemoteConfigs: [[credentialsId: 'admin',
url: 'http://localhost:7990/scm/bout/boutique-a.git']]])
}
works if a commit is done in /develop or if I give explicitly the branch name like feature/test-a but how to configure a script for any feature/
It seems that what i'm asking is not possible using pipeline job.
I found a work arround for "feature/** ". I created a param BRANCH_NAME in the job, then the branch name is send by bitbucket when a push is made on "feature/** " through a basic POST request.
http://user:token#localhost:8081/jenkins/job/MY_JOB_NAME/buildWithParameters?token=U1C1yQo7x3&BRANCH_NAME=feature/branche-test
I am new to Jenkin's and I have 4 repo's in Bitbucket say A,B,C,D.
I have to fetch the A,B & C repos, build them using gradle build which will generate wars.
Now I have to copy those wars in D\warsFolder
I have created Multibranch pipeline and generated the pipeline syntax which fetches A,B & C from git and builds them. Looks some thing like this
node {
checkout([$class: 'GitSCM', branches: [[name: '*/master']], doGenerateSubmoduleConfigurations: false, extensions: [[$class: 'RelativeTargetDirectory', relativeTargetDir: 'A']], submoduleCfg: [], userRemoteConfigs: [[credentialsId: 'id', url: 'http://.../A.git']]])
dir('A') {
bat 'gradle build -i --info --stacktrace --debug'
}
checkout([$class: 'GitSCM', branches: [[name: '*/master']], doGenerateSubmoduleConfigurations: false, extensions: [[$class: 'RelativeTargetDirectory', relativeTargetDir: 'B']], submoduleCfg: [], userRemoteConfigs: [[credentialsId: 'id', url: 'http://.../B.git']]])
dir('B') {
bat 'gradle build -i --info --stacktrace --debug'
}
checkout([$class: 'GitSCM', branches: [[name: '*/master']], doGenerateSubmoduleConfigurations: false, extensions: [[$class: 'RelativeTargetDirectory', relativeTargetDir: 'C']], submoduleCfg: [], userRemoteConfigs: [[credentialsId: 'id', url: 'http://.../C.git']]])
dir('C') {
bat 'gradle build -i --info --stacktrace --debug'
}
}
added the above script in Jenkinsfile which I placed in A repo.
Now I have created a Multibranch pipeline Fetch_all and in branch sources -> Single repository & branch -> Repository URL I have added http://.../A.git (which has Jenkinsfile).
Upto here everything is working fine I am able to fetch the sources and build them.
I have created new job of Freestyle where in Source Code Management -> Git -> Repository URL will be http://.../D.git.
I am trying to copy the wars generated in the Fetch_all pipeline but in Build -> Copy artifacts from another project the Project Name is not accepting the Multibranch pipeline. It is throwing error like
ERROR: Unable to find project for artifact copy:
This may be due to incorrect project name or permission settings; see help for project name in job configuration.
Any help is appreciated.
Finally got it, when I gave pipeline_name/branchname i.e., Fetch_all/%00 it worked fine.
It took some time to find out the correct syntax. The documentations of the Coyartifact Plugin is a little bit confusing, as it mentions the encoding of special characters. Actually spaces don't have to be encoded, but slashes have to.
The Jenkinsfile which copies artifacts is located at 'Other-folder/Multi branch Pipeline Test/', put in this content to copy the artifact of the last successfull build of the 'Folder/Multi branch Pipeline/feature%2Fallow-artifact-copy' project
copyArtifacts(
projectName: 'Folder/Multi branch Pipeline/feature%2Fallow-artifact-copy',// the name of project, as you find it from the root of jenkins
selector: lastSuccessful(), // selector to select the build to copy from. If not specified, latest stable build is used.
filter: 'projects/Output/myzip.zip', // ant-expression to filter artifacts to copy, Attention! Filter is case sensitive
target: 'sources/deploy/', // target directory to copy to, intermediate folders will be created
flatten: true, // ignore directory structures of artifacts, Artifact will be placed at 'sources/deploy/myzip.zip'. Is the option false, you find it at 'projects/Outpu/myzip.py'
optional: false, // do not fail the step even if no appropriate build is found.
fingerprintArtifacts: true, // fingerprint artifacts to track builds using those artifacts
)
And don't forget to allow artifact copy in the project you want to take the artifact from. Add this to the Jenkinsfile of 'Folder/Multi branch Pipeline/feature%2Fallow-artifact-copy'. Use absolute paths, to avoid issues if you move some projects around.
options {
disableConcurrentBuilds()
timeout(time: 30, unit: 'MINUTES')
copyArtifactPermission('/Other-folder/Multi branch Pipeline Test/*, /second Folder/*') // allow all the projects or branches of 'Other-folder/Multi branch Pipeline Test' and 'second Folder' to copy artifacts of this job
} // end of options