We are migrating our source code repository to cloud bucket and all the source code that jenkins uses will be read/downloaded from the bucket like S3.
This also involves rewriting our jenkins pipeline that reads from SCM (git). The Jenkis pipeline project configuration dosen'allow any independent script execution (say weget or download file from bucket using shell)
I would like to do following ,if possibe
1) Download the jenkinsfile from S3 bucket to workspace
2) Choose None for SCM in Pipeline section
3) Give path to downloaded jenkinsfile in script path
My question is , how can I make #1 possible?. Image attached.
Related
I have some terraform code that uses a map of tfvars to deploy multiple lambdas to Aws. it all works fine except I want to run the script in a Jenkins pipeline which would need to download the jars from Nexus first for each lamda. is there a way I can read the tfvars file in the Jenkins pipeline to get the names of the jars to download from Nexus, copy them into the working dir on Jenkins and then upload them using terraform?
I created a Jenkinsfile and pushed it to master branch.
In Jenkins pipeline job, I selected Pipeline Script from SCM and filled all other details.
When I build this job, it runs properly as expected.
Console log says 'Obtained Jenkinsfile from git <repo url>'.
I am using windows. Jenkins has a folder .jenkins in C:\Users\<Username>.
I looked in workspace folder which is empty until Jenkinsfile started downloading the repo.
Where is Jenkins storing the Jenkinsfile it downloaded in the very first step?
The output on the console log should show the directory where the Jenkinsfile was checked out. It is usually named somthing similar to what the workspace is called, but with an #script or something on the end.
I have, as I think, simple use case, when jenkins builds static website, so in the end of the build, I have a folder like $WORKSPACE/site-result.
Now I want to upload this folder to S3 (and clean bucket if something already there). How can I do it?
I'm using pipeline, but can switch to freestyle project if necessary. So far I installed S3 Plugin (S3 publisher plugin). Created IAM user. Added credentials to "Configure system" section. And can't find any further info. Thanks!
If the answer suggesting the Pipeline AWS Plugin doesn't work, you could always have an upload step in your pipeline where you use sh call the AWS CLI:
aws s3 cp $WORKSPACE/site-result s3://your/bucket --recursive --include "*"
Source: http://docs.aws.amazon.com/cli/latest/reference/s3/
You have to use s3Upload plugin and set the sourceFile parameter as '*/*'
I want to use a config file provided by config file provider plugin in a pipeline project.
However when I run a build step inside a slave. I get a "PermissionDenied" exception, The same runs in master however.
So question is thats the best possible way to share files between master and slaves. I may not be able to Copy to slave plugin as there doesn't seem to be pipeline support.
If you want to share files between stages or nodes you can use the stash - unstash methods. see the example here
If you want to share files between builds you can use the archive method and the Copy Artifact Plugin
My environment uses Gradle for builds, Jenkins for CI, and Artifactory for a repository. I use the Artifactory plugin for Jenkins.
Jenkins successfully builds my main jar file and uploads it to Artifactory. The build script has a second target for creating a distribution zip file under build/distributions. Jenkins creates the zip file successfully, but I don't know how to tell it to upload that artifact to Artifactory, too.
Is this something I should be able to specify in the Jenkins Artifactory plugin config, or something I should define in the Gradle build script? Thanks for any pointers.
You should configure the archives configuration to include all the archives you intend to publish as described in Gradle's user manual. Not only Artifactory will pick up all the files to deploy automatically (without messing with Published Artifacts configuration), you won't even need to run the second task. All the archives will be creating by running the build task.
I assume you have configured artifactory server correctly in Manage Jenkins section; also your job is setup as a Freestyle Project.
Select your job and click Configure. Check Generic Artifactory Integration in Build Environment. Select your Artifactory Server and Target Repository from drop downs, check Override default deployer credentials if required. In Published Artifacts you enter the pattern for your zip file to be published, e.g. ${WORKSPACE}/distr/*.zip (where by WORKSPACE is jenkins current project's workspace and distr/*.zip your distribution zip file). Check if required Capture and publish build info, Include environment variables etc. Save your job. When you build it the next time, the zip file will be uploaded and will be available in the Builds section on artifactory.