I am working on a pipeline with AWS Code pipeline using Jenkins as a
build provider. Jenkins has a plugin(AWS CodePipeline plugin) to connect/poll
with the pipeline.
Flow of the pipeline:
Source - CodeCommit
Build - Jenkins
Deploy - CloudFormation
Jenkins produces an output artifact(testart which contains imagedefinitions.json) that is uploaded to s3 using
the plugin. For some reason, CloudFormation is able to find the artifact, but not the imagedefinitions.json file.
The error that I get in the deploy stage:
"File (imagedefinitions.json) does not exist in artifact (testart)".
PS: The pipeline has full permissions to access s3.
Any help is appreciated :)
An artifact in a CodePipeline is a zipped directory. You refer to the files inside this dir:
.
└── JenkinsArtifact
└── imagedefinitions.json
So you just need to put the imagedefinitions.json into a directory and have Jenkins zip it.
The CloudFormation action expects a zip file, so you should configure Jenkins with a directory instead of a file.
Related
I have a bitbucket pipeline that runs Google Lighthouse. I want to access the json output that is generated at the end of the pipeline and have it echo 1 of the variables. I understand that I can use artifacts, but I am unsure of how to access it.
Here is my bitbucket-pipelines.yml file:
script:
- lhci collect
- lhci upload
- echo "===== Lighthouse has completed running ====="
artifacts: # defining the artifacts to be passed to each future step.
- .lighthouseci/*.json
Reciting the official doc,
Artifacts are files that are produced by a step. Once you've defined them in your pipeline configuration, you can share them with a following step or export them to keep the artifacts after a step completes. For example, you might want to use reports or JAR files generated by a build step in a later deployment step. Or you might like to download an artifact generated by a step, or upload it to external storage.
If you have your json file generated right after echo "===== Lighthouse has completed running =====" line, you don't have to define a separate step for echoing its contents. Do it right here. You don't even need artifacts if that's the only thing you want to do with your json.
I've created I Jenkinsfile into my really straightforward repository:
├── Jenkinsfile
└── README.md
My question is: How this file is processed by Jenkins?
I mean, how does jenkins know that has to pick up the Jenkinsfile located on wherever?
When you create Pipeline type Job in Jenkins it gives you two options in "Pipeline Definition". Choose "Pipeline script from SCM". Here you can define repository location (git) and "Script Path" - path to your Jenkinsfile in the repository
For local I know how to download the failed tests screenshots.
scp -P 2222 vagrant#127.0.0.1:/tmp/features_article_feature_817.png ~/Downloads/.
How do we download the screenshot from travis CI ?
For people who get here via Google, there is an alternative approach.
You can run a (failing) job/build in debug mode, which gives you access to an interactive session via ssh. See the Travis docs for more information on how to.
Once in your interactive environment, you can run your build phases and find info on failing specs in your tmp folder.
You can't really ssh to Travis CI. What you can do is to upload your build artifacts (like screenshots) to Amazon S3. Here's an example config that would result in uploading all png files found in the /tmp directory:
# .travis.yml
addons:
artifacts: true
paths:
- $(ls /tmp/*.png | tr "\n" ":")
You'll also have to configure some Amazon specific environment variables:
ARTIFACTS_KEY=(AWS access key id)
ARTIFACTS_SECRET=(AWS secret access key)
ARTIFACTS_BUCKET=(S3 bucket name)
Environment variables can be encrypted and securely defined in your .travis.yml with the travis tool.
Read more about amazon s3 uploader and secure variables in Travis CI docs:
https://docs.travis-ci.com/user/uploading-artifacts/
https://docs.travis-ci.com/user/environment-variables/#Defining-encrypted-variables-in-.travis.yml
There's a bit of an error in the yaml here - paths should be indented under artifacts. The .travis.yml fiel would have
# .travis.yml
addons:
artifacts:
paths:
- $(ls /tmp/*.png | tr "\n" ":")
I have some windows slave at my Jenkins so I need to copy file to them in pipeline. I heard about Copy To Slave and Copy Artifact plugins, but they doesn't have pipeline syntax manual. So I don't know how to use them in pipeline.
Direct copy doesn't work.
def inputFile = input message: 'Upload file', parameters: [file(name: 'parameters.xml')]
new hudson.FilePath(new File("${ENV:WORKSPACE}\\parameters.xml")).copyFrom(inputFile)
This code returns and error:
Caused: java.io.IOException: Failed to copy /var/lib/jenkins/jobs/_dev/jobs/(TEST)job/builds/107/parameters.xml to d:\Jenkins\workspace\_dev\(TEST)job\parameters.xml
Is there any way to copy file from master to slave in Jenkins Pipeline?
As I understand copyFrom is executed on your Windows node, therefore the source path is not accessible.
I think you want to look into the stash/unstash steps (Jenkins Pipeline: Basic Steps), which work across different nodes. Also this example might be helpful.
Pipeline DSL context runs on master node even that your write node('someAgentName') in your pipeline.
Try to use stash/unstash, but it is bad for large files.
Try External Workspace Manager Plugin. It has
pipelines steps and good for large files.
Try to use an intermediate storage. archive() and sh("wget $url") will be helpful.
If the requirement is to copy an executable to the test slave and to publish the test results, this is easy to do without the Copy to Slave plugin.
A shared folder should be created on each test slave (normal Windows shared folder).
After build: Build script copies the executable to the shared directory on each slave. A simple batch script using copy command is sufficient for this.
stage ('Copy to slaves') {
steps {
bat 'call "copy-to-slave.bat"'
}
}
During test: The test script copies the executable to another directory and runs it.
After test: Post-build action "Publish Robot Framework test results" can be used to report the test results. It is not necessary to copy the test result files back to the master first.
I recommend on Pipeline: Phoenix AutoTest plugin
Jenkins plugin website:
https://plugins.jenkins.io/phoenix-autotest/#documentation
GitHub repository of plugin:
https://github.com/jenkinsci/phoenix-autotest-plugin
Im trying to upload artifacts to an s3 bucket after a successful build, but i cant find any working example to be implemented into a stage/node block.
any idea ( s3 plugin installed, jenkins v2.32 )?
node {
sh 'echo ""> 1.jar'
archiveArtifacts artifacts: '1.jar', fingerprint: true
// upload to s3 bucket ???
}
Detailed steps:
Install Pipeline AWS Plugin.
Go to Manage Jenkins -> Manage Plugins -> Available tab -> Filter by 'Pipeline AWS'.
Install the plugin.
Add Credentials as per your environment. Example here:
Jenkins > Credentials > System > Global credentials (unrestricted) -> Add
Kind = AWS Credentials
and add your AWS credentials
Note the ID
Then in your Pipeline project (Similar to the code I use)
node {
stage('Upload') {
dir('path/to/your/project/workspace'){
pwd(); //Log current directory
withAWS(region:'yourS3Region',credentials:'yourIDfromStep2') {
def identity=awsIdentity();//Log AWS credentials
// Upload files from working directory 'dist' in your project workspace
s3Upload(bucket:"yourBucketName", workingDir:'dist', includePathPattern:'**/*');
}
};
}
}
Looking at the Pipeline Steps documentation on the Jenkins website, it shows that the Pipeline AWS Plugin provides an s3Upload step.
Try this:
s3Upload(file:'file.txt', bucket:'my-bucket', path:'path/to/target/file.txt')
I think it is easier to show the direct plugin documentation URL.
You can find the plugin documentation here.
As you are looking for a way to upload files to S3, here are some examples.