Im trying to upload artifacts to an s3 bucket after a successful build, but i cant find any working example to be implemented into a stage/node block.
any idea ( s3 plugin installed, jenkins v2.32 )?
node {
sh 'echo ""> 1.jar'
archiveArtifacts artifacts: '1.jar', fingerprint: true
// upload to s3 bucket ???
}
Detailed steps:
Install Pipeline AWS Plugin.
Go to Manage Jenkins -> Manage Plugins -> Available tab -> Filter by 'Pipeline AWS'.
Install the plugin.
Add Credentials as per your environment. Example here:
Jenkins > Credentials > System > Global credentials (unrestricted) -> Add
Kind = AWS Credentials
and add your AWS credentials
Note the ID
Then in your Pipeline project (Similar to the code I use)
node {
stage('Upload') {
dir('path/to/your/project/workspace'){
pwd(); //Log current directory
withAWS(region:'yourS3Region',credentials:'yourIDfromStep2') {
def identity=awsIdentity();//Log AWS credentials
// Upload files from working directory 'dist' in your project workspace
s3Upload(bucket:"yourBucketName", workingDir:'dist', includePathPattern:'**/*');
}
};
}
}
Looking at the Pipeline Steps documentation on the Jenkins website, it shows that the Pipeline AWS Plugin provides an s3Upload step.
Try this:
s3Upload(file:'file.txt', bucket:'my-bucket', path:'path/to/target/file.txt')
I think it is easier to show the direct plugin documentation URL.
You can find the plugin documentation here.
As you are looking for a way to upload files to S3, here are some examples.
Related
I'm a little bit of a newb, with this CI/CD container stuff so please correct me anywhere I'm wrong.
I can't seem to find out how to send by npm build files created on my jenkins instance (workspace) to a remote server. I have a pipeline that successfully pulls in my github repo, does all my fun npm stuff (npm install, test, build). I see my build dir in my jenkins instance /workspace.
My environment is as follows. We have a server where docker (with Portainer) is installed. Jenkins is running in a container with a volume mounted (my react build dir goes here). No issues with the pipeline or building etc. I just can't figure out how to push my artifacts from my jenkins workspace directory to my 'remote' dev server.
I can successfully open a console in my jenkins container (portainer as the jenkins user) and scp files from the workspace directory using my remote server creds(but password is necessary).
I installed and used "Publish Over SSH" Jenkins plugin and get a successful "Test Configuration" from my setup.
I created my RSA keys on the REMOTE machine (that I'm trying to push my build files to).
I then pasted the private key (created without a password) into the plugin at the 'Use password authentication, or use a different key' section. Again, I get a successful test connection.
In my pipeline the last step is deploying and I use this command
sh 'scp -r build myusername#xx.xx.xx.xx:/var/files/react-tester'
I get a 'Permission denied (publickey,password).' error. I have no password associated with the rsa key. I tried both ways, creating the rsa key on the remote machine as my remote user, and the jenkins machine as the jenkins user. I've read examples of people creating the keys both ways, but not sure which user/machine combo to create the keys and paste to which section of the 'Publish Over SSH' plugin.
I'm out of ideas.
First, go to "Manage Jenkins" > "Credentials", add a new SSH credential of type "SSH Username with private key" and fill the "Username" and your private key (generate one if you haven't done it yet) fields (you can also upload one). Don't forget that you have to copy the generated public key to the ${SSH_USERNAME}/.ssh/authorized_keys file on the remote server.
I'm assuming you're using a scripted or DSL pipeline here. In your code, after you've builded your application, you can push it to your server adding this step to your pipeline:
pipeline {
stages {
stage("Pushing changes to remote server") {
steps {
script {
def remote_server = "1.2.3.4"
withCredentials([sshUserPrivateKey(credentialsId: 'my-key', keyFileVariable: 'SSH_KEY', passphraseVariable: '', usernameVariable: 'SSH_USERNAME')]) {
sh "scp -i \${SSH_KEY} build/ ${SSH_USERNAME}#${remote_server}:/var/files/react-tester/"
}
}
}
}
}
}
Best regards.
I want to add stage with one step to my Jenkinsfile pipeline to upload an apk to google drive and then getting the shareable link of uploaded file.
Have you checked this? Looks like it could be of some use
Here are some general steps to add a stage that uploads a file to google cloud:
Download jenkins google Google OAuth Credentials plugin and Google Cloud Storage
plugin:
https://plugins.jenkins.io/google-oauth-plugin/
https://plugins.jenkins.io/google-storage-plugin/
Create a credentials parameter (in jenkins/configure) with type: "Google Service Account from private key", if you do not have a Service account for the project, create it using a credentials json obtained by doing: https://cloud.google.com/iam/docs/creating-managing-service-account-keys#creating
Create a GCP bucket: https://cloud.google.com/storage/docs/creating-buckets
In jenkins, go to "pipeline-syntax" and create the build step, instruction can be found in: https://plugins.jenkins.io/google-storage-plugin/
example for the jenkins file upload code (generate by step 4):
googleStorageUpload bucket: 'gs://my-jenkins-bucket-from-part-3', credentialsId: 'my-jenkins-credentials-from-part-2', pattern: 'my_file_to_upload.zip'
Note that you might have issues with bucket permissions, configure them correctly or open the bucket for public access (not recommended):
in GCP, Activate Cloud Shell:
gsutil defacl ch -u allUsers:R gs://my-jenkins-bucket-from-part-3
gsutil acl ch -u allUsers:O gs://my-jenkins-bucket-from-part-3
I am working on a pipeline with AWS Code pipeline using Jenkins as a
build provider. Jenkins has a plugin(AWS CodePipeline plugin) to connect/poll
with the pipeline.
Flow of the pipeline:
Source - CodeCommit
Build - Jenkins
Deploy - CloudFormation
Jenkins produces an output artifact(testart which contains imagedefinitions.json) that is uploaded to s3 using
the plugin. For some reason, CloudFormation is able to find the artifact, but not the imagedefinitions.json file.
The error that I get in the deploy stage:
"File (imagedefinitions.json) does not exist in artifact (testart)".
PS: The pipeline has full permissions to access s3.
Any help is appreciated :)
An artifact in a CodePipeline is a zipped directory. You refer to the files inside this dir:
.
└── JenkinsArtifact
└── imagedefinitions.json
So you just need to put the imagedefinitions.json into a directory and have Jenkins zip it.
The CloudFormation action expects a zip file, so you should configure Jenkins with a directory instead of a file.
I am migrating from Jenkins 1.x to Jenkins 2. I want to build and deploy application using Jenkinsfile.
I am able to build gradle application, but I am confused about deploying application via AWS Codedeploy using Jenkinsfile.
Here is my jenkinsfile
node {
// Mark the code checkout 'stage'....
stage 'Checkout'
// Get some code from a GitHub repository
git branch: 'master',
credentialsId: 'xxxxxxxx-xxxxx-xxxxx-xxxxx-xxxxxxxx',
url: 'https://github.com/somerepo/someapplication.git'
// Mark the code build 'stage'....
stage 'Build'
// Run the gradle build
sh '/usr/share/gradle/bin/gradle build -x test -q buildZip -Pmule_env=aws-dev -Pmule_config=server'
stage 'Deploy via Codedeploy'
//Run using codedeploy agent
}
I have searched many tutorial but they're using AWS Code deploy plugin instead.
Could you help me deploying application via AWS Codedeploy using Jenkinsfile?
Thank you.
Alternatively you can use AWS CLI commands to do code deployment. This involves two steps.
Step 1 - Push the deployment bundle to S3 bucket. See the following command:
aws --profile {profile_name} deploy push --application-name {code_deploy_application_name} --s3-location s3://<s3_file_path>.zip
Where:
profile_name = name of AWS profile (if using multiple accounts)
code_deploy_application_name = name of AWS code deployment application.
s3_file_path = S3 file path for deployment bundle zip file.
Step 2 - Initiate code deployment
The second command is the used to trigger code deployment. See the following command:
aws --profile {profile} deploy create-deployment --application-name {code_deploy_application_name} --deployment-group-name {code_deploy_group_name} --s3-location bucket={s3_bucket_name},bundleType=zip,key={s3_bucket_zip_file_path}
Where:
profile = name of your AWS profile (if using multiple accounts)
code_deploy_application_name = same as step 1.
code_deploy_group_name = name of code deployment group. This is associated with your code deploy application.
s3_bucket_name = name of S3 bucket which will store your deployment artefacts. (Make sure that your role that performs code deploy has permissions to s3 bucket.)
s3_bucket_zip_file_path = same as step 1.
I have an incredibly basic Gradle build file:
plugins {
id "base"
id "com.jfrog.artifactory" version "4.3.0"
}
configurations {
batchConfig
}
artifacts{
file("dist").eachFile{ zipFile ->
batchConfig zipFile
}
}
println "BatchConfig Artifacts: " + configurations.batchConfig.allArtifacts
This is executed via Jenkins and appears to work fine:
Archives Artifacts: [DefaultPublishArtifact_Decorated
module-0.0.post0.dev6+n4c62094-py2.7:egg:egg:null]
[buildinfo] Properties file found at
'/tmp/buildInfo65481565498521.properties'
:artifactoryPublish
Deploying build descriptor to:
https://ourArtifactoryServer/artifactory/api/build
Build successfully deployed.
Browse it in Artifactory under
https://ourArtifactoryServer/artifactory/webapp/builds/testGradleBuild/34
BUILD SUCCESSFUL
However the artifact is not actually uploaded to Artifactory at all.
SSL cert configuration appears to be working fine, as I had to address that first. Any suggestions as to what I'm missing here?
Looks like you do still need to utilise the artifactory closure outlined in the Gradle Artifactory Plugin. Switching back to using "archives" instead of a custom Config and then adding this to my build sorted it:
artifactory {
publish {
defaults {
publishConfigs('archives')
}
}
}