how can i upload to Repo as part of a custom release build task - azure-devops-extensions

I have built a Release pipeline custom task extension and this has been successful, we are downloading a file from one location (Dev environment), then deploying it to the next (test environment) I am wanting to have the build task capture the file and store it in the Repo as part of the task.
I've researched through all the Microsoft documentation, samples etc, but i am unable to find anything, can anyone recommend some blogs/tutorials/examples that will allow me to do what i am after?
I also have a few other queries around custom build tasks and would love an end to end general tutorial, i have found disparate information on the Microsoft Docs but piecing it all together is an issue.

how can i upload to Repo as part of a custom release build task
There is no such out of box method to upload file to Repo as part of a custom release build task. Because it is not recommended to use a script/build task to upload files directly to the repository without any checks.
If you are interested, you can try to use following method to do this:
Steps:
Using task to download the file and you can copy them to your local repo directory. Then specify the Working Folder to the local repo directory in all Command Lines tasks.
Run the build agent with an account that can push commits to TFS/Azure Devops.
Details steps:
Run git add <filename> command:
Run git commit -m "Commit message".
Run git push origin master.
Note: Make sure you have install Git on your build agent machine.
Hope this helps.

Related

`gcloud builds submit` for Cloud Run

I have this situation, because the documentation was not clear. The gcloud builds submit --tag gcr.io/[PROJECT-ID]/helloworld command will
archive the contents of my source folder and then run the docker build on the Google build server.
Also it is only looking at the .gitignore file for the contents to archive. If it is a docker build, it should honor the .dockerignore file.
Also there is no word about how to compile the application. It has to be compiled if is not precompiled application before it is dockerized.
the quick guide only considers that the application is a precompiled one and all the contents of the folder as per the .gitignore are required required to run the application. People will not be aware of all that for a new technology. I have just figured it out by myself.
So, the alternate way of doing all that is either include the build steps in the docker file (which will make my image heavy) or create a docker image locally (manually) and then submit the image to the repository (manually) and then publish to the cloud run (using the second command documented or manually).
Is there anything I am missing over here?
Cloud Build respects .dockerignore. It will upload all files that are not in .gitignore, but once uploaded, it will respect .dockerignore regarding which files to use for the build.
Compiling your application is usually done at the same time as "containerizing" it. For example, for a Node.js app, the Dockerfile must run npm install --production. I recommend looking at the many examples in the quickstart.
I think you've got it, essentially your options are:
Building using Cloud Build
Building locally and pushing using Docker
Generally if you need additional build steps, I would recommend including them in your Docker file. Ideally you should be able to go from source + Dockerfile to a complete image in either case.

Ruby on Rails Capistrano to update the code without deployment

Am looking for Capistrano Geek to reduce the Deployment time & process.
everyone knows how the capistrano is working, its always Clone code to the target server and keep the code as release directory & create a symblink to current directory.
Here am looking for Git pull request, in ROR if I made any changes like Changing the Caption, updating text means I dont want to deploy the whole application again.
I simply need to update the code which has minimum changes
For that I have to use Git pull to update the changes & Git pull is not working in the Capistrano
I directly ran git pull in the release path I got error only.
Could anyone has solution for this pls post & my sample code is show below.
desc "Update the deployed code."
task :update_code
execute "/usr/bin/git pull origin #{fetch(:release_path)}")
end
end
Capistrano uses git archive to create the release copy of the repo. This does not include the .git/ directory, so further git commands will not work.

How do you put your source code into Kubernetes?

I am new to Kubernetes and so I'm wondering what are the best practices when it comes to putting your app's source code into container run in Kubernetes or similar environment?
My app is a PHP so I have PHP(fpm) and Nginx containers(running from Google Container Engine)
At first, I had git volume, but there was no way of changing app versions like this so I switched to emptyDir and having my source code in a zip archive in one of the images that would unzip it into this volume upon start and now I have the source code separate in both images via git with separate git directory so I have /app and /app-git.
This is good because I do not need to share or configure volumes(less resources and configuration), the app's layer is reused in both images so no impact on space and since it is git the "base" is built in so I can simply adjust my dockerfile command at the end and switch to different branch or tag easily.
I wanted to download an archive with the source code directly from repository by providing credentials as arguments during build process but that did not work because my repo, bitbucket, creates archives with last commit id appended to the directory so there was no way o knowing what unpacking the archive would result in, so I got stuck with git itself.
What are your ways of handling the source code?
Ideally, you would use continuous delivery patterns, which means use Travis CI, Bitbucket pipelines or Jenkins to build the image on code change.
that is, every time your code changes, your automated build will get triggered and build a new Docker image, which will contain your source code. Then you can trigger a Deployment rolling update to update the Pods with the new image.
If you have dynamic content, you likely put this a persistent storage, which will be re-mounted on Pod update.
What we've done traditionally with PHP is an overlay on runtime. Basically the container will have a volume mounted to it with deploy keys to your git repo. This will allow you to perform git pull operations.
The more buttoned up approach is to have custom, tagged images of your code extended from fpm or whatever image you're using. That way you would run version 1.3 of YourImage where YourImage would contain code version 1.3 of your application.
Try to leverage continuous integration and continuous deployment. You can use Jenkins as CI/CD server, and create some jobs for building image, pushing image and deploying image.
I recommend putting your source code into docker image, instead of git repo. You can also extract configuration files from docker image. In kubernetes v1.2, it provides new feature 'ConfigMap', so we can put configuration files in ConfigMap. When running a pod, configuration files will be mounted automatically. It's very convenience.

where to get build file to deploy to server

I've set up config to tell circle ci what to build and how to build.
After the the build I want to send all the built files to my ftp server, which is a share host (host-gator)
Can I instruct circleCI to do so?
There's two separate things here. If the build files that you want to upload are your application itself, then this is considered a deploy. You can do this in the deployment phase in circle.yml. More info can be found here: https://circleci.com/docs/configuration/#deployment
If the build is "other" files that you want to upload for record keeping, debugging, or basically a deployment for someday in the future, you can utilize what are called build artifacts: https://circleci.com/docs/build-artifacts/

Jenkins: Testing on every commit AND releasing on click with the same project

1) I want to run my testsuite on every commit, so Jenkins should poll SVN, make an update and run the tests. That's possible without problems, BUT...
2) I also want to be able to create builds, which additionally covers creating zip archives and transferring them via scp to a remote server.
How is this possible within the same project?
Have you checked Jenkins SCP Plugin? You can add an aditional build step in the existing project itself which creates your builds, generates zip archives etc. And at last as a post build action publish the artifacts to a SCP repository. Will that help?

Resources