I have a Jenkins job that gets the code from version control and builds (like what a normal pipeline do), I was doing is that after building the project, I download the build and use FTP to transfer that build to the client's server then I unzip it and then copy the whole build because I copy whole build my application's down time is very high. (I have to use FTP because as a service provider we have some limitations and can't change this policy)
What I wanted to do is that Jenkins know what is changed when it is building so Jenkins will create a package with all the changes and with the correct path where the file should go, and I can download that package and copy that package and just run the package so whatever was changed only that should get updated.
Is that possible? Is there any plugin that I can use?
This really depends on the build tool/language you are using to build you application. I dont think there is a generic jenkins plugin.
Other idea would be to upload your package to a local Nexus server. Download after the next build and the compare the files from old and new build. With this information you can create a patch package for your clienst server.
Related
Each time i generate my build through jenkins, my existing jar file in the target folder is overwritten by maven. For example: i have a existing version of 1.0 in jenkins target folder, now if i create a new build with version 1.1, the previous version in my target folder gets overwritten.
I don't want that to happen, i want to archive all the versions (because we might provide some of the old features to certain set of customers). i am just trying to understand is there way to do this in jenkins pipeline. I don't prefer plugins, it would be nice to do it declarative way using jenkins file.
First of all, it's not the best solution to store your artifacts just in target folder without any copying to other place. Usually all needed build artifacts are stored in Nexus or Artifactory repositories (of course, you can copy them to some local directory also). You can do that in pipeline Jenkinsfile as well, but you still require to install needed plugin. For example, for publishing artifacts to Nexus repo, you can use Nexus Platform Plugin, see this answer for details.
About overwriting your target folder, I'm not sure if it's cleaned by Jenkins by default. To clean workspace, you need to specify Discard old builds option in job configuration first.
Seems to be that you just execute mvn clean ... command, that's why target folder is cleaned, so I would recommend to check that first.
I need to regularly download a complete set of latest code for a particular project from a VSTS account (server workspace), to a folder on a file server for readonly archiving.
Currently I log on to the web portal and click Download as ZIP for the selected project and save this to the file server.
But I'd like a more automated way, preferably something I can schedule to run from the file server itself which won't have Visual Studio installed or cached credentials for the online account.
Any of the following soluions would be ok:
A permanent URL to download the latest code as a zip file
A REST URL
to get all latest files
A command line tool to connect to the VSTS
account and download all latest files for a particular project to a specified local folder, not the default local folder
Nice to have:
Option to download as ZIP or recursive folder of files
Set files modified date as check-in time
Remove source control binding information from the downloaded files
Provide user credentials as part of the command line, not assume to use the default cached credentials on the machine
You could use our tools in Visual Studio, Eclipse, or from the command line to keep a local copy of your source code on your machine.
More details please refer official tutorial: Download (get) files from the Server
Also, if you want to download your code as a zip:
You can click on any ellipsis to find the menu which contains Download as Zip option.
If you want a automated way, suggest you use the build pipeline. You could disable the default get source steps in the build definition. And use your own powershell script to do the get source/pull files to the workspace. How to, please follow: Is it able to ignore/disable the first step Get source in vNext Build?
This will download files in your build agent, if it's not the machine you are working on. You could combine Archive Files & Windows Machine File Copy task and select Scheduled trigger in your build definition.
you might consider using an agent + build definition to download the source code (this could happen either based on a schedule or triggered after every check-in). This could easily include compression to a ZIP file and some copy commands.
An additional benefit would be that the build definition doesn't have to re-download the entire source code repository each time it is run - instead, it can be configured to just download get the changes that occurred.
Powershell
$tfsurl = "https://tfs.alogent.com/tfs"
$collection ="/defaultcollection"
$project = "/MyProject"
$api = "/_api/_versioncontrol/itemContentZipped?repositoryId=&path="
$path = "$/MyProject/Source/Datafolder"
Invoke-WebRequest -UseDefaultCredentials -Uri "$tfsurl$collection$project$api$path" -OutFile ".\DataFolder.zip"
Expand-Archive .\Datafolder.zip
Is there any similar plugin like this
https://wiki.jenkins.io/display/JENKINS/Config+File+Provider+Plugin
in TFS Build & Release
I want to provide configuration.json file which is not included in git source.
Unfortunately, there is no such extension in TFS/VSTS Build & Release.
According to your tag tfs2013, seems you are working with XAML build.
Just like you need a workspace on your dev machine to develop your app, you must specify the workspace that the build agent uses to build and test your app. Then we get/pull source files from TFS server side. It's not able to achieve below similar features in TFS UI:
Adds the ability to provide configuration files (i.e., settings.xml
for maven, XML, groovy, custom files, etc.) loaded through the Jenkins
UI which will be copied to the job's workspace.
As a workaround, you could try to put configuration.json files in a ftp server instead of git source and then use a PowerShell solution to down the files in build agent workspace. If create a PowerShell to over FTP you can have it called by the build template(customize workflow).
We have an automated build process in place which creates the release artifacts for us.
These are copied to an FTP location and post certain processes, the packages are available for deployment to val, customer dev, UAT and prod.
I want to create a Release in TFS where the release should simply use the package from the ftp location instead of triggering a new build.
The process of moving the artifacts to the ftp using a detached build process is legacy and I'm afraid cannot be changed.
I would like to trigger a release ( at the moment I'm testing this using VSTS ) which will use the artifact from an ftp instead of triggering a build.
My build server / process is not in TFS and it's a large application with multiple components.
Trigger a release using the artifacts from an FTP instead of triggering a build is supported. But there's no default steps about download a FTP file for you to use. You need to use command or Powerscript to download.
In the release definition, delete all artifacts under the Artifacts tab. Then it won't download any files from your builds.
Create a bat file or Powershell script file, write the command/script to download those files you want. Here are some methods you could have a look:
How to script FTP upload and download?
How to download files from FTP site in one command line without user interaction (Windows)
https://tecadmin.net/download-upload-files-using-ftp-command-line/
Check the bat/script to TFS and run it in your release definition.
Currently I have my whole automation source code (Script and test data) in Jenkins server and whenever I want to change my test data, I need to go to the Jenkins server machine and changing it .
The problem is if I want to change the test data, I need to wait for long time for my access from the admin team. Also I have huge number of test data in my project so I am not interested in creating Jenkins project with Parameter Builds. So if there any option available in Jenkins to import files (excel) before build then that would we helpful.
Please consider as a priority one.
The most common way to transfer files to Jenkins server is to use a version control system like git or subversion:
Commit files to version control system
Configure Jenkins job to detect change in the version control system and check out work directory for the build or test
If your files are so big they cannot fit into a version control system (some of them do not perform well with files in the gigabyte range), you could use a shared disk drive which you have permission to write to.