TFS - Build and Release, configuration file provider - tfs

Is there any similar plugin like this
https://wiki.jenkins.io/display/JENKINS/Config+File+Provider+Plugin
in TFS Build & Release
I want to provide configuration.json file which is not included in git source.

Unfortunately, there is no such extension in TFS/VSTS Build & Release.
According to your tag tfs2013, seems you are working with XAML build.
Just like you need a workspace on your dev machine to develop your app, you must specify the workspace that the build agent uses to build and test your app. Then we get/pull source files from TFS server side. It's not able to achieve below similar features in TFS UI:
Adds the ability to provide configuration files (i.e., settings.xml
for maven, XML, groovy, custom files, etc.) loaded through the Jenkins
UI which will be copied to the job's workspace.
As a workaround, you could try to put configuration.json files in a ftp server instead of git source and then use a PowerShell solution to down the files in build agent workspace. If create a PowerShell to over FTP you can have it called by the build template(customize workflow).

Related

TFS 2017 release management artifact files from version control

I would like to access some files from source control (tfvc) while release management.
The sources I found are either build (type "Build") and the whole source tree (type "Team Foundation Version Control").
The type "Team Foundation Version Control" seems to match, but it is not allowed to select sub folder, e.g. "$/MyApp/branches/V2/scripts".
Do I need to create an artifact for the script files?
Instead of linking in a separate repository, I'd strongly recommend either publishing them as a build artifact (as the other answer mentions) or publishing them as a versioned NuGet package.
The reason is because everything that goes into a deployment should be versioned together. Scripts that are changing out of sync with everything else can cause abrupt deployment failures for unknown reasons. Let's say you linked those scripts in as an artifact and started a deployment along your pipeline from Dev -> Production. Dev deployment is fine. QA deployment is fine. Staging deployment is fine. Production deployment... fails? Because of an error in the scripts?
Whoops, someone committed a change to those scripts and introduced a bug. But the scripts weren't versioned, so you had no way of guaranteeing that the scripts being used in prior stages were the same as the scripts being used in your production stage.
You can save your source code as a artifact in your build process. Use the "Publish Artifact" step to publish your source code in Tfs or on a unc path. After that release management downloads your artifacts as the first step.

Release Management in TFS without build automation

We have an automated build process in place which creates the release artifacts for us.
These are copied to an FTP location and post certain processes, the packages are available for deployment to val, customer dev, UAT and prod.
I want to create a Release in TFS where the release should simply use the package from the ftp location instead of triggering a new build.
The process of moving the artifacts to the ftp using a detached build process is legacy and I'm afraid cannot be changed.
I would like to trigger a release ( at the moment I'm testing this using VSTS ) which will use the artifact from an ftp instead of triggering a build.
My build server / process is not in TFS and it's a large application with multiple components.
Trigger a release using the artifacts from an FTP instead of triggering a build is supported. But there's no default steps about download a FTP file for you to use. You need to use command or Powerscript to download.
In the release definition, delete all artifacts under the Artifacts tab. Then it won't download any files from your builds.
Create a bat file or Powershell script file, write the command/script to download those files you want. Here are some methods you could have a look:
How to script FTP upload and download?
How to download files from FTP site in one command line without user interaction (Windows)
https://tecadmin.net/download-upload-files-using-ftp-command-line/
Check the bat/script to TFS and run it in your release definition.

TFS Build: How to get EXE file deployed to website

I have a TFS Build Definition that builds/deploys a web project to our internal IIS server. That works fine. However, I would also like to build/deploy a WinForms app (.exe) to the same web site.
I did add the WinForms solution to the Build Definition. TFS builds the .exe and copies it to the drops folder. But it's not in the _PublishedWebsites folder.
I've been manually copying the file over to the web site. Is there a way to automate this?
Thanks in advance!
In XAML build, you can check in your script, and specify a post-build script path in your XAML build definition. This script gathers some of the typical binary types from the typical locations and copies them to the folder from which TFBuild copies and drops to your staging location. Check more information about Run a script in your XAML build process at website: https://msdn.microsoft.com/library/dn376353%28v=vs.120%29.aspx
If you upgrade to TFS 2015, in the new build system, you can simply add a PublishBuildArtifacts task in your build definition. About how to use this task, check: https://www.visualstudio.com/en-us/docs/build/steps/utility/publish-build-artifacts

Plugin to Copy Files from Desktop to Jobs Workspace in Jenkins

I have a scenario wherein the users who have created some configuration files need to upload the same from the desktop where they access Jenkins onto the Job's workspace to build and execute tests.
So I did try using the Config File Provider plugin as mentioned in https://wiki.jenkins-ci.org/display/JENKINS/Config+File+Provider+Plugin. This seems to work fine to copy the configuration file to the Jenkins UI, which can be later synced to the Slave with the given path in the Build environment of the respective job. But the users who wish to upload these files don't have the administrative rights. Hence they are unable to access the Configuration File Management which is under the Manage Jenkins tab. Is there any way that I can move the Configuration File Management under the Jenkins sidebar and allow users to edit the same.
Are there any other plugins that will help me achieve the same? I did also try Copy to Slave plugin but this only copies the files under $JENKINS_HOME/userContent to the job's workspace. We will have to copy the files from the desktop to $JENKINS_HOME/userContent and then use the plugin.
Wouldn't using a parameterized job and have one of the parameters be a File Parameter? See https://wiki.jenkins-ci.org/display/JENKINS/Parameterized+Build

Jenkins - Is it necessary to have a repository for running a multi configuration project

I am new to Jenkins. I am trying to configure Jenkins to run my Selenium tests on multiple browsers. So I thought multi-configuration project would be a best choice. I am using Sauce labs for performing cross-browser testing.
My selenium source code is on my local system directory. I have not uploaded the same to any of the repositories. I have configured a multi-configuration project with a custom workspace pointing to my local source code, and selected "none" in Source code management section.
Now, when I build the job, the job creates workspace for each browser combination. Eg: <project workspace>\SELENIUM_DRIVER\Windows 2003firefox16 and <project workspace>\SELENIUM_DRIVER\Windows 2003internet explorer8. But the files are not copied to each of these workspaces automatically. I need to copy my files manually into these directories for it to work.
Is it necessary to have Repositories like SVN, CVS or Git for this to work. Or is there a way I can run the build from my local system?
For this to work, a Repository is not required,
but you do need to have some good way to access your artifacts and selenium code.
Suggest you drop the artifacts on a shared drive as a preliminary step,
and also put your selenium source-code on a shared drive, as a practice -
this will allow you to run multiple tests from multiple machines.
Cheers

Resources