I need to regularly download a complete set of latest code for a particular project from a VSTS account (server workspace), to a folder on a file server for readonly archiving.
Currently I log on to the web portal and click Download as ZIP for the selected project and save this to the file server.
But I'd like a more automated way, preferably something I can schedule to run from the file server itself which won't have Visual Studio installed or cached credentials for the online account.
Any of the following soluions would be ok:
A permanent URL to download the latest code as a zip file
A REST URL
to get all latest files
A command line tool to connect to the VSTS
account and download all latest files for a particular project to a specified local folder, not the default local folder
Nice to have:
Option to download as ZIP or recursive folder of files
Set files modified date as check-in time
Remove source control binding information from the downloaded files
Provide user credentials as part of the command line, not assume to use the default cached credentials on the machine
You could use our tools in Visual Studio, Eclipse, or from the command line to keep a local copy of your source code on your machine.
More details please refer official tutorial: Download (get) files from the Server
Also, if you want to download your code as a zip:
You can click on any ellipsis to find the menu which contains Download as Zip option.
If you want a automated way, suggest you use the build pipeline. You could disable the default get source steps in the build definition. And use your own powershell script to do the get source/pull files to the workspace. How to, please follow: Is it able to ignore/disable the first step Get source in vNext Build?
This will download files in your build agent, if it's not the machine you are working on. You could combine Archive Files & Windows Machine File Copy task and select Scheduled trigger in your build definition.
you might consider using an agent + build definition to download the source code (this could happen either based on a schedule or triggered after every check-in). This could easily include compression to a ZIP file and some copy commands.
An additional benefit would be that the build definition doesn't have to re-download the entire source code repository each time it is run - instead, it can be configured to just download get the changes that occurred.
Powershell
$tfsurl = "https://tfs.alogent.com/tfs"
$collection ="/defaultcollection"
$project = "/MyProject"
$api = "/_api/_versioncontrol/itemContentZipped?repositoryId=&path="
$path = "$/MyProject/Source/Datafolder"
Invoke-WebRequest -UseDefaultCredentials -Uri "$tfsurl$collection$project$api$path" -OutFile ".\DataFolder.zip"
Expand-Archive .\Datafolder.zip
Related
I have a Jenkins job that gets the code from version control and builds (like what a normal pipeline do), I was doing is that after building the project, I download the build and use FTP to transfer that build to the client's server then I unzip it and then copy the whole build because I copy whole build my application's down time is very high. (I have to use FTP because as a service provider we have some limitations and can't change this policy)
What I wanted to do is that Jenkins know what is changed when it is building so Jenkins will create a package with all the changes and with the correct path where the file should go, and I can download that package and copy that package and just run the package so whatever was changed only that should get updated.
Is that possible? Is there any plugin that I can use?
This really depends on the build tool/language you are using to build you application. I dont think there is a generic jenkins plugin.
Other idea would be to upload your package to a local Nexus server. Download after the next build and the compare the files from old and new build. With this information you can create a patch package for your clienst server.
I'm using TFS 2018 to automatically build a Cypress Creator Projekt via cmd line tools. After a successfull build I want to copy only my .hex file to another server, not the whole directory.
I followed the instructions here:
My Publish Artifact looks like this:
Error message: "PathToPublish is not found."
If I change the PathToPublish like this:
It works without any problem. So I think there is a syntax issue here.
Question: How to publish just a specific file instead of a whole directory?
Thanks in advance!
You can’t specify files in Publish Build Artifact task in TFS 2018. For your requirement, you can add Copy Files task to copy necessary files to a folder, then publish this folder:
We are moving to TFS 2018 from 2012 and I'm working on migrating the builds. One of the builds has a few mtbwa:DownloadFiles activities in it but I don't see an equivalent way to do this in the new build system. We have a few utilities in a different branch that are used to build installers. So I need to download those utils before completing the build. How would I do this in the new build system?
If the files are in source control, then you can map the source directly in Get source step.
Then the files will be automatically downloaded to the $(build.sourcesDirectory) by deafult on the agent machine.
After that you can also add a Copy Files task to copy the files to any location as needed.
If the files are not in source control, you can also use the Copy Files task to copy them, but you need to make sure that the service account has the proper permission to access the source folder.
UPDATE:
If the team projects are in same collection, map sources in Get Sources step is also available. You need to manually specify the Server path (Click ... can only navigate to the root path of current team project).
e.g.:
In below screenshot I entered the server path $/2017ScrumProjectFromVS/WpfTest
This also works with the Copy Task, that means you can copy the files directly from another team project which in the same collection.
When I started writing this question, my problem was that after a successful VSTS Build, I wasn't able to see the files relating to my web application project for release. Only the files from certain other projects in the solution were present. However, I just came across this question, which has helped.
I can now see the compiled .dll files for my web application project, after altering the configuration of the Content setting in the Build - that is, the contents of the Bin folder under that project. But I can't see anywhere the other files I need to copy the built web application to my server - the views, the scripts, the css, etc.
I'm finding the power and flexibility of VSTS's Build and Release functionality very confusing as it's complete overkill for our requirements. Up until now, I've just right-clicked on the web app project in Visual Studio selected Publish and used the File System publish method. Easy. Now that I want to automate the building and deploying of the application, it's many times more complicated!
So, can anybody tell me how I can get the solution to build in VSTS in such a way that I can then use a Copy Files task in the Release Definition to copy the files to our web server (the server isn't visible to the Internet so I'm using a locally-hosted Agent)?
In vNext build, to publish your build artifacts with the Copy files or Publish build artifacts steps. Try to use the local path on the agent where any artifacts are copied to before being pushed to their destination. For example:
Add /p:DeployOnBuild=true
/p:OutDir="$(build.artifactstagingdirectory)\" arguments in Visual
Studio Build step;
Change "Path to Publish" of Publish Build Artifacts task to
$(build.artifactstagingdirectory)\_PublishedWebsites\ProjectName:
Details please check the screenshot of build step with this question: How do I get the the expected output from a TFS 2015 build (to match my XAML build)?
Base on your comments, you have published the web app from Visual Studio. Usually, this action will generate a publish profile under Project/Properties/PublishProfiles folder. The settings you used to publish the web app is stored in the profile. So you just need to make sure this publish profile is checked into source control. And then in the TFS build, add following MSBuild arguments:
/p:DeployOnBuild=true /p:PublishProfile="publishprofile.pubxml"
Here is the use-case. There is a very large file in the TFS branch, let's say it is 50GB. When I try to get this specific file with a command line similar to this:
tf get $/Branch/very-large-file.dat
The operation fails because the required time for the download is larger than the time a VPN would stay connected and of-course TFS is behind a VPN. This is why I have download the file manually using a different approach. Problem is that once the file is in place in my local directory and I check which files need to be updated with the following command:
tf get $/Branch/ /recursive /preview
I see that the very-large-file.dat will be downloaded from TFS. And if I go again with:
tf get $/Branch/very-large-file.dat
This will just create the partial file in the directory and start downloading the file from scratch.
Is there a way to update the local version table on the server, so that TFS knows that I have the file locally without having to download it?
In TFS 2012 lcoal workspaces were added, in which case TFS will recognize the file and will compare it to the server version. In 2010 and earlier, the server will keep a list of files on your workspace stored on the server at all times, which will say that you didn't download the file. The server workspace is also cached on your client. I don't know of a way to tell TFS from the commandline or another simple way the file is up to date.
As a workaround you could 'cloak' the large file to tell TFS you don't want to download it at all.