Artifactory: Download exploded archive - jenkins

I cannot help thinking I am missing the point.
I have a jenkins system that is to store builds in artifactory pro, I would like to end up able to compare builds on the artifactory server and download a set of files as a zip file.
The output of my build is a few thousand files in a folder structure.
As I understand it, artifactory uses a de-duplicating file system so the efficient way to do it is to upload each file individually as only a few change each build.
When I do this, it takes 20 mins but the result on the server is good, I can compare builds and see the changes, however I cannot download the whole release, I need to click on each file and do them one by one.
If I upload a zip file, it is quicker and I can download it, but I loose the ability to see the files inside and presumably this will eat up disk space as there can be no de-duplication.
Ah - the explode option, this unpacks the zip file on the artifactory server, brilliant, except if I diff builds it just shows me the original archive and says, it changed, yet I still have to download every file individually.
Has any one cracked this thing, I want fast upload, diff files (with efficient storage) and single click download?

To download files in a "single click" you can either:
Use the REST API to download a folder as a compressed file
Use the JFrog CLI to download files in a single command using file specs.
HTH,
Yinon

Related

Is there a way to see the size of files in Plastic Cloud Edition?

I am trying to do an audit of the files I have in Plastic Cloud to see if there's anything in there that is hogging up space that we don't need in there, and I'm not sure exactly how to do that. What I have in my local workspace is not the same as what's in the cloud (because of ignore lists, if I'm understanding correctly?) so is there any way I can view what's in the cloud, or am I thinking about this all the wrong way?
The working copy downloaded in your machine is just the last version of the project, but the cloud storage includes all the history of the project (every previous revision for each file in different branches...)
From the branch explorer, you can right-click any changeset and select "Browse repository in this changeset". This way, you can check the tree of this specific changeset (including file paths and file sizes).
There is a recent feature that allows you to arrchive old file revisions in the cloud and this way, reduce the database usage. Please check: https://forum.plasticscm.com/topic/21799-trimming-to-reduce-size-of-the-database/?do=findComment&comment=45397

TFS - not able to download files

In our project solution we done the changes in 5 files (only content change) and the change set number is NNNNN, we want to download those 5 files only. We can get entire solution files upto this change set. Due to content change we want those files only instead of getting all other files like dll or *.CS files.
In other source control the view history displays the affected files, there is a provision to export those files into folder rather than entire solution.
You could write a little PowerShell/.Net code to do this or you can use the TFS Power Tools. There is a command line tfpt.exe and you can use the getcs flag
tfpt getcs /changeset:12
There is a decent write up of the process here

Moving TFS workspace to another PC without re-downloading

I have a TFS workspace which I need to move to my new PC. I have copied the whole folder structure over and ensured that the workspace is mapping to the correct folders. However the "Latest" column for every file displays as "Not downloaded". How can I reconcile this such that TFS is aware that the files match the server version?
The standard answer seems to be to re-download the whole thing. Unfortunately the repository is huge, my connection is unreliable, and I have monthly download quotas. Is there anything in the command-line tools or power tools that can make it compare file hashes or similar and realise that the files are identical?
Thanks.
There's a binary metadata file inside the working copy that stores the mapping of every path in your repo, to the path on the filesystem.
It uses absolute paths - so unless your new project folder occupies the exact same location as it did on the original computer they won't match.
Because it's a binary format, you can't do something simple like mass replace the paths with a text editor or sed.

TFS 2010 Build, constant drop location, random access issue

We are using TFS 2010 Build to deliver libraries on a fixed location. ( \\server\product-R0\latest )
Other team projects reference the library from this location.
On my build process I check if Build and unit tests passed, if it's ok I:
Transform web/app.config
Delete the latest folder using a "DeleteDirectory" activity
Create the latest folder using a "CreateDirectory" activity
Copy the binaries in the folder using "CopyDirectory" activity
I delete the folder first because if we rename an assembly the old one won't be deleted.
The issue is random and happen 40% of the time:
TF270002 : An error occurred copying files from
'D:\Builds\1\FooTeam\BarService\Binaries' to
'\\nas\Builds\BarService-R0\Latest'.
Details : Access to the path
'\\nas\Builds\BarService-R0\Latest\SomeFile.dll'
is denied.
If you launch the build several times it work.
I've try the usual dumb idea of "putting sleeps between steps to see what happens" but it don't solve the problem, it just seems to reduce the probability of it happening.
It's like TFS try to copy while still deleting the directory, some times it hangs on the directory creation step.
Anyone? Thank you!
The most elegant solution is to create a link instead of copying, something like
mklink /J D:\Drops\MyBuild_LatestGood D:\Drops\MyBuild_2014-06-13
Plus: No copy involved, same ACLs.
Caveats: this command works only locally, when the Drop share is located on the Build server. There are options also in the case of a NAS, as long as you are allowed to execute remote commands (e.g. SSH).
Another option is to create a network share on the desired folder, even if the disk is remote, as long as it reside on a Windows server.

TFS MSBuild Copy Files from Network Location Into Build Directory

We are using TFS to build our solutions. We have some help files that we don't include in our projects as we don't want to grant our document writer access to the source. These files are placed in a folder on our network.
When the build kicks off we want the process to grab the files from the network location and place them into a help folder that is part of source.
I have found an activity in the xaml for the build process called CopyDirectory. I think this may work but I'm not sure what values to place into the Destination and Source properties. After each successful build the build is copied out to a network location. We want to copy the files from one network location into the new build directory.
I may be approaching this the wrong way, but any help would be much appreciated.
Thanks.
First, you might want to consider your documentation author placing his documents in TFS. You can give him access to a separate folder or project without granting access to your source code. The advantages of this are:
Everything is in source control. Files dropped in a network folder are easily misplaced or corrupted, and you have no history of changes to them. The ideal for any project is that everything related to the project is captured in source control so you can lift out a complete historical version whenever one is needed.
You can map the documentation to a different local folder on your build server such that simply executing the "get" of the source code automatically copies the documentation exactly where it's needed.
The disadvantage is that you may need an extra CAL for him to be able to do this.
Another (more laborious) approach is to let him save to the network location, and have a developer check the new files into TFS periodically. If the docs aren't updated often this may be an acceptable compromise.
However, if you wish to copy the docs from the net during your build, you can use one of the MSBuild Copy commands (as you are already aware), or you can use Exec. The copy commands are more complicated to use because they often populated with filename lists that are generated from outputs of other build targets, and are usually used with solution-relative pathnames. But if you're happy with DOS commands (xcopy/robocopy), then you may find it much easier just to use Exec to run an xcopy/robocopy command. You can then "develop" and test the xcopy command outside the MSBuild environment and then just paste it into the MSBuild script with confidence that it will work - much easier than trialling copy settings as part of your full build process.
Exec is documented here. The example shows pretty well how to do what you want, but in your case you can probably just replace the Command attribute with the entire xcopy/robocopy command (or even the name of a batch file) you want to use, so you won't need to set up the ItemGroup etc.

Resources