Move workdir to another drive and set max size - tfs

Documentation is extremely scarse on this (https://learn.microsoft.com/en-us/vsts/build-release/actions/agents/v2-windows?view=vsts#download-configure)
How to install VSTS agent on C, but keep workdir on other drive?
How to force VSTS agent to reclaim space or limit agent to X gb?

It's able to change the working directory to another driver.
For the new build agent, when you run .\config.cmd command, there is a step called:
Enter work folder (press enter for _work):
You could change the path here.
The default work folder location is a _work folder directly under
the agent installation directory. You can change the location to be
outside of the agent installation directory, for example: D:_work.
For a existing build agent, you could remove and re-configure an agent. Delete the old working folder will not break the agent.
You could refer this blog as a tutorial(For TFS but similar to VSTS).
How to force VSTS agent to reclaim space or limit agent to X gb?
You could not do this for local agent. There is no such setting in agent capabilities.
If you need this feature, Hosted agents offer 10 GB of space.
If you are worrying about the disk space:
By setting the Build.Clean variable to all, it will clears the directories before pulling sources.
Use "Delete Files" utility task. The $(Agent.BuildDirectory) and $(Build.SourcesDirectory)variables are useful for this. It will delete files after built successfully.
For more info please refer: Clear the work folder after each build?

A simple way to change it is by going to the agent folder, show all the hidden files.
you'll find a JSON file ".agent" open it, specify the path to your desired directory in "workFolder".
Don't forget to use escaping with the double slashes \, also the agent must be stopped before doing this.

When you install the agent it will ask you for the working directory. The default is _work under the agent install folder, but you can change this at setup time.
You can not limit agent to use X space, but you could add a build/release task to check available space using powershell and fail the process if it is under Y.
Updated
This task could be your first build/release task, so you would not go far in the process if there was not enough space.

One can also use junctions links e.g. running this in CMD (replace folders with real locations for you):
mklink /J "C:\agent\_work" "D:\agent\_work"
Now, _work folder will actually be on D: drive, but the agent will think it is a real folder on C:.
I tested it myself, working fine.

Related

Where will Build Go

I am thinking to use team city for a TFS project. I created team city configurations using TFS server. It is started and creating builds as usual. I was thinking to make the builds for my local version of TFS checked out directory before putting it live. Can anyone please suggest where my current builds are going and can I do something what I am thinking.
help please !!
There is a working directory, the same as TFS build agent working directly.
Agent work directory is the directory on a build agent that is used as
a containing directory for the default checkout directories. By
default, this is the Build agent home /work directory.
To modify the default directory location, see workDir parameter in
Build Agent Configuration.
Source: Agent Work Directory
To change this checkout location, you just need to change the workDir=../work of the buildagent.properties file (which stored in the /conf/)

Is it possible to import file from local machine to Jenkins server during start of the build

Currently I have my whole automation source code (Script and test data) in Jenkins server and whenever I want to change my test data, I need to go to the Jenkins server machine and changing it .
The problem is if I want to change the test data, I need to wait for long time for my access from the admin team. Also I have huge number of test data in my project so I am not interested in creating Jenkins project with Parameter Builds. So if there any option available in Jenkins to import files (excel) before build then that would we helpful.
Please consider as a priority one.
The most common way to transfer files to Jenkins server is to use a version control system like git or subversion:
Commit files to version control system
Configure Jenkins job to detect change in the version control system and check out work directory for the build or test
If your files are so big they cannot fit into a version control system (some of them do not perform well with files in the gigabyte range), you could use a shared disk drive which you have permission to write to.

Is there a way to prevent a project from not having access to another project files in Jenkins CI?

I am trying to setup a Jenkins/Hudson CI in a distributed environment. I am curious about the following questions:
1) does the slave account need to be a root/administrator account? If lower privilege can run whats the minimum access?
2) On a slave node, does one projects jobs have access to another project files that previously built on the same node? How would you prevent this?
3) How do you secure someone from not being able to format your disk with a bat file running in a pre or post build script?
1) The slave account does not have to be root or administrator. It only needs full access to the folder you give in the "Remote FS root" field of the slave configuration.
2) Yes it does. Each project folder is owned by the user that is used to run the slave. You can access other project folders using relative paths: $WORKSPACE/../OTHER_PROJECT/. I'm not sure if there is a default way to prevent this. However, you have two options:
Delete the workspace after your build (use plugin Workspace Cleanup Plugin)
Create a separate slave/user combination for each project - the slave can be the same, but you'd have to create a separate user for each project.
3) Formatting a disk completely would require privileged access. You should not give your slave user those rights. I'm not sure whether your slave is Unix or Windows based, but either way, you should be able to prevent your user from being allowed to do any such task. Like stated in A1, the slave user only needs enough access to be able to read/write/execute in its "Remote FS root" folder.
Just out of curiosity - what OS are you running on your slave?

how to make jenkins use an already checked out codebase?

I am just starting with Jenkins 1.487 and wanted to integrate Jenkins in my Ant project. But while configuring it, I can't find any way to make Jenkins re-use an already checked out codebase, instead of downloading a fresh copy relative to its workspace root. Is there a way to do that ?
I tried to specify a custom workspace manually (where my codebase was already checked out), and clicked on 'Build now'. The result was that it wiped out my current checked out code saying
"Checking out a fresh workspace because there's no workspace at /home/daud/Work
Cleaning local Directory ."
Not even a warning..
If you really want to build from an existing checkout somewhere on the file system, then do not use "Source Code Management" section of Jenkins. Leave it as "none"
Go straight to the "Build" section
Click "Add Build Step"
Select Invoke Ant"
Click Advanced
And under "Build File", provide a full path to the ant build file on your file system. You would have to include the drive letter (if on Windows) or a leading / (if on Linux) to break from the Workspace (by default, this path is relative to Workspace). Or use a lot of ../../../.. if needed.
But like others have said, this is not the way a CI system is supposed to be used
The idea behind Jenkins and CI is that it works on a fresh copy of the codebase. Every build done by Jenkins should not depend on any external preconditions and it should be reproducible.
You might want to try using the Clone Workspace SCM Plugin for Jenkins. It will allow you to zip up the workspace from one job and use it to create the workspace for another one. I've used this for downstream jobs that need to act on the work from a previous job.
This is also helpful if you're using something like Git for source control and want to avoid a second Git clone (or SVN checkout). Furthermore, you can limit the content of zip file that is used to recreate the workspace, for example to avoid carrying unnecesary files (e.g. the .git or .svn directories) downstream.

How to keep the subfolder in DropLocation constant in TFS build

I have a build definition setup with a drop location. The binaries are moved into this location, but under a new directory (named as build number) every time. Is there a way to have the same location over written everytime. we have some batch files that copy the binaries out to multiple servers that will be accessed by the end users. We need the location to remain constant so that the batch files can work correctly.
If this is not possible, is there a way for the batch files to pick the latest location which contains our exe (sometimes, the folder is created even when the build failed).
Having an unique name of the drop location, is something you cannot (and don't want to) change. To solve your issue, you can either
1) start the batch files with arguments (so the directory is %1) where you specify the name of the directory
2) Add a task in the build to copy all the files to a file share. If you are using TFS 2008, you can follow the steps provided at http://blogs.msdn.com/b/msbuild/archive/2005/11/07/490068.aspx to copy the files.
If you are using TFS 2005/2008, then TFS Deployer. It flat rocks when doing deployments.
TFS 2010 has a new build deployment model that is pretty good.

Resources