I'm trying to reduce build times and right now Source Indexing and Symbol Publishing with TFS 2015 takes (~1hr). Maybe indexing sources and publishing symbols is just heavy on disk I/O and bottle-necked there -- I'm unsure. I want sources to continue to be indexed and symbols to continue to be published for this particular build as it makes debugging exponentially easier.
Are there any ways to make source indexing and symbol publishing with TFS 2015 faster?
It's hard to just reduce the time of this task "Source Indexing/Symbol Publishing "
However, there are other ways to reduce the build time:Such as setting clean workspace to none. Changing the workspace setting from recreate a fresh workspace every time to incremental by which it will incrementally download the source to the build workspace only.
During the build process, the build agent compiles and does other work with your source files. Before the build agent can do this work, it downloads the files from folders on your version control server into a local working directory. To facilitate downloading these files, the build agent creates a version control workspace, which maps the folders on the server to the local folders in the working directory for the build agent. If you set clean workspace , it will delete the old files and get down the sources during every triggered build. So set clean workspace to none can reduce the time of the build.
And it's also related to the hardware of your server. Improve the performance of the server will also reduce your build times.
Related
Actually first code is built and deployed after some updates developer changed the code and check-in the files now I want to build that files or folder and get that files artefact in drop folder? as well as how to deploy those files? in Tfs 2017 and it will automatically build only check-in files.
What you would like is to have an incremental build. When you add a Visual Studio Build / MSBuild task to build the project, just uncheck the Clean option. Thus it will sync the source and only get the changed files from the second time to build. See
Build task Arguments for details.
Clean Option : Set to False if you want to make this an incremental build. This setting might reduce your build time,
especially if your codebase is large. This option has no practical
effect unless you also set Clean repository to False.
Set to True if you want to rebuild all the code in the code projects.
This is equivalent to the MSBuild /target:clean argument.
However for artifacts, TFS always delivered all files - changed and unchanged.
It does copy all of the project output (subsequent projects that
depend on it may depend on these assemblies and files being there).
This causes incremental builds to be much faster, but it doesn't "only
deliver the changed files". It always delivers all files whether they
are changed or unchanged.
You could take a look at jessehouwing's reply in this question about this part: Incremental Builds issue in Team Foundation Server
Currently my team is using local build agents for our on-premise TFS 2015 installation. We've installed these build agents on our own development machines. However we are having issues limiting the space required for the continuous integration builds.
Our disk space is limited. Consequently we have to remove old build tasks (including sources and artifacts) manually to clean-up disk space.
Is there some way to automate this? Preferly by telling TFS to automatically remove older build tasks.
You can specify build retention policies, which will automatically delete old completed builds to minimize clutter. You modify these policies on the Retention tab of your build definition. Retention policies will delete the items below:
The build record
Logs
Published artifacts
Automated test results
Published artifacts
Published symbols
Currentlt, server drops are deleted when a build is deleted but drops to UNC shares are not. This has been fixed on Team Services and is in Team Foundation Server '15', which is currently in prerelease. Check: https://connect.microsoft.com/VisualStudio/feedback/details/1513256/build-preview-drop-folder-not-deleted-when-build-is-deleted
Working folder on your machine won't be deleted . In order to delete UNC drops and working folder, you can add a task Delete files in your build definition to delete working folder and drop folder.
I'm using TFS 2013 on premises. I have four build agents configured on a Build machine. Several build definitions compile ASP .NET websites. I configured the msbuild parameters to deploy the IIS application to the integration server, which sits out there in Rackspace.
By default webdeploy does differential deployments by comparing file dates. In my case that's a big plus because copying files from our network to Rackspace takes quite some time. Now, in order to preserve file dates the build agent has to compile the same base set of source code. On every build only the differential source code yields a new DLL, minimizing the number of files deployed.
All of that works fine, with a caveat: a given build definition has to be assigned to a build agent (by agent name or tag). The problem is I create a lot of contingency when all builds assigned to the same agent are queued up. They wait in line until the previous build is done.
In an ideal world any agent should be able to take care of any build, but the source code being compiled has to be the same, regardless of the agent.
I tried changing the working folder of all agents to point to the same location but I get an error because two agents can't be mapped to the same folder. I guess there is one workspace per agent.
Any ideas?
Finally I found a way to do this. Here are all the changes you need to do:
By default the working folder of each agent is $(SystemDrive)\Builds\$(BuildAgentId)\$(BuildDefinitionPath). That means there's one working folder per BuildAgentId. I changed it so that all Agents share the same folder: $(SystemDrive)\Builds\WorkingFolder\$(BuildDefinitionPath)
By default at runtime the workflow creates a workspace that looks like "[BuildDefinitionId][AgentId][MachineName]". Because all agents share the same working folder there's an error trying to create each separate workspace. The solution to this is in the build definition: Edit the xaml and look for an activity called "Get sources from Team Foundation Version Control". There's a property called WrokspaceName. Since I want to have one workspace per build definition I set that property to the BuildDetail.BuildDefinition.Name.
Save your customized build template and create a build that uses it.
Make sure the option "1. TF VersionControl/1. Clean workspace" is set to False. Otherwise the build will wipe out all the source code on every build.
Make sure the option "2. Build/3. Clean build" is set to false. Otherwise the build will wipeout the output binaries on every build.
With this setup you can queue up the same build on any agent, and all of them will point to the same source code and bin output. When the source code changes only the affected binaries are recompiled. I have a custom step in the template that deploys the output files to IIS, to all the servers in our webfarm, using msdeploy.exe. Now my builds+deployments take one or two minutes, because only the dlls or content that changed during the build are synchronized to the servers.
You can't run two build agents in the same folder. The point of build agents is to run multiple builds in parallel, usually on separate PCs. If you try to run them on the same source code, then (a) it's pointless as two build of exactly the same source should produce identical results, and (b) they are almost certainly going to trip over each other and cause the builds to fail or produce unexpected results.
If you want to be able to build and then deploy a series of versions of your codebase, then there are two options:
if you queue up multiple builds, then the last one will "win", so the intermediate builds are of no real value. So if you check in New code before your first build completes, you may as well stop the active build and start a new one. you should be asking yourself why the build is so slow, or why you are checking in changes so often that this is necessary.
if each build produces an incremental update to the deployed result, then you need to pass the output of your builds to some deployment agent that is able to diff it against the deployed version and send only the changes to be deployed. This could be set up to gather results from multiple build agents if that would be beneficial.
but I wonder if perhaps your build Is slow because you are doing a complete build each time (which cleans the build folder, gets all the sources, and does a full rebuild), when what you want is an incremental build (which gets the latest changes, compiles only what is affected, and complete quickly). perhaps you should investigate making your build incremental.
Earlier I used my local workspace to keep all the project contents. Now I moved that to a remote server. I just created empty folder (remote_ws) in my local PC and mount that with remote workspace (ws). Before I open eclipse giving following command in a terminal and then it will mount local workspace folder(remote_ws) and remote workspace folder(ws). By giving that command all the remote contents are available in local PC.
Command is,
sshfs -o nonempty hera#192.168.1.83:/projects/project_hera/ws /external/remote_ws/
I am using ant build to build this project. My problem is earlier it took around 2 minutes to build this project. But now it is taking very long time (around 15 minutes) to build the same project. To copy content to a mount local folder (remote_ws) is taking very long time.
How can I speedup this build process? Please help.
Build (compilation, creation jar, war) etc involves a lot IO. You cannot make it fast without improving over-all IO performance - faster disk, faster network etc.
Build will be faster on the machine where source code exist. I would advice to use a CI (say jenkins and do a build on the remote server. They can checkout, build , test and deploy without any manual intervention.
My build agent working directories are starting to take too much space on the disk. I wonder if it is okay for me to delete some old folders, or if I should back them up.
What is the impact of deleting a TFS build agent working directory?
Are the labels affected?
Is the build history affected?
You can do it if your builds are not incremental (incremental get). If your builds always get all source files for every build it is OK to delete working directories. Build history and labels are not affected. Your build logs are in the drop location and it shouldn't be a problem either.
It is a problem for incremental (at least from incremental get) builds. These builds are getting only the latest changes from the TFS source control before every build, not the whole workspace as defined in the build definition's workspace.
Check your build definition's Process → CleanWorkspace settings. If it is set to All, it should not be a problem to delete the build directory.
Assuming I understand your question correctly, make sure you delete the builds through the TFS interface. Don't just delete the folders off the disk if you can help it. The TFS 2010 Build Deletion dialog gives you some options about what to keep and what to delete.
http://blogs.msdn.com/b/jpricket/archive/2009/12/09/tfs-2010-how-about-those-build-delete-options.aspx
You can also set up build retention policies so that old/unused builds automatically get deleted.