I am working with an older version of TFS and am trying to save time by only doing an npm install if the packages have changed or only webpack if the javascript files have been changed. There are a few projects in the repository that are built every time but if we can skip the unchanged projects it will save a bunch of time. After a bit of research I think the answer is an incremental build.
The build's initial Get Sources step has Clean set to false.
I don't see any other places to toggle clean build other than the initial Get Sources. Or any options with the npm build steps to check before doing an install or webpack.
I found this documentation note in the Clean build options:
https://learn.microsoft.com/en-us/azure/devops/pipelines/tasks/reference/vsbuild-v1?view=azure-pipelines-2018
This option has no practical effect unless you also set the Clean repository to false.
But I don't understand where the Clean repository setting is, or if the Get Sources Clean is the Clean repository setting and I need to find a different Clean option.
If it isn't obvious I'm pretty new to TFS builds so breaking down the answer will be helpful.
Related
There are a few things done in monorepos/monobuilds (you can do a monorepo with no monobuild) that make things very nice but I don't see how yarn workspaces solves it just yet. One of the main ones is I do not see how yarn workspaces can do this part of a mono build process (very typical for scale)
git status to figure out which files changed
map those files to projects that have changed
build those projects and projects that depend on those and projects that depend on those
I am a little confused there. As a monobuild scales up, we really desire build times of a server change is under 3 minutes and changes to a library that may affect all projects would take a long time as it builds the entire repo (unless we split it out to different machines and the build time goes way down again).
Don't think there is necessarily one answer here but a number of things to consider in the context of your project:
If your project is really humungously large, consider someting like Bazel which is a bit complex but allows for incremental building and testing.
There are some specific tools to help with building large projects quickly. For instance, for JavaScript, there are Turborepo and Nx.
Yarn Workspaces or npm workspaces can generally help with enabling better monorepo build processes by allowing us to run build scripts only for a subset of workspaces. They won't solve the problem though of figuring out what to build when, they just provide us with the basic building block of running scripts selectively.
Finally a bit of Bash/Git/Makefile magic will probably be required. The following git command for instance can help us determine if files in particular paths have changed since the last commit git diff --quiet HEAD~1 HEAD -- [paths]. Note though this can can create a few annoying edge cases, especially if builds fail and we risk missing out on builing projects that we should build.
There are plugins for some CI/CD platforms that wrap the Git commands in a somewhat easier to use way. For instance, I have used the GitHub action has-changed-path and I think there was a plugin for BuildKite too, but I cannot find the link to that.
Generally I think it will be challenging to have a monorepo setup that avoids installing dependencies for all modules/workspaces and compiling all code. But I think it is possible to get to scale up to a few hundred thousand lines of code and hundreds of dependencies and keep install and compile times under 2-3 min using TypeScript in Yarn - when making good use of TypeScript project references and using something like Yarn Zero Installs.
We're experiencing problems with a build, since had been working fine up until last week Thursday. Now we get this error whenever it runs:
Not found SourceFolder:
C:\agent_work\34\s\PharmacyWarehouse\PharmacyWarehouse\bin\release\app.publish
This is in our test build environment. It fails on the Copy Files task. Both the source and destination folders are on the build server, so I think Copy Files would be fine. Here's our environment for test:
According to your description and error message. This may related to cache for build agent/server.
When you add a Visual Studio Build / MSBuild task to build the project, make sure you have checked the Clean option or set Clean=True. Thus it will delete all got source and generated build output which generated from prior builds.
Clean Option : Set to False if you want to make this an incremental build. This setting might reduce your build time,
especially if your codebase is large. This option has no practical
effect unless you also set Clean repository to False.
Set to True if you want to rebuild all the code in the code projects.
This is equivalent to the MSBuild /target:clean argument.
Also check if there are enough drive space in your build agent/server.
Besides, you could also reinstall build agent or upgrade your build agent version. This will force delete/recreate working folder. Which may do the trick.
I am new to the world of scripting with TFS2015. I created a script that builds all of the projects within my solution (it is a rather large solution) and puts it out in a shared folder (where each project has its own subfolder).
I would like to create a separate script for each project that simply copies the bin folder from the shared and pastes it out on my Test environment. I rarely need to deploy everything, so the idea is one build...multiple deploys.
However, when I run my deploy script using the Copy Files step it is doing another build. Although it copies the files that I expect, it is after a full build that creates the folder structure for the build.
Am I able to make the Copy Files step NOT do a Build?
Here is the steps that my script is curently doing:
As you can see, there is only one step (Copy Files) but it still does the Get sources and copies everything into a new folder on the build box like so (where the number keeps incrementing up with each run of the script):
I just want to copy the files from the Source to the Target and not do a build or Get Sources.
It looks like you're still on TFS 2015 RTM or Update 1. Which is already pretty old technology if you compare it to the lifetime of the new build system which was introduced with this version.
TFS 2015 update 2 has introduced a similar system to the Build pipelines to orchestrate Releases. This doesn't require you to map any workspaces or git repositories and can act on the artefacts of your builds or simply on the contents of file shares.
It makes sense that a Build has to build something and in order to build something, it has to get the things to build. If you're actually not building something, then you're probably deploying or releasing or packaging something else. Hence the distinction between Build and Release pipelines.
TFS 2017+ has an option to disable the syncing of sources. Primarily to allow people to get the sources themselves in creative ways (e.g. a custom powershell script that invokes git.exe).
My primary advice would be to upgrade to TFS 2018 update 3 or at least TFS 2017 update 3.1, worst case TFS 2015 update 4.1. The fact that versions older than update 2015.4.1 have a known XSS scripting security bug may be reason enough to convince your organisation to perform this update.
Barring that option you're left with one solution:
Link your build definition either to a git repository with only a single commit (If I remember correctly the 2015 agent still crashes when syncing an empty Git repo) or link it to a TFVC repository and set the workspace settings to cloak everything. This essentially causes the build to sync an empty folder, which it can cache, before calling your powershell script.
I have our Solution on CI build. That works.
When dev's check in changes, the solution builds, but only for changes to that solution.
How do I get the build definition to build on changes to OTHER folder changes outside of the solution?
Yes, I can add a workspace in the workspace sources tab. But that means all that code is downloaded on every build.
Our solution has over a dozen dependencies. I would like to trigger a build if any of those dependencies change. We don't need the dependency source code to download into the build workspace at all. That's just pointless.
Yes, we have a folder in TFS for our Nuget Packages. We check them in automatically on build (Thank you TFS).
I could just add the NugetPackage workspace to the solution's workspace list, BUT that would result in every version of every dependency getting downloaded into the build workspace.
How can I trigger a build on a change that I do NOT list in the workspaces list of the build definition?
Btw, we are using TFS 2012
I was hoping the Cloaking feature would allow for this, but if it's cloaked, the automated build does not trigger. The automated build only fires if the workspace folder is set to active. Which also means download every NugetPackage in that same folder!
It's not able to trigger a CI build on a change which not list in the workspaces list of the build definition.
A few other things to know
Make sure the folders you include in your trigger are also included in your mappings on the Repository tab(the same to
workspace mappings).
Source Link
As a workaround you could set clean workspace to false, which will not get other unchanged files every time.
If your build process does not require a clean workspace or
repository, you can significantly reduce the time that is required to
run the build setting this parameter value to False.
I have made a checkout in my directory of an SVN repository. The project take a lot of time to be completely checked. And so while creating my Hudson job I need this in order :
Clean up the directory (this resolves some ambiguous problems such as : "Hudson workspace locked while building" )
Revert the changes
Update
The choices that I have for Check-out Strategy, in the Hudson job creation form, are:
Use svn update as much as possible, with 'svn revert' before update
Use 'svn switch' as much as possible
Use 'svn update' as much as possible
Clean checkout folders and then checkout
Emulate clean checkout by first deleting unversioned/ignored files, then 'svn update'
Clean workspace and then checkout (Eliminated)
What is the right option for my case?
Thank you a lot!
If your build is done correctly, you should be able to simply do use 'svn update as much as possible. This is the fastest way to update your files. This means not modifying committed files, or placing build artifacts in directories where they will interfere with the build process. In a Java shop, simply keeping all built objects in a subdirectory (we use target to match Maven, but others use build or diet) and out of the way of the rest of the process.
Most people do a clean as part of their build step. This, in theory, should not be necessary, and doing so will lengthen build times. The idea of the build is not to do any unnecessary work. If a source file isn't changed, the corresponding object file should not need to be rebuilt. However, Java is pretty fast at compiling, that most Java projects simply wipe the build directory clean. In C projects, not deleting old objects is better since it really reduces build time.
If there is a problem with your build process where use 'svn update' as much as possible can't work, you should fix your build process. However, there are a couple of projects on our old Jenkins server that do have problems, and they simply aren't updated enough to worry about it. For those, I do Always checkout a fresh copy. This takes the longest, but if you're having problems with your build process, I wouldn't bother using emulate a clean checkout by first deleting unversioned/modified files and use svn revert first. These can cause update conflicts, and cause problems with your build. Either get the build working correctly, or do a clean checkout.
I would go with "Use svn update as much as possible, with 'svn revert' before update". If that is not sufficient, check out the EnvInject Plugin. It can run a script before the SCm checkout happens. You can use it to run a svn cleanup for your job, before the Subversion plugin takes over with the revert and the update. Caveat, you need to install some kind of SVN command line client on your build server.