At my work we are just starting to use TFS with our team of 4 developers, and are at the same time transitioning from single developer projects to team projects. We are mostly using the default settings in TFS
I was the first to push up a simple Silverlight MVVM project consisting of a solution with a Silverlight and a web project.
When my team-mate pulled down my code and tried to compile, he was faced with many missing references (.dlls), Expression blend SDK, Ria Services toolkit, Telerik controls, simple mvvm toolkit, silverlight toolkit, etc.
What do we need to do, to add projects to TFS that have everything needed to be compile it when the next developer pulls it down?
There isn't a really good way to do this all automatically. What you'd generally do this this:
in your branch create a bin folder next to your src folder.
in the bin folder create folders for each component you're relying on
in each folder place the setup or a link to the setup
in each folder place the binary files you're using in your solution
in each folder place a readme with any manual steps that must be completed
if wanted you can create a powershell script or batch file which installs all required components. It isn't too hard to detect whether or not an application is already installed using powershell and wmi
Now you'll have to fix a few things in your solution:
make sure your references don't point to the GAC, but that they point to the assemblies inside the bin folder of your branch
make sure all the paths are relative to the solution. Any c:... paths will not carry over from one system to another
I found that the easiest way to do this is to unload the project in Visual Studio and then edit it. You can then quickly add hintpath="..\..\..\bin\component attributes to each reference. There are a few blog on this subject which provide different solutions which all solve this same issue.
This setup allows you to at least get the latest version of any solution and build it without having to install any tools. If some of your components rely on visual studio add-ins, then the designers for these tools usually won't work, but at least you're able to build them.
An often used alternative is to create a Virtual Machine base image for your project and install all the required components onto it. Then copy the image to each developers workstation and sysprep it to ensure they all have a unique name and identifiers. When the project needs to update its dependencies, let one developer create a new clean machine and re-distribute that to all team members.
If you're using Windows Server Virtualization or VMWare, it's quite easy to create differencing disks and allow developers to access these images remotely.
Another approach would be to use NuGet and script NuGet using a powershell script for your solution. This will work for most cases, but products like Expression Blend still need to be installed separately.
Related
I have set up a TeamCity partly. Now it downloads the code from TFS and try to build it using MSBuild which was not successful. I know that I am doing something wrong. I have some library added to my code(An ASP.NET website). I know that it is not a good idea to add dll files to Version Control(TFS), but if I don't check them in, when TeamCity downloads the code, it does not have that libraries so MSBuild cannot successfully build it. I was wondering what would be the best practice to solve that issue?
For dependency management in .net I would recommend that you take a look at the TeamCity built in nuget feeds. You have a possibility to utilize a feed directly from within TeamCity, acting as a server. As you state, commiting dependencies in (any) VCS should really be avoided...
It depends on what type of dlls you're dealing with.
If they are available on NuGet.org, use NuGet and the Package Manager Console to add the references to your solution. Then just put NuGet.exe on your Build Server, and run
NuGet.exe restore YourSolution.sln
As your first build step.
If they are in-house dlls, then you have a few options. The first being, as TeNGiL mentioned, setting up a private NuGet repository, and publishing the in-house dlls, to that feed, and pulling from it within your build server.
The other option is just to create a 'References' directory in source control, which holds dlls, reference them in your solution from the source controlled directory, and then pull them down as part of your Build Configuration. This really isn't as bad as it sounds, within reason, and is a perfectly acceptable interim solution to incorporate until everyone is on board with using a private NuGet feed, or something of that nature.
Open the code in the checkoutdirectory of TeamCIty in visual studio and try and build.I am pretty sure that visual studio will give you the exact error message of what's going wrong.
Missing packages have to be restores. Use a Nuget Installer build step to restore your packages as given in image below.
Just installed the latest Umbraco (7.2.1) package via NuGet. My development environment is as follows:
Umbraco is installed installed on IIS8 as shown below and is all up and running.
My Visual studio project is set up as shown below (For the sake of clarity, any folder/file excluded from project is not in included in my source control.
The content folder houses all scripts, images & css
On build - bin, config, content, masterpages, usercontrols, Views, xslt, default.aspx, Global.asax & the transformed Web.config are copied to the IIS instance (I don't like running Umbraco in the same place as my project, it just seems messy.)
Is this an appropriate way of developing for Umbraco? Am I missing anything, my biggest concern is whether or not I should include the umbraco & umbraco_client folders in version control and in the post build action. Any suggestions would be great.
There is some debate over what should and shouldn't be in your repository and ultimately it comes down to personal preference. I used to only add custom files and files that I changed from the Umbraco install such as the config files however since the introduction of the Nuget package I do put all but the binaries into source control because when I upgrade via Nuget later on I can easily see changes and merge customisations back in.
It saves a lot of hassle running Umbraco directly (IMO) especially if you make any changes via the UI and if you're not running it directly then there is little point really in using the Nuget package because you will end up with a bunch of unused files in your project. In your situation you might as well keep your project clean and do a manual install into the location IIS is using for the site and only keep files in your project that you have created.
This is only my opinion so take from it what you wish but hopefully it is of some help.
Simon
This is a problem we have been living with for a while already. Suppose that I have three files:
a FxCop ruleset, containing our basic Code Analysis rules
a Resharper .DotSettings file, with company defined naming conventions, for instance
a StyleCop.Settings file, with some of the default style cop settings disabled
How do I share these kinds of company wide settings files across multiple TFS Team Projects?
At the moment, we have these replicated in a Resources folder in each project, but this is quite a maintenance nightmare, since when we decide to update a few rules on any of these files, we have to update them in a lot of different places.
One approach I've seen is to create a team project specifically to store these files on TFS, for instance $/Core, and by some means share the files this way, either using workspace mappings or branching the project into the other projects.
I don't like this primarily because it requires manual intervention and hardcoded paths. Ideally, I'd like an approach that was not intrusive, i.e. a developer gets the project from source control and compiles without any problems. No need for separate mappings into specific forlders, setting environment variables, anything like that.
At the same time, I'd like to keep the history on these files, so it would be nice if they could still be source controlled. Since they are central to the company, it would be ideal to have limited permissions on them, and that each change was documented appropriately (changesets provide that).
Another approach that crossed my mind was to share these settings via company internal nuget packages. Say for instance that I do have this $/Core project, but instead of using workspace mapping or branching it inside the other projects, I publish a nuget package (or more than one even) containing the configuration files, and add these packages to each project that needs it. I can then use relative paths to the package folder when referencing them, and it would require no manual intervention on the developers part.
Although using nuget would probably work fine, this solution seems weird to me, since nuget packages are meant to be project specific, but this would be "whole team project" specific instead. Again, using this approach, I'd probably have to add the package to one of the projects inside one of the solutions inside each team project. This is actually quite similar to how test adapters are shared now. For instance, NUnit already support this approach. I feel I'd have to do something very similar to that if I went with an internal nuget package: each solution would have to load the package at least once.
Is there some other way to share these kinds of things across the whole company, while still maintaining them on source control? What if I loosened this constraint, and accepted that they don't need to be source controlled? Would it open up other options to share them?
If you go down the NuGet route, you could create Packages for your FXCOP and Resharper files, and then store them in a local feed.
For Stylecop you could use the existing NuGet Package for StyleCop.MSBuild and then repackage it with your own settings file.
I'm a Fossil (and CVS configuration) novice attempting to create and manage a set of distributed Fossil repositories for a Delphi project.
I have the following directory tree on my development machine:
Projects
Some Project
Delphi Components
LookupListView
Some Client
Some Project For Client
Some Other Project For Client
Source Code
Project Resources
Project Database
I am setting up Fossil version control in order to version and share Projects\Some Client\Some Other Project For Client\Source Code, which contains Delphi 2010 source for a database project.
This project makes use of Projects\Delphi Components\LookupListView which is a Delphi component. I need this code to be included in the versioning system for my project. I will, in theory, need to include it in other Fossil repositories in the future, as well.
If I create my Fossil repository at the Source Code or Some Other Project For Client level, I cannot add any code above that level to my repository. What is the proper way to deal with this? The two solutions that occur to me are
1) Creating a separate repository for LookupListView and make sure that everyone who uses a repository for a project that references it "knows" that they must also get the current version of this project as well. This seems to defeat the purpose of being able to obtain a complete, current version of the project with a single checkout. The problem is magnified because there are other common component dependencies in this project.
2) Establishing my Fossil repository in the Projects directory, so I can check in files from various subfolders. This seems to me to involve an awful lot of extra path-typing when doing adds, and also to impose my directory structure (Some Client\Some Other Project For Client\Source) on the other users of the repository -- in this case, the actual client.
Any suggestions appreciated.
I use Git, but my approach can be applied in your situation.
I have one repository for all my components folder. This gives me an ability to get all of them with only few console commands (in case when I reinstall my OS or go to another computer etc.).
Also I have one repository per each of my projects. If some project uses 3rd party controls I create "components" sub-folder and do symbolic links (junctions) of every components set.
This approach have some disadvantages (when you "go back" in commits history of some project, components can be modified too. And if many projects are using same components this could cause some troubles). But I had no issues yet :)
We are using TFS 2008 for Web App dev. The WebApp is a large project, so we do not want every developer to see all the source code, which means lots of libraries or subapps need to be referenced.If I put all the libraries and subapps in one VS2008 Project or VS2008 Team Project, all the source codes will be exposed to each person.
Do I have to make the WebApp reference other Team Project to solve this problem? What is the best way doing so?
Consider each isolated section to be a project (in both the physical and management sense) as independent. Ship release from those shared components/projects and deliver them as binaries to be pulled into the others. You can use a the output from trunk or release branch builds of the shared components to deliver new "releases".
This affords you the option of full branching, work item, reporting etc for each logical project in your organization.
If you let someone be a contributor/developer on a project, then that individual has access to the entire project. If you want to keep someone out of the certain files, then that will need to be under its own TFS project. You would then reference the output assemblies from the parent project in the child project.