Orchard CMS module development and continuous integration with TeamCity - asp.net-mvc

I've been developing with Orchard CMS for a few months now (and love it) and the time for launching my website is fast approaching. So far I've just been developing solo out of my BitBucket repo, forking where necessary, not doing anything too fancy. Once I have released though I really need to have a handle on exactly what versions of my modules and themes are in use in production. I figure versioned packages out of my build server is the best way to achieve this.
Currently my repo consists of the source for the entire Orchard instance (minus the App_Data folder), with a solution file that includes the projects that are my modules and themes. My modules take on dependencies of other modules from the App_Data/Dependencies folder.
My question is, is this the best approach to achieve Continuous Integration?
I have my solution building under TeamCity, but I don't include the App_Data folder in my repo, so I need to at least load the setup page so the Dependencies directory gets populated (which doesn't happen on my build server for some reason? Seems only dynamic compilation kicks in for everything?)
Any thoughts or assistance would be greatly appreciated.
UPDATE:
I have decided I will add a lib folder to my solution and store all dependent assemblies there. I will then have my repo consist of only the projects required for my modules / themes. The CI server will then have no problems building the solution, and I can just clone the repo into an Orchard instance for easy development (this means my solution will have to contain a Modules and Themes directory).

I used the following approach.
add Orchard binaries to the repository without any source codes, just in the for they are distributed at the orchard project site
create my solution and all related projects in separate directory, so at the moment dir structure looks like this:
then place your modules projects under orchard/modules folder with all sources and .proj files
add references from your module projects to orchard/bin for any
orchard-specific stuff
add module binaries manually to App_Data/Dependencies folder to be able to reference them
One of the improvements of this approach is to turn off dynamic compilation and store only module binaries, but this will require configuration of the output bin path and additional actions in the build script.
Benefits
You don't have any orchard sources in your repository except of the
modules (but this is solvable by turning off dynamic compilation).
You can easily upgrade orchard binaries and modules almost independently
The build takes less time

Related

Should umbraco & umbraco_client be checked in to source control?

Just installed the latest Umbraco (7.2.1) package via NuGet. My development environment is as follows:
Umbraco is installed installed on IIS8 as shown below and is all up and running.
My Visual studio project is set up as shown below (For the sake of clarity, any folder/file excluded from project is not in included in my source control.
The content folder houses all scripts, images & css
On build - bin, config, content, masterpages, usercontrols, Views, xslt, default.aspx, Global.asax & the transformed Web.config are copied to the IIS instance (I don't like running Umbraco in the same place as my project, it just seems messy.)
Is this an appropriate way of developing for Umbraco? Am I missing anything, my biggest concern is whether or not I should include the umbraco & umbraco_client folders in version control and in the post build action. Any suggestions would be great.
There is some debate over what should and shouldn't be in your repository and ultimately it comes down to personal preference. I used to only add custom files and files that I changed from the Umbraco install such as the config files however since the introduction of the Nuget package I do put all but the binaries into source control because when I upgrade via Nuget later on I can easily see changes and merge customisations back in.
It saves a lot of hassle running Umbraco directly (IMO) especially if you make any changes via the UI and if you're not running it directly then there is little point really in using the Nuget package because you will end up with a bunch of unused files in your project. In your situation you might as well keep your project clean and do a manual install into the location IIS is using for the site and only keep files in your project that you have created.
This is only my opinion so take from it what you wish but hopefully it is of some help.
Simon

TFS Online/VSO Build with Common Assemblies

I was wondering if anyone could help.We have the following project structure in our company :
Code/Common
Code/Project1
Code/Project2
etc...
When the Common Project builds, it has a PostBuild Event that copies all the relevant files into the Code/Common/Binaries folder. Then all the other Projects reference the Common components in this folder.
However, what we are struggling with is that when TFS Online checks-out the solution it does so to c:\a\src and the Common binaries are placed in c:\a\src\Binaries. Now, when the other projects (Project1 etc) do their build it cannot find the Common Assemblies, as not only are they removed, but the paths are different from what it expects them to be in c:\a\src\Common\Binaries instead of c:\a\src\Binaries.
Is there anyway to tell the build server to not delete those files in the "Binaries" directory and to specify the folder location to checkout to? Or how one one go about solving such a problem?
Thanks very much
A build server is a transient thing, you cannot rely on files to be there.
You need to either Create Nuget Packages for you common output and then consume these in your other projects (the 'proper' way), or you will need to check your dependencies into source control after each build so you can then reference them in subsequent builds (the 'really frowned apon' way).

Ant/Ivy for project building

I am considering switching a Maven project that I manage to Apache-Ant/Ivy. I need more control over the build process and am getting very frustrated with Maven. Please no comments about how great Maven is. My question is about Ivy.
I would like to set up a "standard" Ant build template that can later be used for other projects with minimal changes.
I will set up a central "enterprise" repository where we can place third-party libraries that are not available in the public Maven repositories (e.g. commercial libraries, Sun libraries, proprietary libraries, etc.). This enterprise repository will be available on our local LAN, but may not be available from outside the office.
Each developer will have a private repository in ~/.ivy/repository. I would like the Ant build to automatically update this private repository with changed versions of libraries from the enterprise repository.
In ~/.ivy/ant, I plan on placing "standard" modules for including in the individual project build.xml files, using the include task in Ant 1.8. These modules will provide things like Scala and Clojure compilation targets with different versions for different Scala and Clojure versions (e.g.: scala-compile-2.9.1.xml, clojure-compile-1.3.xml, etc.) The build modules will be available in the enterprise repository and should be updated automatically in the private repositories if they change.
Each project will follow a standard Maven directory structure: ${project}/src/main/java, ${project}/target/classes, etc.
In the past, I tried using Ivy but the Ant build files got to be very large (> 500 lines for the template build file) and hard to manage/edit. I am hoping that by putting standard targets in their own build modules in the ~/.ivy/ant directory, I can avoid that code bloat.
Can this be done? Am I way off base? The only documentation I can find on Ivy is at the Apache web site (http://ant.apache.org/ivy). Is there any other documentation available, including books?
Rather sensible idea about dividing template build file into includable helper files. Personally, now i'm switchin' a really large project from ant (no dependency managment at all - only copying files from ftp) to ant/ivy solution. So i've done this way - i have a file with milestones targets - i.e ready-to-compile, compiled, ready-to-archiving, archived - so on. I think u got the idea. I've configured dependencies of all this targets ( dependencies in terms of ant, do't get me wrong). In that way - compiled depends on ready-to-compile, ready-to-compilede depends on initialized - smth like this. This targets don't have body - they are for including in every build-file of every module of your multi-module project. The sole purpose of this targets for maintaining the STATE of build, because of this import stuff things become rather tricky and it's hard time to know what target was overriden, and when this target would be run. But with this file i can easily change state of vy build on every sensible milestone. I want in one module to compile help files with exteran exe. No problem - in this project i just do this - ready-to-archiving depends on the target for compiling help. And as this milestones targets are included - i can override only some of them - all others would presere the desired way of building project.
Another part of my strategy - mixins build files - for every specific area. So i have a file for ivy. There i put initializing, resolving, publishing and so on. When i want to use ivy - i just include this file and manage depdendencies through my milestones targets. If the build is typical - i only include this file and have a convention-over-configuration functionality. All out of the box. How?? Just combining with other mixins. Mixins may include other mixins to depend on them. So each mixin is a reusable part of my build strategy. The stuff from OOP - single-concerned unit. In your case it's scala mixin with targets specific for scala stuff.
Then i have delegate.xml that delegates child projects common build activities. I have dist, all, test and whatever u want for multimodule project. The build order is evaluated with ant-ivy task buildlist.
There also some other files - but this are the strategically basic files that helped me to have a reusable and maintanable build with this BIG and VERY Conservative project. So, if u are interested about details, don't be shy and contact me. I will be very pleased to help you, because ivy docs are really comlicated and incomplete.
EDIT: About books - Ant in Action may help you, i took several ideas from this book, and i really highly recommend it everyone to read. There u can find ivy stuff, also. And about ivy docs - sorry, it's all that is available. But when i was in trouble with this cumbersome ivy+ant - i found several interesting articles on private blogs. So ... that may fill the gap some way.

Managing common components with Fossil CVS

I'm a Fossil (and CVS configuration) novice attempting to create and manage a set of distributed Fossil repositories for a Delphi project.
I have the following directory tree on my development machine:
Projects
Some Project
Delphi Components
LookupListView
Some Client
Some Project For Client
Some Other Project For Client
Source Code
Project Resources
Project Database
I am setting up Fossil version control in order to version and share Projects\Some Client\Some Other Project For Client\Source Code, which contains Delphi 2010 source for a database project.
This project makes use of Projects\Delphi Components\LookupListView which is a Delphi component. I need this code to be included in the versioning system for my project. I will, in theory, need to include it in other Fossil repositories in the future, as well.
If I create my Fossil repository at the Source Code or Some Other Project For Client level, I cannot add any code above that level to my repository. What is the proper way to deal with this? The two solutions that occur to me are
1) Creating a separate repository for LookupListView and make sure that everyone who uses a repository for a project that references it "knows" that they must also get the current version of this project as well. This seems to defeat the purpose of being able to obtain a complete, current version of the project with a single checkout. The problem is magnified because there are other common component dependencies in this project.
2) Establishing my Fossil repository in the Projects directory, so I can check in files from various subfolders. This seems to me to involve an awful lot of extra path-typing when doing adds, and also to impose my directory structure (Some Client\Some Other Project For Client\Source) on the other users of the repository -- in this case, the actual client.
Any suggestions appreciated.
I use Git, but my approach can be applied in your situation.
I have one repository for all my components folder. This gives me an ability to get all of them with only few console commands (in case when I reinstall my OS or go to another computer etc.).
Also I have one repository per each of my projects. If some project uses 3rd party controls I create "components" sub-folder and do symbolic links (junctions) of every components set.
This approach have some disadvantages (when you "go back" in commits history of some project, components can be modified too. And if many projects are using same components this could cause some troubles). But I had no issues yet :)

TFS 2008 and Common libraries folder structure

TFS 2008 and Common Libraries
I have created a Team Project called "Common Library" that will host code used in numerous different Team Projects throughout TFS. For sake of argument, lets say we have 2 distinct Librarys under the "Common Library" Team Projects, MailProject and LoggingProject. Other projects throughout TFS will be using the binary representation of these projects via branching and not the actual source code.
What is the best way to set up the folder structure for this Team Project? Do I add the project to the "Common Library" and simply "include" the bin/release folder as part of the project?
I have seen some examples of people creating a seperate "Deploy" folder. I assume this is synonamous with the bin/release folder?
We do not want the source code available in other solutions.
Currently, each project has the dll included in the project. Using a mailing module as an example, many projects need the ability to mail. The common module is very stable and mostly static.
However, what if there is a change in the mail module. It seems there would be a better way, than to check out each project and update the dll. Is it possible to allow TFS to grab the latest mail module any time a 'get latest' is called? Either explicitly or implicitly.
Unless you really require the source code for the libraries to be available in the other solutions my advice would be to include the binaries for the libraries in the projects that would use them not really having any explicit link between the two in TFS. Custom labelling of the library builds could be helpful to easily return and rebuild any chosen version of the shared libraries.
If the shared libraries require different versions for different projects then the obvious solution would to create a separate branch for every version of the libraries that need to be customised to a particular project.
TFS does not have a concept similar to SVN's 'externals' though - so if you include a branch from the shared libraries in a project and than branch that project it is very difficult to propagate changes correctly.
I suppose you could also use the Get task in the build and get the latest version of DDLs into the current project from another one, but verify if you can point of Workspace of another project (I have not tired it and MSDN is somewhat vague here). You might need to have a separate workspace for the shared project.
Yet another alternative would be to publish the DLLs for common components to a known location on every build of the shared libs and for individual builds to get whatever version is available from that common location (network share) even via the Copy task. This is simplistic and may cause problems with versioning of the common components but should work well enough in simple case.

Resources