Using WiX to generate an installer for an ASP.Net MVC website - asp.net-mvc

Has anyone used WiX to generate an installer for an ASP.Net MVC website? Do you harvest files from the web project? I can’t find any good examples of this being done. There doesn’t seem to be a documented way to include all the right files, only the right files and put them in the right place.
If you add the website project as a reference in the installer project, and set harvest=True in the properties, then all the website files are captured, but there are issues:
Some files that should not be copied are included, e.g. packages.config, Web.Debug.config There doesn’t seem to be any clear or simple way to exclude them (as per this discussion).
The .website dll file is in the wrong place, in the root rather than the bin folder (as per this discussion)
However if you do not use harvesting, you have a lot of files to reference manually (e.g. Under \Content\ alone I have 58 files in 5 folders. Most of that is jQuery UI) and they change from time to time, and errors and omissions could easily be missed from a WiX file list. So it really should be kept in sync automatically.
I disagree with the idea that the list of files should be specified explicitly in WiX and not generated dynamically (which is what seems to be suggested at the first link, the wording isn't very clear). If I need to remove a file I will remove if from the source control system, there is no need to do the extra work of maintaining two parallel but different catalogues – one set of files in source control, and the same files listed in WiX. there should be one version of the truth. All files in the website's source tree (with certain known exceptions that are not used at runtime e.g. packages.config) should be included in the deployment.
For corporate reasons I don't have much choice about using WiX for this project

In our MVC 3 project we use Paraffin to harvest files for the installer. For example, you can use "-ext " to ignore the files with extension , use "regExExclude " to ignore the file name matching the regular expression, etc.
Paraffin also keeps the proper structure, all your files would be in the correct folder as they appear in your project.

I use a program that I wrote called ISWIX that makes authoring wxs merge modules a simple drag and drop operation like InstallShield. I then consume that merge module in an installer that handles the UI and IIS configuration.
I also have postbuild automation that extracts the content of the MSI and compares it against what the project published. If there is a delta I fail the build and you have to either a) add it to the wxs or b) remove it from the publish.
I find that the file count churn from build to build is minimal and that this system is not difficult to maintain. The upside is everything remains 100% intentionally authored and files don't ever magically add or remove from the installer unless you intended them to. Dynamic installer generation isn't worth the risk and most people who argue that it is don't even know what those risks are.

Related

TFS Build controller and extensions

I am setting up a TFS 2012 Build server. I am using some extensions (NUnit Test Adapter). Per the instructions, I have added the dlls to a common folder in TFS, and configured the "Version Control Path to Custom Assemblies" on the build controller to reference the correct folder in TFS.
Everything works, BUT: According to the docs, I can create subfolders under my "Path to custom assemblies" folder, and the controller should pick them up.
That doesn't seem to be working for me. If I put the NUnit support in the root, it works, in subfolders, it does not.
I would like to use the subfolders feature so that I can keep each set of extensions/custom build targets, etc separate.
Does this just not work, or am I missing something?
Edit
As requested, here is a reference to the documentation where I found the information:
http://msdn.microsoft.com/en-us/library/vstudio/ee330987(v=vs.120).aspx#custom_process
Here's the passage:
To enable your build processes to leverage these kinds of code, check the binaries in to the folder (or any of its descendant folders) that you specify in the Version control path to custom assemblies box.
It turns out that the documentation is just wrong -- all of the dlls must go into 1 folder.
http://social.msdn.microsoft.com/Forums/en-US/0059bc66-d3c9-42e6-8d8a-dd22f3416e07/version-control-path-to-custom-assemblies-doesnt-use-subfolders?forum=tfsbuild
I do understand that helps to ensure that you don't get duplicate dependencies with different versions -- makes sense. It just makes it a little tougher to know what depends upon what if you add a few extensions. Seems to work fine using a single folder, though.

How do you share scripts among multiple projects in one solution?

In case the question wasn't clear. I have 3 MVC projects in one Solution. Every time I create a new project it adds the "Scripts" folder with all the .js files I'll ever need. I don't want to have this created every time for every application. Is there a way to reference scripts from a central folder in the solution so all applications/projects can share one common script folder with all the scripts common among them?
Edit:
Please explain the pros and cons of doing this if there are any...now I'm curious.
Here is what I would recommend:
Right click the solution and create a New Solution Folder called Common Javascript Files (or whatever you feel like calling it.
Right click on the Solution, click Open Folder in Windows Explorer,
or navigate there manually for other versions of Visual Studio :(
In the solution directory, create a directory with the same name as the solution folder (solution folders do not normally match directories at the source code level but this will for sanity sake).
In this new directory, add files that need to be shared between solutions.
In Visual Studio, click the solution folder and select Add - Existing Item.
In the file selection dialog, navigate to the directory previous created, select the file(s) added to the directory and click Add.
In each Project that needs a shared file, right click on the project (or directory within the project) and click Add - Existing Item.
Navigate to the shared Directory, Select the files and click the drop down arrow then click Add As Link.
Now the files in the projects are essentially short cuts to the files in the Solution Folder. But they are treated as actual files in the project (this includes .CS or Visual Basic files, they will be compiled as files that actually exist in the project).
PROS
Files are truly shared across projects at Design time
Only the files needed for each project can be added, it's not all or nothing
Does not require any configuration in IIS (virtual directory etc)
If the solution is in TFS Source control, you can add the Directory to the TFS Source and the shared files will be source controlled.
Editing a file by selecting it in the Project, will edit the actual file.
Deleting a Linked file does not delete the file.
This is not limited to JS files, linked files can be ANY file you might need (Images, Css, Xml, CS, CSHTML, etc)
CONS
Each deployment gets it's own file.
There is a small learning curve when understanding that Solution Folders are not Directories that exist in a Solution Directory.
The best thing to do, imo, is to roll your own CDN... Basically just create another site in IIS and give it it's own binding, e.g. "http://cdn.somedomain.com"
Then store all of your css/js/fonts/shared images etc on the CDN site and link to them from your other sites.
Doing so solves 2 problems,
All of your stuff is shared when it needs to be and you only have to manage 1 revision per file.
Your users browsers can cache them in 1 single location instead of downloading copies of your stuff for every site that uses them..
I added this answer because I see a lot of people referrencing creating virtual directories. While that does indeed share the files, it creates multiple download paths for them which is an extreme waste of bandwidth. Why make your users download jquery.js (1 * number of sites) when you can allow them to download it once on (cdn.somedomain.com).
Also when I say waste of bandwidth, I'm not just talking about server bandwidth, I'm talking about mobile users on data plans... As an example, I hit our companies HR site (insuance etc) on my phone the other day and it consumed 25mb right out the gate, downloaded jquery and a bunch of stuff 5 times each... On a 2gb a month data plan, websites that do that really annoy me.
Here it goes, IMO the best and easiest solution, I spent a week trying to find best and easiest way which always had more cons than pros:
Resources(DLL)
Shared
images
image.png
css
shared.css
scripts
jquery.js
MvcApp1
Images
Content
Shared <- We want to get files from above dll here
...
MvcApp2
Images
Content
Shared <- We want to get files from above dll here
...
Add following to MvcApp1 -> Project -> MvcApp1 Properties -> Build events -> post build event:
start xcopy "$(SolutionDir)Resources\Shared\*" "$(SolutionDir)MvcApp1\Shared" /r /s /i /y
Here is explanation on what it does: Including Build action content files directory from referenced assembly at same level as bin directory
Do the same for MvcApp2. Now after every build fresh static files will be copied to your app and you can access files like "~/Shared/css/site.css"
If you want you can adjust the above command to copy scripts from .dll to scripts folder of every app, that way you could move some scripts to .dll without having to change any paths,here is example:
If you want to copy only scripts from Resources/Shared/scripts into MvcApp1/scripts after each build:
start xcopy "$(SolutionDir)Resources\Shared\Scripts\*" "$(SolutionDir)MvcApp1\Scripts" /r /s /i /y
This is a late answer but Microsoft has added a project type called Shared Project starting Visual Studio 2013 Update 2 that can do exactly what you wan't without having to link files.
The shared project reference shows up under the References node in the
Solution Explorer, but the code and assets in the shared project are
treated as if they were files linked into the main project.
"In previous versions of Visual Studio, you could share source code between projects by Add -> Existing Item and then choosing to Link. But this was kind of clunky and each separate source file had to be selected individually. With the move to supporting multiple disparate platforms (iOS, Android, etc), they decided to make it easier to share source between projects by adding the concept of Shared Projects."
https://blogs.msdn.microsoft.com/somasegar/2014/04/02/visual-studio-2013-update-2-rc-windows-phone-8-1-tools-shared-projects-and-universal-windows-apps/
Info from this thread:
What is the difference between a Shared Project and a Class Library in Visual Studio 2015?
https://stackoverflow.com/a/30638495/3850405
A suggestion that will allow you to debug your scripts without re-compiling the project:
Pick one "master" project (which you will use for debugging) and add the physical files to it
Use "Add As Link" feature as described in Eric's answer to add the script files to the other projects in solution
Use CopyLinkedContentFiles task on Build, as suggested in Mac's comment to copy the files over to the second over to your additional projects
This way you can modify the scripts in the "master" project without restarting the debugger, which to me makes the world of difference.
In IIS create a virtual folder pointing to the same scripts folder for each of the 3 applications. Then you'll only need to keep them in a single application. There are other alternatives, but it really depends on how your applications are structured.
Edit
A scarier idea is to use Areas. In a common area have a scripts directory with the scripts set to be compiled. Then serve them up yourself by getting them out of the dll. This might be a good idea if you foresee the common Area having more functionality later.
Most of the files that are included by default are also available via various CDN's.
If you're not adding your own custom scripts, you may not even need a scripts directory.
Microsoft's CDN for scripts: http://www.asp.net/ajaxlibrary/cdn.ashx

Is there any simple automated way of finding out all the source files associated with a Delphi project?

I like to backup up the source code set for a project when I release a version. I use GExperts project backups, which seems to gather up all the files in the project manager into the ZIP file. You can also add arbitrary files to this file set, but I'm always conscious of the fact that I haven't necessarily got all the files. Unless I specifically go though the uses clauses and add all the units I have sources for to the project, I'll never be sure of storing all the files necessary to recreate the installable/executable.
I've thought about rolling an app to traverse a project, following all the units used and looking down all the search paths and seeing if there is a source file available for that unit, and building a list of files to back up that way, but hey - maybe someone has already done the work?
You should (highly recommend) look into Version Control.
e.g. SVN (subversion), CVS
This will allow you to control revisions of all of your source. It will allow you to add or remove source files, roll back merge and all other nice things related to managing project sources.
This WILL save your a$%# one day.
You can interpret your question in two ways:
How can I make sure that I backup at least enough files so I can build the project
How can I make sure that I backup not too many files so I can still build the project
The first is to make sure you can build the system at all, the second to allow you to clean up unused files.
For both, a version control system including a separate build system is the way to go.
You then - for each new set of changes - can use these steps to assure that both conditions hold:
On your daily development system, check in the new revision of your source code into your version control system.
On your separate build system, get the latest version of your source control system.
Build the project on the build system; if this fails, go to Step 1, and add the missing files to your version control system from your development system
Start removing (one-by-one) files from the project that you suspect are not needed, then rebuild until it fails.
When the build fails, restore that particular file from the version control system, then continue step 3 with the next candidate
When the build succeed you have the minimum set of files.
Now make a difference overview of the files in your version control system, and the build machine.
Mark the files that are in your version control system but not on your build machine as deprecated or deleted.
Most version control systems have good ways of generating a difference between the files on your development or build system against the files in the version control system (usually fine grained for each historic point in time you added/removed/updated files in your version control system).
The reason you want a separate build system (or two separate development systems) is that you want them to be independent: you use one for developing, and the other for checking if the build is still OK.
This is the first step that in the future you might want to extend this into a continuous integration system (that runs unit tests, automatically creates product setups and much more).
--jeroen
I'm not sure if you're asking about version control or how to be sure you've got all the files.
One useful utility I run occasionally is a program that makes a DirList of all of the files in my dcu output folder. Changing the extensions from .dcu to .pas gives me a list of all of the source code files.
Of course it misses .inc files and other non-.pas files, but perhaps this line of thinking would be helpful to you in some way?
The value of this utility to me is that a second housekeeping utility program then makes a list of all .pas files in my source tree that do not have corresponding .dcu files. This (after a full compile of all programs) generally reveals some "junk" .pas files that are no longer in use.
For getting a list of all units compiled into an executable, you could let the compiler generate a MAP file. This file will contain entries for all the units used.

Managing common components with Fossil CVS

I'm a Fossil (and CVS configuration) novice attempting to create and manage a set of distributed Fossil repositories for a Delphi project.
I have the following directory tree on my development machine:
Projects
Some Project
Delphi Components
LookupListView
Some Client
Some Project For Client
Some Other Project For Client
Source Code
Project Resources
Project Database
I am setting up Fossil version control in order to version and share Projects\Some Client\Some Other Project For Client\Source Code, which contains Delphi 2010 source for a database project.
This project makes use of Projects\Delphi Components\LookupListView which is a Delphi component. I need this code to be included in the versioning system for my project. I will, in theory, need to include it in other Fossil repositories in the future, as well.
If I create my Fossil repository at the Source Code or Some Other Project For Client level, I cannot add any code above that level to my repository. What is the proper way to deal with this? The two solutions that occur to me are
1) Creating a separate repository for LookupListView and make sure that everyone who uses a repository for a project that references it "knows" that they must also get the current version of this project as well. This seems to defeat the purpose of being able to obtain a complete, current version of the project with a single checkout. The problem is magnified because there are other common component dependencies in this project.
2) Establishing my Fossil repository in the Projects directory, so I can check in files from various subfolders. This seems to me to involve an awful lot of extra path-typing when doing adds, and also to impose my directory structure (Some Client\Some Other Project For Client\Source) on the other users of the repository -- in this case, the actual client.
Any suggestions appreciated.
I use Git, but my approach can be applied in your situation.
I have one repository for all my components folder. This gives me an ability to get all of them with only few console commands (in case when I reinstall my OS or go to another computer etc.).
Also I have one repository per each of my projects. If some project uses 3rd party controls I create "components" sub-folder and do symbolic links (junctions) of every components set.
This approach have some disadvantages (when you "go back" in commits history of some project, components can be modified too. And if many projects are using same components this could cause some troubles). But I had no issues yet :)

Where does TFS keep information on how projects within a solution are structured?

Is it just in csproj files and the solution file or are there other files or locations where information is stored? What are those vssscc files doing?
TFS source control generally does not care how the content put in it is stored or structured. The .vssscc files are just a hold-over from the previous Visual Studio integration, and are only used by TFS to list file exclusions from source control (as far as I know, anyway...there could be undocumented uses as well.) Your solution and project structure is defined by your solution and project files. TFS Source control simply tracks the different versions of any file put in it, and provides ways to find and analyze that version data.
JRista's answer is pretty close. I'm not aware of any uses for the .vssscc file other than exclusions. Even within the .sln/.csproj files, there are no explicit bindings, just a bunch of "SAK" entries that tell VS not to do anything special. In the end, the layout of the projects on disk is determined by workspace mappings; the structure of the solution itself is determined by the paths in the .sln file.
Obviously the two need to match. If you have nonstandard workspace mappings, some projects may not load. If some of the paths referred to in the .sln are not mapped at all, VS will create explicit mappings during Add To SCC and Open From SCC operations. This can be problematic -- people who don't map their code correctly in the first place often have unusual relative paths between their projects that can't be easily replicated by people trying to Get their code from source control.

Resources