Confguration SpecFlow feature code generation to use "locally" sequential table suffixes - specflow

Whenever I add/remove a table in one feature file it seems to affect the code generation of the other features. This causes a lot of unnecessary files to be added to commits.
For example, this diff was caused by a change in a completely different feature:
Is there a way to configure the code generation to use locally sequential suffixes? i.e. I want all suffixes for a particular feature should start at table1 or table0 instead of continuing from the value in the "previous" feature. This way changing a table in one feature has no impact on the code generation of another.
I am using SpecFlow v3.70

SpecFlow 3 generates the code-behind files using MS Build. It is recommended to remove all *.feature.cs files from version control for precisely the same reason you asked your question. These auto generated files exhibit a lot of churn, so version control is not beneficial. The real benefit is adding the feature files themselves to version control. The .feature.cs files become an artifact of the build process, rather than an asset that requires tracking.

Related

What's a better way to manage translation in a solution with multiple feature branches?

situation:
we have multiple feature branches, one main branch;
one particular project has multiple XML files, key-value pairs for different languages;
only one person doing translation, not programmer, but an analyst;
we use feature branches to separate features and scope of testing; different QA is responsible for testing on different branch for different user story.
when a user story is done, we merge the feature branch up, and sync down to other sub branches
those language XML files are large, say, roughly a few hundred key-value pairs in each.
challenges:
challenges come when multiple feature branches need translation, and the translation changes are done in the same file, say Japanese;
QA occasionally files a translation bug fixed in another branch;
the analyst is confused which branch to work on;
very high chance when merging, one version overwrites another version; because when we merge from a branch, the default logic in TFS takes the newer version to overwrite it. Most of time it works fine, but in some cases it failed, which increases the complexity of merging.
of course those challenges are manageable.
But ideally, I really think those static files should belong to one place instead of feature branches, so all feature branches share the same translation pack.
I could use an internal nuget source to host the language files, but it will increase the work every time we make a small translation change.
or I could setup TFS to use relative path, but then I need to update build definition in the build server and make sure the local build can grab the correct file.
Is there any other recommendation?
Thanks
If you have any shared internal libraries then create NuGet packages should be a good way , and you can also create a vs solution just containing that library.
For XAML build, add a post build command to create the nuget package, or extend your tfs build template to do it (there's a number of templates out there that already do this).
For VNext build, there is a new VSTS Task called "NuGet Installer" this allows you to check in your NuGet.config file and specify the different package sources. Run this task before you run MSBuild. More details please refer: How to get TFS2015 Build (Build.vnext) and NuGet package restore to use custom package sources
Moreover, I think if your product or app haven't been published to user, the update changes of translation could not be so frequently. After all, just some translation and language packages. You could update it once or twice during an iteration.

Using WiX to generate an installer for an ASP.Net MVC website

Has anyone used WiX to generate an installer for an ASP.Net MVC website? Do you harvest files from the web project? I can’t find any good examples of this being done. There doesn’t seem to be a documented way to include all the right files, only the right files and put them in the right place.
If you add the website project as a reference in the installer project, and set harvest=True in the properties, then all the website files are captured, but there are issues:
Some files that should not be copied are included, e.g. packages.config, Web.Debug.config There doesn’t seem to be any clear or simple way to exclude them (as per this discussion).
The .website dll file is in the wrong place, in the root rather than the bin folder (as per this discussion)
However if you do not use harvesting, you have a lot of files to reference manually (e.g. Under \Content\ alone I have 58 files in 5 folders. Most of that is jQuery UI) and they change from time to time, and errors and omissions could easily be missed from a WiX file list. So it really should be kept in sync automatically.
I disagree with the idea that the list of files should be specified explicitly in WiX and not generated dynamically (which is what seems to be suggested at the first link, the wording isn't very clear). If I need to remove a file I will remove if from the source control system, there is no need to do the extra work of maintaining two parallel but different catalogues – one set of files in source control, and the same files listed in WiX. there should be one version of the truth. All files in the website's source tree (with certain known exceptions that are not used at runtime e.g. packages.config) should be included in the deployment.
For corporate reasons I don't have much choice about using WiX for this project
In our MVC 3 project we use Paraffin to harvest files for the installer. For example, you can use "-ext " to ignore the files with extension , use "regExExclude " to ignore the file name matching the regular expression, etc.
Paraffin also keeps the proper structure, all your files would be in the correct folder as they appear in your project.
I use a program that I wrote called ISWIX that makes authoring wxs merge modules a simple drag and drop operation like InstallShield. I then consume that merge module in an installer that handles the UI and IIS configuration.
I also have postbuild automation that extracts the content of the MSI and compares it against what the project published. If there is a delta I fail the build and you have to either a) add it to the wxs or b) remove it from the publish.
I find that the file count churn from build to build is minimal and that this system is not difficult to maintain. The upside is everything remains 100% intentionally authored and files don't ever magically add or remove from the installer unless you intended them to. Dynamic installer generation isn't worth the risk and most people who argue that it is don't even know what those risks are.

TFS and one-up releases

I apologize for the length of this post but I needed to include a lot of information for proper answers. I hope this does not discourage responses...
Our shop historically has coded web sites using Classic ASP with some newer ASP.NET sites configured as web sites. As everyone knows this means that the source files (*.asp, *.aspx, and *.aspx.vb (or *.aspx.cs)) files are deployed to development and production servers as is.
The configuration management process was (and still is) entirely manual and includes the following steps (requirements):
Taking copies of the modified files and storing them in a "release" folder for archiving.
Taking copies of the production files that will be replaced and storing them in a "archive" folder for easier rollback.
Generating a diff report of before and after source files for code review or general reference when diagnosing a post-release issue.
The developer who coded the changes is not the person who performs the production release. The original developer is required to hand off the source files to another developer for some additional testing and production deployment.
To make the situation more difficult (not with the above..but with what I talk about below) we do not follow a formal release schedule. As individual bugs or enhancements are completed they are released. This means we could easily be making several releases to a site a week. It is even possible that a given site gets two different releases to individual pages on the same day!
Since I came on board I have been trying to transition the team to newer technologies like ASP.NET web applications and ASP.NET MVC. (We have also taken on responsibility for stand-alone applications and console utilities used for non-web processes...so my dilemma still applies.)
The difference between these technologies and the legacy technologies is the pre-compiling. Instead of deploying the code-behind files (*.aspx.vb (or *.aspx.cs)) a dll or exe gets deployed. This type of deployment package has raised several questions (issues ??).
Generating difference reports when the source has been compiled. While the newly modified source files are sitting on the developers system the production copy is a compiled copy.
Making sure that changes related to other bugs or enhancements are not included in the particular release. This would apply to both the original developer and the person performing the release.
Allowing the original developer to pass along the changed files to another developer for build, testing, and deployment.
Up to now I was the only developer on the team working on these types of sites and applications so the conflicts and issues mentioned above where non-existent. (I skip the difference report step and the I do my own deployments.) However, I am trying to push the rest of the team to embrace this plus allow for better distribution of bugs and enhancement tasks.
We are currently using VSS but I am pushing (and will most likely succeed) in getting us moved over to TFS. Some ideas I have are
Setting up a separate build system for use by the developer to do the deployment. This will solve two problems -- (1) Different versions/patches of Visual Studio and other libraries between developers and (2) instances where the person performing the release has checked out files locally for another change. (Of course this does not guarantee differences between the build system and the original developer but at least that means the release is from a consistent config.
Using labels to tag just the modified files. My problem is that while I can identify (and pull down for a build) the modified files, how do I identify the files that need to be included in the build but have not changed. Again, the idea is to not included checked in files that are related to un-released changes.
Using labels to tag all the files for the release (the modified files and the unchanged files). My problem with this is similar to the last one...how do I make sure that a file checked in by another developer (say they went on vacation) for an un-related change is not labelled and included in this build.
Using the labels I could probably write a script to generate difference reports for the labeled version and the previously labeled version. If the process works properly that should result in exactly what changes are included in the the particular release..?
Any other ideas, concerns, points of interest? While I do have some flexibility of the process some of the requirements (like difference report or some way to easily view differences and having separate developer/deployer) are most likely untouchable.
Thank you so much for any help you can provide on this.
To keep track of different versions of the code and to help you manage very fast release cycles (daily) vs long term enhancements you can use branches in TFS.
There is a ton of information out there on branching, but in general I like to try to keep things simple. For example, have one branch called "release" and another "development". Everybody works on the development branch but the code to be deployed to production is merged into the release branch right before release.
This blog post describes the process:
http://team-foundation-server.blogspot.com/2008/01/how-we-branch-our-code-in-tfs.html
Well, based on my experience with VS2003 vs VS2010 for example is that the project structures are different and allowing VS to do a conversion often times results in a solution that either requires a lot of refactoring or is unusable. Having said that; if you can transition everything over to TFS2010 then one way to handle it is to setup different projects for each solution and use the TFS built in version handling for the different releases. You can also set up a build server and schedule nightly builds. If the build is ok then you can push this version into testing and ultimately production. You should really read up on TFS because it's totally different from VSS and is definitely a huge upgrade in allowing you to do team-focused development.
P.S. TFS has a really good Sharepoint integration which will help you and your team keep track of all the bugs and tasks.

Is there any simple automated way of finding out all the source files associated with a Delphi project?

I like to backup up the source code set for a project when I release a version. I use GExperts project backups, which seems to gather up all the files in the project manager into the ZIP file. You can also add arbitrary files to this file set, but I'm always conscious of the fact that I haven't necessarily got all the files. Unless I specifically go though the uses clauses and add all the units I have sources for to the project, I'll never be sure of storing all the files necessary to recreate the installable/executable.
I've thought about rolling an app to traverse a project, following all the units used and looking down all the search paths and seeing if there is a source file available for that unit, and building a list of files to back up that way, but hey - maybe someone has already done the work?
You should (highly recommend) look into Version Control.
e.g. SVN (subversion), CVS
This will allow you to control revisions of all of your source. It will allow you to add or remove source files, roll back merge and all other nice things related to managing project sources.
This WILL save your a$%# one day.
You can interpret your question in two ways:
How can I make sure that I backup at least enough files so I can build the project
How can I make sure that I backup not too many files so I can still build the project
The first is to make sure you can build the system at all, the second to allow you to clean up unused files.
For both, a version control system including a separate build system is the way to go.
You then - for each new set of changes - can use these steps to assure that both conditions hold:
On your daily development system, check in the new revision of your source code into your version control system.
On your separate build system, get the latest version of your source control system.
Build the project on the build system; if this fails, go to Step 1, and add the missing files to your version control system from your development system
Start removing (one-by-one) files from the project that you suspect are not needed, then rebuild until it fails.
When the build fails, restore that particular file from the version control system, then continue step 3 with the next candidate
When the build succeed you have the minimum set of files.
Now make a difference overview of the files in your version control system, and the build machine.
Mark the files that are in your version control system but not on your build machine as deprecated or deleted.
Most version control systems have good ways of generating a difference between the files on your development or build system against the files in the version control system (usually fine grained for each historic point in time you added/removed/updated files in your version control system).
The reason you want a separate build system (or two separate development systems) is that you want them to be independent: you use one for developing, and the other for checking if the build is still OK.
This is the first step that in the future you might want to extend this into a continuous integration system (that runs unit tests, automatically creates product setups and much more).
--jeroen
I'm not sure if you're asking about version control or how to be sure you've got all the files.
One useful utility I run occasionally is a program that makes a DirList of all of the files in my dcu output folder. Changing the extensions from .dcu to .pas gives me a list of all of the source code files.
Of course it misses .inc files and other non-.pas files, but perhaps this line of thinking would be helpful to you in some way?
The value of this utility to me is that a second housekeeping utility program then makes a list of all .pas files in my source tree that do not have corresponding .dcu files. This (after a full compile of all programs) generally reveals some "junk" .pas files that are no longer in use.
For getting a list of all units compiled into an executable, you could let the compiler generate a MAP file. This file will contain entries for all the units used.

TFS: Labels vs Changesets

I am trying to come up with best practices regarding use of TFS source control. Right now, anytime we do a build, we label the files that are checked into the TFS with the version number. Is this approach better or worse than simply checking the files in and having the version number in the comments?
Can you then use the changeset to go back if necessary or the labels are still more versatile?
Thanks!
They have two different purposes, ChangeSets are when the files have actually changed and you wish to keep a permanent record of that change. Labels mark a certain version of the files so that you can easily go back to that point. Unless your build actually changes files under source control and you wish to record these changes. You should be labeling.
Also, labeling is much less resource intensive. And you can have multiple labels on the same version of a file.
You should label the versions of source files that make up your build. If you're using TeamBuild, it does that for you automatically. It combines the name of your build definition, date, and the build number. So you don't need to do anything.
Your other option is not very conventional and requires a lot of unnecessary work. If I understand it correctly, you would check out your source files during the build process and then check them back in with a version number specified in the check-in comments. This is as Alex mentioned very resource intensive in terms of your build process and also your source control repository. Moreover, how would you get the source files for a particular version if the version information is embedded in the comments? It will be very hard and you would have to sit down and write your own application that uses TFS source control api to download the source files to a workspace by searching for the version number in the check-in comments. This creates unnecessary complexity and headaches.
If you use labels instead, you can do a get by label in VS IDE to download the source files that make up that label. You can even tell TeamBuild to use a label instead of downloading the latest source files during build automation. That way you can build previous versions of your application easily. With labels, you can also apply later changesets to an existing label if there were code changes by simply getting that label and then getting specific changesets and then doing a quick label or creating a brand new label.
Labeling is very powerful, convenient to use, and is a part of TFS. Rather than coming up with your custom solution that requires a lot of effort to make it work and maintain, just try to use what's already available.
Right now, anytime we do a build, we label the files that are checked into the TFS with the version number
You don't need to do this. TFS can refer to a state of the codebase in numerous ways, of which labels are indeed one - but so are builds and even changesets. You can see the available ways to reconstruct a particular point in time by doing a Get Specific Version... and examining the options in the Type dropdown:
Changeset
Date
Label
Latest Version
Workspace Version
Changeset allows you to get just after any changeset; Date is obvious; Label is too, except that builds automatically* create labels (choose Label from this dropdown then have a look in the Find Label dialog).
*I think it's automatic! Unless it's something we've set up specially where I am at the moment...
StackOverflow won't let me comment on the answers above, so I'm writing this as a new "answer". I want to clarify some of the misconceptions listed above.
First, using TFVC Labels is MORE resource intensive than using changesets. A lot more. Commands such as Branch, Merge, and Get by Label is slower. For enterprise servers with huge databases you do not want to be using labels.
Second, Builds don't automatically create labels, although the default build steps include a step to create a label.
Third, as others already mentioned, labels can be moved or deleted, so they are much less dependable than changesets which are immutable.
Overall I recommend you NOT use labels. The simplest alternative is to just remember the changeset number for your builds. Or if you want to isolate different release versions, you should create release branches.
Labels are OK for small systems, but are not good for large enterprises.

Resources