Retain changes to existing files when publishing - tfs

I'm stuck in a scenario where my published file is initially correct, but then later is replaced by it's original version. I suspect that the order of events is wrong, coupled with what's in the root directory.
Essentially I have successfully setup my publishing environment where it executes a custom command to create some JavaScript (which gets created outside of my project). Because a file in source control needs to reference this newly created JavaScript, I am simply copying the file (MyControl.ascx) to a temp location (thus it loses the read lock by TFS) and I am having the custom command update the references to the JS. Once this is done, I gather all the files (custom JavaScript, as well as edited MyControl.ascx in it's temp location) and publish.
It publishes everything first, so I see the new JS as well as the updated MyControl.ascx, but a few minutes later it finishes the publish, and Control looks like how it looked in the root directory.
I think what's happening is it's just pushing out what's in the root (which includes MyControl.ascx) on top of my custom Control (which is in another directory).
<PropertyGroup>
<PipelineCollectFilesPhaseDependsOn>
CustomCollectFiles;
$(PipelineCollectFilesPhaseDependsOn);
</PipelineCollectFilesPhaseDependsOn>
</PropertyGroup>
<Target Name="CustomCollectFiles">
Exec Command="MyCommand.bat"
<ItemGroup>
<_BundledJS Include="$(MSBuildThisFileDirectory)..\..\Includes\javascript\*.js" />
<FilesForPackagingFromProject Include="%(_BundledJS.Identity)">
<DestinationRelativePath>Includes\javascript\%(Filename)%(Extension)
</DestinationRelativePath>
</FilesForPackagingFromProject>
</ItemGroup>
<ItemGroup>
<_UpdatedControl Include="$(MSBuildThisFileDirectory)..\..\TempArea\MyControl.ascx" />
<FilesForPackagingFromProject Include="%(_UpdatedControl.Identity)">
<DestinationRelativePath>Controls\%(Filename)%(Extension)
</DestinationRelativePath>
</FilesForPackagingFromProject>
</ItemGroup>
</Target>
So you can see above i'm taking something that's in ....\TempArea\MyControl.ascx and pushing it out to Controls\%(Filename)%(Extension).
Any ideas how I can tell it to essentially retain my ....\TempArea\MyControl.ascx without then overwriting it with the original MyControl.ascx within the project (....\Controls\MyControl.ascx) ?
Thanks so much!

So I actually ended up figuring it out, and wanted to share:
<ItemGroup>
<_UpdatedControl Include="$(MSBuildThisFileDirectory)..\..\TempArea\MyControl.ascx" />
<FilesForPackagingFromProject Remove="Controls\MyControl.ascx" ></FilesForPackagingFromProject>
<FilesForPackagingFromProject Include="%(_UpdatedControl.Identity)" >
<DestinationRelativePath>Controls\%(RecursiveDir)%(Filename)%(Extension) </DestinationRelativePath>
</FilesForPackagingFromProject>
</ItemGroup>
So you can see here it actually removes the original control, then adds it back.
I also changed the target as such:
<PropertyGroup>
<OnAfterPipelineCollectFilesPhase>
CustomCollectFiles;
$(OnAfterPipelineCollectFilesPhase);
</OnAfterPipelineCollectFilesPhase>
</PropertyGroup>
Let me know if any questions would be happy to help!

Related

Publish an MVC application with a web.config for each production environment

I have an MVC application with a Dev, Staging, and Production environment. Dev and Staging are essentially the same thing (same VM, IIS, DB etc.); however, Production is hosted on 4 VMs behind a load balancer. Each VM has it's own DB. For example, the instance deployed to VM1 communicates with the PROD1 DB, VM2->PROD2, etc.
For deployment to Dev and Staging, I do a simple File System deployment from VS2013 to the VM using Debug/Release web.config transforms. For Production deployments, a SysAdmin will copy the bits deployed and tested in Staging to each Production VM. This is to ensure that what was tested and verified by QA in Staging is what we promote to Production -- I don't want to do another build between Staging and Production. Because of this, our SysAdmin is responsible for (with DevOps guidance) editing each web.config between Staging and Production. This basically consists of changing connectionString values from "Data Source=STAGINGDB" to "Data Source=PROD1" (and PROD2, PROD3, PROD4).
What I ultimately want is when I publish to Staging, I want to deploy my web.config using standard Release web.config transform; however, alongside this file I want to also create and drop 4 additional files (web.config.PROD1, .PROD2, etc.). This will allow us to create scripts which ignore the existing web.config (with Staging settings) and copy/rename the appropriate .PROD config.
I am able to (sort of) achieve this with MSBuild:
<Project ToolsVersion="12.0" DefaultTargets="Build">
...
<Target Name="Build">
<TransformXml Source="Web.config" Transform="Web.PROD1.config" Destination="Web.config.PROD1" />
<TransformXml Source="Web.config" Transform="Web.PROD2.config" Destination="Web.config.PROD2" />
...
</Target>
</Project>
My main issue with this approach is that I have to create 4 essentially redundant solution configurations to wire up to the Transform. Every setting is the same except the DB connectionString. Seems like there should be a more efficient way.
Can I execute individual transforms without solution configurations by simply calling the appropriate transform via MSBuild, like:
<add name="connectionString" connectionString="PROD1" xdt:Transform="SetAttributes" xdt:Locator="Match(name)" />
Should I be using another process altogether? I'd rather not use a 3rd party nuget solution if I can stay way from it. Should I be using a .wpp.targets file? XmlPoke?
My desired workflow
Right-click my MVC app and choose "Publish" (File System)
Let the Release transforms do their thing and generate the web.config. I have basic configurations. Debug = Dev, Release = Staging.
Add a custom step that generates 4 additional web.config files
Package everything up, and publish to the Staging server, so I see this on the VM:
Everything I've read leads me to believe that I should be writing custom MSBuild steps, but I don't know what I should be doing (or how). Here's some pseudo-code:
<Project ToolsVersion="12.0" DefaultTargets="Build">
...
<Target Name="Build">
<TransformXml Source="Web.config" Transform="[Do-Basic-Transform-On-Conection-String]" Destination="Web.config.PROD1" />
<TransformXml Source="Web.config" Transform="[Do-Basic-Transform-On-Conection-String]" Destination="Web.config.PROD2" />
<IncludeFilesInPublish>
<FileToInclude>Web.config.PROD1</FileToInclude>
<FileToInclude>Web.config.PROD2</FileToInclude>
</IncludeFilesInPublish>
</Target>
</Project>
Can I [Do-Basic-Transform-On-Connection-String] inline here without a solution configuration? I'll only be changing 2 connectionString values. If I need to create a solution config, that's fine... I just don't think it's totally necessary especially if I can do it inline. Maybe I'm wrong?
How do I accomplish the <IncludeFilesInPublish> bit so that whatever I do get's packaged up during the Publish, so my Staging deployment has my release candidate code and web.configs ready for promotions.
I think your question is twofold: 1) how do I pass environment variables or parameters (i.e. PROD1) into my xdt transformation file so I only have to use one transformation file? and 2) how do I get MSBuild to iterate over a set of known named items to produce outputs distinguished by each item in this set?
For the first part, the only reason why I assert you might be asking this is because you said "I have to create 4 essentially redundant solution configurations to wire up to the Transform", so if your transform took "PROD1" as a parameter you could ideally just use one transform. But I'm not sure that you can do this without creating your own XmlTransform task. The xdt transformation tooling is really limited. But the nice thing about MSBuild is that it's flexible enough that you could theoretically come up with your own transformation task that extends/subclasses or behaves like the one out of the box.
using System;
using Microsoft.Build.Framework;
using Microsoft.Build.Utilities;
using Microsoft.Web.Publishing.Tasks;
namespace Thanks.IllWriteMyOwnTasks
{
public class MyCustomTransformXml : TransformXml // no idea if you can do this
{
public override bool Execute()
{
// do stuff here,
// maybe declare parameters that you can pass down to base.Execute()
return true;
}
}
}
..
<!--<UsingTask TaskName="TransformXml" AssemblyFile="$(MSBuildExtensionsPath)\Microsoft\VisualStudio\v11.0\Web\Microsoft.Web.Publishing.Tasks.dll" />-->
<UsingTask TaskName="MyCustomTransformXml" AssemblyFile="Thanks.IllWriteMyOwnTasks.dll" />
For the second part, I think you can use ItemGroup.
<ItemGroup>
<MyEnvironments Include="PROD1" />
<MyEnvironments Include="PROD2" />
<MyEnvironments Include="PROD3" />
<MyEnvironments Include="PROD4" />
</ItemGroup>
<Target Name="BeforeBuild">
<TransformXml Source="Web.config"
Transform="Web.%(MyEnvironments.Identity).config"
Destination="Web.config.%(MyEnvironments.Identity)" />
</Target>
I haven't tested this, but I think based on what I see over here it will automatically repeat the same task as it iterates over MyEnvironments in this example.
You can add the extra transform files to your solution without adding a new solution configuration, and run the transform as in your own example (except for the target. 'Build' didn't work for me):
<Project ToolsVersion="12.0" DefaultTargets="Build">
...
<Target Name="BeforeBuild">
<TransformXml Source="Web.config" Transform="Web.PROD1.config" Destination="Web.config.PROD1" />
<TransformXml Source="Web.config" Transform="Web.PROD2.config" Destination="Web.config.PROD2" />
...
</Target>
</Project>
If you are only changing a connectionstring, your transform file will be pretty minimal:
<?xml version="1.0" encoding="utf-8"?>
<configuration xmlns:xdt="http://schemas.microsoft.com/XML-Document-Transform">
<connectionStrings>
<add name="MyDB"
connectionString="Data Source=PROD1SQLServer;Initial Catalog=MyReleaseDB;Integrated Security=True"
xdt:Transform="SetAttributes" xdt:Locator="Match(name)"/>
</connectionStrings>
</configuration>
After adding the above, and compiling, you will need to add the newly generated Web.config.PRODX files to your solution. After adding them, you simply open properties for each file and ensure that their compile action is set to 'Content'. This will mean that they are included in your deployments.
As the web.PRODX.config transform files are not part of a solution configuration, you could stick them in a folder to reduce clutter.

Ant tasks don't run IN IzPack installer

.
Hello, everyone
I'm studying IzPack as a tool to be used in a future project and I'm really enjoying it. It's as flexible as I need and makes the process much more easy. I have even submmited a silly pull request at github with a modification I needed to my purposes. Who knows?
Although I don't find it particularly complicated, I've been stuck trying to use a resource for some days. I need that certain Ant Tasks to be executed in certain points of the installation process (right before everything is unpacked is the really one that matters) and that is not working, besides all the efford. :(
My current state, that seems right looking at examples, is the following:
[ My current use of this is based on an example I found here (the docs don't clear too much when It cames to these kind of Actions.]
In my definitions xml file, I included some things:
First, the AntActionsSpect.xml and the .jars, followed by the listeners:
<resources>
...
<res id="AntActionsSpec.xml" src="specs/AntActionsSpec.xml" />
...
</resources>
<jar src="libs/ant/ant.jar" stage="both" />
<jar src="libs/ant/ant-launcher.jar" stage="both" />
<listeners>
<listener classname="AntActionInstallerListener" stage="install" />
<listener classname="AntActionUninstallerListener" stage="uninstall" />
</listeners>
<pack name="test_app" required="yes" installGroups="Application Core">
...
In the specs/AntActionsSpec.xml file, I have the following:
<pack name="test_app">
<antcall order="beforepacks" quiet="no" verbose="yes" buildfile="$INSTALL_PATH/ant-tasks.xml">
<property name="INSTALL_PATH" value="$INSTALL_PATH" />
<target name="touch_beforepacks" />
</antcall>
</pack>
And the ant-tasks.xml file:
<?xml version="1.0" encoding="UTF-8"?>
<project>
<target name="touch_beforepacks">
<touch file="$INSTALL_PATH/beforepacks.txt"/>
</target>
</project>
Nothing special here, just creating a dumb file.
The ant-tasks.xml is unpacked right before anyone else. Everything builds with no error, even if I create one "mistake" at AntActionsSpec or ant-tasks.xml, what suggests me that they aren't even been loaded, though if I mess with the path where the definitions file has them, the build will fail.
I would like some help addressing that. I'm probably making some stupid little error and just can't see it by myself. If any of you could provide an example of a running build, that would be sweet.
If I can give any more information, please, let me known so I can update the question.
Thank you very much.
Just found it using a forum on a Google Groups discussion: [izpack-user] Quick question on variable substitution.
Unfortunattly the I will conclude that the docs are misleading. The docs in
"AntActionInstallerListener and AntActionUninstallerListener" until this date are stating that I should use this listener configuration:
<listeners>
<listener classname="AntActionInstallerListener" stage="install" />
<listener classname="AntActionUninstallerListener" stage="uninstall" />
</listeners>
That is what is up there, in the question. Comparing my XML code with the one in the Google Groups discussion, I found a different use of it:
<listeners>
<listener installer="AntActionInstallerListener"
uninstaller="AntActionUninstallerListener" />
</listeners>
In fact, that is the instruction given in the other wiki: Ant Actions (InstallerListener and UninstallerListener), what points out that I something can be wrong under the hood, but that is a story to another episode.
That just works. The Ant tasks are executed properly. :)
I just could not find where freaking Codehaus will allow me to grab a login and edit the docs wiki. >:( . If someone could endorse-me with some testing and then adjust the wiki for future happiness or just give a link to this tired programmer, I'd be happy.

JaCoCo report looks correct but can not view source

I am new to JaCoCo and trying to figure out why the html report that I am generating is not linked with my source.
The coverage numbers look correct and I can browse down to each class and then each method but I can not see the source. I have tried many different things inside the sourcefiles tag but nothing is working. Has anyone else had this issue? Here is a snippet of my ant script:
...
<test name="test.fw.UITestSuite" todir="${logdir}"/>
</junit>
</jacoco:coverage>
<fail if="TestFailed" status="1" message="UI junit test failure detected"/>
<echo message="${src}"/>
<jacoco:report>
<executiondata>
<file file="jacoco.exec"/>
</executiondata>
<structure name="UI">
<classfiles>
<fileset dir="${build}/fw"/>
</classfiles>
<sourcefiles encoding="UTF-8">
<fileset dir="fw" includes="**./*.java"/>
</sourcefiles>
</structure>
<html destdir="report"/>
</jacoco:report>
</target>
...
Your fileset definition seems odd.
The include must be (the first . is misplaced):
includes="**/*.java
Try simply pointing it to the root of your src dir (there is no need for the includes)
<fileset dir="fw" />
But fw has to be the root of your sources, i.e. it contains the package folders like:
src
-org
-module
-MyClass1.java
-MyClass2.java
I’ve seen this break when using Scala-style package directory names, e.g.,
src/main/java/com.example.foo.bar/Foo.java
for fewer levels of nesting, faster autocompletion, &c., compared to the standard
src/main/java/com/example/foo/bar/Foo.java
Most tools support the first version just fine, but usually if you try it out and everything works fine, by the time you notice something like the jacoco reports not showing the source anymore, you’ve long forgotten the directory name change …

How to strip one folder during Ant copy

I have a file which has filepaths like "LibraryX/A/Stuff/FileY.txt", which I'm using as includesfile in Ant build. However, I'm in need of removing the "LibraryX/A/" part of the path DURING the copy process: The file gets copied from "LibraryX/A/Stuff/FileY.txt" and lands into "Stuff/FileY.txt". I've looked into few regexpmappers but haven't had any success with them at all. :/
The purpose for this is that the target folder can have custom files in "Stuff/MoreStuff" overwritten, and I want to use the overwrite="false" to keep the disk access into minimum and keeping the custom files intact.
Ant:
<copy todir="C:/targetdir/" overwrite="false">
<fileset dir="C:/sourcedir/">
<includesfile name="C:/targetdir/includes.file" />
</fileset>
</copy>
Includes.file:
LibraryX/A/Stuff/FileA.txt
LibraryX/A/Stuff/FileB.txt
LibraryX/A/Stuff/FileC.txt
LibraryX/A/Stuff/FileY.txt
Sourcedir:
sourcedir/LibraryX/A/Stuff/FileA.txt
sourcedir/LibraryX/A/Stuff/FileB.txt
sourcedir/LibraryX/A/Stuff/FileC.txt
sourcedir/LibraryX/A/Stuff/FileY.txt
Target dir:
targetdir/Stuff/FileY.txt
Now, all the files in Stuff -folder at sourcedir, should end into the Stuff -folder in targetdir. But how?
Bonus: If I move the files from "targetdir/LibraryX/A/Stuff", they will overwrite everything in the "targetdir/Stuff" folder, even with the overwrite="false". Presumably because they are newer files than the ones in the Stuff folder currently.
Note: I could, of course, move the custom files away from the target directory, copy the stuff over and then move the custom files back, overwriting the new ones. But this accesses the disk quite a lot, slowing down the process.
Starting with Ant v1.8.2 you can use the cutdirsmapper to strip some number of leading directories from file paths. See the very bottom of the mapper type docs.
<copy todir="C:/targetdir/" overwrite="false">
<fileset dir="C:/sourcedir/">
<includesfile name="C:/targetdir/includes.file" />
</fileset>
<cutdirsmapper dirs="2"/>
</copy>
Bonus: You could use the touch ant task to make all the files in targetdir newer than all the source files and therefore prevent them from being overwritten.

getting error message: "unknown resolver XYZ"

while resolving my ivy.xml, I get a long list of errors, all stating "unknown resolver XYZ". I know the resolver, it is used in the same project but different task.
As far as I understand, the resolver used to create the cache entry is stored and than cannot be determined by the follow-up resolver.
Question is: how can I avoid this? Seeams like this is not really an error, more like a warning since I am able to resolve all dependencies and continue compiling.
Within the same project, the build resolver will not change because it's defined in your ivysettings.xml file.
This is more likely to be a problem with a stale ivy cache. I'd suggest adding an extra target that purges your cache. Useful when encountering this type of problem:
<target name="clean-all" depends="clean" description="Purge ivy cache">
<ivy:cleancache/>
</target>
Run your ant build with the verbose flag (-v). This will give you clear insight into which settings files are being used throughout the resolve process. My wager is you will find your problem fairly easily and it will be along the lines of the settings file you thought you were using is actually not being used.
In my projects, I find this type of thing often happens when a post-resolve task (such as retrieve) triggers a resolve "automatically" and uses the default ivy settings instead of the one I want it to use at the moment. Chances are, your default settings file does not contain the resolvers you're expecting.
To solve these issues, I make a ivysettings-common.xml containing only resolvers. Then, in each of my settings files, I import the common settings and reference the resolvers in the main chain. That looks like:
<ivysettings>
<settings defaultResolver="all-repositories" />
<include file="ivysettings-common.xml" />
<resolvers>
<chain name="all-repositories" returnFirst="true" >
<resolver ref="project" />
<resolver ref="local" />
<resolver ref="hibernate" />
<resolver ref="ibibilo" />
</chain>
</resolvers>
</ivysettings>
From there, I make the common file my default settings, just "in case of emergency" I know all my resolvers can be found (by adding the following to ivy.properties):
ivy.settings.file = ${basedir}/path/to/ivysettings-common.xml
but I explicitly point all my ivy calls to the appropriate settings file, trying to never rely on the default because the whole reason I use ivy+ant is that I prefer precise control over my build process:
I hope all that helps you or someone else.
~gMale

Resources