Related
i have installed the thinbackup plugin to backup jenkins but as i have left the exclude option empty it is including all the files i never wanted to backup,so now i knew that if i use an expression like
^.*\.(log)$
it will not include the files having extention as .log but now the situation is i want to exclude a whole folder and struggling to get the regular expression for it.
what i have tried is
/jobs or jobs/*
Need help!!
Thanks in advance
I don't think you can do that with thinBackup plugin. However, Backup Plugin does what you need and much more. Once you've installed it, just go to Jenkins > Backup manager and enable Configuration files (.xml) only. Above that you will see a box named Custom exclusions. Here you can specify comma- or space-separated list of file/directory names to exclude from backup. See snapshot below:
You should also check the Backup content section shown above to include job workspace, if required. Even that has option to include/exclude files/directories.
I'm using TFS 2012 to automate a build of a solution which contains multiple windows services and two web applicaitons.
I've used the guide I found here to customize the build process template so that the windows services are put in a folder structure that I like. Specifically:
\dropserver\droproot\MyApp\BuildNumber\
\Service1
\Service2
\Service3
\Service4
This works great, but unfortunately it doesn't work for web applicaitons. If I used the same strategy for those, I just get the contents of /bin for each web app, rather than the full site contents.
MSBuild typically uses the web application targets to handle this, but for some reason, this doesn't work when you customize the build as I have. I no longer get the _PublishedWebSites folder in the build output. (I'm guessing that's because I cleared our the OutDir property of the MSBuild task.)
Has anybody done something like this and gotten it to work with web applications as well?
I think I can help with this, it looks like in the build targets that the published websites folder isn't created if the OutDir is the same as the OutputPath.
So this isn't perfect, but if you add the following into the csproj file in the first property group, you'll get everything deployed into "\bin\deploy\" including the _PublishedWebsites folder
<DeployOnBuild>True</DeployOnBuild>
<OutDir>bin\deploy\</OutDir>
With a bit of customization, this solution ended up working for me:
http://www.edsquared.com/2011/01/31/Customizable+Output+Directories+For+TFS+2010+Build.aspx
Basically, did what that link recommended, but also leveraged a new solution configuration (which I called TeamBuild) rather than conditional property definitions.
I believe the key to making this all work was the passing of the outputDirectory as the TeamBuildOutDir argument to MSBuild. Embedding this variable reference in the OutDir or OutputPath variable was allowed Team Build to build to the correct staging location and then automatically copy files from that location to the drop folder.
I'm going to take this a little futher and get rid of the whole _PublishedWebSites thing, but that will be done entirely in the build workflow.
EDIT: TFS 2013 supports this natively with a simply build configuration option:
Take a look at this thread as this post as well.
Team Build: Publish locally using MSDeploy
Since you need all the files for your web projects, you need to trigger the publishing process, and by tweaking the destination of that process, you can have all of your files copied where you need them.
I think option (2) from his answer will work for you.
I hope that helps.
As I can see in your reference link, it will just compile and package the binaries. It does not deploy the website by the steps mentioned in that.
If you want to get the .html, .css, .js etc. under the _PublishedWebSites folder, you need to do a Web Deployment. This manually we can do by clicking the publish option from right click menu of your VS project and by selecting Publish Method as File System.
But, since you need to automate this in your build and drop it in custom drop folder, you may need to manipulate your MSBuild script by calling a AspNetCompiler task. You can get more information on this at the MSDN link. By specifying the TargetPath while you call this target you can get your Web files deployed at the appropriate custom drop folder.
Happy Scripting.
Have you check this blog, this solved my problem where I wanted customized TeamBuild Ouput Directory.
Customizable O/P with TFS 2013
Customizaable O/P with TFS 2012 and .NET Framework 4.5
Our team is sharing a Jenkins server with other teams, and this currently means that we are sharing the same OS-level build-user account. The different teams' OS-level build-user settings (Maven settings, bash settings, user-level Ant libraries, etc...) have collided a few times--"fixing" the settings for one team's jobs inadvertently "breaks" another team's jobs. The easiest sol'n that occurs to me is giving each team its own OS-level build-user account with which to execute its Jenkins jobs--but I cannot find a way to do this.
I have checked with Google, and also here
https://wiki.jenkins-ci.org/display/JENKINS/Use+Jenkins
and here
https://wiki.jenkins-ci.org/display/JENKINS/Plugins
to no avail.
Is there a way to do this? If not, can you recommend any best practices for segregating sets of builds from one another?
Maven Specific
You have two options that come to mind,
Add additional installations of Maven into your Jenkins global configuration, each using their own Home directory, and thus settings files. This will allow you to use totally different version of Maven, and selected based on Job requirements (You are given the option to select which "version" of maven you wish to use on the job itself.
Similar to (1), but specify specific settings configurations using Maven command line arguments. Its a little less "obvious" but may be quicker to implement
Multi-slave
You could possibly make use of multiple slaves on each machine. It increases the overheads of the builds quite significantly, and the implementation is such that you'd have multiple user accounts on a machine, each setup as needed, and then one slave instance for each user.
I'm not sure these solutions will totally answer your problem, I'll have a think and see if anything else pops into mind, but it might give some starting points
Key builds to a specific team directory that contains that team's settings. For example, provide a parameter 'TEAM' to every build, set its default value to the appropriate team name, and use that parameter as a key to a directory that contains the team's settings (so instead of using ${HOME} as in what you want to do, you'll use something like ${TEAM_SETTINGS}/${TEAM}).
You can set per-job users (who has access to/can build a particular job).
Under "Manage Jenkins" > "Configure System" >
Click on Enable Security
Check Project-based Matrix Authorization Strategy
However, I do not think there is a "per-build" option for a single job.
If you have the same project that you are sharing between teams, you could (and probably should) create two jobs for this project, and have different libraries/scripts be used in each.
You could also parametrize the build (On the Job Page, "Configure" > This build is parametrized) and supply the library versions, etc via string parameters.
You could also use a parameter to be the team's name, and in your build script change libraries based on the parameter:
For example, have a parameter called "TEAM", with choices: TEAM_A and TEAM_B, and in your script, have
if [ $TEAM == "TEAM_A" ]
then
ANT_HOME=/opt/ant/libA
else
ANT_HOME=/opt/ant/libB
fi
======================================================================
Have you considered sourcing your settings? In Linux, you could do this by saving your OS settings in a script file (for example paths, etc), and using source /path/to/settings/file, in Windows it would be call /path/to/settings/batch/file.
Can you give examples of OS level settings that you would require and per-build user for?
You problem is a common one.
Whenever something nonstandard is installed on a build server, something will break for someone.
The only solutions I know are
Set up a separate build slave for each team or product. Then they can install whatever they want on the build slave and any mess they create is all their own fault.
Any dependencies required by a job need to come with the job. This is my preferred way of working. For example: If a job needs a library or a tool, the library or tool is not installed on the build server but in the source tree and the build uses it from the source tree.
Sometimes the latter way is more work. You need to set up the tools or library so it works when it is installed in the source tree. Some tools have hard-coded paths and they do not work. In that case you can install the source of the tool and compile the tool during the build.
An even better solution is to set up separate Jenkins jobs for all the tools and libraries and the jobs that need a library or tool will download them from the Jenkins jobs.
This way you can control all your dependencies and different jobs do not conflict when e.g. one needs an older version of a library and one a newer version. And if someone upgrades the library, it is immediately visible in the version control who did what.
Has anyone used WiX to generate an installer for an ASP.Net MVC website? Do you harvest files from the web project? I can’t find any good examples of this being done. There doesn’t seem to be a documented way to include all the right files, only the right files and put them in the right place.
If you add the website project as a reference in the installer project, and set harvest=True in the properties, then all the website files are captured, but there are issues:
Some files that should not be copied are included, e.g. packages.config, Web.Debug.config There doesn’t seem to be any clear or simple way to exclude them (as per this discussion).
The .website dll file is in the wrong place, in the root rather than the bin folder (as per this discussion)
However if you do not use harvesting, you have a lot of files to reference manually (e.g. Under \Content\ alone I have 58 files in 5 folders. Most of that is jQuery UI) and they change from time to time, and errors and omissions could easily be missed from a WiX file list. So it really should be kept in sync automatically.
I disagree with the idea that the list of files should be specified explicitly in WiX and not generated dynamically (which is what seems to be suggested at the first link, the wording isn't very clear). If I need to remove a file I will remove if from the source control system, there is no need to do the extra work of maintaining two parallel but different catalogues – one set of files in source control, and the same files listed in WiX. there should be one version of the truth. All files in the website's source tree (with certain known exceptions that are not used at runtime e.g. packages.config) should be included in the deployment.
For corporate reasons I don't have much choice about using WiX for this project
In our MVC 3 project we use Paraffin to harvest files for the installer. For example, you can use "-ext " to ignore the files with extension , use "regExExclude " to ignore the file name matching the regular expression, etc.
Paraffin also keeps the proper structure, all your files would be in the correct folder as they appear in your project.
I use a program that I wrote called ISWIX that makes authoring wxs merge modules a simple drag and drop operation like InstallShield. I then consume that merge module in an installer that handles the UI and IIS configuration.
I also have postbuild automation that extracts the content of the MSI and compares it against what the project published. If there is a delta I fail the build and you have to either a) add it to the wxs or b) remove it from the publish.
I find that the file count churn from build to build is minimal and that this system is not difficult to maintain. The upside is everything remains 100% intentionally authored and files don't ever magically add or remove from the installer unless you intended them to. Dynamic installer generation isn't worth the risk and most people who argue that it is don't even know what those risks are.
I'm just getting started with the team build functionality and I'm finding the sheer amount of things required to do something pretty simple a bit overwhelming. My setup at the moment is a solution with a web app, an assembly app and a test app. The web app has a PublishProfile set up which publishes via the filesystem.
I have a TFS build definition set up which currently builds the entire solution nightly and drops it onto a network share as a backup of old builds. All I want to do now is have the PublishProfile I've already setup publish the web app for me. I'm sure this is really simple but I've been playing with MSBuild commands for a full day now with no luck. Help!
Unfortunately sharing of the Publish Profile is not supported or implemented in MSBuild. The logic to publish from the profile is contained in VS itself. Fortunately the profile doesn't contain much information so there are ways to achieve what you are looking for. Our targets do not specifically support the exact same steps as followed by the publish dialog, but to achieve the same result from team build you have two choices, I will outline both here.
When you setup your Team Build definition in order to deploy you need to pass in some values for the MSBuild Arguments for the build process. See image below where I have highlighted this.
Option 1:
Pass in the following arguments:
/p:DeployOnBuild=true;DeployTarget=PipelinePreDeployCopyAllFilesToOneFolder;PackageTempRootDir="\\sayedha-w500\BuildDrops\Publish";AutoParameterizationWebConfigConnectionStrings=false
Let me explain these parameters a bit, show you the result then explain the next option.
DeployOnBuild=true:This tells the project to execute the target(s) defined in the DeployTarget property.
DeployTarget=PipelinePreDeployCopyAllFilesToOneFolder: This specifies the DeployTarget target.
PackageTempRootDir="\\sayedha-w500\BuildDrops\Publish": This specifies the location where the package files will be written. This is the location where the files are written before they are packaged.
AutoParameterizationWebConfigConnectionStrings=false: This tells the Web Publishing Pipeline (WPP) to not parameterize the connection strings in the web.config file. If you do not specify this then your connection string values will be replaced with placeholders like $(ReplacableToken_dummyConStr-Web.config Connection String_0)
After you do this you can kick off a build then inside of the PackageTempRootDir location you will find a PackageTmp folder and this contains the content that you are looking for.
Option 2:
So for the previous option you probably noticed that it creates a folder named PackageTmp and if you do not want that then you can use the following options instead.
/p:DeployOnBuild=true;DeployTarget=PipelinePreDeployCopyAllFilesToOneFolder;_PackageTempDir="\\sayedha-w500\BuildDrops\Publish";AutoParameterizationWebConfigConnectionStrings=false
The difference here is that instead of PackageTempRootDir you would pass in _PackageTempDir. The reason why I don't suggest that to begin with is because MSBuild properties that start with _ signify that the property in essentially "internal" in the sense that in a future version it may mean something else or not exist at all. So use at your own risk.
Option 3
With all that said, you could just use the build to package your web. If you want to do this then use the following arguments.
/p:DeployOnBuild=true;DeployTarget=Package
When you do this in the drop folder for your build you will find the _PublishedWebsites folder as you normally would, then inside of that there will be a folder {ProjectName}_Package where {ProjectName} is the name of the project. This folder will contain the package, the .cmd file, the parameters file and a couple others. You can use these files to deploy your web.
I hope that wasn't information over load.
The ability to publish web sites, configure IIS and push schema changes for the DEV->QA->RELEASE cycle has required either custom configuration to imitate publish or custom code where IIS settings are involved.
As of Visual Studio 2013.2 Microsoft has added a third party product that manages deployment of web sites, configuration changes and database deployment with windows workflow and would be the recommended solution for automating deployment from TFS build.
More information can be found here:
http://www.visualstudio.com/en-us/explore/release-management-vs.aspx
You can use the Publish/Deploy in Visual Studio 2010.
See http://www.ewaldhofman.nl/post/2010/04/12/Auto-deployment-of-my-web-application-with-Team-Build-2010-to-add-Interactive-Testing.aspx for more information