Can I run TFS Build Controller and Unit Tests on one machine and TFS Build Agents on another machine? - tfs

I have sucessfully setup two machines. The first machine uses as TFS Build Controller and the other machine use as TFS Build Agent.
There are two things I want to do.
Run SQL scripts on TFS Build Controller machine. I have all SQL script files on TFS but I have no idea how can I get it run.
I want to do is upload the output (dll and exe files) from TFS Build Agent machine back to TFS Build Controller machine then run test on this machine. (I want to run test after I run all sql script files)
Please let me know if this is possible or not. You can just give me a link since I know it might take a long explanation. I would appriciate if you could write down the answer. :)

Yes, that is all possible, but I would strongly suggest not using the Build Controller for that. Build Controllers are meant to just dole the builds out to the Agents that do all the work. Also Build Controllers are typically shared between Team Projects, so if a single Team Project is abusing the Controller it can affect others; whereas Build Agents are usually dedicated to Team Projects. You should be able to run your SQL Scripts and tests on the Build Agent, that's what most people will do.
Having said that, if you're set on still doing it, you can modify the build workflow template to accomplish it. Anything inside the Run On Agent activity runs on the Build Agent, anything outside of that runs on the Build Controller. See Ewald's Blog series on customizing TFS Build if you've never done it before: http://www.ewaldhofman.nl/post/2010/04/20/Customize-Team-Build-2010-e28093-Part-1-Introduction.aspx

Related

jenkins tests with ranorex

I'm just getting started with Jenkins and I have a few doubts that must be silly, but I'm stuck at it.
After I build my project Jenkins save the build file in some specific path?
Using Ranorex for automation test, is it better to put my files locally on the server or push them to a repository?
Note: I just start tried to use this, at this moment I can check for changes at BitBucket, build the file, build the Ranorex test and run the test.
Jenkins is quite a versatile application that allows system setup to specific needs and requirements of the test project. So i'd say go with the way that seems most logical/easiest. It's kind of a learning process as well so you will be able to understand the working flow of Jenkins itself.
But to answer your 2 questions:
1) By build files i believe you mean the test reports? - For this I actually use the Jenkins UserContent folder. This requires the "Copy to slave" plugin to be installed. With this you will get an additional Post-build Action where you can specify the files that will be copied over to the UserContent folder. But don't forget to specify a common layout for the naming of report files through the Ranorex run parameters ("/rf"). The UserContent folder actually acts as a web server and you can directly link the URLs for email reports. (eg. "http://Jenkins-server.com/UserContent/Regression-Client-Test-#1.html")
2) This totally depends on the system setup. But i can give you an example on how our system is currently set up. So we have Jenkins which runs on a Linux machine. It is only used to manage and run the tests and the actual machine does not include the automation test project. Then we have the test machine which runs on Windows and holds the actual automation tests. This machine is connected to Jenkins through the Slave functionality. So basically when someone starts a test job Jenkins from the Linux machine sends a command to the slave to start the automated tests. When the test run has finished post-build actions take over and copy the report files from the Slave machine to the Linux machines UserContent folder.
Now when talking about the test project management. It's a good idea to use a gir repository which will add another layer of somewhat "security". But if you have a small team (or you are the only test developer) then there is no actual need for it. You just copy the project to the test machine to a specified folder whenever needed/updated and you are ready to run it.
Regards,
Martin

How to organize the execution of each test on a separate virtual machine?

I would like to ask the public how to organize complex integration tests when you need to start a virtual machine to run each test.
First, let me describe how organized tests we have now and what I do not like this.
The process is organized using TFS Process Templates and Custom Actions.
First step: On the build agent is assembled the project.
From the custom action (using vmware api) run the virtual machine.
Copy to the virtual machine the assembled project dlls.
Run mstest.exe with the necessary parameters on the virtual machine
Copy Test Results to the build agent and analyze it manually in the custom action.
So, steps 2-5 we repeat for each test.
Disadvantages of this approach - complexity. We need to manually analyze the test results, if the format of these files will changes - we get a lot of code rewriting.
It would be much better if the TFS build agent was a virtual machine and TFS will revert it themself before running each test.
How do you organize such kind of tests. Specifically, in this case we are talking about CodeUI, but similar problems occur frequently. Maybe my question is too general, but any ideas will be very helpful...
If you were to move to Lab Management and Release Management much of what you are trying to do is out of the box with Visual Studio ALM. You can create and build out environments either in Hyper-V or in Azure dynamically.
I would recommend using Release Management and Powershell to orchestrate this. The RM team have a good demo:
http://blogs.msdn.com/b/visualstudioalm/archive/2014/11/11/deploying-and-testing-web-applications-using-release-management.aspx
Here they both deploy the application and then run codedUI.

Set up Team Foundation Server Build service to do automatic builds and testing

Our plan is to use Team Foundation Build service to do automatic builds, then use the testing facility to automatically perform testing on the build server then release that build onto the application server.
So far we have
Team Foundation Server with TF Build Controller configured
Build server with win2012, Visual Studio 2013 and Build agent configured.
SQL Server with SQL 2013 installed
Application Server with Win2012 and .netframework installed
My question is what do I need to do to set up automatic builds, and to execute the unit test harness once compilation is successful.
Also the deployment target machine will initially be DEV, however we would like to quickly build for test env and prod etc.
This is what I got so far.
Build Controller (Already set up I believe)
Build Agent (Already installed on build server)
Build Process Template (Do I need to do anything with this. Is this what controls the whole lot)
Team Build Definition (I had a look at this, and it seems to use the build process template)
Drop Folder (I am assuming this is where the executables will be dropped into).
At the moment I have bits and pieces of info, what I would like to know is how this whole thing is hanging together. From the moment the developer wants to do the build to the moment that exe is placed into the DEVAPPSERV (Development application server).
Is anyone able to point me in the right direction or give a summary of what I need to make this happen?
Many thanks,
Dalibor
Install TFS Server (TFS Disk) Create a Team Project Collection and any desired Projects
Install TFS Controller + Agents onto a dedicated machine (TFS Disk) Configure only the build options if on a different machine to the TFS Server
Configure Build Controller to connect to a Specific Team Collection on your TFS Server
Install VS Premium or higher on build machine, if you want code coverage results for your tests
Add some code to TFS Source Control
Create a Build Definition using the default template.
Configure the build definition.
Set the working folder for the build, include only what you need as this will speed up the process
Point the definition to your .sln or proj file.
Ensure testing is enabled and that your test assembly names will match the regex used to identify test dll's i.e. name your test assemblies with the word test.
Set the trigger to be CI or what ever flavour of build you require i.e. gated build
Save the build definition
Trigger a manual build and debug any issues
you should have the basics done and a repeatable build created.
That should cover the basics, you may want to customise the build template (see Ewald Hoffman's guide for tips), you may want to narrow down your code coverage (look for runsettings file info).
If you follow these steps you should be able to get a basic build created and running from these, if you hit any issues you can come back and ask specific questions about a particular area
In order to do automatic builds you should check the CI build option ( under the trigger build option ) and third party automated testing can be run by executed by a post build script.
See the following TFS article about post build scripts.
http://msdn.microsoft.com/en-us/library/dn376353.aspx

Using TFS build definitions on a local machine

I have created a lightly customized TFS build process template and also appropriate TFS build definition. It builds fine on the TFS build server.
Is there any way I can allow developers to reuse the same build process XAML and definition to do full builds on their local machines? Maybe there is some utility which can be run with TFS build process XAML files?
I really would like to avoid maintaining a separate copy of the build script for full local rebuilds.
The build templates can only be run by the TFS build service. Without installing that on each developers machine, that might not be the best idea.
An alternative is to setup a share on the developments machine and grant access to the TFS Build account (the one that TFSBuildServiceHost.exe runs on the server as). Then the developer can Queue a build and get the Server to Drop the files onto their machine.
The downside to this is you need a lot of builds to be run on the Agents.

TFS Build - Powershell or custom activity?

I understand I can write my own custom activity (in C#) to execute custom logic during the build process. My understanding is that Powershell can also be used, but I am not sure where it fits in. I do understand Powershell is used for executing command line commands but how and where would I use it to customize the build process?
Thanks
The decision whether to use Powershell or a Custom activity is for me based on who is responsible. If you have an activity that is created by the build master (for TFS) and therefor reusable for all the teams in the organization, I create a custom activity.
If the project team is responsible (for example a deployment script), the I use the powershell. I create an argument where the team can enter the path of the powershell script that needs to execute to deploy. The project team can optionally choose to enter a value in that argument. The project team can also maintain their powershell deployment script themselves without the help of the build master.
So in short:
A reusable activity: Custom activity
Activity for the team only: Powershell
For me, powershell is the way to go. Here are my reasons why:
Script Independence:
You get script independence using this approach. Example: I have a number of scripts that run after the build (i.e. the compile) process has completed:
Instantiate Database
Deploy Database Code
Deploy Web Applications
Verify Deployment
Run Acceptance Tests
All of the above can be launched, debugged and tested independently without the need to queue a new build.
Powershell is easy to work with:
Custom assemblies tend to add a lot of complexity and flakiness to the solution. Example: Upgrading from TFS 2010 to TFS 2012 was very painful, because all of the Build Templates broke. We had to recompile all of our custom assemblies, and only one dev on the team knew how TFS Build was set up to run our custom activities. I have recently removed all custom assemblies from our build templates, and am using Powershell exclusively.
I have customised my process templates to call a user-defined powershell script after the TFS Build has completed. I do this by using a paths argument in the build definition. This argument is simply an array of strings pointing to the scripts. I agree with Ewald, above, that TFS does not pass the build arguments to scripts. To solve this, in my workflow template I parse each script in the string array, and replace well-known tokens with the build arguments - e.g. #(BuildNumber), #(SourcesDirectory) etc. I find this to be a very easy and solid solution.

Resources