How to organize the execution of each test on a separate virtual machine? - tfs

I would like to ask the public how to organize complex integration tests when you need to start a virtual machine to run each test.
First, let me describe how organized tests we have now and what I do not like this.
The process is organized using TFS Process Templates and Custom Actions.
First step: On the build agent is assembled the project.
From the custom action (using vmware api) run the virtual machine.
Copy to the virtual machine the assembled project dlls.
Run mstest.exe with the necessary parameters on the virtual machine
Copy Test Results to the build agent and analyze it manually in the custom action.
So, steps 2-5 we repeat for each test.
Disadvantages of this approach - complexity. We need to manually analyze the test results, if the format of these files will changes - we get a lot of code rewriting.
It would be much better if the TFS build agent was a virtual machine and TFS will revert it themself before running each test.
How do you organize such kind of tests. Specifically, in this case we are talking about CodeUI, but similar problems occur frequently. Maybe my question is too general, but any ideas will be very helpful...

If you were to move to Lab Management and Release Management much of what you are trying to do is out of the box with Visual Studio ALM. You can create and build out environments either in Hyper-V or in Azure dynamically.
I would recommend using Release Management and Powershell to orchestrate this. The RM team have a good demo:
http://blogs.msdn.com/b/visualstudioalm/archive/2014/11/11/deploying-and-testing-web-applications-using-release-management.aspx
Here they both deploy the application and then run codedUI.

Related

jenkins tests with ranorex

I'm just getting started with Jenkins and I have a few doubts that must be silly, but I'm stuck at it.
After I build my project Jenkins save the build file in some specific path?
Using Ranorex for automation test, is it better to put my files locally on the server or push them to a repository?
Note: I just start tried to use this, at this moment I can check for changes at BitBucket, build the file, build the Ranorex test and run the test.
Jenkins is quite a versatile application that allows system setup to specific needs and requirements of the test project. So i'd say go with the way that seems most logical/easiest. It's kind of a learning process as well so you will be able to understand the working flow of Jenkins itself.
But to answer your 2 questions:
1) By build files i believe you mean the test reports? - For this I actually use the Jenkins UserContent folder. This requires the "Copy to slave" plugin to be installed. With this you will get an additional Post-build Action where you can specify the files that will be copied over to the UserContent folder. But don't forget to specify a common layout for the naming of report files through the Ranorex run parameters ("/rf"). The UserContent folder actually acts as a web server and you can directly link the URLs for email reports. (eg. "http://Jenkins-server.com/UserContent/Regression-Client-Test-#1.html")
2) This totally depends on the system setup. But i can give you an example on how our system is currently set up. So we have Jenkins which runs on a Linux machine. It is only used to manage and run the tests and the actual machine does not include the automation test project. Then we have the test machine which runs on Windows and holds the actual automation tests. This machine is connected to Jenkins through the Slave functionality. So basically when someone starts a test job Jenkins from the Linux machine sends a command to the slave to start the automated tests. When the test run has finished post-build actions take over and copy the report files from the Slave machine to the Linux machines UserContent folder.
Now when talking about the test project management. It's a good idea to use a gir repository which will add another layer of somewhat "security". But if you have a small team (or you are the only test developer) then there is no actual need for it. You just copy the project to the test machine to a specified folder whenever needed/updated and you are ready to run it.
Regards,
Martin

Jenkins master/slave with ASP.NET MVC

Right now we have Jenkins up and running on a single VM. It fetches the code from Github, builds and runs all Unit- and E2E Tests.
This takes 3+ hours.
The thing is, we have some integration tests that restore a test database. There are a lot of tests so it takes a long time.
We want to speed this up significantly.
So I created a Template Slave VM on Azure. It has Visual Studio, IIS, SQL Express, Git and everything else we would need for a deployment.
I can now clone this template to 5-6 identical slaves. Each will be instructed by the Jenkins master to build, deploy and test a chunk (suites) of the codebase.
Is this really the best infrastructure for this? It's a LOT of work setting this up. I have trouble finding good reading material on this subject.
I think you are referring to something called not only Continuous Integration, but more Release Management. Microsoft has the development platform Visual Studio Team Services that can be used in integration with Jenkins mode or/and you can create the build definition and release management definition and automate that.
Sure, it will involve some work (maybe a lot) but using VSTS + Azure (as they are tightly integrated) you can automate it for a future use.

Should Jenkins be run inside development/deployment environment or on standalone box

I am using Vagrant to provide a 'synchronised' and standardised development/test/uat/staging and production environments.
I am now looking at how to standardise my CI build process. I like the look of Jenkins but I am confused as to what the best way to deploy it is. Should I have it deployed in a stand-alone CI box or install it on all the various environments?
I guess I am a little confused here. Any help much appreciated, Thanks
The standard approach is a stand-alone CI server shared by the development team. This common server (at a well known URL) provides the development dashboard for a team and the only authorized way to publish into the release repository (Developers not allowed to publish directly)
You could go for extra credit and also setup an instance of Sonar which in my opinion is much better suited as a development dashboard, providing a richer set of metrics and also serves as a historicial record for development.
Finally Jenkins is so simple to setup, there is nothing stopping developers from running their own instances. I find that with Sonar it matters less and less where a build is actually run, once the release credentials are properly controlled. In fact this attitude is important as it prevents the build server from turning into a delicate snowflake :-)
Update
There's a vagrant plugin for Jenkins which might prove useful in running your current processes.
You're likely better off running Jenkins as a shared stand-alone server.
However, I highly recommend that you set up your builds in such a way that they can be run on each developer's machine locally as well. This is particularly key with unit-tests.
In our setup, we have a shared Jenkins server that executes all of our builds using NAnt. Each developer also has NAnt installed and can run the build and unit-test portions of the build freely. Ideally integration tests could also be run, but we're not quite there yet and having them execute on the CI server still gives us that proper feedback even if it takes a little longer to get.

Can I run TFS Build Controller and Unit Tests on one machine and TFS Build Agents on another machine?

I have sucessfully setup two machines. The first machine uses as TFS Build Controller and the other machine use as TFS Build Agent.
There are two things I want to do.
Run SQL scripts on TFS Build Controller machine. I have all SQL script files on TFS but I have no idea how can I get it run.
I want to do is upload the output (dll and exe files) from TFS Build Agent machine back to TFS Build Controller machine then run test on this machine. (I want to run test after I run all sql script files)
Please let me know if this is possible or not. You can just give me a link since I know it might take a long explanation. I would appriciate if you could write down the answer. :)
Yes, that is all possible, but I would strongly suggest not using the Build Controller for that. Build Controllers are meant to just dole the builds out to the Agents that do all the work. Also Build Controllers are typically shared between Team Projects, so if a single Team Project is abusing the Controller it can affect others; whereas Build Agents are usually dedicated to Team Projects. You should be able to run your SQL Scripts and tests on the Build Agent, that's what most people will do.
Having said that, if you're set on still doing it, you can modify the build workflow template to accomplish it. Anything inside the Run On Agent activity runs on the Build Agent, anything outside of that runs on the Build Controller. See Ewald's Blog series on customizing TFS Build if you've never done it before: http://www.ewaldhofman.nl/post/2010/04/20/Customize-Team-Build-2010-e28093-Part-1-Introduction.aspx

Automated Build and Deploy of Windows Services

How would you implement an automated build and deploy system for Windows services. Things to keep in mind:
The service will have to be stopped on the target machine.
The service entry in the Windows registry might need to be created/updated.
Some, but not all, of the services might need to be automatically started.
I am willing to use TFS for this, but it isn't a requirement. The target machines will always be development machines, we won't be doing this for production servers.
The automated build part can be done in multiple ways - TFS, TeamCity (what we use), CruiseControl.NET, etc. That in turn could call a build script in NAnt (again, what we use), MSBuild, etc.
As for stopping and installing a service remotely, see How to create a Windows service by using Sc.exe. Note that you could shell/exec out to this from your build script if there isn't a built-in task. (I haven't tried this recently, so do a quick spike first to make sure it works in your environment.)
Alternately, it's probably possible (and likely elegant) in Windows PowerShell 2.0.

Resources