When I first put my rails project in Jenkins, my settings were off and I generated reports for all code in vendor/. I've fixed my settings so new reports don't include that code, but even after wiping out my Jenkins workspace it still includes the vendor/ code in every report.
How can I erase the old rcov statistics and track just my own code?
The coverage statistics will fix themselves when all old builds that included them go away.
Related
I have a TFS 2017 instance running on windows server 2012 R2 with a test box running windows 10.
I am running into a very odd issue. Most of my automated CodedUI tests are running pretty well. However, I have noticed that code changes are not always picked up by TFS when it performs a new build, at least not in the testing code area.
The builds themselves work well, and new code always gets incorporated for those. However, when the latest build triggers a release containing CodedUI tests, those tests do not always grab the latest build.
I have noticed this primarily in my App.config file which contains connection strings that are not being updated. In one case I had three tests that ran apparently successfully, but then they ran again using the values from the old App.config file.
I also have found that changes to the [TestCategory()] attribute are not always picked up either. I use that category to specify which tests I want run in a particular release build. I use variations on the same word for my categories: CodedUI, CodedUIExtended, CodedUIStage. At first I thought the system was doing some sort of StartsWith and picking up the other names, but when I tell it to run CodedUI it is running both the CodedUI and CodedUIStage categories.
[TestCategory("CodedUI"), TestMethod]
public void UI_Login_AdminAuthenticate()
{
...
}
Because the CodedUIStage categories were recently changed and used to be CodedUI that has led me in the direction of suspecting some sort of caching being used in TFS.
Can anyone shed some light on why my category and app.config changes are not being picked up correctly? What is causing this, and could it be happening to the code itself as well when I attempt to deploy a fix/correction?
EDIT:
As suggested, I tried checking the clean option on my TFS build configuration, however it had no affect.
The release, which is triggered immediately upon build does show it is using the newly finished build number.
And looking at the artifacts, the test dll has the current date modified, so it looks like it was just created.
The test.dll.config seems to be the issue. In the artifacts it has a very old Date modified and is not the current version that is checked into TFS. Typically this would feel like a clean issue, however TFS always deletes and re-copies all destination files when doing a build or Release. I have verified that by watching the files being deleted and re-created on the file system during the process.
C:\agent_work\r6\a\ [artifact_name]\bin
EDIT2:
With a little more exploration, the build artifacts are correct. It is when the Release copies those artifacts into the release process that the problem happens.
A week ago I renamed the folder in TFS containing my test project inside the solution. The old folder name is showing up in the artifacts the Release copies into itself. The new folder name is also showing up, which means I now have two dlls, and that is causing problems.
I am not sure where the Release is finding this copy of the old folder. I explicitly deleted it from the Release's copy of the build artifacts and re-ran the build and release and it showed back up.
Thanks to the suggestions from Daniel, I eventually figured out that after creating the artifacts, my build process was then publishing those to a separate place on the file system.
Unfortunately, the Copy and Publish Build Artifacts task does not have a clean feature like the basic Copy Files task does.
As such, whenever files are removed from the build they still exist in that location when the Release process goes to try and grab what it thinks the artifacts are. And so it ends up grabbing extra files.
Manually deleting the old files from that secondary artifact destination location solved the issue.
I use TeamCity 10.x (Enterprise), as well as TFS for source control.
I recently committed a changeset which did not break the build, but ended up causing hundreds of unit tests to fail across 10+ test projects. Those tests started failing immediately after my check-in, so naturally I assumed that I was at fault.
The changeset included these changes:
Changing the namespaces of various files (e.g., moving all test stubs to a "Stubs" folder and updating their namespace to end with the ".Stubs" suffix);
Updating the using statements in other files to reference these new namespaces where necessary;
Rewriting some unit tests using the Moq library, replacing TypeMock.
I rolled back the entire changeset, in the hopes that all the unit tests would pass again.
Unfortunately, most of the test projects continue to have a lot of failing tests. Additionally, these failing tests are causing all of the remaining tests to be skipped by the VS Test Engine. This is new behaviour that I hadn't seen before.
Questions:
Why are the tests still failing even though I have rolled back the offending changeset?
What can I do to fix this?
If I haven't provided enough information in this post, please let me know and I shall update.
Please try below items to narrow down the issue:
Check that if you indeed rollback the changeset.
Check how TeamCity get the sources, get the latest changeset or
specific changeset?
Try to clean caches and pervious sources on agent machine(Clean all files before build), then try it again.
Reference : Clean Checkout and
Clean Sources
I've used jenkins for quite a few years but have never set it up myself which I did at my new job. There are a couple questions and issues that I ran into.
Default workspace location - It seems like the latest Jenkins has the default workspace in Jenkins\jobs[projectName]\workspace and is overwritten (or wiped if selected) for every build. I thought that it should instead be in Jenkins\jobs[projectName]\builds[build_id]\ so that it would be able to store the workspace state for every build for future reference?
Displaying workspace on the project>Build_ID page - This goes along with the previous as I expected each 'workspace' for previous builds to show here. Currently in my setup this individual page gives you nothing except the Git revision and what repo changes triggered the build. As well as console output. Where are the artifacts? Where is the link to this build's workspace that was used?
Archiving Artifacts in builds - When choosing artifacts, the filter doesn't seem to work. My build creates a filestructure with the artifacts in it inside workspace. I want to store this and the artifacts filter says it starts at workspace. So I put in 'artifacts' and nothing gets stores (also where would this get stored?). I have also tried '/artifacts' and 'artifacts/*'.
Any help would be great! Thanks!
It does seem like you are confused about several aspects of Jenkins.. I think your question basically boils down to the following.
What is a difference between a workspace and a build?
So, here are some thoughts on this topic:
Builds are historical data. They (usually) don't change like a workspace does during building/checkout.
Builds contain information about a run (e.g. its status, build number, change log, etc) and any artifacts that you tell it to archive (logs, test results, etc). They (usually) don't contain source code like a workspace.
Builds are stored in the Jenkins\jobs\[projectName]\builds\[build_id]\ directory. This is a directory managed by Jenkins and you (usually) do not need to modify anything in this directory. However, workspaces are directories meant for the build and you can do pretty much anything with them and place them anywhere (it does not need to be in the default Jenkins\jobs\[projectName]\workspace directory.
Workspaces should be able to be wiped at any given time. To restore it, just rebuild the job with the same parameters/revision. If you need to keep something after a build, tell Jenkins to archive it before the build is done.
In regard to saving the entire state, I don't think you need to do that. As mentioned in #4, you should be able to reproduce the same build by kicking off the same revision/parameters as the build in question. If you cannot get back to the original state from the same revision/parameters, then that might be something to strive for as debugging is going to be a nightmare. :)
A workspace is an aspect of the project and not a build and that is why there is no link to the workspace from that page. Again, a build is just saved data from a previous run. A project uses the workspace to build stuff and that is why you can get to the workspace from that page.
In regard to how to save artifacts, you must specify the names of the files you want to save. Unless you are trying to save a file called "artifacts", then you should probably use something else. How about **/*.log for all log files? or **/*.xml for all xml files?
Hope this helps.
I set up jmeter job in jenkins, which supposed to publish *.jtl results and then display them in a nice trend graph.
But, depite that I see that they're published under the builds//performance-results/JMeter folders, the trend always shows only current day results. So if I run this build three times during a day - I'll see graph with a three points. If it was just one run today - I'll see 1 run on that graph. I don't see yesterday and etc results on graph. I'd like to see this trend to display all the data from all the previous builds, including yesteday, etc.
What should I check, how perf plugin decides which *.jtl data to use to display data??
in settings of the job I have this regexp for jtl source: **/*.jtl, so I would expect all the builds data being displayed on the trend ...
Apparently the solution is very simple. Found it myself!
By default all jtl files had a timestamp at the beginning, thanks to jmeter-maven-plugin. Pattern was yyyyMMdd. Trend report in jenkins displayed last build results. And because of the pattern jtl results for all builds run this day were the same, and were different for previous day.
So, easiest solution was to remove that timestamp from the results file name.
<testResultsTimestamp>false</testResultsTimestamp>
in configuration part of jmeter-maven-plugin in the pom file.
Annoying, is that Performance plugin guys haven't put it into the documentation, - the requirement for results file to have the same name in order to be displayed on the graph...
Apart from that, there is an issue with Performance plugin (version 1.12 and 1.13). Due to that, the LastReport (image doesn't show) and other reports are showing missing info.
To fix it, either you can download/git clone the latest code from Performance Plugin github repo and build it locally (using mvn clean install and you'll get performance.hpi Jenkins plugin file) OR revert back to Performance plugin 1.11 version.
As 1.12/1.13 have some other enhancements over 1.11, I selected to build myself, until someone will fix Performance Plugin and come up with a recent release version (aka 1.14 containing the fix for this issue).
Issue: https://issues.jenkins-ci.org/browse/JENKINS-27100
I have setup a build controller etc and the builds were failing, I have fixed these now and the build failed properly - as in because of an error.
I have fixed the error and checked the code back in but now the code is not being extracted, although sometimes one folder of many is.
I have deleted the code from the build machine and requeued a build but it keeps failing. It complains that it cannot find the solution that I specified as the build solution.
I have checked the check box to build even if nothing has changed.
Have I missed a setting somewhere for extracting the code?
TFS version is 2012 Express
Visual Studio version is 2010 Professional
I had this issue recently with TFS 2012. I think it boils down to this:
In the lastest build definition files, it appears that it performs a Clean task before updating the workspace. This means that if you do something that causes the Clean part of the build to fail, it will never download the new files in order to fix it.
Recently, I was making big changes to my build file and inevitably made a lot of mistakes, I found that if one of these mistakes caused the Clean to break, I had to go onto the Build server and change the file manually to get it working again.
Does this sound like it might be the same issue?
There are several properties in your build definition you can check. I would start with setting the "Clean Workspace" to All to ensure the correct code is being pulled down and built.
There are other checks you can look at as well like the agent set for the build and the "GetVersion" property. Check the below link out. It should be able to help you in more detail.
Define a Build Process that is Based on the Default Template