How can I add a custom Build Summary section from a script? - tfs

I have a TFS 2013 XAML build process template that runs a PowerShell script (which pushes packages to NuGet).
The build activity WriteCustomSummaryInformation was added in TFS2012 for XAML builds. I'd like to use this same activity or implement the same functionality somehow from my script (so that I can show which packages were published). How can I do this?

I figured it out by running the activity and looking at what it added to the build information.
Function New-CustomSummaryInformation($Build, $Message, $SectionHeader, $SectionName, $SectionPriority = 0)
{
$CustomSummaryInformationType = 'CustomSummaryInformation'
$root = $Build.Information.Nodes | ? { $_.Type -eq $CustomSummaryInformationType } | select -First 1
if (!$root)
{
$root = $Build.Information.CreateNode()
$root.Type = 'CustomSummaryInformation'
}
$node = $root.Children.CreateNode()
$node.Type = 'CustomSummaryInformation'
$node.Fields['Message'] = $Message
$node.Fields['SectionHeader'] = $SectionHeader
$node.Fields['SectionName'] = $SectionKeyName
$node.Fields['SectionPriority'] = $SectionPriority
}
[void][Reflection.Assembly]::LoadWithPartialName('Microsoft.TeamFoundation.Client')
[void][Reflection.Assembly]::LoadWithPartialName('Microsoft.TeamFoundation.VersionControl.Client')
[void][Reflection.Assembly]::LoadWithPartialName('Microsoft.TeamFoundation.Build.Client')
$workspaceInfo = [Microsoft.TeamFoundation.VersionControl.Client.Workstation]::Current.GetLocalWorkspaceInfo($env:TF_BUILD_SOURCESDIRECTORY )
$tpc = new-object Microsoft.TeamFoundation.Client.TfsTeamProjectCollection $workspaceInfo.ServerUri
$vcs = $tpc.GetService([Microsoft.TeamFoundation.VersionControl.Client.VersionControlServer])
$buildServer = $tpc.GetService([Microsoft.TeamFoundation.Build.Client.IBuildServer])
$buildDef = $buildServer.GetBuildDefinition("MyProject", "MyBuildDefn")
$build = $buildServer.GetBuild($def.LastBuildUri)
New-CustomSummaryInformation $build -Message "This is a test message" -SectionHeader "This is the header displayed" -SectionName "ThisIsAnInternalKey"
$build.Information.Save()

Related

How to integrate AxoCover (NUnit) with Jenkins

I've not been able to find article or tutorial described the topic.
Thus, my question is how to integrate AxoCover with Jenkins:
Mark build failed or secussefful in accordance of code coverage percentage
Sent email with notification to specific users
Thanks in advance
So, in general i founded out solution in creating powershell script and executing it during the jenkins job:
<#
.SYNOPSIS
Runs NUnit unit tests in given Visual Studio solution and provides code coverage result
Duplicate for CodeCoverageRunner fix for NUnit 3.
https://github.com/nunit/nunit-console/issues/117
.PARAMETER TestConfig
{
"SolutionFile": "\\project.sln",
"BuildConfiguration": "Debug",
"TestResultDirectory": "\\tempFolder",
"TestRunnerPath": "\\nunit3-console.exe",
"CodeCoverageToolPath" : "\\OpenCover.Console.exe",
"ReportGeneratorPath": "\\ReportGenerator.exe",
"AssemblyFilters": "+[*]*"
}
#>
$config = ConvertFrom-Json $TestConfig
$testRunnerArgs = GetTestProjects `
-SolutionFile $config.SolutionFile `
-BuildConfiguration $config.BuildConfiguration
$workingDirectory = $config.TestResultDirectory
$testRunner = $config.TestRunnerPath
$codeCoverageTool = $config.CodeCoverageToolPath
$reportGenerator = $config.ReportGeneratorPath
$filters = $config.AssemblyFilters
$coverageResult = Join-Path $workingDirectory "codecoverage.xml"
& $codeCoverageTool -target:"""$testRunner""" `
-targetargs:"""$testRunnerArgs --inprocess $("-result","test_result.xml")""" `
-register:Administrator `
-mergebyhash `
-skipautoprops `
-output:"""$coverageResult""" `
-filter:"""$filters""" `
-returntargetcode
if($LASTEXITCODE -ne 0){
exit $LASTEXITCODE
}
$targetDir = Join-Path $workingDirectory "CodeCoverage"
$historyDir = Join-Path $workingDirectory "CoverageHistory"
& $reportGenerator `
-reports:$coverageResult `
-targetDir:"""$targetDir""" `
-historydir:"""$historyDir"""
function GetTestProjects
{
param (
[Parameter(Mandatory=$true)]
[string]$SolutionFile,
[Parameter(Mandatory=$true)]
[string]$BuildConfiguration
)
$BaseDir = (get-item $solutionFile).Directory
$Projects = #()
Get-Content -Path $SolutionFile | % {
if ($_ -match '\s*Project.+=\s*.*,\s*\"\s*(.*Tests\.Unit.*proj)\s*\"\s*,\s*') {
$currentName = $matches[1].Split("\")[0]
$currentDll = $matches[1].Split("\")[1].Replace(".csproj",".dll")
#Write-Host "current $currentName"
$Projects += "`"", $(Join-Path -Path $BaseDir $currentName), "\bin\$BuildConfiguration\$currentDll" ,"`"" -join ""
}
}
return $Projects
}

AWS Codepipeline: works "by hand" but trouble getting terraform to set up stages

I got a sample AWS codepipeline working via the console but need to get it set up via Terraform.
I have two problems, one minor and one major:
The Github stage fails until I go in and edit it via the console, even though I wind up not changing anything that I already had set up in "owner" or "repo"
The more major item is that I keep getting CannotPullContainerError on the build step that keeps anything else from happening. It says "repository does not exist or may require 'docker login'".
The repository DOES exist; I used the command line from my Linux instance to verify the same 'docker login' and 'docker pull' commands that don't work from AWS CodePipeline.
(I know: the buildspec.yml is stupidly insecure but I wanted to get the prototype I had working the same way before I put in kms.)
My buildspec.yml is simple:
version: 0.2
phases:
pre_build:
commands:
- $(aws ecr get-login --no-include-email --region us-west-2)
- docker pull 311541007646.dkr.ecr.us-west-2.amazonaws.com/agverdict-next:latest
build:
commands:
- sudo apt install curl
- curl -sL https://deb.nodesource.com/setup_8.x | sudo bash -
- sudo apt install nodejs -y
- mkdir /root/.aws
- cp ./deployment/credentials /root/.aws/credentials
- cd ./deployment
- bash ./DeployToBeta.sh
Here's the terraform that creates the pipeline. (No 'deploy' step as the 'build' shell script does that from a previous incarnation.)
locals {
github_owner = "My-Employer"
codebuild_compute_type = "BUILD_GENERAL1_LARGE"
src_action_name = "projectname-next"
codebuild_envronment = "int"
}
data "aws_caller_identity" "current" {}
provider "aws" {
region = "us-west-2"
}
variable "aws_region" { default="us-west-2"}
variable "github_token" {
default = "(omitted)"
description = "GitHub OAuth token"
}
resource "aws_iam_role" "codebuild2" {
name = "${var.codebuild_service_role_name}"
path = "/projectname/"
assume_role_policy = "${data.aws_iam_policy_document.codebuild_arpdoc.json}"
}
resource "aws_iam_role_policy" "codebuild2" {
name = "codebuild2_service_policy"
role = "${aws_iam_role.codebuild2.id}"
policy = "${data.aws_iam_policy_document.codebuild_access.json}"
}
resource "aws_iam_role" "codepipeline2" {
name = "${var.codepipeline_service_role_name}"
path = "/projectname/"
assume_role_policy = "${data.aws_iam_policy_document.codepipeline_arpdoc.json}"
}
resource "aws_iam_role_policy" "codepipeline" {
name = "codepipeline_service_policy"
role = "${aws_iam_role.codepipeline2.id}"
policy = "${data.aws_iam_policy_document.codepipeline_access.json}"
}
resource "aws_codebuild_project" "projectname_next" {
name = "projectname-next"
description = "projectname_next_codebuild_project"
build_timeout = "60"
service_role = "${aws_iam_role.codebuild2.arn}"
encryption_key = "arn:aws:kms:${var.aws_region}:${data.aws_caller_identity.current.account_id}:alias/aws/s3"
artifacts {
type = "CODEPIPELINE"
name = "projectname-next-bld"
}
environment {
compute_type = "${local.codebuild_compute_type}"
image = "311541007646.dkr.ecr.us-west-2.amazonaws.com/projectname-next:latest"
type = "LINUX_CONTAINER"
privileged_mode = false
environment_variable {
"name" = "PROJECT_NAME"
"value" = "projectname-next"
}
environment_variable {
"name" = "PROJECTNAME_ENV"
"value" = "${local.codebuild_envronment}"
}
}
source {
type = "CODEPIPELINE"
}
}
resource "aws_codepipeline" "projectname-next" {
name = "projectname-next-pipeline"
role_arn = "${aws_iam_role.codepipeline2.arn}"
artifact_store {
location = "${var.aws_s3_bucket}"
type = "S3"
}
stage {
name = "Source"
action {
name = "Source"
category = "Source"
owner = "ThirdParty"
provider = "GitHub"
version = "1"
output_artifacts = ["projectname-webapp"]
configuration {
Owner = "My-Employer"
Repo = "projectname-webapp"
OAuthToken = "${var.github_token}"
Branch = "deploybeta_bash"
PollForSourceChanges = "false"
}
}
}
stage {
name = "Build"
action {
name = "projectname-webapp"
category = "Build"
owner = "AWS"
provider = "CodeBuild"
input_artifacts = ["projectname-webapp"]
output_artifacts = ["projectname-webapp-bld"]
version = "1"
configuration {
ProjectName = "projectname-next"
}
}
}
}
Thanks much for any insight whatsoever!
Both issues sound like permission problems.
CodePipeline's console is likely replacing the GitHub OAuth token (with one that works): https://docs.aws.amazon.com/codepipeline/latest/userguide/GitHub-authentication.html
Make sure the CodeBuild role (${aws_iam_role.codebuild2.arn} in the code you provided I think) has permission to access ECR.

Connecting swift into web service

Hey guys I'm kinda new into swift programming, currently I'm making an iOS apps and facing a problem
I have one textfield and one textview in my view controller, I'm supposed to connect the application into a web service, so when I input number in the textfield, the apps will print the return value in the textview. But i don't know what to code in the Xcode to communicate the apps with the server.
This is the code of the web service
<?php error_reporting(E_ERROR | E_PARSE);
include 'koneksi.php';
$nama = $_REQUEST['nama'];
$query = ("select * from pegawai where nip='$nama' or no_kpe='$nama'");
$exe = mysql_query($query);
if($row = mysql_fetch_assoc($exe)){
$c = TRIM($row['nama']);
$a = NUMBER_FORMAT($row['tht']);
$b = NUMBER_FORMAT($row['pensiun']);
if($a == 0){
echo "Nama = ".$c."\n";
echo "THT = Sedang Dilakukan Peremajaan Data Keluarga"."\n";
echo "Pensiun = Rp. ".$b;
}
else{
echo "Nama = ".$c."\n";
echo "THT = Rp. ".$a."\n";
echo "Pensiun = Rp. ".$b;
}
}
else {
echo "Mohon Periksa Kembali Notas atau No. KPE anda."."\n";
}
?>
I'm using Xcode 7 and running the apps in iOS 9.0
I'm new in this language and also web service so step by step answer will be much appreciated
You need to use the NSURLSession API to connect to a remote web service. I suggest using AFNetworking. It is an easy to use framework that makes network request and serialization for you.

How to get Jenkins to exclude entire folders from code coverage?

I'm trying to figure out how to exclude a list of folders from the code coverage report generated by jacoco, which is launched by Jenkins.
It seems possible to exclude classes, but not folders, which is annoying for me as I've started using a pretty big library for an online payment system. Running those unit tests means constantly creating test accounts on that platform and having to delete them again. Every single tine Jenkins runs.
And it would be far simpler to just have the folders excluded than having to exclude every single one of the classes.
To exclude entire directories by changing the Jenkins JaCoCo plugin configuration you would need to add an entry to the 'Exclusions' field.
For instance, if you want to exclude any files under any directory named 'test' you would add the following exclusion:
**/test/**
Keep in mind that if you want to add multiple exclusions you have to separate each one by a comma and there can be no spaces (due to a bug with the plugin).
Here is my JaCoCo plugin configuration:
Example JaCoCo Plugin Configuration
If you are using pipelines and Jenkinsfile you can use the following as an example of the settings (assumes gradle):
stage("Check code quality and coverage") {
steps{
sh "./gradlew jacocoTestReport sonarqube -x check"
step( [$class: 'JacocoPublisher',
exclusionPattern: '**/*Exception*,**/*Configuration*,**/ApiApplication*,**/*Test*'] )
}
}
Of note here is the exclusionPattern is comma separated and NO SPACES between the multiple exclusion patterns.
The easiest way to see the full list of potential settings is to look at the code:
https://github.com/jenkinsci/jacoco-plugin/blob/master/src/main/java/hudson/plugins/jacoco/JacocoPublisher.java
And check out the #DataBoundSetter's
public JacocoPublisher() {
this.execPattern = "**/**.exec";
this.classPattern = "**/classes";
this.sourcePattern = "**/src/main/java";
this.inclusionPattern = "";
this.exclusionPattern = "";
this.skipCopyOfSrcFiles = false;
this.minimumInstructionCoverage = "0";
this.minimumBranchCoverage = "0";
this.minimumComplexityCoverage = "0";
this.minimumLineCoverage = "0";
this.minimumMethodCoverage = "0";
this.minimumClassCoverage = "0";
this.maximumInstructionCoverage = "0";
this.maximumBranchCoverage = "0";
this.maximumComplexityCoverage = "0";
this.maximumLineCoverage = "0";
this.maximumMethodCoverage = "0";
this.maximumClassCoverage = "0";
this.changeBuildStatus = false;
this.deltaInstructionCoverage = "0";
this.deltaBranchCoverage = "0";
this.deltaComplexityCoverage = "0";
this.deltaLineCoverage = "0";
this.deltaMethodCoverage = "0";
this.deltaClassCoverage = "0";
this.buildOverBuild = false;
}
Exclude classes from sonar analysis by specifying sonar.jacoco.excludes parameter like this:
sonar.jacoco.excludes=*/exceptions/*:*/dto/*

Find all locked files in TFS

I would like to see all files that are locked. so far, I've only found to use tf.exe status and look for anything with '!' because they are not reported as "lock, edit" as they are in the UI. Any ideas? thanks.
If you have the power tools installed, it's a one-liner:
tfstatus . -r -user * | % { $_.pendingchanges } | ? { $_.islock } | select -unique serveritem
If you prefer GUIs to scripts, try TFS Sidekicks.
If you are trying to use TFS Sidekicks, and can't figure out how, it is under Tools, Team Foundation Sidekicks, Status Sidekick. You will need to expand that window, but you will then be able to search for locks for a username.
I don't think this is possible using tf.exe or even tfpt.exe (The Power Tool command line). You'll need to look through the pending changesets for changes that are locks. You could do this in powershell using the Power Tool commandlets or you could do it using the following bit of .NET code that exercises the TFS API:
using Microsoft.TeamFoundation.Client;
using Microsoft.TeamFoundation.VersionControl.Client;
namespace TfsApiExample
{
class Program
{
static void Main(string[] args)
{
GetLockedFiles("http://tfsserver:8080","$/TeamProject");
}
private static void GetLockedFiles(string serverUrl, string serverPath)
{
TeamFoundationServer tfs = new TeamFoundationServer(serverUrl);
VersionControlServer vcServer = (VersionControlServer)tfs.GetService(typeof(VersionControlServer));
// Search for pending sets for all users in all
// workspaces under the passed path.
PendingSet[] pendingSets = vcServer.QueryPendingSets(
new string[] { serverPath },
RecursionType.Full,
null,
null);
Console.WriteLine(
"Found {0} pending sets under {1}. Searching for Locks...",
pendingSets.Length,
serverPath);
foreach (PendingSet changeset in pendingSets)
{
foreach(PendingChange change in changeset.PendingChanges)
{
if (change.IsLock)
{
// We have a lock, display details about it.
Console.WriteLine(
"{0} : Locked for {1} by {2}",
change.ServerItem,
change.LockLevelName,
changeset.OwnerName);
}
}
}
}
}
}
from your command prompt
>powershell
Then from powershell do:
PS > tf info * -recursive | &{
begin{
$out=#{}
$prefix = "loc"
}
process{
if ($_ -match "Local information"){
if ($out.Count -gt 0) {
[pscustomobject]$out
$out=#{}
$prefix = "loc"
}
} ElseIf ($_ -match "Server information"){
$prefix = "svr"
} else {
$parts = $_.Split(':')
if ($parts.Length -eq 2){
$out.Add($prefix + $parts[0].Trim(), $parts[1].Trim())
}
}
}
end{
if ($out.Count -gt 0) {
[pscustomobject]$out
}
}
} | where {!($_.svrLock -eq 'none')}
I've found a GUI option.
Start Visual Studio
Open file
Go to source control
Then workspaces
Enter your credentials
Check show remote workspaces
Remove all unwanted workspaces
That simple :)

Resources