How to exclude execution of certain specflow tests marked with #ignore keyword in Azure Pipeline - specflow

I have many specflow feature files in my solution and there are multiple UI test cases.
At the page level, I have defined a tag eg #Feature1 for the first file and #feature2 for the second file. They are passed onto as the parameter in the yaml file
I pass the tag to my pipeline yml. Now I am in a situation wherein I have few test cases marked as #ignore as well
So then the pipeline runs, these test cases are not exculded but eventually fail.
I want to skip the test cases marked with the #ignore attribute/tag.
Here is a snippet from my pipeline
parameters:
- name: 'featuresToRun'
type: object
default:
- Performance
- AutoComplete
- Benches
- CATMI
- Export
- GemIntegration
- Keyboard
- MainMenu
- NewVoyages
- ReferenceData
- Settings
- SimilarVoyages
- Validation
- Views
- VolumeConversion
- Voyages
- LaycanRanges
trigger: none
jobs:
- job: startVM
timeoutInMinutes: 10
pool:
vmImage: 'windows-latest'
steps:
- checkout: none
- job: runTests
timeoutInMinutes: 1800
dependsOn: startVM
condition: not(canceled())
pool:
name: 'UI Automation'
steps:
- task: ScreenResolutionUtility#1
inputs:
displaySettings: 'optimal'
- task: VisualStudioTestPlatformInstaller#1
inputs:
packageFeedSelector: 'nugetOrg'
versionSelector: 'latestStable'
- task: NuGetCommand#2
inputs:
command: 'restore'
restoreSolution: '**/*.sln'
feedsToUse: 'config'
- task: MSBuild#1
inputs:
solution: 'UIAutomation.sln'
msbuildArchitecture: 'x64'
clean: true
- ${{each feature in parameters.featuresToRun}}:
- task: VSTest#2
displayName: ${{feature}} Tests
inputs:
testSelector: 'testAssemblies'
testAssemblyVer2: |
UIAutomation.Specs\bin\Debug\UIAutomation.Specs.dll
!**\*TestAdapter.dll
!**\obj\**
searchFolder: '$(System.DefaultWorkingDirectory)'
uiTests: true
testRunTitle: '${{feature}}'
testFiltercriteria: 'Category=${{feature}}'
rerunFailedTests: true
rerunMaxAttempts: 2
rerunFailedThreshold: 80
codeCoverageEnabled: true
continueOnError: true

Modify the testFiltercriteria to exclude that category:
testFiltercriteria: 'Category=${{feature}}&Category!=ignore'
^^^^^^^^^^^^^^^^^
it appears the testFiltercriteria property translates to the --filter argument to dotnet test or the --testcasefilter:<Expression> argument to vstest.console.exe.
The #ignore tag in Gherkin gets translates to a [TestCategory(...)] attribute above the test methods (when using MS Test). Other unit test providers have a similar conversion.
More info:
https://github.com/Microsoft/vstest-docs/blob/main/docs/filter.md
https://learn.microsoft.com/en-us/visualstudio/test/vstest-console-options

Related

How to deploy webapp and webjob in same pipeline for .net framework project

I have an ASP.NET MVC application (not .NET Core) which I am deploying to an Azure App Service using Azure Devops and a yaml file. It's working fine but I've another console application in the same solution and I want to deploy that as a webjob in the same app service.
I can't find any good help with .NET framework. All the links I found are demonstrating in .NET Core.
Here is my yaml:
trigger:
- master
pool:
vmImage: 'windows-2019'
variables:
solution: '**/*.sln'
buildPlatform: 'Any CPU'
buildConfiguration: 'Release'
stages:
- stage: Build
displayName: Build and Test Package
jobs:
- job: Build_Test_Publish
displayName: Build_Test_Publish
steps:
- task: NuGetToolInstaller#1
- task: VisualStudioTestPlatformInstaller#1
displayName: 'Install Visual Studio Test Platform'
inputs:
packageFeedSelector: 'nugetOrg'
versionSelector: 'latestStable'
- task: NuGetCommand#2
displayName: 'Restore NuGet packages'
inputs:
command: 'restore'
restoreSolution: '$(solution)'
feedsToUse: 'config'
nugetConfigPath: './'
- task: VSBuild#1
displayName: 'Build Solution'
inputs:
solution: '$(solution)'
msbuildArgs: '/p:DeployOnBuild=true /p:WebPublishMethod=Package /p:PackageAsSingleFile=true /p:SkipInvalidConfigurations=true /p:PackageLocation="$(build.artifactStagingDirectory)"'
platform: '$(buildPlatform)'
configuration: '$(buildConfiguration)'
- task: VSTest#2
displayName: 'Run Unit Tests'
inputs:
platform: '$(buildPlatform)'
configuration: '$(buildConfiguration)'
- task: PublishBuildArtifacts#1
inputs:
PathtoPublish: '$(Build.ArtifactStagingDirectory)'
ArtifactName: 'drop'
publishLocation: 'Container'
- stage: Deploy
displayName: Deploy
jobs:
- deployment: Deploy
displayName: Deploy
environment: 'PROD'
strategy:
runOnce:
deploy:
steps:
- checkout: none
- task: DownloadBuildArtifacts#0
inputs:
buildType: 'current'
downloadType: 'single'
artifactName: 'drop'
downloadPath: '$(System.ArtifactsDirectory)'
- task: AzureRmWebAppDeployment#4
inputs:
ConnectionType: 'AzureRM'
azureSubscription: 'mySubscription'
appType: 'webApp'
WebAppName: 'myWebApp'
packageForLinux: '$(System.ArtifactsDirectory)/**/Web.zip'
To deploy a .NET framework Web job, you need to put it in a specific folder like App_Data/jobs/continuous for continuous Web Jobs and App_Data/jobs/triggered for scheduled or on-demand Web Jobs .
The following script must be added to the .YAML file .
- task: DotNetCoreCLI#2
inputs:
command: 'publish'
publishWebProjects: true
arguments: '--output $(Build.BinariesDirectory)/publish_output'
zipAfterPublish: false
- task: DotNetCoreCLI#2
inputs:
command: 'publish'
publishWebProjects: false
projects: '**/*webjob.csproj'
arguments: '--output $(Build.BinariesDirectory)/publish_output/App_Data/jobs/continuous/YoutubeWebJob'
zipAfterPublish: false
modifyOutputPath: false
The first task will add a the web app the later one will add a web job in this it is a continuous Web job .
refer this tutorial to deploy Web job and webapp together in one pipeline .
refer this documentation on how to deploy .NET framework Web jobs .

Azure DevOps Pipepine with YAML for solution with many projects

I have a .NET MVC solution in Visual Studio 2019 that contains 3 projects:
AdminWebApp
SharedCode (which is set as a dependency in both of the other projects in VS)
FrontWebApp
In Azure DevOps Pipelines I want to create separate builds for
AdminWebApp
and
FrontWebApp
which will both contain the
SharedCode
because it contains helpers etc. I want to do it with the YAML way.
Should I create 1 or 2 pipelines (each artifact will be later published to its own Azure App Service)? What is the YAML code to achieve it?
It would really matter on how the management and release cycle looks like. The puristic way would be to redeploy everything every-time. The realistic way would be to group deployment pipelines together when it make sense.
In terms of what to do in the "YAML way". Would look at using YAML templates
The template parameter would at least by the directory of the project to build. This is an example of a .net Core template but will give you an idea of the thought process: For example purposes this YAML file would be called something like build-corewebapp.yml
parameters:
SolutionPath: ''
BuildConfiguration: 'Release'
projectName: ''
DependsOn: []
publish: 'false'
jobs:
- job: Build_${{ parameters.projectName }}
dependsOn: ${{ parameters.DependsOn }}
steps:
- task: DotNetCoreCLI#2
displayName: 'dotnet restore'
inputs:
command: 'restore'
projects: '$(Build.SourcesDirectory)/${{ parameters.SolutionPath }}/${{ parameters.projectName }}**/*.csproj'
- task: DotNetCoreCLI#2
displayName: 'dotnet build'
inputs:
projects: '$(Build.SourcesDirectory)/${{ parameters.SolutionPath }}/${{ parameters.projectName }}**/*.csproj'
arguments: '--configuration ${{ parameters.BuildConfiguration }}'
- task: DotNetCoreCLI#2
displayName: 'dotnet test'
inputs:
command: test
projects: '$(Build.SourcesDirectory)/${{ parameters.SolutionPath }}/${{ parameters.projectName }}.Tests/*.csproj'
arguments: '--configuration ${{ parameters.BuildConfiguration }} --collect "Code coverage" '
- job: Publish_${{ parameters.projectName }}
dependsOn: Build_${{ parameters.projectName }}
condition: and(succeeded(),eq( ${{ parameters.publish }}, 'true'))
steps:
- task: DotNetCoreCLI#2
displayName: 'dotnet publish'
inputs:
command: publish
publishWebProjects: false
projects: '$(Build.SourcesDirectory)/${{ parameters.SolutionPath }}/${{ parameters.projectName }}**/*.csproj'
arguments: '--configuration ${{ parameters.BuildConfiguration }} --output $(build.artifactstagingdirectory)'
zipAfterPublish: True
- task: PublishBuildArtifacts#1
displayName: 'Publish Artifact: drop'
A template would be called by something similar to:
jobs:
- template: build-corewebapp.yml
parameters:
projectName: ${{ variables.appProjectName }}
solutionPath: $(solutionPath)
publish: 'true'
For max re usability I would recommend any kind of build template to exist in a separate repository so it can be used by other repos. This would be setup in your pipeline by referencing the repo similar to:
resources:
repositories:
- repository: repositoryTemplate
type: git
name: ProjectName/YAMLTEMPLATERepoName
The pro to using templates is then updating a task version or changing a build/deployment strategy can be updated and referenced in one place.
You can use conditions to make separate builds, so that you can put all build steps in one pipeline. Here's similar topic.
Simple sample for your build steps:
steps:
- task: PowerShell#2
inputs:
targetType: 'inline'
script: |
$files=$(git diff HEAD HEAD~ --name-only)
$temp=$files -split ' '
$count=$temp.Length
For ($i=0; $i -lt $temp.Length; $i++)
{
$name=$temp[$i]
echo "this is $name file"
if ($name -like "AdminWebApp/*")
{
Write-Host "##vso[task.setvariable variable=RunAdminWebApp]True"
}
if ($name -like "SharedCode/*")
{
Write-Host "##vso[task.setvariable variable=RunSharedCode]True"
}
if ($name -like "FrontWebApp/*")
{
Write-Host "##vso[task.setvariable variable=RunFrontWebApp]True"
}
}
- task: MSBuild#1
inputs:
solution: '**/AdminWebApp.csproj'
msbuildArguments: 'xxx'
condition: or(variables['RunAdminWebApp'], variables['RunSharedCode'])
- task: MSBuild#1
inputs:
solution: '**/FrontWebApp.csproj'
msbuildArguments: 'xxx'
condition: or(variables['RunFrontWebApp'], variables['RunSharedCode'])
- task: PublishBuildArtifacts#1
inputs:
PathtoPublish: '$(Build.ArtifactStagingDirectory)'
ArtifactName: 'drop'
publishLocation: 'Container'
If any file in your AdminWebApp project changes, then only build the AdminWebApp and SharedCode project.(First build task)
If any file in your FrontWebApp project changes, then only build the FrontWebApp and SharedCode project.(Second build task)
And if the file in SharedCode changes, since the two projects depend on it, both two build tasks will run.
You should specify the msbuild arguments(/t:publish...) so that the build task can generate zip package to deploy. (Otherwise you need to add additional task to zip the output files)
Since you'll get two published zip files once the SharedCode project has changes. Then your release pipeline should at least have two deploy tasks. For you release: One PS task(determine whether the A.zip/B.zip exists and then set custom variables DeployA/DeployB) and two conditional Deploy tasks based of the value of DeployA/DeployB.(Just a suggestion,it's not about your original issue, so I won't talk much about it here...)

Azure pipelines UI to accept parameters (like Jenkins)

Jenkins has a UI concept with dropdown lists, etc. to allow users to specify variables at run time. This has proven essential in our builds to make decisions in the pipeline (ie. which agent to run on, code base to choose, etc). By allowing parameters we are able to have a single pipeline/definition handle the same task for many clients/releases/environments.
I have been watching many people over the past year ask for this - to eliminate the number of almost identical build definitions - is there a best practice to handle this? Would be nice to have a single build definition for a specific task that can be smart enough to handle parameters.
Edit : example of possible pseudo-code to build on levi-lu#MSFT's suggestion.
parameters:
- name: ClientName
displayName: Pool Image
default: Select client
values: powershell
valuesScript : [
assemble curl request to http://myUrl.com/Clients/GetAll
]
- name: TargetEnvironment
displayName: Client Environment
type: string
values: powershell
valuesScript: [
assemble curl request using above parameter value to
https://myUrl.com/Clients/$(ClientName)/GetEnvironments
]
trigger: none
jobs:
- job: build
displayName: Run pipeline job
pool:
vmImage: windows-latest
parameters:
ClientName : $(ClientName)
TargetEnvironment : $(TargetEnvironment)
steps:
- script: echo building $(Build.BuildNumber)
Runtime parameters is available now. You can now set runtime parameters at the beginning of your pipepline YAML using parameters. For below example:
parameters:
- name: image
displayName: Pool Image
default: ubuntu-latest
values:
- windows-latest
- vs2017-win2016
- ubuntu-latest
- ubuntu-16.04
- macOS-latest
- macOS-10.14
- name: test
displayName: Run Tests?
type: boolean
default: false
trigger: none
jobs:
- job: build
displayName: Build and Test
pool:
vmImage: ${{ parameters.image }}
steps:
- script: echo building $(Build.BuildNumber)
- ${{ if eq(parameters.test, true) }}:
- script: echo "Running all the tests"
Above example is from Microsoft official document. Click here to learn more about runtime parameters.
When you run above Yaml pipeline, You will be able to select the parameter's value from the dropdown list. See below screenshot.
Update: To set variables dynamically at runtime.
You can use the task.setvariable logging command to set variables dynamically in scripts.
For below example: $resultValue is the value from rest api call. And its value is assigned to variable VariableName
- powershell: |
$resultValue = call from Rest API
echo "##vso[task.setvariable variable=VariableName]$resultValue"
Check document here for more information.

Azure pipeline building image from wrong Dockerfile

My project contains 2 Dockerfiles, one for the backend and one for a mock database. I have a build pipeline in Azure using the following script:
trigger:
- master
pool:
vmImage: 'windows-latest'
variables:
solution: '**/*.sln'
buildPlatform: 'Any CPU'
buildConfiguration: 'Release'
steps:
- task: NuGetToolInstaller#1
- task: NuGetCommand#2
inputs:
restoreSolution: '$(solution)'
- task: VSBuild#1
inputs:
solution: '$(solution)'
msbuildArgs: '/p:DeployOnBuild=true /p:WebPublishMethod=Package /p:PackageAsSingleFile=true /p:SkipInvalidConfigurations=true /p:DesktopBuildPackageLocation="$(build.artifactStagingDirectory)\WebApp.zip" /p:DeployIisAppPath="Default Web Site"'
platform: '$(buildPlatform)'
configuration: '$(buildConfiguration)'
- task: CopyFiles#2
inputs:
SourceFolder: './MyProject'
Contents: '**'
TargetFolder: '$(build.artifactStagingDirectory)'
flattenFolders: true
- task: CopyFiles#2
inputs:
SourceFolder: './MyProject/Database'
Contents: '**'
TargetFolder: '$(build.artifactStagingDirectory)/Database'
flattenFolders: true
- task: ArchiveFiles#2
displayName: "Archive files"
inputs:
rootFolderOrFile: "$(build.artifactStagingDirectory)"
includeRootFolder: true
archiveFile: "$(System.DefaultWorkingDirectory)/build$(Build.BuildId).zip"
- task: PublishBuildArtifacts#1
inputs:
PathtoPublish: '$(build.artifactStagingDirectory)'
ArtifactName: 'backend'
I put 2 CopyFiles steps in there because I have 2 Dockerfiles, one in /MyProject for the backend and one in /MyProject/Database for a mock database. This way I can choose between both Dockerfiles later on in my release pipeline. In the picture below I have one Dockerfile marked in the backend folder and you can see the other Dockerfile in the Database folder.
The problem is that even though I select the Dockerfile placed in the backend folder for the release step, the pipeline uses the Dockerfile for the database. Presumably this is because this is the first Dockerfile it encounters, even though it is located in a subdirectory of what I have specified. How can I make my pipeline use the correct Dockerfile?
Found the problem: setting flattenFolders: true did so that the Dockerfile in the subdirectory overwrote the top-level Dockerfile. By setting it to false I got the complete folder structure and could choose the correct Dockerfile.

Triggering different downstream jobs via templates in Jenkins Job Builder

I'm trying to trigger one or more downstream jobs using a job template. A summary of my definition:
- job-template:
name: something-{app}-build
project-type: freestyle
defaults: global
block-downstream: true
scm:
- git:
url: url
branches:
- 'master'
excluded-users:
- Jenkins
builders:
!include: templates/build-and-publish.yml
publishers:
- postbuildscript:
builders:
!include: templates/docker-build-and-push-to-ecr.yml
script-only-if-succeeded: True
mark-unstable-if-failed: True
- trigger-parameterized-builds:
- project: 'deploy-dev-ecs-service'
condition: SUCCESS
predefined-parameters: |
service={app}
envparams={envparams}
- project:
name: release-to-ecr
type: app
envparams: ''
app:
- app-1
- app-2:
- app-3:
envparams: 'FOO=42'
jobs:
- 'something-{app}-build'
Now this works, but I need to trigger different downstream jobs based on the app. This means triggering deploy-dev-ecs service multiple times with multiple parameters. For example:
app:
- app-1:
- project: deploy-dev-ecs-service
service: 'app-1'
envparams: 'foo=bar'
- app-2:
- project: deploy-dev-ecs-service
service: 'app-2.2'
envparams: 'x=2'
- project: deploy-dev-ecs-service
service: 'app-2.3'
envparams: 'x=3'
Essentially, I need to control which downstream job(s) get triggered based on
the project parameters. Is there a way to do this?

Resources