Specflow test runsettings & usersecrets - specflow

I have some specflow tests which use runsettings files to pass parameters through to the test cases and some of these parameters are secrets.
In our build pipeline & release pipelines on ADO I have it figured out that I can choose the run settings file and override parameters with those on the pipeline variables but I am trying to figure out a way that we can still run these tests locally.
I want to be able to create user secrets using the .net tool and somehow get these passed into the tests at runtime and replace anything in the runsettings with the matching ID.
I don't want it to write anything into the runsettings files as this still poses a risk that someone might commit the runsettings files with the secrets in.
Any idea how I can accomplish this as struggling to find anything

Related

What is the standard way of preconfiguring Jenkins?

I have a significant amount of pre-configuration that I want to automate for Jenkins. E.g. Pre configuring gerrit for the gerrit trigger plugins, pre configuring saml, libraries etc
I'm aware of two methods typically used to do similar tasks:
Configuration as code plugin + yaml configuration
Groovy scripts to execute from the init.groovy.d directory of jenkins home on Jenkins startup
My users want to be able to update Jenkins configuration from the UI without needing to update yaml, suggesting the config as code plugin isn't fit for our purpose as I believe it reapplies the config when the Jenkins container is restarted.
My hunch is to use groovy scripts that remove themselves after the first execution so that they don't reapply themselves on restart.
Is there a more standard way of pre configuring Jenkins? or is groovy my best bet?
TL;DR: Use the file system
Why? There is no "standard" way to achieve what you intend; the two approaches that you suggest are viable options for sure.
From operational point of view, however, it will be good to select a solution which is
generic (so it can cover all aspects of Jenkins configuration) and
"simple" to use
Now,
"Configuration as code" makes you depend on the corresponding plugin -- it may or may not support a specific configuration option
With groovy, it is sometimes quite difficult to find out how to set a Jenkins configuration option (and how to store the setting permanently).
Since all Jenkins configuration data is stored on-disk, another option for bootstrapping Jenkins with a well-defined configuration is to pre-fill those configuration files with proper content right away:
You can be sure that this works in all cases, including all border cases (like, secret/encrypyted data)
Users can change the data later on as needed
Usually, it's quite easy to find the proper configuration file
On the downside, there is a risk that the configuration file format might change with newer versions of the core or of some plugin. However, a similar risk exists for the two other solutions that you suggested.
Tip: for rolling out such pre-configured Jenkins setups, it is helpful to disable the Jenkins setup wizard by setting jenkins.install.runSetupWizard to false.
When you combine words like : pre-configuring Jenkins, init.groovy.d, jenkins home, jenkins startup, etc, it sounds confusing o_O
When Jenkins is ready to use, usual folks just need to create jobs or pipelines. If you need to create a job or pipeline, you just need to install and configure some plugins. Very few of them need groovy, because the goal is "Easy to use".
Advanced user are able to create its own plugins, with java. But almost all is available as plugins.
You can use groovy in a pipeline scripts or declarative pipelines.
So if your question is more like "What is the best way to create and configure jobs or pipelines", I can advise you:
Try much as possible to use pipeline scripts or declarative pipelines.
Use just verified and supported plugins.
Stop call shell scripts in hard drive.
Stop using complicated configurations. Almost all of requirements are already implemented and documented.
If you have a requirement and no one plugin seems to help you, ask here in stackoverflow or develop your own plugin focused in configurability, so you can release it, for the benefit of Jenkins Community.

TFS Build Definitions convert to script or version the build definitions

In TFS, I want to treat build definitions as a code that I can also check-in to some repository. This will benefit tracking deltas in case the build definitions got updated and is no longer in a working state. Also having it in a script provides me the option to run the script locally.
Is that possible in TFS? In practice, I prefer writing the builds,packaging, and deploy code via powershell. I am able to reuse my powershell scripts but I find creating build definitions sometimes easy but I was hoping that after I create the build definitions/steps, I can somehow extract it to a powershell script.
There is no built-in capability for that at the moment. The closest you can get is to use the REST API to extract the build JSON and then set up a CI/CD trigger to update the build definition using the REST API. It's not ideal. I agree with your approach.
There is a proposal to use YAML for exactly this purpose: https://github.com/Microsoft/vsts-tasks/blob/master/docs/yaml.md

Jenkins Global Pipeline Library - what's a sane development workflow?

What is a sane development workflow for writing jenkins global pipeline libraries and jenkinsFiles? It's kind of a pain to check in my changes to the global pipeline library and then run a build w/ retry to modify the jenkinsFile, then save the diff if it takes a couple iterations.
Anybody have any recommendations? What do you do?
There is a 3rd-party unit testing framework for Jenkins pipelines: lesfurets/JenkinsPipelineUnit. This also covers shared libraries and allows you to verify the call stack of your pipeline scripts.
Just based on the small amount of context from your question, I can share what I've learned. YMMV.
Source-control the library and make your changes on a branch. You might need another repo to act as a guinea pig to test the new changes - add #branchname to the library declaration in its Jenkinsfile.
Run and test your code on a Jenkins instance running on your local machine. It may not match exactly your production instance, but it's much closer, and therefore faster feedback, to your local. Also, you don't have to push your changes to test - commit locally, test, un-commit, change, commit, test, repeat. When you are done or need to test on the production instance, push.
Use the Script Console as a Groovy scratchpad to test code in. It's missing a lot of plugins/features, but it's great for throwing together some test code to make sure the basics work.
Create small throwaway pipelines to test bits of functionality that you want to iterate quickly on before putting it in a real pipeline, and that won't work in the script console. This lets you focus on the functionality you're building and not worry about the other bits.
Important note: I know of at least one function that doesn't work in the script console, but works fine in a real pipeline: readJSON. It throws an error as if you're doing something wrong but it's just broken in the console. I'm sure there are others.
I'll come back and add more as i think of it.

Configure Jenkins to generate fake dll

We are trying to run all the vstest unit test cases as part of our CI.
I want to generate fake dll once after the actual dll is built in Jenkins.
is there any msbuild command to generate fakes dll out of C# dll that i can configure in Jenkins?
Right now we are actually generating the fake dll from visual studio 2013 and copying it to jenkins machine(predefined location).
This will be really helpful as i'm trying to get rid of manually generating the fake dll and copying it to jenkins machine every time there is a change in the actual dll.
Could you elaborate on your question a little bit more? I think that will help us narrow down a good answer for you.
In general, Jenkins can easily execute 2 compile steps in succession to generate both of your dlls before testing.. For example, in 2 different build steps, as a batch/shell/bash script, as downstream/pipeline jobs, or within a build scripting language like ant/maven/gradle.

TFS/MSBuild Generic test - how to specify project-relative path

I'm trying to get Silverlight unit tests integrated into a TFS build using a "Generic Test" on a test project to wrap Statlight
When I hard code the paths to everything it works fine but I'm tearing my hair out trying to work out how to pass project-relative paths to the generic test so that it works in the TFS build environment.
The closest I've come is to have the build process workflow set environment variables pointing to StatLight and the xap files containing the SL unit tests so that I can reference those in the command line arguments to statlight.
Feels hacky though - is there a better way of getting path information into generic tests based on the current solution/build configuration?
The MSDN documentation uses a couple of environment variables (%TestDeploymentDir% for one) that I can't find documented anywhere so I'm wondering if there are any more magic variables that I can use to infer paths from rather than having to set my own.
Here are some references on MSBuild properties:
Common MSBuild Project Properties
MSBuild Reserved Properties
Team Build 2008 Property Reference
Hope that helps.

Resources