i'm currently trying to learn bdd techniques and trying frameworks for it. the idea comes from test to code, but i want to use it in revers way if possible.
i'm currently using specflow, selenium and nunit combination. i want to open my work to anyone (non-developers) for adding new examples or test cases. let's say i have implemented all use cases but changing step orders or modifying/adding examples would enrich my test cases.
the problem is that i have to build the project for each change in feature files. is it possible to use this technique in that way?
Given I have scenarios
And I have step definitions
When I want to add new tests
Then I should be able to load .feature file as input
if not possible, i'm thinking about some parsing operation to map them with my methods, i guess there is a better way to implement this idea.
From what I understand, you want to avoid having to rebuild your test for changes in your feature file
The problem with what you are suggesting is that you are assuming your mapping is as simple as:
Feature file <---> StepDefs
But in reality whats happening is:
.Feature File<--->FeatureFile.feature.cs<--->StepDefs
The features are mapped to the step defs using a 3rd file that is auto generated when your test builds. This maps the two files together. So even if you were simply passing in a .feature file you would still have to do some kind of build in order to generate the .cs file and map the two files together.
Related
Whenever I add/remove a table in one feature file it seems to affect the code generation of the other features. This causes a lot of unnecessary files to be added to commits.
For example, this diff was caused by a change in a completely different feature:
Is there a way to configure the code generation to use locally sequential suffixes? i.e. I want all suffixes for a particular feature should start at table1 or table0 instead of continuing from the value in the "previous" feature. This way changing a table in one feature has no impact on the code generation of another.
I am using SpecFlow v3.70
SpecFlow 3 generates the code-behind files using MS Build. It is recommended to remove all *.feature.cs files from version control for precisely the same reason you asked your question. These auto generated files exhibit a lot of churn, so version control is not beneficial. The real benefit is adding the feature files themselves to version control. The .feature.cs files become an artifact of the build process, rather than an asset that requires tracking.
I'm very new to Specflow and working on evaluating it. I was able to write scenario, step definition, and execute the test. But now I'm stuck on integrating the feature file to TFS.
I want to know if there is a way to integrate Specflow feature file to TFS(MTM)
Following is the workflow I want to accomplish :
A feature file is created with multiple scenario
If the feature file is checked in, scenarios are automatically generated in TFS with corresponding area (maybe using tags?)
or I would appreciate if you could share any other integration suggestions you may have.
Thank you in advance!
for export scenario can be used TFS API
example - Create work item in Team Project (TFS) using c# code
https://social.msdn.microsoft.com/Forums/vstudio/en-US/1ff5415e-0ef2-4c65-b0b7-a109187adf51/create-work-item-in-team-project-tfs-using-c-code?forum=tfsgeneral
So, at work, I frequently have to create virtually identical ant scripts. Basically the application we provide to our clients is designed to be easily extensible, and we offer a service of designing and creating custom modules for it. Because of the complexity of our application, with lots of cross dependencies, I tend to develop the module within our core dev environment, compile it using IntelliJ, and then run a basic ant script that does the following tasks:
1) Clean build directory
2) Create build directory and directory hierarchy based on package paths.
3) Copy class files (and source files to a separate sources directory).
4) Jar it up.
The thing is, to do this I need to go through the script line by line and change a bunch of property names, so it works for the new use case. I also save all the scripts in case I need to go back to them.
This isn't the worst thing in the world, but I'm always looking for a better way to do things. Hence my idea:
For each specific implementation I would provide an ant script (or other file) of just properties. Key-value pairs, which would have specific prefixes for each key based on what it's used for. I would then want my ant script to run the various tasks, executing each one for the key-value pairs that are appropriate.
For example, copying the class files. I would have a property with a name like "classFile.filePath". I would want the script to call the task for every property it detects that starts with "classFile...".
Honestly, from my current research so far, I'm not confident that this is possible. But... I'm super stubborn, and always looking for new creative options. So, what options do I have? Or are there none?
It's possible to dynamically generate ANT scripts, for example the following does this using an XML input file:
Use pure Ant to search if list of files exists and take action based on condition
Personally I would always try and avoid this level of complexity. Ant is not a programming language.
Looking at what you're trying to achieve it does appear you could benefit from packaging your dependencies as jars and using a Maven repository manager like Nexus or Artifactory for storage. This would simplify each sub-project build. When building projects that depend on these published libraries you can use a dependency management tool like Apache ivy to download them.
Hope that helps your question is fairly broad.
I have been working with Pex(IntelliTests) for some time now and I wondered if it is possible to create the tests via some sort of command(or .exe) and not through the IDE(VS2015) with right-clicking the function.
I have an automated process which builds my projects and further runs my tests. However if the IntelliTests are not generated anew for the new implementation they are rather useless.
This may seem like a basic question but unfortunately I could not find any information on the internet.
A command line for IntelliTest is not yet supported. If you would like to see it supported, kindly add your vote here: https://visualstudio.uservoice.com/forums/330519-team-services/suggestions/8623015-enable-intellitest-to-run-in-the-build-pipeline
TFS build flow is defined in TFS 2010's build template(which in fact is Windows Workflow Foundation file with *.xaml extension).
It was pretty convenient for dealing with single build definition in simple project, but in the near future we'll have more complicated project where we'll have many very different build definitions, but in the same time some of them will have some significant common parts in logic.
And there is no wish to have common logic replicated in each build template, and on the other hand having one super-smart-parametrizable build is considered as not the best idea.
Long story short, but the questions is:
is there any possibility to put common logic into another build template/or_whatever and reuse it?
If not - do you have some approaches/recommendation regarding such situation?
UPDATE
As K.Hoff mentioned, there is a possibility to create custom activities, but I want to go deeper and reuse not only activities but sequences as well(put simply, similarly to like Ant or NAnt do - include one file into another, call one sequences from another, etc).
I would recommend you to check whether it is possible to write code activity which executes workfow (.xaml file) with common build functionality. As a result such code activity could be put into several "master" build templates so it is possible to reuse common flow.
Here is an example how to dynamically load and execute workflow - http://msdn.microsoft.com/en-us/vs2010trainingcourse_introtowf_topic8.aspx.
We have a similar situation, but since most of our build scenarios are similar (i.e. get->build->test->deploy) we have mostly solved it with one big definition and custom activities. But we also make use of the ExecuteWorkflow activity available from Community TFS Build Extensions.
This works well for "simple" scenarios, the reason we don't use this more extensively is because it's quite complicated to pass parameters between workflow executions. Here's a link to a problem I had with this (and further down the solution I found).
You can create custom code activities as explained here and reuse them in other build templates.
An other way is to implement good old msbuild scripts and put them in the msbuild execution activities to reuse them in many build process templates.
I can't find a quick way to reuse complete sequences, the only way we found is to write the acitvities as common as possible and inject parameters to get them run.
But I don't think it's a TFS problem it's a Workflow problem.