I have problems setting up my specflow project. I want to have some step definitions in a shared assembly, so that I can reuse them.
I have two assemblies: A and B
Assembly A contains stepDefinitions which should be used by a feature file which is located in Assembly B.
Assembly B references Assembly A.
Assembly A contains the following specflow.json file:
{
"language": {
"feature": "de-CH"
}
}
Assembly B contains the following specflow.json file:
{
"language": {
"feature": "de-CH"
},
"stepAssemblies": [
{ "assembly": "A" }
]
}
This works perfectly fine as long as both assemblies have their output path set to bin\Debug\
However, we generally have set up our solution so that all projects are built into a shared directory.
This speeds up compilation a lot, and we need this because our solution is huge.
So if I go to the Project Property Page of both projects A and B and open the Build tab, and change the Output Path to a shared path, e.g. ..\DebugBuild then Specflow can no longer
find the stepdefinitions. The test in Assembly B is now Inconclusive and the following output is shown:
No matching step definition found for one or more steps.
How can I use external step definitions when I want to use shared output directory?
Best Regards
Matthias
Related
I'm using some macro which generates *.kt files in bazel-bin folder.
My plan was to encapsulate those kt files in a kt_jvm_library.
I am trying this but it's not working:
kt_jvm_library(
name = "generated-stuff",
srcs = ["bazel-bin/src/main/java/com/example/Hello.kt"],
deps = [
...
],
)
In Gradle I could just do:
sourceSets {
main {
java {
srcDir ('${buildDir.absolutePath}/generated/...')
}
resources {
srcDir ('config')
}
}
}
Trying to find an equivalent in Bazel.
You need to use a label to refer to outputs from other rules, so that Bazel understands where they're coming from to set up the correct dependencies. Depending on your BUILD file layout, that would be something like //src/main/java/com/example:Kello.kt or //src/main/java:com/example/Hello.kt. If it's in the same BUILD file, then just :Hello.kt would work too. It depends on where the package boundary is (basically where the deepest-nested BUILD file is).
Also, depending on the rule, you may not be able to refer to Hello.kt directly, you may have to use the label of the rule instead. Using the label of a rule in srcs will typically use its default outputs.
There are 2 projects in the same solution. First project is a .NET Core project and it has all the codes(controllers, models etc.) related to packages. I need to get the information (id, name, description) of the packages and display it in the second project(.NET Core Web App with Razor). Is it possible to do it without changing the first project? I only want to show the package list on a single web page.
I tried calling the first project's controller but it didn't work. Maybe I missed a point. Any help is appreciated.
This requirement can be achieved, please see the gif image below.
Tips
If you want to call another project's controller from a project in the same solution, you need to make sure there is in HomeController in both project. I mean the name of any class should be unique in both projects.
Otherwise you will face the same issue like my homepage.
Test Code:
public List<PackageReference> GetPackageList5(string projectname)
{
List<PackageReference> list = new List<PackageReference>();
PackageReference p = null;
XDocument doc = XDocument.Load(_webHostEnvironment.ContentRootPath+ "/"+ projectname + ".csproj");
var packageReferences = doc.XPathSelectElements("//PackageReference")
.Select(pr => new PackageReference
{
Include = pr.Attribute("Include").Value,
Version = pr.Attribute("Version").Value
});
Console.WriteLine($"Project file contains {packageReferences.Count()} package references:");
foreach (var packageReference in packageReferences)
{
p = new PackageReference();
p.Version= packageReference.Version;
p.Include= packageReference.Include;
list.Add(packageReference);
//Console.WriteLine($"{packageReference.Include}, version {packageReference.Version}");
}
return list;
}
My Test Steps:
create two project, Net5MVC,Net6MVC
add project reference.
My .net6 project references a .net5 project. So in my HomeController (.net), I add below:
using Net5MVC.ForCore6;
using Net5MVC.Models;
Suggestion
When we reference the .net5 project in .net6 project, we can build success, but when we deploy it, it always failed. The reason is some file was multiple publish output files with the same relative path.
Found multiple publish output files with the same relative path:
D:\..\Net6\Net6\Net5MVC\appsettings.Development.json,
D:\..\Net6\Net6\Net6MVC\appsettings.Development.json,
D:\..\Net6\Net6\Net5MVC\appsettings.json,
D:\..\Net6\Net6\Net6MVC\appsettings.json.
And usually will add class library to current project, not add a web project.
As we know we can find packages info in .csproj file, so we need copy and paste .csproj file to publish folder.
I still recommend using the GetPackageList5 method above as an interface for your project, using HttpClient for requests.
I'm trying to extract c++ source code's info.
One is field's type.
when source code like under I want to extract info's Type when info.call() is called.
Info info;
//skip
info.call(); //<- from here
Trough making a visitor which visit IASTName node, I tried to extract type info like under.
public class CDTVisitor extends ASTVisitor {
public CDTVisitor(boolean visitNodes) {
super(true);
}
public int visit(IASTName node){
if(node.resolveBinding().getName().toString().equals("info"))
System.out.println(((IField)node.getBinding()).getType());
// this not work properly.
//result is "org.eclipse.cdt.internal.core.dom.parser.ProblemType#86be70a"
return 3;
}
}
Assuming the code is in fact valid, a variable's type resolving to a ProblemType is an indication of a configuration problem in whatever tool or plugin is running this code, or in the project/workspace containing the code on which it is run.
In this case, the type of the variable info is Info, which is presumably a class or structure type, or a typedef. To resolve it correctly, CDT needs to be able to see the declaration of this type.
If this type is not declared in the same file that's being analyzed, but rather in a header file included by that file, CDT needs to use the project's index to find the declaration. That means:
The AST must be index-based. For example, if using ITranslationUnit.getAST to create the AST, the overload that takes an IIndex parameter must be used, and a non-null argument must be provided for it.
Since an IIndex is associated with a CDT project, the code being analyzed needs to be part of a CDT project, and the project needs to be indexed.
In order for the indexer to resolve #include directives correctly, the project's include paths need to be configured correctly, so that the indexer can actually find the right header files to parse.
Any one of these not being the case can lead to a type resolving to a ProblemType.
Self response.
The reason I couldn't get a binding object was the type of AST.
When try to parse C++ source code, I should have used ICPPASTTranslationUnit.
There is no code related this, I used IASTTranslationUnit as a return type of AST.
After using ICPPASTTranslationUnit instead of IASTTranslationUnit, I solved this problem.
Yes, I figure it out! Here is the entire code which can index all files in "src" folder of a cpp project and output the resolved type binding for all code expressions including the return value of low level API such as memcpy. Note that the project variable in following code is created by programatically importing an existing manually configured cpp project. I often manually create an empty cpp project and programatically import it as a general project (once imported, Eclipse will automatically detect the project type and complete the relevant configuration of CPP project). This is much more convenient than creating and configuring a cpp project from scratch programmatically. When importing project, you'd better not to copy the project or containment structures into workspace, because this may lead to infinitely copying same project in subfolder (infinite folder depth). The code works in Eclipse-2021-12 version. I download Eclipse-For-cpp and install plugin-development and jdt plugins. Then I create an Eclipse plugin project and extend the "org.eclipse.core.runtime.applications" extension point.
In another word, it is an Eclipse-Application plugin project which can use nearly all features of Eclipse but do not start the graphical interface (UI) of Eclipse. You should add all cdt related non-ui plugins as the dependencies because new version of Eclipse does not automatically add missing plugins any more.
ICProject cproject = CoreModel.getDefault().getCModel().getCProject(project.getName());
// this code creates index for entire project.
IIndex index = CCorePlugin.getIndexManager().getIndex(cproject);
IFolder folder = project.getFolder("src");
IResource[] rcs = folder.members();
// iterate all source files in src folder and visit all expressions to print the resolved type binding.
for (IResource rc : rcs) {
if (rc instanceof IFile) {
IFile f = (IFile) rc;
ITranslationUnit tu= (ITranslationUnit) CoreModel.getDefault().create(f);
index.acquireReadLock(); // we need a read-lock on the index
ICPPASTTranslationUnit ast = null;
try {
ast = (ICPPASTTranslationUnit) tu.getAST(index, ITranslationUnit.AST_SKIP_INDEXED_HEADERS);
} finally {
index.releaseReadLock();
}
if (ast != null) {
ast.accept(new ASTVisitor() {
#Override
public int visit(IASTExpression expression) {
// get the resolved type binding of expression.
IType etp = expression.getExpressionType();
System.out.println("IASTExpression type:" + etp + "#expr_str:" + expression.toString());
return super.visit(expression);
}
});
}
}
}
I'd like to modify some of my Grails domain classes at compilation time. I initially thought this was a job for Groovy's global ASTTransformation since I don't want to annotate my domain classes (which local transformers require). What's the best way to do this?
I also tried mimicking DefaultGrailsDomainClassInjector.java by creating my own class in the same package, implementing the same interfaces, but I probably just didn't know how to package it up in the right place because I never saw my methods get invoked.
On the other hand I was able to manually create a JAR which contained a compiled AST transformation class, along with the META-INF/services artifacts that plain Groovy global transformations require. I threw that JAR into my project's "lib" dir and visit() was successfully invoked. Obviously this was a sloppy job because I am hoping to have the source code of my AST transformation in a Grails plugin and not require a separate JAR artifact if I don't have to, plus I couldn't get this approach to work by having the JAR in my Grails plugin's "lib" but had to put it into the Grails app's "lib" instead.
This post helped a bit too: Grails 2.1.1 - How to develop a plugin with an AstTransformer?
The thing about global transforms the transform code should be available when the compilation starts. Having the transformer in a jar was what i did first! But as you said it is a sloppy job.
What you want to do is have your ast transforming class compile before others gets to the compilation phase. Here is what you do!
Preparing the transformer
Create a directory called precompiled in src folder! and add the Transformation class and the classes (such as annotations) the transformer uses in this directory with the correct packaging structure.
Then create a file called org.codehaus.groovy.transform.ASTTransformation in called precompiled/META-INF/services and you will have the following structure.
precompiled
--amanu
----LoggingASTTransformation.groovy
--META-INF
----services
------org.codehaus.groovy.transform.ASTTransformation
Then write the fully qualified name of the transformer in the org.codehaus.groovy.transform.ASTTransformation file, for the example above the fully qualified name would be amanu.LoggingASTTransformation
Implementation
package amanu
import org.codehaus.groovy.transform.GroovyASTTransformation
import org.codehaus.groovy.transform.ASTTransformation
import org.codehaus.groovy.control.CompilePhase
import org.codehaus.groovy.ast.ASTNode
import org.codehaus.groovy.control.SourceUnit
#GroovyASTTransformation(phase=CompilePhase.CANONICALIZATION)
class TeamDomainASTTransformation implements ASTTransformation{
public void visit(ASTNode[] nodes, SourceUnit sourceUnit) {
println ("*********************** VISIT ************")
source.getAST()?.getClasses()?.each { classNode ->
//Class node is a class that is contained in the file being compiled
classNode.addProperty("filed", ClassNode.ACC_PUBLIC, new ClassNode(Class.forName("java.lang.String")), null, null, null)
}
}
}
Compilation
After implementing this you can go off in two ways! The first approach is to put it in a jar, like you did! and the other is to use a groovy script to compile it before others. To do this in grails, we use _Events.groovy script.
You can do this from a plugin or the main project, it doesn't matter. If it doesn't exist create a file called _Events.groovy and add the following content.
The code is copied from reinhard-seiler.blogspot.com with modifications
eventCompileStart = {target ->
...
compileAST(pluginBasedir, classesDirPath)
...
}
def compileAST(def srcBaseDir, def destDir) {
ant.sequential {
echo "Precompiling AST Transformations ..."
echo "src ${srcBaseDir} ${destDir}"
path id: "grails.compile.classpath", compileClasspath
def classpathId = "grails.compile.classpath"
mkdir dir: destDir
groovyc(destdir: destDir,
srcDir: "$srcBaseDir/src/precompiled",
classpathref: classpathId,
stacktrace: "yes",
encoding: "UTF-8")
copy(toDir:"$destDir/META-INF"){
fileset(dir:"$srcBaseDir/src/precompiled/META-INF")
}
echo "done precompiling AST Transformations"
}
}
the previous script will compile the transformer before others are compiled! This enable the transformer to be available for transforming your domain classes.
Don't forget
If you use any class other than those added in your classpath, you will have to precompile those too. The above script will compile everything in the precompiled directory and you can also add classes that don't need ast, but are needed for it in that directory!
If you want to use domain classes in transformation, You might want to do the precompilation in evenCompileEnd block! But this will make things slower!
Update
#Douglas Mendes mentioned there is a simple way to cause pre compilation. Which is more concise.
eventCompileStart = {
target -> projectCompiler.srcDirectories.add(0, "./src/precompiled")
}
I have a simple dart class I am trying to test.
To test it I need to open a txt file, feed the content to an instance of the class and check that the output is correct.
Where do I place this txt file? The txt file is useless outside of testing.
Also, related, how do I acess its directory consistently? I tried placing it in the test folder, but the problem is that:
System.currentDirectory
Returns a different directory if I am running the test on its own or the script calling all the other test dart files on at a time
I check if System.currentDirectory is the directory containing the pubspec.yaml file, if not I move the current directory upwards until I found the directory containing the pubpsec.yaml file and then continue with the test code.
Looks like package https://pub.dev/packages/resource is also suitable for this now.
I have still not found a definitive answer to this question. I've been looking for something similar to the testdata directory in Go and the src/test/resources directory in Java.
I'm using Android studio and have settled on using a test_data.dart file at the top of my test directory. In there I define my test data (mostly JSON) and then import it into my individual tests. This doesn't help if you need to deal with binary files but it has been useful for my JSON data. I'll also inject the JSON language with //language=json so I can open the fragment in a separate window to format.
//language=json
const consolidatedWeatherJson = '''{
"consolidated_weather": [
{
"id": 4907479830888448,
"weather_state_name": "Showers",
"weather_state_abbr": "s",
"wind_direction_compass": "SW",
"created": "2020-10-26T00:20:01.840132Z",
"applicable_date": "2020-10-26",
"min_temp": 7.9399999999999995,
"max_temp": 13.239999999999998,
"the_temp": 12.825,
"wind_speed": 7.876886316914553,
"wind_direction": 246.17046093256732,
"air_pressure": 997.0,
"humidity": 73,
"visibility": 11.037727173307882,
"predictability": 73
}
]
}
''';
Using the Alt + Enter key combination will bring up the Edit JSON Fragment option. Selecting that open the fragment in a new editor and any changes made there (formatting for example) will be updated in the fragment.
Not perfect but it solves my issues.