Strange behaviour with SpecflowPlusRunner using an older version of Newtonsoft,Json - how do you update it? - specflow

I have a dotnet core 3.1 class library, providing some service implementations. I wanted to use Specflow to create some BDD tests to test various scenarios with my service.
I created a Specflow tests project which uses the SpecflowPlus Test Runner. If I inspect my "deps.json" file present in my bin folder, all references to NewtonSoft.Json are set to version 12.0.3 - currently the latest version.
However, when the solution is built, some files are copied to this location \Specflow.Tests\bin\Debug\netcoreapp3.1\SpecFlowPlusRunner\ - one of them being \Specflow.Tests\bin\Debug\netcoreapp3.1\SpecFlowPlusRunner\netcoreapp3.1\Newtonsoft.Json.dll
This version of Newtonsoft.Json is only 11.0.2
If I look at "Manage Nuget Packages for Solution" and switch to the Consolidation tab, all versions referenced are 12.0.3 - I manually added Newtonsoft.Json to all projects in an attempt to equal up the versions.
When I run the tests, I get this error:
System.InvalidCastException: '[A]Newtonsoft.Json.Linq.JObject cannot be cast to [B]Newtonsoft.Json.Linq.JObject.
Type A originates from 'Newtonsoft.Json, Version=11.0.0.0, Culture=neutral, PublicKeyToken=30ad4fe6b2a6aeed' in the context 'Default' at location 'C:\Source\Repos\BSP_Infrastructure\deployment\CleanUp\src\WTW.ICT.BSP.CleanUp.Specflow.Tests\bin\Debug\netcoreapp3.1\SpecFlowPlusRunner\netcoreapp3.1\Newtonsoft.Json.dll'.
Type B originates from 'Newtonsoft.Json, Version=12.0.0.0, Culture=neutral, PublicKeyToken=30ad4fe6b2a6aeed' in the context 'Default' at location 'C:\Users\ian64639\.nuget\packages\newtonsoft.json\12.0.3\lib\netstandard2.0\Newtonsoft.Json.dll'.'
My code does some casting, internal to the class library:
public async Task<string[]> GetSpaUris(string id)
{
var app = await GetApplicationRegistrationAsync(id).ConfigureAwait(false);
return GetSpaUris(app);
}
public string[] GetSpaUris(Application application) => ((JArray)((JObject)application.AdditionalData["spa"])["redirectUris"]).ToObject<string[]>();
And returns a string array. I dont return any JArray or JObject objects/references, so this should not "leak" any Newtonsoft.Json dependencies. I am not trying to cast JObject to JObject across projects, so no casting should occur.
I created a small console app
var hostBuilder = new HostBuilder();
var svc = hostBuilder.MyHost.Services.GetService<ApplicationRegistrationService>();
var uris = await svc.GetSpaUris("b3340c1e-c37a-471f-8c90-4e25f27990e8");
And this has no such casting issues. So it seems its the version of Newtonsoft.Json that the SpecflowPlus Runner utilises that is likely to the be the issue.
I return a string[] to try and avoid having dependency crossover issues though.
I looked at the Specflow docs, and it says you can create a specflow.json config file, and you can specify "Custom" dependencies.
dependencies custom dependencies Specifies custom dependencies for the SpecFlow runtime. See Plugins for details. Default: not specified
https://docs.specflow.org/projects/specflow/en/latest/Extend/Plugins.md
However this link returns
\ SORRY /
\ /
\ This page does /
] not exist yet. [ ,'|
] [ / |
]___ ___[ ,' |
] ]\ /[ [ |: |
] ] \ / [ [ |: |
] ] ] [ [ [ |: |
] ] ]__ __[ [ [ |: |
] ] ] ]\ _ /[ [ [ [ |: |
] ] ] ] (#) [ [ [ [ :===='
] ] ]_].nHn.[_[ [ [
] ] ] HHHHH. [ [ [
] ] / `HH("N \ [ [
]__]/ HHH " \[__[
] NNN [
] N/" [
] N H [
/ N \
/ q, \
/ \
Anyone got any ideas how I can resolve it? Is custom dependencies my salvation? If so, does anyone know how?

Custom dependencies are only for changing the behavior of SpecFlow. They have nothing to do with this.
Newtonsoft.Json 11 is used by the SpecFlow+ Runner internally. As dependencies in test runners are not resolved with normal NuGet dependencies and need to be shipped with the test runner NuGet package. That's why you are seeing the v11 assembly also if you update all projects to v12.
And because of assembly loading in .NET, this is getting complicated and you get this error.
At the moment the only thing I can think of, that will unblock you is that you are using also Newtonsoft.Json v11. I hope this is possible.
Please open an issue at https://github.com/SpecFlowOSS/SpecFlow/ about this, so we can have a look at it and fix this.
Full disclosure: I am the community manager of SpecFlow and SpecFlow+

Related

How to access node_modules folder after running yarn_install (or npm_install ) in rules_nodejs bazel?

I'm relatively new to Bazel but this has taken longer than I felt it should. I am doing yarn_install in my workspace, and I am just trying to reference the installed node_modules so that I can put them in my new docker container.
Workspace
yarn_install(
name = "npm",
package_json = "//:package.json",
yarn_lock = "//:yarn.lock",
)
BUILD.bazel
load("#io_bazel_rules_docker//nodejs:image.bzl", "nodejs_image")
nodejs_image(
name = "webapi_image",
# gets all the files in my directory
data = glob(
[
"**/*",
],
# references the node modules, but doesn't work :(
) + ["#npm//node_modules"],
entry_point = "//:app.js",
)
I've been able to get specific packages (i.e. #npm//express) but if I try to access node_modules then I just get
no such package '#npm//node_modules': Package is considered deleted due to --deleted_packages and referenced by '//:webapi_image'
I'm not sure I totally understand why I can access individual packages (i.e. #npm//express) but not node_modules (i.e. #npm//node_modules).
but after bumping around, I found if I just use the structure #npm//:node_modules, then it works finally.

bazel with protobuf / gRPC-gateway / golang - getting started

So I am trying to convert a monorepo of micro services (C#, Go, NodeJS) to use bazel. Just playing with it for now.
I focus on one go service to get started and isolated it as a WORKSPACE.
The go service is gRPC service that uses protobuf obviously, grpc-gateway with the protoc-gen-swagger and also protoc-gen-gorm (this one does not support bazel).
The code builds using a command like go build cmd/server/server.go
I am hoping to get some guidance on how to get started to build this project with all the dependencies.
I see several rules available for protobuf/go and I am not yet comfortable browsing through them or deciding which is better (i cannot get any to work due to grpc gateway or protoc gen gorm)
- https://github.com/stackb/rules_proto
- https://github.com/bazelbuild/rules_go
- https://github.com/stackb/rules_proto/tree/master/github.com/grpc-ecosystem/grpc-gateway
Code structure looks like this:
/repo
svc1
svc2
svc3
cmd/server
BUILD.bazel
server.go
pkg
contains folders and some go files and a BUILD.bazel in each
proto
BUILD.bazel
test.proto
WORKSPACE
BUILD.bazel
Right now I only work on svc3. Later i will probably move the WORKSPACE to the parent folder.
My WORKSPACE looks like this:
load("#bazel_tools//tools/build_defs/repo:http.bzl", "http_archive")
http_archive(
name = "io_bazel_rules_go",
sha256 = "96b1f81de5acc7658e1f5a86d7dc9e1b89bc935d83799b711363a748652c471a",
urls = [
"https://storage.googleapis.com/bazel-mirror/github.com/bazelbuild/rules_go/releases/download/0.19.2/rules_go-0.19.2.tar.gz",
"https://github.com/bazelbuild/rules_go/releases/download/0.19.2/rules_go-0.19.2.tar.gz",
],
)
load("#io_bazel_rules_go//go:deps.bzl", "go_register_toolchains", "go_rules_dependencies")
go_rules_dependencies()
go_register_toolchains()
http_archive(
name = "bazel_gazelle",
urls = [
"https://storage.googleapis.com/bazel-mirror/github.com/bazelbuild/bazel-gazelle/releases/download/0.18.1/bazel-gazelle-0.18.1.tar.gz",
"https://github.com/bazelbuild/bazel-gazelle/releases/download/0.18.1/bazel-gazelle-0.18.1.tar.gz",
],
sha256 = "be9296bfd64882e3c08e3283c58fcb461fa6dd3c171764fcc4cf322f60615a9b",
)
load("#bazel_gazelle//:deps.bzl", "gazelle_dependencies", "go_repository")
gazelle_dependencies()
load("#bazel_tools//tools/build_defs/repo:git.bzl", "git_repository")
git_repository(
name = "com_google_protobuf",
commit = "09745575a923640154bcf307fba8aedff47f240a",
remote = "https://github.com/protocolbuffers/protobuf",
shallow_since = "1558721209 -0700",
)
load("#com_google_protobuf//:protobuf_deps.bzl", "protobuf_deps")
protobuf_deps()
+ a bunch of go_repository() created by Gazelle
Running gazelle created a bunch of build.bazel files for my go project in each folder.
Next to the .proto, I have a generated build.bazel file:
load("#io_bazel_rules_go//go:def.bzl", "go_library")
load("#io_bazel_rules_go//proto:def.bzl", "go_proto_library")
proto_library(
name = "svc_proto",
srcs = ["test.proto"],
visibility = ["//visibility:public"],
deps = [
# the two github below are referenced as go_repository
"#com_github_infobloxopen_protoc_gen_gorm//options:proto_library", # not sure what to put after the colon
"#com_github_grpc_ecosystem_grpc_gateway//protoc-gen-swagger/options:proto_library",
"#go_googleapis//google/api:annotations_proto",
],
)
go_proto_library(
name = "svc_go_proto",
compilers = ["#io_bazel_rules_go//proto:go_grpc"],
importpath = "src/test/proto/v1",
proto = ":svc_proto",
visibility = ["//visibility:public"],
deps = [
"//github.com/infobloxopen/protoc-gen-gorm/options:go_default_library",
"//github.com/grpc-ecosystem/grpc-gateway/protoc-gen-swagger/options:go_default_library",
"#go_googleapis//google/api:annotations_go_proto",
],
)
go_library(
name = "go_default_library",
embed = [":svc_go_proto"],
importpath = "src/test/proto/v1",
visibility = ["//visibility:public"],
)
Now the questions:
not sure what to put to reference other proto files: "#com_github_infobloxopen_protoc_gen_gorm//options:proto_library" ? and not sure this is the best way to reference other external libraries from git.
if i build the above using bazel build //proto/v1:svc_proto, i get: no such target '#com_github_grpc_ecosystem_grpc_gateway//protoc-gen-swagger/options:proto_library': target 'proto_library' not declared in package 'protoc-gen-swagger/options'. Probably linked to 1.
i am not sure which rule to use. As I need grpc gateway, I guess i
need to exclusively use
https://github.com/stackb/rules_proto/tree/master/github.com/grpc-ecosystem/grpc-gateway
but i can't make them to work either.
I use statik (https://github.com/rakyll/statik) to package the
swagger file in go to server the swagger. Is there any alternative
or if not, how can i call a custom bash/command as part of the build
process in the chain?
In summary, I am pretty sure my BUILD.bazel file to build the proto and library is structured wrong and would appreciate some up-to-date guidance (github is full of repos that are outdated, using outdated rules or simply not working).

How can I use SimpleSC with Dependencies in NSIS

I've this structure of my project(example):
Installer\Dependencies\Myservice.exe
Installer\Dependencies\dependenci.dll
Installer\Dependencies\js\file.js
Installer\Dependencies\resources\folder\file.js
In each one of these folders I've my dependencies to services install fine.
How can I use the simpleSC to install service if dependencies?
I know the statement:
SimpleSC::InstallService [name_of_service] [display_name] [service_type][start_type] [binary_path] [dependencies] [account] [password]
and I already try this, but isn't working:
SimpleSC::InstallService "LprService" "LprService" "272" "2" "$INSTDIR\GeneteLPRService.exe" "Dependencies" "" ""
P.s.: Using InstallUtil.exe, it works
SimpleSC dependencies is a list of other services that have to be started before your service can start.
The Wiki page has an example:
; Depends on "Windows Time Service" (w32time) and "WWW Publishing Service" (w3svc):
SimpleSC::InstallService "MyService" "My Display Name" "16" "2" "$InstDir\MyService.exe" "w32time/w3svc" "" ""
Pop $0
If you don't have any service dependencies then you can just use a empty string like the other unused parameters.
Files required by your service can be installed normally with File or File /r.

erlang lager_syslog driver failure

I'm trying to use lager_syslog in my project but it seems like a driver is missing.
This is in my rebar.conf:
{deps, [
...
{lager_syslog, {git, "https://github.com/basho/lager_syslog.git", {branch, master}}}
]}.
My test handler:
{lager_syslog_backend, ["test", local1, info]},
The error:
19:29:09.981 [error] Lager failed to install handler {lager_syslog_backend,{"test",local1}} into lager_event, retrying later : {error,
{{shutdown,
{failed_to_start_child,
syslog,
"could not load driver syslog_drv: \"cannot open shared object file: No such file or directory\""}},
{syslog_app,
start,
[normal,
[]]}}}
Any suggestion?
thanks to Kenneth Lakin who answered my question in mailing list
IIRC, rebar3 moved the port compiler out to a rebar3 plugin, rather than
packing it in with the core project. From what I've seen, rebar2
projects that relied on it will fail to load their port drivers.
Add
{overrides,
[{override, syslog, [
{plugins, [pc]},
{artifacts, ["priv/syslog_drv.so"]},
{provider_hooks, [
{post,
[
{compile, {pc, compile}},
{clean, {pc, clean}}
]
}]
}
]}
]}.
to a rebar.conf in your project, clean, and rebuild. (The syslog project
is where lager_syslog's port driver lives.)
See also: https://github.com/blt/port_compiler#use-with-existing-dependency

How can I integrate Webmachine into an Erlang Application?

I read and re-read the docs and tutorials, but my understanding of how to create Erlang Applications, and Rebar for that matter, still has enough holes to resemble Swiss cheese. Very simple stuff throws me.
I'm working toward an Erlang Release that will eventually include several applications of my own plus Webmachine and maybe a nosql db of one flavor or another. Using Rebar I've successfully compiled and tested my applications: ZZZ and ZZZ_Lib. my directory structure is shown below. I'm not confident that it's optimal, but it works.
I've installed Webmachine under the ...learn1/apps directory.
My next step has been to integrate Webmachine with the very simple webmachine_demo_resource shown below under the name test_resource:erl.
http://webmachine.basho.com/example_resources.html
But when I try to compile, I get this error message:
src/test_resource.erl:3: can't find include lib "webmachine/include/webmachine.hrl"
Here's the offending line in test_resource.erl:
-include_lib("webmachine/include/webmachine.hrl").
I've tried to set both ERL_LIBS (which I don't fully understand) and PATH with no success. So, clearly, I don't understand how to set the proper path or how best to integrate Webmachine.
Any and all guidance would be gratefully welcomed.
LRP
* Directory structure
learn1$ ls
apps rebar rebar.config
learn1/apps$ ls
webmachine zzz zzz_lib
learn1/apps/zzz_lib/src$ ls
yada yada test_resource.erl yada yada
* rebar.config
{sub_dirs,
["apps/zzz",
"apps/zzz/src",
"apps/zzz_lib",
"apps/zzz_lib/src"
]
}.
* zzz_lib.app.src
{application, zzz_lib,
[
{description, ""},
{vsn, "1"},
{modules, [
yada yada
]},
{applications, [
kernel,
stdlib,
webmachine
]},
{mod, { zzz_lib_app, []}},
{env, []}
]}.
You most likely will end up happier including it as a dependency, not as a contained app. See for example how Riak Core does it: https://github.com/basho/riak_core/blob/master/rebar.config
For more insight, you might find asking the mailing lists to be worthwhile:
http://lists.therestfulway.com/mailman/listinfo/webmachine_lists.therestfulway.com
http://lists.basho.com/mailman/listinfo/rebar_lists.basho.com
Using ERL_LIBS in your case, you'd need to set it to /.../learn1/apps.
When compiling, you may also add a {i, Dir} option.
According to the documentation however, it only mentions -include and -include_dir, not -include_lib.
{i,Dir} Add Dir to the list of directories to be searched when
including a file. When encountering an -include or -include_dir
directive, the compiler searches for header files in the following
directories:
".", the current working directory of the file server;
the base name of the compiled file;
the directories specified using the i option. The directory specified
last is searched first.

Resources