Spring .Net Configuration Fluently - dependency-injection

I have the need to use Spring .Net in a project and am exploring configuration options. All I can find about config for Spring .Net is config file stuff. Does Spring support configuration in code? I have used Castle and Ninject, and both seem to offer this natively. I have found projects that claim to add support, but I dont want some knock off project that will die in 6 months. I have found references in blogs that seem to indicate Spring supports this but I cant find any documentation!!
Part 2 of this might be would you recommend Spring .Net over Windsor knowing it cant support fluent configuration? I know both are great IoC containers, but I have worked on projects that have massive config files for Spring configuration and I hate it.

No, the current version (1.3) of Spring.NET only supports XML configuration. There has been talk about supporting Code as Configuration in future versions, but this has not yet materialized.
In my opinion, Castle Windsor is far superior to Spring.NET. I can't think of a single feature of Spring.NET that Castle Windsor doesn't have. On the other hand, Castle Windsor has the following features that are not available in Spring.NET:
Code as Configuration
Convention-based configuration
More lifetimes
Custom lifetimes
Object graph decommissioning
Explicit mapping of interfaces/base classes to concrete types
Type-based resolution
Modular configuration (Installers)
Built-in support for Decorators
Typed Factories
There are probably other features I forgot about...
It appears I was a bit too quick on the trigger here, although to my defense, the Spring.NET documentation also states that there's only XML configuration in the current version.
However, it turns out that if for certain contexts, a very primitive API is available that enables you to configure a context without XML. Here's an example:
var context = new GenericApplicationContext();
context.RegisterObjectDefinition("EggYolk",
new RootObjectDefinition(typeof(EggYolk)));
context.RegisterObjectDefinition("OliveOil",
new RootObjectDefinition(typeof(OliveOil)));
context.RegisterObjectDefinition("Mayonnaise",
new RootObjectDefinition(typeof(Mayonnaise),
AutoWiringMode.AutoDetect));
Notice how this API very closely mirrors the XML configuration schema. Thus, you don't get any fluent API from the IObjectDefinitionRegistry interface, but at least there's an API which is decoupled from XML. Building a fluent API on top of this is at least theoretically possible.

You will find a fully working spring fluent API for spring.net on github:
https://github.com/thenapoleon/Fluent-API-for-Spring.Net
This API brings fluent configuration, and will soon support convention based configuration.

In answer to the first part of your question: the springsource team appears to be working on a code configuration project on github: https://github.com/SpringSource/spring-net-codeconfig. It was announced with (but not included in) the 1.3.1 (december 2010) release.
From the MovieFinder example:
[Configuration]
public class MovieFinderConfiguration
{
[Definition]
public virtual MovieLister MyMovieLister()
{
MovieLister movieLister = new MovieLister();
movieLister.MovieFinder = FileBasedMovieFinder();
return movieLister;
}
[Definition]
public virtual IMovieFinder FileBasedMovieFinder()
{
return new ColonDelimitedMovieFinder(new FileInfo("movies.txt"));
}
}

There is another option using the Spring.AutoRegistration. The same concept used with Unity AutoRegistration.
https://www.nuget.org/packages/Spring.AutoRegistration
http://autoregistration.codeplex.com/
var context = new GenericApplicationContext();
context.Configure()
.IncludeAssembly(x => x.FullName.StartsWith("Company.ApplicationXPTO"))
.Include(x => x.ImplementsITypeName(), Then.Register().UsingSingleton()
.InjectByProperty(If.DecoratedWith<InjectAttribute>))
.ApplyAutoRegistration();

It is also possible to use Spring.FluentContext project.
With it, the configuration of MovieFinder would look as follows:
// Configuration
private static IApplicationContext Configure()
{
var context = new FluentApplicationContext();
context.RegisterDefault<MovieLister>()
.BindProperty(l => l.MovieFinder).ToRegisteredDefaultOf<ColonDelimitedMovieFinder>();
context.RegisterDefault<ColonDelimitedMovieFinder>()
.UseConstructor((FileInfo fileInfo) => new ColonDelimitedMovieFinder(fileInfo))
.BindConstructorArg().ToValue(new FileInfo("movies.txt"));
return context;
}
// Usage
static void Main(string[] args)
{
IApplicationContext context = Configure();
var movieLister = context.GetObject<MovieLister>();
foreach (var movie in movieLister.MoviesDirectedBy("Roberto Benigni"))
Console.WriteLine(movie.Title);
Console.ReadLine();
}
It does not require any hardcoded literal ID for objects (but allows that), it is type safe and contains documentation with samples on GitHub wiki.

Using the Fluent-API-for-Spring.Net, the configuration could look something like:
private void ConfigureMovieFinder()
{
FluentApplicationContext.Clear();
FluentApplicationContext.Register<ColonDelimitedMovieFinder>("ColonDelimitedMovieFinder")
.BindConstructorArgument<FileInfo>().To(new FileInfo("movies.txt"));
// By default, fluent spring will create an identifier (Type.FullName) when using Register<T>()
FluentApplicationContext.Register<MovieLister>()
.Bind(x => x.MovieFinder).To<IMovieFinder>("ColonDelimitedMovieFinder");
}

Related

Trying to add AutoMapper to Asp.net Core 2?

I worked on a asp.net core 1.1 project a while ago and use in projetc AutoMapper.
in asp.net core 1.1, I add services.AddAutoMapper() in startup file :
StartUp file in asp.net core 1.1:
public void ConfigureServices(IServiceCollection services)
{
//Some Code
services.AddMvc();
services.AddAutoMapper();
}
And I use AutoMapper in Controller easily.
Controller :
public async Task<IActionResult> AddEditBook(AddEditBookViewModel model)
{
Book bookmodel = AutoMapper.Mapper.Map<AddEditBookViewModel, Book>(model);
context.books.Add(bookmodel);
context.SaveChanges();
}
And everything was fine.
But I'm currently working on a Asp.net Core 2 project and I get the error with services.AddAutoMapper() in sturtap file.
Error CS0121 The call is ambiguous between the following methods or properties: 'ServiceCollectionExtensions.AddAutoMapper(IServiceCollection, params Assembly[])' and 'ServiceCollectionExtensions.AddAutoMapper(IServiceCollection, params Type[])'
What is the reason for this error?
Also, services.AddAutoMapper in asp.net core 2 has some parameters. what should I send to this parameter?
If you are using AspNet Core 2.2 and AutoMapper.Extensions.Microsoft.DependencyInjection v6.1
You need to use in Startup file
services.AddAutoMapper(typeof(Startup));
You likely updated your ASP.NET Core dependencies, but still using outdated AutoMapper.Extensions.Microsoft.DependencyInjection package.
For ASP.NET Core you need at least Version 3.0.1 from https://www.nuget.org/packages/AutoMapper.Extensions.Microsoft.DependencyInjection/3.0.1
Which references AutoMapper 6.1.1 or higher.
AutoMapper (>= 6.1.1)
Microsoft.Extensions.DependencyInjection.Abstractions (>= 2.0.0)
Microsoft.Extensions.DependencyModel (>= 2.0.0)
The older packages depend on Microsoft.Extensions.DependencyInjection.Abstractions 1.1.0 and can't be used with ASP.NET Core since there have been breaking changes between Microsoft.Extensions.DependencyInjection.Abstractions 1.1.0 and 2.0
In new version (6.1) of AutoMapper.Extensions.Microsoft.DependencyInjection nuget package you should use it as follows:
services.AddAutoMapper(Type assemblyTypeToSearch);
// OR
services.AddAutoMapper(params Type[] assemblyTypesToSearch);
e.g:
services.AddAutoMapper(typeOf(yourClass));
Install package:
Install-Package AutoMapper.Extensions.Microsoft.DependencyInjection -Version 7.0.0
Nuget:
https://www.nuget.org/packages/AutoMapper.Extensions.Microsoft.DependencyInjection/
In Startup Class:
services.AddAutoMapper(typeof(Startup));
None of these worked for me, I have a .NET Core 2.2 project and the complete code for configuring the mapper looks like this(part of ConfigureService() method):
// Auto Mapper Configurations
var mappingConfig = new MapperConfiguration(mc =>
{
mc.AddProfile(new SimpleMappings());
});
IMapper mapper = mappingConfig.CreateMapper();
services.AddSingleton(mapper);
Then I have my Mappings class which I've placed in the BL project:
public class SimpleMappings : Profile
{
public SimpleMappings()
{
CreateMap<DwUser, DwUserDto>();
CreateMap<DwOrganization, DwOrganizationDto>();
}
}
And finally the usage of the mapper looks like this:
public class DwUserService : IDwUserService
{
private readonly IDwUserRepository _dwUserRepository;
private readonly IMapper _mapper;
public DwUserService(IDwUserRepository dwUserRepository, IMapper mapper)
{
_dwUserRepository = dwUserRepository;
_mapper = mapper;
}
public async Task<DwUserDto> GetByUsernameAndOrgAsync(string username, string org)
{
var dwUser = await _dwUserRepository.GetByUsernameAndOrgAsync(username, org).ConfigureAwait(false);
var dwUserDto = _mapper.Map<DwUserDto>(dwUser);
return dwUserDto;
}
}
Here is a similar link on the same topic:
How to setup Automapper in ASP.NET Core
If you are using AspNet Core 2.2.Try changing your code
from:
services.AddAutoMapper();
to:
services.AddAutoMapper(typeof(Startup));
It worked for me.
In .Net 6, you can do it like
builder.Services.AddAutoMapper(typeof(Program).Assembly); // Since there is no Startup file
OR
builder.Services.AddAutoMapper(AppDomain.CurrentDomain.GetAssemblies());
Basically, it requires the assembly name as shown in screenshot below
I solved this by creating a class that inherits AutoMapper.Profile
public class model_to_resource_profile : Profile
{
public model_to_resource_profile()
{
CreateMap<your_model_class, your_model_resource_class>();
}
}
And adding this line in the Startup.cs:
services.AddAutoMapper(typeof(model_to_resource_profile ));
try this, works with 2.1 and up, i have not used any previous version so can't tell.
services.AddAutoMapper(AppDomain.CurrentDomain.GetAssemblies());
The official docs:
https://automapper.readthedocs.io/en/latest/Dependency-injection.html#asp-net-core
You define the configuration using profiles. And then you let
AutoMapper know in what assemblies are those profiles defined by
calling the IServiceCollection extension method AddAutoMapper at
startup:
services.AddAutoMapper(profileAssembly1, profileAssembly2 /*, ...*/);
or marker types:
services.AddAutoMapper(typeof(ProfileTypeFromAssembly1), typeof(ProfileTypeFromAssembly2) /*, ...*/);
If you are having issue with adding your auto mapper, It is better you check through the type and version you added.
If it is not "AutoMapper.Extensions.Microsoft.DependencyInjection", then you won't be able to use "services.AddAutoMapper()".
Sometimes, you might mistakenly add "AutoMapper
Dec 6th 2019 Based upon initial attempt in a pluralsight course Building an API with ASP.NET Core by Shawn Wildermuth. As I got the error "...ambiguous 'ServiceCollectionExtensions.AddAutoMapper(IServiceCollection, params Assembly[])..."
I started researching proper syntax to implement AddAutoMapper in Core 2.2. My NuGet reference is version 7.0.0 After the tutorial had me create the Profile class in my Data repository directory which additionally referenced my model nir weiner & dev-siberia's answers above led me to trying to reference the profile class in the Startup.ConfigureServices() by name:
services.AddAutoMapper(typeof(CampProfile));
the content of the profile class is just a (no pun intended) old school map of the data class and the model in its constructor
this.CreateMap<Camp, CampModel>();
This addressed the poor lack of documentation for this current version.
Respectfully,
ChristianProgrammer

Use ASP.NET MVC Unity for Data Caching

I have a ASP.Net MVC project running on .NET 4.6.1 Framework.
I have recently added Unity.Mvc 5 IoC framework for dependency injection
In order to have flexibility for unit testing and other, I moved my Unity Configuration to a separate class library so that I can call the Unity Register methods from Unit test projects and other as needed.
Here is my high-level solution design.
I would like to use the same class library to implement application cache.
When I installed Unity.Mvc5 from nuget package it added following references (I added some of them manually) :
Microsoft.Practices.EnterpriseLibrary.Caching 5.0.505.0
Enterprise Library Shared Library 5.0.505.0
Microsoft.Practices.ServiceLocation 1.3.0.0
Microsoft.Practices.Unity 4.0.0.0
Microsoft.Practices.Unity.Configuration 4.0.0.0
Microsoft.Practices.Unity.Interception 2.1.505.0
Microsoft.Practices.Unity.Interception.Configuration 2.1.505.0
Microsoft.Practices.Unity.RegistrationByConvention 4.0.0.0
I tried few articles to implement Application Block Cache Management so that I can cache data in my Service Implementer layers, but all those documentations are showing code examples which is expecting Unity 2.xxx version.
Here is my Unity Configuration
public static class UnityConfig
{
public static void RegisterComponents()
{
var container = new UnityContainer();
container.RegisterType<UserManager<User>>(new HierarchicalLifetimeManager());
container.RegisterType<IUserStore<User>, UserStore<User>>(new HierarchicalLifetimeManager());
container.RegisterType<DbContext, OfficeGxDbContext>(new HierarchicalLifetimeManager());
container.RegisterType<IAppSetting, AppSettingService>();
container.RegisterType<ISubscription, SubscriptionService>();
DependencyResolver.SetResolver(new UnityDependencyResolver(container));
}
}
In my AppSettingService.cs I have get all method
public List<AppSetting> All()
{
using (var context = new MyDbContext())
{
//CachecKeyItem.AppSettingsAll
return context.AppSettings.Where(x => !x.IsDeleted)
.Include(x => x.Module).ToList();
}
}
I want to store this data in cache and reuse it. Similarly do this across all projects I have in my solution and if there is any update or add or delete for any DB records, I want the cached object to refresh it so that cached object is always in sync with DB data
I ended up doing something like this
public interface ICacheService
{
T GetOrSet<T>(string cacheKey, Func<T> getItemCallback) where T : class;
}
Service Implementor
public class InMemoryCache : ICacheService
{
public T GetOrSet<T>(string cacheKey, Func<T> getItemCallback) where T : class
{
if (MemoryCache.Default.Get(cacheKey) is T item) return item;
item = getItemCallback();
MemoryCache.Default.Add(cacheKey, item, DateTime.Now.AddMinutes(8));
return item;
}
}
use like this
_cacheService.GetOrSet(CachecKeyItem.AppSettingsAll, () => context.AppSettings
.Where(x => !x.IsDeleted)
.Include(x => x.Module).ToList());
Now my question is, when there is any change to data, like add/edit/delete, how do I refresh the cache in most efficient way? I know deleting the key would be one, is there a better way?
It slightly depends on if your running in a single server system or multi server system. Its generally better to always design for multi-server that way if you ever decide to scale out your already sorted.
So assuming you can get a message to fire off when your cache invalidates to all the servers without issue then the easiest way is to delete your cache key... now that also brings a few areas of optimization that can be looked at.
Is this cached data utilized heavily, in which case may you benefit from a pre-fetch cache where you either send the updated cached entry to all servers or require all servers to ask for it? Is it used very little in which case you don't really want to re-populate until its requested otherwise your needlessly bloating your application.

Switching from Rhino to Nashorn

I have a Java 7 project which makes a lot of use of Javascript for scripting various features. Until now I was using Rhino as script engine. I would now like to move to Java 8, which also means that I will replace Rhino by Nashorn.
How compatible is Nashorn to Rhino? Can I use it as a drop-in replacement, or can I expect that some of my scripts will not work anymore and will need to be ported to the new engine? Are there any commonly-used features of Rhino which are not supported by Nashorn?
One problem is that Nashorn can no longer by default import whole Java packages into the global scope by using importPackage(com.organization.project.package);
There is, however, a simple workaround: By adding this line to your script, you can enable the old behavior of Rhino:
load("nashorn:mozilla_compat.js");
Another problem I ran into is that certain type-conversions when passing data between java and javascript work differently. For example, the object which arrives when you pass a Javascript array to Java can no longer be cast to List, but it can be cast to a Map<String, Object>. As a workaround you can convert the Javascript array to a Java List in the Javascript code using Java.to(array, Java.type("java.util.List"))
To use the importClass method on JDK 8, we need to add the following command:
load("nashorn:mozilla_compat.js");
However, this change affect the execution on JDK 7 (JDK does not gives support to load method).
To maintain the compatibility for both SDKs, I solved this problem adding try/catch clause:
try{
load("nashorn:mozilla_compat.js");
}catch(e){
}
Nashorn can not access an inner class when that inner class is declared private, which Rhino was able to do:
import javax.script.ScriptEngine;
import javax.script.ScriptEngineManager;
import javax.script.ScriptException;
public class Test {
public static void main(String[] args) {
Test test = new Test();
test.run();
}
public void run() {
ScriptEngineManager factory = new ScriptEngineManager();
ScriptEngine engine = factory.getEngineByName("JavaScript");
Inner inner = new Inner();
engine.put("inner", inner);
try {
engine.eval("function run(inner){inner.foo(\"test\");} run(inner);");
} catch (ScriptException e) {
e.printStackTrace();
}
}
private class Inner {
public void foo(String msg) {
System.out.println(msg);
}
}
}
Under Java8 this code throws following exception:
javax.script.ScriptException: TypeError: kz.test.Test$Inner#117cd4b has no such function "foo" in <eval> at line number 1
at jdk.nashorn.api.scripting.NashornScriptEngine.throwAsScriptException(NashornScriptEngine.java:564)
at jdk.nashorn.api.scripting.NashornScriptEngine.evalImpl(NashornScriptEngine.java:548)
I noticed that Rhino didn't have a problem with a function called 'in()' (although 'in' is a reserved JavaScript keyword).
Nashorn however raise an error.
Nashorn cannot call static methods on instances! Rhino did this, therefore we had to backport Rhino to Java 8 (Here's a short summary: http://andreas.haufler.info/2015/04/using-rhino-with-java-8.html)
Nashorn on Java8 does not support AST. So if you have Java code that inspects the JS source tree using Rhino's AST mechanism , you may have to rewrite it (using regex maybe) once you port your code to use Nashorn.
I am talking about this API https://mozilla.github.io/rhino/javadoc/org/mozilla/javascript/ast/AstNode.html
Nashorn on Java9 supports AST though.
One feature that is in Rhino and not Nashorn: exposing static members through instances.
From http://nashorn-dev.openjdk.java.narkive.com/n0jtdHc9/bug-report-can-t-call-static-methods-on-a-java-class-instance : "
My conviction is that exposing static members through instances is a
sloppy mashing together of otherwise separate namespaces, hence I
chose not to enable it.
I think this is deeply wrong. As long as we have to use two different constructs to access the same java object and use package declarations unnecessarily in javascript, code becomes harder to read and write because cognitive load increases. I will rather stick to Rhino then.
I have not found a workaround for this obvious "design bug" yet.

How can we support modular and testable patterns with ASP.NET MVC 4 and MEF 2?

We're trying to use MEF 2 with ASP.NET MVC 4 to support an extensible application. There are really 2 parts to this question (hope that's okay SO gods):
How do we use Microsoft.Composition and the MVC container code (MEF/MVC demo source) to replace Ninject as our DI for ICoreService, ICoreRepository, IUnitOfWork, and IDbContext?
It looks like we can't use both Ninject and the MVC container at the same time (I'm sure many are saying "duh"), so we'd like to go with MEF, if possible. I tried removing Ninject and setting [Export] attributes on each of the relevant implementations, spanning two assemblies in addition to the web project, but Save() failed to persist with no errors. I interpreted that as a singleton issue, but could not figure out how to sort it out (incl. [Shared]).
How do we load multiple assemblies dynamically at runtime?
I understand how to use CompositionContainer.AddAssemblies() to load specific DLLs, but for our application to be properly extensible, we require something more akin to how I (vaguely) understand catalogs in "full" MEF, which have been stripped out from the Microsoft.Composition package (I think?); to allow us to load all IPluggable (or whatever) assemblies, which will include their own UI, service, and repository layers and tie in to the Core service/repo too.
EDIT 1
A little more reading solved the first problem which was, indeed, a singleton issue. Attaching [Shared(Boundaries.HttpRequest)] to the CoreDbContext solved the persistence problem. When I tried simply [Shared], it expanded the 'singletonization' to the Application level (cross-request) and threw an exception saying that the edited object was already in the EF cache.
EDIT 2
I used the iterative assembly loading "meat" from Nick Blumhardt's answer below to update my Global.asax.cs code. The standard MEF 2 container from his code did not work in mine, probably because I'm using the MEF 2(?) MVC container. Summary: the code listed below now works as desired.
CoreDbContext.cs (Data.csproj)
[Export(typeof(IDbContext))]
[Shared(Boundaries.HttpRequest)]
public class CoreDbContext : IDbContext { ... }
CoreRepository.cs (Data.csproj)
[Export(typeof(IUnitOfWork))]
[Export(typeof(ICoreRepository))]
public class CoreRepository : ICoreRepository, IUnitOfWork
{
[ImportingConstructor]
public CoreRepository(IInsightDbContext context)
{
_context = context;
}
...
}
CoreService.cs (Services.csproj)
[Export(typeof(ICoreService))]
public class CoreService : ICoreService
{
[ImportingConstructor]
public CoreService(ICoreRepository repository, IUnitOfWork unitOfWork)
{
_repository = repository;
_unitOfWork = unitOfWork;
}
...
}
UserController.cs (Web.csproj)
public class UsersController : Controller
{
[ImportingConstructor]
public UsersController(ICoreService service)
{
_service = service;
}
...
}
Global.asax.cs (Web.csproj)
public class MvcApplication : System.Web.HttpApplication
{
protected void Application_Start()
{
CompositionProvider.AddAssemblies(
typeof(ICoreRepository).Assembly,
typeof(ICoreService).Assembly,
);
// EDIT 2 --
// updated code to answer my 2nd question based on Nick Blumhardt's answer
foreach (var file in System.IO.Directory.GetFiles(Server.MapPath("Plugins"), "*.dll"))
{
try
{
var name = System.Reflection.AssemblyName.GetAssemblyName(file);
var assembly = System.Reflection.Assembly.Load(name);
CompositionProvider.AddAssembly(assembly);
}
catch
{
// You'll need to craft exception handling to
// your specific scenario.
}
}
}
}
If I understand you correctly, you're looking for code that will load all assemblies from a directory and load them into the container; here's a skeleton for doing that:
var config = new ContainerConfiguration();
foreach (var file in Directory.GetFiles(#".\Plugins", "*.dll"))
{
try
{
var name = AssemblyName.GetAssemblyName(file);
var assembly = Assembly.Load(name);
config.WithAssembly(assembly);
}
catch
{
// You'll need to craft exception handling to
// your specific scenario.
}
}
var container = config.CreateContainer();
// ...
Hammett discusses this scenario and shows a more complete version in F# here: http://hammett.castleproject.org/index.php/2011/12/a-decent-directorycatalog-implementation/
Note, this won't detect assemblies added to the directory after the application launches - Microsoft.Composition isn't intended for that kind of use, so if the set of plug-ins changes your best bet is to detect that with a directory watcher and prompt the user to restart the app. HTH!
MEF is not intended to be used as DI framework. Which means that you should separate your "plugins" (whatever they are) composition from your infrastructure dependencies, and implement the former via MEF and the latter via whatever DI framework you prefer.
I think there are a little misunderstandings on what MEF can and can't do.
Originally MEF was conceived as purely an extensibility architecture, but as the framework evolved up to its first release, it can be fully supported as a DI container also. MEF will handle dependency injection for you, and does so through it's ExportProvider architecture. It is also entirely possible to use other DI frameworks with MEF. So in reality there are a number of ways things could be achieved:
Build a NinjectExportProvider that you can plug into MEF, so when MEF is searching for available exports, it will be able to interrogate your Ninject container.
Use an implementation of the Common Services Locator pattern to bridge between MEF and Ninject or vice versa.
Because you are using MEF for the extensibility, you'll probably want to use the former, as this exposes your Ninject components to MEF, which in turn exposes them to your plugins.
The other thing to consider, which is a bit disappointing, is in reality there isn't a lot of room for automagically plugging in of features ala Wordpress on ASP.NET. ASP.NET is a compiled and managed environment, and because of that you either resort to late-binding by loading assemblies manually at runtime, or you restart the application to pick up the new plugins, which sort of defeats the object of being able to plug new extensions in through the application.
My advice, is plan your architecture to pick up any extensibility points as startup and assume that any core changes will require a deployment and application restart.
In terms of the direct questions asked:
The CompositionProvider accepts in instance of ContainerConfiguration which is used internally to create the CompositionContainer used by the provider. So you could use this as the point by which you customise how you want your container to be instantiated. The ContainerConfiguration supports a WithProvider method:
var configuration = new ContainerConfiguration().WithProvider(new NinjectExportDescriptorProvider(kernel));
CompositionProvider.SetConfiguration(configuration);
Where NinjectExportDescriptorProvider might be:
public class NinjectExportDescriptorProvider: ExportDescriptorProvider
{
private readonly IKernel _kernel;
public NinjectExportDescriptorProvider(IKernel kernel)
{
if (kernel == null) throw new ArgumentNullException("kernel");
_kernel = kernel;
}
public override IEnumerable<ExportDescriptorPromise> GetExportDescriptors(
CompositionContract contract, DependencyAccessor dependencyAccessor)
{
var type = contract.ContractType;
if (!_kernel.GetBindings(type).Any())
return NoExportDescriptors;
return new[] {
new ExportDescriptorPromise(
contract,
"Ninject Kernel",
true, // Hmmm... need to consider this, setting it to true will create it as a shared part, false as new instance each time,
NoDependencies,
_ => ExportDescriptor.Create((c, o) => _kernel.Get(type), NoMetadata)) };
}
}
}
Note: I have not tested this, this is all theory, and is based on the example AppSettingsExportDescriptorProvider at: http://mef.codeplex.com/wikipage?title=ProgrammingModelExtensions
It's different from using the standard ExportProvider, because using the CompostionProvider is built around lightweight composition. But essentially you're wrapping up access to your Ninject kernel and making it available to your CompositionContainer.
As with adding a specific new provider (see above), you can use the ContainerConfiguration to read the available assemblies, probably something like:
var configuration = new ContainerConfiguration().WithAssemblies(AppDomain.GetAssemblies())
Again, I haven't tested all of this, but I hope it at least points you in the right direction.

Why getting a 202 in two equal setup structuremap code paths

In the C# language, using StructureMap 2.5.4, targeting .NET Framework 3.5 libraries.
I've taken the step to support multiple Profiles in a structure map DI setup, using ServiceLocator model with Bootstrapper activation. First setup was loading default registry, using the scanner.
Now I like to determine runtime what Registry configuration I like to use. Scanning and loading multiple assemblies with registries.
Seems it's not working for the actual implementation (Getting the 202, default instance not found), but a stripped test version does work. The following setup.
Two assemblies containing Registries and implementations
Scanning them in running AppDomain, providing the shared Interface, and requesting Creation Of Instance, using the interfaces in constructor (which get dealt with thanx to the profile on Invokation)
Working code sample below (same structure for other setup, but with more complex stuff, that get's a 202):
What type of couses are possible for a 202, specifically naming the System.Uri type, not being handles by a default type?? (uri makes no sense)
// let structure map create instance of class tester, that provides the registered
// interfaces in the registries to the constructor of tester.
public class Tester<TPOCO>
{
private ITestMe<TPOCO> _tester;
public Tester(ITestMe<TPOCO> some)
{
_tester = some;
}
public string Exec()
{
return _tester.Execute();
}
}
public static class Main {
public void ExecuteDIFunction() {
ObjectFactory.GetInstance<Tester<string>>().Exec();
}
}
public class ImplementedTestMe<TSome> : ITestMe<TSome>
{
public string Execute()
{
return "Special Execution";
}
}
public class RegistryForSpecial : Registry
{
public RegistryForSpecial()
{
CreateProfile("Special",
gc =>
{
gc.For(typeof(ITestMe<>)).UseConcreteType(typeof(ImplementedTestMe<>));
});
}
}
Background articles on Profiles I used.
How to setup named instances using StructureMap profiles?
http://devlicio.us/blogs/derik_whittaker/archive/2009/01/07/setting-up-profiles-in-structuremap-2-5.aspx
http://structuremap.sourceforge.net/RegistryDSL.htm
EDIT:
It seemed the missing interface was actually the one being determined runtime. So here is the next challange (and solved):
I provided a default object whenever StructureMap needs to create the object. Like:
x.ForRequestedType<IConnectionContext>()
.TheDefault.Is.Object(new WebServiceConnection());
This way I got rid of the 202 error, because now a real instance could be used whever structure map needed the type.
Next was the override on runtime. That did not work out at first using the ObjectFactory.Configure method. Instead I used the ObjectFactory.Inject method to overide the default instance. Works like a charm.
ObjectFactory.Inject(typeof(IConnectionContext), context);
Loving the community effort.
Error code 202 means a default instance could not be built for the requested type. Your test code is apparently not equal to your real code that fails. If you are getting an error about Uri, you likely have a dependency that requires a Uri in its constructor. It may not be the class you are asking for - it may be one of that classes dependendencies - or one of the dependencies dependencies... somewhere down the line someone is asking StructureMap to resolve a Uri, which it cannot do, without some help from you.

Resources