Hangfire job on Console/Web App solution? - asp.net-mvc

I'm new to Hangfire and I'm trying to understand how this works.
So I have a MVC 5 application and a Console application in the same solution. The console application is a simple one that just updates some data on the database (originally planned to use Windows Task Scheduler).
Where exactly do I install Hangfire? In the Web app or the console? Or should I convert the console into a class on the Web app?

If I understand it correctly, the console in your solution is acting like an "pseudo" HangFire, since like you said it does some database operations overtime and you plan to execute it using the Task Scheduler.
HangFire Overview
HangFire was design to do exactly what you want with your console app, but with a lot more of power and functionalities, so you avoid all the overhead of creating all that by yourself.
HangFire Instalation
HangFire is installed commonly alongside with ASP.NET Applications, but if you carefully read the docs, you will surprisingly find this:
Hangfire project consists of a couple of NuGet packages available on
NuGet Gallery site. Here is the list of basic packages you should know
about:
Hangfire – bootstrapper package that is intended to be installed only
for ASP.NET applications that uses SQL Server as a job storage. It
simply references to Hangfire.Core, Hangfire.SqlServer and
Microsoft.Owin.Host.SystemWeb packages.
Hangfire.Core – basic package
that contains all core components of Hangfire. It can be used in any
project type, including ASP.NET application, Windows Service, Console,
any OWIN-compatible web application, Azure Worker Role, etc.
As you can see, HangFire can be used in any type of project including console applications but you will need to manage and add all the libraries depending on what kind of job storage you will use. See more here:
Once HangFire is Installed you can configure it to use the dashboard, which is an interface where you can find all the information about your background jobs. In the company I work, we used HangFire several times with recurring jobs mostly to import users, synchronize information across applications and perform operations that would be costly to run during business hours, and the Dashboard proved to be very useful when we wanted to know if a certain job was running or not. It also uses CRON to schedule the operations.
A sample of we are using right now is:
Startup.cs
public partial class Startup
{
public void Configuration(IAppBuilder app)
{
//Get the connection string of the HangFire database
GlobalConfiguration.Configuration.UseSqlServerStorage(connection);
//Start HangFire Server and enable the Dashboard
app.UseHangfireDashboard();
app.UseHangfireServer();
//Start HangFire Recurring Jobs
HangfireServices.Instance.StartSendDetails();
HangfireServices.Instance.StartDeleteDetails();
}
}
HangfireServices.cs
public class HangfireServices
{
//.. dependency injection and other definitions
//ID of the Recurring JOBS
public static string SEND_SERVICE = "Send";
public static string DELETE_SERVICE = "Delete";
public void StartSend()
{
RecurringJob.AddOrUpdate(SEND_SERVICE, () =>
Business.Send(), //this is my class that does the actual process
HangFireConfiguration.Instance.SendCron.Record); //this is a simple class that reads an configuration CRON file
}
public void StartDeleteDetails()
{
RecurringJob.AddOrUpdate(DELETE_SERVICE, () =>
Business.SendDelete(), //this is my class that does the actual process
HangFireConfiguration.Instance.DeleteCron.Record); //this is a simple class that reads an configuration CRON file
}
}
HangFireConfiguration.cs
public sealed class HangFireConfiguration : ConfigurationSection
{
private static HangFireConfiguration _instance;
public static HangFireConfiguration Instance
{
get { return _instance ?? (_instance = (HangFireConfiguration)WebConfigurationManager.GetSection("hangfire")); }
}
[ConfigurationProperty("send_cron", IsRequired = true)]
public CronElements SendCron
{
get { return (CronElements)base["send_cron"]; }
set { base["send_cron"] = value; }
}
[ConfigurationProperty("delete_cron", IsRequired = true)]
public CronElements DeleteCron
{
get { return (CronElements)base["delete_cron"]; }
set { base["delete_cron"] = value; }
}
}
hangfire.config
<hangfire>
<send_cron record="0,15,30,45 * * * *"></send_cron>
<delete_cron record="0,15,30,45 * * * *"></delete_cron>
</hangfire>
The CRON expression above will run at 0,15,30,45 minutes every hour every day.
Web.config
<configSections>
<!-- Points to the HangFireConfiguration class -->
<section name="hangfire" type="MyProject.Configuration.HangFireConfiguration" />
</configSections>
<!-- Points to the .config file -->
<hangfire configSource="Configs\hangfire.config" />
Conclusion
Given the scenario you described, I would probably install HangFire in your ASP.NET MVC application and remove the console application, simple because it is one project less to worry about. Even though you can install it on a console application I would rather not follow that path because if you hit a brick wall (and you'll hit, trust me), chances are you'll find help mostly for cases where it was installed in ASP.NET applications.

No need of any more console application to update the database. You can use hangfire in your MVC application itself.
http://docs.hangfire.io/en/latest/configuration/index.html
After adding the hangfire configuration, you can make use of normal MVC method to do the console operations like updating the DB.
Based on your requirement you can use
BackgroundJob.Enqueue --> Immediate update to DB
BackgroundJob.Schedule --> Delayed update to DB
RecurringJob.AddOrUpdate --> Recurring update to DB like windows service.
Below is an example,
public class MyController : Controller
{
public void MyMVCMethod(int Id)
{
BackgroundJob.Enqueue(() => UpdateDB(Id));
}
public void UpdateDB(Id)
{
// Code to update the Database.
}
}

Related

Can an Azure Function be Executed for Multiple Environments

I've encountered a dependency injection scenario which I cannot find a way through.
We currently have an Azure function.
We are using dependency injection via the FunctionsStartup attribute.
That all works fine, until I get asked to make it work for multiple environments.
The tester found it too onerous to deploy to 7 different environments, so I was asked to re-jig the function so that it runs (in a loop) for those environments.
That means 7 different IConfigurations and somehow having 7 separate compartmentalised IOC registrations of services.
I can't think of a way of doing that, without significantly re-structuring the way abstractions are being resolved. Even if you set up registrations in a loop and inject an IEnumerable of a service, when it goes to resolve a child dependency, it just pulls the last one registered, rather than the one which was meant to correlate with the current item being iterated.
So, something like this (using Autofac):
Registration
foreach (var configuration in configurations)
{
containerBuilder.Register<ICosmosDbService<AccountUsage>>(sp =>
{
var dBConfig = CosmosDBHelper.GetProjectDatabaseConfig(configuration.Value, Project.Jupiter);
return CosmosClientInitializer<AccountUsage>.Initialize(dBConfig);
}).As<ICosmosDbService<AccountUsage>>();
}
Usage
private readonly IEnumerable<IAccountUsageService> _accountUsageService;
public JobScheduler(IEnumerable<IAccountUsageService> accountUsageService)
{
_accountUsageService = accountUsageService;
}
[FunctionName("JobScheduler")]
public async Task Run([TimerTrigger("0 */2 * * * *")] TimerInfo myTimer, ILogger log)
{
log.LogInformation($"Job Scheduler Timer trigger function executed at: {DateTime.Now}");
try
{
foreach (var usageService in _accountUsageService)
{
var logs = await usageService.GetCurrentAccountUsage("gfkjdsasjfa");
// ...
}
}
I realise this kind of DI usage is not ideal (and does not even work).
Is there a way to structure an Azure Function such that it can execute for different configurations in a compartmentalised manner? Or is this really just fighting against the technology?
You've got a couple of ways to do this - either inject the right dependencies into the function constructor, or resolve them dynamically using a service-locater type approach with a named instance.
Let's consider the second approach and what it would mean for your implementation. As you demonstrated, you'd be looping through your instances and resolving the dependency you want to use, then invoking it
foreach (var usageService in _accountUsageService)
{
var logs = await usageService.GetCurrentAccountUsage("named-instance");
logs.DoSomething();
}
This is technically possible, but now you're doing batch processing - you're doing more than once piece of work that's been triggered by a single event (the timer object), which means you have to deal with a couple of extra problems. What should you do if there's a failure with one of the instances, and what to do if one of the instances is running slowly?
Ideally, you want functions to do the smallest bit of work they can, and complete quickly - You don't want failure or slowness with one particular instance impacting the other instances. By breaking it down to the smallest piece of work (think, one event trigger does one piece of work) then you can take advantage of the functions runtime for things like retries on failures, and threading and concurrency is now being done for you by the runtime.
You could then think about a couple of ways you could do this. a) multiple function signatures and a service resolver approach, e.g.
public class JobScheduler
{
public JobScheduler(IEnumerable<IAccountUsageService> accountUsageService)
{
_accountUsageService = accountUsageService;
}
[FunctionName("FirstInstance")]
public Task FirstInstance([TimerTrigger("%MetricPoller:Schedule%")] TimerInfo myTimer)
{
var logs = await _accountUsageService.GetNamedInstance("instance-a");
logs.DoSomething();
}
[FunctionName("SecondInstance")]
public Task SecondInstance([TimerTrigger("%MetricPoller:Schedule%")] TimerInfo myTimer)
{
var logs = _accountUsageService.GetNamedInstance("instance-b");
logs.DoSomething();
}
}
or b), multiple classes with the necessary dependencies injected
public class JobSchedulerFirstInstance
{
public JobSchedulerFirstInstance(ILogs logs)
{
_logs = logs;
}
[FunctionName("FirstInstance")]
public Task FirstInstance([TimerTrigger("%MetricPoller:Schedule%")] TimerInfo myTimer)
{
_logs.DoSomething();
}
}
I'd personally lean towards multiple classes approach, and register named instances with my container. A bit of extra wire up work needed, but you'll end up with lots of small classes that all look very similar that are basically jus t plumbing that the functions runtime executes.

Use ASP.NET MVC Unity for Data Caching

I have a ASP.Net MVC project running on .NET 4.6.1 Framework.
I have recently added Unity.Mvc 5 IoC framework for dependency injection
In order to have flexibility for unit testing and other, I moved my Unity Configuration to a separate class library so that I can call the Unity Register methods from Unit test projects and other as needed.
Here is my high-level solution design.
I would like to use the same class library to implement application cache.
When I installed Unity.Mvc5 from nuget package it added following references (I added some of them manually) :
Microsoft.Practices.EnterpriseLibrary.Caching 5.0.505.0
Enterprise Library Shared Library 5.0.505.0
Microsoft.Practices.ServiceLocation 1.3.0.0
Microsoft.Practices.Unity 4.0.0.0
Microsoft.Practices.Unity.Configuration 4.0.0.0
Microsoft.Practices.Unity.Interception 2.1.505.0
Microsoft.Practices.Unity.Interception.Configuration 2.1.505.0
Microsoft.Practices.Unity.RegistrationByConvention 4.0.0.0
I tried few articles to implement Application Block Cache Management so that I can cache data in my Service Implementer layers, but all those documentations are showing code examples which is expecting Unity 2.xxx version.
Here is my Unity Configuration
public static class UnityConfig
{
public static void RegisterComponents()
{
var container = new UnityContainer();
container.RegisterType<UserManager<User>>(new HierarchicalLifetimeManager());
container.RegisterType<IUserStore<User>, UserStore<User>>(new HierarchicalLifetimeManager());
container.RegisterType<DbContext, OfficeGxDbContext>(new HierarchicalLifetimeManager());
container.RegisterType<IAppSetting, AppSettingService>();
container.RegisterType<ISubscription, SubscriptionService>();
DependencyResolver.SetResolver(new UnityDependencyResolver(container));
}
}
In my AppSettingService.cs I have get all method
public List<AppSetting> All()
{
using (var context = new MyDbContext())
{
//CachecKeyItem.AppSettingsAll
return context.AppSettings.Where(x => !x.IsDeleted)
.Include(x => x.Module).ToList();
}
}
I want to store this data in cache and reuse it. Similarly do this across all projects I have in my solution and if there is any update or add or delete for any DB records, I want the cached object to refresh it so that cached object is always in sync with DB data
I ended up doing something like this
public interface ICacheService
{
T GetOrSet<T>(string cacheKey, Func<T> getItemCallback) where T : class;
}
Service Implementor
public class InMemoryCache : ICacheService
{
public T GetOrSet<T>(string cacheKey, Func<T> getItemCallback) where T : class
{
if (MemoryCache.Default.Get(cacheKey) is T item) return item;
item = getItemCallback();
MemoryCache.Default.Add(cacheKey, item, DateTime.Now.AddMinutes(8));
return item;
}
}
use like this
_cacheService.GetOrSet(CachecKeyItem.AppSettingsAll, () => context.AppSettings
.Where(x => !x.IsDeleted)
.Include(x => x.Module).ToList());
Now my question is, when there is any change to data, like add/edit/delete, how do I refresh the cache in most efficient way? I know deleting the key would be one, is there a better way?
It slightly depends on if your running in a single server system or multi server system. Its generally better to always design for multi-server that way if you ever decide to scale out your already sorted.
So assuming you can get a message to fire off when your cache invalidates to all the servers without issue then the easiest way is to delete your cache key... now that also brings a few areas of optimization that can be looked at.
Is this cached data utilized heavily, in which case may you benefit from a pre-fetch cache where you either send the updated cached entry to all servers or require all servers to ask for it? Is it used very little in which case you don't really want to re-populate until its requested otherwise your needlessly bloating your application.

How do I setup Windsor container on a console application to inject to external library

I have a console app and web API both referencing the same data layer which is a separate project.
In that data layer, I have a class that requires a repository that we are grabbing from the container when that class is instantiated.
In that class, it has a base class which we are doing the following in the constructor to setup the Repository:
IContainerAccessor containerAccessor = HttpContext.Current.ApplicationInstance as IContainerAccessor;
Repository = containerAccessor.Container.Resolve<IRepository>();
What would be the best way to set this up? This is obviously a problem for our console application as it has no HttpContext.
If I'm correct you want to setup your console app so it can inject classes from the shared data layer.
To do so, you need to create an installer for the console app and tell it to run the installers in the shared library, but to modify the life style from 'PerWebRequest' to 'Singleton' or 'Transient'.
For more information read this article:
http://blog.ploeh.dk/2010/04/26/ChangingWindsorlifestylesafterthefact/
Be aware that changing this may cause problems.
I.e.: If multiple components configured as "perWebRequest" require a 'Unit-Of-Work' to be injected, then this uow will be different for all components if you change the life style to transient.
Changing it to Singleton causes the same but opposite problem. Objects that are created now will have the same object for different requests ...
If you are okay with the problems this code should get you starting
public class ConsoleAppInstaller: IWindsorInstaller
{
public void Install(IWindsorContainer container, IConfigurationStore store)
{
// 1) make sure we do not use PerWebRequest life style types
var convertWebToTransient = new WebToTransientConvertor();
container.Kernel.ComponentModelBuilder.AddContributor(convertWebToTransient);
// 2) call installers on all libraries we use ...
container.Install(FromAssembly.Containing<SharedDataLayerInstaller>());
// 3) link internal services ...
container.Register(Component.For<IXxxxFactory>().AsFactory());
container.Register(Component.For<IYyyyFactory>().AsFactory());
container.Register(Classes.FromThisAssembly().Where(c => typeof(Form).IsAssignableFrom(c)).LifestyleTransient());
}
public static IWindsorContainer Bootstrap()
{
return new WindsorContainer().Install(FromAssembly.This());
}
}
/// <summary>
/// This class allows to intercept installers using PerWebRequest lifestyles and replaces them with Transient life styles.
/// <code>container.Kernel.ComponentModelBuilder.AddContributor(new WebToTransientConvertor())</code>
/// </summary>
public class WebToTransientConvertor : IContributeComponentModelConstruction
{
//http://blog.ploeh.dk/2010/04/26/ChangingWindsorlifestylesafterthefact/
public void ProcessModel(IKernel kernel, ComponentModel model)
{
if (model.LifestyleType == LifestyleType.PerWebRequest)
//model.LifestyleType = LifestyleType.Transient;
model.LifestyleType = LifestyleType.Singleton;
}
}

How does EF7 bootstrap dependency injection (IServiceCollection) if you're not in ASP.NET 5?

I'm still trying to get my head around what's what with ASP.NET 5 / EF 7. I'm using DNX projects (.xproj).
Startup is used by OWIN/ASP.NET for configuring, loading services, etc. But it's also used for EF 7 migrations (to set your DbContextOptions for example).
My main goal is to know how EF7 (and ASP.NET 5) bootstrap with Startup and who's creating the startup class, initializing the DI container, etc.
An example of what I need to do, for context, is that in my xUnit unit tests (which are in their own assembly and reference my data assembly which doesn't have a Startup class), I need to AddDbContext to set my connection.
I have the sample startup class:
namespace Radar.Data
{
using Microsoft.AspNet.Builder;
using Microsoft.AspNet.Hosting;
using Microsoft.Data.Entity;
using Microsoft.Extensions.Configuration;
using Microsoft.Extensions.DependencyInjection;
using Microsoft.Extensions.PlatformAbstractions;
public class Startup
{
public IConfigurationRoot Configuration { get; set; }
public Startup(IHostingEnvironment env, IApplicationEnvironment appEnv)
{
var builder = new ConfigurationBuilder()
.SetBasePath(appEnv.ApplicationBasePath)
.AddJsonFile("appsettings.json");
Configuration = builder.Build();
}
public void ConfigureServices(IServiceCollection services)
{
services.AddEntityFramework()
.AddSqlServer()
.AddDbContext<RadarDbContext>(options =>
options.UseSqlServer(Configuration["Data:DefaultConnection:ConnectionString"]));
}
public void Configure(IApplicationBuilder app)
{
}
}
}
This is currently in my data assembly and not my unit test assembly. I tried adding the app setting (I know it's OWIN but I thought I'd give it a shot):
<appSettings>
<add key="owin:appStartup" value="Radar.Data.Startup, Radar.Data" />
</appSettings>
The startup class is not getting executed.
I'd really like an understanding of the overall mechanism with Startup, who calls it, etc., but for now, I just need an understanding of how EF 7 initializes dependencies/services so that I can properly initialize my unit tests.
UPDATE
Here's what I've got in my unit test so far and I thought I had it working at one point:
ServiceCollection serviceCollection = new ServiceCollection();
IServiceProvider serviceProvider = serviceCollection.BuildServiceProvider();
DbContextActivator.ServiceProvider = serviceProvider;
serviceCollection.AddEntityFramework()
.AddSqlServer()
.AddDbContext<RadarDbContext>(
options => options.UseSqlServer("Server=.;Database=SonOfRadar;Trusted_Connection=True;MultipleActiveResultSets=True"));
but now I'm getting No service for type 'Microsoft.Data.Entity.Internal.IDbSetInitializer' has been registered when my DbContext is instantiated. So obviously not getting all the EF services loaded.
If I comment out:
DbContextActivator.ServiceProvider = serviceProvider;
it errors earlier with: No database providers are configured. Configure a database provider by overriding OnConfiguring in your DbContext class or in the AddDbContext method when setting up services.
Setting DbContextActivator.ServiceProvider is the only place in EF7 where I can find a hook to set your own provider. I'd be just as happy getting an instance of EF7's internal service collection and working with that. I think I'm going to scour the EF7 unit test code again and see if I'm missing a critical piece.
Startup class is created by Microsoft.AspNet.Hosting package when you run you web application (see StartupLoader.cs).
You can also look onto WebApplication.Run method (WebApplication.Run) its an entry point to ASP.NET 5 web applications.
DI is initialized in WebHostBuilder class (WebHostBuilder.cs) and inside dnx in Bootstrapper class (Bootstrapper.cs)

How can we support modular and testable patterns with ASP.NET MVC 4 and MEF 2?

We're trying to use MEF 2 with ASP.NET MVC 4 to support an extensible application. There are really 2 parts to this question (hope that's okay SO gods):
How do we use Microsoft.Composition and the MVC container code (MEF/MVC demo source) to replace Ninject as our DI for ICoreService, ICoreRepository, IUnitOfWork, and IDbContext?
It looks like we can't use both Ninject and the MVC container at the same time (I'm sure many are saying "duh"), so we'd like to go with MEF, if possible. I tried removing Ninject and setting [Export] attributes on each of the relevant implementations, spanning two assemblies in addition to the web project, but Save() failed to persist with no errors. I interpreted that as a singleton issue, but could not figure out how to sort it out (incl. [Shared]).
How do we load multiple assemblies dynamically at runtime?
I understand how to use CompositionContainer.AddAssemblies() to load specific DLLs, but for our application to be properly extensible, we require something more akin to how I (vaguely) understand catalogs in "full" MEF, which have been stripped out from the Microsoft.Composition package (I think?); to allow us to load all IPluggable (or whatever) assemblies, which will include their own UI, service, and repository layers and tie in to the Core service/repo too.
EDIT 1
A little more reading solved the first problem which was, indeed, a singleton issue. Attaching [Shared(Boundaries.HttpRequest)] to the CoreDbContext solved the persistence problem. When I tried simply [Shared], it expanded the 'singletonization' to the Application level (cross-request) and threw an exception saying that the edited object was already in the EF cache.
EDIT 2
I used the iterative assembly loading "meat" from Nick Blumhardt's answer below to update my Global.asax.cs code. The standard MEF 2 container from his code did not work in mine, probably because I'm using the MEF 2(?) MVC container. Summary: the code listed below now works as desired.
CoreDbContext.cs (Data.csproj)
[Export(typeof(IDbContext))]
[Shared(Boundaries.HttpRequest)]
public class CoreDbContext : IDbContext { ... }
CoreRepository.cs (Data.csproj)
[Export(typeof(IUnitOfWork))]
[Export(typeof(ICoreRepository))]
public class CoreRepository : ICoreRepository, IUnitOfWork
{
[ImportingConstructor]
public CoreRepository(IInsightDbContext context)
{
_context = context;
}
...
}
CoreService.cs (Services.csproj)
[Export(typeof(ICoreService))]
public class CoreService : ICoreService
{
[ImportingConstructor]
public CoreService(ICoreRepository repository, IUnitOfWork unitOfWork)
{
_repository = repository;
_unitOfWork = unitOfWork;
}
...
}
UserController.cs (Web.csproj)
public class UsersController : Controller
{
[ImportingConstructor]
public UsersController(ICoreService service)
{
_service = service;
}
...
}
Global.asax.cs (Web.csproj)
public class MvcApplication : System.Web.HttpApplication
{
protected void Application_Start()
{
CompositionProvider.AddAssemblies(
typeof(ICoreRepository).Assembly,
typeof(ICoreService).Assembly,
);
// EDIT 2 --
// updated code to answer my 2nd question based on Nick Blumhardt's answer
foreach (var file in System.IO.Directory.GetFiles(Server.MapPath("Plugins"), "*.dll"))
{
try
{
var name = System.Reflection.AssemblyName.GetAssemblyName(file);
var assembly = System.Reflection.Assembly.Load(name);
CompositionProvider.AddAssembly(assembly);
}
catch
{
// You'll need to craft exception handling to
// your specific scenario.
}
}
}
}
If I understand you correctly, you're looking for code that will load all assemblies from a directory and load them into the container; here's a skeleton for doing that:
var config = new ContainerConfiguration();
foreach (var file in Directory.GetFiles(#".\Plugins", "*.dll"))
{
try
{
var name = AssemblyName.GetAssemblyName(file);
var assembly = Assembly.Load(name);
config.WithAssembly(assembly);
}
catch
{
// You'll need to craft exception handling to
// your specific scenario.
}
}
var container = config.CreateContainer();
// ...
Hammett discusses this scenario and shows a more complete version in F# here: http://hammett.castleproject.org/index.php/2011/12/a-decent-directorycatalog-implementation/
Note, this won't detect assemblies added to the directory after the application launches - Microsoft.Composition isn't intended for that kind of use, so if the set of plug-ins changes your best bet is to detect that with a directory watcher and prompt the user to restart the app. HTH!
MEF is not intended to be used as DI framework. Which means that you should separate your "plugins" (whatever they are) composition from your infrastructure dependencies, and implement the former via MEF and the latter via whatever DI framework you prefer.
I think there are a little misunderstandings on what MEF can and can't do.
Originally MEF was conceived as purely an extensibility architecture, but as the framework evolved up to its first release, it can be fully supported as a DI container also. MEF will handle dependency injection for you, and does so through it's ExportProvider architecture. It is also entirely possible to use other DI frameworks with MEF. So in reality there are a number of ways things could be achieved:
Build a NinjectExportProvider that you can plug into MEF, so when MEF is searching for available exports, it will be able to interrogate your Ninject container.
Use an implementation of the Common Services Locator pattern to bridge between MEF and Ninject or vice versa.
Because you are using MEF for the extensibility, you'll probably want to use the former, as this exposes your Ninject components to MEF, which in turn exposes them to your plugins.
The other thing to consider, which is a bit disappointing, is in reality there isn't a lot of room for automagically plugging in of features ala Wordpress on ASP.NET. ASP.NET is a compiled and managed environment, and because of that you either resort to late-binding by loading assemblies manually at runtime, or you restart the application to pick up the new plugins, which sort of defeats the object of being able to plug new extensions in through the application.
My advice, is plan your architecture to pick up any extensibility points as startup and assume that any core changes will require a deployment and application restart.
In terms of the direct questions asked:
The CompositionProvider accepts in instance of ContainerConfiguration which is used internally to create the CompositionContainer used by the provider. So you could use this as the point by which you customise how you want your container to be instantiated. The ContainerConfiguration supports a WithProvider method:
var configuration = new ContainerConfiguration().WithProvider(new NinjectExportDescriptorProvider(kernel));
CompositionProvider.SetConfiguration(configuration);
Where NinjectExportDescriptorProvider might be:
public class NinjectExportDescriptorProvider: ExportDescriptorProvider
{
private readonly IKernel _kernel;
public NinjectExportDescriptorProvider(IKernel kernel)
{
if (kernel == null) throw new ArgumentNullException("kernel");
_kernel = kernel;
}
public override IEnumerable<ExportDescriptorPromise> GetExportDescriptors(
CompositionContract contract, DependencyAccessor dependencyAccessor)
{
var type = contract.ContractType;
if (!_kernel.GetBindings(type).Any())
return NoExportDescriptors;
return new[] {
new ExportDescriptorPromise(
contract,
"Ninject Kernel",
true, // Hmmm... need to consider this, setting it to true will create it as a shared part, false as new instance each time,
NoDependencies,
_ => ExportDescriptor.Create((c, o) => _kernel.Get(type), NoMetadata)) };
}
}
}
Note: I have not tested this, this is all theory, and is based on the example AppSettingsExportDescriptorProvider at: http://mef.codeplex.com/wikipage?title=ProgrammingModelExtensions
It's different from using the standard ExportProvider, because using the CompostionProvider is built around lightweight composition. But essentially you're wrapping up access to your Ninject kernel and making it available to your CompositionContainer.
As with adding a specific new provider (see above), you can use the ContainerConfiguration to read the available assemblies, probably something like:
var configuration = new ContainerConfiguration().WithAssemblies(AppDomain.GetAssemblies())
Again, I haven't tested all of this, but I hope it at least points you in the right direction.

Resources