I have read lots of information about page caching and partial page caching in a MVC application. However, I would like to know how you would cache data.
In my scenario I will be using LINQ to Entities (entity framework). On the first call to GetNames (or whatever the method is) I want to grab the data from the database. I want to save the results in cache and on the second call to use the cached version if it exists.
Can anyone show an example of how this would work, where this should be implemented (model?) and if it would work.
I have seen this done in traditional ASP.NET apps , typically for very static data.
Here's a nice and simple cache helper class/service I use:
using System.Runtime.Caching;
public class InMemoryCache: ICacheService
{
public T GetOrSet<T>(string cacheKey, Func<T> getItemCallback) where T : class
{
T item = MemoryCache.Default.Get(cacheKey) as T;
if (item == null)
{
item = getItemCallback();
MemoryCache.Default.Add(cacheKey, item, DateTime.Now.AddMinutes(10));
}
return item;
}
}
interface ICacheService
{
T GetOrSet<T>(string cacheKey, Func<T> getItemCallback) where T : class;
}
Usage:
cacheProvider.GetOrSet("cache key", (delegate method if cache is empty));
Cache provider will check if there's anything by the name of "cache id" in the cache, and if there's not, it will call a delegate method to fetch data and store it in cache.
Example:
var products=cacheService.GetOrSet("catalog.products", ()=>productRepository.GetAll())
Reference the System.Web dll in your model and use System.Web.Caching.Cache
public string[] GetNames()
{
string[] names = Cache["names"] as string[];
if(names == null) //not in cache
{
names = DB.GetNames();
Cache["names"] = names;
}
return names;
}
A bit simplified but I guess that would work. This is not MVC specific and I have always used this method for caching data.
I'm referring to TT's post and suggest the following approach:
Reference the System.Web dll in your model and use System.Web.Caching.Cache
public string[] GetNames()
{
var noms = Cache["names"];
if(noms == null)
{
noms = DB.GetNames();
Cache["names"] = noms;
}
return ((string[])noms);
}
You should not return a value re-read from the cache, since you'll never know if at that specific moment it is still in the cache. Even if you inserted it in the statement before, it might already be gone or has never been added to the cache - you just don't know.
So you add the data read from the database and return it directly, not re-reading from the cache.
For .NET 4.5+ framework
add reference: System.Runtime.Caching
add using statement:
using System.Runtime.Caching;
public string[] GetNames()
{
var noms = System.Runtime.Caching.MemoryCache.Default["names"];
if(noms == null)
{
noms = DB.GetNames();
System.Runtime.Caching.MemoryCache.Default["names"] = noms;
}
return ((string[])noms);
}
In the .NET Framework 3.5 and earlier versions, ASP.NET provided an in-memory cache implementation in the System.Web.Caching namespace. In previous versions of the .NET Framework, caching was available only in the System.Web namespace and therefore required a dependency on ASP.NET classes. In the .NET Framework 4, the System.Runtime.Caching namespace contains APIs that are designed for both Web and non-Web applications.
More info:
https://msdn.microsoft.com/en-us/library/dd997357(v=vs.110).aspx
https://learn.microsoft.com/en-us/dotnet/framework/performance/caching-in-net-framework-applications
Steve Smith did two great blog posts which demonstrate how to use his CachedRepository pattern in ASP.NET MVC. It uses the repository pattern effectively and allows you to get caching without having to change your existing code.
http://ardalis.com/Introducing-the-CachedRepository-Pattern
http://ardalis.com/building-a-cachedrepository-via-strategy-pattern
In these two posts he shows you how to set up this pattern and also explains why it is useful. By using this pattern you get caching without your existing code seeing any of the caching logic. Essentially you use the cached repository as if it were any other repository.
I have used it in this way and it works for me.
https://msdn.microsoft.com/en-us/library/system.web.caching.cache.add(v=vs.110).aspx
parameters info for system.web.caching.cache.add.
public string GetInfo()
{
string name = string.Empty;
if(System.Web.HttpContext.Current.Cache["KeyName"] == null)
{
name = GetNameMethod();
System.Web.HttpContext.Current.Cache.Add("KeyName", name, null, DateTime.Noew.AddMinutes(5), Cache.NoSlidingExpiration, CacheitemPriority.AboveNormal, null);
}
else
{
name = System.Web.HttpContext.Current.Cache["KeyName"] as string;
}
return name;
}
AppFabric Caching is distributed and an in-memory caching technic that stores data in key-value pairs using physical memory across multiple servers. AppFabric provides performance and scalability improvements for .NET Framework applications. Concepts and Architecture
Extending #Hrvoje Hudo's answer...
Code:
using System;
using System.Runtime.Caching;
public class InMemoryCache : ICacheService
{
public TValue Get<TValue>(string cacheKey, int durationInMinutes, Func<TValue> getItemCallback) where TValue : class
{
TValue item = MemoryCache.Default.Get(cacheKey) as TValue;
if (item == null)
{
item = getItemCallback();
MemoryCache.Default.Add(cacheKey, item, DateTime.Now.AddMinutes(durationInMinutes));
}
return item;
}
public TValue Get<TValue, TId>(string cacheKeyFormat, TId id, int durationInMinutes, Func<TId, TValue> getItemCallback) where TValue : class
{
string cacheKey = string.Format(cacheKeyFormat, id);
TValue item = MemoryCache.Default.Get(cacheKey) as TValue;
if (item == null)
{
item = getItemCallback(id);
MemoryCache.Default.Add(cacheKey, item, DateTime.Now.AddMinutes(durationInMinutes));
}
return item;
}
}
interface ICacheService
{
TValue Get<TValue>(string cacheKey, Func<TValue> getItemCallback) where TValue : class;
TValue Get<TValue, TId>(string cacheKeyFormat, TId id, Func<TId, TValue> getItemCallback) where TValue : class;
}
Examples
Single item caching (when each item is cached based on its ID because caching the entire catalog for the item type would be too intensive).
Product product = cache.Get("product_{0}", productId, 10, productData.getProductById);
Caching all of something
IEnumerable<Categories> categories = cache.Get("categories", 20, categoryData.getCategories);
Why TId
The second helper is especially nice because most data keys are not composite. Additional methods could be added if you use composite keys often. In this way you avoid doing all sorts of string concatenation or string.Formats to get the key to pass to the cache helper. It also makes passing the data access method easier because you don't have to pass the ID into the wrapper method... the whole thing becomes very terse and consistant for the majority of use cases.
Here's an improvement to Hrvoje Hudo's answer. This implementation has a couple of key improvements:
Cache keys are created automatically based on the function to update data and the object passed in that specifies dependencies
Pass in time span for any cache duration
Uses a lock for thread safety
Note that this has a dependency on Newtonsoft.Json to serialize the dependsOn object, but that can be easily swapped out for any other serialization method.
ICache.cs
public interface ICache
{
T GetOrSet<T>(Func<T> getItemCallback, object dependsOn, TimeSpan duration) where T : class;
}
InMemoryCache.cs
using System;
using System.Reflection;
using System.Runtime.Caching;
using Newtonsoft.Json;
public class InMemoryCache : ICache
{
private static readonly object CacheLockObject = new object();
public T GetOrSet<T>(Func<T> getItemCallback, object dependsOn, TimeSpan duration) where T : class
{
string cacheKey = GetCacheKey(getItemCallback, dependsOn);
T item = MemoryCache.Default.Get(cacheKey) as T;
if (item == null)
{
lock (CacheLockObject)
{
item = getItemCallback();
MemoryCache.Default.Add(cacheKey, item, DateTime.Now.Add(duration));
}
}
return item;
}
private string GetCacheKey<T>(Func<T> itemCallback, object dependsOn) where T: class
{
var serializedDependants = JsonConvert.SerializeObject(dependsOn);
var methodType = itemCallback.GetType();
return methodType.FullName + serializedDependants;
}
}
Usage:
var order = _cache.GetOrSet(
() => _session.Set<Order>().SingleOrDefault(o => o.Id == orderId)
, new { id = orderId }
, new TimeSpan(0, 10, 0)
);
public sealed class CacheManager
{
private static volatile CacheManager instance;
private static object syncRoot = new Object();
private ObjectCache cache = null;
private CacheItemPolicy defaultCacheItemPolicy = null;
private CacheEntryRemovedCallback callback = null;
private bool allowCache = true;
private CacheManager()
{
cache = MemoryCache.Default;
callback = new CacheEntryRemovedCallback(this.CachedItemRemovedCallback);
defaultCacheItemPolicy = new CacheItemPolicy();
defaultCacheItemPolicy.AbsoluteExpiration = DateTime.Now.AddHours(1.0);
defaultCacheItemPolicy.RemovedCallback = callback;
allowCache = StringUtils.Str2Bool(ConfigurationManager.AppSettings["AllowCache"]); ;
}
public static CacheManager Instance
{
get
{
if (instance == null)
{
lock (syncRoot)
{
if (instance == null)
{
instance = new CacheManager();
}
}
}
return instance;
}
}
public IEnumerable GetCache(String Key)
{
if (Key == null || !allowCache)
{
return null;
}
try
{
String Key_ = Key;
if (cache.Contains(Key_))
{
return (IEnumerable)cache.Get(Key_);
}
else
{
return null;
}
}
catch (Exception)
{
return null;
}
}
public void ClearCache(string key)
{
AddCache(key, null);
}
public bool AddCache(String Key, IEnumerable data, CacheItemPolicy cacheItemPolicy = null)
{
if (!allowCache) return true;
try
{
if (Key == null)
{
return false;
}
if (cacheItemPolicy == null)
{
cacheItemPolicy = defaultCacheItemPolicy;
}
String Key_ = Key;
lock (Key_)
{
return cache.Add(Key_, data, cacheItemPolicy);
}
}
catch (Exception)
{
return false;
}
}
private void CachedItemRemovedCallback(CacheEntryRemovedArguments arguments)
{
String strLog = String.Concat("Reason: ", arguments.RemovedReason.ToString(), " | Key-Name: ", arguments.CacheItem.Key, " | Value-Object: ", arguments.CacheItem.Value.ToString());
LogManager.Instance.Info(strLog);
}
}
I use two classes. First one the cache core object:
public class Cacher<TValue>
where TValue : class
{
#region Properties
private Func<TValue> _init;
public string Key { get; private set; }
public TValue Value
{
get
{
var item = HttpRuntime.Cache.Get(Key) as TValue;
if (item == null)
{
item = _init();
HttpContext.Current.Cache.Insert(Key, item);
}
return item;
}
}
#endregion
#region Constructor
public Cacher(string key, Func<TValue> init)
{
Key = key;
_init = init;
}
#endregion
#region Methods
public void Refresh()
{
HttpRuntime.Cache.Remove(Key);
}
#endregion
}
Second one is list of cache objects:
public static class Caches
{
static Caches()
{
Languages = new Cacher<IEnumerable<Language>>("Languages", () =>
{
using (var context = new WordsContext())
{
return context.Languages.ToList();
}
});
}
public static Cacher<IEnumerable<Language>> Languages { get; private set; }
}
I will say implementing Singleton on this persisting data issue can be a solution for this matter in case you find previous solutions much complicated
public class GPDataDictionary
{
private Dictionary<string, object> configDictionary = new Dictionary<string, object>();
/// <summary>
/// Configuration values dictionary
/// </summary>
public Dictionary<string, object> ConfigDictionary
{
get { return configDictionary; }
}
private static GPDataDictionary instance;
public static GPDataDictionary Instance
{
get
{
if (instance == null)
{
instance = new GPDataDictionary();
}
return instance;
}
}
// private constructor
private GPDataDictionary() { }
} // singleton
HttpContext.Current.Cache.Insert("subjectlist", subjectlist);
You can also try and use the caching built into ASP MVC:
Add the following attribute to the controller method you'd like to cache:
[OutputCache(Duration=10)]
In this case the ActionResult of this will be cached for 10 seconds.
More on this here
Related
How to use DecorateAllWith to decorate with a DynamicProxy all instances implements an interface?
For example:
public class ApplicationServiceInterceptor : IInterceptor
{
public void Intercept(IInvocation invocation)
{
// ...
invocation.Proceed();
// ...
}
}
public class ApplicationServiceConvention : IRegistrationConvention
{
public void Process(Type type, Registry registry)
{
if (type.CanBeCastTo<IApplicationService>() && type.IsInterface)
{
var proxyGenerator = new ProxyGenerator();
// ??? how to use proxyGenerator??
// ???
registry.For(type).DecorateAllWith(???); // How to use DecorateAllWith DynamicProxy ...??
}
}
}
I could decorate some interfaces to concrete types using (for example):
var proxyGenerator = new ProxyGenerator();
registry.For<IApplicationService>().Use<BaseAppService>().DecorateWith(service => proxyGenerator.CreateInterfaceProxyWithTargetInterface(....))
But havent able to using DecorateAll to do this.
To call registry.For<>().Use<>().DecorateWith() I have to do this:
if (type.CanBeCastTo<IApplicationService>() && !type.IsAbstract)
{
var interfaceToProxy = type.GetInterface("I" + type.Name);
if (interfaceToProxy == null)
return null;
var proxyGenerator = new ProxyGenerator();
// Build expression to use registration by reflection
var expression = BuildExpressionTreeToCreateProxy(proxyGenerator, type, interfaceType, new MyInterceptor());
// Register using reflection
var f = CallGenericMethod(registry, "For", interfaceToProxy);
var u = CallGenericMethod(f, "Use", type);
CallMethod(u, "DecorateWith", expression);
}
Only for crazy minds ...
I start to get very tired of StructureMap, many changes and no documentation, I have been read the source code but ... too many efforts for my objective ...
If someone can give me a bit of light I will be grateful.
Thanks in advance.
In addition ... I post here the real code of my helper to generate the expression tree an register the plugin family:
public static class RegistrationHelper
{
public static void RegisterWithInterceptors(this Registry registry, Type interfaceToProxy, Type concreteType,
IInterceptor[] interceptors, ILifecycle lifecycle = null)
{
var proxyGenerator = new ProxyGenerator();
// Generate expression tree to call DecoreWith of StructureMap SmartInstance type
// registry.For<interfaceToProxy>().Use<concreteType>()
// .DecoreWith(ex => (IApplicationService)
// proxyGenerator.CreateInterfaceProxyWithTargetInterface(interfaceToProxy, ex, interceptors)
var expressionParameter = Expression.Parameter(interfaceToProxy, "ex");
var proxyGeneratorConstant = Expression.Constant(proxyGenerator);
var interfaceConstant = Expression.Constant(interfaceToProxy);
var interceptorConstant = Expression.Constant(interceptors);
var methodCallExpression = Expression.Call(proxyGeneratorConstant,
typeof (ProxyGenerator).GetMethods().First(
met => met.Name == "CreateInterfaceProxyWithTargetInterface"
&& !met.IsGenericMethod && met.GetParameters().Count() == 3),
interfaceConstant,
expressionParameter,
interceptorConstant);
var convert = Expression.Convert(methodCallExpression, interfaceToProxy);
var func = typeof(Func<,>).MakeGenericType(interfaceToProxy, interfaceToProxy);
var expr = Expression.Lambda(func, convert, expressionParameter);
// Register using reflection
registry.CallGenericMethod("For", interfaceToProxy, new[] {(object) lifecycle /*Lifecicle*/})
.CallGenericMethod("Use", concreteType)
.CallNoGenericMethod("DecorateWith", expr);
}
}
public static class CallMethodExtensions
{
/// <summary>
/// Call a method with Generic parameter by reflection (obj.methodName[genericType](parameters)
/// </summary>
/// <returns></returns>
public static object CallGenericMethod(this object obj, string methodName, Type genericType, params object[] parameters)
{
var metod = obj.GetType().GetMethods().First(m => m.Name == methodName && m.IsGenericMethod);
var genericMethod = metod.MakeGenericMethod(genericType);
return genericMethod.Invoke(obj, parameters);
}
/// <summary>
/// Call a method without Generic parameter by reflection (obj.methodName(parameters)
/// </summary>
/// <returns></returns>
public static object CallNoGenericMethod(this object obj, string methodName, params object[] parameters)
{
var method = obj.GetType().GetMethods().First(m => m.Name == methodName && !m.IsGenericMethod);
return method.Invoke(obj, parameters);
}
}
Almost two years later I have needed return this issue for a new project. This time I have solved it this time I have used StructureMap 4.
You can use a custom interceptor policy to decorate an instance in function of his type. You have to implement one interceptor, one interceptor policy and configure it on a registry.
The Interceptor
public class MyExInterceptor : Castle.DynamicProxy.IInterceptor
{
public void Intercept(Castle.DynamicProxy.IInvocation invocation)
{
Console.WriteLine("-- Call to " + invocation.Method);
invocation.Proceed();
}
}
The interceptor policy
public class CustomInterception : IInterceptorPolicy
{
public string Description
{
get { return "good interception policy"; }
}
public IEnumerable<IInterceptor> DetermineInterceptors(Type pluginType, Instance instance)
{
if (pluginType == typeof(IAppService))
{
// DecoratorInterceptor is the simple case of wrapping one type with another
// concrete type that takes the first as a dependency
yield return new FuncInterceptor<IAppService>(i =>
(IAppService)
DynamicProxyHelper.CreateInterfaceProxyWithTargetInterface(typeof(IAppService), i));
}
}
}
Configuration
var container = new Container(_ =>
{
_.Policies.Interceptors(new CustomInterception());
_.For<IAppService>().Use<AppServiceImplementation>();
});
var service = container.GetInstance<IAppService>();
service.DoWork();
You can get a working example on this gist https://gist.github.com/tolemac/3e31b44b7fc7d0b49c6547018f332d68, in the gist you can find three types of decoration, the third is like this answer.
Using it you can configure the decorators of your services easily.
It has been decided by the ASP.NET Web API team to use the JSON.NET library for model binding JSON data. However, "normal" MVC controllers still use the inferior JsonDataContractSerializer. This causes issues with parsing dates, and is causing me much headache.
See this for reference:
http://www.devcurry.com/2013/04/json-dates-are-different-in-aspnet-mvc.html
The author chooses to solve the issue in the Knockout layer on the client. But I would prefer to solve this by using the same JSON.NET model binder in MVC controllers as in Web API controllers.
How do I substitute a different JSON model binder into ASP.NET MVC? Specifically, the JSON.NET library. Using the same model binder from Web API would be ideal if possible.
I have done this, and also heavily customized the serialization that Json.NET is doing, by:
Replace the default formatter in global.asax.cs, Application_Start:
GlobalConfiguration.Configuration.Formatters.Remove(GlobalConfiguration.Configuration.Formatters.JsonFormatter);
GlobalConfiguration.Configuration.Formatters.Add(new CustomJsonMediaTypeFormatter());
And my CustomJsonMediaTypeFormatter is:
public static class CustomJsonSettings
{
private static JsonSerializerSettings _settings;
public static JsonSerializerSettings Instance
{
get
{
if (_settings == null)
{
var settings = new JsonSerializerSettings();
// Must convert times coming from the client (always in UTC) to local - need both these parts:
settings.Converters.Add(new IsoDateTimeConverter { DateTimeStyles = System.Globalization.DateTimeStyles.AssumeUniversal }); // Critical part 1
settings.DateTimeZoneHandling = DateTimeZoneHandling.Local; // Critical part 2
// Skip circular references
settings.ReferenceLoopHandling = ReferenceLoopHandling.Ignore;
// Handle special cases in json (self-referencing loops, etc)
settings.ContractResolver = new CustomJsonResolver();
_settings = settings;
}
return _settings;
}
}
}
public class CustomJsonMediaTypeFormatter : MediaTypeFormatter
{
public JsonSerializerSettings _jsonSerializerSettings;
public CustomJsonMediaTypeFormatter()
{
_jsonSerializerSettings = CustomJsonSettings.Instance;
// Fill out the mediatype and encoding we support
SupportedMediaTypes.Add(new MediaTypeHeaderValue("application/json"));
SupportedEncodings.Add(new UTF8Encoding(false, true));
}
public override bool CanReadType(Type type)
{
return true;
}
public override bool CanWriteType(Type type)
{
return true;
}
public override Task<object> ReadFromStreamAsync(Type type, Stream stream, HttpContent content, IFormatterLogger formatterLogger)
{
// Create a serializer
JsonSerializer serializer = JsonSerializer.Create(_jsonSerializerSettings);
// Create task reading the content
return Task.Factory.StartNew(() =>
{
using (StreamReader streamReader = new StreamReader(stream, SupportedEncodings.First()))
{
using (JsonTextReader jsonTextReader = new JsonTextReader(streamReader))
{
return serializer.Deserialize(jsonTextReader, type);
}
}
});
}
public override Task WriteToStreamAsync(Type type, object value, Stream stream, HttpContent content, TransportContext transportContext)
{
// Create a serializer
JsonSerializer serializer = JsonSerializer.Create(_jsonSerializerSettings);
// Create task writing the serialized content
return Task.Factory.StartNew(() =>
{
using (StreamWriter streamWriter = new StreamWriter(stream, SupportedEncodings.First()))
{
using (JsonTextWriter jsonTextWriter = new JsonTextWriter(streamWriter))
{
serializer.Serialize(jsonTextWriter, value);
}
}
});
}
}
And finally, the CustomJsonResolver:
public class CustomJsonResolver : DefaultContractResolver
{
protected override IList<JsonProperty> CreateProperties(Type type, Newtonsoft.Json.MemberSerialization memberSerialization)
{
var list = base.CreateProperties(type, memberSerialization);
// Custom stuff for my app
if (type == typeof(Foo))
{
RemoveProperty(list, "Bar");
RemoveProperty(list, "Bar2");
}
return list;
}
private void RemoveProperty(IList<JsonProperty> list, string propertyName)
{
var rmc = list.FirstOrDefault(x => x.PropertyName == propertyName);
if (rmc != null)
{
list.Remove(rmc);
}
}
}
The JsonNetValueProviderFactory proposed here works better than the others I've tried (I had issues with arrays using Greg Ennis' one for example). This link also propose a solution to return Json from an action.
I am new to the dependency injection pattern and I am having issues getting a new instance of a class from container.Resolve in tinyioc it just keeps returning the same instance rather than a new instance. Now for the code
public abstract class HObjectBase : Object
{
private string _name = String.Empty;
public string Name
{
get
{
return this._name;
}
set
{
if (this._name == string.Empty && value.Length > 0 && value != String.Empty)
this._name = value;
else if (value.Length < 1 && value == String.Empty)
throw new FieldAccessException("Objects names cannot be blank");
else
throw new FieldAccessException("Once the internal name of an object has been set it cannot be changed");
}
}
private Guid _id = new Guid();
public Guid Id
{
get
{
return this._id;
}
set
{
if (this._id == new Guid())
this._id = value;
else
throw new FieldAccessException("Once the internal id of an object has been set it cannot be changed");
}
}
private HObjectBase _parent = null;
public HObjectBase Parent
{
get
{
return this._parent;
}
set
{
if (this._parent == null)
this._parent = value;
else
throw new FieldAccessException("Once the parent of an object has been set it cannot be changed");
}
}
}
public abstract class HZoneBase : HObjectBase
{
public new HObjectBase Parent
{
get
{
return base.Parent;
}
set
{
if (value == null || value.GetType() == typeof(HZoneBase))
{
base.Parent = value;
}
else
{
throw new FieldAccessException("Zones may only have other zones as parents");
}
}
}
private IHMetaDataStore _store;
public HZoneBase(IHMetaDataStore store)
{
this._store = store;
}
public void Save()
{
this._store.SaveZone(this);
}
}
And the derived class is a dummy at the moment but here it is
public class HZone : HZoneBase
{
public HZone(IHMetaDataStore store)
: base(store)
{
}
}
Now since this is meant to be an external library I have a faced class for accessing
everything
public class Hadrian
{
private TinyIoCContainer _container;
public Hadrian(IHMetaDataStore store)
{
this._container = new TinyIoCContainer();
this._container.Register(store);
this._container.AutoRegister();
}
public HZoneBase NewZone()
{
return _container.Resolve<HZoneBase>();
}
public HZoneBase GetZone(Guid id)
{
var metadataStore = this._container.Resolve<IHMetaDataStore>();
return metadataStore.GetZone(id);
}
public List<HZoneBase> ListRootZones()
{
var metadataStore = this._container.Resolve<IHMetaDataStore>();
return metadataStore.ListRootZones();
}
}
However the test is failing because the GetNewZone() method on the Hadrian class keeps returning the same instance.
Test Code
[Fact]
public void ListZones()
{
Hadrian instance = new Hadrian(new MemoryMetaDataStore());
Guid[] guids = { Guid.NewGuid(), Guid.NewGuid(), Guid.NewGuid() };
int cnt = 0;
foreach (Guid guid in guids)
{
HZone zone = (HZone)instance.NewZone();
zone.Id = guids[cnt];
zone.Name = "Testing" + cnt.ToString();
zone.Parent = null;
zone.Save();
cnt++;
}
cnt = 0;
foreach (HZone zone in instance.ListRootZones())
{
Assert.Equal(zone.Id, guids[cnt]);
Assert.Equal(zone.Name, "Testing" + cnt.ToString());
Assert.Equal(zone.Parent, null);
}
}
I know its probably something simple I'm missing with the pattern but I'm not sure, any help would be appreciated.
First, please always simplify the code to what is absolutely necessary to demonstrate the problem, but provide enough that it will actually run; I had to guess what MemoryMetaDataStore does and implement it myself to run the code.
Also, please say where and how stuff fails, to point others straight to the issue. I spent a few minues figuring out that the exception I was getting was your problem and you weren't even getting to the assertions.
That said, container.Resolve<HZoneBase>() will always return the same instance because that's how autoregistration in TinyIoC works - once an abstraction has been resolved, the same instance is always returned for subsequent calls.
To change this, add the following line to the Hadrian constructor:
this._container.Register<HZoneBase, HZone>().AsMultiInstance();
This will tell the container to create a new instance for each resolution request for HZoneBase.
Also, Bassetassen's answer about the Assert part is correct.
In general, if you want to learn DI, you should read Mark Seemann's excellent book "Dependency Injection in .NET" - not quite an easy read as the whole topic is inherently complex, but it's more than worth it and will let you get into it a few years faster than by learning it on your own.
In your assert stage you are not incrementing cnt. You are also using the actual value as the expected one in the assert. This will be confusing, becuase it says something is excpected when it actually is the actual value that is returned.
The assert part should be:
cnt = 0;
foreach (HZone zone in instance.ListRootZones())
{
Assert.Equal(guids[cnt], zone.Id);
Assert.Equal("Testing" + cnt.ToString(), zone.Name);
Assert.Equal(null, zone.Parent);
cnt++;
}
I need to allow my content pipeline extension to use a pattern similar to a factory. I start with a dictionary type:
public delegate T Mapper<T>(MapFactory<T> mf, XElement d);
public class MapFactory<T>
{
Dictionary<string, Mapper<T>> map = new Dictionary<string, Mapper<T>>();
public void Add(string s, Mapper<T> m)
{
map.Add(s, m);
}
public T Get(XElement xe)
{
if (xe == null) throw new ArgumentNullException(
"Invalid document");
var key = xe.Name.ToString();
if (!map.ContainsKey(key)) throw new ArgumentException(
key + " is not a valid key.");
return map[key](this, xe);
}
public IEnumerable<T> GetAll(XElement xe)
{
if (xe == null) throw new ArgumentNullException(
"Invalid document");
foreach (var e in xe.Elements())
{
var val = e.Name.ToString();
if (map.ContainsKey(val))
yield return map[val](this, e);
}
}
}
Here is one type of object I want to store:
public partial class TestContent
{
// Test type
public string title;
// Once test if true
public bool once;
// Parameters
public Dictionary<string, object> args;
public TestContent()
{
title = string.Empty;
args = new Dictionary<string, object>();
}
public TestContent(XElement xe)
{
title = xe.Name.ToString();
args = new Dictionary<string, object>();
xe.ParseAttribute("once", once);
}
}
XElement.ParseAttribute is an extension method that works as one might expect. It returns a boolean that is true if successful.
The issue is that I have many different types of tests, each of which populates the object in a way unique to the specific test. The element name is the key to MapFactory's dictionary. This type of test, while atypical, illustrates my problem.
public class LogicTest : TestBase
{
string opkey;
List<TestBase> items;
public override bool Test(BehaviorArgs args)
{
if (items == null) return false;
if (items.Count == 0) return false;
bool result = items[0].Test(args);
for (int i = 1; i < items.Count; i++)
{
bool other = items[i].Test(args);
switch (opkey)
{
case "And":
result &= other;
if (!result) return false;
break;
case "Or":
result |= other;
if (result) return true;
break;
case "Xor":
result ^= other;
break;
case "Nand":
result = !(result & other);
break;
case "Nor":
result = !(result | other);
break;
default:
result = false;
break;
}
}
return result;
}
public static TestContent Build(MapFactory<TestContent> mf, XElement xe)
{
var result = new TestContent(xe);
string key = "Or";
xe.GetAttribute("op", key);
result.args.Add("key", key);
var names = mf.GetAll(xe).ToList();
if (names.Count() < 2) throw new ArgumentException(
"LogicTest requires at least two entries.");
result.args.Add("items", names);
return result;
}
}
My actual code is more involved as the factory has two dictionaries, one that turns an XElement into a content type to write and another used by the reader to create the actual game objects.
I need to build these factories in code because they map strings to delegates. I have a service that contains several of these factories. The mission is to make these factory classes available to a content processor. Neither the processor itself nor the context it uses as a parameter have any known hooks to attach an IServiceProvider or equivalent.
Any ideas?
I needed to create a data structure essentially on demand without access to the underlying classes as they came from a third party, in this case XNA Game Studio. There is only one way to do this I know of... statically.
public class TestMap : Dictionary<string, string>
{
private static readonly TestMap map = new TestMap();
private TestMap()
{
Add("Logic", "LogicProcessor");
Add("Sequence", "SequenceProcessor");
Add("Key", "KeyProcessor");
Add("KeyVector", "KeyVectorProcessor");
Add("Mouse", "MouseProcessor");
Add("Pad", "PadProcessor");
Add("PadVector", "PadVectorProcessor");
}
public static TestMap Map
{
get { return map; }
}
public IEnumerable<TestContent> Collect(XElement xe, ContentProcessorContext cpc)
{
foreach(var e in xe.Elements().Where(e => ContainsKey(e.Name.ToString())))
{
yield return cpc.Convert<XElement, TestContent>(
e, this[e.Name.ToString()]);
}
}
}
I took this a step further and created content processors for each type of TestBase:
/// <summary>
/// Turns an imported XElement into a TestContent used for a LogicTest
/// </summary>
[ContentProcessor(DisplayName = "LogicProcessor")]
public class LogicProcessor : ContentProcessor<XElement, TestContent>
{
public override TestContent Process(XElement input, ContentProcessorContext context)
{
var result = new TestContent(input);
string key = "Or";
input.GetAttribute("op", key);
result.args.Add("key", key);
var items = TestMap.Map.Collect(input, context);
if (items.Count() < 2) throw new ArgumentNullException(
"LogicProcessor requires at least two items.");
result.args.Add("items", items);
return result;
}
}
Any attempt to reference or access the class such as calling TestMap.Collect will generate the underlying static class if needed. I basically moved the code from LogicTest.Build to the processor. I also carry out any needed validation in the processor.
When I get to reading these classes I will have the ContentService to help.
I'm rolling my own ActivatableCollection<T> for db4o but cribbing heavily from the builtin ActivatableList<T> implementation. I'm running into the problem where transparent persistence doesn't seem to be working correctly. In the test code below:
[Fact]
void CanStoreActivatableCollection()
{
var planets = new ActivatableCollection<Planet>();
var pagingMemoryStorage = new PagingMemoryStorage();
var config = Db4oEmbedded.NewConfiguration();
config.Common.Add(new TransparentActivationSupport());
config.Common.Add(new TransparentPersistenceSupport());
config.File.Storage = pagingMemoryStorage;
var objectContainer = Db4oEmbedded.OpenFile(config, "Memory.yap");
planets.Add(new Planet("Mercury"));
objectContainer.Store(planets);
planets.Add(new Planet("Venus"));
planets.Add(new Planet("Earth"));
objectContainer.Commit();
objectContainer.Close();
config = Db4oEmbedded.NewConfiguration();
config.Common.Add(new TransparentActivationSupport());
config.Common.Add(new TransparentPersistenceSupport());
config.File.Storage = pagingMemoryStorage;
objectContainer = Db4oEmbedded.OpenFile(config, "Memory.yap");
planets = objectContainer.Query<ActivatableCollection<Planet>>().FirstOrDefault();
Assert.NotNull(planets);
Assert.Equal(3, planets.Count);
objectContainer.Close();
}
The planet "Mercury" is stored, but not "Venus" and "Earth". If I change from ActivatableCollection to ActivatableList, then all 3 planets are stored.
What am I missing? My ActivatableCollection is just minimal implementation of ActivatableList as best as I can tell.
Below is my implementation of ActivatableCollection:
public class ActivatableCollection<T>
: ICollection<T>
, IActivatable
, INotifyCollectionChanged
{
List<T> _list;
List<T> List
{
get
{
if (_list == null)
_list = new List<T>();
return _list;
}
}
public ActivatableCollection()
{
}
public int Count
{
get
{
ActivateForRead();
return List.Count;
}
}
public bool IsReadOnly
{
get
{
ActivateForRead();
return ((IList) List).IsReadOnly;
}
}
public void Add(T t)
{
ActivateForWrite();
List.Add(t);
OnCollectionChanged(new NotifyCollectionChangedEventArgs(NotifyCollectionChangedAction.Add, t));
}
public void Clear()
{
ActivateForWrite();
List.Clear();
OnCollectionChanged(new NotifyCollectionChangedEventArgs(NotifyCollectionChangedAction.Reset));
}
public bool Contains(T t)
{
ActivateForRead();
return List.Contains(t);
}
public void CopyTo(T[] array, int index)
{
ActivateForRead();
List.CopyTo(array, index);
}
public IEnumerator<T> GetEnumerator()
{
ActivateForRead();
return List.GetEnumerator();
}
IEnumerator IEnumerable.GetEnumerator()
{
return GetEnumerator();
}
public bool Remove(T t)
{
ActivateForWrite();
bool removed = List.Remove(t);
if (removed)
OnCollectionChanged(new NotifyCollectionChangedEventArgs(NotifyCollectionChangedAction.Remove, t));
return removed;
}
[Transient]
private IActivator _activator;
public virtual void Bind(IActivator activator)
{
if (_activator == activator)
return;
if (activator != null && _activator != null)
throw new InvalidOperationException();
_activator = activator;
}
public virtual void Activate(ActivationPurpose purpose)
{
if (_activator == null)
return;
_activator.Activate(purpose);
}
protected virtual void ActivateForRead()
{
Activate(ActivationPurpose.Read);
}
protected virtual void ActivateForWrite()
{
Activate(ActivationPurpose.Write);
}
[Transient]
public event NotifyCollectionChangedEventHandler CollectionChanged;
protected virtual void OnCollectionChanged(NotifyCollectionChangedEventArgs e)
{
if (CollectionChanged != null)
CollectionChanged(this, e);
}
}
I've also tried copying the code from GenericTypeHandlerPredicate and registering my ActivatableCollection to use the GenericCollectionTypeHandler. That results in a crash in GenericTypeFor() throwing an InvalidOperationException() when "Mercury" is being stored.
Just want to mention my answers from the db4o forums also here, for people with a similar problem:
First part of the issue:
From db4o's point of view nothing has changed in the 'ActivatableCollection' object and therefore no changes are stored. This is what is happening:
When you add the items, the ActivatableCollection is marked as changed.
When you commit the changes are stored. However the ' ActivatableCollection' holds the reference to the same object. db4o only stores the changes in the ActivatableCollection-object, which is the reference to the List. Since it is the same, no actual change is stored.
The List of the ActivatableCollection is never updated, because it wasn't marked as 'changed'
So the transparent activation doesn't see the changes in the list. You can fix your issue simply by using an ActivatableList in you're ActivatableCollection implementation. Just change the List with a IList interface and instantiate a ActivatableList instead of an List.
The second part of the issue: Why doesn't it work even when registering the GenericCollectionTypeHandler for this type? Here we hit a implementation detail. The GenericCollectionTypeHandler has an internal list of supported types, which doesn't include the self made 'ActivatableCollection'. GenericCollectionTypeHandler is not really part of the public API and intendet for internal use only.
Workaround / Fix
Just use an ActivatableList<T> instead of a List<T>. then everything works fine.