I have a scenario in which I start a new thread from an action where the thread will perform some background work. Sometimes if the thread is taking some longer time to complete the work, the session seems to get expired. Is there any way to keep the session alive from the new thread?
You should avoid accessing any HttpContext resources such as Session in your background threads. You could pass the information this thread needs as a parameter.
For example:
public ActionResult Index()
{
// Get the value you will need from the session
SomeModel model = (SomeModel)Session["myModel"];
// start a new Thread
Thread thread = new Thread(DoWork);
thread.Start(model);
return View();
}
private void DoWork(object state)
{
SomeModel model = (SomeModel)state;
// do the work here without ever accessing the HttpContext
}
And if you need information from multiple sources such as Sessions, Models, Cookies, whatever, ... simply build a new model which will aggregate them all:
public class MyModel
{
public SomeModelFromSession SessionData { get; set; }
public SomeModelFromCookie CookieData { get; set; }
...
}
and then pass this new model to your background thread when starting it.
When you copy the file to your local file system, have session/necessary data serialized to a json file with the same file name (or write that information to database).
My suggestions are;
This should be handled by a separate windows service or a running executable on the background.
If you want the web application to handle this, you can create a thread when the web application starts, and checks for new files uploaded to upload to S3. But the drawback of this, you cannot know when IIS recycles the web application, and some scenarios, you can have more than one instance of the same web application (depending on the configuration of the application pool). So you have to handle application restarts and having multiple instances gracefully.
Or maybe you can use a Queue system (MSMQ, database maybe)
You can even use, Redis or database.
Related
I have developed an asp.net mvc application using entityFramewrok, code first in my app I have a class which maps to a table with the following property
public class Comments
{
public int Id{ get; set; }
public string Comment { get; set; }
public DateTime LastEdit{ get; set; }
}
I want my app be able to delete(remove) comments which are older that 40 days automatically.
How can I achieve that?
This has nothing to do with the class you posted or with ASP.NET, or MVC, or even Entity Framework really. This is just about scheduling a task to run each day, which will identify data and delete it.
Essentially you have two primary options...
1) Create a Windows Service. This would include a Timer which is set to execute every 24 hours (or any other interval you see fit). The action invoked by that Timer would connect to the database, identify the records to be deleted, and delete them.
2) Create a Console Application. This wouldn't internally run any kind of schedule, but would just perform a one-time action of connecting to the database, identifying the records to be deleted, and deleting them. This application would be scheduled to run periodically (again, every day sounds reasonable) using the host system's task scheduler.
It would make sense to use the same Entity Framework code that the web application uses, so you would want to make sure that code is in its own Class Library project and then both the Web Application and the Windows Service would reference that project.
Conversely, if you need to keep this local to the web application itself, then the application would need to do this in response to a request of some kind. That would hurt performance of the application, but it's still possible. Any time a user requests a given page you can first perform those deletes and then return the requested view. Again, this is not ideal because it means doing this many times a day and interrupting the user experience (even for just a moment). It's best to offload background data maintenance to an offline application, such as a Windows Service or a Console Application.
1. You can create trigger in database for this.
2. As David say, using Windows Service or Console Application create a applications that connects to database and do something like this:
public void DeleteOldComments()
{
var monthAgo = DateTime.Now.AddMonths(-1);
var oldComments = Db.CommentsTable.Where(e => e.DateTime <= monthAgo);
foreach (var item in oldComments )
{
var e = Db.CommentsTable.Find(item.Id);
if (e != null)
{
Db.CommentsTable.Remove(e);
}
}
}
3. Long and bad way: Create settings table, and save LastCommentDeleteDate there, in your site layout, using javascript call any action that fire DeleteOldComments() one time in a day. (using cookie). In any request check LastCommentDeleteDate, if it allowed, call delete function.
for now I have this context
namespace Dafoor_MVC.Models
{
public class DafoorDBContext : DbContext
{
public DbSet<Department> departments { get; set; }
public DbSet<Course> courses { get; set; }
public DbSet<Reply> replies { get; set; }
}
}
This context will grow large because I have about 40 models that I wanna add.
1- is it a good idea to have the 40 models in one context ?
2- i want this context to be shared among all users, because i don't want to hit the database with queries every time if the record is already in a context, but this will affect the server memory,so how can i implement something like " last object used to be disposed or the object that didn't get called for an amount of time to be disposed from the context " ? i don't want to dispose the whole context.
3-if point 2 didn't work, can i put an instance of the context in a user session so the context will be a user specific not application spicific
is it a good idea to have the 40 models in one context?
There is nothing wrong with that if they all logically belong together.
i want this context to be shared among all users
No, you don't. You want to instantiate a context for each individual HTTP request, and dispose of it before the handling of the HTTP request is done. Do not cache your DbContext.
can i put an instance of the context in a user session so the context will be a user specific not application spicific
You should not cache your context. However, you can store objects retrieved using the context in Session. You will not be able to update the objects / object graph without first re-attaching the objects to a new context.
UPDATE
Here's why the DbContext should never be stored beyond the current HTTP request:
One DbContext per web request... why?
1) there is no reason to worry about having 40+ DbSets in a single context class. They're simply collections and are only populated with objects you're currently using
2) instance members of DbContext and DbSet are not guaranteed to be thread safe. I don't recommend the singleton approach
3) you could, but be sure to handle database concurrency exceptions properly
I have a helper class which reads a big XML document and generates a list of c# objects.
I work with these objects quite a lot, so i thought the best way of doing this would be to save them in memory and then access them from there.
I made a simple repository, which gets an object from memory, and if doesn't exists, it adds it.
The Repository looks like this:
public class XmlDocumentRepository
{
private readonly ICacheStorage _cacheStorage;
public XmlDocumentRepository(ICacheStorage cacheStorage)
{
_cacheStorage = cacheStorage;
}
private readonly object _locker = new object();
private void DeserializeXmlDocument()
{
lock (_locker)
{
// I deserialize the xml document, i generate the c# classes, and save them in cache
IEnumerable<Page> pages = new XmlDeserializerHelper().DeserializeXml();
foreach(var page in pages)
{
_cacheStorage.Add(page_Id, page);
}
}
}
public Page GetPage(Guid page_Id)
{
Page page = _cacheStorage.Get<Page>(page_Id);
if (page != null)
return page;
lock (_locker)
{
page = _cacheStorage.Get<Page>(page_Id);
if (page != null)
return page;
DeserializeXmlDocument();
page = _cacheStorage.Get<Page>(page_Id);
return page;
}
}
}
The XmlDocumentRepository is used inside a web application (asp.net mvc more exacly).
Is the implementation of the repository good? I am using the lock statements properly?
In my comments on the question I misunderstood the cache being shared. I think you will need to do one of the following options:
Make XmlDocumentRepository a singleton which is used across all requests because the lock object is a private field so each request will have a new instance of the repository with a new field.
Make the lock object a static field so that it is shared across all XmlDocumentRepository instances.
As a primary rule, you want to protect all access variations to data stores that are used by multiple threads. I see several potential problems with your implementation;
1: ICacheStorage is provided from the outside, which means that this collection could be modified elsewhere, which may or may not be protected by locks. Maybe you should require that the collection itself uses locking internally, or other types of thread safety mechanisms?
2: You have inconsistent lock protection of data access. In GetPage you access _cacheStorage before applying the lock, while in Deserialize, you access it inside a lock. This means that you may get a result where one is adding to the cache while another is getting from it.
3: Do you require thread safety for the cache, for xml reading, or both?
If you only need to protect the cache, move reading of xml outside the lock. If protecting both, you should put the entire GetPage function inside the lock.
I have a "blog" website developed using ASP.NET MVC 1. Recent version of MVC includes AsyncController feature. This actually requires some additional task in development. But how can I reuse my existing code without modifying my business layer.
Some part of the code looks like:
BlogPost post = new BlogPost();
post.GetPost(58345);
BlogComment comments = new BlogComment();
comments.GetComments(58345);
As per the current environment, I need to wait till the request completes two operations. Using AsyncController, I can do two operations simultaneously. But the classes BlogPost and BlogComment requires to be changed to support for asynchronous operations like adding EventHandlers to know whether the operation is completed and etc.
How can I do asynchronous operation without modifying existing business layer.
You could do this:
public class BlogController : AsyncController
{
private readonly IBlogRepository _repository;
public BlogController(IBlogRepository repository)
{
_repository = repository;
}
public void ShowAsync(int id)
{
AsyncManager.OutstandingOperations.Increment(2);
new Thread(() =>
{
AsyncManager.Parameters["post"] = _repository.GetPost(id);
AsyncManager.OutstandingOperations.Decrement();
}).Start();
new Thread(() =>
{
AsyncManager.Parameters["comments"] = _repository.GetComments(id);
AsyncManager.OutstandingOperations.Decrement();
}).Start();
}
public ActionResult ShowCompleted(Post post, IEnumerable<Comment> comments)
{
return View(new BlogViewModel
{
Post = post,
Comments = comments,
});
}
}
You should measure the performance of your application and decide whether introducing an async controller brings any value to the performance.
First, why do you think you need to do Async Controllers? Are you experiencing some performance problem that you thinkn Async will help you with? Why complicate your application with Async handling if you do not really need it?
Async is really designed to handle much more massive scaling, or when you need to do non-cpu bound operations in your controllers that might take a long time to execute.
Second, I think you are a bit confused about how Async controllers operate. You don't need to modify your business layer in most cases, you simply need to create an async "shim" to wrap your business layer. Async does not mean "multi-threaded". it will still do one thread per request, and you will still call your business logic single threaded (unless you write code to do things multi-threaded).
All Async controllers do is allow for better utilization of the Thread Pool. When you have threads that are not CPU bound, they can be returned to the thread pool while waiting for your request to be re-activated, thus allowing the thread pool to be better utlized, rather than using a thread doing nothing but waiting.
If you need to call multiple operations, you use the AsyncManager.OutstandingOperations property to control how many operaitons must complete to complete the request.
Right now I'm having an issue with a Singleton that I just wrote for use in ASP.NET MVC -- My Singleton looks like this:
public sealed class RequestGenerator : IRequestGenerator
{
// Singleton pattern
private RequestGenerator()
{
requestList = new Stack<Request>();
appSettings = new WebAppSettings();
}
private static volatile RequestGenerator instance = new RequestGenerator();
private static Stack<Request> requestList = new Stack<Request>();
// abstraction layer for accessing web.config
private static IAppSettings appSettings = new WebAppSettings();
// used for "lock"-ing to prevent race conditions
private static object syncRoot = new object();
// public accessor for singleton
public static IRequestGenerator Instance
{
get
{
if (instance == null)
{
lock (syncRoot)
{
if (instance == null)
{
instance = new RequestGenerator();
}
}
}
return instance;
}
}
private const string REQUESTID = "RequestID";
// Find functions
private Request FindRequest(string component, string requestId)
private List<Request> FindAllRequests(string component, string requestId)
#region Public Methods required by Interface
// Gets and increments last Request ID from Web.Config, creates new Request, and returns RequestID
public string GetID(string component, string userId)
// Changes state of Request to "submitted"
public void SetID(string component, string requestId)
// Changes state of Request to "success" or "failure" and records result for later output
public void CloseID(string component, string requestId, bool success, string result)
// Verifies that Component has generated a Request of this ID
public bool VerifyID(string component, string requestId)
// Verifies that Component has generated a Request of this ID and is owned by specified UserId
public bool VerifyID(string component, string userId, string requestId)
// Returns State of Request ID (Open, Submitted, etc.)
public Status GetState(string component, string requestId)
// Returns Result String of Success or Failure.
public string GetResult(string component, string requestId)
#endregion
}
And my controller code looks like this:
public ViewResult SomeAction()
{
private IRequestGenerator reqGen = RequestGenerator.Instance;
string requestId = reqGen.GetID(someComponentName, someUserId);
return View(requestId);
}
Everything works okay the first time I hit the controller. "reqGen" is assigned the instance of the Singleton. A new instance of Request is added to the internal list of the Singleton. And then we return a View(). The next time I hit this controller's SomeAction(), I'm expecting the Singleton to contain the List with the instance of SomeClass that I had just added, but instead the List is empty.
What's happened? Has Garbage Collection gobbled up my object? Is there something special I need to consider when implementing the Singleton pattern in ASP.NET MVC?
Thanks!
EDIT: Ahh, the lightbulb just went on. So each new page request takes place in a completely new process! Got it. (my background is in desktop application development, so this is a different paradigm for me...)
EDIT2: Sure, here's some more clarification. My application needed a request number system where something being requested needed a unique ID, but I had no DB available. But it had to be available to every user to log the state of each request. I also realized that it could double as a way to regulate the session, say, if a use double-clicked the request button. A singleton seemed like the way to go, but realizing that each request is in its own process basically eliminates the singleton. And I guess that also eliminates the static class, right?
EDIT3: ok, I've added the actual code that I'm working with (minus the implementation of each Method, for simplicity sake...) I hope this is clearer.
EDIT4: I'm awarding the green check mark to Chris as I'm beginning to realize that an application-level singleton is just like having a Global (and global's are evil, right?) -- All kidding aside, the best option really is to have a DB and SQLite seems like the best fit for now, although I can definitely see myself moving to an Oracle instance in the future. Unfortunately, the best option then would be to use an ORM, but that's another learning curve to climb. bugger.
EDIT5: Last edit, I swear. :-)
So I tried using HttpRuntime.Cache, but was surprised to find that my cache was getting flushed/invalidated constantly and couldn't figure out what was going on. Well, I was getting tripped up by a side-effect of something else I was doing: Writing to "Web.config"
The Answer --> Unbeknownst to me, when "web.config" is altered in anyway, the application is RESTARTED! Yup, everything gets thrown away. My singleton, my cache, everything. Gah. No wonder nothing was working right. Looks like writing back to web.config is generally bad practice which I shall now eschew.
Thanks again to everyone who helped me out with this quandary.
The singleton is specific to the processing instance. A new instance is being generated for each page request. Page requests are generally considered stateless so data from one doesn't just stick around for another.
In order to get this to work at the application level, the instance variable will have to be declared there. See this question for a hint on how to create an application level variable. Note that this would make it available across all requests.. which isn't always what you want.
Of course, if you are trying to implement some type of session state then you might just use session or use some type of caching procedure.
UPDATE
Based on your edits: A static class should not maintain data. It's purpose is to simply group some common methods together, but it shouldn't store data between method calls. A singleton is an altogether different thing in that it is a class that you only want one object to be created for the request.
Neither of those seem to be what you want.
Now, having an application level singleton would be available to the entire application, but that crosses requests and would have to be coded accordingly.
It almost sounds like you are trying to build an in memory data store. You could go down the path of utilizing one of the various caching mechanisms like .NET Page.Cache, MemCache, or Enterprise Library's Caching Application Block.
However, all of those have the problem of getting cleared in the event the worker process hosting the application gets recycled.. Which can happen at the worst times.. And will happen based on random things like memory usage, some timer expired, a certain number of page recompiles, etc.
Instead, I'd highly recommend using some type of persisted storage. Whether that be just xml files that you read/write from or embedding something like SQL Lite into the application. SQL Lite is a very lightweight database that doesn't require installation on the server; you just need the assemblies.
You can use Dependency Injection to control the life of the class. Here's the line you could add in your web.config if you were using Castle Windsor.
<component id="MySingleton" service="IMySingleton, MyInterfaceAssembly"
type="MySingleton, MyImplementationAssembly" lifestyle="Singleton" />
Of course, the topic of wiring up your application to use DI is beyond my answer, but either you're using it and this answer helps you or you can take a peak at the concept and fall in love with it. :)