i've been reading about sessionless controllers lately and it seems an interesting idea, since it improves perfomance and let ajax calls to be asynchronous, as usually they should be.
However, i can't figure a nice way to store data that would previously stored by a session. I have a lot of single-fetch data that i get once and walks with it through several pages. My first thought was to use MemoryCache, but reading this post i begin to doubt it, since IIS can let go my data anytime.
Because of this, i got a little confused on what should i do to store data in a session like way. I read a couple of thing about NoSQL and MongoDB, but wouldn't that be the same as to fetch data all the time i need it?
Can you give me some clarifications and technologies i can use to serve as temporary datastore?
Have you considered using the HttpContext.Cache? As you're saying in a session like way, there is no reason you couldn't create a cache key based upon the sessionid of the current request:
// cache key
var cacheKey = string.Format("{0}-{1}", "SomeKey", Session.SessionID);
// save to cache
HttpContext.Cache.Insert(cacheKey, <yourobject>, null, Cache.NoAbsoluteExpiration, TimeSpan.FromMinutes(20));
From there it would simply be a matter of passing along the sessionid and retrieving at a later time:
HttpContext.Cache[cacheKey]
Related
In our application, we are using grails framework and SQL server for database. We have multiple sites and those sites can have multiple users (a few users) and if they are accessing the same method via AJAX that can cause issue so we made the that method as synchronized method and to minimize the database interaction we are storing data in map on site basis since all the user from one site will get the same data, and if the data is older than 10 seconds we get the data from database and update the map object. Here we are getting a lot of database connection close issues on the very first line of synchronized method where we are getting site object from database. What is the issue here and how we can resolve the issue?
def synchronized getData(params){
Site site = Site.get(params.siteId)
// Here we are checking whether site data does not exists in map
// or the data expired (10 second older data) then we get data from
// database and update the map object
// Then here we create new list object from the data in map object
return list
}
Difficult to figure out the exact problem without more information here. Several things stand out...
I'm not especially familiar with using the synchronized keyword in front of a service method, I would recommend trying the synchronized annotation with a static object key:
private static final myLock = new Object()
#Synchronized("myLock")
void getData() {
//do stuff
}
or synchronizing explicitly within the method
void getData() {
synchronized(myLock) {
//do stuff
}
}
I don't know if that's related to your connection closing issues, but worth a try.
But also notably, grails and hibernate provide caching of database retrieves, so if you're loading the same data that's been loaded into hibernate cache, you don't need to cache this in a Map locally... grails is already doing that for you. Site site = Site.get(params.siteId) will NOT make a database call if it's been called recently and is already cached by the framework.
I would strongly suggest running some performance checks just making that call vs. caching in a Map object, especially if you're expiring in ~10s anyway.
I have iOS app that takes data from the server as json and then serializes them into objects of different types. Types can be complicated, can contain subtypes, can inherit, so there is no any limitations. Another thing that makes everything even more complicated is some of types are stored as AnyObject? and only in run time they are being serialized into real types accordingly to the specific rules. Something like that:
class A {
var typeName: String?
var b: AnyObject?
}
Then when it's serialized it can be done something like that:
if let someClass = NSClassFromString(typeName) as? SomeGenericType.Type{
b = someClass.init()
}
Also querying should be done on all the data. Currently I'm trying to store all of them locally, then load into memory and query there from the code. I'm using User defaults, but they have some limitations, also I needed to provide custom coding to make it work, and each time when I add a new field it turned out that I missed something in coding and nothing works. So it's pain.
Ideally I would just do some magic command and all the objects are sent to local storage no matter how complicated they are. The same to extract them from this storage. Also, user change data so I can't just store primary Json. And I don't want to covert objects back to Jason as for it's pain too.
Any suggestions?
If you want to use sqlite then You can store whole object in one row! I means you can create table with 2 columns one is id and second is your dataobject(it's data type should be blob). Then convert your whole object into data. Then store in sqlite table and retrieve it as data then convert it to object when want to use. By this way your object will remains in same format as you asked
Firebase while meant for online synching and storage can also cache everything locally in case you are offline and perform query's against the local cache. It uses JSON.
CouchDB also has a mobile version for iOS.
Both of those are over kill if your dataset is small; you can just store it as a text file and read the JSON back in. See performance characteristics here. The graph is for a 7MB file so if you are significantly less than that your load time may be minimal.
NSKeyedArchiver.archivedData(withRootObject:) is great for storing custom objects as Data objects. The only thing you need to do to be able to use this is to make your custom objects conform to NSCoding. A great example can be found here:
Save custom objects into NSUserDefaults
Once you have the Data version of the object, it can easily be stored in UserDefaults, as a property in CoreData, or even in the app's keychain entries. Depending on your use case, sensitivity of data, and how much data you intend to store, you might want to use any number of storage methods. NSKeyedArchiver.archivedData(withRootObject:) allows you to pretty much use any of them.
I need to cache json data from API in swift.
So I researched a Lot & get to this Post.
I tried to implement the Option 1 in my App. But the Custom manager always returned nil. I don't know why?
After that I got AwesomeCache. It says that it an do Awesome API Caching.
But I don't know how to implement this?
I referred this Issue. Still I can't figure it Out.
This is how my Current implementation Looks without Cache:
Alamofire.request(.GET, "http://api.androidhive.info/volley/person_array.json")
.responseJSON { (_, _, data, _) in
let json = JSON(data!)
let catCount = json.count
for index in 0...catCount-1 {
let name = json[index]["name"].string
println(name)
}
Please suggest me the Best way to Cache JSON from API ?
Thanks in Advance!
UPDATE
These are my requirements
Fetch the JSON from the API & Parse the JSON data. These can be done with the help of Alamofire & SwiftyJSON
I will populate the parsed data in the Table View. It works when the user is in Online.
But I want to show the data in the Table when the user is in offline too.
So I need to save the Parsed data or the JSON data in my cache & I need to refresh or expire the cache within a week or days.
I don't prefer to store the JSON in my disk because it will be updated.
Please suggest me the Best way to achieve this...
You have many tools already at your disposal.
NSURLCache
All your requests are already stored in the NSURLCache in the NSURLSessionConfiguration on the NSURLSession stored inside the sharedInstance of the Alamofire Manager. Those stored requests already follow all the caching policy rules provided by the servers you are hitting. You can control the caching behavior by setting the requestCachePolicy on your own custom NSURLSessionConfiguration. I'd also suggest you read through this awesome NSHipster article that walks you through the ins and outs of NSURLCache and how to control it.
Creating custom Manager objects is covered in the current Alamofire docs.
Downloading JSON to Disk
You can also download the JSON directly to disk using Alamofire.download instead of using Alamofire.request. This will download the payload to a fileURL that you provide in the destination closure. This would give you full control over the caching of the file after that point. You would need to create your own caching policy around these files afterwards if you wanted to follow the caching header rules provided by the server.
Populating Table View
Once you have your data downloaded to disk, you need to load it into an NSData blob and parse it into JSON to populate your table view. This should be pretty straight forward. You need the destination NSURL that you specified to Alamofire when you started your download. Then load the file data into an NSData blob. Finally, use NSJSONSerialization to convert the NSData object into a JSON AnyObject which can be parsed into model objects to populate your table view.
Obviously you don't "have" to parse the JSON into model objects, but this helps protect your table view from malformed JSON data.
Storing JSON for Offline Usage
If you stick with this approach, you'll need to track your cache expiration dates in something like CoreData or SQLite. You can do this by either caching the paths to the JSON files on disk, or store the model objects directly in CoreData or SQLite. This could get fairly complicated and I would not recommend this approach unless you absolutely don't want to cache your model objects.
Offline Usage
Generally, if you need to cache data for offline usage, you want to store your model objects in something like CoreData. You would use the Alamofire request method coupled with a responseJSON serializer to parse the data into JSON. Then you would convert the JSON into model objects. From there, you'd save your model objects in CoreData, then finally populate your table view with the model objects.
The nice thing about this approach is that you have all your model objects cached in the case that your table view is accessed when the device is offline. Coupling this design with queries to your NSURLCache to see if your request is cached let's you avoid unnecessary server calls and parsing logic when you already have your model objects generated.
Given the updates to your original question, I would recommend this approach.
You can use this cache open source. It cache data on disk and memory. Can cache many swift type, and custom class which inherit NSObject and conform NSCoding protocol.
https://github.com/huynguyencong/DataCache
To implement:
First, it use NSCache for mem cache. NSCache use like a dictionary.
Second, save cache to disk, use NSFileManager methods.
I have an MVC2-site using Windows authentication.
When the user requests a page I pull some user information from the database. The class I retrieve is a Person class.
How can get this from the database when the user enters the site, and pick up the same class without touching the db on all subsequent page requests?
I must admit, I am pretty lost when it comes to session handling in ASP.net MVC.
You can store that kind of information in HttpContextBase.Session.
One option is to retrieve the Person object from your database on the first hit and store it in System.Web.HttpContext.Current.Cache, this will allow extremely fast access and your Person data will be temporarily stored in RAM on the web server.
But be careful: If you are storing significantly large amount of user data in this way, you could eat up a lot of memory. Nevertheless, this will be perfectly fine if you only need to cache a few thousand or so. Clearly, it depends upon how many users you expect to be using your app.
You could add like this:
private void CachePersonData (Person data, string storageKey)
{
if (HttpContext.Current.Cache[storageKey] == null)
{
HttpContext.Current.Cache.Add(storageKey,
data,
null,
Cache.NoAbsoluteExpiration,
TimeSpan.FromDays(1),
CacheItemPriority.High,
null);
}
}
... and retrieve like this:
// Grab data from the cache
Person p = HttpContext.Current.Cache[storageKey];
Don't forget that the object returned from the cache could be null, so you should check for this and load from the database as necessary (then cache).
First of all, if you are using a load balanced environment, I wouldn't recommend any solution that you try without storing it in a database, because it will eventually fail.
If you are not in a load balancing environment, you can use TempData to store your object and then retrieve it in the subsequent request.
HttpContext.Current.Session[key];
I am still mostly unfamiliar with Inversion of Control (although I am learning about it now) so if that is the solution to my question, just let me know and I'll get back to learning about it.
I have a pair of controllers which need to a Session variable, naturally nothing too special has happen because of how Session works in the first place, but this got me wondering what the cleanest way to share related objects between two separate controllers is. In my specific scenario I have an UploadController and a ProductController which work in conjunction with one another to upload image files. As files are uploaded by the UploadController, data about the upload is stored in the Session. After this happens I need to access that Session data in the ProductController. If I create a get/set property for the Session variable containing my upload information in both controllers I'll be able to access that data, but at the same time I'll be violating all sorts of DRY, not to mention creating a, at best, confusing design where an object is shared and modified by two completely disconnected objects.
What do you suggest?
Exact Context:
A file upload View posts a file to UploadController.ImageWithpreview(), which then reads in the posted file and copies it to a temporary directory. After saving the file, another class produces a thumbnail of the uploaded image. The path to both the original file and the generated thumbnail are then returned with a JsonResult to a javascript callback which updates some dynamic content in a form on the page which can be "Saved" or "Cancelled". Whether the uploaded image is saved or it is skipped, I need to either move or delete both it and the generated thumbnail from the temporary directory. To facilitate this, UploadController keeps track of all of the upload files and their thumbnails in a Session-maintained Queue object.
Back in the View: after the form is populated with a generated thumbnail of the image that was uploaded, the form posts back to the ProductsController where the selected file is identified (currently I store the filename in a Hidden field, which I realize is a horrible vulnerability), and then copied out of the temp directory to a permanent location. Ideally, I would like to simply access the Queue I have stored in the Session so that the form does not need to contain the image location as it does now. This is how I have envisioned my solution, but I'll eagerly listen to any comments or criticisms.
A couple of solutions come to mind. You could use a "SessionState" class that maps into the request and gets/sets the info as such (I'm doing this from memory so this is unlikely to compile and is meant to convey the point):
internal class SessionState
{
string ImageName
{
get { return HttpContext.Current.Session["ImageName"]; }
set { HttpContext.Current.Session["ImageName"] = value; }
}
}
And then from the controller, do something like:
var sessionState = new SessionState();
sessionState.ImageName = "xyz";
/* Or */
var imageName = sessionState.ImageName;
Alternatively, you could create a controller extension method:
public static class SessionControllerExtensions
{
public static string GetImageName(this IController controller)
{
return HttpContext.Current.Session["ImageName"];
}
public static string SetImageName(this IController controller, string imageName)
{
HttpContext.Current.Session["ImageName"] = imageName;
}
}
Then from the controller:
this.SetImageName("xyz");
/* or */
var imageName = this.GetImageName();
This is certainly DRY. That said, I don't particularly like either of these solutions as I prefer to store as little data, if any, in session. But if you're intent is to hold onto all of this information without having to load/discern it from some other source, this is the quickest (dirtiest) way I can think of to do it. I'm quite certain there's a much more elegant solution, but I don't have all of the information about what it is you're trying to do and what the problem domain is.
Keep in mind that when storing information in the session, you will have to dehydrate/rehydrate the objects via serialization and you may not be getting the performance you think you are from doing it this way.
Hope this helps.
EDIT: In response to additional information
Not sure on where you're looking to deploy this, but processing images "real-time" is a sure fire way to be hit with a DoS attack. My suggestion to you is as follows -- this is assuming that this is public facing and anyone can upload an image:
1) Allow the user to upload an image. This image goes into the processing queue for background processing by the application or some service. Additionally, the name of the image goes into the user's personal processing queue -- likely a table in the database. Information about background processing in a web app can be found # Schedule a job in hosted web server
2) Process these images and, while processing, display a "processing graphic". You can have an ajax request on the product page that checks for images being processed and trys to reload them every X seconds.
3) While an image is being "processed", the user can opt out of processing assuming they're the one that uploaded the image. This is available either on the product page(s) that display the image or on a separate "user queue" view that will allow them to remove the image from consideration.
So, you end up with some more domain objects and those objects are managed by the queue. I'm a strong advocate of convention over configuration so the final destination of the product image(s) should be predefined. Something like:
images/products/{id}.jpg or, if a collection, images/products/{id}/{sequence}.jpg.
You then don't need to know the destination in the form. It's the same for all images.
The queue then needs to know where the temp image was uploaded and what the product id was. The queue worker pops items from the queue, processes them, and stores them accordingly.
I know this sounds a little more "structured" than what you originally intended, but I think it's a little cleaner.
Is there complete equivalence between the UploadController and ProductController?
As files are uploaded by the UploadController, data about the upload is stored in the Session. After this happens I need to access that Session data in the ProductController.
As I read that the UploadControl needs read and write access to Upload data, the ProductController needs only read.
If that's true then you can make it clear by using an immuatable wrapper around the upload information and have the UploadController put that into the session.
The Session itself is by definiton a public shared noticeboard, decouples explicit relationships at the cost of allowing anyone to get and put. You could allow the ProductController to know about the UploadController and hence remove the need for passing the upload information via the session. My instinct is that the upload info is interesting to the public, so using Session is reasonable.
I don't see any DRY violation here, we are explicitly trying to separate responsibilities.