Best way to Cache JSON from API in SWIFT? - ios

I need to cache json data from API in swift.
So I researched a Lot & get to this Post.
I tried to implement the Option 1 in my App. But the Custom manager always returned nil. I don't know why?
After that I got AwesomeCache. It says that it an do Awesome API Caching.
But I don't know how to implement this?
I referred this Issue. Still I can't figure it Out.
This is how my Current implementation Looks without Cache:
Alamofire.request(.GET, "http://api.androidhive.info/volley/person_array.json")
.responseJSON { (_, _, data, _) in
let json = JSON(data!)
let catCount = json.count
for index in 0...catCount-1 {
let name = json[index]["name"].string
println(name)
}
Please suggest me the Best way to Cache JSON from API ?
Thanks in Advance!
UPDATE
These are my requirements
Fetch the JSON from the API & Parse the JSON data. These can be done with the help of Alamofire & SwiftyJSON
I will populate the parsed data in the Table View. It works when the user is in Online.
But I want to show the data in the Table when the user is in offline too.
So I need to save the Parsed data or the JSON data in my cache & I need to refresh or expire the cache within a week or days.
I don't prefer to store the JSON in my disk because it will be updated.
Please suggest me the Best way to achieve this...

You have many tools already at your disposal.
NSURLCache
All your requests are already stored in the NSURLCache in the NSURLSessionConfiguration on the NSURLSession stored inside the sharedInstance of the Alamofire Manager. Those stored requests already follow all the caching policy rules provided by the servers you are hitting. You can control the caching behavior by setting the requestCachePolicy on your own custom NSURLSessionConfiguration. I'd also suggest you read through this awesome NSHipster article that walks you through the ins and outs of NSURLCache and how to control it.
Creating custom Manager objects is covered in the current Alamofire docs.
Downloading JSON to Disk
You can also download the JSON directly to disk using Alamofire.download instead of using Alamofire.request. This will download the payload to a fileURL that you provide in the destination closure. This would give you full control over the caching of the file after that point. You would need to create your own caching policy around these files afterwards if you wanted to follow the caching header rules provided by the server.
Populating Table View
Once you have your data downloaded to disk, you need to load it into an NSData blob and parse it into JSON to populate your table view. This should be pretty straight forward. You need the destination NSURL that you specified to Alamofire when you started your download. Then load the file data into an NSData blob. Finally, use NSJSONSerialization to convert the NSData object into a JSON AnyObject which can be parsed into model objects to populate your table view.
Obviously you don't "have" to parse the JSON into model objects, but this helps protect your table view from malformed JSON data.
Storing JSON for Offline Usage
If you stick with this approach, you'll need to track your cache expiration dates in something like CoreData or SQLite. You can do this by either caching the paths to the JSON files on disk, or store the model objects directly in CoreData or SQLite. This could get fairly complicated and I would not recommend this approach unless you absolutely don't want to cache your model objects.
Offline Usage
Generally, if you need to cache data for offline usage, you want to store your model objects in something like CoreData. You would use the Alamofire request method coupled with a responseJSON serializer to parse the data into JSON. Then you would convert the JSON into model objects. From there, you'd save your model objects in CoreData, then finally populate your table view with the model objects.
The nice thing about this approach is that you have all your model objects cached in the case that your table view is accessed when the device is offline. Coupling this design with queries to your NSURLCache to see if your request is cached let's you avoid unnecessary server calls and parsing logic when you already have your model objects generated.
Given the updates to your original question, I would recommend this approach.

You can use this cache open source. It cache data on disk and memory. Can cache many swift type, and custom class which inherit NSObject and conform NSCoding protocol.
https://github.com/huynguyencong/DataCache
To implement:
First, it use NSCache for mem cache. NSCache use like a dictionary.
Second, save cache to disk, use NSFileManager methods.

Related

Save a large JSON to Realm using Swift 3

I have a JSON with more or less 75 keys.
I need to receive this JSON and store offline it using Realm.
I do not want to iterate through the keys, since I've heard that there are ways to save a large JSON using a few lines. How can I do this?
EDIT:
My JSON (
I saved on a server away because it's too big)
http://myjson.com/i7e6l
There is no easy, one liner to parse the JSON and store it in Realm, since each JSON response is unique and no framework can have explicit knowledge about the structure of your JSON response without you giving some information to this framework about your JSON.
You will need to write some code either to parse the response or to make a mapping between your JSON response's fields and the properties of your Realm object. If you choose the latter solution, you can use Alamofire Object Mapper to do the JSON parsing automatically, but even then you have to write code for the mapping.

What are ways to store complex dynamic objects locally (iOS, swift)?

I have iOS app that takes data from the server as json and then serializes them into objects of different types. Types can be complicated, can contain subtypes, can inherit, so there is no any limitations. Another thing that makes everything even more complicated is some of types are stored as AnyObject? and only in run time they are being serialized into real types accordingly to the specific rules. Something like that:
class A {
var typeName: String?
var b: AnyObject?
}
Then when it's serialized it can be done something like that:
if let someClass = NSClassFromString(typeName) as? SomeGenericType.Type{
b = someClass.init()
}
Also querying should be done on all the data. Currently I'm trying to store all of them locally, then load into memory and query there from the code. I'm using User defaults, but they have some limitations, also I needed to provide custom coding to make it work, and each time when I add a new field it turned out that I missed something in coding and nothing works. So it's pain.
Ideally I would just do some magic command and all the objects are sent to local storage no matter how complicated they are. The same to extract them from this storage. Also, user change data so I can't just store primary Json. And I don't want to covert objects back to Jason as for it's pain too.
Any suggestions?
If you want to use sqlite then You can store whole object in one row! I means you can create table with 2 columns one is id and second is your dataobject(it's data type should be blob). Then convert your whole object into data. Then store in sqlite table and retrieve it as data then convert it to object when want to use. By this way your object will remains in same format as you asked
Firebase while meant for online synching and storage can also cache everything locally in case you are offline and perform query's against the local cache. It uses JSON.
CouchDB also has a mobile version for iOS.
Both of those are over kill if your dataset is small; you can just store it as a text file and read the JSON back in. See performance characteristics here. The graph is for a 7MB file so if you are significantly less than that your load time may be minimal.
NSKeyedArchiver.archivedData(withRootObject:) is great for storing custom objects as Data objects. The only thing you need to do to be able to use this is to make your custom objects conform to NSCoding. A great example can be found here:
Save custom objects into NSUserDefaults
Once you have the Data version of the object, it can easily be stored in UserDefaults, as a property in CoreData, or even in the app's keychain entries. Depending on your use case, sensitivity of data, and how much data you intend to store, you might want to use any number of storage methods. NSKeyedArchiver.archivedData(withRootObject:) allows you to pretty much use any of them.

UITableView handling Json and Core Data

What would be the best practise and best for user experience to achieve the following?
1:) Retrieve data from JSON
2:) Store in Core Data
3:) Display in UITableViewController
Do i store the JSON first, then populate the table using the stored data? OR Do i store it in the Core Data (background process) and populate the table using the JSON for the first time?
I want the user to be presented with a UITableview with minimum load time.
Thanks
This is what I would do:
Create your Core Data database and model.
Create a data access layer that will contain the read and write methods for each of your objects
In the read functions you can query the core data, if there is data then return that. Then in the background call the web server and and update your core data with the new JSON.
If there is no data go and request it from the web server, populate your core data tables using the JSON and then return the data from the core data so it is always consistent.
You can also have a last update date on your data so you are only requesting the new data from the web server that isnt already in your local core data DB. This will reduce the amount of data coming down to your ios device.
If you want minimum load time then I'd serve from JSON and that save to CoreData afterwards. That way the user can see content straight away without first having to wait for all the data to be saved (and parsed).
The course of action in this matter heavily depends on:
A. The amount of JSON data you are downloading
B. How effective your backend is at only sending necessary JSON results (rather than sending everything in bulk)
C. How you are attaching Core Data to your UITableViewController.
I recently made a pretty big project that does exactly this, which involved fetching a pretty big chunk of JSON, parsing it, and inserting it into Core Data. The only time there is any delay is during the initial load. This is how I accomplished it:
Download JSON. Cast as [[String: AnyObject]]: if let result = rawJSON as? [[String: AnyObject]] {}
Check to see if IDs for objects already exist in Core Data. If that object already exists, check if it needs update. If it doesn't exist, create it. Also check if IDs have been deleted from the JSON, if so remove them from.
Use NSFetchedResultsController to manage data from Core Data and populate the UITableView. I use NSFetchedResultsController rather than managedObjectContext.executeFetchRequest() because NSFetchedResultsController has delegate methods that are called every time the managedObjectContext is updated.

How to sync data from web service with Core Data?

I'm trying to sync my data from a web service in a simple way. I download my data using AFNetworking, and using a unique identifier on each object, I want to either insert, delete or update that data.
The problem is that with Core Data you have to actually insert objects in the NSObjectManagedContext to instantiate NSManagedObjects. Like this:
MyModel *model = (MyModel *)[NSEntityDescription insertNewObjectForEntityForName:#"MyModel" inManagedObjectContext:moc];
model.value = [jsonDict objectForKey:#"value"];
So when I get the data from the web service, I insert them right away in Core Data. So there's no real syncing going on: I just delete everything beforehand and then insert what's being returned from my web service.
I guess there's a better way of doing this, but I don't know how. Any help?
You are running into the classic insert/update/delete paradigm.
The answer is, it depends. If you get a chunk of json data then you can use KVC to extract the unique ids from that chunk and do a fetch against your context to find out what exists already. From there it is a simple loop over the chunk of data, inserting and updating as appropriate.
If you do not get the data in a nice chunk like that then you will probably need to do a fetch for each record to determine if it is an insert or update. That is far more expensive and should be avoided. Batch fetching before hand is recommended.
Deleting is just about as expensive as fetching/updating since you need to fetch the objects to delete them anyway so you might as well handle updating properly instead.
Update
Yes there is an efficient way of building the dictionary out of the Core Data objects. Once you get your array of existing objects back from Core Data, you can turn it into a dictionary with:
NSArray *array = ...; //Results from Core Data fetch
NSDictionary *objectMap = [NSDictionary dictionaryWithObjects:array forKeys:[array valueForKey:#"identifier"]];
This assumes that you have an attribute called identifier in your Core Data entity. Change the name as appropriate.
With that one line of code you now have all of your existing objects in a NSDictionary that you can then look up against as you walk the JSON.
The easiest thing to do is to restore the Json to a entity that maps properly to it. Once you've mapped it, determine if a object matching the entities ID exists already, if so then fetch the entity and merge changes. If not, create a new entity in Core Data and restore the Json to it.
I'm building a app were I do client side syncing with Evernote. They keep a syncUpdate number on all of their objects and at the server level. So when I start my sync I check if my clients syncUpdate count is less than the servers. If so, I know I am out of sync. If my updateCount is at 400 and the server is at 410, I tell the server to provide me with all objects between updateCount 400 and 410. Then I check if I already have the objects or not and perform my update/create.
Every time a object is modified on the server, that objects updateCount is increments along with the servers.
The server also keeps a time stamp of the last update, which I can check against also.

RestKit 0.20 and ManagedObjectContexts

I am mapping data using RestKit 0.20 into a Core Data and displaying it in a UITable. I am writing the data, an 'Activity' object, to the mainQueue's ManagedObjectContext and it all works fine. Now I need a second table with Future-Activities and also a third table with Past-Activities. I need a ManagedObjectContext for each table as the sorting is done on the server side. How can I handle this and have persistent data. Is 'newChildManagedObjectContextWithConcurrencyType' what I need to use?
Keep a single store. Use a predicate to filter out the items you want.
If you can download all of the data (and you're happy to do that even though some of it may not be used by the user), and you can tag them for what they are used for then that is an option.
From a RestKit point of view, you can use metadata to tag the items during the mapping process so that you know how they should be used (and then filter on that). This requires that you add a new key to the item - but, if one item could be in all responses this will be problematic because the values would get overwritten.
To use metadata, simply add a new mapping like:
#"#metadata.URL": #"requestURL"
Where #metadata.URL is the URL used to make the request and requestURL is the property on your entity that you can use for filtering. The predicate will check for contains your types ("all_day" "start_time" "end_time").

Resources