iOS 7 - UITableViewController big data source - strategy advice? - ios

Xcode: 5.0.2
iOS: 7.0.x
I have a secondary view that is conditionally shown when a user logs in to my app. This secondary view shows a list of items that a user must choose one of as a "default" value for the lifetime of their authenticated session. This secondary view is going to be seen only once by the large majority of my users.
The list of items are returned in JSON from a web service and can be anywhere from 1 item to 1000 items. If one is returned, the secondary view won't even show.
The json will be structured with two elements again each item, and id and an itemName. I've estimated a few hundred kb download for a worst case scenario - and its a one time download. Perhaps a searchable API rather than a data-dump would be better practice?
Once the results are return they will be processed into two NSArray. An NSArray of NSDictionary for me to retrieve an id once selected and an NSArray of NSString containing itemName - used for populating UITableView and performing the keyword search against;
For retrieving ID reference:
[ { id: 0, itemName: "one" }, { id: 1 itemName: "two" } ]
For populating the UITableView data source
[ "one", "two" ]
Now I need this data in my UITableView. As this is a one-time operation (changeable later, but the users typically will not be changing this regularly) I was planning on adding the entire array into the UITableView.
Typically, what is the max size that you should put into the table view? Will this cause me some serious memory issues? How will the keyword search fair when searching against 100's - 1000.
I'm also looking at perhaps updating the UI to follow that very closely of the Contacts app (UILocalizedIndexedCollation?) so again it will have an impact on this.
Thanks,

Having a JSON, a corresponding Foundation representation will consume roughly 5 times more RAM than the length in bytes.
So, for example if your JSON is 300 Kbyte (UTF-8), then NSJSONSerialization will create a Foundation hierarchy which may consume 1.5 MByte RAM.
This amount fits nicely into RAM.
If, and only if you don't want to persist the JSON representation or a corresponding Model, and don't want to utilize Core Data for other reasons (e.g. editing, undo, rollbacks, etc.), I would strongly argument against using Core Data: since your data is still "small" in the worst case, there's no need to utilize Core Data in the assumption it would safe you memory.
Core Data isn't cheap either, memory wise. In your case, using Core Data would actually cause your App to consume much more RAM, because of SQLite's buffers and caches, Core Data buffers and caches and internal structures. In practice and given your scenario with 1000 objects, a Core Data /SQLite approach would consume roughly 4 Mbyte. This is about 3 times more than your JSON representation. On the other hand, Core Data may not consume much more RAM when your number of elements increase tremendously and when there is also memory pressure.

I will also strongly recommended to you CoreData approach with UIFetchedResultController. This is solution crafted for your problem. While you're downloading the data from the web, FetchedResultController will display empty table view, and when download is complete or "enough" you might display the data without blocking user interface.
What's more, if you want to display a huge database at once, the controller manage to fetched more data "by itself" when user scroll down.
Of course UILocalizedIndexedCollation is also supported here.

Related

Swift - core data capacity

I have to design an app that will sustain increasing number of data as time goes by.
For example, let's say I have data model like this.
class obj{
let data1: String
let data2: String
init(data1: String, data2: String){
self.data1 = data1
self.data2 = data2
}
}
And everyday I have to save new obj.
In such condition, is using core data smart way to keep the data?
Or is it better to prepare database like sql and save data there?
My concern is that app cannot handle so much data after using the app for awhile because the total size of data gets too big at the end...
Sorry I do not have deep knowledge in core data & database.
I appreciate to all of you who share me some knowledge here in advance.
Thank you
In most cases people use Core Data with a SQLite backing store. It's not like working with SQLite, but it's relevant because you get similar benefits for memory use.
You won't use excessive memory unless you load lots of your model objects into memory. For an example like yours that would be an extremely large number of objects. But more typically you use Core Data to load only a subset of the total. Every object where data1 has a particular string value, for example, or the most recent 20 objects, for example. "Too big" isn't usually a problem. If you are loading an enormous number of objects into memory, you'll have problems whether or not you use Core Data.

How to use NSCache appropriately in iOS application?

I am building an application and want to use NSCache to store data for caching.
There will be approx 5 API for which I need to cache data. Should I user NSCache? I did research for NSCache but I have some doubts regarding this.
I did go through below link.
https://developer.apple.com/library/content/documentation/Performance/Conceptual/ManagingMemory/Articles/CachingandPurgeableMemory.html
I found some interesting things there.
NSCache provides two other useful "limit" features: limiting the number of cached elements and limiting the total cost of all elements in the cache. To limit the number of elements that the cache is allowed to have, call the method setCountLimit:. For example, if you try to add 11 items to a cache whose countLimit is set to 10, the cache could automatically discard one of the elements.
If we set limit to 10 and if we try to add 11th item then which particular item it will discard that is 1st one or 10th one? Or it can be a random item.
If I want to store 5 different NSCache object for all APIs then how can I do it?
I should make 5 different NSCache objects or I should store 5 different dictionary in single NSCache object?
Please let me know the best way to deal with it.
NSCache works basically as an NSMutableDictionary, the main difference is that even if is mutable is thread safe (usually mutable NSFoundation objects are not thread safe). So you get and set objects using keys.
Yes, the documentation is not clear, but I remember (and it makes sense) that is written that you should always detect if an object is cache and if not reload it from your primary source or manage that particular case. So it is not very important which one is removed, just make sure that at some point an object can not be there anymore and you are managing that situation.
I usually create different caches based on different context, but most of the time one would suffice.
I have few advices:
Your answer is tagged as Swift, thus pay attention that NSCache (in swift 2, don't know in 3) works only with objects and not struct or enumerations (value types unless thay can be bridged).
Remember that http protocol has its own cache system and communication, do not reinvent the wheel

read/write an NSCountedSet to plist

I am developing a game app where i am using an NSCountedSet as my character inventory where the inventory dynamically changes from view to view.
In other words:
the user can buy items from view 1 and add to the inventory, then the user switches to view 2 and uses some items and those should be removed from the inventory, and so on..
My questions are:
1.How can I write and read a NSCounted set efficiently to a plist?
2.is the best approach to write the data to disk as view 1 closes and the reread the data as view 2 opens? or is there a way i can read the data once when the app launches, make all the changes and then save the data back when the app is terminating?
The data consists of strings and numbers only and is small in ammount.
THe following are snippets from my code:
- (void) initInventory
{
//initialize the inventory with some string objects
[Inventory addObject:#"x"];
[Inventory addObject:#"y"];
[Inventory addObject:#"z"];
}
- (void) addItemToInvetory:(NSString*)ItemName
{
//add object passed in method to the inventory
[Inventory addObject:ItemName];
}
- (void) removeItemFromInventory:(NSString*)ItemName
{
//add object passed in method to the inventory
[Inventory removeObject:ItemName];
}
1.How can I write and read a NSCounted set efficiently to a plist? ...The data consists of strings and numbers only and is small in amount.
You can just record it using an array of (alternating) strings and numbers. The number represents the count of the string object. For a small set, you should not need to worry about the performance of the operation.
2.is the best approach to write the data to disk as view 1 closes and the reread the data as view 2 opens? or is there a way i can read the data once when the app launches, make all the changes and then save the data back when the app is terminating?
You can pass it (the model) from one view controller to the next, and just share the same model instance in many cases. Whether it makes sense to dispose or not depends on whether or not you need a reference, and how often that information is needed. So best practice depends on the memory and your ability to ensure the data is correct. For example, you may opt to share in order to avoid unnecessary I/O, and to keep the data synchronized, but you should avoiding holding thousands of objects if you don't need them anytime soon.
If your data were not small, you should consider something like CoreData instead (3 values is extra-tiny).

Disappointing iOS search times with CoreData

I have a coredata db running on an ipad with iOS 5.1.1. My database has around 50,000 companies in it. I have created an index on the company name attribute. I am searching on this attribute and on occasion get several thousand records returned with a fetchRequest.
When several thousand records are returned then it can take a couple of seconds to return from the fetch. This makes type-ahead searching pretty clunky.
I anticipate having much larger databases in the future. What are my options for implementing a really fast search function?
Thanks.
I recommend watching the core-data performance videos from the last few WWDCs. They often talk about strategies for improving this kind of bottleneck. Some suggestions from the videos :
De-normalise the name field into a separate "case and diacritic insensitive" searchString field and search on that field instead, using <, <= or BEGINSWITH. Avoid MATCHES and wildcards.
Limit the number of results returned by the NSFetchRequest using fetchLimit and fetchBatchSize
If your company object is large you can extract some of the key data items off onto a separate smaller "header" object that is used just for the search interface. Then add a relationship back to the main object when the user makes a selection.
Some pointers to a couple of videos (there are more from other years also):
WWDC 2012: Session 214 - Core Data Best Practices : 45:00
WWDC 2010: Session 137 - Optimizing Core Data Performance on iPhone OS: 34:00
While Core Data is the correct tool in many cases, it's not a silver bullet by any means.
Check out this post which discusses a few optimizing strategies and also cases when an SQL database is your better choice instead of Core Data:
http://inessential.com/2010/02/26/on_switching_away_from_core_data
You may honestly be better off using an SQL database instead of Core Data in this case because when you try to access attribute values on Core Data entities, it typically results in a fault and pulls the object into active memory... this can definitely have performance and speed costs. Using a true database - Core Data is not a true database, see http://cocoawithlove.com/2010/02/differences-between-core-data-and.html - you can query the database without creating objects from it.

Reading from SQLite in Xcode iOS SDK

I have program with a table in SQLite with about 100 rows and 6 colunms of text (not more than hundred of characters). Each time click a button, the program will display in the view a contents of a row in the table.
My question is that should I copy the content of whole table into an array and then reading from array each time user clicks button or I access the table in database each time the user click the button? Which one is more efficient?
It all depends, but retrieving from the database as you need data (rather than storing the whole thing in an array) would generally be most efficient use of memory, which is a pretty precious resource. My most extravagant use of memory would be to store an array of table unique identifiers (i.e. a list of primary keys, returned in the correct order), that way I'm not scanning through the database every time, but my unique identifiers are always numeric, so it doesn't use up too much memory. So, for something of this size, I generally:
open the database;
load an array of solely the table's unique identifiers (which were returned in the right order for my tableview using the SQL ORDER BY clause);
as table cells are built, I'll go back to the database and get the few fields I need for that one row corresponding to the unique identifier that I've kept track of for that one table row;
when I go to the details view, I'll again get that one row from the database; and
when I'm all done, I'll close the database
This yields good performance while not imposing too much of a drain on memory. If the table was different (much larger or much smaller) I might suggest different approaches, but this seems reasonable to me given your description of your data.
for me - it was much easier to re-read the database for each view load, and drop the contents when done.
the overhead of keeping the contents in memory was just too much to offset the quick read of a small dataset.
100 row and 6 columns is not a lot of data. iOS is capable of handling larger data than that and very efficiently. So don't worry about creating a new array. Reading from the database should work just fine.

Resources