Caching with core data - ios

I inherited some old code where I have an in memory cache without an eviction policy, and the cache is populated once with tens of thousands of different kinds of objects. The memory footprint of the app crosses 500 MB at times, and the app is often terminated due to memory pressure. My question is do I really need a cache over a core data setup? Since core data does the work of loading managed objects into the memory, and evicting them eventually, am I not better off getting rid of this cache which has several arrays and dictionaries holding a lot of objects?

You should definitely avoid a cache on a core data setup. Once objects have been fetched into a managed object context, they are held in memory. Well mostly anyways. So the answer to your question is most likely YES but not knowing the reasons for caching it could also be the opposite. I'd get rid of the cache first and do some performance measurements. Once all scenarios can be confirmed to work fine then the answer was indeed yes.

Related

Core Data Excessive VM: SQLite page cache

I will keep this question general for now and avoid cluttering this with code.
I have an iOS application that uses Core Data (sqlite) for its data store. The model is fairly complex, with large hierarchy of objects. When I fetch and import these large data sets I am noticing that the application shuts down after awhile due to a memory warning.
The Allocations profiler shows me excessive "transient" VM: SQLite page objects. The size of this keeps growing and growing but NEVER goes down. I have tried to ensure that all of my NSManagedObjectContext saves occur inside performBlock calls.
It would seem to me as if there are object contexts that are not getting deallocated and / or reset.
I have tried disabling undoManager in NSManagedObjectContext. setting the stalenessInterval to a very low value (1.0), and calling reset on my MOC's after they are done saving data upon import.
What does this mean when the transient VM SQLite page cache continues to go up so high?
What needs to be done in order to make the page cache go down?
What is an acceptable size for this cache to get to in a large Core Data application?
Thanks,
Well it turns out the transient VM SQLite page cache column show in Instruments is cumulative to the session, not the "current" value. Well of course it never goes down then!
Turns out that some other optimizations around ensuring managed object contexts get cleared out fixed our CoreData memory issue.
Great article here on the subject: Core Data issues with memory allocation

Core Data refuses to clear external data references from memory

I am loading large amounts of data into Core Data on a background thread with a background NSManagedObjectContext. I frequently reset this background context after it's saved in order to clear the object graph from memory. The context is also disposed of once the operation is complete.
The problem is that no matter what I do, Core Data refuses to release large chunks of data that are stored as external references. I've verified this in the Allocations instrument. Once the app restarts the memory footprint stays extremely low as these external references are only unfaulted when accessed by the user. I need to be able to remove these BLOBS from memory after the initial download and import since they take up too much space collectively. On average they are just html so most are less than 1MB.
I have tried refreshObject:mergeChanges: with the flag set to NO on pretty much everything. I've even tried reseting my main NSManagedObjectContext too. I have plenty of autorelease pools, there are no memory leaks, and zombies isn't enabled. How can I reduce my Core Data memory footprint when external references are initially created?
I've reviewed all of Apple's documentation and can't find anything about the life cycle of external BLOBS. I've also searched the many similar questions on this site with no solution: Core Data Import - Not releasing memory
Everything works fine after the app first reboots, but I need this first run to be stable too. Anyone else been able to successfully fault NSData BLOBS with Core Data?
I'm assuming the "clear from memory" means "cause the objects to be deallocated" and not "return the address space to the system". The former is under your control. The latter is not.
If you can see the allocations in the Allocations instrument, have you turned on tracking of reference count events and balanced the retains and releases? There should be an indicative extra retain (or more).
If you can provide a simple example project, it would be easier to figure out what is going on.

How does NSArray and Core Data with a non-persistent store work from a memory perspective?

This is a rather simple question, but I haven't been able to pinpoint a clear answer in my searching.
If I have an NSArray, and add fifty 1MB UIImages to it, where does that 50MB get deducted from? Will the app be using 50MB more memory? Will it simply store it on the disk?
The same goes for Core Data where instead of using a persistent store I store it in memory. Would the size of the Core Data store take up exactly that much memory/RAM or would it live on the disk and be wiped when the app finishes executing?
I'm concerned whether or not I should be storing several dozen megabytes in UIImages in an NSArray, or if I should be using NSCache (I'd rather not as I'd prefer to never lose any of the images).
If I have an NSArray, and add fifty 1MB UIImages to it, where does
that 50MB get deducted from? Will the app be using 50MB more memory?
Yes.
Will it simply store it on the disk?
No. Arrays are stored in memory.
The same goes for Core Data where instead of using a persistent store
I store it in memory. Would the size of the Core Data store take up
exactly that much memory/RAM or would it live on the disk and be wiped
when the app finishes executing?
Yes, if you tell Core Data to story everything in memory, that's exactly what it'll do.
The line between "memory" and "disk" can get a little fuzzy if you consider that virtual memory systems can swap pages of real memory out to disk and read them back when they're needed. That's not an issue for iOS, however, as iOS doesn't provide a VM backing store and writeable memory is never swapped out.
I'm concerned whether or not I should be storing several dozen
megabytes in UIImages in an NSArray, or if I should be using NSCache
Those aren't your only options, of course. You could store the images in files and read them in as needed. You should be thoughtful about the way your app uses both memory and disk space, of course, but you also need to consider network use, battery use, and performance. Storing data on disk is often preferable to downloading it again because downloading takes time, may impact the user's data plan, and uses a lot more energy than just reading data from secondary storage.

Windows Task Manager shows process memory keeps growing even though there are no memory leaks

My application keeps consuming more and more memory as seen in the Windows Task Manager and eventually crashes due to OutOfMemory. However when i check for leaks using MemoryValidator (from www.softwareverify.com) no leaks are detected. Why is this happening?
Just because there is a growing amount of memory usage doesn't mean it is necessarily 'leaking'. You could simply be accumulating a large number of live objects and/or very large ones (containing lots and lots of data).
If you can provide more information about what language(s) you are using and what the application is doing I can perhaps help out with some more specific information!
UPDATE AS PER COMMENTS
Well, you'll just want to make sure the garbage collection is happening correctly. I'd suggest the libgc library to help with that perhaps.
http://developers.sun.com/solaris/articles/libgc.html
The only other thing I could think of as being the cause of this is that you are maintaining references to the objects somewhere unintentionally so they are just piling up.

SQLite: ON disk Vs Memory Database

We are trying to Integrate SQLite in our Application and are trying to populate as a Cache. We are planning to use it as a In Memory Database. Using it for the first time. Our Application is C++ based.
Our Application interacts with the Master Database to fetch data and performs numerous operations. These Operations are generally concerned with one Table which is quite huge in size.
We replicated this Table in SQLite and following are the observations:
Number of Fields: 60
Number of Records: 1,00,000
As the data population starts, the memory of the Application, shoots up drastically to ~1.4 GB from 120MB. At this time our application is in idle state and not doing any major operations. But normally, once the Operations start, the Memory Utilization shoots up. Now with SQLite as in Memory DB and this high memory usage, we don’t think we will be able to support these many records.
When I create the DB on Disk, the DB size sums to ~40MB. But still the Memory Usage of the Application remains very high.
Q. Is there a reason for this high usage. All buffers have been cleared and as said before the DB is not in memory?
Any help would be deeply appreciated.
Thanks and Regards
Sachin
You can use the vacuum command to free up memory by reducing the size of sqlite database.
If you are doing a lot of insert update operations then the db size may increase. You can use vaccum command to free up space.
SQLite uses memory for things other than the data itself. It holds not only the data, but also the connections, prepared statements, query cache, query results, etc. You can read more on SQLite Memory Allocation and tweak it. Make sure you are properly destroying your objects too (sqlite3_finalize(), etc.).

Resources