Core data and one to many relationship - ios

I have two entities:
Profiles<-->>Events
Now, I want the user to be able to add a profile and then be able to add events to that profile.
I'm having a hard time getting my mind out of the relational database world and into core data, but as I understand it, whenever I add an Event, I'll have to set the relationship for the profile..which makes sense to me. But when I add the Profile initially, do I have to tell the Event entity anything, or does core data resolve that when I add an Event?

You really need to read the Core Data guide from start to finish. It answers all these questions, and will save you a ton of headache. I'm a big fan of Core Data, but it is a massive framework that cannot be learned by puttering around and just trying things on your own (that is what I tried first as well, and got very frustrated and wasted more time undoing what I thought I had learned). Most importantly, don't think of CD as an ORM or database mapper - it's really an object graph manager that also handles persisting that object graph for you (as well as undo management, object 'schema' evolution, and more)
The short answer to your question is that no, you don't have to tell CD everything about your objects right away. You can create a Profile, set a few attributes on it, save it, come back days later and then start adding Events that are related.

Yes Ryan core data is different from relational DB , in your example you can add the event separately and after that you can add the relation to the profile in different command .
[_profile1 addEventObject:_event1]
[_profile1 addEventObject:_event2]
[_profile1 addEventObject:_event3]

Related

Core Data Memory Efficient Migration

I am currently building a CoreData migration for an app which has 200k / 500k average rows of data per entity. There are currently 15 entities within the CoreData Model.
This is the 7th migration I have built for this app, all of the previous have been simple (add 1or 2 column style) migrations, which have not been any trouble and have not needed any mapping models.
This Migration
The migration we are working on is fairly sizeable in comparison to previous migrations and adds a new entity between two existing entities. This requires a custom NSEntityMigrationPolicy which we have built to map the new entity relationships. We also have a *.xcmappingmodel, which defines the mapping between model 6 and the new model 7.
We have implemented our own subclass of NSMigrationManager (as per http://www.objc.io/issue-4/core-data-migration.html + http://www.amazon.com/Core-Data-Management-Pragmatic-Programmers/dp/1937785084/ref=dp_ob_image_bk).
The Problem
Apple uses the migrateStoreFromURL method of NSMigrationManager to migrate the model, however, this seems to be built for low/medium dataset sizes, which do not overload the memory.
We are finding that the app crashes due to memory overload (# 500-600mb on iPad Air/iPad 2) as a result of the following apple method not frequently dumping the memory on data transfer.
[manager migrateStoreFromURL:sourceStoreURL type:type options:nil withMappingModel:mappingModel toDestinationURL:destinationStoreURL destinationType:type destinationOptions:nil error:error];
Apple's Suggested Solution
Apple suggest that we should divide the *.xcmappingmodel up into a series of mapping models per individual entities - https://developer.apple.com/library/ios/documentation/cocoa/conceptual/CoreDataVersioning/Articles/vmCustomizing.html#//apple_ref/doc/uid/TP40004399-CH8-SW2. This would work neatly with the progressivelyMigrateURL methods defined in the above NSMigrationManager subclasses. However, we are not able to use this method as once entity alone will still lead to a memory overload due to the size of one entity by itself.
My guess would be that we would need to write our own migrateStoreFromURL method, but would like to keep this as close to as Apple would have intended as possible. Has anyone done this before and/or have any ideas for how we could achieve this?
The short answer is that heavy migrations are not good for iOS and should be avoided at literally any cost. They were never designed to work on a memory constrained device.
Having said that, a few question for you before we discuss a resolution:
Is the data recoverable? Can you download it again or is this user data?
Can you resolve the relationships between the entities without having the old relationship in place? Can it be reconstructed?
I have a few solutions but they are data dependent, hence the questions back to you.
Update 1
The data is not recoverable and cannot be re-downloaded. The data is formed from user activity within the application over a time period (reaching up to 1 year in the past). The relationships are also not reconstructable, unless we store them before we lose access to the old relationships.
Ok, what you are describing is the worst case and therefore the hardest case. Fortunately it isn't unsolvable.
First, Heavy migration is not going to work. We must write code to solve this issue.
First Option
My preferred solution is to do a lightweight migration that only adds the new relationship between the (now) three entities, it does not remove the old relationship. This lightweight migration will occur in SQLite and will be very quick.
Once that migration has been completed then we iterate over the objects and set up the new relationship based on the old relationship. This can be done as a background process or it can be done piece meal as the objects are used, etc. That is a business decision.
Once that conversion as been completed you can then do another migration, if needed, to remove the old relationship. This step is not necessary but it does help to keep the model clean.
Second Option
Another option which has value is to export and re-import the data. This has the added value of setting up code to back up the user's data in a format that is readable on other platforms. It is fairly simple to export the data out to JSON and then set up an import routine that pulls the data into the new model along with the new relationship.
The second option has the advantage of being cleaner but requires more code as well as a "pause" in the user's activities. The first option can be done without the user even being aware there is a migration taking place.
If I understand this correctly then you have one entity that is so big that when migrating this entity does cause the memory overload. In this case, how about splitting the migration of this one entity in several steps and therefore doing only some properties per each migration iteration?
That way you won't need to write your own code but you can benefit form the "standard" code.

iOS Core Data Wants All Relationships to be bi-directional

I am new to iOS programming but have done SQL stuff for years. I am trying to use Core Data to build my model. Following the tutorials I have created a schema for my application that involves a number of one-to-many relationships that are not bi-directional.
For example I have a Games entity and a Player entity. A Game includes a collection of Players. Because a Player can be involved in more than one game, an inverse relationship does not make any sense and is not needed.
Yet when I compile my application, I get Consistency Error messages in two forms. One says.
Game.players does not have an inverse; this is an advanced setting.
Really? This is an "advanced" capability enough to earn a warning message? Should I just ignore this message or am I actually doing something wrong here that Core Data is not designed to do?
The other is of the form Misconfigured Property and logs the text:
Something.something should have an inverse.
So why would it think that?
I can't find any pattern to why it picks one error message over the other. Any tips for an iOS newb would be appreciated.
This is under Xcode 5.0.2.
Core Data is not a database. This is an important fact to grasp otherwise you will be fighting the framework for a long time.
Core Data is your data model that happens to persist to a database as one of its options. That is not its main function, it is a secondary function.
Core Data requires/recommends that you use inverse relationships so that it can keep referential integrity in check without costly maintenance. For example, if you have a one way between A and B (expressed A --> B) and you delete a B, Core Data may need to walk the entire A table looking for references to B so that it can clean them up. This is expensive. If you have a proper bi-directional relationship (A <-> B) then Core Data knows exactly which A objects it needs to touch to keep the referential integrity.
This is just one example.
Bi-directionality is not required but it is recommended highly enough that it really should be considered a requirement. 99.999% of the time you want that bi-directional relationship even if you never use it. Core Data will use it.
Why not just add the inverse relationship? It can be to-many as well, and you may well end up using it - often fetch requests or object graph navigation works faster or better coming from a different end of a relationship.
Core Data prefers you to define relationships in both directions (hence the warnings) and it costs you nothing to do so, so you may as well. Don't fight the frameworks - core data isn't an SQLLite "manager", it is an object graph and persistence tool, that can use SQL as a back end.

What shouldn't I store in core data?

So this is more of an application design question. But I think it can be 'answered' and not just discussed. :)
I'm using RestKit for an application we're building. It obviously makes it super easy to put stuff into either straight objects or core data objects.
In the specific instance I'm dealing with, we have comments, much like comments on a facebook post.
Now, the nicest thing about storing these comments in core data is that with NSFRC I can sort them super easily and deal with updating/inserting automatically into the right spots into the timeline. But there's a couple sticking points there as well.
For instance with infinite loading, I now have to manage loading the comments in between the new most recent comments and the old stored comments. (Maybe the first time I grabbed 25, but there has been 100 new comments since then. So I retrieve the latest 25 first, then have to have an auto load cell in between the new comments and the old until I run into those, then have to paginate any after.
Aside from that, then you are also storing potentially thousands of comments in core data. Maybe it's not a big deal for quite a long time, but eventually you might want to start cleaning up old comments with a GCD task.
So what are the leading thoughts on what to store in core data, and what to keep as transient objects. (Maybe storing those in a cache like NSCache or the new Tumblr cache https://github.com/tumblr/TMCache).
Edit
Ok maybe I should clarify a little here. I get the purpose of Core Data... for persisting across app restarts and having an object graph with relationships. I make plenty of use of it. I guess what I'm wondering about here is the grey areas where I would like things to persist for the sake of not always having to wait for a network call, and offline availability.
But much like stories and comments on facebook, there are always going to be a constant stream of new ones coming in, and you don't necessarily care about 300 comments on an old post. Someone could come back to view comments on their 'post' quite a few times, or someone may just be browsing 'posts' and comments casually, and never coming back to them.
So I'm just trying to consider the strategy for something like this where you have potentially lots of entities (comments) coming down from a service. Sometimes people will want to view them several times (their own 'post') and sometimes they are just browsing through. When trying to see how others do this, it seems some stuff it all into core data, some (like Facebook) seem to store 25-50 most recent in the db, and any beyond that are transient (they probably are clearing out older stories and comments regularly too.)
Core Data is not designed to be used as "dumb data storage", but rather object persistence. So, anything that you want to persist between uses of your app should go into Core Data.
If you are using Core Data properly, it will take care of all of your caching for you as well.
EDIT:
Anything that is going to change too often for your taste or that you just don't want to permanently store, NSCache may be a better option. If you don't think your user will look at it again tomorrow, leave those bits out of your persistence. (IMHO)
Create a scond repository. Either select a time period that is known as 'recent' or provide a preference for such. Periodically look at the primary repository and find objects now older than the recent range, and move those objects to the scond repository.
Then provide users the means to search in just recent or all.
If all they want are recent values the searches should be faster, and nothing is lost.

Using Core Data while development with many planned changes

I read this blog where he writes, that everyone should use core data as soon as he want to store more than just trivial data.
So I added a xcdatamodeld to my project. I'm going to fill the database in the app with a formular. And I know, that I will change the data model a lot in the future development. But the entered data in the formular have to be saved. This means I need many migrations. Do you think that it is a good idea to use core data at this stage of development? I don't like the idea having tons of old xcdatamodel files while developing.
By the way, I'm using Magical Record if this helps anyone.
Definitely use core data. It's great.
I don't find that it is worth the effort doing migrations while you're developing the app. Half the time you're not saving the data anyway, or you want to start from a clean slate, or you have data setup code you want to test on the new model.
I'd advise altering your core data stack setup code to simply delete and recreate the persistent store if there is an error. Save the migrations for when you're updating a live version of the app.

Keeping Core Data Objects in multiple stores

I'm developing an iOS application using Core Data. I want to have the persistent store located in a shared location, such as a network drive, so that multiple users can work on the data (at different times i.e. concurrency is not part of the question).
But I also want to offer the ability to work on the data "offline", i.e. by keeping a local persistent store on the iPad. So far, I read that I could do this to some degree by using the persistent store coordinator's migration function, but this seems to imply the old store is then invalidated. Furthermore, I don't necessarily want to move the complete store "offline", but just a part of it: going with the simple "company department" example that Apple offers, I want users to be able to check out one department, along with all the employees associated with that department (and all the attributes associated with each employee). Then, the users can work on the department data locally on their iPad and, some time later, synchronize those changes back to the server's persistent store.
So, what I need is to copy a core data object from one store to another, along with all objects referenced through relationships. And this copy process needs to also ensure that if an object already exists in the target persistent store, that it's overwritten rather than a new object added to the store (I am already giving each object a UID for another reason, so I might be able to re-use the UID).
From all I've seen so far, it looks like there is no simple way to synchronize or copy Core Data persistent stores, is that a fair assessment?
So would I really need to write a piece of code that does the following:
retrieve object "A" through a MOC
retrieve all objects, across all entities, that have a relationship to object "A"
instantiate a new MOC for the target persistent store
for each object retrieved, check the target store if the object exists
if the object exists, overwrite it with the attributes from the object retrieved in steps 1 & 2
if the object doesn't exist, create it and set all attributes as per object retrieved in steps 1 & 2
While it's not the most complicated thing in the world to do, I would've still thought that this requirement for "online / offline editing" is common enough for some standard functionality be available for synchronizing parts of persistent stores?
Your point of views greatly appreciated,
thanks,
da_h-man
I was just half-kidding with the comment above. You really are describing a pretty hard problem - it's very difficult to nail this sort of synchronization, and there's seldom, in any development environment, going to be a turn-key solution that will "just work". I think your pseudo-code description above is a pretty accurate description of what you'll need to do. Although some of the work of traversing the relationships and checking for existing objects can be generalized, you're talking about some potentially complicated exception handling situations - for example, if updating an object, and only 1 out 5 related objects is somehow out of date, do you throw away the update or apply part of it? You say "concurrency" is not a part of the question, but if multiple users can "check out" objects at the same time, unless you plan to have a locking mechanism on those, you would start having conflicts when trying to make updates.
Something to check into are the new features in Core Data for leveraging iCloud - I doubt that's going to help with your problem, but it's generally related.
Since you want to be out on the network with your data, another thing to consider is whether Core Data is the right fit to your problem in general. Since Core Data is very much a technology designed to support the UI and MVC pattern in general, if your data needs are not especially bound to the UI, you might consider another type of DB solution.
If you are in fact leveraging Core Data in significant ways beyond just modeling, in terms of driving your UI, and you want to stick with it, I think you are correct in your analysis: you're going to have to roll your own solution. I think it will be a non-trivial thing to build and test.
An option to consider is CouchDB and an iOS implementation called TouchDB. It would mean adopting more of a document-oriented (JSON) approach to your problem, which may in fact be suitable, based on what you've described.
From what I've seen so far, I reckon the best approach is RestKit. It offers a Core Data wrapper that uses JSON to move data between remote and local stores. I haven't fully tried it yet, but from what the documentation reads, it sounds quite powerful and ideally suited for my needs.
You definetly should check these things:
Parse.com - cloud based data store
PFIncrementalStore https://github.com/sbonami/PFIncrementalStore - subclass of NSIncrementalStore which allows your Persistent Store Coordinator to store data both locally and remotely (on Parse Cloud) at the same time
All this stuff are well-documented. Also Parse.com is going to release iOS local datastore SDK http://blog.parse.com/2014/04/30/take-your-app-offline-with-parse-local-datastore/ wich is going to help keep your data synced.

Resources