I want to be able to "deep clone" 10 instances of an ActiveRecord model and all its associations into memory, work on them, update the in-memory objects and then, when I've finished, pick one to write back over the original in the database.
How do I deep clone (i.e. .clone but also cloning all associations right down to the bottom of the association tree)? I've assumed so far that I'm going to have to write my own method in the Model.
How can ensure that none of the cloned instances will write back to the database until I'm ready to do so?
If possible I'd like to:-
retain all current IDs as one of my main associations is a has_many :through matching the IDs of one model to another
still be able to treat each of the clones as if it were in the database (i.e. .find_by_id etc. will work)
Moon on a stick perhaps? ;)
Not 100% sure of what you are trying to do ...
Models will only be stored in the database if you call the save method. Calling save in an existing model will update the database with any data that has been changed. Associations may be saved also, but it really depends on the type of association and in most cases you will probably need to call save on these models as well.
Doh! Sometimes it takes asking the stupid question before you see the obvious answer.
My issue was that I was having to make changes to the associated objects and they weren't showing up when I used those in-memory objects later so thought I had to save. However, you are right. All that was actually happening was that the variables referencing them had gone out of scope and I was therefore accessing the in-database ones instead.
I'll go back through my code and check that this is the case.
Having said that, it doesn't answer my question on the "deep cloning though" ...
I've solved our deep cloning issues using DefV's deep cloning plugin : http://github.com/DefV/deep_cloning
It's done everything I've required so far, though as you've found you need to be very watchful of your relationships. My tests have luckily shown this up as an issue and I'm working through it at the moment. I found this post as I was trying to solve it :)
Check out the plugin though, it's been quite handy.
Related
I am currently on the verge of finishing my progressive core data migration and I am bumping into a few problems:
1) How is one expected make use of the function createRelationshipsForDestinationInstance:entityMapping:manager:error: to successfully migrate the relationships? Did Apple expect the developers to somehow know what they are querying for and re-establishing the relationships or is there a more elegant way?
2) I've noticed that the function mentioned in question 1 doesn't always get called. When I migrate from version 1 to version 2, it doesn't get called, but when I try to migrate from version 1 to version 3, it does get called. Why is that? I thought this function gets called at the end of every endInstanceCreationForEntityMapping:manager:error:?
3) I have an immediate concern with migrating many-to-many relationships as it is stored in a separate table. How does the migration manager treat this? Surely it is not another instance?
As a reference, this is the project I am experimenting all of my migration stuff on: https://github.com/sdwornik/ECD-Migration
Please feel free to fork and play around with it! :)
You get the destination entity passed in. From there you can find the source entity. You can also use -sourceInstancesForEntityMappingNamed: destinationInstances:
Does the v1 to v2 migration work? If it doesn't then perhaps heavy migration has broken recently.
It will treat all relationships the same, you need to recreate them as part of the migration. Perhaps further details on what you believe the issue is.
Update
I tried using that function, but was unable to fault the relationship associated with that entity. Is this a context issue?
Are you implementing the method that creates the object in the new context? If not, why not? If so, consider storing it in the -userInfo of the migration manager so you can access it later.
It works to a certain degree. It works with attribute migration, but the relationships don't get migrated.
Are the relationships being migrated? I suspect your mapping model is doing that work for you.
So I guess I'd have to reassign each relationship every time I do custom migration?
Every relationship that your mapping model can't handle; yes.
I am using the Wicked gem to build an instance of a model in steps (step 1, step 2, etc). On the third step, however, I need to make an API call to collect some data and store it in another model instance (it would have a :belongs_to relationship with the other model). What I am wondering is how do I interact with this API and store information, all while I am still in the creation process of the first model. Is this a good design pattern? Or should I be dealing with API information in a different way?
My thoughts are that I could redirect to the form for making the API call and redirect back to the fourth step after dealing with the API.
Does Rails have a specific design it uses for dealing with 3rd party APIs?
No, this is not a good design pattern, but sometimes there is no way around it. Important is that
everything is covered by a single database transaction, and that, as I understand from your question, is the case. Your objects are connected by a "belongs_to" relationship, so they can be saved in one go (when the "parent" object is saved, the "children" will be saved at once). There is also no second, unconnected object involved, so no need to create a separate transaction just for this action
second is that you cover everything with enough error handling. This is your own responsibility: make sure when the 3rd party call goes bananas, you're ready to catch the error, and worse case, roll back the entire transaction yourself
So, to summarize: no it's not a good practice, but Rails gives you the tools to "keep it clean"
Although your question was rather verbose, I would recommend looking at the before_create ActiveRecord callback in your model:
#app/models/parent.rb
Class Parent < ActiveRecord::Base
before_create :build_child
end
This builds the child object before you create the parent, meaning that when you save the parent, you'll have the child created at the same time. This will allow you to create the child object when interacting with the parent. To ensure the child's data is populated correctly, you'll need to use an instance method with the callback
I just realised that standard ActiveRecord actually hits the database when you do
person = Person.new(:name => "test")
I suspect ActiveRecord does this to check what fields are available for the model.
However, our legacy database is only occasionally available. That means that our test suite cannot run all the time. Are there any tricks to making this work without a database?
If it's not possible, we thought of some alternatives:
have a local copy of the database and work on that one
use another ORM that solves this problem (DataMapper)
Any suggestions are welcome.
Working with a local copy is only possible if the data touched isn’t changed from anywhere else! Otherwise you will have a database disaster soon.
Your suspection is right too. ActiveRecord checks the Database for fields so it needs a connection. I don’t think there is a way to solve this when you want continue using ActiveRecord. I think you need to solve the underlying problem and run a database that is always available!
Try attr_accessor :your_method_name
You may use also read_attribute/write_attribute
I have got an object wich I am using as a pattern. This objects has got a number of associations. One of those association is an attach. My object has many attaches. I can clone all, you know, db data, but how should I do it with files, that are attached to my object.
I can imagine some solutions, but all of them little hacky and don't look native.
For example I can add virtual attribute to temprorary store ids of attaches while I am cloning an object.
what's the solution do you have to manage attachments? If something like Paperclip, it has a callback that handles remove/cloning the real files on filesystem level.
I have a model that I am attempting to apply optimistic locking to. If the user has a stale record I need the system to display what they entered and ask them to try again. I have attempted to use the changes method, which works. My issue is that the model has many different levels of related models, that are all submitted within the same form. Is it possible to traverse through all of the related models gathering all of the changes, or do I need to do this manually?
Any help would be appreciated!
Thanks,
Ryan Lundie
Hey, check out this plugin: dirty_associations I think that's what you are looking for.