Modify updated_at attribute through db trigger in rails - ruby-on-rails

I was wondering if there is anything wrong about updating the updated_at attribute of a record through a db trigger in terms of fragment caching (i.e. the partials dont get re-cached / the old cache keys do not disappear from memory).
additional info I'm using a trigger due to using the upsert gem which does not modify the updated_at attribute unless explicitly told to do so ( which I do not want to do ); also, due to the same gem I cannot use an active::record after_save or before_save on the model.
Please let me know if there any other information I should provide to add some clarity to my question.

There isn't nothing wrong, but if you need to do so you can simply use record.touch in a method so your code will be more clean and app will be more maintainable.

One more way to achive that is to use on_duplicate attribute of the Rails upsert_all method (without any gems). Check the documentation, pseudo code example:
YourModel.upsert_all(
array_of_data,
unique_by: %i[field_1 field_2],
on_duplicate: Arel.sql('updated_at = current_timestamp')
)
If you have other fields to update, don't forget to add them into Arel.sql:
on_duplicate: Arel.sql('updated_at = current_timestamp, field_to_update = excluded.field_to_update')

Related

Unpermitted parameters issue Ruby on Rails 5

I'm currently trying to understand how permitted parameters works in ruby.
Usually, in my_model.rb I have:
has_many: some_other_model
*
*
*
def my_model_params
params.require(:my_model).permit( :column1, some_other_model_attributes %i[other_column1])
etc...
and in the update function
my_object.update_attributes(my_model_params)
with a well formatted json which has some my_model root, and some_other_model_attributes as a child (array) with values.
My problem is I receive a json like this one
However the different arrays inside (such as codification, general_information) do contain attributes of the mission (general_information contains reference that is a column in the mission table) but there isn't any column named codification, or relation to a codification_attributes.
So, when I add :
general_information: %i[reference] in the permitted params, it says unknown attribute 'general_information' for Mission.
If not, no error are raised but in the log I can see unpermitted_parameter: general_information. And my object is not updated.
Finally if I reject it, there is no more unpermitted_parameter: general_information in the log but my object is not updated either.
I tried to set config.action_controller.action_on_unpermitted_parameters to false in my development config, it did nothing and it's probably a bad idea for production environment anyway.
The use of .permit! (even if it works) is currently not an option. And even though I think the json needs to be re-formatted it'd be better to find an other solution.
Thanks for the help.
unpermitted_parameter: ... in logs in not a problem which you need to fix, it's just an info.
How it works - you just permit needed parameters, you may think about them as a hash. Unpermitted parameters will not go into the model even if they are present in params. It means when you call
my_object.update_attributes(my_model_params)
it works like
my_object.update_attributes(column1: value_for_column1_from_params)
Keys in params should be named exactly as columns in the model, otherwise you need to prepare params somehow before create/update

Rails cache_key returning different values before and after model.reload

I'm on Rails 4.2.8.
Let's say I have a model Contact, being contact = Contact.new.
Calling contact.cache_key returns a timestamped key, like so:
"contacts/2615608-20180109154442000000000"
Where 2615608 is the ID and 20180109154442000000000 the timestamp.
I'm seeing a weird behavior in my controller.
After a contact.update(contact_params), if I call contact.cache_key I'm having a different timestamp then if I call contact.reload.cache_key.
The weird thing is I'm 100% sure no other update is happening to the model's updated_at (saying so by checking console's SQL update statements, there is only one call that is changing the updated_at)
It's so weird that if I do something like this in the controller:
contact.udpate(contact_params)
Rails.logger.info "Updated_at before reload is #{#contact.updated_at} and it's cache_key is #{#contact.cache_key}"
Rails.logger.info "Updated_at after reload is #{#contact.reload.updated_at} and it's cache_key is #{#contact.reload.cache_key}"
The output reveals IDENTICAL updated_at values, but different cache_key values:
Updated_at before reload is 2018-01-09 14:01:58 -0200 and it's cache_key is contacts/2615608-20180109160158143423000
Updated_at after reload is 2018-01-09 14:01:58 -0200 and it's cache_key is contacts/2615608-20180109160158000000000
As you can see, same updated_at, but different timestamps.
This is driving me nuts. I hate having to manually .reload models without necessity. Why is this happening? Tried looking at cache_key's source code but couldn't find an answer there, since it apparently only depends on the updated_at value (which is the same).
Found out the issue. It was a Rails bug, fixed in Rails 5.0.
The source code for .cache_key for Rails 4.2 can be found here. As you can see, it used cache_timestamp_format = :nsec, which was too precise.
The reason I could understand is that, BEFORE the reload, the model's updated_at is still in memory, so it has enough resolution (in memory) to return a very precise nanosecond key. But after model.reload, the updated_at comes from the database, which doesn't have that high resolution, so that's why the cache_key timestamp results in a different number, with a lot of zeroes at the end (in my example, 20180109160158000000000).
This issue details the problem with the too precise (nanosecond) timestamp, and this pull request was merged to fixed the issue, changing the precision from :nsec to :usec.
On the issue discussion (same link above) #tarzan makes a suggestion to fix it by creating an ActiveSupport::Concern with the following code:
included do
self.cache_timestamp_format = :usec
end
and defined the :usec as a formatter in an initializer:
Time::DATE_FORMATS[:usec] = "%Y%m%d%H%M%S%6N"
And manually inserting your concern into the models you want to use .cache_key manually (by the way, I'm using this method as recommended by the oficial Rails guides for low level caching, by using Rails.cache.fetch(self.cache_key), check http://guides.rubyonrails.org/caching_with_rails.html in topic 1.6 Low-Level Caching).
The problem is, in my tests at least (MySQL on Mac Os High Sierra), being contact a scaffolded model, the precision of :datetime column (like updated_at) is only in the seconds level; that's why his fix doesn't solve it for us since changing from nanoseconds to microseconds precision will still result in two different cache_keys, with less zeroes added to it. As an example:
[4] pry(#<ContactsController>)> ::Time::DATE_FORMATS[:usec] = "%Y%m%d%H%M%S%6N"
=> "%Y%m%d%H%M%S%6N"
[5] pry(#<ContactsController>)> #contact.updated_at.utc.to_s(:usec)
=> "20180109234014062142"
[6] pry(#<ContactsController>)> #contact.reload.updated_at.utc.to_s(:usec)
Contact Load (2.2ms) SELECT `contacts`.* FROM `contacts` WHERE `contacts`.`id` = 2615608 LIMIT 1 /*application:Temporadalivre,controller:contacts,action:toggle_status*/
=> "20180109234014000000"
So, for now, at least, I'm sticking to the .reload unfortunately.
Lastly but now least, I only realized this bug because we had a page of 20 of these contacts that could be manipulated via ajax. Each ajax call would call .update on the model, and return the model .as_cached_json to the view. Since .as_cached_json was called, we hoped that by reloading the page everything would be already cached, but it wasn't, and only by checking the cache keys we discovered this bug, which gave us a really nice performance boost that we were loosing.

Rails 5 with mongoid .changed is always empty on update

In my controller, I have the *_params to permit attributes. No matter what, even when things are definitely changed, .changed is always blank. Is there a new Rails 5 way to detect changed attributes?
# case_params is the basic rails controller permission method
if #case.update(case_params)
puts #case.changed.count # this is is always 0
...
Can anyone see what I'm doing wrong? I need to know what's changed so I can selectively do some other work in a thread but only for changed attributes.
Thanks for any help.
You can use #cases.previous_changes which gives all the changes to the document in the form of a hash. The hash also consitis of updated_at attribute. You can do -1 to the above count to get all the changed attributes count excluding updated_at.
You can get the count by #cases.previous_changes.keys.count - 1
We need to subtract -1 for removing updated_at attribute changes.
If you want to detect changes in the assosiation you can use try using children_changed? for it.

first_or_create: determining which is called

I call first_or_create like so:
collection = Collection.first_or_create(:title => title)
Is there a way to determine if the result is an existing entry or a freshly created one? So far the best solution I've come up with is to use first_or_initialize:
collection = Collection.first_or_initialize(:title => title)
if collection.id.nil?
<process>
collection.save
end
But this feels a bit hacky. Is there a way to get this information directly from first_or_create?
first_or_create takes a block that'll only be executed if a new record is created, you can set some flag inside that block to indicate it's a new record, for example
MyObject.where(attr: "value").first_or_create do |obj|
#my_object_created = true
end
As far as I know you can't know. Two options are to check the created_at time (unreliable), or instead use first_or_initialize, then check to see if new_record? is true, and if so, do your other operations and then call save!. This may be the best approach for you anyway, since you may very well not want to finalize the save until the other relations are saved, and you probably want to do all of that in the same database transaction.
Using first_or_create you can't know for sure is it a newly created object or one from the database. Possible tricky solution is to compare created_at value with current time. This works if you don't create objects often.
Btw, why you need to know is it the newly created object or not?

Does Rails have a CLEAN update attribute method?

One thing that keeps me banging my head on the wall in Rails is its unclean way of saving single attributes back to the model. Or at least, my understanding of it.
From what i know, the closest method that is doing that is update_attribute (which is now deprecated?). However, it has a major drawback. It performs an update on all model fields.
If i'm mistaken, please state what is the best way to do this thing in a clean manner. If i'm correct, i seriously don't understand, why there is not clean method that does this on single attributes?
I just tested this out:
Code:
Order.update(1, :description => 'fff')
SQL executed:
UPDATE `orders` SET `updated_at` = '2011-04-25 05:23:29', `description` = 'fff' WHERE `orders`.`id` = 1
So yes, Rails update does almost what you need. (except that it also updates the updated_at)
Tip: In IRB, you can execute ActiveRecord::Base.logger = Logger.new(STDOUT) to see the log output of Rails (including SQL statments).

Resources