Rails - Exclude an attribute from being saved - ruby-on-rails

I have a column named updated_at in postgres. I'm trying to have the db set the time by default. But Rails still executes the query updated_at=NULL. But postgres will only set the timestamp by default when updated_at is not in the query at all.
How do I have Rails exclude a column?

You can disable this behaviour by setting ActiveRecord::Base class variable
record_timestamps to false.
In config/environment.rb, Rails::Initializer.run block :
config.active_record.record_timestamps = false
(if this doesn't work, try instead ActiveRecord::Base.record_timestamps = false at the end of the file)
If you want to set only for a given model :
class Foo < ActiveRecord::Base
self.record_timestamps = false
end
Credit to Jean-François at http://www.ruby-forum.com/topic/72569

I've been running into a similar issue in Rails 2.2.2. As of this version there is an attr_readonly method in ActiveRecord but create doesn't respect it, only update. I don't know if this has been changed in the latest version. I overrode the create method to force is to respect this setting.
def create
if self.id.nil? && connection.prefetch_primary_key?(self.class.table_name)
self.id = connection.next_sequence_value(self.class.sequence_name)
end
quoted_attributes = attributes_with_quotes(true, false)
statement = if quoted_attributes.empty?
connection.empty_insert_statement(self.class.table_name)
else
"INSERT INTO #{self.class.quoted_table_name} " +
"(#{quoted_attributes.keys.join(', ')}) " +
"VALUES(#{quoted_attributes.values.join(', ')})"
end
self.id = connection.insert(statement, "#{self.class.name} Create",
self.class.primary_key, self.id, self.class.sequence_name)
#new_record = false
id
end
The change is just to pass false as the second parameter to attributes_with_quotes, and use quoted_attributes.keys for the column names when building the SQL. This has worked for me. The downside is that by overriding this you will lose before_create and after_create callbacks, and I haven't had time to dig into it enough to figure out why. If anyone cares to expand/improve on this solution or offer a better solution, I'm all ears.

Related

Getting "original" object during a before_add callback in ActiveRecord (Rails 7)

I'm in the process of updating a project to use Ruby 3 and Rails 7. I'm running into a problem with some code that was working before, but isn't now. Here's (I think) the relevant parts of the code.
class Dataset < ActiveRecord::Base
has_and_belongs_to_many :tags, :autosave => true,
:before_add => ->(owner, change){ owner.send(:on_flag_changes, :before_add, change) }
before_save :summarize_changes
def on_flag_changes(method, tag)
before = tags.map(&:id)
after = before + [tag.id]
record_change('tags', before, after)
end
def record_change(field, before_val, after_val)
reset_changes
before_val = #change_hash[field][0] if #change_hash[field]
if before_val.class_method_defined? :sort
before_val = before_val.sort unless before_val.blank?
after_val = after_val.sort unless after_val.blank?
end
#change_hash[field] = [before_val, after_val]
end
reset_changes
if #change_hash.nil?
#change_notes = {}
#change_hash = {
tags: [tags.map(&:id), :undefined]
}
end
end
def has_changes_to_save?
super || !change_hash.reject { |_, v| v[1] == :undefined }.blank?
end
def changes_to_save
super.merge(change_hash.reject { |_, v| v[0] == v[1] || v[1] == :undefined })
end
def summarize_changes
critical_fields = %w[ tags ]
#change_notes = changes_to_save.keep_if { |key, _value| critical_fields.include? key } if has_changes_to_save?
self.critical_change = true unless #change_notes.blank?
end
There are more fields for this class, and some attr_accessors but the reason I'm doing it this way is because the tags list can change, which may not necessarily trigger a change in the default "changes_to_save" list. This will allow us to track if the tags have changed, and set the "critical_change" flag (also part of Dataset) if they do.
In previous Rails instances, this worked fine. But since the upgrade, it's failing. What I'm finding is that the owner passed into the :before_add callback is NOT the same object as the one being passed into the before_save callback. This means that in the summarize_changes method, it's not seeing the changes to the #change_hash, so it's never setting the critical_change flag like it should.
I'm not sure what changed between Rails 6 and 7 to cause this, but I'm trying to find a way to get this to work properly; IE, if something says dataset.tags = [tag1, tag2], when tag1 was previously the only association, then dataset.save should result in the critical_change flag being set.
I hope that makes sense. I'm hoping this is something that is an easy fix, but so far my looking through the Rails 7 documentations has not given me the information I need. (it may go without saying that #change_notes and #change_hash are NOT persisted in the database; they are there just to track changes prior to saving to know if the critical_change flag should be set.
Thanks!
Turns out in my case there was some weird caching going on; I'd forgotten to mention an "after_initialize" callback that was calling the reset method, but for some reason at the time it makes this call, it wasn't the same object as actually got loaded, but some association caching was going on with tags (it was loading the tags association with the "initialized" record, and it was being cached with the "final" record, so it was confusing some of the code).
Removing the tags bit from the reset method, and having it initialize the tag state the first time it tries to modify tags solved the problem. Not particularly fond of the solution, but it works, and that's what I needed for now.

Update fails first time, succeeds second time

We've got this object, #current_employer, that's acting a bit weird. Update fails the first time, succeeds the second.
(byebug) #current_employer.update(settings_params)
false
(byebug) #current_employer.update(settings_params)
true
Here's where we initialise it:
#current_employer = Employer.find(decoded_auth_token[:employer_id])
It's just a standard "find".
Current workaround:
if #current_employer.update(settings_params) || #current_employer.update(settings_params)
...
Anyone seen this before?
Update
Tracked it down to this line in a "before_save" call
# self.is_test = false if is_test.nil?
Seems like is_test is a reserved keyword?
Solved
The full callback, with the fix commented inline:
def set_default_values
self.has_accepted_terms = false if has_accepted_terms.nil?
self.live = true if live.nil?
self.account_name.downcase!
self.display_name ||= account_name
self.display_name ||= ""
self.contact_first_name ||= ""
self.contact_last_name ||= ""
self.phone_number ||= ""
self.is_test_account = false if is_test_account.nil?
true # FIX. The proceeding line was returning 'false', which was giving us a 'false' return value for the before_save callback, preventing the save.
end
Model
If it's failing in one instance and succeeding almost immediately afterwards, the typical issue is that you're passing incorrect / conflicting attributes to the model.
I would speculate that the settings_params you're sending have a value which is preventing the save from occurring. You alluded to this with your update:
# self.is_test = false if is_test.nil?
The way to fix this is to cut out any of the potentially erroneous attributes from your params hash:
def settings_params
params.require(:employer).permit(:totally, :safe, :attributes)
end
Your model should update consistently - regardless of what conditions are present. If it's failing, it means there'll be another problem within the model save flow.
--
Without seeing extra information, I'm unable to see what they may be
Update
A better way to set default values is as follows:
How can I set default values in ActiveRecord?
You may wish to use the attribute-defaults gem:
class Foo < ActiveRecord::Base
attr_default :age, 18
attr_default :last_seen do
Time.now
end
end

Rails fixtures use empty_clob() with CLOB fields but nothing else

I'm back with Rails fixtures after seeing they were much improved since the last time I used them.
#models.yml
one:
id: 1
clob_field: "My Text"
When the models fixture is loaded into the DB - I can see that the clob text (My Text) is substituted with an empty_clob() call (in the insert statement)
According to my understanding, the Oracle enhanced adapter should make another update statement that sets the clob_field appropriately - but this doesn't get executed (and the value remains blank).
Any idea why that is?
I traced the fixture loading and found out that that was due to a mixture of specifying a schema_name along with the table_name (self.table_name = "SCHEMA_OWNER.TABLE_NAME") as well as using upper-case TABLE_NAMES.
I've worked-around the issue by overriding insert_fixture method (in the oracle-enhanced adapter) to properly manipulate table_name.
Now the write_lobs is being called correctly.
UPDATE
Here's the change as requested by #jeff-k
# config/initializers/oracle_enhanced_adapter.rb
...
ActiveSupport.on_load(:active_record) do
ActiveRecord::ConnectionAdapters::OracleEnhancedAdapter.class_eval do
# Overriding this method to account for including the schema_name in the table name
# which is implemented to work around another limitation of having the schema_owner different
# than the connected user
#
# Inserts the given fixture into the table. Overridden to properly handle lobs.
def insert_fixture(fixture, table_name) #:nodoc:
super
if table_name =~ /\./i
table_name = table_name.downcase.split('.')[1]
end
if ActiveRecord::Base.pluralize_table_names
klass = table_name.to_s.singularize.camelize
else
klass = table_name.to_s.camelize
end
klass = klass.constantize rescue nil
if klass.respond_to?(:ancestors) && klass.ancestors.include?(ActiveRecord::Base)
write_lobs(table_name, klass, fixture, klass.lob_columns)
end
end
end
end

Rails Cache Key generated as ActiveRecord::Relation

I am attempting to generate a fragment cache (using a Dalli/Memcached store) however the key is being generated with "#" as part of the key, so Rails doesn't seem to be recognizing that there is a cache value and is hitting the database.
My cache key in the view looks like this:
cache([#jobs, "index"]) do
The controller has:
#jobs = #current_tenant.active_jobs
With the actual Active Record query like this:
def active_jobs
self.jobs.where("published = ? and expiration_date >= ?", true, Date.today).order("(featured and created_at > now() - interval '" + self.pinned_time_limit.to_s + " days') desc nulls last, created_at desc")
end
Looking at the rails server, I see the cache read, but the SQL Query still runs:
Cache read: views/#<ActiveRecord::Relation:0x007fbabef9cd58>/1-index
Read fragment views/#<ActiveRecord::Relation:0x007fbabef9cd58>/1-index (1.0ms)
(0.6ms) SELECT COUNT(*) FROM "jobs" WHERE "jobs"."tenant_id" = 1 AND (published = 't' and expiration_date >= '2013-03-03')
Job Load (1.2ms) SELECT "jobs".* FROM "jobs" WHERE "jobs"."tenant_id" = 1 AND (published = 't' and expiration_date >= '2013-03-03') ORDER BY (featured and created_at > now() - interval '7 days') desc nulls last, created_at desc
Any ideas as to what I might be doing wrong? I'm sure it has to do w/ the key generation and ActiveRecord::Relation, but i'm not sure how.
Background:
The problem is that the string representation of the relation is different each time your code is run:
|This changes|
views/#<ActiveRecord::Relation:0x007fbabef9cd58>/...
So you get a different cache key each time.
Besides that it is not possible to get rid of database queries completely. (Your own answer is the best one can do)
Solution:
To generate a valid key, instead of this
cache([#jobs, "index"])
do this:
cache([#jobs.to_a, "index"])
This queries the database and builds an array of the models, from which the cache_key is retrieved.
PS: I could swear using relations worked in previous versions of Rails...
We've been doing exactly what you're mentioning in production for about a year. I extracted it into a gem a few months ago:
https://github.com/cmer/scope_cache_key
Basically, it allows you to use a scope as part of your cache key. There are significant performance benefits to doing so since you can now cache a page containing multiple records in a single cache element rather than looping each element in the scope and retrieving caches individually. I feel that combining this with with the standard "Russian Doll Caching" principles is optimal.
I have had similar problems, I have not been able to successfully pass relations to the cache function and your #jobs variable is a relation.
I coded up a solution for cache keys that deals with this issue along with some others that I was having. It basically involves generating a cache key by iterating through the relation.
A full write up is on my site here.
http://mark.stratmann.me/content_items/rails-caching-strategy-using-key-based-approach
In summary I added a get_cache_keys function to ActiveRecord::Base
module CacheKeys
extend ActiveSupport::Concern
# Instance Methods
def get_cache_key(prefix=nil)
cache_key = []
cache_key << prefix if prefix
cache_key << self
self.class.get_cache_key_children.each do |child|
if child.macro == :has_many
self.send(child.name).all.each do |child_record|
cache_key << child_record.get_cache_key
end
end
if child.macro == :belongs_to
cache_key << self.send(child.name).get_cache_key
end
end
return cache_key.flatten
end
# Class Methods
module ClassMethods
def cache_key_children(*args)
#v_cache_key_children = []
# validate the children
args.each do |child|
#is it an association
association = reflect_on_association(child)
if association == nil
raise "#{child} is not an association!"
end
#v_cache_key_children << association
end
end
def get_cache_key_children
return #v_cache_key_children ||= []
end
end
end
# include the extension
ActiveRecord::Base.send(:include, CacheKeys)
I can now create cache fragments by doing
cache(#model.get_cache_key(['textlabel'])) do
I've done something like Hopsoft, but it uses the method in the Rails Guide as a template. I've used the MD5 digest to distinguish between relations (so User.active.cache_key can be differentiated from User.deactivated.cache_key), and used the count and max updated_at to auto-expire the cache on updates to the relation.
require "digest/md5"
module RelationCacheKey
def cache_key
model_identifier = name.underscore.pluralize
relation_identifier = Digest::MD5.hexdigest(to_sql.downcase)
max_updated_at = maximum(:updated_at).try(:utc).try(:to_s, :number)
"#{model_identifier}/#{relation_identifier}-#{count}-#{max_updated_at}"
end
end
ActiveRecord::Relation.send :include, RelationCacheKey
While I marked #mark-stratmann 's response as correct I actually resolved this by simplifying the implementation. I added touch: true to my model relationship declaration:
belongs_to :tenant, touch: true
and then set the cache key based on the tenant (with a required query param as well):
<% cache([#current_tenant, params[:query], "#{#current_tenant.id}-index"]) do %>
That way if a new Job is added, it touches the Tenant cache as well. Not sure if this is the best route, but it works and seems pretty simple.
Im using this code:
class ActiveRecord::Base
def self.cache_key
pluck("concat_ws('/', '#{table_name}', group_concat(#{table_name}.id), date_format(max(#{table_name}.updated_at), '%Y%m%d%H%i%s'))").first
end
def self.updated_at
maximum(:updated_at)
end
end
maybe this can help you out
https://github.com/casiodk/class_cacher , it generates a cache_key from the Model itself, but maybe you can use some of the principles in the codebase
As a starting point you could try something like this:
def self.cache_key
["#{model_name.cache_key}-all",
"#{count}-#{updated_at.utc.to_s(cache_timestamp_format) rescue 'empty'}"
] * '/'
end
def self.updated_at
maximum :updated_at
end
I'm having normalized database where multiple models relate to the same other model, think of clients, locations, etc. all having addresses by means of a street_id.
With this solution you can generate cache_keys based on scope, e.g.
cache [#client, #client.locations] do
# ...
end
cache [#client, #client.locations.active, 'active'] do
# ...
end
and I could simply modify self.updated from above to also include associated objects (because has_many does not support "touch", so if I updated the street, it won't be seen by the cache otherwise):
belongs_to :street
def cache_key
[street.cache_key, super] * '/'
end
# ...
def self.updated_at
[maximum(:updated_at),
joins(:street).maximum('streets.updated_at')
].max
end
As long as you don't "undelete" records and use touch in belongs_to, you should be alright with the assumption that a cache key made of count and max updated_at is sufficient.
I'm using a simple patch on ActiveRecord::Relation to generate cache keys for relations.
require "digest/md5"
module RelationCacheKey
def cache_key
Digest::MD5.hexdigest to_sql.downcase
end
end
ActiveRecord::Relation.send :include, RelationCacheKey

Rails - How to use find or create with a default value

I currently have the following:
conversation_participation = #user.conversation_participations.find_or_create_by_conversation_id(conversation.id)
This correctly creates a record, problem is the default value of conversation.read is false.
In this particular method I want the default value to be true when creating the record. Right now the only way I got this to work was by the following:
conversation_participation = #user.conversation_participations.find_or_create_by_conversation_id(conversation.id)
conversation_participation.read = true
conversation_participation.save
Problem is this hits the DB twice. How can I use find_or_create and set the default :read => true?
Thanks
You can try find_or_initialize_by...
Then set your read attribute as normal
conversation_participation = #user.conversation_participations.find_or_initialize_by_conversation_id(conversation.id)
conversation_participation.read = true
conversation_participation.save
Or
conversation_participation = #user.conversation_participations.find_or_initialize_by_conversation_id_and_read(conversation.id, true)
conversation_participation.save
With after_initialize (Oh.. you deleted your comment, but here is something for that just in case)
class Conversation < ActiveRecord::base
def after_initialize
self.read ||= true # true if not already set
end
end
Then you can do find_or_create|initialize_by... or which ever way you wish to proceed.
More on callbacks if you are interest.
From here:
Use the find_or_initialize_by_
finder if you want to return a new
record without saving it first.
So something like this:
conversation_participation = #user.conversation_participations.find_or_initialize_by_conversation_id(conversation.id)
conversation_participation.read = true
conversation_participation.save
This should just do an INSERT instead of an INSERT followed by an UPDATE.
Try this:
conversation_participation = #user.conversation_participations.find_or_create_by_conversation_id_and_read(conversation.id, true)

Resources