I have a jsonb-type column called 'payloads' for my Tweet model in my Rails 6.1.1 app. I use a store to access various fields on this attribute directly:
class Tweet < ApplicationRecord
store :payload, accessors: [:lang, :text, :entities], coder: JSON
end
(Note that the part coder: JSON is necessary to change the serialization from the YAML default to JSON – otherwise you end up with YAML in your jsonb column.)
When I create a new tweet, I see that ActiveRecord erroneously escapes the JSON string during the insertion:
Tweet.create payload: {foo: 'bar'}
TRANSACTION (0.7ms) BEGIN
Tweet Create (2.0ms) INSERT INTO "tweets" ("payload", ...) VALUES ($1, ...) RETURNING "id" [["payload", "\"{\\\"foo\\\":\\\"bar\\\"}\""], ...]
I'm referring to the part "\"{\\\"foo\\\":\\\"bar\\\"}\"". It looks like it's been double escaped. This results in a sort of 'stringception' where a string within a string is stored in the payloads column, rendering postgres unable to perform any searches on fields using the arrow -> syntax. Postgres can't recognize this value as JSON. (Seemingly miraculously, however, Rails is able to deserialize the field properly on read operations.)
Another SO user has illustrated the issue here: https://dbfiddle.uk/gcwTQOUm
When I do not use store, I cannot reproduce the issue:
Tweet.create payload: {foo: 'bar'}
TRANSACTION (0.2ms) BEGIN
Tweet Create (1.6ms) INSERT INTO "tweets" ("payload", ...) VALUES ($1, ...) RETURNING "id" [["payload", "{\"foo\":\"bar\"}"], ...]
"{\"foo\":\"bar\"}" is the desired string.
This has led me to believe that I'm using store wrong.
I checked the docs:
NOTE: If you are using structured database data types (e.g. PostgreSQL hstore/json, or MySQL 5.7+ json) there is no need for the serialization provided by .store. Simply use .store_accessor instead to generate the accessor methods. Be aware that these columns use a string keyed hash and do not allow access using a symbol.
In other words, because I was already using a jsonb-type column, Rails already knew to serialize the hash – applying another serialization resulted in a double escape.
So, instead of doing:
store :payload, accessors: [:lang, :text, :entities], coder: JSON
I know do:
store_accessor :payload, :lang, :text, :entities
And it works.
Related
I am using
ruby version 2.2.6
rails version 4.1.16
postgresql 11.10
gem json-1.8.6
Currently I am getting the error "typeerror: no implicit conversion of hash into string" when I am trying to save a model. This only happen for param that is using json datatype.
The database schema for the table:
create_table "jasons", force: :cascade do |t|
t.string "name"
t.json "description"
end
I already entered accessors for using json in the Jason.rb model like this.
class Jason < ApplicationRecord
:description, accessors: [:key1, :key2, :watapak]
end
and also whitelisted the params in the controller.
def jason_params
params.require(:jason).permit(:name, :description, :key1, :key2, :watapak)
end
and I can call new on the model successfully with
jason = Jason.new(name: "Jason John", key1: "abcd", key2: "efgh")
=> #<Jason id: nil, name: "Jason John", description: {"key1"=>"abcd", "ke...
then when I'm trying to call jason.save, it return's the error "typeerror: no implicit conversion of hash into string".
I tested the code on another environment, using the latest ruby 3.0 and rails 6.1 and everything works great without any error.
One thing that I notice is that the SQL command on both environment is different.
Rails 6.1
Jason Create (0.5ms) INSERT INTO "jasons" ("name", "description") VALUES (?, ?) [["name", "Jason John"], ["description", "\"--- !ruby/hash:ActiveSupport::HashWithIndifferentAccess\\nkey1: abcd\\nkey2: efgh\\n\""]]
Rails 4.1.6
SQL (6.6ms) INSERT INTO "device_models" ("created_at", “name", "description", “updated_at") VALUES ($1, $2, $3, $4) RETURNING "id" [["“created at", "2021-08-09 07:13:04.157826"], ["name", “Jason John"], ["description", "{\"key1\":\"abcd\",\"key2\":\"efgh\"}"], ["updated at", "2021-08-09 07:13:04.157826"]]
The full error is as in this image. Also it differs a bit with the question but the root of the problem is the same.
Do tell me if more information is needed. Also sorry if my question is gibberish as this is my first time posting question, I don't really know how to structure it to make it simpler to understand.
Please help T.T
I suspect that you may have intend to write store :description, accessors: [:key1, :key2, :watapak] which is a hack to store serialized data in a VARCHAR/TEXT columns and does not work with native JSON types. Both serialize and store are practically obsolete.
NOTE: If you are using structured database data types (e.g. PostgreSQL
hstore/json, or MySQL 5.7+ json) there is no need for the
serialization provided by .store. Simply use .store_accessor instead
to generate the accessor methods. Be aware that these columns use a
string keyed hash and do not allow access using a symbol.
class Jason < ApplicationRecord
store_accessor :description, :key1, :key2, :watapak
end
It seems like Mongoid is now the superior ORM for Mongo based on performance and development activity. Unfortunately, we're on MongoMapper and need to migrate.
Are there any concerns or stumbling blocks we should be aware of? We have found a few outdated articles on Google and tried posting on the Mongoid Google Groups (though we were prohibited), but would love thoughts from SO members who have done this in the past.
We're on Rails 3.2.12.
Thanks!
Both of them are great MongoDB Libraries for Ruby. But if you want to switch, here are some notes:
Migrating MongoMapper ORM to Mongoid ORM - Notes
Configure the database connection.
Replace configuration yaml file(includes replica configuration).
Configure Mongoid specific options. e.g - raise_not_found_error: false. if you don't want an error every time a query returns nothing...
Change all models definations - include MongoMapper::Document to include Mongoid::Document
Change the format for all fields definitions.
In mongoid, you should specipy the timestamp: include Mongoid::Timestamps
Change validation. e.g: :in => ARRAY, will be: validates :name, presence: true, inclusion: { in: ARRAY }
Change indexes.
Change order_by format. e.g: MM: Model.all(:order => 'name'). Mongoid: Model.order_by('name ASC')
Error is a keyword in Mongoid. So if you have a model named Error, you should change it.
Pagination format is different, using another gem.
The primary key index entry in MM is id. In Mongoid it's _id, if you have other code relying on .id in the object JSON, you can override as_json function in your Model to create the JSON structure you want.
In MM, Model.fields(:id, :name) ,limits the fields returned from the database to those supplied to the method. In Mongoid it's Model.only(:name,:id)
Some queries changes:
Selecting objects by array: MM: Model.where(:attr.in => [ ] ) and Model.where(:attr => [ ] ) . Mongoid is only: Model.where(:attr.in => [ ] )
Map option of MM is equivalent to the Mid's pluck. Model.map(&:name) --to-- Model.pluck(:name)
Mongoid doesn't support find query for nil. e.g: value = nil. Model.find(value) will throw an error : "Calling Document .find with nil is invalid". So in mongoid we should do: Model.find(value || "").
In MM: Model.find_or_initialize_by_name("BOB"). In Mongoid Model.find_or_initialize_by(name: "BOB").
MM can be used in those two options: Model.where({:name => 'BOB'}).first, and also Model.first({:name => 'BOB'}). Mongoid has only first option.
In MM, to update multiple objects: Model.set({conditions},attr_to_update). In Mongoid: Model.where(conditions).update_all(attr_to_update).
I was following this railscast, and finished the tutorial. Everything was working fine. Then I decided to use hstore instead of a serialized hash, and after setting up hstore, ran into a error:
PG::Error: ERROR: Syntax error near '!' at position 4 : INSERT INTO "products" ("product_type_id", "created_at", "properties", "updated_at") VALUES ($1, $2, $3, $4) RETURNING "id"
I googled, and found a similar SO question, but I'm using Rails 4, which supposedly doesn't need to use that gem anymore.
Here's my code:
The relevant portion of my form.html.haml looks like this
= f.fields_for :properties, OpenStruct.new(#product.properties) do |builder|
- #product.product_type.products.each do |product|
= render "products/fields/#{product.field_type}", field: field, f: builder
My Product model looks like this:
class Product < ActiveRecord::Base
belongs_to :product_type
serialize :properties
end
I can post more code if it will help. Thanks!
The Rails4 PostgreSQL driver for ActiveRecord is supposed to have native support for PostgreSQL's hstore type so you shouldn't need to use serialize at all. Try ditching the serialize.
BTW, a ! will appear in a YAML string when you attempt to serialize some objects to YAML:
"--- !ruby/object:SomeClassName ..."
and that ! could cause some problems if PostgreSQL was expecting to see an hstore string.
Need to store a serialized hash into mysql. Since the size of hash is going to be very small, i decided to use a varchar for saving the serialized data instead of text column. I am using mysql with rails 3.
Model:
class User < ActiveRecord::Base
serialize :monday
end
When I do the following,
u = User.new
u.monday = {:from => "10:00", :to => "04:00"}
u.save
I get following error "TypeError: class or module required". Shouldn't we use varchar for serialized data?
You need a :text database datatype to use the serialize option.
I would like to know if it is possible to get the types (as known by AR - eg in the migration script and database) programmatically (I know the data exists in there somewhere).
For example, I can deal with all the attribute names:
ar.attribute_names.each { |name| puts name }
.attributes just returns a mapping of the names to their current values (eg no type info if the field isn't set).
Some places I have seen it with the type information:
in script/console, type the name of an AR entity:
>> Driver
=> Driver(id: integer, name: string, created_at: datetime, updated_at: datetime)
So clearly it knows the types. Also, there is .column_for_attribute, which takes an attr name and returns a column object - which has the type buried in the underlying database column object, but it doesn't appear to be a clean way to get it.
I would also be interested in if there is a way that is friendly for the new "ActiveModel" that is coming (rails3) and is decoupled from database specifics (but perhaps type info will not be part of it, I can't seem to find out if it is).
Thanks.
In Rails 3, for your model "Driver", you want Driver.columns_hash.
Driver.columns_hash["name"].type #returns :string
If you want to iterate through them, you'd do something like this:
Driver.columns_hash.each {|k,v| puts "#{k} => #{v.type}"}
which will output the following:
id => integer
name => string
created_at => datetime
updated_at => datetime
In Rails 5, you can do this independently of the Database. That's important if you use the new Attributes API to define (additional) attributes.
Getting all attributes from a model class:
pry> User.attribute_names
=> ["id",
"firstname",
"lastname",
"created_at",
"updated_at",
"email",...
Getting the type:
pry> User.type_for_attribute('email')
=> #<ActiveRecord::ConnectionAdapters::AbstractMysqlAdapter::MysqlString:0x007ffbab107698
#limit=255,
#precision=nil,
#scale=nil>
That's sometimes more information than needed. There's a convenience function that maps all these types down to a core set (:integer, :string etc.)
> User.type_for_attribute('email').type
=> :string
You can also get all that data in one call with attribute_types which returns a 'name': type hash.
You can access the types of the columns by doing this:
#script/console
Driver.columns.each {|c| puts c.type}
If you want to get a list of all column types in a particular Model, you could do:
Driver.columns.map(&:type) #gets them all
Driver.columns.map(&:type).uniq #gets the unique ones
In rails 5 this will give you a list of all field names along with their data type:
Model_Name.attribute_names.each do |k| puts "#{k} = #{Model_Name.type_for_attribute(k).type}" end
Rails 5+ (works with virtual attributes as well):
Model.attribute_types['some_attribute'].type
This snippet will give you all the attributes of a model with the associated database data types in a hash. Just replace Post with your Active Record Model.
Post.attribute_names.map {|n| [n.to_sym,Post.type_for_attribute(n).type]}.to_h
Will return a hash like this.
=> {:id=>:integer, :title=>:string, :body=>:text, :created_at=>:datetime, :updated_at=>:datetime, :topic_id=>:integer, :user_id=>:integer}
Assuming Foobar is your Active Record model. You can also do:
attributes = Foobar.attribute_names.each_with_object({}) do |attribute_name, hash|
hash[attribute_name.to_sym] = Foobar.type_for_attribute(attribute_name).type
end
Works on Rails 4 too
In Rails 4 You would use Model.column_types.