I have started building application in Ruby and come to know the gem serialzer, but I am confused as I am from python background in that serialzation means to convert the response into bytes but here we simply use as attributes.
What exactly happens in Ruby with serialization?
class UserSerializer < ActiveModel::Serializer
attributes :name
end
Serialization in Ruby generally can mean to convert any kind of object into a form that can be transmitted or stored. This broader definition is closer to the one used in computer science.
For example the built in Marshal module converts objects into a byte stream which allows them to be stored outside the currently active script and then be reconstituted. This is a form of serialization that should be instantly recognizable to a Pythonista.
In Rails specifically the term serialization is commonly used for two things:
Converting model data into transmission formats such as JSON (or XML back in ye olde times) to be sent across the wire to clients or other servers. Rails has simple form of serialization built in through the as_json method. This is where gems like ActiveModel::Serializers, jbuilder and the plethora of JSONAPI.org gems come into play.
ActiveRecord::AttributeMethods::Serialization - which is an old hack to store hashes and other types of objects in VARCHAR or TEXT type columns in the database. Rails marshals/unmarshals them as JSON or YAML on the application side which kind of sucks if you want to be able to query the data in a sensible way. While this is largely supplanted by native ARRAY and JSON/JSONB types it does still have a few legit uses such as storing encrypted data.
ActiveModel::Serializers (not to be confused with Rails built in ActiveModel::Serialization) is basically the swiss army bulldozer gem of object serialization for Rails. It consists of a pretty simple structure of serializer classes which are blueprint for turning the model into JSON. They are used in your controller to provide JSON responses.
For example that serializer could produce something like:
{
"name": "John Doe"
}
But the actual format depends on the configuration of ActiveModel::Serializers - it can produce tons of different output formats like "flat objects" or JSONAPI.org:
{
"type": "user",
"id": "1",
"attributes": {
"name": "John Doe",
}
}
Related
If we have a small table which contains relatively static data, is it possible to have Active Record load this in on startup of the app and never have to hit the database for this data?
Note, that ideally I would like this data to be join-able from other Models which have relationships to it.
An example might be a list of countries with their telephone number prefix - this list is unlikely to change, and if it did it would be changed by an admin. Other tables might have relationships with this (eg. given a User who has a reference to the country, we might want to lookup the country telephone prefix).
I saw a similar question here, but it's 6 years old and refers to Rails 2, while I am using Rails 5 and maybe something has been introduced since then.
Preferred solutions would be:
Built-in Rails / ActiveRecord functionality to load a table once on startup and if other records are subsequently loaded in which have relationships with the cached table, then link to the cached objects automatically (ie. manually caching MyModel.all somewhere is not sufficient, as relationships would still be loaded by querying the database).
Maintained library which does the above.
If neither are available, I suppose an alternative method would be to define the static dataset as an in-memory enum/hash or similar, and persist the hash key on records which have a relationship to this data, and define methods on those Models to lookup using the object in the hash using the key persisted in the database. This seems quite manual though...
[EDIT]
One other thing to consider with potential solutions - the manual solution (3) would also require custom controllers and routes for such data to be accessible over an API. Ideally it would be nice to have a solution where such data could be offered up via a RESTful API (read only - just GET) if desired using standard rails mechanisms like Scaffolding without too much manual intervention.
I think you may be discounting the "easy" / "manual" approach too quickly.
Writing the data to a ruby hash / array isn't that bad an idea.
And if you want to use a CRUD scaffold, why not just use the standard Rails model / controller generator? Is it really so bad to store some static data in the database?
A third option would be to store your data to a file in some serialized format and then when your app loads read this and construct ActiveRecord objects. Let me show an example:
data.yml
---
- a: "1"
b: "1"
- a: "2"
b: "2"
This is a YAML file containing an array of hashes; you can construct such a file with:
require 'yaml'
File.open("path.yml", "w") do |f|
data = [
{ "a" => "1", "b" => 1 },
{ "a" => "2", "b" => 2 }
]
f.write(YAML.dump(data))
end
Then to load the data, you might create a file in config/initializers/ (everything here will be autoloaded by rails):
config/initializers/static_data.rb
require 'yaml'
# define a constant that can be used by the rest of the app
StaticData = YAML.load(File.read("data.yml")).map do |object|
MyObjectClass.new(object)
end
To avoid having to write database migrations for MyObjectClass (when it's not actually being stored in the db) you can use attr_accessor definitions for your attributes:
class MyObjectClass < ActiveRecord::Base
# say these are your two columns
attr_accessor :a, :b
end
just make sure not to run stuff like save, delete, or update on this model (unless you monkeypatch these methods).
If you want to have REST / CRUD endpoints, you'd need to write them from scratch because the way to change data is different now.
You'd basically need to do any update in a 3 step process:
load the data from YAML into a Ruby object list
change the Ruby object list
serialize everything to YAML and save it.
So you can see you're not really doing incremental updates here. You could use JSON instead of YAML and you'd have the same problem. With Ruby's built in storage system PStore you would be able to update objects on an individual basis, but using SQL for a production web app is a much better idea and will honestly make things more simple.
Moving beyond these "serialized data" options there are key-val storage servers store data in memory. Stuff like Memcached and Redis.
But to go back to my earlier point, unless you have a good reason not to use SQL you're only making things more difficult.
It sounds like FrozenRecord would be a good match for what you are looking for.
Active Record-like interface for read only access to static data files of reasonable size.
If we have a small table which contains relatively static data, is it possible to have Active Record load this in on startup of the app and never have to hit the database for this data?
Note, that ideally I would like this data to be join-able from other Models which have relationships to it.
An example might be a list of countries with their telephone number prefix - this list is unlikely to change, and if it did it would be changed by an admin. Other tables might have relationships with this (eg. given a User who has a reference to the country, we might want to lookup the country telephone prefix).
I saw a similar question here, but it's 6 years old and refers to Rails 2, while I am using Rails 5 and maybe something has been introduced since then.
Preferred solutions would be:
Built-in Rails / ActiveRecord functionality to load a table once on startup and if other records are subsequently loaded in which have relationships with the cached table, then link to the cached objects automatically (ie. manually caching MyModel.all somewhere is not sufficient, as relationships would still be loaded by querying the database).
Maintained library which does the above.
If neither are available, I suppose an alternative method would be to define the static dataset as an in-memory enum/hash or similar, and persist the hash key on records which have a relationship to this data, and define methods on those Models to lookup using the object in the hash using the key persisted in the database. This seems quite manual though...
[EDIT]
One other thing to consider with potential solutions - the manual solution (3) would also require custom controllers and routes for such data to be accessible over an API. Ideally it would be nice to have a solution where such data could be offered up via a RESTful API (read only - just GET) if desired using standard rails mechanisms like Scaffolding without too much manual intervention.
I think you may be discounting the "easy" / "manual" approach too quickly.
Writing the data to a ruby hash / array isn't that bad an idea.
And if you want to use a CRUD scaffold, why not just use the standard Rails model / controller generator? Is it really so bad to store some static data in the database?
A third option would be to store your data to a file in some serialized format and then when your app loads read this and construct ActiveRecord objects. Let me show an example:
data.yml
---
- a: "1"
b: "1"
- a: "2"
b: "2"
This is a YAML file containing an array of hashes; you can construct such a file with:
require 'yaml'
File.open("path.yml", "w") do |f|
data = [
{ "a" => "1", "b" => 1 },
{ "a" => "2", "b" => 2 }
]
f.write(YAML.dump(data))
end
Then to load the data, you might create a file in config/initializers/ (everything here will be autoloaded by rails):
config/initializers/static_data.rb
require 'yaml'
# define a constant that can be used by the rest of the app
StaticData = YAML.load(File.read("data.yml")).map do |object|
MyObjectClass.new(object)
end
To avoid having to write database migrations for MyObjectClass (when it's not actually being stored in the db) you can use attr_accessor definitions for your attributes:
class MyObjectClass < ActiveRecord::Base
# say these are your two columns
attr_accessor :a, :b
end
just make sure not to run stuff like save, delete, or update on this model (unless you monkeypatch these methods).
If you want to have REST / CRUD endpoints, you'd need to write them from scratch because the way to change data is different now.
You'd basically need to do any update in a 3 step process:
load the data from YAML into a Ruby object list
change the Ruby object list
serialize everything to YAML and save it.
So you can see you're not really doing incremental updates here. You could use JSON instead of YAML and you'd have the same problem. With Ruby's built in storage system PStore you would be able to update objects on an individual basis, but using SQL for a production web app is a much better idea and will honestly make things more simple.
Moving beyond these "serialized data" options there are key-val storage servers store data in memory. Stuff like Memcached and Redis.
But to go back to my earlier point, unless you have a good reason not to use SQL you're only making things more difficult.
It sounds like FrozenRecord would be a good match for what you are looking for.
Active Record-like interface for read only access to static data files of reasonable size.
I have SPA and few models, I want to create form for REST api, but I don't want to duplicate validators data in JS and Ruby.
All I want is to get validators data(required, read-only, type, default, choices, name, label, help_text) from rails and render form from it.
The problem is I don't see any solution to serialize model into json. With Python, Django REST Framework, I can make OPTIONS request and it will give me full information about model fields, parsers, methods, etc. Is there any similar solution for Rails?
There is validation reflection available in Rails 3 and Rails 4: MyModel.validators (railscast). This will give you an array containing all validators with options, e.g.:
[
[0] #<ActiveRecord::Validations::PresenceValidator:0x007fe542431b40 #attributes=[:name], #options={}>,
[1] #<UrlValidator:0x007fe542431050 #attributes=[:url], #options={:allow_blank=>true}>
]
If you want to get validation errors, the option is to pass #object.errors.messages through json response.
Additionally you may use AMS to serialize model data into json (railscast).
Hope it helps.
I have a Ruby on Rails form that takes the input and saves it as model attributes. One of the attributes, though, holds json data. I would like to be able to take the data entered into multiple fields and save it as a single JSON object in my models attribute. Is there a way I could do this?
If it helps, I could also make a Hash and convert it to json. Basically I just want to combine multiple input fields into one, then hand it off from there.
Thanks!
There are multiple things to consider here.
The first problem is to get your data out of the HTML form. If you use the standard Rails way of naming your form inputs, it's quite simple.
<input name="my_fields[value1]">
<input name="my_fields[value2]">
<input name="my_fields[sub1][value1]">
<input name="my_fields[sub1][value2]">
If you name them like that they can be accessed "en bloc" using the params hash via params[:my_fields], which gives you another hash containing your data.
Then you have to choose which way to save this data in your model. There are several options:
1. Use a string attribute
Just use a string or text column and assign a JSON string:
#my_model.my_data = params[:my_fields].to_json
Pro: A very simple solution.
Contra: SQL queries virtually impossible. Processing with Rails requires manually parsing of the data string.
2. Use a serialized hash
Use a string or text column and declare it as serializable on your model
serialize :my_data, Hash
Then you can use this column as it was a simple hash and Rails will perform the reading and writing operations.
#my_model.my_data = params[:my_fields]
Pro: Still a simple solution. No messing with JSON strings. Processing with Rails much easier.
Contra: SQL queries virtually impossible. A call to to_json is necessary if you need a real JSON string.
3. Use specialized JSON database types
In case you need to be able to query the database using SQL the solutions above won't work. You have to use specialized types for that.
Many DBMS provide structured datatypes in form of XML or even JSON types. (PostgreSQL for example)
Pro: Database queries are possible.
Contra: Custom parsing and serialization necessary, migrations ditto. This solution might be over-engineered.
Update: Since Rails 5 JSON column types are supported. If you are using PostgreSQL or MySQL just use t.json (or t.jsonb) in your migration and use that attribute like a regular hash.
You can save all multiple files as text in JSON format and when you need parse the field.
Ex:
a = JSON.parse('{"k1":"val1"}')
a['k1'] => "val1"
You will probably looking for an before_save that takes all your model attributes and create a JSON format using .to_json method.
You should probably look into the new JSONB format of Postgres. I think this gets you all the PROS, and none of the CONS:
http://robertbeene.com/rails-4-2-and-postgresql-9-4/
I want to send a very vague and dynamic JSON as a response from a client to server.
for backend I'm using rails + mongoid.
What I know from mongoid is I have to create a model class corresponding to my collection structure so that I can call it from my controller to store data to it this way reminds me traditional RDBMS (still can't figure it out why people are happy with it!!!!)
I don't want to do that , I want to send a JSON (which I don't know about its structure) back to my server and mongoid stores the JSON as it is on the server in other words I don't have any preference structure to storing them and I don't want to have any.
Is there a way to that in rails + mongoid?
Generally Mongoid expects you to specify the fields of your model because there is no underlying schema to infer those fields from in the way that ActiveRecord does. But if you just want to store an arbitrary JSON object you can parse it into a Ruby Hash and store it using Mongoid's hash datatype.
field :untyped_data, type: Hash
There are a few caveats about key names, see http://mongoid.org/en/mongoid/docs/documents.html#fields