Overwriting data to the model - ruby-on-rails

I am using reading a table from Oracle and writing the data to a model using cronjob everymidnight so that it help me caching the data and enhance speed. BUT I am using create method to writing the data in model which create new entries and I get duplicate data in my model everytime. Is there any method in model to update the data or overwrite?
Model.each do |model|
p = Model.create model.attributes
p.save
end

Why do you want to save the models? It won't help caching. You should instead profile your application and implement caching techniques like memcached.
But anyways here's code to just read your table:
Model.each {|model| Model.new model.attributes }
Update according to your clarification: Copying from one DB to another without duplication could be done like so (given that Model1 accesses the source DB and Model2 access the destination DB):
Model1.each do |model1|
model2 = Model2.find_or_initialize_by_id model1.id, model1.attributes
model2.save!
}
This won't handle deletions, however. You may still want to look into memcached and maybe database sharding. It may also be worth thinking about a master/slave replication on database level where your "slow" database is the master (all database writes go to this) and the slave is the "fast" database (all read go to this). Web applications are usually read bound, so it will help in your case. Rails has support for this, see NewRelic's Scaling Rails Screencast Series about scaling your databases.

Use create_or_update instead of create.

Related

Creating static permanent hash in rails [duplicate]

If we have a small table which contains relatively static data, is it possible to have Active Record load this in on startup of the app and never have to hit the database for this data?
Note, that ideally I would like this data to be join-able from other Models which have relationships to it.
An example might be a list of countries with their telephone number prefix - this list is unlikely to change, and if it did it would be changed by an admin. Other tables might have relationships with this (eg. given a User who has a reference to the country, we might want to lookup the country telephone prefix).
I saw a similar question here, but it's 6 years old and refers to Rails 2, while I am using Rails 5 and maybe something has been introduced since then.
Preferred solutions would be:
Built-in Rails / ActiveRecord functionality to load a table once on startup and if other records are subsequently loaded in which have relationships with the cached table, then link to the cached objects automatically (ie. manually caching MyModel.all somewhere is not sufficient, as relationships would still be loaded by querying the database).
Maintained library which does the above.
If neither are available, I suppose an alternative method would be to define the static dataset as an in-memory enum/hash or similar, and persist the hash key on records which have a relationship to this data, and define methods on those Models to lookup using the object in the hash using the key persisted in the database. This seems quite manual though...
[EDIT]
One other thing to consider with potential solutions - the manual solution (3) would also require custom controllers and routes for such data to be accessible over an API. Ideally it would be nice to have a solution where such data could be offered up via a RESTful API (read only - just GET) if desired using standard rails mechanisms like Scaffolding without too much manual intervention.
I think you may be discounting the "easy" / "manual" approach too quickly.
Writing the data to a ruby hash / array isn't that bad an idea.
And if you want to use a CRUD scaffold, why not just use the standard Rails model / controller generator? Is it really so bad to store some static data in the database?
A third option would be to store your data to a file in some serialized format and then when your app loads read this and construct ActiveRecord objects. Let me show an example:
data.yml
---
- a: "1"
b: "1"
- a: "2"
b: "2"
This is a YAML file containing an array of hashes; you can construct such a file with:
require 'yaml'
File.open("path.yml", "w") do |f|
data = [
{ "a" => "1", "b" => 1 },
{ "a" => "2", "b" => 2 }
]
f.write(YAML.dump(data))
end
Then to load the data, you might create a file in config/initializers/ (everything here will be autoloaded by rails):
config/initializers/static_data.rb
require 'yaml'
# define a constant that can be used by the rest of the app
StaticData = YAML.load(File.read("data.yml")).map do |object|
MyObjectClass.new(object)
end
To avoid having to write database migrations for MyObjectClass (when it's not actually being stored in the db) you can use attr_accessor definitions for your attributes:
class MyObjectClass < ActiveRecord::Base
# say these are your two columns
attr_accessor :a, :b
end
just make sure not to run stuff like save, delete, or update on this model (unless you monkeypatch these methods).
If you want to have REST / CRUD endpoints, you'd need to write them from scratch because the way to change data is different now.
You'd basically need to do any update in a 3 step process:
load the data from YAML into a Ruby object list
change the Ruby object list
serialize everything to YAML and save it.
So you can see you're not really doing incremental updates here. You could use JSON instead of YAML and you'd have the same problem. With Ruby's built in storage system PStore you would be able to update objects on an individual basis, but using SQL for a production web app is a much better idea and will honestly make things more simple.
Moving beyond these "serialized data" options there are key-val storage servers store data in memory. Stuff like Memcached and Redis.
But to go back to my earlier point, unless you have a good reason not to use SQL you're only making things more difficult.
It sounds like FrozenRecord would be a good match for what you are looking for.
Active Record-like interface for read only access to static data files of reasonable size.

Rails 5 Active Record - is it possible to keep a table in memory?

If we have a small table which contains relatively static data, is it possible to have Active Record load this in on startup of the app and never have to hit the database for this data?
Note, that ideally I would like this data to be join-able from other Models which have relationships to it.
An example might be a list of countries with their telephone number prefix - this list is unlikely to change, and if it did it would be changed by an admin. Other tables might have relationships with this (eg. given a User who has a reference to the country, we might want to lookup the country telephone prefix).
I saw a similar question here, but it's 6 years old and refers to Rails 2, while I am using Rails 5 and maybe something has been introduced since then.
Preferred solutions would be:
Built-in Rails / ActiveRecord functionality to load a table once on startup and if other records are subsequently loaded in which have relationships with the cached table, then link to the cached objects automatically (ie. manually caching MyModel.all somewhere is not sufficient, as relationships would still be loaded by querying the database).
Maintained library which does the above.
If neither are available, I suppose an alternative method would be to define the static dataset as an in-memory enum/hash or similar, and persist the hash key on records which have a relationship to this data, and define methods on those Models to lookup using the object in the hash using the key persisted in the database. This seems quite manual though...
[EDIT]
One other thing to consider with potential solutions - the manual solution (3) would also require custom controllers and routes for such data to be accessible over an API. Ideally it would be nice to have a solution where such data could be offered up via a RESTful API (read only - just GET) if desired using standard rails mechanisms like Scaffolding without too much manual intervention.
I think you may be discounting the "easy" / "manual" approach too quickly.
Writing the data to a ruby hash / array isn't that bad an idea.
And if you want to use a CRUD scaffold, why not just use the standard Rails model / controller generator? Is it really so bad to store some static data in the database?
A third option would be to store your data to a file in some serialized format and then when your app loads read this and construct ActiveRecord objects. Let me show an example:
data.yml
---
- a: "1"
b: "1"
- a: "2"
b: "2"
This is a YAML file containing an array of hashes; you can construct such a file with:
require 'yaml'
File.open("path.yml", "w") do |f|
data = [
{ "a" => "1", "b" => 1 },
{ "a" => "2", "b" => 2 }
]
f.write(YAML.dump(data))
end
Then to load the data, you might create a file in config/initializers/ (everything here will be autoloaded by rails):
config/initializers/static_data.rb
require 'yaml'
# define a constant that can be used by the rest of the app
StaticData = YAML.load(File.read("data.yml")).map do |object|
MyObjectClass.new(object)
end
To avoid having to write database migrations for MyObjectClass (when it's not actually being stored in the db) you can use attr_accessor definitions for your attributes:
class MyObjectClass < ActiveRecord::Base
# say these are your two columns
attr_accessor :a, :b
end
just make sure not to run stuff like save, delete, or update on this model (unless you monkeypatch these methods).
If you want to have REST / CRUD endpoints, you'd need to write them from scratch because the way to change data is different now.
You'd basically need to do any update in a 3 step process:
load the data from YAML into a Ruby object list
change the Ruby object list
serialize everything to YAML and save it.
So you can see you're not really doing incremental updates here. You could use JSON instead of YAML and you'd have the same problem. With Ruby's built in storage system PStore you would be able to update objects on an individual basis, but using SQL for a production web app is a much better idea and will honestly make things more simple.
Moving beyond these "serialized data" options there are key-val storage servers store data in memory. Stuff like Memcached and Redis.
But to go back to my earlier point, unless you have a good reason not to use SQL you're only making things more difficult.
It sounds like FrozenRecord would be a good match for what you are looking for.
Active Record-like interface for read only access to static data files of reasonable size.

Rails - Large database queries

Let's say I need to implement a search algorithm for a product catalog database. This would include multiple joins across the products table, manufacturers table, inventory table, etc. etc.
In .NET / MSSQL, I would isolate such logic in a DB stored procedure, then write a wrapper method in my data access layer of my .NET app to simply call this stored procedure.
How does something like this work in RoR? From my basic understanding, RoR uses its ORM by default. Does this mean, I have to move my search logic into the application layer, and write it using its ORM? The SQL stored proc is pretty intense... For performance, it needs to be in the stored procedure.
How does this work in RoR?
Edit: From the first two responses, I gather that ActiveRecord is the way to do things in Ruby. Does this mean that applications that require large complex queries with lots of joins, filtering and even dynamic SQL can (should) be re-written using ActiveRecord classes?
Thanks!
While it is possible to run raw SQL statements in Rails, using the execute method on a connection object, by doing so you will forfeit all the benefits of ActiveRecord. If you still want to go down this path, you can use it like so:
ActiveRecord::Base.connection.execute("call stored_procedure_name")
Another option to explore might be to create a "query object" to encapsulate your query logic. Inside, you could still use ActiveRecord query methods. ActiveRecord has become fairly proficient in optimizing your SQL queries, and there is still some manual tweaking you could do.
Below is a simple scaffoold for such an object:
# app/queries/search_products.rb
class SearchProducts
def initialize(params)
#search = search
end
def call
Product.where(...) # Plus additional search logic
end
end
The third option would be to go with something like elasticsearch-rails or sunspot. This will require some additional setup time and added complexity, but might pay off down the line, if your search requirements change.
Stored procedure is one of way to make the apps can be faster sometimes but it will need high costs and time to debug code for developer. Rails is using ActiveRecord ORM so if you want to use stored procedure that will lead to the main function ActiveRecord unused well.
There are some explains about rails
stored-procedures-in-ruby-on-rails and using stored procedure

rails and multi dbs, using establish_connection on the fly to route to the correct database?

I want to structure my database in such a way that certain tables (that don't have any relationships with other tables, so no joins required) have to be put on seperate mysql databases.
I know each model has a establish_connection property.
What I want to do:
I will fetch 10 rows from a particular model, based on a clientID.
The clientID will determine which database this model will be fetched from.
I want to have this database routing logic baked into the model logic somehow.
Is this possible?
You can point individual models to different databases using establish_connection in your models. See here for examples.
If you want a single model to access multiple databases based on an attribute, you probably need to use database sharding, for example, with DataFabric or ShardTheLove.
Don't know if this can help you :
https://github.com/brianmario/mysql2
This allows you to do connections and requests like this :
client = Mysql2::Client.new(:host => "localhost", :username => "root")
results = client.query("SELECT * FROM users WHERE group='githubbers'")
Looks like what you need is 'Sharding' support
Unfortunately, ActiveRecord doesn't have this built in. There are a bunch of gems that hack in Sharding support
The gem 'octopus' seems promising. See https://github.com/tchandy/octopus
Note: I haven't used octopus myself
If you are willing to go one step further, I recommend Sequel ( http://sequel.rubyforge.org/ ) which is a stable, feature-rich ORM that I have used before.
The documentation is excellent. What you need is http://sequel.rubyforge.org/rdoc/files/doc/sharding_rdoc.html
It might take sometime to get used to Sequel if you are accustomed to AR 2.x. There is a nice migration guide. See http://sequel.rubyforge.org/rdoc/files/doc/active_record_rdoc.html
Hoping this helps

Rails: Accessing a database not meant for Rails?

I have a standard rails application, that uses a mysql database through Active Record, with data loaded through a separate parsing process from a rather large XML file.
This was all well and good, but now I need to load data from an Oracle database, rather than the XML file.
I have no control how the database looks, and only really need a fraction of the data it contains (maybe one or two columns out of a few tables). As such, what I really want to do is make a call to the database, get data back, and put the data in the appropriate locations in my existing, Rails friendly mysql database.
How would I go about doing this? I've heard* you can (on a model by model basis) specifiy different databases for Rails Models to use, but that sounds like they use them in their entirety, (that is, the database is Rails friendly). Can I make direct Oracle calls? Is there a process that makes this easier? Can Active Record itself handle this?
A toy example:
If I need to know color, price, and location for an Object, then normally I would parse a huge XML file to get this information. Now, with oracle, color, price, and location are all in different tables, indexed by some ID (there isn't actually an "Object" table). I want to pull all this information together into my Rails model.
Edit: Sounds like what I'd heard about was ActiveRecord's "establish_connection" method...and it does indeed seem to assume one model is mapped to one table in the target database, which isn't true in my case.
Edit Edit: Ah, looks like I might be wrong there. "establish_connection" might handle my situation just fine (just gotta get ORACLE working in the first place, and I'll know for sure... If anyone can help, the question is here)
You can create a connection to Oracle directly and then have ActiveRecord execute a raw SQL statement to query your tables (plural). Off the top of my head, something like this:
class OracleModel < ActiveRecord::Base
establish_connection(:oracle_development)
def self.get_objects
self.find_by_sql("SELECT...")
end
end
With this model you can do OracleModel.get_objects which will return a set of records whereby the columns specified in the SELECT SQL statement are attributes of each OracleModel. Obviously you can probably come up with a more meaningful model name than I have!
Create an entry named :oracle_development in your config/database.yml file with your Oracle database connection details.
This may not be exactly what you are looking for, but it seems to cover you situation pretty well: http://pullmonkey.com/2008/4/21/ruby-on-rails-multiple-database-connections/
It looks like you can make an arbitrarily-named database configuration in the the database.yml file, and then have certain models connect to it like so:
class SomeModel < ActiveRecord::Base
establish_connection :arbitrary_database
#other stuff for your model
end
So, the solution would be to make ActiveRecord models for just the tables you want data out of from this other database. Then, if you really want to get into some sql, use ActiveRecord::Base.connection.execute(sql). If you need it as a the actual active_record object, do SomeModel.find_by_sql(sql).
Hope this helps!
I don't have points enough to edit your question, but it sounds like what you really need is to have another "connection pool" available to the second DB -- I don't think Oracle itself will be a problem.
Then, you need to use these alternate connections to "simply" execute a custom query within the appropriate controller method.
If you only need to pull data from your Oracle database, and if you have any ability to add objects to a schema that can see the data you require . . . .
I would simplify things by creating a view on the Oracle table that projects the data you require in a nice friendly shape for ActiveRecord.
This would mean maintaining code to two layers of the application, but I think the gain in clarity on the client-side would outweigh the cost.
You could also use the CREATE OR REPLACE VIEW Object AS SELECT tab1., tab2. FROM tab1,tab2 syntax so the view returned every column in each table.
If you need to Insert or Update changes to your Rails model, then you need to read up on the restrictions for doing Updates through a view.
(Also, you may need to search on getting Oracle to work with Rails as you will potentially need to install the Oracle client software and additional Ruby modules).
Are you talking about an one-time data conversion or some permanent data exchange between your application and the Oracle database? I think you shouldn't involve Rails in. You could just make a SQL query to the Oracle database, extract the data, and then just insert it into the MySQL database.

Resources