Rails Overall Architecture Best Practices - ruby-on-rails

I'm very interested to see what Rails veterans on Stack Overflow have to say about this.
Basically, as a Rails newb, and as someone with no "formal" background in programming, one of my biggest challenges has to do with proper architecture. How do I know what the right or better to build an app are?
I would love to see what some resources are.
I know the basic ones - but a lot of these don't give me the high level architecture overview I want:
http://www.amazon.com/Agile-Development-Rails-Third-Edition/dp/1934356166/ref=cm_lmf_tit_7
http://www.amazon.com/Ruby-Programming-Language-David-Flanagan/dp/0596516177/ref=cm_lmf_tit_3
Railscasts
Here's a quick example:
Say I'm building a Rails app that has merchants and shoppers - each set of users has its own authentication, different permissions, etc. Would it be proper to build a single app for this or multiple that communicate through APIs? Is there any added benefit to this multi-app abstraction?
Thanks!

this is not an easy question. The complete answer would largely depend on your project.
If you are a starter I'd recommend you keep it all in one app, you have enough "separation" using models wisely. It's hard to immagine a scenario where the complexity introduced by inter-app communication is beneficial.
In your example you should ask yourself wether it's better to use one single parent model for the Merchants and Shoppers or two separate models.
In the former case you can consider STI:
user (main class, defined as class User < ActiveRecord::Base)
merchant (defined as class Merchant < User)
shopper (similarly class Shopper < User)
Google for STI for further details.
Then in your controllers/view you can check permissions quickly, for example:
if user.class == Merchant
do_something
else
do_something_else
end
Similarly the two classes might authenticate with different algorithms. You might also include a "standard" authentication in the base User class and specialize it in the subclasses, if required.
Cheers,

Related

What do folks use app/services/ in rails applications

Every now and them I would come across this in the ruby on rails ecosystem:
class LocalizeUrlService
class Services::UpdateUserRegistrationForOrder
class ProductionOrderEmailService
UserCart::PromotionsService.new(
Shipping::BulkTrackingService.new(bulk_update, current_spree_user)
You can also see an example here
However, in the official examples of for example "Ruby On Rails Guides" I've never seen this. Which leads me to believe this is a concept coming from another language/paradigm different to Rails/OOP.
Where is this paradigm/trend coming from? Is there a tutorial/book
that these folks got influenced by? Are these folks holdouts from the SOA trend of a few years ago?
Is it a good idea to put code in app/service/blah_service.rb ?
If yes, what logic/code can be considered "Service" material.
Is there any kind code that would/wouldnt belong as a service?
What gem/plugin creates the app/services folder? The vanilla rails app doesn't ship with it at first.
sidetone: Personally I have issue with instantiating a service. I feel classes and instantiating is misused by a ametuer programmers. I feel a class and instantiating is "for a thing" and a service is something that "does"
so mixins/defs/include should be the way to go I feel.
Service objects are for things that don't fit well in the normal MVC paradigm. They're typically for business logic that would otherwise make your models or controllers too fat. Typically they have no state (that's held in a model) and do things like speak to APIs or other business logic. Service objects let you keep your models thin and focused, and each service object is also thin and focused on doing one thing.
Rails Service Objects: A Comprehensive Guide has examples of using service objects to manage talking to Twitter, or encapsulating complex database transactions which might cross multiple models.
Service Objects in Ruby on Rails…and you shows creating a service object to manage the new user registration process.
The EngineYard blog posted Using Services to Keep Your Rails Controllers Clean and DRY with an example of a service object which does credit card processing.
If you're looking for the origins, Service objects in Rails will help you design clean and maintainable code. Here's how. is from 2014 when they were coming on the scene.
Services has the benefit of concentrating the core logic of the application in a separate object, instead of scattering it around controllers and models.
The common characteristic among all services is their lifecycle:
accept input
perform work
return result
If this sounds an awful lot like what a function does, you're right! They even go so far as to recommend using call as the public method name on the service, just like a Proc. You can think of service objects as a way to name and organize what would otherwise be a big subroutine.
Anatomy of a Rails Service Object addresses the difference between a service object and a concern. It covers the advantages a service object has over modules. It goes into some detail about what makes a good service object including...
Do not store state
Use instance methods, not class methods
There should be very few public methods
Method parameters should be value objects, either to be operated on or needed as input
Methods should return rich result objects and not booleans
Dependent service objects should be accessible via private methods, and created either in the constructor or lazily
For example, if you have an application which subscribes users to lists that might be three models: User, List, Subscription.
class List
has_many :subscriptions
has_many :users, through: :subscriptions
end
class User
has_many :subscriptions
has_many :lists, through: :subscriptions
end
class Subscription
belongs_to :user
belongs_to :list
end
The process of adding and removing users to and from lists is easy enough with the basic create and destroy methods and associations and maybe a few callbacks.
Now your boss wants an elaborate subscription process that does extensive logging, tracks statistics, sends notifications to Slack and Twitter, sends emails, does extensive validations... now what was a simple create and destroy becomes a complex workflow contacting APIs and updating multiple models.
You could write those all as concerns or modules, include all that stuff into these three formerly simple models, and write big Subscription.register and Subscription.remove class methods. Now your subscriptions can Tweet and post to Slack and verify email addresses and perform background checks? Weird. Your models are now bloated with code unrelated to their core functionality.
Instead you can write SubscriptionRegistration and SubscriptionRemove service objects. These can include the ability to Tweet and store statistics and perform background checks and so on (or more likely put that into more service objects). They each have one public method: SubscriptionRegistration.perform(user, list) and SubscriptionRemove.perform(subscription). User, List, and Subscription don't need to know anything about it. Your models stay slim and do one thing. And each of your service objects do one thing.
As to your specific questions...
Where is this paradigm/trend coming from?
As near as I can tell, it's a consequence of the "fat model / skinny controller" trend; that's how I came to it. While that's a good idea, often your models get TOO fat. Even with modules and concerns, it gets to be too much to cram into a single class. That other business logic which would normally bloat a model or controller goes into service objects.
What gem/plugin creates the app/services folder?
You do. Everything in app/ is autoloaded in Rails 5.

Rails Engines scalability issue

I need to find a way for scaling my rails monolith application with the help of rails engines.
Goal: I have database connection timeout issue and monolith has more than 200+ models. what we want to do is divide our models into the tree-like structure of engines. and we will be able to use a separate database for each engine.
UseCase: let's say we have engine A as the base engine and which is included in engine B and C respectively. Both B and C engines live on the same level of the tree.
So i have models seggrated in different engines.
Engine A: has all data related to user.
Class User
end
Engine B: has all data related to products
class Product
end
Engine C: has all data related to reports.
class Report
end
now the main issue comes while defining the associations. earlier we were having associations and several other methods which access associations. For eg.
class User
has_many products
def get_title_product
products.pluck(:title)
end
end
now I cannot define it in engine A as products table doesn't live there.
Option:
what I know is I have to open that User model inside Engine B and define all the association and get_title_product logic related to this Domain in engine B itself.
I can't even include Engine B in Engine A because it will result in circular dependency.
I don't want to follow above approach because it will get messy and my application is significantly large, additionally I don't think it is good as per rails best practices.
Thanks in advance.
Your post contains many questions. Your asking about database sharding, architecture with rails engines and performances / scalability with the timeout issue.
Performances / Timeout
First your timeout issue is not related to the number of models as you suggest. The number of models as no impact at all on performances. To know where is the performance problem or the bottleneck you should use a monitoring tool. This one is my favorite https://www.rorvswild.com (DISCLAIMER: I'm the author ^_^). There is other competitors, use the one your like more.
For the timeouts maybe you should check around your database config. We have not enough informations here to go deeper.
Database sharding
That is not trivial at all since you cannot JOIN and reference foreign keys when they are not in the same db. That is why you have to carefully chose where to shard your db. Ideally it's where you have the least joins. It's a long work which has an important impact on your code base. A stack overflow post is not enough to talk about sharding. Fortunately there is a lot of articles and gems to help you for that.
Just be sure you understand that you can split the load across many databases, but it comes at an extra price on your code base.
For the relationships across databases you cannot use the Rails builtin has_many and so on. You have to define the relationships yourself or use a
gem which will help for that. To give you an idea:
class User
def products
Product.where(user_id: id)
end
end
Rails engines
They are great to built reusable piece of features across applications (http://guides.rubyonrails.org/engines.html). Looks like reusing is not your goal. So I'm afraid your going in the wrong direction.
If you don't want to reopen the class you can use a module:
module HasProducts
def self.included(model)
model.has_many(:products)
end
def get_title_product
products.pluck(:title)
end
end

Is maintaining multiple "sites" using the same "web app" worth the added architecture and code complexity?

This may not at first seem like a code-related question. But it truly is, in the end a very high-level architecture question, which has implications for database design, and code architecture. So please give a good moment of thought before judging this as off-topic.
I imagine my situation is not unique in the industry and I would like to learn from others' experience.
I have run a topic based video education website for 8 years. Recently I've been inspired to make one or two other websites that are essentially the same thing, but with a different topic. Everything I'll need for these new sites I already have, including searching, indexing, external content hosting, backend jobs, mailers, etc.
I'm faced with the decision of do I fork the current website for each additional website, and make the 5% alterations required, and set up all of the other services, etc? Or do I try to basically cram a number of "websites" into the same app, that would basically key off of the domain name and give a different face and content (and registration, shopping cart, menus, content, etc)?
An example of this issue is Stack Overflow itself. They have many sites "branded" slightly differently. Do they maintain separate apps for each, or do they all run off of one app?
In the first case upgrades and code development will get out of sync and become a nightmare, and in the second, this will add a significant degree of complexity to, well, mostly everything.
Both seem pretty bad. Which is least bad?
ps, it's a ruby on rails app, in case by some magic, some gem exists for this kind of thing that I don't know about.
If the functionality is the same then the you only need one app. However, if you realize that at some point that the functionality of the app are diverging, then you could add another app to handle the diverging code but keep the common app to handle all the common functionality and convert to service oriented architecture.
Forking and maintaining two sets of identical code is a nightmare that should be avoided.
You're looking at multi tenancy -- the ability for multiple "tenants" to use a single application.
Modern terminology would label this as "cloud" software, although our current HTTP architecture prevents users maintaining state, preventing it from achieving that mantle.
CMS
In the terms of your app, you'd benefit from reading up about CMS systems.
Specifically, the likes of tumblr etc work with the exact pattern you want -- single application with...
admin area
front-end
customization (background image, styling etc)
users
--
For Rails, you'd be able to create a single instance of an application, with the following structure:
#config/routes.rb
scope constraints: AccountCheck do
resources :posts, path: "" #-> url.com/
end
#lib/account_check.rb
module AccountCheck
def initializer(router)
#router = router
end
def self.matches?(request)
Account.exists?(request.subdomain) && request.subdomain
end
end
--
#app/models/account.rb
class Account < ActiveRecord::Base
has_and_belongs_to_many :users
has_many :posts
end
#app/models/user.rb
class User < ActiveRecord::Base
has_and_belongs_to_many :accounts
has_many :posts
end
#app/models/post.rb
class Post < ActiveRecord::Base
belongs_to :account
belongs_to :user
end
The above will give you the ability to create a subdomain-centered system, which will display the posts for the current account:
#app/controllers/posts_controller.rb
class PostsController < ApplicationController
def index
#account = Account.find_by name: request.subdomain
#posts = #account.posts
end
end
Gem
Very simple example, hopefully shows you how you'd achieve the multi-tenancy aspect.
If you wanted to take this even further, you'd want to look at data-scoping around the Account model. This cannot be done at the db-level in any other system than pgsql; for which there's a gem called apartment to manage PGSQL schemas.
This is the best I have so far. There's a good book about this pattern by #Ryan Bigg - https://leanpub.com/multi-tenancy-rails
If you have total control over the "creative direction" for all the sites, maintaining all of them under a single site would be easier.
If you plan to retain a main site, with less frequent updates to the other sites, keeping them separate will have the advantage of reduced coupling - you can safely perform customization for a single site/app without impacting others, and avoid (further) raising the overall code complexity needed to accommodate competing requirements for multiple apps under the same site.

Sharing AR models between Rails applications

I have a problem that I have been trying to solve for a while now. I have 4 different Rails applications using the same database, meaning they need to use the same models and have the same migrations. I initially solved the problem by creating a Rails engine packaged into a gem, which then carries all the models and migrations with it. Now I realize that there are pieces of functionality that only one application needs, but the others do not - like for example the admin application needs to have methods for providing sortable tables for all models - the other applications do not need this functionality at all.
So my idea was to find a way where I can provide the "base" models from the gem, while augmenting these base models in my specific applications to add additional functionality when needed. What I tried first was inheritance:
class User < Base::User
end
This does not work though, because now you have 2 User models in your load path (User and Base::User) and when querying associations, it always picks the "closest" class for the associated record class - meaning when you have an Model::Account which belongs_to :user, it will pick Model::User as the association class, not User. I tried reversing the AR type compute method but this only resulted in more problems.
I can technically provide all of my models from the base engine (gem), but the issue here is that how do i extend these models in my application? .class_eval feels really really dirty, inheritance does not work, providing base functionality as mixins means the "base" models do not feel and look like models at all. My goal would be to cause as little friction as possible for the other developers, I want them to be able to define their models in the gem like they do normally and then also have an easy way to extend that functionality in other applications.
Has anyone solved this problem before or have any suggestions? Or how do you guys solve this problem in your larger applications? Any help would be appreciated.
This is mentioned in the Rails guides. It describes class modification with the Decorator pattern.

ActiveRecord relations between models on different engines

I'm building a Rails app that is probably going to have a large amount of models. For now let's say that i need to have Users, and each User will have a personal Blog
I've been developing the User-related part on a separate Engine (let's call it AuthEngine) that encapsulates the User model, authentication, etc. At the time this seemed like the best approach
In parallel, my colleague was developing another engine, to deal with the blog features (let's call it BlogEngine).
Now he needs to access user data which is stored in the other engine. He achieved this by defining:
BlogEngine.user_class = "AuthEngine::User"
This way, he can easily ask for user data, even though it isn't stored in the same engine. The problem comes when we want to define relashionships between models
He can easily say that each blog post belongs to a user
has_one :user, :class_name => BlogEngine.user_class
But as far as i know he can't specify that each User has multiple posts since the User model is within the other engine
The consequence of this is that he can't do things like #user.posts, but instead has to do Post.find_all_by_user(#user)
Is there a more elegant way to deal with this?
I also considered the possibility that each engine could simply generate the models inside the app, removing the encapsulation, but since the amount of models will grow quickly, i think this will make the app more of a mess, and not as much maintanable
I think you should reopen user class inside blog_engine in order to define has_many :posts relation and it should be appropriate way.
What about having a common models gem/engine with only the relationships as a dependency for your engines? This way you would have access to all relevant relationships in every engine.

Resources