I'm thinking about my options here.
implement it with after_create/after_update hooks in activerecord with models.
using ActiveSupport::Notifications to decouple activity feed objects with models.
use observer.
I wasn't able to find much information about the second approach. And I imagine the third one is kinda like the first one. Am I using ActiveSupport::Notifications wrong? Why?
Found this post about using ActiveSupport::Notification to implement email notifications. It might be similar with activity streams.
http://alma-connect.github.io/techblog/2014/03/rails-pub-sub.html
Related
Just looking into vote_fu for a public page where people can vote on content. I am interested in implementing this, but there will be no model to act_as_voter, only a model to act_as_votable. Is this doable? Anyone have experience with this?
Many thanks as always.
Ben
I'd not recommend doing that. The gem/plugin is built around the idea of a voter model. You might be able to tweak your implementation by creating your own migrations and overriding included methods to be still able to use vote_fu, but IMO that would be hardly worth it.
Thinking about an anonymous voting process in its simplest form you should be a lot faster by rolling your own custom version.
I'm new to rails, so be nice.
I'm building a "rolodex" type application, and this question is about the best way to handle creating an entity along with several relationship entities at the same time.
For (a contrived) example:
My application will have a Person model, which has_one Contact_Info model. On the create.html.erb page for Person it makes sense for the user of my appliction to create the person, and the contact_info at the same time.
It doesn't seem right to include details for creating a contact directly in the create view/controller for person. What's the rails way to handle this?
Using nested attributes is the most common way to do this.
The actual documentation is here.
You want to use "Nested Forms". There is a great example of them in this blog post.
I'm also noob, but I had a similar issue with an app. I was using a tutor at the time and he basically said it was a good example of rails being opinionated. It sounds like you want to take the create action for two different models at the same time, which may be possible but probably very hard. Id suggest considering whether your data model could be modified, or find a way to make an acceptable user flow while collecting the data in different forms.
Update: while writing this the technical answer came in. Keep in mind, its perfectly okay to take the easy route if doing so helps you get the app out the door, and especially while you're still new.
I notice it's used primarily for sending emails. Let's say I want to send an email after every comment is created.
Is using Observers really necessary when you could just place the Mailer.deliver_email(user) in your comments_controller.rb's create action instead?
For proper programming practices, yes. The observers decouple the code and make sure it stays maintainable.
In Rails, the closest I've seen to Django Signals are Observers. The problem with them is that they're restricted to triggering callbacks on hardcoded events related to a model's lifecycle.
Django signals can be created anywhere, triggered anywhere and handled anywhere. The model lifecycle callbacks are just regular signals that happen to come built-in and that are triggered by the ORM.
Does anyone know of a similarly general solution for Rails? It could be some generic Ruby library, not tied to Rails, which would be even better.
Edit: Observer is the closest thing, but it's not what I'm looking for. It's a one-to-many solution. Anyone can listen, but only the originating object can post. I'd like something where you declare a signal, and anyone can trigger it as well as handle it. Also, I don't like the fact that the Ruby Observer dictates that the handler have an #update method. I'd like to be able to pass any method reference with the appropriate signature.
I could use the Ruby Observer to implement my own such broker, but I'm trying to learn if someone already did it.
I think a closer equivalent than Rails' Observer is the standard Ruby Observable module. It lets you add a list of observers to an object and the object can then send notifications to the observers when it changes.
What about the 'wisper' gem? https://github.com/krisleech/wisper
Wisper is a Ruby library for decoupling and managing the dependencies
of your Ruby objects using Pub/Sub.
It is commonly used as an alternative to ActiveRecord callbacks and
Observers to reduce coupling between data and domain layers.
Perhaps acts_as_state machine will help. Most of this functionality has recently been baked into Rails edge.
I just implemented a gem with that. https://github.com/pkoch/django_signal/
Ruby gem 'watchable' is the most appropriate choice
https://github.com/jbarnette/watchable
It has a syntax that is very familiar to Django's (and other frameworks, like Qt and many others).
Sending an email is usually called after an action on a model, but the email itself is a view operation. I'm looking for how you think about what question(s) to ask yourself to determine where to put the action mailer method call.
I've seen/used them:
In a model method - bad coupling of related but seperate concerns?
In a callback in the model (such as after_save) - best separation as far as I can tell with my current level of knowledge.
In the controller action - just feels wrong, but are there situations were this would be the smartest way to structure the code?
If I want to know how to program I need to think like a programmer, so learning how you go about thinking through particular programming solutions is worth months of coding on my own in isolation. Thank you!
Late answer, but I want to rationalize on the subject:
Usually, in a web app, you want to send emails either as a direct reaction to a client. Or as a background task, in case we're talking about a newsletter/notification mail sort of thing.
The model is basically a data storage mapper. Its logic should encapsulate data-handling/communication with data storage handling. Therefore, inserting logic which does not relate to it is a bit tricky, and in most cases wrong. Let us take the example: User registers an account and should receive a confirmation email. In this case one could say, the confirmation email is a direct effect of the creation of a new account. Now, instead of doing it in the web app, try to create a user in the console. Sounds wrong to trigger a callback in that case, right? So, callback option scratched. Should we still write the method in the model? Well, if it's a direct effect of a user action/input, then it should stay in that workflow. I would write it in the controller after the user was successfully created. Directly. Replicating this logic in the model to be called in the controller anyways adds unnecessary modularity, and dependency of an Active Record model from Action Mailer. Try to consider sharing the model over many apps, in which some of them don't want Action Mailer for it. For the stated reasons, I'm of the opinion that the mailer calls should be where they make sense, and usually the model is not that place. Try to give me examples where it does make.
Well, depends.
I've used all of those options and your point about 'why should I put this where?' is good.
If it's something I want to happen every time a model is updated in a certain way, then I put it in the model. Maybe even in a callback in the model.
Sometimes you're just firing off a report; there's no updating of anything. In that case, I've normally got a resource with an index action that sends the report.
If the mailer isn't really related to the model that's being changed, I could see putting it in a callback. I don't do that very often. I'd be more likely to still encapsulate it in the model. I've done it, just not very often.
I'm aware it's been a while but best practices never die, right? :)
Email is by definition asynchronous communication (except for confirmation email, but even this one it should be a best practice to leave a delay before having to confirm).
Hence in my opinion, the most logical way to send it is :
in a background action (using Sidekiq or delayed_job)
in a callback method : "hey this action is successfully done, maybe we can tell the world now?"
Problem in Rails is that it is not too many callbacks (as in JS for instance): I personnaly find it dirty to have code like:
after_save :callback
def callback
if test_that_is_true_once_in_the_objects_life
Mailer.send_email()
end
end
So, if you really want to think like a programmer, the idea would be to set up some custom callback system in your app.
Eg.
def run_with_callback(action, callback_name)
if send(action)
delay.send(callback_name)
end
end
Or even creating an event system in your app would be a decent solution.
But in the end those solutions are pretty expensive in time so people end-up writing it inline after the action
def activate
[...]
user.save
Mailer.send_mail
respond_to
[...]
end
which is the closest fashion to callback in synchronous programming and results having Mailers call everywhere (in Model and in Controller).
There's several reasons why controllers are a good place for the mailers:
Emails that have nothing to do with a model.
If your emails depend on several models that dont know about each other.
Extracting models to an API should not mean reimplementing mailers.
Mailer content determined by request variables that you dont want to pass to the model.
If your business model requires a lot of diferent emails, model callbacks can stack.
If the email does not depend on the result of model computations.