Bind to only to certain actions in WebsocketBinding - django-channels

I am developing an application with Django Channels in which I have a model whose changes are automatically published over a WebSocket channel using WebSocketBinding. In this particular case, only the create and delete messages are relevant, and as such, I thought it might be better if the update messages were never sent, instead of ignoring them completely in the frontend part of the setup. I have tried overriding the different handlers in the base Binding class of channels (post_save_receiver, pre_save_receiver and the like), as well as extending that class and WebsocketBinding to no avail. My question is: is there a simple way of achieving this? In the documentation I could not find any pointers. If not, what would be the simplest way to get it done?
Thanks in advance!

Related

How to implement transclusion with React (Draft.js) and Rails

I built a Draft.js editor in my Rails application. I use the convertToRaw() and convertFromRaw() methods to persist its state on the server and restore it later.
Now I want to add a custom Todo block component as explained here, except that it would save a Todo object to the database, reference it, and render it… The goal being to be able to "extract" these objects from the editor and use them elsewhere in the app.
Since a picture is worth a thousand words, here's one I found in an article on Transclusion which shows pretty well what I have in mind:
But to be honest, I'm not sure at all what would be the best way to implement such a feature. Especially since I'd like to apply it to other Draft.js blocks in the future…
Thank you for the help!
David

How to use EventMachine(superfeeder-ruby gem) within a Rails controller?

thank you for taking a look at this.
I am new to rails, unfortunately. I currently have to implement an endpoint that Superfeedr can push updates to, but that endpoint has to be in a rails controller.
Initially it seemed to me that this should be a background job that runs and the rest of the web tends to agree, but I am being pressured to include this as a rails controller - which confuses me. I am not certain how to include EventMachine in a request/response cycle.
I know the web is full of examples, but none really answer my question with how to route this. I have no idea.
I have a rails controller, called Superfeeds. I want Superfeeder to push updates to something like myrailsapp/superfeeds/
Inside feeds I want to inspect the pushed content, and then write the results of that to another controller that actually has a model and will persist it.
Basically, the controller called Feeds just needs to receive and pass the information along. This confuses me however because it seems to have to implement something which is a long running process inside of a rails controller - and I am not even sure if this can work.
Does anyone know of a way that this has been done with rails, but not using EventMachine as a background job? In the end I really just need to know that this is possible.
-L
Inside feeds I want to inspect the pushed content, and then write the results of that to another controller that actually has a model and will persist it.
Why not do all the work in the one controller? If you're trying to separate out different concerns, you could even use two models - for instance one to do the inspecting/parsing and one to handle the persisting. But rarely would you need or want to to pass data from controller to controller.
I am not certain how to include EventMachine in a request/response cycle.
Never used superfeedr myself, but glanced at the docs quickly - are you using the XMPP or PubSubHubbBub client? I assume the latter? If so, you want to do the persistence (and any other time-consuming process) async (outside the request/resp cycle), right?
If you are using an EventMachine-based webserver such as Thin, basically every request cycle is run within an EM reactor. So you can make use of EM's facilities for offloading tasks such as the deferred thread pool. For an example of this in action, check out Enigmamachine, in particular here. (I believe that in addition your db client library needs to be asynchronous.)

Building consumable uri/urls for a model (rails/datamapper/SOA)

Perhaps you can help me think this through to greater detail.
I need to build or make available a uri for a model instance that can be referenced or used by another application which may or may not be a rails application.
e.g.
I create a standard Post with content; I want to build a URL for that post another application can consume or reference by looking at the model in the database (or another less sticky fashion). Datamapper has a URI field, I want to build a canonical uri, store it there and have another application be able to access, announce, manipulate, etc.
Basically, I have several applications that may be in different places, that need to access the same model, to do differing things with the model. I need a way to make that happen clearly without putting them all in one monster application.
I've looked at Pubsubhub, RSS, etc. but haven't found any concrete examples of what I'm trying to do. Do I need to create an common API for the applications, etc?
DataMapper is very flexible about using existing databases.
Many people come to DataMapper because it can create and tear down the database structures without migrations. However, you do not have to work with it in that way.
I have had good success with using a large set of models owned by a central 'housekeeping' app and then declaring a small subset of the same models in separate 'interface' apps.
Some trial and error is required to figure out what works but it can certainly be done. I'd suggest putting your models in modules and including them across apps if possible.
A final point it sounds like you want URIs/URLs to be the primary interface. If that is the case I strongly suggest you look at Sinatra. It is entirely oriented around URLs (and I find Rails routes very obtuse).

Equivalent of Django Signals for Rails?

In Rails, the closest I've seen to Django Signals are Observers. The problem with them is that they're restricted to triggering callbacks on hardcoded events related to a model's lifecycle.
Django signals can be created anywhere, triggered anywhere and handled anywhere. The model lifecycle callbacks are just regular signals that happen to come built-in and that are triggered by the ORM.
Does anyone know of a similarly general solution for Rails? It could be some generic Ruby library, not tied to Rails, which would be even better.
Edit: Observer is the closest thing, but it's not what I'm looking for. It's a one-to-many solution. Anyone can listen, but only the originating object can post. I'd like something where you declare a signal, and anyone can trigger it as well as handle it. Also, I don't like the fact that the Ruby Observer dictates that the handler have an #update method. I'd like to be able to pass any method reference with the appropriate signature.
I could use the Ruby Observer to implement my own such broker, but I'm trying to learn if someone already did it.
I think a closer equivalent than Rails' Observer is the standard Ruby Observable module. It lets you add a list of observers to an object and the object can then send notifications to the observers when it changes.
What about the 'wisper' gem? https://github.com/krisleech/wisper
Wisper is a Ruby library for decoupling and managing the dependencies
of your Ruby objects using Pub/Sub.
It is commonly used as an alternative to ActiveRecord callbacks and
Observers to reduce coupling between data and domain layers.
Perhaps acts_as_state machine will help. Most of this functionality has recently been baked into Rails edge.
I just implemented a gem with that. https://github.com/pkoch/django_signal/
Ruby gem 'watchable' is the most appropriate choice
https://github.com/jbarnette/watchable
It has a syntax that is very familiar to Django's (and other frameworks, like Qt and many others).

Non persistent data in a Rails app

I'm working on an "analytics" page for a rails app. The analytics page does not persist any data of its own (it's very primitive at this point) but does utilize metrics that I'm grabbing from the DB (via the aggregate expressions built into ActiveRecord). Aside from gathering and presenting the metrics the only other requirement I have is to allow the user to provide a date range to filter the data. Up to this point I have been using instance variables and the like to store the metrics information...as the number of metrics grow along with the need to manage the start and end filter dates I begin thinking that I should put this data into its own model. If I do move all of my "data" into a model should I just use a plain object with attr_accessors or is there a more appropriate base class I could use for non-persistent data? I'm familiar enough with a MVC architecture to know that my controller is getting to bloated but no familiar enough with rails to determine how I should organize my data/logic in this case.
Any insight would be greatly appreciated!
It sounds like you could use a Rails non active-record model. There's a good Railscast about that :
http://railscasts.com/episodes/121-non-active-record-model
Hope that helps,
You're on the right track here. Many applications have classes inside app/models that do not inherit from ActiveRecord::Base. Any time you find yourself managing lots of arbitrary variables inside controller actions, it's a good place to consider abstracting that data into a non-persistent model.
This is an area that's not well documented at present, probably because the ActiveRecord stuff is sexier?
I went through the same process, finding my controller actions were becoming uncomfortably large and full of logic as I strove to construct my derived data from ActiveRecord-based models, which in turn started to acquire additional responsibilities that they didn't really want or need.
Once I got the same advice you're getting now the whole thing simplified magnificently.

Resources