How to get a Facebook like notification system in rails - ruby-on-rails

Hey how to make a notification system like Facebook or diaspora in rails.
I had tried making activity feed but that was not the thing I wanted I want an exactly same feature like this websites have.
I have a simple app Where there are two types of users buyers and sellers
I want to notify the seller whenever a buyer comment on their products.

What you are looking at here is a server push implementation. That means when some notification/action happens in the server, it should push a notification to your rails app. The difference with #manju's answer is, its proposing a a solution based on your clients browser will call the server periodically for new notifications.
There are two main ways to do this.
1 - Using some third party SASS solutions. (easy way, but cost money ;))
Fire base , allows you to send push notifications to clients.
pusher is another provider offers the same kind of functionalists.
Read their documentations, normally each of them have gems you can easily integrate to your rails app.
2 - Implement your own push server
You can implement your own push server with rails, and integrate to your app.
Faye is a one option,
But more exiting thing is Rails5 will have Action Cable which tries to solve the same issue. action cable gem
and there are articles showing action cable with rails4 apps (you dont have to wait till rails5 comes out) , but I haven't used it personally yet.
HTH

Facebook does it using comet techniques.
Here are some of the helpful links
Link1
Link2
Link3
Here is the theory how facebook does
Facebook works by polling the server for any changes.
Facebook page will make ajax request to server and the ajax request will have much time out
But in the server-side in the API in server it will constantly poll DB server if anything has changed by constantly checking the activity log table in database ..if a change has been found it will return the result till then it will poll the DB
Once Ajax request is complete it will recursively try again.
Here is a code snippet - Client side
function doPoll() {
$.get("events.php", {}, function(result) {
$.each(result.events, function(event) { //iterate over the events
//do something with your event
});
doPoll();
//this effectively causes the poll to run again as
//soon as the response comes back
}, 'json');
}
$(document).ready(function() {
$.ajaxSetup({
timeout: 1000*60//set a global AJAX timeout of a minute
});
doPoll(); // do the first poll
});
Here is a code-snippet in server side:
while(!has_event_happened()) {
sleep(5);
}
echo json_encode(get_events());
you can find it in much detail here
you can actually adopt this approach according to your needs

Related

Web Browser Perform Syncing Action While Online In Background

I am looking to synchronize an indexed db in the background for offline access. I want to do this while the application is online and have it run in the background where the user doesn't even know it is running
I looked at backgroundSync with service workers but that appears to be for offline usage.
What I am really looking for is something like a cron task in the browser so I can synchronize data from a remote server to a local in-browser database
Here's a different approach, which fetches the json results from the backend's API, store in localStorage and pass results array to a custom function to render it.
If localStorage if not available in the browser, it fetches the results every time the function is called... so it does when the "force" parameter is set to true.
It also uses cookies for storing the last timestamp when the data was retrieved. The given code is set for a duration of 15 minutes (900,000 milliseconds).
It also assumes that in the json result of the api there's a member called .data where's an array of data to be cached / updated.
It requires jquery for the $.ajax, but I'm sure it can be easily refactored for using fetch, axios, or any other approach:
function getTrans(force=false) {
var TRANS=undefined
if(force || window.localStorage===undefined ||
(window.localStorage!==undefined && localStorage.getItem("TRANS") === null) ||
$.cookie('L_getTrans')===undefined ||
($.cookie('L_getTrans')!==undefined && Date.now()-$.cookie('L_getTrans')>900000)) {
$.ajax({
url: 'api/',type: 'post',
data: {'OP':'getTrans'},
success: function (result) {
TRANS = result.data ?? []
if(window.localStorage!==undefined) {
localStorage.setItem('TRANS', JSON.stringify(TRANS))
$.cookie('L_getTrans',Date.now())
}
renderTransactionList(TRANS)
},
error: function (error) { console.error(error) }
});
} else {
TRANS = JSON.parse(localStorage.getItem('TRANS'))
renderTransactionList(TRANS)
}
}
Hope it helps some of you, or even amuse.
For your purpose you probably need webworker instead of service worker.
While service worker acts as a proxy for connections, web worker can be more generic.
https://www.quora.com/Whats-the-difference-between-service-workers-and-web-workers-in-JavaScript
Is has some limitation interacting with browser objects but http connections and indexed db are allowed.
Pay particular attention to browser’s cache during development: even cltr + F5 does not reload web worker code.
Force reload/prevent cache of Web Workers
I believe what you're going for is a Progressive Web Application (PWA).
To build on Claudio's answer, performing background fetches are best done with a web worker. Web workers are typically stateless, and you would need to adapt your project to note what data was loaded last. However, using History API and (lazy) loading other page contents via JavaScript means that the user wouldn't have to exit the page.
A service worker would be able to monitor when your application is online or offline, and can call methods to pause or continue downloads to the indexed db.
As a side note, it is advisable to load only what is needed by your users, as excessive background loading may offend some users.
Further Reading.
Mozilla's PWA Documentation
An example of Ajax loading and the History API from Mozilla
I looked at backgroundSync with service workers but that appears to be
for offline usage.
No it is not just for offline usage! Also Answer about PWA and service workers also right!
A solution in your case:
You can use navigator.onLine to check the internet connection like that
window.addEventListener('offline', function() {
alert('You have lost internet access!');
});
If the user loses their internet connection, they'll see an alert. Next we'll add an event listener for watching for the user to come back online.
window.addEventListener('online', function() {
if(!navigator.serviceWorker && !window.SyncManager) {
fetchData().then(function(response) {
if(response.length > 0) {
return sendData();
}
});
}
});
A good pattern to detection
if(registration.sync) {
registration.sync.register('example-sync')
.catch(function(err) {
return err;
})
} else {
if(navigator.onLine) {
requestSync();
} else {
alert("You are offline! When your internet returns, we'll finish up your request.");
}
}
Note: Maybe you need to limit your application activity while offline
I'm trying to send (push to db) data with fetch api below, but here things like HTML5 Apex or Socket or CacheAPI can also be used.
requestSync() {
navigator.serviceWorker.ready.then(swRegistration => swRegistration.sync.register('todo_updated'));
}
Maybe if you try with a socket(PHP) and a setTimeout(JS)
Let me explain to you:
When you enter your page, it uses the indexDB, and at the same moment starts a setTimeout, for exemple every 30 sec., and also tries to comunicate with the socket. If this action is successful, the web page synchronizes with the indexDB.
I don't know if you understand me.
My English is very bad. I'm sorry.
Service Worker Push Notifications
Given your reference to cron, it sounds like you want to sync a remote client at times when they are not necessarily visiting your site. Service Workers are the correct answer for running an operation in ~1 context irregardless of how many tabs the client might have open, and more specifically service workers with Push Notifications are necessary if the client might have no tabs open to the origin site at the time the sync should occur.
Resources to setup web push notifications:
There are many guides for setting up push notifications, i.e.:
https://developers.google.com/web/fundamentals/codelabs/push-notifications/
which link to the test push services like:
https://web-push-codelab.glitch.me
To test sending the push before you have configured your own server to send pushes.
Once you have your test service worker for a basic push notification, you can modify the service worker's push handler to call back to your site and do the necessary DB sync.
Example that triggers a sync into an IndexedDB via a push
Here is a pouchdb example from a few years back that was used to show pouchdb could use its plugins for http and IndexedDB from within the push handler:
self.addEventListener('push', (event) => {
if (event.data) {
let pushData = event.data.json();
if (pushData.type === 'couchDBChange') {
logDebug(`Got couchDBChange pushed, attempting to sync to: ${pushData.seq}`);
event.waitUntil(
remoteSync(pushData.seq).then((resp) => {
logDebug(`Response from sync ${JSON.stringify(resp, null, 2)}`);
})
);
} else {
logDebug('Unknown push event has data and here it is: ', pushData);
}
}
});
Here:
Pouchdb inside a service worker is receiving only a reference to a sync point in the push itself.
It now has a background browser context with an origin of the server that registered the service worker.
Therefore, it can use its http sync wrapper to contact the db provided over http by the origin server.
This is used to sync its contents which are stored in IndexedDB via its wrapper for IndexedDB
So:
Awaking on a push and contacting the server over http to sync to a reference can be done by a service worker to update an IndexedDB implementation as long as the client agreed to receive pushes and has a browser with internet connectivity.
A client that does not agree to pushes, but has service workers can also centralize/background sync operations on the service worker when tabs are visiting the site either with messages or chosen URLs to represent a cached sync results. (For a single tab SPA visitor, performance benefits as far as backgrounding are similar to web workers DB performance) but pushes, messages and fetches from the origin are all being brought together in one context which can then provide synchronization to prevent repeated work.
Safari appears to have fixed their IndexedDB in service workers bug, but some browsers may still be a bit unreliable or have unfortunate edge cases. (For example, hitting DB quotas should be tested carefully as they can cause problems when a quota is hit while in an unexpected context.)
Still, this is the only reliable way to get multiple browsers to call back and perform a sync to your server without building custom extensions, etc for each vendor separately.

Pusher in background job is outpacing my app - missing JS notifications

I'm moving imports of excel files (often large, often small) into the background using Sidekiq, and alerting the user when the importing is complete using Pusher.
At first, in the 'synchronous' flow (Rails app here), it will kick off the excel import, and then it will redirect to a dashboard page which will receive notifications from Pusher when the importing is done.
The problem is the Sidekiq-Pusher flow sometimes goes faster than the redirect to the dashboard page. My JavaScript subscriber won't be initialized in time to receive the message that is being published from within the background process. So the client gets nothing.
Does Pusher offer a way to delay publishing a message until there is a subscriber? Or does Pusher offer a way to 'stockpile' messages until the subscriber springs to life to consume them? Or maybe there is a simpler solution here I have not thought of?
Fyi, I don't want the background job to sleep for a few seconds to make sure the client is ready, and I don't want to use Pusher to trigger a refresh (i.e. save something in a DB, then refresh to display it).
I am happy to provide code samples if desired.
EDIT:
I'm certainly open to using something else besides Pusher if something else can solve my problem.
Invoke the worker after 1 or 2 seconds from the current time so that it gets invoked and show the message after being redirected to the landing page.
TestProcess.perform_at(2.seconds.from_now,parameters)
It will work as expected. Hope it helps :)

How to dynamically and efficiently pull information from database (notifications) in Rails

I am working in a Rails application and below is the scenario requiring a solution.
I'm doing some time consuming processes in the background using Sidekiq and saves the related information in the database. Now when each of the process gets completed, we would like to show notifications in a separate area saying that the process has been completed.
So, the notifications area really need to pull things from the back-end (This notification area will be available in every page) and show it dynamically. So, I thought Ajax must be an option. But, I don't know how to trigger it for a particular area only. Or is there any other option by which Client can fetch dynamic content from the server efficiently without creating much traffic.
I know it would be a broad topic to say about. But any relevant info would be greatly appreciated. Thanks :)
You're looking at a perpetual connection (either using SSE's or Websockets), something Rails has started to look at with ActionController::Live
Live
You're looking for "live" connectivity:
"Live" functionality works by keeping a connection open
between your app and the server. Rails is an HTTP request-based
framework, meaning it only sends responses to requests. The way to
send live data is to keep the response open (using a perpetual connection), which allows you to send updated data to your page on its
own timescale
The way to do this is to use a front-end method to keep the connection "live", and a back-end stack to serve the updates. The front-end will need either SSE's or a websocket, which you'll connect with use of JS
The SEE's and websockets basically give you access to the server out of the scope of "normal" requests (they use text/event-stream content / mime type)
Recommendation
We use a service called pusher
This basically creates a third-party websocket service, to which you can push updates. Once the service receives the updates, it will send it to any channels which are connected to it. You can split the channels it broadcasts to using the pub/sub pattern
I'd recommend using this service directly (they have a Rails gem) (I'm not affiliated with them), as well as providing a super simple API
Other than that, you should look at the ActionController::Live functionality of Rails
The answer suggested in the comment by #h0lyalg0rithm is an option to go.
However, primitive options are.
Use setinterval in javascript to perform a task every x seconds. Say polling.
Use jQuery or native ajax to poll for information to a controller/action via route and have the controller push data as JSON.
Use document.getElementById or jQuery to update data on the page.

Is it possible to use 'push' services at the back-end?

I'm using pusher gem to manipulate my front-end from an external API. It works fine, no problem with that.
But the thing I wonder is if there is a possibility to use push notifications at the back-end of my application? I spent a serious amount of time investigating this but couldn't find something useful.
Let me summarize:
I have an application and another API application which is tightly interacting with other. Sometimes I want to use my API to send notification to my main application and I want to be able to manipulate data at the back-end of my main application regarding the data received from API side. These are things like 'an action was completed/started/succeed' etc...
I understand that 'pusher' receives push notifications by JavaScript at the front-end. But I believe that there must be a way to use those notifications at the back-end as well.
If there is another way (maybe Faye? Websocket) to do that I'd love to learn what it is. Any clue would be appreciated.
Is it something doable?
Thank you
Pusher is a backend system too (to "push" updates to channels)
Endpoints
I think you may be interested in endpoints
From what I can gather, it seems you're looking to trigger the transfer of data to an endpoint once an action occurs in your API? For example:
User signs up on "API" app
API app sends "notification" to main app
Main app increases user count by 1
The way I can see this working is by either using ajax, or sending a curl request to your main app's endpoint (set in routes), triggering the action:
#main_app/config/routes.rb
post "endpoint", to: "application#endpoint"
#main_app/controllers/application_controller.rb
def endpoint
#count = Option.increment!(:user_count)
end
This will allow you to manipulate your data in the backend of your "main" app
API
The tricky, non-conventional part comes when you want to send the data from your API app to your Main app (this is where you got the "pusher" idea from)
I would personally look at sending a standard HTTP request to the Main app endpoint, probably with Curl (if from the backend):
Curl on Ruby on Rails
Rails curl syntax
You may want to install curb (CUrl RuBy) here: https://github.com/taf2/curb
I could write some code if you wanted?
I had asked the same question to the Pusher's support team and I got the exact answer I was looking for.
You can install a client library on your server
(http://pusher.com/docs/client_libraries) if there is one for your
server. You can then subscribe to a client channel this way.
In my case, I use Ruby gem which can be reached from https://github.com/pusher/pusher-ruby-client .

Pushing updates to the view in (close to) real-time

Say my user is viewing messages/index, and someone else sends him a message. I'd like the user's view to update in real-time to reflect the new message, kind of like how Twitter lets me know there are more messages on my timeline. Are there any examples of this being done in Rails?
You can either use AJAX to poll the server for updates on a regular basis (pull model), or use the Juggernaut plugin or similar to enable the server to send updates to the client (push model). Note that this requires Flash to be installed on the client.
DUI.stream from digg might be the solution you are looking for. It keeps an open xhr stream that you can keep on adding objects to to send to the user. It has a ruby client example

Resources