I have a page that has a button. When the button is clicked, it gets data through socket.io. However, each time I reload the page, socket.io will send back one extra copy of data from the previous data set. So my data would look good first time the page load (example: abcd). Then reload the page will get back 2n data (abcdabcd), reload the page again I get 3n data (abcdabcdabcd) etc.
How do I avoid duplicated data send back to the client when page reload? Here is my code.
Server Side:
app.get('/test', function(req, res){
// some code...
io.sockets.on("connection", function(socket){
var socketFn = function(data){
socket.emit("trends", {
trends: JSON.parse(redisData)
});
};
socket.on("action", socketFn);
socket.on("disconnect", function(){
socket.removeListener("action", socketFn); // this doesn't work
});
});
res.render('test');
});
Client Side:
var socketOpts = {
"sync disconnect on unload" : true
};
var socket = io.connect("", socketOpts);
socket.on("trends", function(data){
// data received from the node server, so do something with it
});
function action(){
socket.emit("action", {phrase: "some dummy data"});
return false;
}
// already checked client side doesn't fire multiple click event
$("button#click").off("click").on("click", action);
That is because you listen to connection as many times as client hits the page.
app.get('/test', function(req, res){
//when client opens the page
io.sockets.on("connection", function(socket){
//start listening to new connection
...
It is considered a bad idea to initialize connection eventlisteners from within your routes. It should be done only once globally. Here each time your client accesses the page it will listen to the events as many times page is accessed.
Related
I am new to Service Workers, and have had a look through the various bits of documentation (Google, Mozilla, serviceworke.rs, Github, StackOverflow questions). The most helpful is the ServiceWorkers cookbook.
Most of the documentation seems to point to caching entire pages so that the app works completely offline, or redirecting the user to an offline page until the browser can redirect to the internet.
What I want to do, however, is store my form data locally so my web app can upload it to the server when the user's connection is restored. Which "recipe" should I use? I think it is Request Deferrer. Do I need anything else to ensure that Request Deferrer will work (apart from the service worker detector script in my web page)? Any hints and tips much appreciated.
Console errors
The Request Deferrer recipe and code doesn't seem to work on its own as it doesn't include file caching. I have added some caching for the service worker library files, but I am still getting this error when I submit the form while offline:
Console: {"lineNumber":0,"message":
"The FetchEvent for [the form URL] resulted in a network error response:
the promise was rejected.","message_level":2,"sourceIdentifier":1,"sourceURL":""}
My Service Worker
/* eslint-env es6 */
/* eslint no-unused-vars: 0 */
/* global importScripts, ServiceWorkerWare, localforage */
importScripts('/js/lib/ServiceWorkerWare.js');
importScripts('/js/lib/localforage.js');
//Determine the root for the routes. I.e, if the Service Worker URL is http://example.com/path/to/sw.js, then the root is http://example.com/path/to/
var root = (function() {
var tokens = (self.location + '').split('/');
tokens[tokens.length - 1] = '';
return tokens.join('/');
})();
//By using Mozilla’s ServiceWorkerWare we can quickly setup some routes for a virtual server. It is convenient you review the virtual server recipe before seeing this.
var worker = new ServiceWorkerWare();
//So here is the idea. We will check if we are online or not. In case we are not online, enqueue the request and provide a fake response.
//Else, flush the queue and let the new request to reach the network.
//This function factory does exactly that.
function tryOrFallback(fakeResponse) {
//Return a handler that…
return function(req, res) {
//If offline, enqueue and answer with the fake response.
if (!navigator.onLine) {
console.log('No network availability, enqueuing');
return enqueue(req).then(function() {
//As the fake response will be reused but Response objects are one use only, we need to clone it each time we use it.
return fakeResponse.clone();
});
}
//If online, flush the queue and answer from network.
console.log('Network available! Flushing queue.');
return flushQueue().then(function() {
return fetch(req);
});
};
}
//A fake response with a joke for when there is no connection. A real implementation could have cached the last collection of updates and keep a local model. For simplicity, not implemented here.
worker.get(root + 'api/updates?*', tryOrFallback(new Response(
JSON.stringify([{
text: 'You are offline.',
author: 'Oxford Brookes University',
id: 1,
isSticky: true
}]),
{ headers: { 'Content-Type': 'application/json' } }
)));
//For deletion, let’s simulate that all went OK. Notice we are omitting the body of the response. Trying to add a body with a 204, deleted, as status throws an error.
worker.delete(root + 'api/updates/:id?*', tryOrFallback(new Response({
status: 204
})));
//Creation is another story. We can not reach the server so we can not get the id for the new updates.
//No problem, just say we accept the creation and we will process it later, as soon as we recover connectivity.
worker.post(root + 'api/updates?*', tryOrFallback(new Response(null, {
status: 202
})));
//Start the service worker.
worker.init();
//By using Mozilla’s localforage db wrapper, we can count on a fast setup for a versatile key-value database. We use it to store queue of deferred requests.
//Enqueue consists of adding a request to the list. Due to the limitations of IndexedDB, Request and Response objects can not be saved so we need an alternative representations.
//This is why we call to serialize().`
function enqueue(request) {
return serialize(request).then(function(serialized) {
localforage.getItem('queue').then(function(queue) {
/* eslint no-param-reassign: 0 */
queue = queue || [];
queue.push(serialized);
return localforage.setItem('queue', queue).then(function() {
console.log(serialized.method, serialized.url, 'enqueued!');
});
});
});
}
//Flush is a little more complicated. It consists of getting the elements of the queue in order and sending each one, keeping track of not yet sent request.
//Before sending a request we need to recreate it from the alternative representation stored in IndexedDB.
function flushQueue() {
//Get the queue
return localforage.getItem('queue').then(function(queue) {
/* eslint no-param-reassign: 0 */
queue = queue || [];
//If empty, nothing to do!
if (!queue.length) {
return Promise.resolve();
}
//Else, send the requests in order…
console.log('Sending ', queue.length, ' requests...');
return sendInOrder(queue).then(function() {
//Requires error handling. Actually, this is assuming all the requests in queue are a success when reaching the Network.
// So it should empty the queue step by step, only popping from the queue if the request completes with success.
return localforage.setItem('queue', []);
});
});
}
//Send the requests inside the queue in order. Waiting for the current before sending the next one.
function sendInOrder(requests) {
//The reduce() chains one promise per serialized request, not allowing to progress to the next one until completing the current.
var sending = requests.reduce(function(prevPromise, serialized) {
console.log('Sending', serialized.method, serialized.url);
return prevPromise.then(function() {
return deserialize(serialized).then(function(request) {
return fetch(request);
});
});
}, Promise.resolve());
return sending;
}
//Serialize is a little bit convolved due to headers is not a simple object.
function serialize(request) {
var headers = {};
//for(... of ...) is ES6 notation but current browsers supporting SW, support this notation as well and this is the only way of retrieving all the headers.
for (var entry of request.headers.entries()) {
headers[entry[0]] = entry[1];
}
var serialized = {
url: request.url,
headers: headers,
method: request.method,
mode: request.mode,
credentials: request.credentials,
cache: request.cache,
redirect: request.redirect,
referrer: request.referrer
};
//Only if method is not GET or HEAD is the request allowed to have body.
if (request.method !== 'GET' && request.method !== 'HEAD') {
return request.clone().text().then(function(body) {
serialized.body = body;
return Promise.resolve(serialized);
});
}
return Promise.resolve(serialized);
}
//Compared, deserialize is pretty simple.
function deserialize(data) {
return Promise.resolve(new Request(data.url, data));
}
var CACHE = 'cache-only';
// On install, cache some resources.
self.addEventListener('install', function(evt) {
console.log('The service worker is being installed.');
// Ask the service worker to keep installing until the returning promise
// resolves.
evt.waitUntil(precache());
});
// On fetch, use cache only strategy.
self.addEventListener('fetch', function(evt) {
console.log('The service worker is serving the asset.');
evt.respondWith(fromCache(evt.request));
});
// Open a cache and use `addAll()` with an array of assets to add all of them
// to the cache. Return a promise resolving when all the assets are added.
function precache() {
return caches.open(CACHE).then(function (cache) {
return cache.addAll([
'/js/lib/ServiceWorkerWare.js',
'/js/lib/localforage.js',
'/js/settings.js'
]);
});
}
// Open the cache where the assets were stored and search for the requested
// resource. Notice that in case of no matching, the promise still resolves
// but it does with `undefined` as value.
function fromCache(request) {
return caches.open(CACHE).then(function (cache) {
return cache.match(request).then(function (matching) {
return matching || Promise.reject('no-match');
});
});
}
Here is the error message I am getting in Chrome when I go offline:
(A similar error occurred in Firefox - it falls over at line 409 of ServiceWorkerWare.js)
ServiceWorkerWare.prototype.executeMiddleware = function (middleware,
request) {
var response = this.runMiddleware(middleware, 0, request, null);
response.catch(function (error) { console.error(error); });
return response;
};
this is a little more advanced that a beginner level. But you will need to detect when you are offline or in a Li-Fi state. Instead of POSTing data to an API or end point you need to queue that data to be synched when you are back on line.
This is what the Background Sync API should help with. However, it is not supported across the board just yet. Plus Safari.........
So maybe a good strategy is to persist your data in IndexedDB and when you can connect (background sync fires an event for this) you would then POST the data. It gets a little more complex for browsers that don't support service workers (Safari) or don't yet have Background Sync (that will level out very soon).
As always design your code to be a progressive enhancement, which can be tricky, but worth it in the end.
Service Workers tend to cache the static HTML, CSS, JavaScript, and image files.
I need to use PouchDB and sync it with CouchDB
Why CouchDB?
CouchDB is a NoSQL database consisting of a number of Documents
created with JSON.
It has versioning (each document has a _rev
property with the last modified date)
It can be synchronised with
PouchDB, a local JavaScript application that stores data in local
storage via the browser using IndexedDB. This allows us to create
offline applications.
The two databases are both “master” copies of
the data.
PouchDB is a local JavaScript implementation of CouchDB.
I still need a better answer than my partial notes towards a solution!
Yes, this type of service worker is the correct one to use for saving form data offline.
I have now edited it and understood it better. It caches the form data, and loads it on the page for the user to see what they have entered.
It is worth noting that the paths to the library files will need editing to reflect your local directory structure, e.g. in my setup:
importScripts('/js/lib/ServiceWorkerWare.js');
importScripts('/js/lib/localforage.js');
The script is still failing when offline, however, as it isn't caching the library files. (Update to follow when I figure out caching)
Just discovered an extra debugging tool for service workers (apart from the console): chrome://serviceworker-internals/. In this, you can start or stop service workers, view console messages, and the resources used by the service worker.
It may be something obvious but I don't understand why I don't receive my event in the server side
Server :
io.sockets.on('connection', function (socket, pseudo) {
socket.on('clickOnGraph', function(){
console.log('Reception of the first sending');
socket.broadcast.emit('clickOnGraph')
console.log('Broadcasting to everyone');
});
Client :
$scope.clickOnGraph = function(){
console.log('click detected, first sending to server');
socket.emit('clickOnGraph');
}
socket.on('clickOnGraph', function(){
console.log('Reception of the broadcast');
console.log('Event clickOnGraph : OK');
});
When I send an event from the server to the client, it works, but not the opposite...
And $scope.clickOnGraph is working.
Thank you for your help, I'm gettting crazy
I have a basic Node.js & Socket.io chat application running on Heroku that I want to integrate into my main rails website. I understand the way to do this is to have two separate Heroku apps - one for rails, one for Node.js.
It doesn't appear to be as simple as moving the client html from the node app to the rails app (giving it the other app's url in 'io.connect();').
The chat app server seems to automatically call the client index.html its own application, and not allow an external source to connect to it. Removing the code that does this (marked below) does not make it work.
I'm painfully new to Node.js & Socket.io and am hoping that this might be a relatively simple fix for a pro.
I believe the functionality I'm after here works in Liam Kaufman's excellent rails/node.js/socket.io example - his node.js server code is here: https://github.com/liamks/Chatty-Node-Server/blob/master/chat-server.js
I've tried mocking my app's code up to be like his, but haven't yet been able to make it work. He e.g. appears to use an 'http' server, whereas mine uses an 'express' server - I wondered if this might be relevant.
Any help would be greatly appreciated.
UPDATE: Ok, so a bizarre turn of events, thanks to redhotvengeance's reply below I've got this working - server is up on heroku and my client html and javascript connects to it. Great - code below. The problem is, however, that the client html file only connects when it's outside of the Rails app!! i.e. on my desktop!! The moment I put it in the rails application's public/ folder or in a view on my localhost, I get nothing! This makes no sense. I checked it wasn't because of any other random erroneous javascript in my asset pipeline conflicting by just creating a new rails app and dropping the html file in the public/ folder - again nothing - just a dead html page that doesn't connect. Does anyone have any idea what might be going on here? Does Rails have some security feature in place that stops connections to external servers or something??
UPDATE 2: I'm told this has something to do with the 'same origin policy', and I'm in trouble. Is there any way around it? Seems Liam didn't have this problem.
Client:
<script src="http://calm-sands-3826.herokuapp.com/socket.io/socket.io.js" type="text/javascript"></script>
<script src="https://ajax.googleapis.com/ajax/libs/jquery/1.6.4/jquery.min.js"></script>
<script>
var socket = io.connect('http://calm-sands-3826.herokuapp.com');
// on connection to server, ask for user's name with an anonymous callback
socket.on('connect', function(){
// call the server-side function 'adduser' and send one parameter (value of prompt)
socket.emit('adduser', prompt("What's your name?"));
});
// listener, whenever the server emits 'updatechat', this updates the chat body
socket.on('updatelog', function (username, data) {
$('#log').append('<b>'+username + ':</b> ' + data + '<br>');
});
// listener, whenever the server emits 'updateusers', this updates the username list
socket.on('updateusers', function(data) {
$('#users').empty();
$.each(data, function(key, value) {
$('#users').append('<div>' + key + '</div>');
});
});
</script>
<div style="float:left;width:100px;border-right:1px solid black;height:300px;padding:10px;overflow:scroll-y;">
<b>USERS</b>
<div id="users"></div>
</div>
<div style="float:left;width:300px;height:250px;overflow:scroll-y;padding:10px;">
<div id="log"></div>
</div>
Server:
var port = process.env.PORT || 5001;
var io = require('socket.io').listen(parseInt(port));
io.configure(function(){
io.set("transports", ["xhr-polling"]);
io.set("polling duration", 10);
io.set("close timeout", 10);
io.set("log level", 1);
})
// usernames which are currently connected to the chat
var usernames = {};
io.sockets.on('connection', function (socket) {
// when the client emits 'adduser', this listens and executes
socket.on('adduser', function(username){
// we store the username in the socket session for this client
socket.username = username;
// add the client's username to the global list
usernames[username] = username;
// echo to client they've connected
socket.emit('updatelog', 'SERVER', 'you have connected');
// echo globally (all clients) that a person has connected
socket.broadcast.emit('updatelog', 'SERVER', username + ' has connected');
// update the list of users in chat, client-side
io.sockets.emit('updateusers', usernames);
});
// when the user disconnects.. perform this
socket.on('disconnect', function(){
// remove the username from global usernames list
delete usernames[socket.username];
// update list of users in chat, client-side
io.sockets.emit('updateusers', usernames);
// echo globally that this client has left
socket.broadcast.emit('updatelog', 'SERVER', socket.username + ' has disconnected');
});
});
If what you're trying to do is connect pages in your Rails app to your seperate Node.js app running socket.io, then skip setting up Express entirely. You're not looking to actually serve pages from your Node app, just connect users to the socket.io server.
Let's say your Node.js app on Heroku is called: my-awesome-socket-app.
my-awesome-socket-app:
var io = require('socket.io').listen(parseInt(process.env.PORT));
io.configure(function () {
io.set("transports", ["xhr-polling"]);
io.set("polling duration", 10);
});
io.sockets.on('connection', function (socket) {
socket.on('disconnect', function () {
io.sockets.emit('user disconnected');
});
});
Then, in the Rails pages you want to connect to the socket.io server:
<script src="http://my-awesome-socket-app.herokuapp.com/socket.io/socket.io.js"></script>
<script>
var socket = io.connect('http://my-awesome-socket-app.herokuapp.com');
socket.on('connect', function (data) {
console.log('connected!');
});
</script>
I am trying to connect many socket.io clients for different URLs in Node.js like so :
app.get('/:id',function(req,res){
io.of('/'+id).on('connection',function(socket){
socket.emit('hello');
})
});
This works however there is a problem :
When a browser refreshs the page http://localhost:3000/xyz for example, the event socket.emit gets fired two times.
If someone accesses the page http://localhost:3000/xyz 10 times, then the event fires 10 times.
This is not good because everytime the user visits that page, the socket events will be fired n+1 times.
What should be done so that I can register sockets to different URLs and at the same time not have this anomaly .
Another thing :
If I do this :
var sock;
io.of('/'+xyz).on('connection',function(socket){
sock=socket;
})
app.get('/:id',function(req,res){
sock.emit('hello');
})
If I use the above code then the socket doesn't get saved succesfully to the sock variable in time. What that means is , I have to do a setInterval of about 1000 .. so that the
sock=socket
line gets fired.
Please help me.
Because with this, in each request to http://localhost:3000/id, you register a new handler, you should be doing that once, not at every request.
app.get('/:id',function(req,res){
io.of('/'+id).on('connection',function(socket){
socket.emit('hello');
})
});
I use below approach to achieve this goal:
client side:
var socket = io.connect('http://localhost:8183/?clientId='+clientId,{"force new connection":true});
server side:
var io = require('socket.io').listen(server);
io.sockets.on('connection', function(socket) {
console.log("url"+socket.handshake.url);
clientId=socket.handshake.query.clientId;
console.log("connected clientId:"+clientId);
});
reference:https://github.com/LearnBoost/socket.io/wiki/Authorizing#global-authorization
I'm working on a web application and I went through the necessary steps to enable HTML5 App Cache for my initial login page. My goal is to cache all the images, css and js to improve the performance while online browsing, i'm not planning on offline browsing.
My initial page consist of a login form with only one input tag for entering the username and a submit button to process the information as a POST request. The submitted information is validated on the server and if there's a problem, the initial page is shown again (which is the scenario I'm currently testing)
I'm using the browser's developers tools for debugging and everything works fine for the initial request (GET request by typing the URL in the browser); the resources listed on the manifest file are properly cached, but when the same page is shown again as a result of a POST request I notice that all the elements (images, css, js) that were previously cached are being fetched form the server again.
Does this mean that HTML5 App Cache only works for GET requests?
Per http://www.whatwg.org/specs/web-apps/current-work/multipage/offline.html#the-application-cache-selection-algorithm it appears to me that only GET is allowed.
In modern browsers (which support offline HTML), GET requests can probably be made long enough to supply the necessary data to get back data you need, and POST requests are not supposed to be used for requests which are idempotent (non-changing). So, the application should probably be architected to allow GET requests if it is the kind of data which is useful offline and to inform the user that they will need to login in order to get the content sent to them for full offline use (and you could use offline events to inform them that they haven't yet gone through the necessary process).
I'm having exactly the same problem and I wrote a wrapper for POST ajax calls. The idea is when you try to POST it will first make a GET request to a simple ping.php and only if that is successful will it then request the POST.
Here is how it looks in a Backbone view:
var BaseView = Backbone.View.extend({
ajax: function(options){
var that = this,
originalPost = null;
// defaults
options.type = options.type || 'POST';
options.dataType = options.dataType || 'json';
if(!options.forcePost && options.type.toUpperCase()==='POST'){
originalPost = {
url: options.url,
data: options.data
};
options.type = 'GET';
options.url = 'ping.php';
options.data = null;
}
// wrap success
var success = options.success;
options.success = function(resp){
if(resp && resp._noNetwork){
if(options.offline){
options.offline();
}else{
alert('No network connection');
}
return;
}
if(originalPost){
options.url = originalPost.url;
options.data = originalPost.data;
options.type = 'POST';
options.success = success;
options.forcePost = true;
that.ajax(options);
}else{
if(success){
success(resp);
}
}
};
$.ajax(options);
}
});
var MyView = BaseView.extend({
myMethod: function(){
this.ajax({
url: 'register.php',
type: 'POST',
data: {
'username': 'sample',
'email': 'sample#sample.com'
},
success: function(){
alert('You registered :)')
},
offline: function(){
alert('Sorry, you can not register while offline :(');
}
});
}
});
Have something like this in your manifest:
NETWORK:
*
FALLBACK:
ping.php no-network.json
register.php no-network.json
The file ping.php is as simple as:
<?php die('{}') ?>
And no-network.json looks like this:
{"_noNetwork":true}
And there you go, before any POST it will first try a GET ping.php and call offline() if you are offline.
Hope this helps ;)