Scalable MultiUserChat Server with Play! Framework 2.0 and Akka - scalability

With Play! 2.0 and Akka, which is supposed to provide a platform for building highly scalable apps, is it feasible to create a MultiUserChat server? Right now, I'm thinking on the lines of one Akka 'actor' per user, and storing a static array of actors in-memory in the server -- based on the sample provided in the Play! framework package. Each incoming request from the client will act on the respective user's actor object.
User information and Chat Room information will be written to Redis so that in case the server needs to restart, it can recover and rebuild the set of actors.
But I don't believe this will scale.
Any thoughts?

You can try my example : play-akka-cluster-websocket-chat. This this sample of integration a play framework 2 app (java) with akka cluster. It's provide a possibility to add new play node for scale system. when a new node added all nodes in the cluster share chat messages, no matter which node receives the message.
I create new actor for every chat room on all nodes and share chat message between cluster.

Related

Routing a clients connection to a specific instance of a SignalR backend within a Kubernetes cluster

While trying to create a web application for shared drawing I got stuck on a problem regarding Kubernetes and scaling. The application uses an ASP.NET Core backend with SignalR for sharing the drawing data across its users. For scaling out the application I am using a deployment for each microservice of the system. For the SignalR part though, additional configuration is required.
After some research I have found out about the possibility to sync all instances of the SignalR backend either through the use of Azures SignalR Service or the use of a Redis backplane. The latter of which I have gotten to work on my local minikube environment. I am not really happy with this solution because of the following reasons:
My main concern is that like this I have created a hard bottleneck in
the system. Unlike in a chat application where data is sent only once
in a while, messages are sent for every few points drawn in the
shared drawing experience by any client. Simply put, a lot of traffic
can occur and all of it has to pass through the single Redis backplane.
Additionally to me it seems unneccessary to make all instances of the SignalR backend talk to each
other. In this application shared drawing does only occur in small groups of up to 10 clients lets
say. Groups of this size can easily be hosted on a single instance.
So without syncing all instances of the SignalR backend I would have to route the clients connection based on the SignalR group name to the right instance of the SignalR backend when the client is trying to join a group.
I have found out about StatefulSets which allow me to have a persistent address for each backend pod in the cluster. I then could somehow associate the SignalR group IDs with the pod addresses they are running on in lets say another look up microservice. The problem with this is that the client needs to be able to access the right pod from outside of the cluster where that cluster internal address does not really help.
Also I am wondering if there isnt a whole better approach to the problem since I am very new to the world of kubernetes. I would be very greatful for your thoughts on this issue and any hint towards a (better) solution.

Using SignalR with Azure Table Storage - What architecture?

I have a smart grid system where multiple hardware devices are sending raw sensor data to an Azure Queue. Each device sends a single data packet once every minute. Multiple Worker Roles process the data packets on the queue and push the data to Table Storage. I have a Web Role which holds the application for users to view their device data and a host of other alerts and messages relating to their smart energy system. At the moment the web application just uses ajax polling at one minute intervals to get the latest data updates and any other messages and alerts. Instead of using ajax 'pulling', I'd like to use SignalR instead and 'push' the updates from the cloud when they become available. I'm not sure on what the overall architecture might look like.
So far I have added a SignalR Hub to my Web Role, just to see if I could do that. And it works fine. However, how do I trigger updates from this Hub when there are changes in Table Storage? Should I host the Hub with the Worker Roles that process the raw data, and then make a cross-domain SignalR connection from the web app (client)? Can I even associate an endpoint with a Worker Role? If I have many Worker Roles wouldn't I only be able to connect to one of them, and therefore miss data updates from other Worker Roles?
Perhaps I should create a separate Web Role to host the SignalR hub, but then how do I communicate the changes from the Worker Roles that process the raw data to the hub? Maybe I need to include another Azure Queue that takes messages from the Worker Roles regarding data updates, alerts, and any other messaging, and that queue is processed by the SignalR server. However, would this approach be scalable? If I have multiple instances of the SignalR server processing the message queue(s), would they share the same end point and be aware of all the client connections across the instances? Or maybe the Worker Roles themselves connect as clients to the SignalR server and the messages forwarded from there to the clients.
Is SignalR even the right approach to take if data is being generated at a predictable rate of once every minute for each device. Maybe for updates of this regular data ajax 'pulling' is the best approach, and I should just be using SignalR for the infrequent alerts and messages, although, again, how do I communicate these events from the Worker Roles to the SignalR server?
What overall architecture would suit my needs here?
EDIT 06-09-2014 Half the problem solved
I came across http://www.asp.net/signalr/overview/signalr-20/performance-and-scaling/scaleout-with-windows-azure-service-bus which seems to be exactly what I am after. This deals with the problem of multiple Hub server (Web Role) instances. Now I just need a SignalR client library that can run on the Worker Roles so that they can notify the Hub that new data is available, and the Hub class can then be enhanced to route the new data to the appropriate connected web clients.
EDIT 06-10-2014 A workable solution found
I have added an answer to my question of "What architecture". I thought a quick summary of my setup might be useful. I have many remote devices associated with different users posting real-time data to Azure Queues. The data posted to these queues are parsed and saved to Table Storage, by a number of Worker Roles. Web Roles provide the MVC5 web application for the users (clients) to log on and review their data. I wanted a mechanism by which when new data was posted, any connected clients would receive a real-time notification (and data tables and charts in the client apps could be updated accordingly). SignalR with Service Bus scaleout proved to be the answer.
The first part of the solution I needed was to deploy a SignalR hub that the clients could connect to, to receive any notifications sent. I couldn't use the basic SignalR solution as the MVC5 web app is hosted on a Web Role that will likely have more than one instance - the problem was how to keep all these instances synced so that whatever instance a client was connected to they'd still receive the notifications. SignalR scaleout with Azure Service Bus proved to be the answer to that part of the problem. Details of how to set this up can be found at: http://www.asp.net/signalr/overview/signalr-20/performance-and-scaling/scaleout-with-windows-azure-service-bus - it was VERY easy to setup.
The second part of the problem was how to generate the notifications originating from the Worker Roles (my queue data processors). First I needed to be able to host OWIN in my worker roles - the instructions provided at http://www.asp.net/aspnet/overview/owin-and-katana/host-owin-in-an-azure-worker-role were more than sufficient. Once this was done I created an empty Hub instance with the same name as the one deployed on my Web Roles (it was empty because I didn't expect to have an clients connected to it directly), and changed the Startup class to:
public class Startup
{
public void Configuration(IAppBuilder app)
{
String connectionString = "[Service Bus Connection String]";
GlobalHost.DependencyResolver.UseServiceBus(connectionString, "[App Name]");
app.MapSignalR();
}
}
With this in place if I want to send a notification out to the clients, from the Worker Roles, I do something like:
var context = GlobalHost.ConnectionManager.GetHubContext<MyHub>();
context.Clients.All.clientMethod("[Message]");
What really happens is that a copy of the message gets pushed to the backplane (Service Bus) and is picked up by the Web Roles and pushed out to the connected clients. In reality I will check who is online (in the Web Role Hub instance I override the OnConnected method to save the user's connection id in their profile which is stored in Table Storage), and only create notifications that are associated with those users to reduce SignalR traffic.

Is combining Parse.com API with Pubnub a viable option for large scale Real-Time Messaging and obtaining the combined toolset?

Essentially combining Parse with Pubnub, Pusher or similar, Instead of building a custom backend from scratch.
I'll be working on a real-time messaging system with facebook login and file storage/sharing. In theory I could use a combination of Parse and something like Pubnub to cover backend requirements. Were:
Parse takes care of:
Login
File Storage
Push-notifications(closed app)
And Pubnub takes care of:
real
time delivery of messages...
Requirements:
I need a system that can extend to millions of users if needed and can be deployed quickly
In general a solution that will fit this criteria and specs.
Criteria:
Quick deployment by one or 2 developers.
Can expand to millions of users.
High reliability
Specs:
FB Login
Realtime Msg delivery
Push for closed app delivery
Shared file & image storage
Any feedback if this as a first stage deployment would work well and any pitfalls would be greatly appreciated.
I'm a little biased but check out StackMob (www.stackmob.com), with the StackMob Marketplace you get direct access to PubNub with no need to create a second account. There are also a lot of other great services in the marketplace to add functionality such as SendGrid.
All the features you are looking for are out of the box even the separate development and production accounts. Something you don't get with Parse. With a simple click of a button you can move Schemas and custom code from development to production.
We can certainly support the users you are talking about. We have 7 games from Atari on the platform and other big enterprise like Land O Lakes and Adidas Japan. We also have a great track record when it comes to reliability.
Sounds good, but 2 systems (Parse and PubNub) contradict your criteria Quick deployment by one or 2 developers.
There is reason to find one system which satisfies all your requirements.
You could loot at QuickBlox backend - your own cloud backend
It has 7 modules(sets of API) for different tasks. You may be interested in:
Users module - it has Facebook/Twitter login
Messages module - this is Push Notifications. It supports iOS, Android, BlackBerry, WindowsPhone push notifications
Content/CMS module - it allows to store/share/stream any type of files, any size (up to 5 TB!)
Chat module - realtime message delivery. QuickBlox Chat is a quick and reliable chat solution which combines benefits of scalable cloud hosted XMPP chat server, seamless Single Sign-On authorization via Users module, incoming IM / chat alerts via Push Notifications and file attachments via Content.
I recommend look at it, it also have lots of great features such custom API creation via Custom Objects module
Also, there is Enterprise solutions - QuickBlox this is white box, so you can deploy it to your own server and re-sale to other clients if you want
The short answer:
no.
The details:
Anyway you hash it, it's too expensive to setup a chat with any of these systems since their BaaS model is based on charging on a per number of calls basis.
I had to work out a lot of the logic my self using parse.com and now that I'm implementing an XMPP solution, the quantity of work is the same to get something working.
My alternative solution:
Use an open source xmpp server like ejabberd on something like AWS and then use one of the APIs to connect to it.
Contact me of you need more info on my experiences:
#andrescanella

Need my apps to talk to each other

In a DELPHI 2007 application that I am developing some prospect clients just found interesting to be able to share data and information with each other.
They all have the same application.
All have independent Databases
But all have the same installed application and there are some data types that they might want to share (replicate) between their databases.
How can I enable them to share data with other users of the same application program, but not to everybody on the whole internet.
I would like this to be as automatic as possible, as I already have considered approaches that involve manually sending emails.
I know Datasnap is an option, is there any other.
UPDATE:
The idea is to enable companies that have the same application to be able to share data.
They should be able to select what partner and what to send.
I have been investigating datasnap, but would like to know if there is another way to do this
Another standard way to connect distributed applications and share data and information is through some Message-oriented middleware (MOM). There are many open source middleware products (message brokers) available, which can be used over Delphi client libraries, even in multithreaded Delphi server applications. (Disclaimer: I am the author of message broker client libraries for Delphi and Free Pascal)
There are many essential differences between web services and message brokers, like peer-to-peer and publish/subscribe communication models. They also play a key role in enterprise application integration patterns.
One standard way to connect applications to other applications is to make a web-service, and make a client that consumes that web-service, called a web-client. Technologies like SOAP and REST refer to such web service and web clients.
Your question is vague, perhaps due to english not being your language, but you should probably edit it and be more specific.
If all your applications are going to talk directly to each other that is called "peer to peer networking" and there are huge problems with enabling that kind of communication directly over the internet. It is much easier if you build a server that all these applications will connect to.
As a sample, consider the IRC Chat service, and consider writing a Web Service that will be the Chat Server, and consider all your clients to be "Chat clients". Sharing data could be the same idea as creating "rooms" or "channels" on a chat server.
I get the idea that you want something like a Peer to Peer Data Replication Service. I think that the closest you're going to get to that is something like "RSS Feeds" (used by blog syndication services). You subscribe to them via a simple web service, and pull down the new content on some periodic basis. Since that data has to be published to a central server, that means, that a peer to peer approach is out of the question. If you don't have your own web server running on a web hosting service, or on a "cloud", and you need a truly peer to peer solution, I am not aware of any way to do that, at least not without an incredible custom engineering effort.

Communicating between Node.Js and ASP.NET MVC Application

I have an existing complex website built using ASP.NET MVC, including a database backend, data layer, as well as the Web UI layer. Rebuilding this website in another language is not a feasible option.
There are some UI elements on some views (client side) which would benefit from live interactivity, involving both push and pull, so rather than implement some kind of custom long polling or websocket server in asp.net, I am looking to leverage node.js for Windows, and Socket.io.
My problem is that I need two way communication between both applications. Each user should only be able to receive data once they are authorised on the ASP.NET website, so I first need communication for this. Secondly, once certain events occur on the ASP.NET website I want to immediately push this data to the Node server, to be broadcast to specific users or groups of users. Thirdly, I would like any data sent to the node.js server to be pushed to the ASP.NET website for processing, as this is where all our business logic lies. The sole reason for adding Node.js is to have the possibility to push data directly to the client, I do not want to build any business logic into it (or as little as possible).
I would like to know what the fastest method of two-way push communication is between Node.Js and ASP.NET. The only good option I'm aware of so far is to create a special listener on a specific port on the node.js server and connect to that, but I was wondering if there's a more elegant or more efficient method? I also know that you could use a database inbetween but surely this would need to be polled and would be less efficient? Both servers will be running on the same server under a Visual Studio project.
Many thanks for any help you can provide.
I'm not an ASP.NET expert, but I think there are multiple ways you can achieve this:
1) As you said, you could make Node listen on a specific port for data and then react based on the data received (TCP)
2) You can make POST requests to Node.js (HTTP) and also send an auth-key in the process to be extra-secure. Like on 1) Node would react to the data you send.
3) Use something like Redis for pub-sub, send messages from ASP.NET (pub) and get them on the Node.js part (sub). This is even better if you want to scale your app across multiple machines etc.
The only good option I'm aware of so far is to create a special
listener on a specific port on the node.js server and connect to that,
but I was wondering if there's a more elegant or more efficient
method?
You can try to look at redis pub/sub model where ASP.NET MVC application and node.js would communicate through separate channels in order to achieve full-duplex communication. Or you can also try to use CouchDB change nofitications.
I also know that you could use a database inbetween but surely this
would need to be polled and would be less efficient?
Former techniques do not require you to poll for changes, but instead they will notify you when the changes happens or channel message arrives.

Resources