I have a smart grid system where multiple hardware devices are sending raw sensor data to an Azure Queue. Each device sends a single data packet once every minute. Multiple Worker Roles process the data packets on the queue and push the data to Table Storage. I have a Web Role which holds the application for users to view their device data and a host of other alerts and messages relating to their smart energy system. At the moment the web application just uses ajax polling at one minute intervals to get the latest data updates and any other messages and alerts. Instead of using ajax 'pulling', I'd like to use SignalR instead and 'push' the updates from the cloud when they become available. I'm not sure on what the overall architecture might look like.
So far I have added a SignalR Hub to my Web Role, just to see if I could do that. And it works fine. However, how do I trigger updates from this Hub when there are changes in Table Storage? Should I host the Hub with the Worker Roles that process the raw data, and then make a cross-domain SignalR connection from the web app (client)? Can I even associate an endpoint with a Worker Role? If I have many Worker Roles wouldn't I only be able to connect to one of them, and therefore miss data updates from other Worker Roles?
Perhaps I should create a separate Web Role to host the SignalR hub, but then how do I communicate the changes from the Worker Roles that process the raw data to the hub? Maybe I need to include another Azure Queue that takes messages from the Worker Roles regarding data updates, alerts, and any other messaging, and that queue is processed by the SignalR server. However, would this approach be scalable? If I have multiple instances of the SignalR server processing the message queue(s), would they share the same end point and be aware of all the client connections across the instances? Or maybe the Worker Roles themselves connect as clients to the SignalR server and the messages forwarded from there to the clients.
Is SignalR even the right approach to take if data is being generated at a predictable rate of once every minute for each device. Maybe for updates of this regular data ajax 'pulling' is the best approach, and I should just be using SignalR for the infrequent alerts and messages, although, again, how do I communicate these events from the Worker Roles to the SignalR server?
What overall architecture would suit my needs here?
EDIT 06-09-2014 Half the problem solved
I came across http://www.asp.net/signalr/overview/signalr-20/performance-and-scaling/scaleout-with-windows-azure-service-bus which seems to be exactly what I am after. This deals with the problem of multiple Hub server (Web Role) instances. Now I just need a SignalR client library that can run on the Worker Roles so that they can notify the Hub that new data is available, and the Hub class can then be enhanced to route the new data to the appropriate connected web clients.
EDIT 06-10-2014 A workable solution found
I have added an answer to my question of "What architecture". I thought a quick summary of my setup might be useful. I have many remote devices associated with different users posting real-time data to Azure Queues. The data posted to these queues are parsed and saved to Table Storage, by a number of Worker Roles. Web Roles provide the MVC5 web application for the users (clients) to log on and review their data. I wanted a mechanism by which when new data was posted, any connected clients would receive a real-time notification (and data tables and charts in the client apps could be updated accordingly). SignalR with Service Bus scaleout proved to be the answer.
The first part of the solution I needed was to deploy a SignalR hub that the clients could connect to, to receive any notifications sent. I couldn't use the basic SignalR solution as the MVC5 web app is hosted on a Web Role that will likely have more than one instance - the problem was how to keep all these instances synced so that whatever instance a client was connected to they'd still receive the notifications. SignalR scaleout with Azure Service Bus proved to be the answer to that part of the problem. Details of how to set this up can be found at: http://www.asp.net/signalr/overview/signalr-20/performance-and-scaling/scaleout-with-windows-azure-service-bus - it was VERY easy to setup.
The second part of the problem was how to generate the notifications originating from the Worker Roles (my queue data processors). First I needed to be able to host OWIN in my worker roles - the instructions provided at http://www.asp.net/aspnet/overview/owin-and-katana/host-owin-in-an-azure-worker-role were more than sufficient. Once this was done I created an empty Hub instance with the same name as the one deployed on my Web Roles (it was empty because I didn't expect to have an clients connected to it directly), and changed the Startup class to:
public class Startup
{
public void Configuration(IAppBuilder app)
{
String connectionString = "[Service Bus Connection String]";
GlobalHost.DependencyResolver.UseServiceBus(connectionString, "[App Name]");
app.MapSignalR();
}
}
With this in place if I want to send a notification out to the clients, from the Worker Roles, I do something like:
var context = GlobalHost.ConnectionManager.GetHubContext<MyHub>();
context.Clients.All.clientMethod("[Message]");
What really happens is that a copy of the message gets pushed to the backplane (Service Bus) and is picked up by the Web Roles and pushed out to the connected clients. In reality I will check who is online (in the Web Role Hub instance I override the OnConnected method to save the user's connection id in their profile which is stored in Table Storage), and only create notifications that are associated with those users to reduce SignalR traffic.
Related
I currently have a live chat website set up that hits a subdomain responsible for handling SignalR interactions:
www.domain1.com > chat.domain1.com
I'm wanting to introduce a second, duplicate, largely identical site using the same structure:
www.domain2.com > chat.domain2.com
Both sites will use the same database, which stores all persistent SignalR related things like connections, chat rooms, chat messages, etc.
Is it possible for both SignalR chat subdomains to communicate with clients connected to the other subdomain? While the shared database means that persistent resources are shared, I need to make it so that when I publish an event on chat.domain1.com clients connected to both chat.domain1.com and chat.domain2.com receive them.
It appears that it is common to handle this by sharing the same domain and using CORS to handle cross-domain interactions like so:
www.domain1.com
> chat.domain1.com
www.domain2.com
I can't do this as the SignalR chat endpoints authenticate using cookies set on the main www domain. Those cookies can't be shared cross-domain and even if they could, it's a requirement that a user has the ability to be logged in simultaneously to different accounts on domain1.com and domain2.com on the same machine.
So, are there any approaches I can use to share connections between these two hubs? Both chat subdomains are hosted on the same server?
Typically a backplane is used when your app is being scaled out across multiple servers however it also appeared to work in this situation. I used SQL Server for the backplane but there are also packages to get this working with Redis and Azure Service Bus.
Introduction to Scaleout in SignalR
SignalR Scaleout with SQL Server
First install the package for SQL Server:
Install-Package Microsoft.AspNet.SignalR.SqlServer
Then configure the backplane:
public class Startup
{
public void Configuration(IAppBuilder app)
{
// Any connection or hub wire up and configuration should go here
string sqlConnectionString = "Connecton string to your SQL DB";
GlobalHost.DependencyResolver.UseSqlServer(sqlConnectionString);
app.MapSignalR();
}
}
This allows SignalR to use SQL Server to persist any messages it needs to distribute. The first time it starts up it creates a few tables in the database and you're good to go:
Because both apps share the same database, this works perfectly.
We're planning on adding SignalR to several differnet web applications. The applications are targeted different aspects of an order. When something happens to an order, all users working with the order across all web applications should be notitfied.
Changes to an order are availible asa message on a servicebus.
We could implement the following logic in all web applications:
Subscribe to a topic (one subscription per webapp)
OnMessage -> Send orderId to hub
Hub would notify clients working on the orderId
Question is: Could we implement all this common functionality in a separate application, and all web apps would reference the same signalr scripts?
All applications live on the same domain, and it would give us a lot of benefit not having to implement signalr in every app.
Good idea, or am I missing something important here?
Edit: Put in other words: I have WebAppA, WebAppB and WebAppC all without SignalR. I'm asking if its possible to create a WebappD that talks to clients in WebApp A,B,C
Second Solution is very good. it will move signalr load (espcially memory) from your main web apps to WebAppD(signalr web app). And all your main web apps will not be dependent to signalr.
Drawbacks: You don't have any authentication on WebAppD. Because clients are authenticated on the other WebApps. You should let the WebAppD knows about orderId. That's why, you should send message to server (WebAppD) from clients(Javascript).
Because of enable cross domain settings, anyone can send message to server. Even they don't need to be connected WebAppA,WebAppB or WebAppC. Even if you solve this problem (virtual path etc), Someone is connected but not authenticated on WebAppA,WebAppB,WebAppC can sends message. Because WebAppD just get the message and it doesn't know this client is authenticated or not so it will serve this message to all others. In Short: Someone can send fake messages to other clients.
So you should share your authentication like this (or some other logic) between your web app and signalr webapp.
Other than this I couldn't see any drawback.
I am trying to find the best solution to build the application at work.
1) My client has a web service, I need to have a automatically process to pull data from client web service, store in a database. The data will be displayed in a aspnet MVC web application. Once new data coming in, I want to notify the current user (something like a badge)
Question 1: I have two ideas for the process to pull data from client's web service,
A)I create a WCF service, and host it as a Windows Service
B)Create a console application and get the schedule task to run the console application
Question 2: I am thinking to use SignalR to indicate users of the new data, but in the backend how do get SignalR to check the data changes?
Thanks very much guys
Regards
Question 1: I have two ideas for the process to pull data from
client's web service, A)I create a WCF service, and host it as a
Windows Service B)Create a console application and get the schedule
task to run the console application
Did you mean WCF client? To pull data from client's web service you need a WCF client not a service? The rest looks ok to me here.
Question 2: I am thinking to use SignalR to indicate users of the new
data, but in the backend how do get SignalR to check the data changes?
SignalR is just a mean to send notifications to clients. The way you check for updates depends on the nature of the notification. So you would better elaborate on this part to get a better advice. A common way to do it is to have a notifications log table/queue which you can check for updates, process them and send to the clients. Don't forget to remove or mark processed notification in the table.
Alternatively, you can host SignalR in a windows service with a WCF client and a timer to pull data from web services. You can notify your web users directly from the windows service. Please see this article Tutorial: SignalR Self-Host
Hope it help!
I have a scenario where a request is sent to a service via my client now the response comes inside a message queue in azure, How can I poll the queue at client end and update the view when the response comes say I have to update a label when data is recieved in the queue.
Azure has two types of queues - Azure Queue and Service Bus Queue. Although in theory you can access them from client side (I assume JavaScript) because CORS has been introduced some time ago (Not sure about CORS support for ServiceBusQueue), this might not be the best option.
Problems you might face:
Lot's of clients trying to process messages (locking and releasing), Azure Queue does not support sessions so you would have to either create queue per client or use Service Bus Queue (as I said earlier not sure about CORS) with sessions
What should happen when your client is not online anymore? Does the message stays in the queue? Till when? Expiration?
Different approach
You can do message processing on the server and only notify user about the change using SinglalR. This gives you much better flexibility (one message can trigger notification for many users etc).
SignalR Scaleout with Azure Service Bus
Using SignalR with Azure Table Storage - What architecture?
I'm developing an application that has various types of Notifications. Examples of notifications:
Message Created
Listing Submitted
Listing Approved
I'd like to tie all of these up to SignalR so that any connected clients get updates in real-time.
As far as architecture goes - right now the application is entirely within a single solution hosted on an Azure Website. The triggers for each of these notification types live within this application.
When a trigger is hit, I'd like to tell signalR, "Hey, send this message to the following clients" along with a list of userIds. I'm assuming that it's possible to identify connected clients based on userId... and I'm assuming that the process of send message to clients should be executed outside of the web application, so as to not slow down the MVC app or risk losing data in a broken async call. First question - are these assumptions correct?
Assuming so, this means that I'll need something like a dedicated web/worker role to be sending messages to clients. I could pass messages from my web application directly to this process, but what happens if the process dies? The resiliency concerns lead me to believe that the proper way to pass messages would be via a queue of some sort. Second question - is this a valid train of thought?
Assuming so, this means that I can either use a good ol' Azure SQL database as a queue, but it seems like there are some specialized (and maybe cheaper) services to handle message queueing, such as this:
http://www.windowsazure.com/en-us/develop/net/how-to-guides/queue-service/
Third question: Should this be used as a queueing mechanism for signalR? I'm interested in using Redis for caching in the future... would Redis be better or worse than the queue service?
Final Question:
I've attempted to illustrate my proposed architecture here:
What I'm most unclear on here is how the MVC app will know when to queue, or how the SignalR processes will know when to broadcast. Should the MVC app queue blindly, without caring about connected clients? This seems to introduce a lot of wasted space on the queue, and wasted cycles in the worker roles, since a very small percentage of clients will ever be connected.
The only other approach I can think of is to somehow give the MVC app visibility into the SignalR processes to see if the client is connected... and if they are, then Enqueue. This makes me uncomfortable though because it means I have to hit that red line on the diagram for every trigger that gets hit, which - even if done async - gets me worrying about performance and reliability.
What is the recommended architecture for scalable, performant SignalR message broadcasting? Performance is top priority, followed closely by cost.
Bonus question:
What if some messages are of higher priority than others? Should two queues be used, one of which always gets checked before the other?
If you want to target some users, you'll have to come up with a mechanism, off the top of my head I can give an example, if any user hits a page, you can create a group for that page and push to all users in that group/in that page.
It's not clear to me why you need the queues. Usually users subscribe to some events when hitting a page or by some action like join a chat room, and the server pushes data using those events/functions when appropriate.
For scalability, you can run signalr in different servers, in which case you should use sql server, or service bus or redis as a backplane.
Firstly you need to create a SignalR server to which all the users can connect to. This SignalR server can be created either in the web role or worker role. If you have a huge user base then its better to create the SignalR server on a separate role.
Then wherever the trigger is hit and you want to send messages to users, you have to create a SignalR client (.NET or javascript) and then connect to SignalR server. Then you can send the message to SignalR server which in turn will broadcast to all the other users connected. After that you can disconnect the connection with SignalR server. This way you dont have to use queues to communicate with the SignalR role.
And also to send messages to specific users you can store the socket id's along with their user id's in a table (azure table storage should do) when they connect to SignalR server. Then using socket id you can send messages to specific user.