Send reminder email with a windows service - asp.net-mvc

We have an ASP.NET MVC web app that we have installed in several clients/domains (more than 100). This web app works with MS SQL database running in a Windows server. All the domains are in the same service and running in separate databases in the same MS SQL server.
Each client can create events for their users. We are talking some days one client could have more than 200 events.
We need to create a Window service to be running on the server where all the clients are to send, every few hours, reminder emails of each event for each client.
As I explained before each client has its own database and there is a common database with all the clients information and their database names.
What the service has to do is check, for each client, the events they have and send the corresponding emails.
We don't know if is better to have one service to go through all the clients or have a service per client. If we have just one service we would have to run it , for example, every 6 hours sending those emails.
In case we have a service per client we would need to create the service from the asp.net web app, would it be possible?
What is the best approach for this?

You can use SQL Server Integration Service to do this job. All you need to write a SSIS package and schedule it.
On each execution of this package you can check data and send emails accordingly.
I hope this idea will help.

Related

Signalr connecting to multiple Web applications

We're planning on adding SignalR to several differnet web applications. The applications are targeted different aspects of an order. When something happens to an order, all users working with the order across all web applications should be notitfied.
Changes to an order are availible asa message on a servicebus.
We could implement the following logic in all web applications:
Subscribe to a topic (one subscription per webapp)
OnMessage -> Send orderId to hub
Hub would notify clients working on the orderId
Question is: Could we implement all this common functionality in a separate application, and all web apps would reference the same signalr scripts?
All applications live on the same domain, and it would give us a lot of benefit not having to implement signalr in every app.
Good idea, or am I missing something important here?
Edit: Put in other words: I have WebAppA, WebAppB and WebAppC all without SignalR. I'm asking if its possible to create a WebappD that talks to clients in WebApp A,B,C
Second Solution is very good. it will move signalr load (espcially memory) from your main web apps to WebAppD(signalr web app). And all your main web apps will not be dependent to signalr.
Drawbacks: You don't have any authentication on WebAppD. Because clients are authenticated on the other WebApps. You should let the WebAppD knows about orderId. That's why, you should send message to server (WebAppD) from clients(Javascript).
Because of enable cross domain settings, anyone can send message to server. Even they don't need to be connected WebAppA,WebAppB or WebAppC. Even if you solve this problem (virtual path etc), Someone is connected but not authenticated on WebAppA,WebAppB,WebAppC can sends message. Because WebAppD just get the message and it doesn't know this client is authenticated or not so it will serve this message to all others. In Short: Someone can send fake messages to other clients.
So you should share your authentication like this (or some other logic) between your web app and signalr webapp.
Other than this I couldn't see any drawback.

How to automatically pull data from client's webservice

I am trying to find the best solution to build the application at work.
1) My client has a web service, I need to have a automatically process to pull data from client web service, store in a database. The data will be displayed in a aspnet MVC web application. Once new data coming in, I want to notify the current user (something like a badge)
Question 1: I have two ideas for the process to pull data from client's web service,
A)I create a WCF service, and host it as a Windows Service
B)Create a console application and get the schedule task to run the console application
Question 2: I am thinking to use SignalR to indicate users of the new data, but in the backend how do get SignalR to check the data changes?
Thanks very much guys
Regards
Question 1: I have two ideas for the process to pull data from
client's web service, A)I create a WCF service, and host it as a
Windows Service B)Create a console application and get the schedule
task to run the console application
Did you mean WCF client? To pull data from client's web service you need a WCF client not a service? The rest looks ok to me here.
Question 2: I am thinking to use SignalR to indicate users of the new
data, but in the backend how do get SignalR to check the data changes?
SignalR is just a mean to send notifications to clients. The way you check for updates depends on the nature of the notification. So you would better elaborate on this part to get a better advice. A common way to do it is to have a notifications log table/queue which you can check for updates, process them and send to the clients. Don't forget to remove or mark processed notification in the table.
Alternatively, you can host SignalR in a windows service with a WCF client and a timer to pull data from web services. You can notify your web users directly from the windows service. Please see this article Tutorial: SignalR Self-Host
Hope it help!

Using SignalR with Azure Table Storage - What architecture?

I have a smart grid system where multiple hardware devices are sending raw sensor data to an Azure Queue. Each device sends a single data packet once every minute. Multiple Worker Roles process the data packets on the queue and push the data to Table Storage. I have a Web Role which holds the application for users to view their device data and a host of other alerts and messages relating to their smart energy system. At the moment the web application just uses ajax polling at one minute intervals to get the latest data updates and any other messages and alerts. Instead of using ajax 'pulling', I'd like to use SignalR instead and 'push' the updates from the cloud when they become available. I'm not sure on what the overall architecture might look like.
So far I have added a SignalR Hub to my Web Role, just to see if I could do that. And it works fine. However, how do I trigger updates from this Hub when there are changes in Table Storage? Should I host the Hub with the Worker Roles that process the raw data, and then make a cross-domain SignalR connection from the web app (client)? Can I even associate an endpoint with a Worker Role? If I have many Worker Roles wouldn't I only be able to connect to one of them, and therefore miss data updates from other Worker Roles?
Perhaps I should create a separate Web Role to host the SignalR hub, but then how do I communicate the changes from the Worker Roles that process the raw data to the hub? Maybe I need to include another Azure Queue that takes messages from the Worker Roles regarding data updates, alerts, and any other messaging, and that queue is processed by the SignalR server. However, would this approach be scalable? If I have multiple instances of the SignalR server processing the message queue(s), would they share the same end point and be aware of all the client connections across the instances? Or maybe the Worker Roles themselves connect as clients to the SignalR server and the messages forwarded from there to the clients.
Is SignalR even the right approach to take if data is being generated at a predictable rate of once every minute for each device. Maybe for updates of this regular data ajax 'pulling' is the best approach, and I should just be using SignalR for the infrequent alerts and messages, although, again, how do I communicate these events from the Worker Roles to the SignalR server?
What overall architecture would suit my needs here?
EDIT 06-09-2014 Half the problem solved
I came across http://www.asp.net/signalr/overview/signalr-20/performance-and-scaling/scaleout-with-windows-azure-service-bus which seems to be exactly what I am after. This deals with the problem of multiple Hub server (Web Role) instances. Now I just need a SignalR client library that can run on the Worker Roles so that they can notify the Hub that new data is available, and the Hub class can then be enhanced to route the new data to the appropriate connected web clients.
EDIT 06-10-2014 A workable solution found
I have added an answer to my question of "What architecture". I thought a quick summary of my setup might be useful. I have many remote devices associated with different users posting real-time data to Azure Queues. The data posted to these queues are parsed and saved to Table Storage, by a number of Worker Roles. Web Roles provide the MVC5 web application for the users (clients) to log on and review their data. I wanted a mechanism by which when new data was posted, any connected clients would receive a real-time notification (and data tables and charts in the client apps could be updated accordingly). SignalR with Service Bus scaleout proved to be the answer.
The first part of the solution I needed was to deploy a SignalR hub that the clients could connect to, to receive any notifications sent. I couldn't use the basic SignalR solution as the MVC5 web app is hosted on a Web Role that will likely have more than one instance - the problem was how to keep all these instances synced so that whatever instance a client was connected to they'd still receive the notifications. SignalR scaleout with Azure Service Bus proved to be the answer to that part of the problem. Details of how to set this up can be found at: http://www.asp.net/signalr/overview/signalr-20/performance-and-scaling/scaleout-with-windows-azure-service-bus - it was VERY easy to setup.
The second part of the problem was how to generate the notifications originating from the Worker Roles (my queue data processors). First I needed to be able to host OWIN in my worker roles - the instructions provided at http://www.asp.net/aspnet/overview/owin-and-katana/host-owin-in-an-azure-worker-role were more than sufficient. Once this was done I created an empty Hub instance with the same name as the one deployed on my Web Roles (it was empty because I didn't expect to have an clients connected to it directly), and changed the Startup class to:
public class Startup
{
public void Configuration(IAppBuilder app)
{
String connectionString = "[Service Bus Connection String]";
GlobalHost.DependencyResolver.UseServiceBus(connectionString, "[App Name]");
app.MapSignalR();
}
}
With this in place if I want to send a notification out to the clients, from the Worker Roles, I do something like:
var context = GlobalHost.ConnectionManager.GetHubContext<MyHub>();
context.Clients.All.clientMethod("[Message]");
What really happens is that a copy of the message gets pushed to the backplane (Service Bus) and is picked up by the Web Roles and pushed out to the connected clients. In reality I will check who is online (in the Web Role Hub instance I override the OnConnected method to save the user's connection id in their profile which is stored in Table Storage), and only create notifications that are associated with those users to reduce SignalR traffic.

Communicate between azure websites

I'm currently hosting 2 webstites (clientportal and admin), in windows azure websites.
I'm going to introduce SignalR right now, to get rid of javascript polling. What I need, is when an admin send a broadcast message on the admin site, all the active client should be receive it. Now, the clien polls the webserver all the time, when an ajax poll receives, the webserver check for new messages in the database.
What would be the best way, to notify the client webserver from the admin webserver? WebApi is ok for this? Or are there any simple way?
From the point of view of Azure Websites and SignalR, it is transparent who is the Client and who is the Admin. This is an implementation detail, and thus, there are many ways to do it.
SignalR is split in two parts: The javascript library, and the .NET library that you use to create your "Hub". One way to accomplish what you are trying to do is to implement the Hub functionality in your Admin back-end using the .NET side of SignalR and ASP.NET Web API; and use the javascript side of SignalR to subscribe and listen for notifications on the Client side.
Another way to do it is to create a third component, just for the SignalR Hub, independent of both the Client and the Admin websites. In this scenario both the Admin and the Client would subscribe to the Hub, but only the Admin would push, and the Client would listen.
You can find many working examples on the internet. This is one of them: http://msdn.microsoft.com/en-us/magazine/hh965663.aspx
I hope this answers your questions.

Creating Scoped Job Requests For Quickbooks Web Connector QBWC?

I'm developing a web application that communicates with many different Web Connectors, sometimes simultaneously.
The problem I'm running into is that I have a single, global job queue on the server that all Web Connectors are polling from.
Is there any way to create an XML job request that specifies which Web Connector should run a particular job? I'm wondering if the OwnerID tag could be used to match a job to a specific local .qwc configuration? Or possibly FileID? Beyond these two variables, I can't imagine I have any additional control over influencing the Web Connector to make a decision to run a specific job or not.
I'm trying to avoid having each individual Web Connector run every single job on the queue, whether it was intended for them or not.
Thanks!!
The Web Connector itself doesn't have any logic like this - it's up to your SOAP server implementation to only feed the correct requests to the Web Connector.
This is what the username parameter in the .QWC files/Web Connector is for.
If you have a single username, everything gets sent to just a single Web Connector.
If you have multiple usernames, then you specify which username to queue up each request under, and only the Web Connector with the .QWC file with that username in will run the corresponding items that were queued up for that username.
When you create your .QWC files, use the corresponding usernames in the .QWC files.

Resources