We run a number of ASP.NET MVC sites - one site per client instance with about 50 on a server. Each site has its own configuration/database/etc.
Each site might be running on a slightly different version of our application depending on where they are in our maintenance schedule.
At the moment for background processing we are using Quartz.net which runs in the app domain of the website. This works well mostly but obviously suffers issues like it isn't running when the appdomain shuts down such as after prolonged activity.
What are our options for creating a more robust solution?
Windows Services are being discussed but I don't know how we can achieve the same multi-site on different versions we get within IIS.
I really want each IIS site to have its own background processing which always runs like a service but is isolated in the same way an IIS site is.
You could install a windows service for each client. That seems like a maintenance headache though.
You could also write a single service that understands all versions of your app, then processes each client one after the other.
You could also create sql server jobs to do this, and setup one job for each customer.
I assume that you have multiple databases for each client as you mentioned.
Create a custom config file with db connection-strings for each client.
e.g.
<?xml version="1.0" encoding="utf-8" ?>
<MyClients>
<MyClient name="Cleint1">
<Connection key="Client1" value="data source=<database>;initial catalog=client1DB;persist security info=True;user id=xxxx;password=xxxx;MultipleActiveResultSets=True;App=EntityFramework" />
</MyClient>
<Myclient name="client2".....></MyClient>
</MyClients>
Write a windows service which loads the above config file (clients|connectiostring) into the memory and iterate through each client database config. I guess all clients have similar job queuing infrastructure, so that you can execute same logic against each database.
You can use a Mutex to initiate the windows service for a client so that it will make sure two instances of the service for the same client won't run at the same time.
Related
In my job we are building Web Apps that rely on a common Enterprise class. This class has a method that sends a request to our server every time the app_start or app_end event triggers so we can monitor the status remotely. But we are now requiring that at least once a day the web app reports its status, a bit like telemetry. I don't know how to accomplish this, so far I have found some options, but some have limitations:
Use hangfire. I don't like this since it requires to setup a Database or add more tables and install a new Nuget package on each project, but could be my last option.
Use a Windows Service that reads databases. This could be less work but it can't access the Web App web.config code
Use a Javascript tasks that sends an AJAX request. This requires to have an open web browser and is a big risk.
I'm looking for a server side approach that could allow to set to trigger an event or function at 1am.
I would got with Hangifire.
It is dead easy to setup and very reliable.
You don't need to setup the database, you might want to check memory storage:
https://github.com/perrich/Hangfire.MemoryStorage
Also check:
What is the equivalent to CRON jobs in ASP.NET? - C#
You can use FluentScheduler instead of Hangfire (it is more lightweight).
Instead of a Javascript task that sends an AJAX request you can use a WebJob or an Azure Function.
IIS has to start the application again after I rebuild the website, it takes a very long time and it cuts into productivity.
I am wondering if I can somehow replace the MVC website with a small console application that listens to a port and returns a string that is then interpreted by the browser as valid html. I am not sure if this was done.
Something very lightweight.
So that it does not take 30-50 seconds on each rebuild, to see my site in action.
I want to build my app, then immediately do a request and test it and not wait almost a minute.
Is there something that does that?
There is a way to build Self-Hosted Web Api applications.
It can be console exe, or setup to run as a service.
The ASPNET engine, is designed to compile itself (views, etc), but a minute? This might be due to the weak PC, for instance.
You can use the Browser control in a windows form app. But it is not recommended, there is no way of avoiding IIS restart since your files have been changed.
I will developp and host an e-commerce website based on Asp.Net MVC4 (with several SQL Server Jobs).
I think use Azure in order to stay in Microsoft's world and avoid dedicated server management.
The package Web Site Shared with 1 site / 5Go SQL Server Database / 200Go Bandwidth is very interesting with the price based on 12 months.
But i don't know if this configuration is enough specially on the bandwidth.
What do you think of ? Did you use Azure with this type of application ?
Regards,
Guillaume.
If you want to develop E-Commerce application you will have to secure customers' sensitive data i.e. credit cards, address details etc. via secure connections (HTTPS; in many countries this is legal requirement). For that reason you will have to have SSL support.
Azure Website do not support SSL for custom domains. However, they support SSL for *.azurewebsites.net DNS name. So if your E-Commerce application DNS will be, say, my-ecom-app.azurewebsites.net then it's fine. Otherwise, I would not recommend Azure Website solution yet (I am sure SSL support for custom domains on Azure Website will be implemented).
Azure Cloud Services, on the other had, have full support of SSL for custom domains.
One of the really good websites to check Azure features and development roadmap is ScottGu's Blog
Azure Web Sites do not support SSL and I really don't know of any successful e-commerce site that does not run SSL for at least part of the website. If you really want to host your e-commerce on Azure today your only real choice is to run Virtual Machines for your web front end servers and use them for your DB or use SQL Azure.
We developed platform called Virto Commerce that does just that, MVC4 website hosted on Azure. There was also a need for SQL Jobs (indexing, payment processing, cart cleanups and so on) for which we used WorkerRole (instead of WebRole). WorkerRole and WebRole can actually be combined as part of a single deployment, however it is better to use a different instance for worker roles. In our case WorkerRole acted as a scheduler for multiple jobs defined in the database.
The challenge with WorkerRoles however is to make sure they scale well when new instances are added. So the workload needs to be distributed between multiple instances. This is done through the use of queues and blob locks, where each job is now split into two, one that schedules and partitions the work and the second that actually picks up the next partition and completes it.
Hope this helps!
PS: Virto Commerce is now available as an open source project on codeplex, go to http://virtocommerce.codeplex.com
We currently have an .NET 4 application that consists of Windows Service running in the background and local or remote clients (only 1-3 normally).
The clients have a WPF GUI and need some data from the windows service. Therefore, we use WCF with NamedPipe binding for a local client and NetTcp binding for remote clients. This works, but we often have problems with endpoints that are not reachable (channel faulted or not found etc.). We already try to rebuild faulted connections but it seems to be pretty fragile...
Now enter Web Api: It looks like a HTTP based stack might be more robust (no channels, no endpoints, can be self-hosted in windows service as well). There seems to be no problems with broken channels because each request is handled individually. So if something fails, you just repeat the request. (And we have experience with ASP.NET MVC from other apps, so this not new to us).
Now we are thinking what might be our best bet. Is it better to "harden" our existing WCF service (one service interface with about 15 operations) or to move the interface to Web Api and run it as HTTP requests (with JSON data)? Performance is not our main issue here...
Any ideas?
Hartmut
I recommend you stick with WCF (SOAP) services for your WPF application rather than moving to the Web API. There are a number of reasons for this. First I think we need to consider what the new Web API is trying to address - namely to provide a framework for supporting RESTful/HTTP/hypermedia services. This is likely to be a good fit for building applications that make heavy use of HTTP such as web, mobile and JavaScript applications, where you want to maximise the "reach" or interopability of your services (irrespective of platform). This is not to say that you can't use it for WPF clients but in your case, where all traffic is local to your domain, it makes more sense to stick with your current implementation.
The binding choices you have made for your services / clients sound ok to me. I would focus on why your channels are faulting and address these issues. You may also want to consider hosting your services via IIS and use WAS to expose your non-HTTP endpoints. I have had much success with this in the past and for the most part has been pretty stable. It also takes away a few of the headaches with managing your own host. If you are concerned about the TCP binding faults, then just create a new HTTP or wsHTTP endpoint and use that instead. This will provide you exactly the same transport the web api uses without having to change your programming model.
I have an ASP.NET MVC web application which is hosted by an external provider, on IIS 7.
I wish to run a process every 15 minutes or so, which checks a backlog of emails that need to be sent, and actually sends them.
It seems that the normal way to do this is with Microsoft Message Queue, but since this is a hosted environment which I can't directly control, I won't be able to install or configure MSMQ.
So far I've decided to do it by appending rows to a table in my SQL Server database (same hosting).
So how should I implement the bit where I check the backlog and send the emails?
Should it be some kind of separate thread in my main web application, which restarts itself every 15 minutes?
Another option I considered was just opening an HTTP-POST interface which, when called with an appropriate admin password, runs an iteration of the email sender.
I could then create a small console app on my local PC which calls the interface every 15 minutes.
The first option is simpler, but the second might be more robust.
Any ideas?
I would recommend you taking a look at Quartz.NET. Also an important thing you should be aware is that the web server could unload the ASP.NET application from memory if it is not used meaning that all threads that have been spawned would simply die. That's one of the reasons why such tasks shouldn't be performed in ASP.NET applications but rather offloaded in Windows Services.
Jeff Atwood did a post on how he originally achieved the badge system on Stack Overflow using an expiring cache to reset the process periodically.
https://blog.stackoverflow.com/2008/07/easy-background-tasks-in-aspnet/
I have done something similar to this in the past sending emails out every day. The service was non essential, and it didn't matter if the emails missed a day or two, as they would go out eventually anyway, but the system worked quite well. It's all asp.net so works fine in the hosting environments I use, without access to service on the server or creating a local trigger from your desktop.