I created an asp.net mvc project. In this project I want some code always running. I publish my code on a web hosting service. For first time I start the application by sending an http request to my domain and I expect the application to be always kept alive and never shut down. But this does not happen.
Even I saw some solutions that say if I ping my domain sometimes in my code prevent the application from shutting down. But this solution extend application life cycle to about 24 hours (not always!!!)
Here is my code:
public class Main
{
public static void main()
{
while (true)
{
try
{
// some codes
}catch(Exception exp)
{
// log the exception message, (but any exception hasn't occurred till now
}
}
}
}
Global.asax: (by using this code, the application shutdown shortly, in this way I use a dummy controller)
public class MvcApplication : System.Web.HttpApplication
{
static Thread keepAliveThread = new Thread(KeepAlive);
protected void Application_Start()
{
AreaRegistration.RegisterAllAreas();
RouteConfig.RegisterRoutes(RouteTable.Routes);
keepAliveThread.Start();
Main.main();
}
protected void Application_End()
{
keepAliveThread.Abort();
}
static void KeepAlive()
{
while (true)
{
WebRequest req = WebRequest.Create("http://mydomain/Home/Index");
req.GetResponse();
try
{
Thread.Sleep(60000);
}
catch (ThreadAbortException)
{
break;
}
}
}
}
Global.asax: (by using this code, application stay running about 24 hours. in this way I don't use any controller)
public class MvcApplication : System.Web.HttpApplication
{
protected void Application_Start()
{
AreaRegistration.RegisterAllAreas();
RouteConfig.RegisterRoutes(RouteTable.Routes);
Main.main();
Timer timer = new Timer(new TimerCallback(refreshSession));
timer.Change(0, 5 * 60 * 1000); // 5 min
}
static void refreshSession(object state)
{
Unirest.get("http://mydomain/");
}
}
Is there any better solution for my purpose? If yes please give me a sample code.
Normally that's called scheduled/background jobs. You can use "scheduled" for googling. Most hosting provide some means to do it form the control panel. On azure for example you could use "webjobs".
Note that there are some well-known solutions for background jobs, like hangfire for example. Don't re-invent the wheel unless you really have to.
Related
I am working on a ASP.NET MVC 5.0 web application. I am experiencing some issues from shared hosting server. They say, worker process is limited to 150MB and my application is taking beyond. So, firstly I wanted to work with garbage collection to reduce the load.
In my DAL I have inherited IDisposable to a class where I have customized methods and made a call to destructor at the end of it.
public class DbAccessSupport : IDisposable
{
public void Dispose()
{
//cmd.Dispose();
Dispose(true);
GC.SuppressFinalize(this);
}
protected virtual void Dispose(bool disposing)
{
if (!disposed)
{
if (disposing)
{
cmd.Dispose();
// Manual release of managed resources.
}
// Release unmanaged resources.
disposed = true;
}
}
~DbAccessSupport()
{
Dispose(false);
}
}
public class MasterWithADO
{
public int DALPostChatMsg(string xml)
{
using (DbAccessSupport DbAccessSupportForSP = new DbAccessSupport(true))
{
DbAccessSupportForSP.CommandText = "sp_PostSingleChatMsg";
DbAccessSupportForSP.AddParameter("#inputXml", xml);
return DbAccessSupportForSP.ExecuteNonQuery();
}
}
}
Is this the best way to implement collection and compaction for 3 -tier architecture. If yes then I would have to implement this in BAL and Application Layer also, that is going to add processing overhead quite high.
Currently I have a job to check and send out emails every minute. I'm using hangfire as the job scheduler but it requires the site to be kept alive in order to function properly. To work round this I'm using another job which runs every 5 minutes as follows to keep the site alive:
public static bool Ping()
{
try
{
var request = (HttpWebRequest)WebRequest.Create('http://xyz.domain.com');
request.Timeout = 3000;
request.AllowAutoRedirect = false; // find out if this site is up and don't follow a redirector
request.Method = "HEAD";
using (request.GetResponse())
{
return true;
}
}
catch
{
return false;
}
}
Anyone know of any better or more efficient way to keep the site alive aside from using a windows service or task scheduler?
In last week for the same purpose I used an Azure Scheduler. I think it is very nice tool, you can:
schedule a job,
defince an action
get access to history of your scheduled task
etc.
So if you have an MSDN subscription I think it is worthly to consider.
As you've noticed, app pool recycling, or application inactivity will cause recurring tasks and delayed jobs to cease being enqueued, and enqueued jobs will not be processed.
If you're hosting the application 'on premise' you can use the 'Auto Start' feature that comes with Windows Server 2008 R2 (or later) - running IIS 7.5 (or above)
Full setup instructions are on the Hangfire documentation - http://docs.hangfire.io/en/latest/deployment-to-production/making-aspnet-app-always-running.html
I'll summarise below.
1)
Create a class that implements IProcessHostPreloadClient
public class ApplicationPreload : System.Web.Hosting.IProcessHostPreloadClient
{
public void Preload(string[] parameters)
{
HangfireBootstrapper.Instance.Start();
}
}
2)
Update your global.asax.cs
public class Global : HttpApplication
{
protected void Application_Start(object sender, EventArgs e)
{
//note - we haven't yet created HangfireBootstrapper
HangfireBootstrapper.Instance.Start();
}
protected void Application_End(object sender, EventArgs e)
{
HangfireBootstrapper.Instance.Stop();
}
}
3)
Create the HangfireBootstrapper class mentioned above.
public class HangfireBootstrapper : IRegisteredObject
{
public static readonly HangfireBootstrapper Instance = new HangfireBootstrapper();
private readonly object _lockObject = new object();
private bool _started;
private BackgroundJobServer _backgroundJobServer;
private HangfireBootstrapper()
{
}
public void Start()
{
lock (_lockObject)
{
if (_started) return;
_started = true;
HostingEnvironment.RegisterObject(this);
GlobalConfiguration.Configuration
.UseSqlServerStorage("connection string");
// Specify other options here
_backgroundJobServer = new BackgroundJobServer();
}
}
public void Stop()
{
lock (_lockObject)
{
if (_backgroundJobServer != null)
{
_backgroundJobServer.Dispose();
}
HostingEnvironment.UnregisterObject(this);
}
}
void IRegisteredObject.Stop(bool immediate)
{
Stop();
}
}
4)
Enable service auto-start
After creating above classes, you should edit the global
applicationHost.config file
(%WINDIR%\System32\inetsrv\config\applicationHost.config). First, you
need to change the start mode of your application pool to
AlwaysRunning, and then enable Service AutoStart Providers.
<applicationPools>
<add name="MyAppWorkerProcess" managedRuntimeVersion="v4.0" startMode="AlwaysRunning" />
</applicationPools>
<!-- ... -->
<sites>
<site name="MySite" id="1">
<application path="/" serviceAutoStartEnabled="true"
serviceAutoStartProvider="ApplicationPreload" />
</site>
</sites>
<!-- Just AFTER closing the `sites` element AND AFTER `webLimits` tag -->
<serviceAutoStartProviders>
<add name="ApplicationPreload" type="WebApplication1.ApplicationPreload, WebApplication1" />
</serviceAutoStartProviders>
Note that for the last entry, WebApplication1.ApplicationPreload is
the full name of a class in your application that implements
IProcessHostPreloadClient and WebApplication1 is the name of your
application’s library. You can read more about this here.
There is no need to set IdleTimeout to zero – when Application pool’s
start mode is set to AlwaysRunning, idle timeout does not working
anymore.
this is sample code for session start and end.
public void Session_OnStart()
{
Application.Lock();
Application["UsersOnline"] = (int)Application["UsersOnline"] + 1;
Application.UnLock();
}
public void Session_OnEnd()
{
Application.Lock();
Application["UsersOnline"] = (int)Application["UsersOnline"] - 1;
Application.UnLock();
}
from MVC6 there will be no global.asax file. so how to handle session start and end by owin middle ware. if possible discuss with code example. thanks
It will be very hard to replicate what you want to do in .NET Core 1.0. The SessionStart and SessionEnd events do not exists (and there is no plan to introduce them) and you can only try to "replicate" them with some tricks.
As clearly pointed out here in this extremely clear article,
Since one of the driving forces behind ASP.NET Core 1.0 is "cloud-readiness", the focus on session management design has been to make it work in a distributed scenario. Session_End only ever fired when sessions used inproc mode (local server memory) and the team have stated that they won't add features that only work locally.
BTW, short answer on session usage:
Setup
Add the following entries to the dependencies node of your project.json file
"Microsoft.AspNet.Session": "1.0.0-rc1-final"
Modify you ConfigureServices to activate the Session:
public void ConfigureServices(IServiceCollection services)
{
// Add framework services.
services.AddMvc();
services.AddSession(options => {
options.IdleTimeout = TimeSpan.FromMinutes(30);
options.CookieName = ".MyApplication";
});
}
Then you can activate it like this:
public void Configure(IApplicationBuilder app, IHostingEnvironment env, ILoggerFactory loggerFactory)
{
app.UseSession();
//removed for brevity
}
Usage
HttpContext.Session.SetString("Name, "Foo");
HttpContext.Session.SetInt32("Age", 35);
or
var name = HttpContext.Session.GetString("Name");
var age = HttpContext.Session.GetInt32("Age");
i am working on UI-application that handles multiple entry point approach.
I am referring the link and try for make a demo.
Here is the code :-
public class DemoApp extends UiApplication implements RealtimeClockListener
{
private static DemoApp dmMain ;
private static final long dm_APP_ID = 0x6ef4b845de59ecf9L;
private static DemoApp getDemoApp()
{
if(dmMain == null)
{
RuntimeStore dmAppStore = RuntimeStore.getRuntimeStore();
dmMain = (DemoApp)dmAppStore.get(dm_APP_ID);
}
return dmMain;
}
private static void setDemoApp(DemoApp demoAppMain)
{
RuntimeStore dmAppStore = RuntimeStore.getRuntimeStore();
dmAppStore.remove(dm_APP_ID);
dmAppStore.put(dm_APP_ID, demoAppMain);
}
public static void main(String[] args)
{
Log.d(" Application argument "+args);
if( args.length > 0 && args[ 0 ].equals( "Demo_Alternate" ) )
{
Log.d("Running Demo_Alternate #### Running Demo_Alternate #### Running Demo_Alternate");
dmMain = new DemoApp();
dmMain.enterEventDispatcher();
setDemoApp(dmMain);
}
else
{
Log.d("Running Demo #### Running Demo #### Running Demo #### Running Demo");
getDemoApp().initializeMain();
}
}
public DemoApp()
{
this.addRealtimeClockListener(this);
}
private void initializeMain()
{
UiApplication.getUiApplication().invokeLater(new Runnable()
{
public void run()
{
try
{
pushScreen(new DemoMainScreen());
} catch (Exception e)
{
Log.e(e.toString());
}
}
});
}
public void clockUpdated()
{
showMessage("DemoAppClock Updated");
Log.d("DemoAppClock Updated #### DemoAppClock Updated #### DemoAppClock Updated");
}
private void showMessage(String message)
{
synchronized (Application.getEventLock())
{
Dialog dlg = new Dialog(Dialog.D_OK, message, Dialog.OK, null, Manager.FIELD_HCENTER);
Ui.getUiEngine().pushGlobalScreen(dlg, 1, UiEngine.GLOBAL_QUEUE);
}
}
}
:- I have created an alternate entry point named Demo_Alternate , that runs at start up.
:- If the application has separate entry points, that means a separate process the link
Now my questions are :-
While running the code, I am getting "Uncaught exception : no application instance".
I just want to make one application instance - don't want separate processes.
Can we use (Application) Singleton approach for alternate entry-points?
Only looked briefly at this code, but see an obvious problem here:
dmMain.enterEventDispatcher();
setDemoApp(dmMain);
enterEventDispatcher never returns, so you never put your Application instance in RuntimeStore.
I suggest you review the following KB article, you might find its approach to accessing a RuntimeStore maintained object easier to use. Or not.
Singleton using RuntimeStore
Update
If this solution does not work, please update your original post with the corrected code.
I certainly agree with Peter, that calling setDemoApp(dmMain) after enterEventDispatcher() means it doesn't get called.
That said, I think you have a more basic misunderstanding here.
Using alternate entry points will create multiple processes. See here for more.
But, you say that you don't want separate processes. Can you tell us why not?
Separate BlackBerry processes that are designed to work together can still share data, using the RuntimeStore, for example.
Maybe you could tell us more about what your "Demo" and "Demo Alternate" are supposed to do.
While very familiar to Webforms and Linq, I am a novice to the ASP.NET MVC and NHibernate World. I have been working through a project using Bob Cravens' examples. My application is mostly reads and non-sequential writes, so typically I would not use a transactions. But to implement the Unit-of-Work pattern, all my research including Ayende's blog says I should.
The problem I have is this -
Ninject creates a Session and Opens a Transaction.
Ninject injects repositories into services, and services into controllers.
I make some changes to the properties and children of an object and save on the aggregate root. This calls Transaction.Commit (works fine, as expected)
In another method later in the controller, I try to save a separate object
The second call fails because the transaction is no longer active.
I'm thinking of adding a bool "CommitNeeded" to the UnitOfWork which would be set by my Save() method and conditionally trigger a Commit() on UnitOfWork.Dispose(). Is this a good idea?
Should I remove the transaction infrastructure? Should I change my Commit()'s to Flush()'s?
Any advice that would help fix my anti-pattern would be appreciated.
In response to the comments - I guess I don't mind if they happen together or separate. There are two things going on. The first one changes a "Customer" object, and then saves it. The second makes a logging entry which then calls the same "Save" method.
var Customer = ServiceLayer.GetCustomer(...);
Transmogrify(Customer, Customer.Children, Widgets, ...);
ServiceLayer.Save(Customer)
ServiceLayer.RecordEvent(Customer, "Customer was frobbed")
where LogEvent looks like
public void RecordEvent(Customer customer, int eventTypeId, string description)
{
...
Save(customer);
}
The RecordEvent method has its own "save" because it is called from other controllers that do no data changes. I believe the Save call doesn't belong in either of those places. The question is, where? The Dispose() method of the Service Layer? or a Filter like the other users suggested?
Using ASP.NET MVC, I use an action filter to bind the transaction scope to the controller action execution lifecycle. This works great most of the time, but you have to be cautious not to keep transactions open too long.
public class UnitOfWorkActionFilter : ActionFilterAttribute
{
public IUnitOfWork UnitOfWork { get; set; }
public override void OnActionExecuting(ActionExecutingContext filterContext)
{
UnitOfWork.Start();
base.OnActionExecuting(filterContext);
}
public override void OnActionExecuted(ActionExecutedContext filterContext)
{
if (filterContext.Exception == null)
{
UnitOfWork.Commit();
}
else
{
UnitOfWork.Rollback();
}
UnitOfWork.Dispose();
base.OnActionExecuted(filterContext);
}
}
In my case I'm using property injection via a custom ControllerActionInvoker to get the IUnitOfWork dependency into the ActionFilterAttribute.
I'm using for that an http module. I get the transaction at the beginning of http request and terminate it on the end of http request :
public class UnitOfWorkModule : IHttpModule
{
public void Init(HttpApplication context)
{
context.BeginRequest += context_BeginRequest;
context.EndRequest += context_EndRequest;
}
private void context_BeginRequest(object sender, EventArgs e)
{
IUnitOfWork instance = UnitOfWorkFactory.GetDefault();
instance.Begin();
}
private void context_EndRequest(object sender, EventArgs e)
{
IUnitOfWork instance = UnitOfWorkFactory.GetDefault();
try
{
instance.Commit();
}
catch
{
instance.RollBack();
}
finally
{
instance.Dispose();
}
}
}
My unit of work factory it's just a Func initialized while registering types in IoC container :
public class UnitOfWorkFactory
{
public static Func<IUnitOfWork> GetDefault;
}
Initialization (for my case StructureMap) :
UnitOfWorkFactory.GetDefault = () => container.GetInstance<IUnitOfWork>();
And then you register you in UnitOfWorkModule web.config
<httpModules>
<add name="UnitOfWorkModule" type="UI.UnitOfWorkModule, UI" />
</httpModules>