the first time the site is loaded, the site is loaded for a very long time, then the pages are loaded quickly. How can this be corrected?
The iis server and the mssql server are on the same subnet, but on different virtual machines
Each server has 8 GB of RAM and 4 processor cores
No users, these are test machines
chrome
This is very common issue with all ASP.NET application hosted on IIS server, you can check following things for better first time performance improvement.
Make sure that the Application Pools is configured as "Always
Running"
Configure IIS Warm-up services for your website
Precompile your views
Related
We have a single-page web application running on 6 servers behind a load balancer. Recently, we have been experiencing occasional server crashes, and examining stack traces, it's possibly related to winWebSocketHandler+<>c__DisplayClass5_0+<b__0>d;0 [AWAIT#0].
When I looked at one of the servers current requests, I noticed a lot of these:
Is this the normal way that IIS represents WebSocket connections (State = EndRequest), or could they be hanging requests that are needlessly using up resources? If that's the case, how can we close those connections sooner?
We have an asp.net MVC website(nopcommerce 3.6) (Site 1) using Entity framework and inmemorycache. The memory consumption of the site is very high. Reaches to ~14gb.
The application sits in a windows 2008 server with 16gb of RAM.
I need help in understanding the following scenarios/behavior:
The application does not seem to release memory when no other application is currently in use in the server. It uses the maximum memory.
However, if i launch another asp.net application (Site 2)and push it to consume some memory, site 1 release its memory(drops drastically) and both these applications start sharing the RAM.
Is this an acceptable behavior and what happens when my RAM gets to 100% and only one website is running?
When will my site 1 release its memory automatically? Will GC kick in only if my RAM gets to 100%?
Do i have a memory leak? If so, how does it release memory when site 2 needs it?
Note: we cache the EF objects(using inmemory objectcache) with proxy and tracking enabled.
And yes dotmemory is my next option.
I just moved to .Net programming and built a website based on ASP.NET MVC 5 framework.
I come from php programming and I have to admit that MVC has some good advantages.
However , when it comes to deploy website on the internet I'm a bit lost.
I decided to go on Azure, while it seems to much problem to deploy the Microsoft framework on a linux servers ( and it s does not seem optimized)
However I don 't understand at all the pricing policy with this cloud system.
http://azure.microsoft.com/en-us/pricing/details/websites/
What is a Compute instance ?
And what is this hour rate ?
Does it mean that no one access to your website during one hour this won't be charged ?
The memory they mentioned is it RAM memory ?
If yes it's seems to be very few compared to a normal server.
I'm looking for something enough fast.
Moreover I developed my website with a PostgreSQL, but I have the impression that I have to order a separate virtual machine which will host my database.
I'm sorry if my questions are a bit vague, but it's so much different than a simple Apache server.
A compute instance on Azure, is something that has a CPU reserved for you. This can mean, it is not used at all and just waiting for your command.
Examples of compute engines are:
Virtual Machine
Web site
You can run a free Website on Azure. You cannot use your own domain (at least not supported by Azure), and they will stop when not used. This means the first request is slow, the second and later requests are good. When you get too many requests, it will not fit anymore in a free site, but a startup will fit.
If you are outsite the free range, Azure bills per hour (or even minute), that you have the site (or virtual machine) active.
The RAM seems small, but if you have no UI running, you need a lot less RAM.
The advantage of Azure is you can run on a small cheap machine, but you can upgrade very fast, even for a few hours.
I have a Delphi (hence 32-bit) CGI app running on a Windows 2008 64-bit server that has 24 Gb RAM with IIS7. The web service runs fine for a few days at a time (sometimes a few weeks) and then suddenly starts reporting "Not enough storage available to process this command."
Now I've seen this before in regular Windows apps and it normally means that the machine ran out of memory. In this instance, the server shows that only 10% of physical RAM is in use. On top of that, TaskManager shows only one instance of the CGI executable, with 14Mb allocated. And once it starts it keeps giving the error, regardless of actual server load. No way is this thing really running out of memory.
So I figured there is probably some maximum memory setting in IIS7 somewhere, but I couldn't find anything of the sort. Restarting the web server makes the problem go away until next time, but is probably not the best strategy.
Any ideas?
It might be an IRPStackSize issue as discussed here. And the particular cause mentioned in that article is not the only one, apparently.
The CGI does not seem to ever unload under IIS7, even though it seems to work under IIS6. This seems to be a problem with the CGI support on IIS7.
Im trying to build a web-link to a busy social networking website using intraweb.
Intraweb creates temporary folders for each session to store temporary files, which auto-delete when the session expires.
If hosted on Win 32, the limit is 65,536 folders - which means only 65k concurrent sessions are possible.
Is there a way to turn off the temp file creation or allow for more concurrent sessions in intraweb?
IntraWeb is just not designed for handling such session amounts. IntraWeb is designed for Web applications and not for Web sites. Eventhough a plain IntraWeb session takes only a few kbytes, IntraWeb's session handling model is more a "fat" model. It is perfectly suited for creating complex stateful applications that can handle a few hundred concurrent sessions.
For Web sites with thousands of users per day - where many users just open one page and go away again - you /could/ certainly use Webbroker - but that basically means that you have to build up everything from scratch.
If you are a Delphi guy then I would recommend looking into Delphi Prism plus ASP.NET. There are tons of ASP.NET controls that simplify building your Web site in a rapid way. ASP.NET controls from DevExpress.com, Telerik.com and others work perfectly fine in Delphi Prism.
I am pretty sure you'll run out of system resources before you get close to 65,000 users on one box. To handle that load you'll need a load-balanced cluster, and then the 65K limit won't be an issue. I would not focus on this limitation.