Elmah XML Logging on Load Balanced Environment - asp.net-mvc

We're implementing Elmah for an internal application. For development and testing we use a single server instance but on the production environment the app is delivered using a load balanced environment.
Everything works as charm using Elmah, except for the fact that the logs are done independant in each server. What I mean with this is that if an error happens in Server1 the xml file is stored physically on that server and the same for Server2, since I'm storing that files on the App_Data
When I access the axd location to see the error list, I just see the ones of the server that happened to attend my request.
Is there any way to consolidate the xml files other than putting them on a shared folder? Having a shared folder will make us to allow the user that executes the application on the server to have access to that separate folder and to be on only one of the servers instead of both.
I cannot use In-Memory or Database logging since FileLog is the only one allowed.

You might consider using ElmahR for this case, since you are not able to implement In-Memory or Database logging. ElmahR will provide you with a central location for the two load balanced servers to send errors to (in addition to logging them locally) via an Http post. Then you can access the ElmahR site for to view an aggregated list. Also, ElmahR is storing the error messages in a SqlServerCE database, so it can persist the error messages it receives.
Keep in mind that if the ElamhR Dashboard app design does not meet your initial needs/desires, it could be modified as needed given that it is an open source project.
Hope this might be a viable option.

Related

Can I transfer files from a local server to my Nextcloud server without using the internet and allow users to access them?(same computer)

I used docker technology to set up a nextcloud server for myself and my family
Can I transfer files from a local server to my Nextcloud server without using the internet and allow users to access them?
Because I have discovered two strange things:
1.Placing files directly under a specific user's file path on the server does not allow the user to successfully access the file.
2.As long as I don't delete the files added by the user, even if I directly change the content of the files on the server, the user can still accurately and correctly read the original content.
Or is the user profile path that I think is incorrect?
I think it's /var/www/html/data/"USERID"/files
I would like to know how to solve it, but at the same time, I also want to know what is the reason that causes the following two problems.
Thank you so much.

HTTP Server in iOS to list files Documents directory

I am trying to create an HTTP Server inside my iOS application, to develop something like Xender application. Right now I Succeed to setup HTTP Server inside my Application and hosting any HTML file there, that can be loaded on another Device/System using IP and Port.
But, I want to Link that HTML to my application database to populate data on that HTML file, followed by making it dynamic so that It can be opened from another device or system.
Ultimately, I need to Query on SQLITE database of application from
HTML file, Is there any way to do such thing?
Can I connect SQLITE to frontend of HTML? In case of Web apps these things can be done using any server side scripting languages like PHP, by connecting with Databases like MySQL. But, Here My case is HTML and Sqlite.
EDIT
I found Is it possible to access an SQLite database from JavaScript? . but this is all about Client side local storage, but I think in my case its on Server side SQLITE.
You have to create template HTML files and provide a set of variables for it. Then, when the file is requested in your server, you load it into memory.
Now you do some RegEx magic to get the query parameters, do your SQL stuff and then replace the corresponding variables in your HTML string and finally serve it to the client.
Your would need to define your own non-logical "scripting" language that is able to tell your application what data is requested and where to output possibly returned data.
I think this is quite hard work and you should possibly try to find a better solution that is probably already done by others.
EDIT
You could use Node.js and this interpreter but it's not maintained anymore. But their might be similar projects.
EDIT II
I've found the neu.Node, which sounds quite promising. They haven't done anything in 4 months, but they seem to be well organized and documented.

IIS instances sharing data

I don't know if this is common or something but I wanted to check. So I am building a site on an iis7 server and coming across a weird problem. Whenever I have 2 clients accessing the site it seems they are sharing info. Here is an example, when one client does a search for a particular item, the other client goes to the search page and see's the results of client's one search results. I am using a global class to store this information on my code behind.
So here is my question, my understanding of servers was that if two clients accessed the server they were running on different instances of the site, meaning that even if I have a global class in my code it would be as if two machines were running it. Am I wrong in this understanding?
Also are there settings in IIS that I need to change for this to work?
In asp.net, you can use Session variables which are unique serialized token type things stored in server memory. You can store html form info in these sessions so another page on your site can read it.
The syntax in your MVC controller action to create a Session would be:
Session["MyFormData"] = someObject;
http://msdn.microsoft.com/en-us/library/ms178581.aspx

Make an ASP.NET MVC application Web Farm Ready

What will be the most efficient way to make an ASP.NET MVC application web-farm ready.
Most importantly sharing the current user's information (Context) and (not so important) cached objects such as look-up items (States, Street Types, counties etc.).
I have heard of/read MemCache but haven't seen a simple applicable way (documentation) on how to implement and test it.
Request context
Any request that hits a web farm gets served by an available IIS server. Context gets created there and the whole request gets served by the same server. So context shouldn't be a problem. A request is a stateless execution pipeline so it doesn't need to share data with other servers in any way shape or form. It will be served from the beginning to the end by the same machine.
User information is read from a cookie and processed by the server that serves the request. It depends then if you cache complete user object somewhere.
Session
If you use TempData dictionary you should be aware that it's stored inside Session dictionary. In a server farm that means you should use other means than InProc sessions, because they're not shared between IIS servers across the farm. You should configure other session managers that either use a DB or others (State server etc.).
Cache
When it comes to cache it's a different story. To make it as efficient as possible cache should as well be served. By default it's not. But looking at cache it barely means that when there's no cache it should be read and stored in cache. So if a particular server farm server doesn't have some cache object it would create it. In time all of them would cache some shared publicly used data.
Or... You could use libraries like memcached (as you mentioned it) and take advantage of shared cache. There are several examples on the net how to use it.
But these solutions all bring additional overhead of several things (like network and third process processing and data fetching etc.) if nothing else. So default cache is the fastest and if you explicitly need shared cache then decide for one. Don't share cache unless really necessary.

mod_xsendfile alternatives for a shared hosting service without it

I'm trying to log download statistics for .pdfs and .zips (5-25MB) in a rails app that I'm currently developing and I just hit a brick wall; I found out our shared hosting provider doesn't support mod_xsendfile. The sources I've read state that without this, multiple downloads could potentially cause a DoS issue—something I'm definitely trying to avoid. I'm wondering if there are any alternatives to this method of serving files through rails?
Well, how sensitive are the files you're storing?
If you hosted these files somewhere under your app's /public directory, you could just do a meta tag or javascript redirect to the public-facing URL of these files after your users hit some sort of controller action that will update your download statistics.
In this case, your users would probably need to get one of those "Your download should commence in a few moments" pages before the browser would start the file download.
Under this scenario, your Rails application won't be streaming the file out, your web server will, which will give you the same effect as xsendfile. On the other hand, this won't work very well if you need to control access to those downloadable files.

Resources