We have application deployed on customer's server where we haven't any access.
Only thing which we can to do ask someone to make trivial operations(use profiler is not trivial operation) and change our code to add logger for example.
So i want to know is there any good logger specialized on perfomance. I know about log4net but it is just log some information. It would be good to get reports with charts and represent measures of code in hierachical view. I want logger which separate requests from different users. Ofcourse i can write it by myself but maybe there are exists some good free tools?
If you have no access, the only solution I know is: http://miniprofiler.com/
Other ideas:
Ants profiler
NewRelic
Custom Attributes with PostSharp
List item
Custom ASP.NET MVC filters
Instead of just logging you may want to think about using an application analytics tool. There are a variety of them available, many with a free usage tier. Depending on the tool you may need to do just a little extra coding to get the measurement data you want or it may require more significant custom coding.
First, one that I have personal experience with is Loupe from Gibraltar Software. This is a logging and analytics tool that started as a logging tool and I think will give you more of what you want with minimal code changes.
The other option I am familiar with is PreEmptive Analytics. This originated as an application analytics tool so you need to do a bit more coding to report on the more raw performance value data.
There are also other options in the .NET space including New Relic, Loggr, Equatec, and Trackerbird that should all work with either a greater or lesser amount of custom coding for your situation.
Related
Possible Duplicate:
What Web Application Framework for Delphi is recommended?
We have a Delphi 2007 desktop application which we have hosted using Citrix. Now we want to get rid of Citrix and somehow web-enable it.
I have done bit of research and found that it is possible by using the uniGUI.
http://www.unigui.com
Conclusion: Can be done, but would require a re-write and only a subset of components are supported. Serious questions remain are the monolithic application structure in a web environment.
There are two more options morfik and atozed and they also require a re-write.
I want to know if there is any other option which requires a very less re-write work and how fragile is it?
How fragile it is, is based on the quality of your code. If you have a good structured application, with business logic and data access fully separated from the GUI, it will be pretty safe, although you still have to rewrite mostly all your GUI.
If there's logic in your forms, and the code that talks to the GUI components is intwined with the code that checks your input and stores the data, then you have a big problem.
In that case, this is a great opportunity to refactor large portions of your app and do it better this time. ;)
Since there is no "silver bullet" here, it doesn't matter much which product you use. You have the same challenges with any of them. I would recommend spending a few days on a Proof-of-Concept (PoC) re-write of 2-3 typical screens. Implement the POC for each "finalist" product, and see how it works out. Keep track of how long it took for each one, things that were easier/harder, and how the end result appears to the end-user (performance, good/funny-looking, robustness, "feel").
As for the actual re-write, I would recommend the following:
Re-factor existing application to remove business logic from the UI.
Full Regression testing, and push that into production.
NOW proceed with conversion to one of the web tools.
Oops - I left out a step. Step 0: FREEZE all features/fixes. If fixes are needed to current production, they'll need to be done in a separate branch, and then rolled-up into this project later.
Note that this type of work lends itself nicely to outsourcing, as the work is straightforward and the requirements are simple. Especially if it can be delivered one form at a time, so progress, timelines, and $$$ can be measured in small chunks.
Another preliminary step is to develop a "cook book" for stripping the business logic from the existing GUI layer. It should identify naming conventions, common libraries (for code that should have been shared all along but wasn't), and should describe the conversion methodology.
AFAIK, there's not tool will convert your desktop application to web application without requiring rewrite for most of GUI Parts.
as Golez said, you will have to refactor your application, try to separate your business logic from the GUI, then you can use some tools like Intraweb to develop the GUI as web and reusing the existing business logic with it.
Another option by converting your application to n-tiers technology and warp your business logic as web services or any open technology and make your web part by any web languages such as ASP.Net or PHP.
Depending on how 'Web enabled' you want the App to be.. I use Cybele Software's (https://www.cybelesoft.com/) Thinfinity UI to extend Apps to the Web, including Database Apps.
It only requires the installation of their ThinFinity Server and one line of code added to the Proj source and you are in business.
The Apps all run on your PC.
Well perhaps I simplified it a little, but worth a look.
HTH.
Regards,
Ian
This question contains a lot of background information, to make sure you fully understand why we are looking at these technologies.
The question is basically this:
For a large, spreadsheet-type, module that we need to develop for our webmodule for our application, are there any pitfalls we should know about if we decide to use Silverlight for it?
Issues we already know, and don't need any discussion/reminders about:
We're aware of the problems around using a plugin-type solution, which may or may not be installed on the users machine (and in some cases, probably can't be installed). These risks needs to be mitigated, but we're aware of them. Please don't get hung up on this.
We're a .NET company, so while ruby on rails and lots of other different platforms and architectures are good for this solution, they are not in the scope of the decision here. We have lots of code already written in .NET that we need to take advantage of, otherwise the project will never be finished regardless of platform.
Background
We have a web module for our application with employee-related information and some input forms. Our Windows desktop application is mostly a department leader type of application, to manage employees, but the web module contains mostly employee-centric functions. The web module contains mostly report-type webpages, to list information from the system, or input-forms.
The module we need to add now is more of a heavy spreadsheet type application. You change something one place, and something changes somewhere else, like sums, what is enabled/disabled, etc.
We know we can manage all of that with AJAX, but another issue here is that the application will potentially load a lot of database data in order to put the data in front of the user, and with a AJAXy solution, we're afraid that the request/response method here will have to reload quite a lot of information on every request, even to respond to seemingly easy questions.
A way to mitigate that would basically be to load information into a Session-object or similar, but that's a big no-no, so we'd rather not do that. This is a multi-user module, and some of the data is rather static, but some of the data is also going to have to be refreshed from time to time, so if 10 users loads a lot of data into the session, that's going to be a pretty big memory-hit.
We will be using ASP.NET (MVC) for this if we choose to go this route, that is, developing the module in pure HTML and similar technologies.
Then we looked at Silverlight, and would then load all the information down into the Silverlight application on the client. It would hold the current state, and would only need to touch the database to refresh some of the information, some of the time, instead, as we think the request/response model with ASP.NET (MVC) would work, on every little request.
But, since we have only done minor things with Silverlight, we're not that experienced with it, and we're afraid that some assumptions we might have, stated or unconcious, turns out to be wrong or flawed, which will make this project impossible or very hard to manage at some point.
For instance, just to take an example, is there a limit to how much memory the Silverlight application is allowed to load (I know, if I have to ask I can probably not afford it), for instance if there is a limit on 10MB, then that would be nice to know about before we're midway and start to load the really heavy data.
To make it simpler to give examples, let's just assume we're building a spreadsheet, that has so much data, that for the simple "changed a number here, what else changed", too much data from the database has to be loaded for a proper request/response model to be used, and if we move the entire thing to Silverlight, what will make that project hard or impossible?
Knowing about such things would at least give us the ability to consider if the price is acceptable.
In short, why should we not use Silverlight for this and instead go for ASP.NET (MVC)?
And again, "use Ruby on Rails instead", is not really an answer here. The options are ASP.NET (MVC) which we have experience with, or Silverlight which we don't but can gain.
Of course, if Ruby on rails, given that we'd have to start pretty much from scratch infrastructure-wise, and have to learn a new programming language, and framework, and download and learn a new IDE/tool, if it would still allow us to cut the development time in half, then please give us some information about how that might work, but I daresay that won't really happen here.
You should know that Silverlight (version 3.0) does not support any printing whatsoever, which to me sounds like a whopper of a showstopper for you (sorry, I couldn't resist). The good news is that full printing support has been added in version 4, but that is still in beta. Rumours say it should be out before the summer if everything works out according to plan, so if that fits with your roadmap I would use SL4 right from the start.
There are no memory limitations in Silverlight, but for the local storage (IsolatedStorage) mechanism there is a default limit of 1MB. But you can easily get around that by asking the users permission to increase the local storage space when he/she starts up the application. More on that here: Silverlight Tip of the Day #20 – How to Increase your Isolated Storage Quota.
(Edit)
Aside from the missing printing functionality that will be fixed in SL4 I cannot see any problems with your scenario. I would easily take the Silverlight route if I were you, especially since you already have extensive knowledge of .NET/C#.
For a rich interface as you've described, I would definately go with Silverlight or Flash rather than a html/javascript/ajax solution.
These technologies make for much better and consistent interfaces across platforms, you can buy in various components to speed things up and support things like copy-n-paste and code in a more structured way.
Another element is skills, if you have the skills to achieve it in a particular technology, then go with that.
To the answer you question the best way I can; you should not use silverlight if you decide to use flash.
HTH
What would be a proper way to simulate a large number of requests to test if my web application can handle it?
You could try using Microsoft's WCAT tools. Look here: http://support.microsoft.com/kb/231282
They're free, too. That's always nice.
Depending on your budget, you may be interested in some load testing software designed for this. A Google search brings up all sorts of alternatives. This is probably the best way to do it.
This one has a free trial version and isn't too pricey, but I would recommend shopping around first.
I've used JMeter in the past, and I find it to be very useful for stress/load testing as website, even ones written in ASP.NET (with or without MVC).
In general you would want to (with any tool) write a script of what an average user of your site would do. You may even end up creating multiple of these scripts. Tools like JMeter even allow for a random element to be added to a script. With these scripts created a load testing tool can then simulate as many users as you desire hitting your site.
I would recommend allow JMeter to slowly ramp up the number of concurrent users and watch the response time graph. At the point where the response time starts increasing too highly is at the point where you've hit the maximum number of users (given you scripts) that your site can handle.
ab and httperf are two, more unixy options, if you don't mind delving in that direction.
There's a nice screencast for using httperf by peepcode.
Use the load testing tools from Visual Studio Team System. 2010 if you can get it.
The tools are great to use and provide wonderful instrumentation. There is also a programming model to go with the tools, allowing you to make some very complex testing scenarios possible.
Post the URL on stackoverflow.
Make it sound like a challenge, so lots of people come check it out: "Can you find the hidden performance problem in this app?"
Any suggestions for an accurate Web Log analysis tool to generate reports on the IIS logs? We used WebTrends, but I don't feel it was accurate.
To analyze weblogs, I don't think you can go wrong with Analog: http://www.analog.cx/
If you are analyzing your own logs, which are often huge files, you will want the fastest analyzer you can find. Analog is fast.
You'll want one that's been around awhile and is still supported. Analog just celebrated its 10'th birthday.
Analog claims to be the most popular logfile analyser in the world.
Multi-languages.
Did I say its free and open source?
As far as accuracy goes, no tool gives perfect results. Javascript fails often in catching hits. Trying to track individual people's paths through a website (i.e. for Analytics purposes) is fraught with problems. And even trying to differentiate hits versus visits and screening out the bots is all more of a black art than a science.
What is best is simply to have a tool that gives decent basic statistics that tell you what you need to know.
I've looked at other tools, such as Deep Log Analyzer: http://www.deep-software.com/, which attempts to do analytics from your weblogs. But speed was a problem. They claim their new version 3.5 - April 2008, which I didn't try, has improved performance. The big advantage of a program like this is the advanced reporting you can do, including custom SQL requests. You have to purchase their professional version ($200) to do most of the analytics and custom queries. If Analog is too simple for you, then try the free version of Deep Log Analyzer.
And you can also try Microsoft's own Log Parser, as was the recommended answer in: https://stackoverflow.com/questions/157677/a-good-iis-log-viewer-for-large-log-files.
But you will need some extra skills to use it.
What are you wanting to analyze from your logs? There are a bunch of tools out there - free or paid for - that will go through the logs and spit out a great variety of figures. Some have real meaning, others are best used with a grain of salt.
What none will show you is "How many people are actually reading my wonderful web pages". Those that attempt to show "distinct site visitors" or any detailed metrics are at best a rough approximation to an indication of a vague trend...
But for what it's worth, we use Analog.
SHORT ANSWER:
You are correct to question the results; log analysis is not adequate to report actual traffic.
LONGER ANSWER:
WebTrends is a great tool for what it delivers. But as a previous administrator of a WebTrends installation, I found that web logs are notoriously bad at capturing metrics of interest.
For instance, if there exists any caching in your web delivery stack (or on the consumers side-- *I'm shaking my fist at YOU, AOL!), then your web logs are instantly non-reflective of your site's actual activity. This is because log analysis assumes that all user consumption will translate to an HTTP request back to the web server-- and thus having been recorded in the IIS logs. In the case of a cache, this would not be the case.
In the future if you want more reliable results, you ultimately need to ensure that there exists a way to bust any caching strategy. The obvious answer is dynamic content. But if you do not want to rewrite all of your content in such a fashion, just ensure your web traffic analysis uses a dynamic call.
WebTrends actually offers a solution to this problem, called SDC server. This is exactly what Google Analytics offers as well-- it's a javascript call back to the analysis server.
...I could go for days on this. If you want more specific information, comment back. ;)
EDIT: With WebTrends, specifically, it is quite important to configure session tracking beyond their default IP/userAgent configuration. If your web server assigns a session cookie, you will find this will increase your reliability; especially for differentiating between users which may sit behind the same NAT.
I have had really good luck with SmarterStats, from SmarterTools.
There is a logging package for free from MSFT for viewing this information using SQL Reporting Services. Google it.
doing it with the logs is only a good idea if it's internal - I'd use google analytics for anyhing on teh internets
I have been using Summary, which is paid for software, for years, and love it. The cost of updates is getting to me, and paying for an update to just get user agent string updates out of the deal is getting bothersome. Not that there are not other fixes, I just tend to not need them.
Anyone care to share if they have used Summary compared to analog?
Look at XpoLog log analysis platform for web application servers and web servers log. it a log management and analysis platform that integrate to web servers logs and create reports, provide search and log viewer and also monitor for problems. XpoLog
I need to deploy a Delphi app in an environment that needs centralized data and file storage system (for document imaging) but has multiple branch offices with relatively poor inter connectivity. I believe a 3 tier database application is the best way to go so I can provide a rich desktop experience with relatively light-weight data transfer needs. So far I have looked briefly at Delphi Datasnap, kbmMW and Remobjects SDK. It seems that kbmMW and Remobjects SDK use the least bandwidth. Does anyone have any experience in deploying any of these technologies in a challenging environments with a significant number of users (I need to support 700+)? Thanks!
Depends if you are tied to remote datasets. If you aren't dataset bound then SOAP would likely be a good choice. Or, what I've done is write my own protocol that is similar to SOAP in nature. This was done before SOAP was standard and I'm glad I did - this gives you the ability to control more of the flow of data. It's given that if you have poor connectivity then you will be spending time supporting it. It's very nice if it's your own code you are supporting versus having to wait on a vendor. (Although KBM and REM are known to be pretty good vendors.)
Personal note: 700 users in a document imaging application over poor connectivity sounds like a mess. Spend the money on upgrading connectivity as it'll be cheaper in the long run.
Both kbmMW and RO SDK offer binary format, which is more compact than SOAP format,specially you are working with documents.
RO sdk seems to offer more GUI tools to help you doing your services.
Also give a RealThinClient SDK a look, it's a lightweight remoting framework.
But what ever framework you go with, your design of work will make it fast or slow, I have some applications working on slow 128kb lines, and it's working perfect without any user complain, but I don't do a large transfer for files.
One thing to remember...its not the number of users, but the number of them using the resources at the same time that will be the issue. Attempt to develop your application "server stateless" if at all possible, this will allow greater flexibility in the long term if you find you have to add more servers to the pool to support your customer base. The hardest thing about n-tier is scaling beyond the first server...plan on that from the start. Each request should not know anything about a prior request...or at the very least the request should have a way of passing the context so the server can look it up in a session table or something.
Personally, I would recommend RemObjects. I have used it with good results.
I don't know if it's the very best / most efficient (glad you asked this question!), but I've had good results w/RemObjects SDK + DataAbstract. The latter made much of the plumbing details less involved, which was helpful. Still implementing, but so far so good.
If you really wanna go "low-bandwidth" use BSD Sockets API - that'll give you full control over what's being sent and there you can send as little information as you want. Of course then you'll have to implement all the tiers yourself, but hey - that's still an option :D