Rails performance analyzers - ruby-on-rails

What are the preffered plugins for monitoring and analyzing the performance of a Rails app? I'm looking for both database/query analyzers and the rest of the stack if possible, though not necessarily all in one plugin. Which ones do you recommend?
( Bonus points for free ones :)
For example, this one looks spify.

RAWK I have used to get performance reports emailed to me at regular intervals. It works by analyzing the rails production logs and has always done well for me.
I recently learned about Scout from their article about rails monitoring, but I haven't had a chance to try it out yet, looks promising!

+1 for new relic.
Also consider five runs, I haven't played with it, but it appears to have a loopback mode for development mode, vs's new relics production mode

Derek here from Scout - we've added deep Rails instrumentation as a plugin. This means you'll get a breakdown of where the request time is spent (db,rendering,other) as well as optimization suggestions for slow MySQL queries.
You can play with the instrumentation on our free plan
Screenshot: http://img.skitch.com/20090513-fe4mnaqn5e7i3nsde5qdwrr6k5.png

This gives info on query analyzers in Rails
http://ronnyml.wordpress.com/2008/07/03/query-analyzer-for-rails/

https://github.com/igorkasyanchuk/rails_performance
I've an own option to monitor performance, it's using Redis to store requests information and not sending data to 3rd party services.
Nothing special, but a simple tool to get the most important reports (rendering time, DB queries, most popular controller#actions)

Related

How much can the free version of Heroku handle?

For a mostly static website without a lot of stuff going on in the back end, how many page views can the free version of heroku handle (ruby on rails). Can it handle 100 a month, 1,000, 10,000, etc. When shoudl I consider upgrading it and are there better services? Thanks in advance!
Unfortunately, the only real answer to this question is "It depends". It depends on how many database calls you are making, whether or not those database calls are optimized, whether you have any form of caching in place, etc.
However, I am currently in a similar situation as you and just launched an early stage startup on Heroku. I highly recommend installing New Relic (don't worry, there's a free tier) to keep an eye on site usage/statistics.
Another (free!) addon I installed was Papertrail, which allows me to keep a close eye on the Heroku logs, and trigger alerts if there are too many errors (which could mean it's time to scale-up).
On the other side, you can test your Heroku server by sending load to it using the Apache Bend tool. See this answer for a good explanation.
Lastly, I would take a look at this question, which provides many good references on Heroku performance.
Good luck!
Heroku is free for up to 5 apps. Each app has access to 750 dyno-hours per month. That should be more then enough for a mostly static site.
I get about 4k-5k visitors per month on a Ruby app, but with a lot of caching.

Can anyone recommend best practices for profiling Ruby on Rails software under Passenger/nginx combination in a live server environment?

I am attempting to determine what can possibly be the causative factor for 20+ second response times from a Rails 3 application located in EC2 using Elasticache. I have reason to believe the problem is in fact cache related, but I have no numbers to prove it. I'd like to get those numbers. For the sake of completeness, we're running the applications atop Ubuntu 12.04 .
Searching Google, I found nothing directly relevant to my situation, and no StackOverflow topics I could find were even remotely relevant to my situation. If anyone can point me to some documentation on the matter, I'd be quite appreciative. Thank you!
I've found the best tool for this to be New Relic.
http://newrelic.com/
I don't work for them and get no benefit from you trying them.
They have a free level that you can start with. If you go up to the non-free version you can literally trace all your requests through different models and into the database telling you how long the app spent in each section. It's a great tool for profiling.
Do you, by any chance have access to standard web logs including URLs and response times?
I faced a similar situation, searched the web, found nothing relevant, and eventually decided to roll my own, which I shared in this SO post:
Profiling a multi-tiered, distributed, web application (server side)
While it is far from perfect and may be too high level for some use-cases, it gave me a pretty quick and broad insight into where the application I was trying to profile is spending most of its time in, and what the slowest parts are. HTH.
The best parts of it are that:
It is 100% platform and programming language independent.
It is a 100% free software solution

How can I test if my web application could handle heavy traffic?

What would be a proper way to simulate a large number of requests to test if my web application can handle it?
You could try using Microsoft's WCAT tools. Look here: http://support.microsoft.com/kb/231282
They're free, too. That's always nice.
Depending on your budget, you may be interested in some load testing software designed for this. A Google search brings up all sorts of alternatives. This is probably the best way to do it.
This one has a free trial version and isn't too pricey, but I would recommend shopping around first.
I've used JMeter in the past, and I find it to be very useful for stress/load testing as website, even ones written in ASP.NET (with or without MVC).
In general you would want to (with any tool) write a script of what an average user of your site would do. You may even end up creating multiple of these scripts. Tools like JMeter even allow for a random element to be added to a script. With these scripts created a load testing tool can then simulate as many users as you desire hitting your site.
I would recommend allow JMeter to slowly ramp up the number of concurrent users and watch the response time graph. At the point where the response time starts increasing too highly is at the point where you've hit the maximum number of users (given you scripts) that your site can handle.
ab and httperf are two, more unixy options, if you don't mind delving in that direction.
There's a nice screencast for using httperf by peepcode.
Use the load testing tools from Visual Studio Team System. 2010 if you can get it.
The tools are great to use and provide wonderful instrumentation. There is also a programming model to go with the tools, allowing you to make some very complex testing scenarios possible.
Post the URL on stackoverflow.
Make it sound like a challenge, so lots of people come check it out: "Can you find the hidden performance problem in this app?"

Web Server Log Analysis Tool

Any suggestions for an accurate Web Log analysis tool to generate reports on the IIS logs? We used WebTrends, but I don't feel it was accurate.
To analyze weblogs, I don't think you can go wrong with Analog: http://www.analog.cx/
If you are analyzing your own logs, which are often huge files, you will want the fastest analyzer you can find. Analog is fast.
You'll want one that's been around awhile and is still supported. Analog just celebrated its 10'th birthday.
Analog claims to be the most popular logfile analyser in the world.
Multi-languages.
Did I say its free and open source?
As far as accuracy goes, no tool gives perfect results. Javascript fails often in catching hits. Trying to track individual people's paths through a website (i.e. for Analytics purposes) is fraught with problems. And even trying to differentiate hits versus visits and screening out the bots is all more of a black art than a science.
What is best is simply to have a tool that gives decent basic statistics that tell you what you need to know.
I've looked at other tools, such as Deep Log Analyzer: http://www.deep-software.com/, which attempts to do analytics from your weblogs. But speed was a problem. They claim their new version 3.5 - April 2008, which I didn't try, has improved performance. The big advantage of a program like this is the advanced reporting you can do, including custom SQL requests. You have to purchase their professional version ($200) to do most of the analytics and custom queries. If Analog is too simple for you, then try the free version of Deep Log Analyzer.
And you can also try Microsoft's own Log Parser, as was the recommended answer in: https://stackoverflow.com/questions/157677/a-good-iis-log-viewer-for-large-log-files.
But you will need some extra skills to use it.
What are you wanting to analyze from your logs? There are a bunch of tools out there - free or paid for - that will go through the logs and spit out a great variety of figures. Some have real meaning, others are best used with a grain of salt.
What none will show you is "How many people are actually reading my wonderful web pages". Those that attempt to show "distinct site visitors" or any detailed metrics are at best a rough approximation to an indication of a vague trend...
But for what it's worth, we use Analog.
SHORT ANSWER:
You are correct to question the results; log analysis is not adequate to report actual traffic.
LONGER ANSWER:
WebTrends is a great tool for what it delivers. But as a previous administrator of a WebTrends installation, I found that web logs are notoriously bad at capturing metrics of interest.
For instance, if there exists any caching in your web delivery stack (or on the consumers side-- *I'm shaking my fist at YOU, AOL!), then your web logs are instantly non-reflective of your site's actual activity. This is because log analysis assumes that all user consumption will translate to an HTTP request back to the web server-- and thus having been recorded in the IIS logs. In the case of a cache, this would not be the case.
In the future if you want more reliable results, you ultimately need to ensure that there exists a way to bust any caching strategy. The obvious answer is dynamic content. But if you do not want to rewrite all of your content in such a fashion, just ensure your web traffic analysis uses a dynamic call.
WebTrends actually offers a solution to this problem, called SDC server. This is exactly what Google Analytics offers as well-- it's a javascript call back to the analysis server.
...I could go for days on this. If you want more specific information, comment back. ;)
EDIT: With WebTrends, specifically, it is quite important to configure session tracking beyond their default IP/userAgent configuration. If your web server assigns a session cookie, you will find this will increase your reliability; especially for differentiating between users which may sit behind the same NAT.
I have had really good luck with SmarterStats, from SmarterTools.
There is a logging package for free from MSFT for viewing this information using SQL Reporting Services. Google it.
doing it with the logs is only a good idea if it's internal - I'd use google analytics for anyhing on teh internets
I have been using Summary, which is paid for software, for years, and love it. The cost of updates is getting to me, and paying for an update to just get user agent string updates out of the deal is getting bothersome. Not that there are not other fixes, I just tend to not need them.
Anyone care to share if they have used Summary compared to analog?
Look at XpoLog log analysis platform for web application servers and web servers log. it a log management and analysis platform that integrate to web servers logs and create reports, provide search and log viewer and also monitor for problems. XpoLog

Can you Distribute a Ruby on Rails Application without Source?

I'm wondering if it's possible to distribute a RoR app for production use without source code? I've seen this post on SO, but my situation is a little different. This would be an app administered by people with some clue, so I'm cool with still requiring an Apache/Mongrel/MySQL setup on the customer end. All I really want is for the source to be protected. Encoding seems a popular way to go for distributing PHP apps (eg: Helpspot).
I've found these potential solutions:
Zenobfuscate - not all types of Ruby code is supported however, so that counts that out
Ruby Encoder - may be the best option, as their PHP encoder looks alright (I haven't tried it however) but it's not available yet. I've used IONcube for PHP before and it worked well, but it doesn't seem that IONcube is interested yet.
Slingshot - it was mentioned in the other SO post, but it solves a different problem to mine and the source is still visible.
RubyScript2Exe - from the doco, it's not production ready, so that counts that out.
I've heard that potentially using JRuby and distributing bytecode might be a way to achieve this, but I've never used JRuby so I'm not sure what's involved.
Can anyone offer any ideas and/or known examples? Ideally I'd love to have some kind of automated build scenario as well.
Your best option right now is to use JRuby. A little bit of background: My company (BitRock) works with many proprietary and commercial open source vendors. We help them package their server software, which is typically based on PHP, Java or Ruby together with a web server or application server (Apache, Tomcat), the language runtime and a database (typically Postgres, MySQL) into a self-contained, easy to use installer. We have a large number of PHP-based customers (including HelpSpot, which you mention) but also several Rails-based ones. In the case of the RoR customers the norm is to use JRuby together with Tomcat or Glassfish although in some cases we also bundle a native Ruby interpreter to run specific scripts that rely on libraries not yet ported to JRuby (usually not core to the application). JRuby has matured quickly and in many cases it actually runs their code faster than regular Ruby. You will need to also consider that although porting your code to JRuby is fairly straightforward, you will need to invest some time on that. You may want to check JRuby Stack which is a free installer of everything you need to get started. Good luck!
If you release the source, obfuscated or otherwise, your app will be pirated. See, for example, Mint. It depends on what you're building, but you may find that you're better off releasing the app as a hybrid of sorts: A hosted app with a well-defined API, and a component that runs on the customer's server. As long as the true value of your product lives on the server side, you don't need to obfuscate your code, and you can just release the source code unmodified. Additionally, this may also give you the opportunity to reach clients running, say, PHP rather than Ruby. See, for example, Google Analytics, HopToad, Scout, etc, etc.
You can, but it wouldn't do anything to prevent somebody from reverse-engineering or modifying it. I remember there was an article about similar attempts to obfusticate Perl and how they could be effectively bypassed by a debugger and 5 minutes of effort.
If you can't wait for the delivery of RubyEncoder, then I think ZenObfuscate is the most promising. Though it may require some modifications to your source code, they do say this on their site:
ZenObfuscate costs $2500 for a site license or is individually negotiable for other licensing schemes. Yes, that is expensive. That was on purpose. But don't let that thwart you too much. If your product is really cool and we want to see it succeed, we'll make it work. "Really cool" is not freecell.
Of course, for $2500 (or more), you'd hope to get a few tweaks to the compiler that'd make your codebase fully supported. It might be worth engaging them in the conversation.
You can also take a look at Mingle from ThoughtWorks studios as an example of using JRuby for this.
It's a Ruby on Rails app, they run it using JRuby. They've customized jruby to load encrypted .rb files.
Take a look at JumpBox.
I've had conversations with them on the topic, and they seem to have a solution that will work soon for Rails apps.
I'm wondering if you could just "compile" the ruby code into an executable using something like RubyScript2Exe ?
To be honest I haven't used it but it seems like it could be what you want, even if it just packages up the scripts with the interpreter into a single executable.

Resources