How to get memory usage in Rails app.? - ruby-on-rails

For example, I have Update action in Product Controller.
I want to measure how much memory consumption when the Update action being invoked.
thanks.

New Relic's RPM will do what you need -
http://www.bestechvideos.com/2009/03/21/railslab-scaling-rails-episode-4-new-relic-rpm
also take a look at some of the answers here: ruby/ruby on rails memory leak detection

Ruby Memory Validator monitors memory allocation for Ruby applications.
If you want to modify your Ruby runtime to add a memory tracking API to it, take a look at this.

Related

Memory Leaks in my Ruby on Rails Application

My application calls bunch of API's which returns lots of data which are manipulated inside my controller to give various insights (passed onto my view).
The problem is that I have been having memory leaks in my application for which I currently need to restart my application after few number of requests.
Also, I have been caching all of my api calls to improve the performance of my application. Most of my data is stored in form of hashes when returned by the api and this data is manipulated (sort of duplicated using groupby).
I'm using Ruby 1.9 and Rails 3.2. I need to know how can I remove this memory leak from my application.
You should confirm, that you indeed have a memory leak and not a memory bloat.
You can read about ruby GC here
GC.stat[:heap_live_slot] - this one represents the objects which are not cleared after last GC. If this number steadily increases request by request, then you can be sure, that you have a memory leak.
First you can check a list of Ruby gems that have memory leaks first.
Refer (https://github.com/ASoftCo/leaky-gems)
You can use bundler-leak gem to find memory leaks in your gem dependencies.
https://github.com/rubymem/bundler-leak

Ruby on rails memory management

I'm implementing a ruby on rails server(Ruby 2.2.1 and rails 4.1.10) and I'm facing some memory issues where the server process (puma) which can take 500MB and more. Mainly when i'm uploading large file to the server, i'm getting this kind of value. I'm using carrierwave.
My question is related to the ruby management system and garbage collection. Seen that my server is dedicated to embedded system , i really need to cap or control the memory, my process is taking from the system.
Is there a way to see which objects (with its size) are still alive and should not?
Is it right that the memory system in ruby does not retrieve back the free memory to the system if the memory is fragmented?
Please help me to figure out whats goin on when my memory is larger than 150MB idle.
Stéph
After reading a lot of posts talking about that problem, it seems that the real cause come from the Ruby GC, which is consuming a lot of server memory and causing a lot of server swapping since version 2.1.x .
To fix those performance issues, I just set RUBY_GC_HEAP_OLDOBJECT_LIMIT_FACTOR to a value around 1.25. You can play with that setting to find the best value for your environment.
FYI my app is running with Ruby 2.1.5 on Heroku Cedar 14 and it worked like a charm.
Hope it helps :)
I guess my problem is not related to the way GC is starting because if i ask for a manual garbage collection gc.start, i dont get back my memory. Maybe i'm having a memory leak somewhere but i would like to find a way to track it.
It fact, in your case carrierewave gem itself seems to be the source of your issue.
This gem seems to use the entire file to play with it and not chunks... So when your uploading large files you can easily hit the memory limit of your server.
This post seems to confirm what I'm saying:
https://github.com/carrierwaveuploader/carrierwave/issues/413
What I can suggest in your case is to consider using direct upload on S3 to save your server from processing the upload by doing it in background directly on Amazon servers.
I'm not very familiar with carrierwave but this gem should do the trick:
https://github.com/dwilkie/carrierwave_direct
I don't know if it could be a valid solution for you, but regarding your issue it seems to be your best alternative.
Hope it helps :) !

Searching for a memory leak on Jruby/Rails/Tomcat application with YourKit

I had a misfortune of getting a task of searching for an unconfirmed memory leak. This is my first time using YourKit so while I know what I should be looking for, I have no idea where to look and how.
My understanding is that over time memory consumption goes up because certain objects are not being released. Pretty hard to do that in Rails, but I guess somebody figured out how.
Here's how memory telemetry looks like:
Ignoring the fact periods between GC increase over time, it looks like Old Gen memory is going up... maybe.
Now we probably need to know what objects are getting piled on there and what spawns them.
Steps I've taken so far:
triggered CG
started 'Object Allocation Recording' (each 100th... I have a feeling it might be useful for something)
Waited for while
Triggered another CG
Did a memory dump
After opening the memory snapshot in YourKit I have no idea what I should be looking for.
There's Call Tree in Allocations. Expanding the tree gives me a hint of some of the Rails code being run, but I have no idea if what I'm looking at is actually what I need.
Any Java profiling, Yourkit wielding, persons able to point me in a right direction?
Edit: Example of what I can see in Merged paths view:
1) Do not use object allocation recording. It is almost useless for memory leak finding.
2) I recommend to periodically advance objects generation (it is very fast and does not add overhead). Take a look at http://www.yourkit.com/docs/java/help/generations.jsp
You will be able to split objects by "age" and understand why heap grows.
Since you are using Tomcat it might be helpful to take a look at "Web applications" http://www.yourkit.com/docs/java/help/web_applications.jsp If your application has problem with class reloading/redeploying it will be visible there.
Best regards,
Vladimir Kondratyev
YourKit, LLC

How to find out why rails app uses a lot of memory?

I am using Heroku and today I started seeing a lot of Error R14 (Memory quota exceeded) errors. What is the best way (or tool) to track this down and find what uses up all memory?
Install New Relic. Here you go: https://newrelic.com/.
Also, you probably want to check that you are not loading a lot of stuff via Active Record. For example, doing a Comment.all is probably not a very good idea if you have, say, 10000 comments. Instead, do something like Comment.find_in_batches which is slightly more memory friendly.

Rails garbage collect is not removing old objects

Some odd behavior I encountered while optimizing a rails view:
After tweaking the amount of garbage collect calls in a request I didnt see real improvements on the performance. Investigating the problem I found out garbage collect didn't really remove that much dead objects!
I got a very heavy view with LOADS of objects.
Using scrap I found out after a fresh server start and page load the amount of objects was about 670.000, after reloading the page 3 times the amount has risen to 19.000.000!
RAILS_GC_MALLOC_LIMIT is set to 16.000.000 and I can read GC has been called 1400 times.
Why does the memory keep increasing on a refresh of the page? And is there a way to make sure the old objects are removed by GC?
PS: Running on REE 1.8.7 2011.03 with rails 3.2.1
I highly recommended to use newrelic for optimalization, got way more performance boost there then with a little gc tweaking..
You dont need to gc objects you never create :)

Resources