I have a simple query that for some reason is hanging, causing heroku to hit max memory, and crashing the server. I have never seen this behavior before so I am looking for suggestions on what might be causing this:
#city = params[:city] ? City.find(params[:city]) : City.first
SELECT "cities".* FROM "cities" WHERE "cities"."id" = ? LIMIT 1 [["id", "1"]]
Simple, but for some reason in all environments (dev, staging, and prod) it causes that weird behavior.
Connecting to sqlite db in dev, amazon rds mysql in stag and prod. (sqlite3 gem, mysql2 gem, ruby 2.0.0, rails 4.0.0)
Related
We are using the Gem PaperTrail, so in postgres we have a table called xxx.versions.
Through rails consoles locally we try to query the db through the PaperTrail Object:
PaperTrail::Version.first
PG::InsufficientPrivilege: ERROR: permission denied for relation versions
: SELECT "versions".* FROM "versions" ORDER BY "versions"."id" ASC LIMIT 1
but if I access the table like this:
ActiveRecord::Base.connection.execute("SELECT * FROM xxx.versions limit 1")
a record set is returned.
This only happens when we try and connect to the db from our local machines.
If I run PaperTrail::Version.first from the rails console on one of the qa servers it connects just fine.
Other troubleshooting details:
I have used the same credentials as the qa server and received the
same results.
On my local machine, I can query the DB from
DataGrip/pgadmin just fine outside the rails project.
There are several possibilities:
You use a different PostgreSQL user or database in both attempts.
There is a different relation versions in another schema that is before xxx on the search_path.
We are experiencing a strange problem on a Rails application on Heroku. Juste after migrate from Rails 3.2.17 to Rails 4.0.3 our postgresql server show an infinite increase of memory usage, then it returns the following error on every request :
ERROR: out of memory
DETAIL: Failed on request of size xxx
Juste after releasing the application with rails 4, postgresql memory start to increase.
As you can see in the screenshot below, It increase from 500 MO to more than 3,5Go in 3 hours
Simultaneously, commit per second doubled. It passed from 120 commit per second :
to 280 commit per second :
It is worth noting that when we restart the application, memory go down to a normal value of 600 Mo before going up to more than 3 Go few hours later (then every sql request show the 'out of memory' error). It is like if killing ActiveRecord connections were releasing memory on postgresql server.
We may well have a memory leak somewhere.
However :
It was working very well with Rails 3.2. Maybe this problem is a conjunction between changes we made to adapt our code to Rails 4 and Rails 4 code itself.
Ihe increase of the number of commit per second juste after Rails 4 upgrade seems very odd.
Our stack is :
Heroku, x2 dynos
Postgresql, Ika plan on heroku
Unicorn, 3 workers per instance
Rails 4.0.3
Redis Cache.
Noteworthy Gems : Delayed jobs (4.0.0), Active Admin (on master branch), Comfortable Mexican Sofa (1.11.2)
Nothing seems really fancy in our code.
Our postgresql config is :
work_mem : 100MB
shared_buffers : 1464MB
max_connections : 500
maintenance_work_mem : 64MB
Did someone ever experienced such a behaviour when switching to Rails 4 ? I am looking for idea to reproduce as well.
All help is very welcome.
Thanks in advance.
I don't know what is better : answer my question or update it ... so I choose to answer. Please let me know if it's better to update
We finally find out the problem. Since version 3.1, Rails added prepared statements on simple request like User.find(id). Version 4.0, added prepared statements to requests on associations (has_many, belongs_to, has_one).
For exemple following code :
class User
has_many :adresses
end
user.addresses
generate request
SELECT "addresses".* FROM "addresses" WHERE "addresses"."user_id" = $1 [["user_id", 1]]
The problem is that Rails only add prepared statement variables for foreign keys (here user_id). If you use custom sql request like
user.addresses.where("moved_at < ?", Time.now - 3.month)
it will not add a variable to the prepared statements for moved_at. So it generate a prepared statements every time the request is called. Rails handle prepared statements with a pool of max size 1000.
However, postgresql prepared statements are not shared across connection, so in one or two hours each connection has 1000 prepared statements. Some of them are very big. This lead to very high memory consumption on postgreqsl server.
I'm running a Rails app (v 3.1.10) on a Heroku Cedar stack with Papertrail add-on going crazy because of the size of the logs.
My app is really verbose and the logs are getting huge (really huge):
Sometimes because I serialize a lots of data in one field and that makes a huge SQL request. In my model I have many:
serialize :a_game_data, Hash
serialize :another_game_data, Hash
serialize :a_big_set_of_game_data, Hash
[...]
Thanks to my AS3 Flash app working with bigs sets of json...
Sometimes because there's a lots of partials to render:
Rendered shared/_flash_message.html.erb (0.1ms)
Rendered shared/_header_cart_info.html.erb (2.7ms)
Rendered layouts/_header.html.erb (19.4ms)
[...]
It's not the big issue here, but I've added this case too because Jamiew handle it, see below...
Sometimes because there's lots of sql queries on the same page:
User Load (2.2ms) SELECT "users".* FROM "users" WHERE "users"."id" = 1 LIMIT 1
Course Load (5.3ms) SELECT "courses".* FROM "courses" WHERE (id = '1' OR pass_token = NULL)
Session Load (1.3ms) SELECT "sessions".* FROM "sessions" WHERE "sessions"."id" = 1 LIMIT 1
Training Load (1.3ms) SELECT "trainings".* FROM "trainings" WHERE "trainings"."id" = 1 LIMIT 1
[...]
It's a big (too) complex App we've got here... yeah...
Sometimes because there's a lots of params:
Parameters: {"_myapp_session"=>"BkkiJTBhYWI1MUVlaVdtbE9Eb1Y2I5BjsAVEkiEF9jc3JmX3Rva2VlYVWZyM2I0dEZaR1YwNXFjZhZTQ1uBjsARkkiUkiD3Nlc3Npb25faWQGOgZFRhcmRlbi51c2yN1poVm8vdWo3YTlrdUZzVTA9BjsARkkiH3dAh7CMTQ0Yzc4ZDJmYzg5ZjZjOGQ5NVyLmFkbWluX3VzZXIua2V5BjsAVFsISSIOQWRtaW5Vc2VyBjsARlsGaQZJIiIkMmEkMTAkcmgvQ2Rwc0lrYzFEbGJFRG9jMnZvdQY7AFRJIhl3YXJkZW4udXNlci51c2VyLmtleQY7AFRbCEkiCVVzZXIGOwBGWwZpBkkiIiQyYSQxMCRBUFBST2w0aWYxQmhHUVd0b0V5TjFPBjsAVA==--e4b53a73f6b622cfe7550b2ee12678712e2973c7", "authenticity_token"=>"EeiWmlODoYXUfr3b4tFZGV05qr7ZhVo/uj7a9kuFsU0=", "utf8"=>"✓", "locale"=>"fr", "id"=>"1", "a"=>1, "a"=>1, "a"=>1, "a"=>1, "a"=>1, "a"=>1, [...] Hey! You've reach the end of the line but it's not the end of the parameters...}
The AS3 Flash app send big json data to the controller...
I didn't mention the (in)famous "Assets pipeline logging problem" because now I'm using the quiet_assets gem to handle this:
https://github.com/evrone/quiet_assets
So... what did I try?
1: Dennis Reimann's middleware solution:
http://dennisreimann.de/blog/silencing-the-rails-log-on-a-per-action-basis/
2: Spagalocco's gem (inspired by solution #1):
https://github.com/spagalloco/silencer
3: jamiew's monkeypatches (inspired by solution #1 + a bonus):
https://gist.github.com/1558325
Nothing is really working as expected but it's getting close.
I would rather use a method in my ApplicationController like this:
def custom_logging(opts={}, show_logs=true)
disable_logging unless show_logs
remove_sql_requests_from_logs if opts[:remove_sql_requests]
remove_rendered_from_logs if opts[:remove_rendered]
remove_params_from_logs if opts[:remove_params]
[...]
end
...and call it in any controller method: custom_logging({:remove_sql_requests=>1, :remove_rendered=>1})
You got the idea.
So, is there any good resource online to handle this?
Many thanks for your advices...
I"m the author of the silencer gem mentioned above. Are you looking to filter logging in general or for a particular action? The silencer gem handles the latter problem. While you can certainly use it in different ways, it's mostly intended for particular actions.
It sounds like what you are looking for less verbose logging. I would recommend you take a look at lograge. I use that in production in most of my Rails apps and have found it to be quite useful.
If you need something more specialized, you may want to look at implementing your own LogSubscriber which is essentially the lograge solution.
Set your log level in the Heroku enviroment
View your current log level:
heroku config
You most likely have "Info", which is just a lot of noise
Change it to warn or error
heroku config:add LOG_LEVEL=WARN
Also, when viewing the logs, only specify the "app" server
heroku logs --source app
I personally, append --tail to see the logs live.
heroku logs --source app --tail
I am writing a rails applications that utilize solr for full text search
In development mode, I used the sunpost solr gem which is really handy. I used the sqlite3 database in development and everything went smooth.
Now is the time to move to production server and I installed the solr-tomcat package and moved to my production database which is Mysql. I moved the conf files of solr from my application folder to /usr/share/solr/conf
Suddenly, I cannot reindex, and solr returned this
rake RAILS_ENV=production sunspot:solr:reindex
[# ] [ 50/7312] [ 0.68%] [00:00] [00:41] [ 175.82/s]rake aborted!
Mysql::Error: Unknown column 'barangs.' in 'where clause': SELECT `barangs`.* FROM `barangs` WHERE (`barangs`.`` >= 0) ORDER BY `barangs`.`` ASC LIMIT 50
Intrigued, I tried to reindex with the development database, all is well and can be reindexed. This behavior left me baffled
Any help will be appreciated
I got the problem. It was due to me dropping and restoring tables recklessly and it loses some relationships definitions.
Hence the "missing" column on table 'barangs'.'MISSING' solr can't find the specified column.
For others who are facing the problem, I recommend inspecting the relationship between models first
I'm writing a multi-tenancy gem for rails
My tests right now establish connections for a particular adapter, run some tests, then repeat for subsequent db adapters.
My problem however is that when I call:
ActiveRecord::Base.establish_connection
with a different adapter, the sql generated from it is still in the form of the old adapter. For instance, I run the mysql tests, then try to run postgresql tests. I get an error:
Failure/Error: subject.create(database1)
ActiveRecord::StatementInvalid:
PGError: ERROR: syntax error at or near "."
LINE 1: SELECT `users`.* FROM `users` WHERE `users`.`name` = 'Some ...
^
: SELECT `users`.* FROM `users` WHERE `users`.`name` = 'Some User 0' LIMIT 1
And it's obvious here that it's using the mysql backslash syntax, which isn't valid in postgresql.
So... does anyone know how to establish a connection with a different adapter properly? I've tried:
ActiveRecord::Base.connection.reconnect!
ActiveRecord::Base.clear_all_connections!
Neither of these fixed my tests. Any help is greatly appreciated.
See if this helps
ActiveRecord::Base.send(:subclasses).each do |model|
model.connection.clear_query_cache
end