for i, name in ipairs(redis.call('KEYS''cache:user_transaction_logs:*:8866666')) do redis.call('DEL', name); end"
How can I Optimise this redis query?
We are using Redis as cache store in Rails.Whenever auser makes a successfull transaction The receivers and initiators transaction history is expired from redis
The query can not be optimized - it should be replaced in its entirety because the use of KEYS is discouraged for anything other than debugging purposes on non-production environments.
A preferable approach, instead of trying to fetch the relevant key names ad-hoc, is to manage them in a data structure (e.g. Set or List) and read from it when you perform the deletions.
You need to change the approach for how you are storing cache entries for your users.
Your keys should look something like cache:user_transaction_logs:{user_id}.
Then you will be able to just delete the entry by its key (user_id).
In case if you need several cache entries per user_id - use Redis hashes (https://redis.io/commands#hash), and then again you will be able to delete all entries per user_id with one command DELETE or needed entry with HDEL.
Also a good idea to use Redis database numbers (default 0, 1-15 available) and put separate functionalities on separate database numbers. Then in case if you need to wipe cache of whole functionality that can be done with one command FLUSHDB
Related
If we have a small table which contains relatively static data, is it possible to have Active Record load this in on startup of the app and never have to hit the database for this data?
Note, that ideally I would like this data to be join-able from other Models which have relationships to it.
An example might be a list of countries with their telephone number prefix - this list is unlikely to change, and if it did it would be changed by an admin. Other tables might have relationships with this (eg. given a User who has a reference to the country, we might want to lookup the country telephone prefix).
I saw a similar question here, but it's 6 years old and refers to Rails 2, while I am using Rails 5 and maybe something has been introduced since then.
Preferred solutions would be:
Built-in Rails / ActiveRecord functionality to load a table once on startup and if other records are subsequently loaded in which have relationships with the cached table, then link to the cached objects automatically (ie. manually caching MyModel.all somewhere is not sufficient, as relationships would still be loaded by querying the database).
Maintained library which does the above.
If neither are available, I suppose an alternative method would be to define the static dataset as an in-memory enum/hash or similar, and persist the hash key on records which have a relationship to this data, and define methods on those Models to lookup using the object in the hash using the key persisted in the database. This seems quite manual though...
[EDIT]
One other thing to consider with potential solutions - the manual solution (3) would also require custom controllers and routes for such data to be accessible over an API. Ideally it would be nice to have a solution where such data could be offered up via a RESTful API (read only - just GET) if desired using standard rails mechanisms like Scaffolding without too much manual intervention.
I think you may be discounting the "easy" / "manual" approach too quickly.
Writing the data to a ruby hash / array isn't that bad an idea.
And if you want to use a CRUD scaffold, why not just use the standard Rails model / controller generator? Is it really so bad to store some static data in the database?
A third option would be to store your data to a file in some serialized format and then when your app loads read this and construct ActiveRecord objects. Let me show an example:
data.yml
---
- a: "1"
b: "1"
- a: "2"
b: "2"
This is a YAML file containing an array of hashes; you can construct such a file with:
require 'yaml'
File.open("path.yml", "w") do |f|
data = [
{ "a" => "1", "b" => 1 },
{ "a" => "2", "b" => 2 }
]
f.write(YAML.dump(data))
end
Then to load the data, you might create a file in config/initializers/ (everything here will be autoloaded by rails):
config/initializers/static_data.rb
require 'yaml'
# define a constant that can be used by the rest of the app
StaticData = YAML.load(File.read("data.yml")).map do |object|
MyObjectClass.new(object)
end
To avoid having to write database migrations for MyObjectClass (when it's not actually being stored in the db) you can use attr_accessor definitions for your attributes:
class MyObjectClass < ActiveRecord::Base
# say these are your two columns
attr_accessor :a, :b
end
just make sure not to run stuff like save, delete, or update on this model (unless you monkeypatch these methods).
If you want to have REST / CRUD endpoints, you'd need to write them from scratch because the way to change data is different now.
You'd basically need to do any update in a 3 step process:
load the data from YAML into a Ruby object list
change the Ruby object list
serialize everything to YAML and save it.
So you can see you're not really doing incremental updates here. You could use JSON instead of YAML and you'd have the same problem. With Ruby's built in storage system PStore you would be able to update objects on an individual basis, but using SQL for a production web app is a much better idea and will honestly make things more simple.
Moving beyond these "serialized data" options there are key-val storage servers store data in memory. Stuff like Memcached and Redis.
But to go back to my earlier point, unless you have a good reason not to use SQL you're only making things more difficult.
It sounds like FrozenRecord would be a good match for what you are looking for.
Active Record-like interface for read only access to static data files of reasonable size.
I want to save settings for my users and some of them would be one out of a predefined list! Using https://github.com/ledermann/rails-settings ATM.
The setting for f.e. weight_unit would be out of [:kg, :lb].
I don't really want to hardcode that stuff into controller or view code.
It's kind of a common functionality, so I was wondering: Did anyone come up with some way of abstracting that business into class constants or the database in a DRY fashion?
Usually, when I have to store some not important information which I don't care to query individually, I store them on a serialized column.
In your case you could create a new column in your users table (for example call it "settings").
After that you add to user model
serialize :settings, Hash
from this moment you can put whatever you like into settings, for example
user.settings = {:weight_unit => :kg, :other_setting1 => 'foo', :other_setting2 => 'bar'}
and saving with user.save you will get, in settings column, the serialized data.
Rails does also de-serialize it so after fetching a user's record, calling user.settings, you will get all saved settings for the user.
To get more information on serialize() refer to docs: http://api.rubyonrails.org/classes/ActiveRecord/AttributeMethods/Serialization/ClassMethods.html#method-i-serialize
UPDATE1
To ensure that settings are in the predefined list you can use validations on your user model.
UPDATE2
Usually, if there are some pre-defined values it's a good habit to store them in a constant inside the related model, in this way you have access to them from model (inside and outside). Acceptable values does not change by instance so it makes sense to share them between all. An example which is more valuable than any word. Defining in your User model:
ALLOWED_SETTINGS = {:weight_unit => [:kg, :lb],
:eyes_color => [:green, :blue, :brows, :black],
:hair_length => [:short, :long]}
you can use it BOTH
outside the model itself, doing
User::ALLOWED_SETTINGS
inside your model (in validations, instance methods or wherever you want) using:
ALLOWED_SETTINGS
Based on your question, it sounds like these are more configuration options that a particular user will choose from that may be quite static, rather than dynamic in nature in that the options can change over time. For example, I doubt you'll be adding various other weight_units other than :kg and :lb, but it's possible I'm misreading your question.
If I am reading this correctly, I would recommend (and have used) a yml file in the config/ directory for values such as this. The yml file is accessible app wide and all your "settings" could live in one file. These could then be loaded into your models as constants, and serialized as #SDp suggests. However, I tend to err on the side of caution, especially when thinking that perhaps these "common values" may want to be queried some day, so I would prefer to have each of these as a column on a table rather than a single serialized value. The overhead isn't that much more, and you would gain a lot of additional built-in benefits from Rails having them be individual columns.
That said, I have personally used hstore with Postgres with great success, doing just what you are describing. However, the reason I chose to use an hstore over individual columns was because I was storing multiple different demographics, in which all of the demographics could change over time (e.g. some keys could be added, and more importantly, some keys could be removed.) It sounds like in your case it's highly unlikely you'll be removing keys as these are basic traits, but again, I could be wrong.
TL;DR - I feel that unless you have a compelling reason (such as regularly adding and/or removing keys/settings), these should be individual columns on a database table. If you strongly feel these should be stored in the database serialized, and you're using Postgres, check out hstore.
If you are using PostgreSQL, I think you can watch to HStore with Rails 4 + this gem https://github.com/devmynd/hstore_accessor
I am currently developing a membership website that include a view counter. Past experience says, having view counters in SQL is costly. I in fact kept away from view counters but today its not an option.
The project uses
Rails 4.0.2
Redis Objects gem
For demonstration I am hoping to use Heroku with Redis To Go plugin
Currently the counter is based on PG ( Active Record )
Redis Objects have been used with AR to count ( but how to save to AR profiles table ? )
Need / Thinking of achieving
Counts in Redis and periodically stores to PG table possibly using a Rake + schedule task
Efficiency
Problem
I can't figure out how to do a query the Redis DB to find all Profile objects in it.
If I could get this list of objects I can write a rake task to iterate through each item and save/update the modal value to the database.
After some search KEYS profile:* seems to be the only way to get all the profile values saved to the database. Then from it I have to manually fetch objects and update the values.
Question
Is using keys to find the keys and then the objects sensible ( in terms of efficiency ) to use in a scheduled task possibly once a day.
Is there a way to fetch all Profile objects directly from Redis db like we can do Profile.all in ActiveRecord ?
Any suggestions to implement a counter is highly appreciated ( even if not Redis based )
-
class Profile < ActiveRecord::Base
# To handle the profile counter
include Redis::Objects
counter :tviews
# Associations
belongs_to :user
# Other attributes
end
profiles_controller.rb
class ProfilesController < ApplicationController
def public
#profile.tviews.increment
# #profile.save # Save of AR based counting
end
end
In the SLIM
p.text-center#profile-counter
| Views -
= #profile.tviews.value + #profile.views
1) No it's not. See redis docs it says Time complexity: O(N) with N being the number of keys in the database, under the assumption that the key names in the database and the given pattern have limited length.
Not only redis would have to iterate over all the keys, it would have to store them in memory. Also due to single-threaded nature of redis it would be unresponsive for the time of executing keys command.
2) Not a ruby user here, but I would assume that not.
3) You have several options
Store ids for your keys in a redis set and clean it with your periodic task (and redis transactions if needed). This way you'll always know exactly which keys are in redis.
Use SCAN command from redis 2.8 It returns only limited amount of key and iterates over keys using an internal cursor.
(redis.io seems to be down right now, so links might not be working)
I am building a rails app and the data should be reset every "season" but still kept. In other words, the only data retrieved from any table should be for the current season but if you want to access previous seasons, you can.
We basically need to have multiple instances of the entire database, one for each season.
The clients idea was to export the database at the end of the season and save it, then start fresh. The problem with this is that we can't look at all of the data at once.
The only idea I have is to add a season_id column to every model. But in this scenario, every query would need to have where(season_id: CURRENT_SEASON). Should I just make this a default scope for every model?
Is there a good way to do this?
If you want all the data in a single database, then you'll have to filter it, so you're on the right track. This is totally fine, as data is filtered all the time anyway so it's not a big deal. Also, what you're describing sounds very similar to marking data as archived (where anything not in the current season is essentially archived), something that is very commonly done and usually accomplished (I believe) via setting a boolean flag on every record to true or false in order to hide it, or some equivalent method.
You'll probably want a scope or default_scope, where the main downside of a default_scope is that you must use .unscoped in all places where you want to access data outside of the current season, whereas not using a default scope means you must specify the scope on every call. Default scopes can also seem to get applied in funny places from time to time, and in my experience I prefer to always be explicit about the scopes I'm using (i.e. I therefore never use default_scope), but this is more of a personal preference.
In terms of how to design the database you can either add the boolean flag for every record that tells whether or not that data is in the current season, or as you noted you can include a season_id that will be checked against the current season ID and filter it that way. Either way, a scope of some sort would be a good way to do it.
If using a simple boolean, then either at the end of the current season or the start of the new season, you would have to go and mark any current season records as no longer current. This may require a rake task or something similar to make this convenient, but adds a small amount of maintenance.
If using a season_id plus a constant in the code to indicate which season is current (perhaps via a config file) it would be easier to mark things as the current season since no DB updates will be required from season to season.
[Disclaimer: I'm not familiar with Ruby so I'll just comment from the database perspective.]
The problem with this is that we can't look at all of the data at once.
If you need to keep the old versions accessible, then you should keep them in the same database.
Designing "versioned" (or "temporal" or "historized") data model is something of a black art - let me know how your model looks like now and I might have some suggestions how to "version" it. Things can get especially complicated when handling connections between versioned objects.
In the meantime, take a look at this post, for an example of one such model (unrelated to your domain, but hopefully providing some ideas).
Alternatively, you could try using a DBMS-specific mechanism such as Oracle's flashback query, but this is obviously not available to everybody and may not be suitable for keeping the permanent history...
I have a Postgres database (9) that I am writing a trigger for. I want the trigger to set the modification time, and user id for a record. In Firebird you have a CONNECTIONID that you can use in a trigger, so you could add a value to a table when you connect to the database (this is a desktop application, so connections are persistent for the lifetime of the app), something like this:
UserId | ConnectionId
---------------------
544 | 3775
and then look up in the trigger that connectionid 3775 belongs to userid 544 and use 544 as the user that modified the record.
Is there anything similar I can use in Postgres?
you could use the process id. It can be retrieved with:
pg_backend_pid()
With this pid you can also use the table pg_stat_activity to get more information about the current backend, althouht you already should know everything, since you are using this backend.
Or better. Just create a serial, and retrieve one value from it for each connection:
CREATE SEQUENCE 'connectionids';
And then:
SELECT next_val('connectionids');
in each connection, to retrieve a connection unique id.
One way is to use the custom_variable_classes configuration option. It appears to be designed to allow the configuration of add-on modules, but can also be used to store arbitrary values in the current database session.
Something along the lines of the following needs to be added to postgresql.conf:
custom_variable_classes = 'local'
When you first connect to the database you can store whatever information you require in the custom class, like so:
SET local.userid = 'foobar';
And later in on you can retrieve this value with the current_setting() function:
SELECT current_setting('local.userid');
Adding an entry to a log table might look something like this:
INSERT INTO audit_log VALUES (now(), current_setting('local.userid'), ...)
While it may work for your desktop use case, note that process ID numbers do rollover (32768 is a common upper limit), so using them as a unique key to identify a user can run into problems. If you ever end up with leftover data from a previous session in the table that's tracking user->process mapping, that can collide with newer connections assigned the same process id once it's rolled over. It may be sufficient for your app to just make sure you aggressively clean out old mapping entries, perhaps at startup time given how you've described its operation.
To avoid this problem in general, you need to make a connection key that includes an additional bit of information, such as when the session started:
SELECT procpid,backend_start FROM pg_stat_activity WHERE procpid=pg_backend_pid();
That has to iterate over all of the connections active at the time to compute, so it does add a bit of overhead. It's possible to execute that a bit more efficiently starting in PostgreSQL 8.4:
SELECT procpid,backend_start FROM pg_stat_get_activity(pg_backend_pid());
But that only really matters if you have a large number of connections active at once.
Use current_user if you need the database user (I'm not sure that's what you want by reading your question).