redis save option per database - possible? - save

I find save option in conf for all the databases. Is it possible to assign different save values per database say 1, 2 , etc.?
Thanks.

It is not possible. We've to have have another redis instance which can be set with different save values.

Related

Mongoid: How do I limit a collection to one entry?

Problem::
Is there a way to limit a table to one entry only?
I would like to do this from the model so only one entry can be created and modified. Currently I am only allowing the Object.first entry to be modified. I am also open to hear a better practice. Thanks in advance
background::
I am new to Mongo and the only information i found is for creating a new collection.
With Mongo DB you can limit with Capped Collections.
Set limits on mongo db collection
"Mongoid does not provide a mechanism for creating capped collections on the fly - you will need to create these yourself one time up front either with Moped or via the Mongo console."
https://mongoid.github.io/old/en/mongoid/docs/persistence.html
session.command(create: "name", capped: true, size: 10000000, max: 1000)
Depending on what you're trying to achieve, capped collections might not be suited for your use case. In the Capped Collection Documentation, it says:
Capped collections work in a way similar to circular buffers: once a collection fills its allocated space, it makes room for new documents by overwriting the oldest documents in the collection.
If you use a capped collection and then insert a new document, it would just overwrite the existing document, rather than throwing an error. Of course, you could just insert a new document with the updated information instead of overwriting the existing one, but I'm not sure if that's what you intend to do. (If that is helpful, you can create a capped collection through the Mongo Shell when you're setting up your MongoDB instance.)
Overall, it sounds like enforcing this rule in your application logic is the way to go. I would also spend some time thinking about whether you really need this information to be in the database -- would a Ruby singleton class or some environment variables better suit your needs?

Most effective, secure way to delete Postgres column data

I have a column in my Postgres table that I want to remove for expired rows. What's the best way to do this securely? It's my understanding that simply writing 0's for those columns is ineffective because Postgres creates a new row upon Updates and marks the old row as dead.
Is the best way to set the column to null and manually vacuum to clean up the old records?
I will first say that it is bad practice to alter data like this - you are changing history. Also the below is only ONE way to do this (a quick and dirty way and not to be recommended):
1 Backup your database first.
2 Open PgAdmin, select the database, open the Query Editor and run a query.
3 It would be something like this
UPDATE <table_name> SET <column_name>=<new value (eg null)>
WHERE <record is dead>
The WHERE part is for you to figure out based on you are identifying which rows are dead (eg. is_removed=true, is_deleted=true are common for identifying soft deleted records).
Obviously you would have to run this script regularly. The better way would be to update your application to do this job instead.

What data type can I use for very large text fields that is database agnostic?

In my Rails app, I want to store the geographical bounds of places column fields in a database. E.g., the boundary of New York is represented as a polygon: an array of arrays.
I have declared my model to serialize the polygons, but I am unsure whether I should even store them like this. The size of these serialized polygons easily exceed 100,000 characters, and MySQL only can store about 65000 characters in a standard TEXT field.
Now I know MySQL also has a LONGTEXT field. But I really want my app to be database-agnostic. How does Rails handle this by itself? Will it switch automatically to LONGTEXT fields? What about when I start using PostgreSQL?
At this point I suggest you ask yourself - does this data need to be stored, or should be store in a database in this format?
I propose 2 possible solutions:
Store your polygons in the filesystem, and reference them from the database. Such large data items are of little use in a database - it's practically pointless to query against them as text. The filesystem is good at storing files - use it.
If you do need these polygons in the database, store them as normalised data. Have a table called polygon, and another called point, deserialize the polygons and store it in a way that reflects the way that databases are intended to be used.
Hope this is of help.
Postgresql has a library called PostGIS that my company uses to handle geometric locations and calculations that may be very helpful in this situation. I believe postgresql also has two data types that allow arrays and hashes. Arrays are declared, as an example, like text[] where text could be replaced with another data type. Hashes can be defined using the hstore module.
This question answers part of my question: Rails sets a default byte limit of 65535, and you can change it manually.
All in all, whether you will run into trouble after that depends on the database you're using. For MySQL, Rails will automatically switch to the appropriate *TEXT field. MySQL can store up to 1GB of text.
But like benzado and thomasfedb say, it is probably better to store the information in a file so that the database doesn't allocate a lot of memory that might not even be used.
Even though you can store this kind of stuff in the database, you should consider storing it externally, and just put a URL or some other identifier in the database.
If it's in the database, you may end up loading 64K of data into memory when you aren't going to use it, just because you access something in that table. And it's easier to scale a collection of read-only files (using something like Amazon S3) than a database table.

Dump and restore selected models/object graph in Rails

Given a MySQL database and a set of corresponding active record models similar to:
Test -< Categories -< Questions
I need a way to quickly dump the contents of Test #1 to a file, and then restore on a separate machine. When Test #1 is reinstantiated in the database, all of the relational data should be intact (all foreign keys are maintained, the Categories, Questions for the test are all restored). What's the best way to do this?
Try the Jailer subsetting tool. It's for dumping subsets of relational data, keeping referential integrity.
I'd try using yaml: http://www.yaml.org/
It's an easy way to save and load heirarchical data (in a human readable format), and there are a number of implementations for Ruby. They extend your classes, adding methods to save and load objects to and from yaml files.
I typically use it when I need to save and reload a "deep copy" of a large multi-level hash of objects.
There are options out there, replicate is outdates and known to have issues with Rails 4 and Ruby 2, activerecord-import looks good, but doesn't have a -dump couterpart.

Deleting ALL ItemsName() in a single Query in SimpleDB

HI,
I want to delete all ItemNames in single query in simpledb.
whether it's possible in simple db.If possible please give the query for deleting all items in simple DB
Thanks
senthil
SimpleDB doesn't have any way to delete multiple records with a single query, and there is no equivalent to 'TRUNCATE TABLE'.
Your options are either to delete records one at a time or to delete the entire domain.
Use the DeleteDomain operation to delete an entire domain. You can re-create the domain using CreateDomain afterward.
SimpleDB supports batch delete now: http://docs.amazonwebservices.com/AmazonSimpleDB/latest/DeveloperGuide/SDB_API_BatchDeleteAttributes.html
But you can only do 25 at a time.
If you want to delete the entire domain, do as Scrappydog suggest and delete the domain. Much faster than deleting one by one.

Resources