I have the following question, how can I migrate a table to a newly created db, using from within an app?, currently I do this, create a new customer and generate a db for this customer, after which manually make migrations, migrate and create the tables in the new database, I would like to do it automatically from an app.
One solution is to create a customer with its parameters, then with a raw query
cursor.execute(f'''CREATE DATABASE {request.data['db_name']} CHARACTER SET utf8'''),
and then with a function that I created I create the tables starting from a dump file that I created, but goodbye migrations! What other solution can I adopt?
Related
I want to move one database from one server to another
I followed this guide: https://docs.influxdata.com/influxdb/v0.12/administration/config/
But when I restored metadata I wiped out all my usernames and passwords with new db.
Do I need to restore metadata at all and is there are way to restore it without wiping out existing databases?
Metadata should not be imported when importing one database to an existing server.
Our app is having a coredata store which is based on a single coredata model. There are read-only data as well as read-write data.
The read-only data are pre-loaded and bundled along with the app, so that on a fresh install, this database is copied to the application sandbox and from there on the data base will be updated via webservice (ie only the changed data will get updated from webservice so that less data is transferred).
Now we have situation where we need to add more attributes to the read-only entities.
Light weight migration will help in upgrading the schema easily, but the problem is about the new data, since we are adding new attributes to all the read-only entities, all the data records are changed and a webservice sync might take a lot of time to download and update data. To avoid this we are bundling the updated data along with the app ( this will solve the issue for a fresh install). But for the users which are upgrading the app is there a standard mechanism to copy the read-only entities from the bundled db and update those to the existing database in the sandbox so that they will get the updated read-only data and also their read-write data remains intact.
UPDATE
Here is the scenario,
I have X.sqlite bundled with the proj (which has new schema), if X.sqlite is not there in doc dir I then copy it and from there everything works OK. Now in the App update scenario, X.sqilte will be already present in doc dir and won’t be copied and the migration assistant will migrate the schema. So now we have X.sqlite with the new schema in doc dir but old data (without new attributes). Now what I want to know is if there is a way to merge the data from bundled X.sqlite with the one which is there in the doc dir. I want to know if there is a process for merging.
To be more precise
Below are the entities
*Store - ReadOnly
*Products - ReadOnly
*ProductGroups - ReadOnly
*ShopList - User based
All are in the same model and in the same store.
Now Store/ Products / ProductGroups have extra attributes.
Lightweight migrator will migrate the schema of X.Sqlite so that the DB will have the new attribute columns.
Now what I am concerned is the next step,
Lets take Store as an example. Store has two new attributes latitude and logitude. Now the question how to copy the data? the steps
Copying the bundled DB to doc dir with diff name?
Create a new persistance co-ordinator?
Read the bundled data and get the objects?
then iterate through the existing db?
If I have understood your question: you want to update read-only data during app update--while leaving read-write data that user has changed intact.
There are several ways to achieve this:
Have two separate databases. One database can have the read-write
data and another with read-only data. They can relate to each other
using fetched properties. During update, replace or update the
read-only database--while leaving the read-write one intact.
Update the database using a background thread. The update code will
have it's own ManagedObjectContext--but sharing the same
PersistentStore. Synch the main ManagedObjectContext from the
background thread using some protocol.
The second option of updating from a background thread might work well if you choose to update from your web service.
If I haven't understood your issue, please clarify.
Ok So finally after a lot of research I achieved my goal, below are the trails and solutions that i did
Sol 1
Have the read-only and read-write data in separate databases, so that I can safely delete the readonly db if there is any master data update and I can safeguard user's data, but considering the timelines and constraints that I have, it wont be possible for me. Posting here so that it might help others.
Sol 2
I thought rather trying to merge the new data from bundled DB to the existing DB, I thought of merging the user data from existing db to the new db. Below are the steps done.
--> Created a new datacontext.
--> Created a new persistent co-ordinator
--> Renamed the bundled db with _v2 and copied it to the Doc directory, now we have 2 DB in the doc dir
I took some app Importing large data sets
--> Now using the ManagedObject clone category, I copied all the user info data from the existing db to the new db _v2. Found the category here NSManagedObject+Clone
--> Worked fine, now I got my _v2 database with new readonly data and the user data from the old database.
--> Now I need to give control back to the default datacontext
--> I tried to change the PSC of the old context to the new PSC but system didnt allow me to do that.
--> I then tried to change the persistence store of the old context to the new store but I got error saying that database already exists. (migratePersistentStore:toURL:options:withType:error:)
--> I ran out of ideas here.
Sol 3
I then discussed my problems with some of my other colleagues and they suggested to provide the new data in a different format and that striked. As I already mentioned, my app has logic to download new data as JSON and merge it to core data, why can I provide a JSON file with the new data, along with my app?
I collected the new response from thew webservice and created a JSON (not big just 1.5MB) and attached with app bundle, and for users that update the app, instead of core data merging I will read the JSON data locally and do the initial merging to the core data DB, there by the data base will have the new readonly data and also user data intact. After the initial merge, everything will be taken care by online sync.
I'm fairly new to Ruby on Rails and I'm having some trouble designing the db.
So right now I have a table with about 100 records, populated from seeds.rb. Now I want to use this data to make an API call per record to get more information and I want to update the row with this new info I got from the API call. Is this possible in any way?
For example if I have this in seeds.rb,
Example.create(fruit: 'orange')
and I want to call this API which gives me the colour of this fruit,
color = api.param(fruit)
and I want to update the record,
fruit:'orange', color:'orange'
like so.
Can this be done as part of the seeding procedure? My vision is to run this migration every month or so to prevent outdated data.
Thanks in advance for the help!
You seem to be confused with your terminology.
A migration is a change to the database schema, not the data itself.
Database seeding is the initial seeding of a database with data and should only be done once in the beginning of development (or when an application in development is transferred to another developer).
What you need is not a migration, but a scheduled job that runs a rake task that calls your external API and updates your local database.
Create a rake task (this is an old but still relevant tutorial) that calls your external API, gets the data, and updates the database. Then schedule this task to run at the interval you require, you can use the whenever gem to do it.
i want to add tables/columns to a database during runtime.
Currently I'm using Core Data.
I know that there's a possibility to do so in XCode (add new data model version), but I definitely can't use that way, because I receive the database schema from a web service.
Is there any good possibility to run ddl commands during runtime when using Core Data, or is it just possible with directly using sqlite (or a wrapper/ormapper)?
If it's better to use a wrapper/ormapper please give me some suggestions about which should be used in this case.
Workflow should be:
start app
check if database is up to date
if new version of schema is available from a web service do DDL commands
continue with app workflow
PS: Please no answers which describe alternatives modifying the schema with XCode!
Can you modify the Core Data model at run time? Yes...but, it probably won't work the way you want it to work.
Core Data's API makes it possible to construct or modify every detail of a data model at run time. Xcode's model editor is a convenience, but you could skip it and do everything in code if you wanted. For example, NSEntityDescription's properties attribute (which covers both attributes and relationships) is writeable. You could create a new NSAttributeDescription and update the entity's properties to contain it. Bang, you just added a new attribute to the entity. Similarly, NSManagedObjectModel's entities property is writeable, so you could create a new NSEntityDescription and add it to the model. That gives you a new entity, created at run time.
But, and it's a big one: you can only do this before you load the data store. Once you load your persistent store, altering the model will throw an exception. When Core Data loads a persistent store, it compares the model file to the model used in the store file. They must match, and you can't do anything to change this fact after loading the store. Once you load the store, the model is fixed.
What's more, even if you modify your model before loading the persistent store, you can only load persistent stores that match the current version of the model-- unless, that is, you also write code to migrate the persistent store to the new model. How hard that is depends on the nature of the changes. At a minimum then, you would need to make any changes before loading previously saved data, and then also arrange to do model migration to update the persistent store to use the new model.
With Core Data the model (schema) and data are stored separately and matched up when the store is loaded. That's not how SQLite works internally but it's the approach that Core Data enforces.
We have an old PHP app in MySQL that we're currently rewriting in Rails and PostgreSQL.
I'm looking for a way to migrate users one-by-one when they first sign in to the new system, so that we migrate only active users.
Is there a hook in Devise that I can use to catch a failed login towards the new database and check if the user exists in the old MySQL database and migrate the user if found.
BTW, the Rails app can access both databases and the migration code is already in place that accepts old username as input.
Or should I just simply make a separate _migration_assistant_ page that accepts old logins and initiates the migration?
PS. any user can possibly have thousands of database records to migrate so it can take a bit of time, but we already make use of PosgreSQL's COPY FROM to speed it up (about a few secs per account).
EDIT:
There are currently about 5000 users and we expect roughly 1000 of them to return to the new site.
You could migrate the required data from just your users table. Include the original primary key in a column 'old_id'.
Create a custom encryptor so that users can log in with their existing password.
Redirect users when they log in to a page that triggers the migration process per user making use of the saved 'old_id'.
After your migration period you could look at the Devise trackable columns to determine which users can be cleaned up.