Rails - seed data dynamically from with the result of API calls - ruby-on-rails

I'm fairly new to Ruby on Rails and I'm having some trouble designing the db.
So right now I have a table with about 100 records, populated from seeds.rb. Now I want to use this data to make an API call per record to get more information and I want to update the row with this new info I got from the API call. Is this possible in any way?
For example if I have this in seeds.rb,
Example.create(fruit: 'orange')
and I want to call this API which gives me the colour of this fruit,
color = api.param(fruit)
and I want to update the record,
fruit:'orange', color:'orange'
like so.
Can this be done as part of the seeding procedure? My vision is to run this migration every month or so to prevent outdated data.
Thanks in advance for the help!

You seem to be confused with your terminology.
A migration is a change to the database schema, not the data itself.
Database seeding is the initial seeding of a database with data and should only be done once in the beginning of development (or when an application in development is transferred to another developer).
What you need is not a migration, but a scheduled job that runs a rake task that calls your external API and updates your local database.
Create a rake task (this is an old but still relevant tutorial) that calls your external API, gets the data, and updates the database. Then schedule this task to run at the interval you require, you can use the whenever gem to do it.

Related

Demo environment and data

I have a rails application running in which people can enter timesheets, get reports, ...: www.temponia.com
Now I would like to create a demo environment where users can experiment without the need of having to register.
I already found the gems faker and forgery to generate demo data. But my question is: when a user starts the demo environment, should I generate the data and write the data to the database? I don't want all users to share the same demo environment since one user can completely destroy the experience for other testers...
When I write it to the database, and delete it after a couple of days, aren't I running the risk that some tables will get really high identity values really quick? I would generate for example several thousand timesheet entries to make it look realistic...
Are there any other ways to solve this?
In my opinion, you should let the users do what they want to do, to show the completeness of the application, and let everyone share the same environment.
To repopulate your application, you can add some task every X minutes (depends of frequentation), to automatically insert data when it reaches a minimum threshold. See for example whenever gem to add commands in crontab.
https://github.com/javan/whenever

Updating iOS Local Database from server

In iOS, I've had experience working with local-only SQL, and server-only SQL accessed over PHP.
My question is, the app that I'm planning to write will have a local database and a remote database, which is probably pretty common. I'm planning to basically have the iOS app update from certain tables in the server's database.
My question is: Is there a simple or common way to compare the list of columns in a given table, and copy any that are changed or missing from the server to the local database?
Example, if I had a table full of data, and then added a new column on the server, is there a standard way to have the local iOS database reflect that new column?
The idea that I came up with was start both databases as a blank new database, and then any change I add a new SQL script on the server to update the local DB- then, if the iOS device detects a new database revision it would run the update scripts and anything missing would be added. I was just hoping there would be a better way, as this could get messy.
If you use Sqlite in both sides, which would be a zero risk choice for future development, and if you develop a migration system of your own (check Entity Framework Migrations or https://github.com/mocra/fmdb-migration-manager for ideas), you can simply compare latest migration versions and transfer them accordingly. This would be the wisest choice, in my humble opinion.
You should choose webservice for making updates to your iOS app database with Server database. That will be quite easy and efficient way and also Json and xml libraries are powerful way to parse your data. Let me know if you have further queries!

Locking database while executing rake script

I have created a rake script to send e-mails to some users.
The rake script first needs to delete some old database records, and then proceed with the e-mails.
The trouble is that during the time that the script is running, some users may view/delete the data themselves. If the data is deleted by the script, then the views should be refreshed, in order to accommodate the new data.
The first obvious solution that I can think of is to never display the old data in the views , and so avoid the possibility that a record is deleted after it has already been deleted.
But I still think that I have a race condition possibility here, and I would like to know how could I lock the database while executing the script.
I am using Mysql as my database system.
I would approach this by setting up a rake task that calls a method on a model to delete the database records. I would then wrap the code to delete the old mails in a transaction. That will lock the database while deleting the emails and allow you to handle any exceptions thrown when anyone else tries to delete the data.

writing a scheduler and backend in rails

I am trying to write a very simple rails backend backed up with a mysql database. It's very simple, I have two tables, one is the users table, tweets table, and a time for each tweet. Basically besides just an API, I would like to write a scheduler for posting the tweets at a specific time in the database. This is what confuses me.. how do I write a loop in the background that always checks the current time if there's any tweet that needs to be posted?
Any recommendation/guidelines in achieving this would be helpful.
You need a background job management.
Periodically, you will check if there is anything to do and eventually do it.
Please check this relevant SO question.
Here are some other pointers:
http://4loc.wordpress.com/2010/03/10/background-jobs-in-ruby-on-rails/
https://github.com/tobi/delayed_job
https://github.com/javan/whenever/

Monitor database table for external changes from within Rails application

I'm integrating some non-rails-model tables in my Rails application. Everything works out very nicely, the way I set up the model is:
class Change < ActiveRecord::Base
establish_connection(ActiveRecord::Base.configurations["otherdb_#{RAILS_ENV}"])
set_table_name "change"
end
This way I can use the Change model for all existing records with find etc.
Now I'd like to run some sort of notification, when a record is added to the table. Since the model never gets created via Change.new and Change.save using ActiveRecord::Observer is not an option.
Is there any way I can get some of my Rails code to be executed, whenever a new record is added? I looked at delayed_job but can't quite get my head around, how to set that up. I imagine it evolves around a cron-job, that selects all rows that where created since the job last ran and then calls the respective Rails code for each row.
Update Currently looking at Javan's Whenever, looks like it can solve the 'run rails code from cron part'.
Yeah, you'll either want some sort of background task processor (Delayed::Job is one of the popular ones, or you can fake your own with the Daemon library or similar) or to setup a cronjob that runs on some sort of schedule. If you want to check frequently (every minute, say) I'd recommend the Delayed::Job route, if it's longer (every hour or so) a cron job will do it just fine.
Going the DJ route, you'd need to create a job that would check for new records, process them if there are any, then requeue the job, as each job is marked "completed" when it's finished.
-jon
This is what I finally did: Use Whenever, because it integrates nicely with Capistrano and showed me how to run Rails code from within cron. My missing peace was basically
script/runner -e production 'ChangeObserver.recentchanges'
which is now run every 5 minutes. The recentchanges reads the last looked-at ID from a tmp-file, pulls all new Change records which have a higher ID than that and runs the normal observer code for each record (and saves the highest looked-at ID to the tmp-file, of course).
As usual with monitoring state changes, there are two approaches : polling and notification. You seem to have chose to go the polling way for now (having a cron job look at the state of the database on a regular basis and execute some code if that changed)
You can do the same thing using one of the rails schedulers, there are a few out there (google will find them readily, they have various feature sets, I'll let you choose the one which suits your need if you got that way)
You could also try to go the notification way depending on your database. Some database support both triggers and external process execution or specific notification protocols.
In this case you are notified by the database itself that the table changed. there are many such options for various DBMS in Getting events from a database

Resources