I want to write a few unit tests that do not make any changes to a database.
I have a Rails 2.3.11 application. This app has a SQLite database as its primary database. In many ways, this is a run-of-the-mill Rails app.
What makes this app unique is that it also establishes a connection to a SQL Server database. I have some models which are abstract classes and they use the SQL Server database. I have before_save and before_destroy callbacks to prevent any changes being made to the SQL Server database. Also, the user credentials to connect to SQL Server are supposed to be read-only.
I would like to write unit tests that make assertions on the data that is already present in the SQL Server database. But I don't want to setup or teardown the SQL Server database.
I am afraid to just see what happens. I would like to have a setting in the unit test that will prevent Rails from trying to setup or teardown the SQL Server database. Is this possible? How do I do it?
Thank you!
The setup/tear down only affects the application database (SQLite, sounds like), not additional, external database connections.
Also, you should keep your test environment completely separate from your production environment. So, if you're using a test SQLServer DB as well (and you should be, with test data in it - not the production one) then you should be fine even if the worst happens.
Related
I'd like know if is possible to have a single Rails App, where i have many different clients using this same App, where every single client have your own PostGreSQL DB in Heroku, so we have the same project to be updated for all this clients when i do the pushs to Heroku.
Do you know if it's possible to be done?
And how to ignore the database.yml file in updates, because every single client has your own DB.
Thanks!
You can but you probably shouldn't!
You can attach any number of Heroku Postgres instances to a Heroku app. You'll see that each instance you create adds a connection string to the list of environment variables - listed under the App's settings tab.
You can map the key string to a customer - via some unique identifier. You would then need to have an interceptor to bind the connection to the relevant database - or chose the relevant connect from a pre-bound list - and add it into the request context for each request.
It's a messy approach and not recommended. What would be slightly less messy is if you created a separate schema per customer instead. This way you bind to a single database instance and retain your database.yml config. But, each customer has their own dedicated schema. However, these are more architectural concerns that Heroku capabilities. From a Heroku perspective, both multi-database and mutli-schema approach is possible.
It should be noted that neither approach would give you any more referential integrity at the application logic level than standard roles and permissions with adequate auth mechanisms... All schemas and/or databases will be visible to the same application regardless of its separation at the database level. So, really there's little tangible benefit to it.
How to use database connection with our Unit testing project. We have some thousands of unit test code that follows NUnit framework for our MVC application. We have lot of business logic codes that connect with SQL database with MVC application. So we have written unit testing codes that apply the same business logic which connect with database.
For example we have written to insert, update, retrieve operation in unit testing which connect to database.
We have two different source one for Local or Staging application, and another Production application. So different database for Local and Production. Now our testing codes are connected with local database. Is this better way to use different database for unit testing codes to run?
Is changing connection string to change different database connection in web config file of Unit testing projection enough and right way to do or else do we have any other proper way to switch database based on different Solution configurations?
Help us the address this scenario.
Regards,
Karthik.
I have been using Heroku-postgres as my database for my rails 4 app deployed to Heroku.
I connect to the DB locally using pgAdmin3, and haven't had any issues.
Now, I want to switch my database to a amazon-redshift instance which has been spun up for me. All I have is a username, password, and the database host name. Where do I store this information within my Rails 4 app so that my app will use this DB instead of the current postgres DB?
I provided a similar answer here, but I would recommend using this adapter to connect:
https://github.com/fiksu/activerecord-redshift-adapter
This certainly works well for any ActiveRecord query you need to do, I'm using insert statements to update redshift tables rather than ActiveRecord create. Working on a full redshift adapter, hopefully to be released in the next few weeks.
Here's the answer I've given in the past with code examples about halfway down:
How can I build a front end for querying a Redshift database (hopefully with Rails)
Heroku will need to support Redshift as a database option for you, otherwise you'll need to spin up your own stack.
It might be worthwhile checking out AWS EBS service to do this.
I'm currently developing my Rails app on my local machine. I have no DB installed on my local machine, and I'm sending my codes to remote testing server which actually runs the app in development mode.
Until yesterday all commands like rails g model foo or rails g controller foo on my local machine worked with no errors.
But now all of rails generate commands started to fail due to no database connections. I think the direct reason is because I made some changes to my app configs, but I'm not sure where the changes are.
I guess the wrong part is that rails generate commands are always invoking active_record
which always verifies the DB connection.
Now, my question is:
Is there any way to temporally disable rails to verify the database connections, for local development(which has no DB connection available)?
I tried removing config/database.yml but it didn't help.
Your local development environment needs to have the same sort of facilities as the application requires. If you have database backed models then you need a database, preferably the same one as used when deploying the application so your tests are useful.
It really shouldn't be a big deal to set up a database for local development. Depending on your platform there are usually many different easy to use installers available.
Uploading your code changes to a remote server for execution is a really dysfunctional development model. If you have no alternative, it might be best to create the models on the remote system and pull them down to edit.
I need to access data in an MS SQL database from a rails app.
The MS SQL database is maintained by our contractors, we just need to access data from it.
Is there a way in rails that I can access an outside db (not the main rails db)?
I can write my own SQL queries, I just need to open a connection to that outside db.
I'm on Rails 3.2.1
Thanks
Check out Connection Ninja , it's pretty straight forward and easy to use.