Automating the use of external config files in Grails - grails

I am doing what appears to be the best practices for using an external config file in Grails.
grails.config.locations = ["classpath:${appName}-config.groovy",
"file:./${appName}-config.groovy"]
if (System.properties["${appName}.config.location"]) {
grails.config.locations << "file:" + System.properties["${appName}.config.location"]
}
I put the config file in the root folder during testing and it works. And I manually put the config file in our Tomcat server's lib folder (in the classpath) during production and it worked. But I don't want to have to copy/create our external config file every time. After building a war and deploying an app, is it possible for my config file to be moved to the correct location so that I don't have to manually move it during deployment of any of my apps? Thanks.

Note: I would edit the title of the question to reflect your actual problem which is about automation.
It depends on how manual your existing process is...
Let's imagine the following:
You have a continuous integration engine such as Jenkins/Hudson running
You scp/rsync the external configuration file upon successful builds
You deploy the application to the server upon successful builds (assuming a gant deploy script under projectName/scripts -> Tomcat, JBoss, Websphere, etc.)

In out project we need to provide different configuration settings per environment (local, development, testing, production). While we want to avoid the hassle of updating config files on the different servers themselves, we do need to allow quick overriding of config values on a specific environment.
To accomodate these requirements we have the following setup of 'cascading' configuration files:
common.properties bundled in the .war file and is always loaded
xxx.properties files are bundled in the .war file, and depending on the value of an environment variable (appEnv) one or none of these configuration files is loaded (e.g. if the environment variable is yyy, yyy.properties is loaded)
configuration files on the local file system are found using the appEnv and appLibRoot environment variables. If a configuration file corresponding to appEnv.properties is found, it is loaded last.
Loading the config files on startp of the Grail application is done by simply providing a list of configuration file locations in Config.groovy. The AppEnv class creates the list of configuration files using the appEnv and appLibRoot environment variables, and checking which files actually exist in the classpath and on the file system.
Config.groovy
grails.config.locations = AppEnv.instance.configLocations

Related

Folder contents are automatically deleted every time I deploy my Rails app

I have a directory under /public folder with the use of CarrierWave I store all my PDF files under this dir. But the problem is all the time I deploy new version of my Rails app this directory gets cleaned up and the all files are missing. This directory is was set under my uploader.
I also have a directory named "private" which I created manually in order to not to serve sensitive files globally on WEB. Those files also gone after new deployment process.
How can I prevent files from deleting on deploy process?
Thanks.
I assume you are using some automation for deployment. Because if you are updating your code on server instance manually then you can preserve pre uploaded file, but using manually method to update code is not a good practice.
So in automation deployment we generally follow this kind of flow.
Whenever you deploy that create a new deploy version and set that as current version.
Simply that means it's creating a new directory and placing your rails project in it. Now the files you are storing inside the project directory are there in the previous version those are not gone if you are using any linux instance.(Only if you have setup that way to preserve last few versions to restore incase of new deploy is exploded)
Clear till now?
Not suppose you are not keeping any previous version, your files are gone forever.
So it's not a good idea to store any file under project repository.
Best practice is to use bucket system like AWS bucket or Google cloud bucket, where we store all the uploaded file. If having bucket is not in budget, you can choose a different directory on linux server instance outside of project directory. But you have to setup all those upload paths and directory system to be used as bucket.
This problem I am facing with is happening because of capistrano. Every time I run cap production deploy command on my server, the capistrano deployment tool syncs every file with git repo. And the files added by end-users are not stored under my git repos of course, so capistrano overwriting the empty public folder from my repo to the server. Adding the path to :linked_dirs variable under deploy.rb solved my problem.
Another approach could be using a directory which is somewhere else than your project root path (such as /home/files) to store all your files. By doing this you will be seperating your files from project and also prevent capistrano's overwriting problem.
Hope this information will be useful for someone or future me :) ..
When you deploy with capistrano, a new deploy(folder) is created from the repository.
Any files not in the repository are not carried over.
If you want to persist files in public, you need to create a directory in your server first and then create a symlink with capistrano inside public to that folder.
Then have your carrierwave uploads saved to that location.
During each deployment cap will symlink to that directory and your files will be there

Why is `database.yml` file in config folder, rather than in db folder in ruby on rails?

I am new to Rails and trying to understand the app directory structure of Rails. In this I came across database.yml in config folder. But we have a separate db folder, then why is .yml file in config folder?
Thanks in advance.
Because you use the database.yml file to configure your database. For example if you wish to change your main database from Sql Lite(default) to Mysql, you need to change your database configuration, which is found in your database.yml file.
config
As the name suggests this contains all the application’s configuration files. The database connection and application behavior can be altered by the files inside this directory.
config/database.yml
This file holds all the database configuration the application needs. Here, different configurations can be set for different environments.
So, all the configuration related task is done under config directory.
Because database.yml contains config for your DB setup and Rails convention requires that all configs are in config folder :) This way you don't need to search the whole project for config of a new gem you just installed - all configs are always in the same folder.

How to configure Rails app for deployment to Tomcat

I have a Rails app that I package as a war file for deploying to Tomcat using Warbler. And it works, but the problem is I don't know how to configure the runtime properties like secret_key_base. I use the standard setup of using secrets.yml, with production variables coming from environment variables. But I don't know how to set the variables while still keeping them out of source control.
Ideally I'd still like to be able to deploy the war file automatically, by just dropping it into the webapps/ directory, but I suppose I could edit the server config file? Or is there a better way of handling this?
either do it the same way as you would in a Rails server ... let it read from ENV (of course you will need to make sure Tomcat has the environment variable set).
alternatively you can set it in a web.xml if you're packaging and than do a $servlet_context.getAttribute('foo') in secrets.yml ... or read it from a file location that only the server's tomcat username can access etc.
sky is the limit here - you basically need to decide what fits your deployments the best.

How to use a single .war for several grails environments in AWS Elastic Beanstalk?

I run several environments of my Grails application up in Elastic Beanstalk. It would be a big timesaver to not have to build and upload different .war files just for the different environments (I have all the environmental differences passed in as system properties in the 'container' configuration area, so there is no external config file). As per this article http://mrhaki.blogspot.ca/2011/02/grails-goodness-one-war-to-rule-them.html, it is possible to use a single .war and set the environment dynamically by passing the grails.env property, but it doesn't seem possible to do so as beanstalk limits you to a predefined set of named system properties (JDBC_CONNECTION_STRING, PARAM1, PARAM2, etc)
What would be my best approach here?
It turns out you can pass arbitrary parameters, including environment variables, to the container via the 'JVM Command Line Options' field in the 'container' area of the configuration.
-Dgrails.env=DesiredEnvironmentName
Works like a charm, I'm now using a single .war for all environments.
Set PARAM1 to the name of your config file,
then in Config.groovy
grails.config.locations = [ System.getProperty("PARAM1") ]
Alternatively, you could just store a different configuration for each of your environments in the database via something like the dynamic config plugin - http://grails.org/plugin/dynamic-config

Capistrano: Version control for files in shared/

When Capistrano deploys a Rails app, it creates a shared/ directory to store files that should be shared across releases and not re-exported every time. In my application I have several things in the shared/ directory that rarely change (so they belong there rather than in the application tree), but I'd still like them to be version controlled for the times when they do change.
What is the best way to approach version controlling those files but keeping them separate from the repository Capistrano is exporting from?
The /shared directory is really for un-versioned data. For example, you might store bundled gems so that you don't have to re-install all your gems every release. You can also store you logs there so they don't get overwritten every time you deploy. You can store pid files there so you don't loose the process ids of critical processes during a deploy. You might even store user generated or partially processed data there so that it is not removed during a release. If a file is meant to be versioned and has the chance of changing though, I would recommend keeping it with the rest of your files and out of the shared directory.
That said, you can always also write deploy scripts to pre-populate data in your shared directory, like database configuration files. These scripts will get run on each deploy and can be entirely customized. For example, your database config script might only write the config file if it doesn't already exist.
Another common use of the shared directory is for configuration files. Versioning and source control for configuration files is a very good idea, but should managed in a system configuration management tool. In my environment, I manage code releases with Capistrano and system configuration with Puppet. That way, there is still source control over configuration files, but they are kept distinct from the code deploy process. In turn, the code deploy process is kept independent of system configuration.

Resources