I have a refinery site up on heroku and, in order to change the site name I have been going to my local copy, changing config.site_name, adding the new config file to my git repository, and doing git push heroku master. Though I don't anticipate having to change the site name that many more times (I have a client who is still deciding on a final site name), I was wondering if there was a faster way to do this (I tried to figure out if this was a config option I could just change from the terminal in heroku but to no avail).
I will suggest to add new model 'site_setting' for adding some additional settings. So you can add one column ie site_name. So admin can dynamically add/update the name of site by providing simple UI to insert the name of site.
Remaining task is to add that object at site_bar.html.erb
I hope this work for you.
Related
The problem:
I'm using bitbucket stash (server) API in a script for my project with the {path} api method:
/rest/api/1.0/projects/{projectKey}/repos/{repositorySlug}/browse/{path:.*}
The idea was to save versions of config files in a repository (version01-versionXX for every config). But those configs have the same structure with different names and parameters,
so when I push a new config with a commit message like 'version01' without specifying any sourceCommitId, bitbucket automatically adds a parent commit from the last file with the same structure (if it exists). As a result, in this new file's history I'm getting several 'version01' commits, which is not what I was intended to have.
What I've tried:
If I do specify sourceCommitId as the initial or the last commit on the branch, I get an error message since the file doesn't exist on this commit.
I've tried to experiment with empty sourceBranch parameter, but still some parent commit appears.
The only idea I came up with is to create a new branch for every config, but this seems like overkill to me.
All attempts to find a method for editing file commit history via API also failed.
At the moment as a work around I create every config file with its name as the only line of its content and then change it to the structure I need. This works so far, but doesn't look like a good solution to me and requires 2 API requests instead of one.
Is there a better way to prevent BitBucket from treating those new files as copies of old ones?
I’m using Docusaurus / Travis CI to build my docs and I’m trying to have them hosted on a custom domain - myproject.com. The way I have it setup right now Travis CI initiates a build every time I do a commit BUT the custom domain gets wiped out every time this happens. I can duplicate the issue by manually initiated builds myself.
It happens regardless of what the repo name is. I’ve tried myproject.github.io as the repo name and myproject.com (to match the custom domain) as the repo name. In both cases the custom domain gets wiped. When the repo name is myproject.github.io the site gets published to https://myproject.github.io/ and when the repo name is myproject.com the site gets published to https://myproject.github.io/myproject.com/ .
Am I just going to have to remember to always re-add the custom domain every time I make a commit to my build branch? 🙄
I had this issue, if you're pushing something to the gh-pages branch. it could be that you're not preserving the CNAME file.
When you add a custom domain, github automatically adds a CNAME file to your gh-pages branch with your domain inside it.
I figured this out. Farook's post pointed me in the right direction.
So when the branch is named myproject.github.io the master branch is used for deployments. Like myproject.github.io is a mirror of what's in the master branch.
The problem is that the *.md files that you modify to work with Docusaurus aren't the end product so you have to work out of a branch other than master. Within that branch a lot of the root contents are built from the *.md files but the rest of it is copied from the website/statics directory so I just put the CNAME file there and that worked!
I am wondering the best way to include an old web site into a newer rails app.
The legacy web site:
Has 21,000 small text files with minimal markup that are linked together.
Totals ~ 220MB
Has a root page located within a directory and is linked to many sub-directories
I'd like to include the old site in my rails app folder, but I am concerned that it will mean a much longer development cycle each time I deploy. I am using capistrano and my first thought is to place the folder with Old Site in the shared directory on the production server and symbolically link to it accordingly. This approach strikes me as undesirable as my resources for New App will be split in more than one location. The benefit might be a much quicker debug/deploy cycle.
Right now, I have no plans to modify the Old Site files. At some point, that could change.
I have been impressed with how quickly my otherwise lightweight project will deploy. Right now I am making frequent changes and repeating the code/deploy cycle often. I'd like to avoid slowing that down unnecessarily.
Is there a best practice for this sort of thing?
I don't think there is a "best practice" per se, but one option would be to use "Git Submodules". Add your Old Site as a submodule in the right folder in your New Site and you have one developer git repository and Capistrano will not fetch git submodule files while deploying.
With submodules, you will have 2 git repositories. But, one will be linked from within another one ("old site" in "new site"). I think, it is logical. You will have old site repo and new site repo. They are 2 separate sites after all, just linked.
I have a Rails application which I now plan to deploy many instances to different domains. Originally I only intend for it to be on one domain.
I realize that for each domain, I have to replace all the hard-coded values in various places. These include:
asset host path (assets reside on the same domain)
whenever-gem's :application setting (since two domain can share the same server, and this is to avoid crobtab update clash)
some of tasks which uses curl to its own address to trigger events
carrierwave needs a hardcoded value when computing image full url without the request object.
Question
Is there a strategy to set this, so:
the setting should not be commited into source control (like database.yml.example)
Codes outside Rails can access it (whenever-gem does not load Rails environment)
Ways to access the domain can be consistent
One approach you can take is to have a YAML file with per deployment properties. You could even check the development version in and have your deploy scripts overwrite with the correct version.
Typically I'd put that configuration file in shared/config (assuming a capistrano style layout) and then symlink it into the current release during the deploy.
I am trying to migrate the setup here at the office from SVN to Git and am setting up Redmine as the host for our projects and issue management (Currently we use a version of Gforge + SVN). I should preface by saying that I'm an embedded C software developer by day and have basically zero experience with Rails or web apps, but I like trying new things so I volunteered to set up the project management tools which will take us into the future.
I have Redmine setup and am using Gitolite as the Git repo manager. Additionally, I am using the ericpaulbishop/redmine_git_hosting plugin to facilitate automatic public ssh key pushing to Gitolite and automatic repo creation when we register a new project. Everything seems to work except the repo view within the project does not keep track of the changesets. (The "History" is just empty, although when you view the files, it does show the latest version correctly)
I copied the post-receive hook from the plugin's contrib directory to the .gitolite/ common hooks, but again I know little about Ruby and how these gitolite hooks work so I don't know how to debug this. I notice there are log messages and things in the hook, but I have no idea where those are printed, etc...
I even tried the Howto on the Redmine wiki, HowTo setup automatic refresh of repositories in Redmine on commit:
#!/bin/sh
curl "http://<redmine url>/sys/fetch_changesets?key=<your service key>"
Any ideas on where I start debugging? I've been able to resolve every problem up to this point, but I'm a little stuck now. The plugin doesn't make it obvious how this is supposed to work, and to be honest, I'm not even sure if this is a problem with Redmine not reading the repo correctly (or at all), or gitolite not communicating as Redmine expects, etc...
I guess I could answer this...
I checked the issues under the Github page and I found this on:
https://github.com/ericpaulbishop/redmine_git_hosting/issues/89
Which was pretty much exactly my problem. This does appear to be a small bug in the plugin, but you can work around it by changing Max Cache Time to "1 minute or until next commit". This immediately fixed my problem. I simply left it like that but one of the posters claimed that you could change it back to until next commit and it works from then on...