We decided to try using RefineryCMS for our current project and have run into some issues.
We began the project in a local development environment. The deadline is very quickly approaching. At first, we tried to move the project into production using a DreamHost server. There were issues with the images (Refinery uses Dragonfly), where our database dump wasn't brining any images over (or thumbnails).
We have exhausted our resources and still can't find a solution to our problem. I've done research, asked in the Google group, emailed people, and asked in the chat - no one has answered yet.
So, does anyone here have any idea what is the best solution for moving a project over from a local development environment to production?
I am sure I can't be the only person who has run into this issue...
Thanks in advance.
In case someone arrives here with this same situation (as I did), my two cents:
I've successfully migrated between servers by dumping the database and copying public/system, so this is perfectly feasible and straightforward.
At first the images were not showing neither in the backend nor in the frontend, but after a couple of hours stuck I realized the problem was that I hadn't installed ImageMagick in the new server (huge facepalm).
A quick sudo apt-get install imagemagick solved the issue.
Of course, YMMV, but I hope that helps.
I'm not sure what version of RefineryCMS you are using, or how you configured it, but unless you chose to use Amazon S3 for uploads then your Dragonfly images and resources are being stored on the file system. So in addition to that database dump, you'll probably want to look in the public/system/ folder and copy everything in the images and resources folders up to the server too.
Related
I would appreciate any feedback regarding what may be causing my issue, described below.
I have an application that allows users to upload images. Everything works fine in development and used to work fine in production.
Recently, my newer images have all broken. I can upload new images, but when I check back a few hours later, the images are broken again. This started happening about a week ago, and images that I've had up in production from before then are still ok.
I am using Rails with Bootstrap and SimpleForm, and using Paperclip for the images. I am using Postgres in both devleopment and production, and am deploying to Heroku.
The only hint I have is in the "blank_profile_pic.png" image that I use as a default when users don't have a profile picture uploaded.
User.all.each do |u|
if u.profile_pic.file?
image_tag(user.profile_pic)
else
image_tag("blank_profile_pic.png")
end
end
For users that don't have a profile_pic uploaded, a broken image appears if their profile was created in the last week, but the expected "blank_profile_pic.png" remains for people that created their account before the issues started surfacing a week ago. How can the same block of code return different results between recent and older users?
I really don't know where to start with this, so would appreciate any feedback regarding what possible causes could be, and if there are any other files that I can show here.
Thanks very much for your help!
Heroku is a read only system. So you most definitely have to upload your images either to S3 or some other cloud provided.
Your images might have been uploaded to /tmp in Heroku and then somehow it was cleared, hence the errors.
Here are the docs: https://devcenter.heroku.com/articles/read-only-filesystem
And configuring Paperclip with S3: https://github.com/thoughtbot/paperclip#storage
I am not sure if you have the imagemagick gem installed, but we had same issue with image_magic that was breaking our paperclip functionality in production, but not in development (weird, I know). Yet even after removing imagemagick from our gemfile and Gemfile.lock locally (running bundle install and all that stuff) and then deploying back to production on heroku, the error persisted in production! (weird, I know).
What ended up doing the trick was running:
$ heroku repo:purge_cache -a myAppName
(Taken from: https://github.com/heroku/heroku-repo#purge_cache)
When you deploy your app, Heroku caches some things like your assets and installed gems in order to speed up deployment. Although this is a great feature, it can have side-effects sometimes, and in this case, it seems that something about the imagemagick gem got "stuck" in production's cache, which is why purging solved the issue for us (since after purging, your app will rebuild itself from scratch on your next deployment)
I've been spending the last few months on developing a (my first) Rails application all by my self, just me and my Linux box, all in my development RAILS_ENV, no SCM ("for shame!") or anything. It has become quite the beast now and I am getting ready to release it onto the world. My question is: how am I ever going to make this work?
I installed gems, plugins, servers (MySQL, node.js, nginx, sphinx, juggernaut), photo compression apps that I call, video compression tools (FFMPEG) etc, I also obviously have a DB and a (lot of) seed data. I can't even remember all the things I did to my system to make it all work, but it does.
So now, when I deploy this on some stranger's server, how do I make sure that all those things get installed and configured correctly? How is e.g. FFMPEG ever going to get installed on this server when I deploy my application. How will the seed data get uploaded, how will the servers get started, with the right parameters etc.
I have read (a little bit) about Capistrano which seems to be the deployment tool of choice in the Rails community, but I am not sure if that will cover all my needs. For example, how do I figure out all the gems I used or the plugins (do I even need to know?). Is there any way I can test the deployment on my own linux box,the same I am developing on, i.e. pretend that I am hosting my own production server/rails_env and "deploy" it there?
Any help will be much appreciated.
Cheers.
There a lot of standards to follow that make life easier...
As far as figuring out which gems you need, you could try and use RVM and make a local config that you keep adding gems to until your app works. This will be kind of like starting from scratch so that you are sure to know precisely what configuration you need to run. (And it should make it easy to stand up a new, identical environment each time.)
The RVM route will allow you to test in a specific environment, which should help.
You can list the required gems in your environment.rb file so that the server demands them on start-up.
Good Luck, Cowboy .
OK so I'm looking for a good image uploading gem that is Rails 3 compatible and has no dependencies. I was using attachment_fu, but it's Rails 3 compatibility seems to be in question. And I really wanted to use Paperclip, but it has an image majick dependency. I'm having a hard time finding other alternatives...
Stupid question #1: Shouldnt Rails have some "official" image uploading scheme thats baked into the framework? Every web app will need it at some point, and hunting around every time for some questionable third-party way of doing this gets old after awhile.
Stupid question #2: Why can't Paperclip have a no-dependency mode that doesnt make thumbnails or resize, and just stores images as they are uploaded?
As someone who has Rails with Paperclip running on both Linux, Mac and Windows, I can tell you: installing ImageMagic (or whatever spelling is) is not a problem.
In fact, my old linux hosting already had it, Mac laptop, IIRC, too. I expected some problems installing it on windows, but had to just download installer and specify correct path in rails. No problem whatsoever.
So, in your place, I would really give it a try.
edit
There's also a number of file upload plugins for Rails, but I didn't use them and can't really give advice there. Google will give you examples.
I am going to be away from the internet for a few weeks and would still like to get a project done. What steps should I take to make sure I have access to the things I need (ruby and ROR) while I will be disconnected?
when offline, the following are hard to get:
gems
docs
rails expert blogs
stackoverflow ;-)
so,
gem install as much as you can
download all the railscasts
keep one or two rails book around
and find a place with internet wifi
and most importantly:
un-plug yourself 2 days before the real offline, that's called staging ;-)
If you use version control, make sure you can work offline. DVCS do this well, I've heard SVN can work offline if you have a local SVN server.
Running the Rails app on localhost will allow you to access it with your browser locally.
Apart from this it would also be nice to have documentation offline too. Download everything you can think of: Rails, Ruby, Shell, libs etc. Or use books.
Make sure you have local copies of any documentation you need (railsapi.com lets you download the Rails docs)
Make sure you have all the gems/plugins you need
This may not affect you, but it's bitten me before.
If you are using a javascript library such as jQuery, and are linking to Google's Hosted Libraries rather than a local one, you may find jQuery stops working when you are offline.
Download and link to a local copy before you go.
Get your app (in its current state) up and running on your laptop. Then shut off wireless and make sure it still goes. Don't just guess at what gems and things you'll need - make sure you see it actually run. Don't forget things like database engines and queuing servers. Then start guessing about other gems and items you might need.
Make sure that
gem server
will start up a webserver and let you browse the docs for all your installed gems.
Download every Ruby gem. All of them!
You never know when you'll need to extract EXIF data, or something.
Ok, I just bought the new 27 inch iMac and I am trying get everything set up. I am new to rails and have been developing on my MacBook Pro and seem to be having some trouble sharing my applications. I use dropbox which allowed me to easily access the new files from my new iMac and therefore my rails applications but after installing rails, when I try to start the server for my app, I get:
-bash: script/server: Permission denied.
I am assuming this has to do with the app being protected but not sure what to do here.
It's not protected, I guess you lost the execution right while dropping your files.
Just do:
chmod +x script/server
You might consider something else for the transfer, like rsync instead of drop box.
Rather than sharing the entire application directory structure, I've found that the better solution is to share a git repo via dropbox and then clone it on each machine (I also have an iMac and MBP that I work on)
The problem with storing the app on Dropbox is that the logfiles and possibly sqlite database can chew up a lot of room. Nopt to mention that it's always good to use some sort of SCM (git being the most favored in the Rails community, but others should work fine too.)
I went through the steps to do this on another answer to another question.