I'm using AssetSync to sync to an S3 bucket. It seems to work fine. However, I'd like to be able to add versioning to the S3 bucket. So instead of just the bucket name, I want to put the deployed assets into a subdirectory like
my-bucket/v1/
I tried adding the 'v1' folder to the FOG_DIRECTORY env var, but that doesn't seem to be the thing to do. Is there an easy way to specify a subdir of a bucket?
Found the answer myself. Thanks for nothin'.
AssetSync.config.assets.prefix
Related
I have several sounds files that are located in public/assets/sounds.
Locally everything works fine, but when I deploy via Capistrano to my ec2 instance, none of those assets make it to the server. I added 'public/assets/sounds' to :linked_dirs in deploy.rb. A directory shows up at 'public/assets/sounds' but none of the mp3s are there. Do I need to manually add all files via :linked_files?
I have it working by just loading the files into the shared/public/assets/sounds directory via ftp, but that doesn't seem like the best use of the Capistrano. I'm also new to Capistrano and could be totally wrong :p
The public/assets directory is reserved for the Rails asset pipeline. You should not place any files there. Here's what I would do:
Remove public/assets/sounds from :linked_dirs.
Choose a different place for the mp3 files, like public/sounds.
Do not add this directory to :linked_dirs.
Our server ran into a file limit issue with carrierwave. Over 36000 files. We are now going to move to S3.
Is there a way to migrate the files over to S3? When we launched the code on production none of the images showed up and there was a duh moment. It's trying to grab the files from s3 when they are locally stored on the server still.
How do we migrate the files over?
You can upload the files to s3 via the s3 console in the s3 file manager. Or by using a plugin such as S3Fox for FireFox. You'll just need to make sure the pathing and the s3 bucket are such that Carrierwave will know how to point to the image via the right set of subfolders, etc.
I'm trying to get refinerycms to upload files to s3, using the fog gem.
I'd like to pull my S3 credentials from a file that is not in my git repo (e.g. s3.yml)
I found some old references to doing this using the aws-s3 gem, but not fog.
Thanks in advance for any help!
I keep my config in a config file rather than a yml file.
In config/s3_config.rb:
ENV['S3_KEY'] = 'MYS3KEY'
ENV['S3_SECRET'] = 'MYSECRETKEY'
ENV['S3_BUCKET'] = 'this-is-my-bucket'
When you run your rails app (this would be in development), the config file is automatically loaded, so those credentials will be referenced to the constants (ENV['S3_KEY']).
This would be different when you deploy your application. For instance Heroku, you would create those config vars.
I'm trying to upload a file using paperclip in a production environment in Heroku and the log files show:
Errno::EACCES (Permission denied - /app/public/system/photos/1/small/081811-2012-honda-cbr1000rr-leaked-003.jpg):
Will I have to use s3 or similar to handle file uploads, or can I configure path permissions to store the files on Heroku?
Yes Heroku does not allows you to add files dynamically to its server.
Though if you need upload feature on a app on heroku you need to configure s3 or other similar services
Refer this for details
http://devcenter.heroku.com/articles/read-only-filesystem
Yes, you must use S3 or another persistent store like Rackspace cloudfiles, Dropbox etc.
Whilst you can write to the tmp on all the stacks, the Cedar stack does let you write to the file system but it's not shared across dynos or dyno stop/restarts.
See http://devcenter.heroku.com/articles/dyno-isolation#ephemeral_filesystem
Yeah, it is true that Heroku does not allow you to upload files directly onto their servers. However, you can shove your files into your database. I used a gem created by Pat Shaughnessy:
http://patshaughnessy.net/2009/2/19/database-storage-for-paperclip
It worked well.
Can you install git on Amazon and push assets (js, css, img) easily? Something like Heroku but with assets and S3 would be awesome.
Some people seem to use JungleDisk to sync a local git directory to s3, but that's too bulky. I tried installing jgit on a mac but to no avail, and that thing looks ancient. Is there anything else or does this type of thing just not work?
I've tried most of the gems out there for heroku asset deployment/optimization, but they all either require you to host them on heroku, or to run a rake task, which is not ideal.
You can use s3cmd --sync LOCAL s3://BUCKET/PREFIX to accomplish this. It is best if each asset has a version number in its name. Otherwise you can have weird issues such as the browser caching version 1 of the CSS and using version 2 of the JavaScript, causing undefined behavior.
How about setting up an EC2 image, mounting the s3 image on there and setting up a bare git repository to push to on said S3 image?
Tools recommend by other answers are out of date.
This one is up to date: https://github.com/schickling/git-s3
See Jammit s3, it's a great solution (zip + cloud):
https://github.com/railsjedi/jammit-s3
The problem with Jammit s3 is that you cannot manage versions of your files properly. It might take end users a while until they get the most update to date JS/CSS file.
You might find this useful, in case you use PHP for your application (otherwise this code can be adapted to your needs): https://github.com/SupersonicAds/git-hook-php-s3-files-revisions
Jammit S3 didn't have the control I was looking for, so I wrote my own CLI script:
https://github.com/bradt/git-deploy-s3