paperclip overwrites / resets S3 permissions for non-bucket-owners - ruby-on-rails

I have opened this as an issue on Github (http://github.com/thoughtbot/paperclip/issues/issue/225) but on the chance that I'm just doing this wrong, I thought I'd also ask about it here. If someone can tell me where I'm going wrong, I can close the issue and save the Paperclip guys some trouble.
Issue:
When using S3 for storage, and you wish your bucket to allow access to other users to whom you have granted access, Paperclip appears to overwrite the permissions on the bucket, removing access to these users.
Process for duplication:
Create a bucket in S3 and set up a Rails app with Paperclip to use this bucket for storage
Add a user (for example, aws#zencoder.com, the user for the video encoding service Zencoder) to the bucket, and grant this user List and Read/Write permissions.
Upload a file.
Refresh the permissions. The user you added will be gone. As well, a user "Everyone" with read permissions will have been added.
The end result is that you cannot, so far as I can tell, retain desired permissions on your bucket when using Paperclip and S3.
Can anyone help?

Try explicitly setting :s3_permissions => :public_read
Seems to work for me.

Related

Does an S3 bucket need to be Public to serve user viewable images to an app?

At the moment my Rails 6 React app has user uploaded images (avatars, profile wallpapers, etc) stored in S3, inside a public bucket for local development (not facilitated by active storage because it was not playing nice with vips for image processing). The reason its set to public was for ease of set up, now that all of the functionality is complete, for the stagging (and soon production) I would like to add sensible bucket policies. I don't currently have CloudFront set up but I do intend to add that in the near term, for right now I'm using the bucket asset URL to serve assets. I have created two separate buckets, one for images that will be displayed in the app and one for content that is never to be publicly displayed which will be used for internal purposes.
The question I have is, for the content that is in the bucket reserved for viewable content, do I have to make it public (disable that setting in the AWS console that disables public access), then create a policy that allows GET request from wherever (*), then restriction POST, PUT, DELETE, requests to the arn ID of the EC2 instance that's hosting the rails application. The AWS documentation has confused me, it gives me the impression that you never want to enable public access to a bucket, and that policies alone are how you surface bucket content. When I take that approach I keep getting access denied in the UI when I have attempted to do that.
EDIT:
I'm aware that signed URLs can be used, but it is my current understanding that there is a nontrivial speed hit to the UX of the application if you have to generate a signed URL for every image (this app is image heavy). There are also SEO concerns given that all the image URLs would effectively be temporary.
Objects in Amazon S3 are private by default. You can grant access to an object in several ways:
A Bucket Policy that can grant access to everyone ('Public'), or to specific IP addresses or users
An IAM Policy on an IAM User or IAM Group that grants access to that user or group -- however, they would need to access via an AWS SDK so that they can authenticate the call (eg when an application makes a request to S3, it would make an authenticated API call)
An Access Control List (ACL) on the object, which can make the object public without requiring the bucket to be public
By using an Amazon S3 pre-signed URL, which is a time-limited URL that provides temporary access to a private object
Given your use-case, an S3 pre-signed URL would be the best choice since the content is kept private but the application can generate a link that provides temporary access to the object. This can also be done with CloudFront.
Generating the pre-signed URL only takes a few lines of code and does not involve an API call to AWS. It is simply creating a hash of the request using your Secret Key, and then appending that hash as a 'signature'. Therefore, there is effectively no speed impact of generating pre-signed URLs for all of your images.
I don't see how SEO would be impacted by using pre-signed URLs. Only actual web pages (HTML) are tracked in SEO -- images are not relevant. Also, the URLs point to the normal image, but have some parameters at the end of the URL so they could be tracked the same as a non-signed URL.
No it does not have to be public. If you don't want to use CloudFront, the other option is to use S3 presigned RLs.

Rails 6 - Allow/Disallow file download as per user role for files on AWS S3

I have a Rails 6 app, where registered users(Owner) can upload files - images/videos on S3 and then the the owner can provide access to other users(invitations) to view their uploaded content.
Is there a way I can restrict file access so that only the owner can download his uploaded files(images/videos), thereby putting restrictions in place to other non-owner/invited users.. Videos/images should not get downloaded by just right-clicking and saving/downloading them so easily.
Note - the uploaded files also include large videos(both mp4 and HLS streaming), so other invited users can view them but cannot download it unless they are the owners/uploaders as the files are coming from AWS Cloudfront for videos and S3, if they are images.
Associations are setup like -
User has one role
User has many images/videos, each residing in his own folder on s3(`bucket/user_id/image_slug/` or `bucket/user_id/video_slug/`)
User has many invitations(must be view only access to owners file)
Not sure,what is the right approach, can be -
update the ACL for the file if its accessed by non-owner and make it read-only?
Make all uploaded files public and disable public access for non-owners but this will also restrict any access to the file directly.
Let me know what is the best suited logic for this approach.
What you are trying to achieve needs groundwork on multiple levels:
Based on S3 security best practices, you should keep the permission level to a minimum just enough on the S3 side for the app to provide the expected behavior.
S3 allows you to grant access to user specific folders.
You should look into the access granted gem to cover server side restrictions. You should also look into client side restrictions. A common technique is to disable right mouse click.
Related:
How to download files without showing S3 URls
Top 7 security features for video streaming platforms
How Netflix protects its content
A possible option would be to generate signed urls to your s3 objects, and have the "authorization" logic in your rails app. This should be the default on ActiveStorage otherwise if you're using Carrierwave you need to set fog_public = false
Option 1:
In the view where you are displaying the button to download the file:
- if user.can_access_document?(document)
= link_to 'View', document.attachment_url, target: :blank
- else
Request an invite from the owner
In your User model:
class User < ApplicationRecord
has_many :invitations
...
def can_access_document?(document)
# Check if there is an invitation entry for this user to access said file
self.invitations.where(document_id: document.id).any?
end
end
Option 2
You could also have the button hit a particular endpoint/route to one of your controllers like Files#download_file, and then do the checking in there and redirect to the s3 file.
class FilesController < ApplicationController
def download
document = Document.find(params[:id])
if current_user.can_access_document?(document)
redirect_to document_url
else
raise "You don't have access to this document"
end
end
end
In your view just direct the link to your new controller
= link_to 'View', download_files_url(document), target: :blank
As the urls for the files are signed, the user will need to click the button on your website and cannot just use the same url over and over. These signed urls can also have a dynamic expiry time, for example if link is set to expire after 60s, if the user clicks to download the file after 60s elapsed he/she will be displayed an s3 error saying link expired something like this:
<Error>
<Code>AccessDenied</Code>
<Message>Request has expired</Message>
<Expires>2018-06-28T07:13:14Z</Expires>
<ServerTime>2018-08-06T20:03:02Z</ServerTime>
<RequestId>87E1D2CFAAA7F9A6</RequestId>
<HostId>
A9BEluTV2hk3ltdFkixvQFa/yUBfUSgDjptwphKze+jXR6tYbpHCx8Z7y6WTfxu3rS4cGk5/WTQ=
</HostId>
</Error>

Making a request to an ActiveStorage resource fails occasionally when used alongside Apartment gem

Use-case
We're using Apartment gem alongside ActiveStorage.
In our Amazon S3 server, we have created a bucket for every tenant in our application.
Since ActiveStorage can hold only one bucket globally, we're switching the ActiveStorage bucket for every request.
The problem
When a request from one tenant is being served and a new request from another tenant comes in, the bucket held by ActiveStorage gets switched to that of the new tenant.
Now if the first tenant request is trying to access a resource in ActiveStorage, it results in a failure since request is being made with incorrect bucket name
Pictorial representation attached.
Current workaround
A fallback mechanism to check if the generated ActiveStorage URL is valid. If it's invalid, change the bucket name held by ActiveStorage to that of the current tenant and make a new request
Is there a better way of solving this problem?

Using s3 in a healthcare application, private links

We develop a rails-based healthcare application. What is the best way to configure our s3 implementation so that only the authenticated user has access to the image?
From the Documentation,you should use one of Amazon's "canned" ACLs.
Amazon accepts the following canned ACLs:
:private
:public_read
:public_read_write
:authenticated_read
:bucket_owner_read
:bucket_owner_full_control
You can specify a the ACL at bucket creation or later update a bucket.
# at create time, defaults to :private when not specified
bucket = s3.buckets.create('name', :acl => :public_read)
# replacing an existing bucket ACL
bucket.acl = :private
Wanted to post an updated answer to this question, as the S3 API has changed (slightly) since 2015. Here's a link to the updated ACL section of the S3 Docs. Further, the above answer reflects the use of the Ruby SDK, which not everyone uses.
Canned ACL's are predefined grants supported by S3 that have specific grantees and permissions in place. Canned ACL's can be sent via the SDK, as demonstrated in the above answer, or in an HTTP request by using the x-amz-acl request header for new resources, or with the request header or body for existing resources.
The canned ACL's are as follows. Unless otherwise specified, the bucket owner has FULL_CONTROL in addition to the other permissions listed:
private: No other user is granted access (default)
public-read: AllUsers group gets READ access
public-read-write: AllUsers group gets READ and WRITE access (not recommended)
aws-exec-read: Amazon EC2 gets READ access to GET an Amazon Machine Image (bundle)
authenticated-read: AuthenticatedUsers group gets READ access
bucket-owner-read: Bucket owner gets READ access. Ignored when creating a bucket
bucket-owner-full-control: Both object and bucket owner get FULL_CONTROL over object. Ignored when creating a bucket
log-delivery-write: LogDelivery group gets WRITE and READ_ACP permissions
Also noted in the docs: you can specify only one canned ACL in your request

rails devise with amazon s3 how to limit usage per user

I want to setup a video&image viewing function on my site. My idea is deploy everything video,image to amazon s3. I know I should use devise to setup a user signup feature. But I still have few concern about the security issue and usage charge problems
1.Is devise safe?
2.How can I guarantee only user signed in and can only access the video/images on my amazon-s3 via only my sites while they signed in?
3.This is the most most difficult problem.. Can we keep track of a user's usage? let say I dont want each user in my sites accessing more than 100mb/day contents from s3, anyway to acheive this features??
Thankyou in advance!
Devise is most certainly a framework that will allow you to use best practices to authorize and authenticate users (e.g. by doing things like using very strong encryption methods when storing passwords). But "safe" is a little subjective -- think of Devise as a very good toolbox that will allow you to easily do things that will make your site safe.
Guaranteeing that users will only access data via your site means that you cannot set the default S3 permissions that make content in S3 buckets readable by all. I am pretty sure S3 is pretty basic in terms of permissions. Instead consider a gem like CarrierWave that makes it easy to move files around, including streaming file from S3 through your server to the user, thus giving you hooks to authenticate by user. This is also a hook for measuring number of bits transferred.
If I recall, CarrierWave (or maybe Fog?) gives you a way to query the S3 buckets similarly to how you would in a filessystem, so you can check for size.

Resources