I am working on Amazon S3 sdk for storing files on cloud server,i am using codeplex's threesharp(http://threesharp.codeplex.com) for implementing this, I have successfully uploaded file on server now i have to download it, and for this i have to download it with the URL eg https://s3.amazonaws.com/MyBucket/Filename
I can download the uploaded file but it is appearing blank, if i upload a text file then after downloading it's showing nothing in it,same as images and other files. I have read on Amazon S3 documentation that i'll have to make the object publically readable(http://docs.amazonwebservices.com/AmazonS3/latest/gsg/OpeningAnObject.html) i dont have any idea how to achieve this.
How can i accomplish the download functionality?
Threesharp project is a desktop based and i am working on web based application
During file upload set proper ACL:
Eg.:
AmazonS3 client = GetS3Client();
SetACLRequest request = new SetACLRequest();
request.BucketName = "my-new-bucket";
request.Key = "hello.txt";
request.CannedACL = S3CannedACL.PublicRead;
client.SetACL(request)
Amazon S3 provides a rich set of mechanisms for you to manage access to your buckets and objects.
Check this for detail: Amazon S3 Bucket Public Access Considerations
Also, You can Download Explorer for Amazon S3 (Eg. CloudBerry Explorer for Amazon S3) & they you can assign appropriate rights to your buckets.
CloudBerry Explorer for Amazon S3: Data Access Feature:
Bucket Policy Editor
Create and edit conditional rules for managing access to the buckets and objects.
ACL Editor
Manage access permission to any of your objects by setting up 'Access Control List'. ACL will also apply to all 'child objects' inside S3 buckets.
Also, you can do the same using Amazon S3 admin console.
Eg.
Have you tried the following:
Right-click the object and click Make public
Select the object and in the Permissions section checked Open/Download ?
edit:
have you taken a look here:
How to set the permission on files at the time of Upload through Amazon s3 API
and here:
How to set a bucket's ACL on S3?
It might guide you in the right direction
Related
Is there a way i can upload files while keeping bucket private and access control uniform?
as i am trying to use carrierwave with fog for this purpose and followed carrierwave gem instructions but i receive this error "Cannot insert legacy ACL for an object when uniform bucket-level access is enabled." which led me to make my bucket public and control access "fine grained". also there were some solutions to use "SignedURLs" but in that case i have to make url for every individual object.
All i want is to simply upload .pdf or .docx files to google cloud storage without making bucket public.
I have CSV files in a directory of an S3 bucket. I would like to use all of the files as a single table in Dremio, I think this is possible as long as each file has the same header/columns as the others.
Do I need to first add an Amazon S3 data source using the UI or can I somehow add one as a Source using the Catalog API? (I'd prefer the latter.) The REST API documentation doesn't provide a clear example of how to do this (or I just didn't get it), and I have been unable to find how to get the "New Amazon S3 Source" configuration screen as shown in the documentation, perhaps because I've not logged in as an administrator?
For example, let's say I have a dataset split over two CSV files in an S3 bucket named examplebucket within a directory named datadir:
s3://examplebucket/datadir/part_0.csv
s3://examplebucket/datadir/part_1.csv
Do I somehow set the S3 bucket/path s3://examplebucket/datadir as a data source and then promote each of the files contained therein (part_0.csv and part_1.csv) as a Dataset? Is that sufficient to allow all the files to be used as a single table?
It turns out that this is only possible for admin users, normal users can't add a source. To do what I have proposed above you put the files into an S3 bucket which has already been configured as a Dremio source by an admin user. Then you promote the files or folder as a data source using the Dremio Catalog API.
After looking over the GIT repository and reading the web interfacce instructions, also had a look over this nice tutorial. I sow that it is possible to create a folder with some data(in this case an image) as:
...
_uploadRequest = [AWSS3TransferManagerUploadRequest new];
....
_uploadRequest.key = #"foldername/image.png";
...
Now my question is hoe can I password protect the folder(Object as Amazon calls them)?
I am asking because I sowed in the web interface that it specifies:
When you add a file to Amazon S3, you have the option of including
metadata with the file and setting permissions to control access to
the file.
Password protection for S3 Bucket is not available. If you want to secure the S3 bucket either you have to use Signed URLS to access the S3 resources or you can keep the S3 bucket private.
For my Rails application, I download a bunch of files from a remote URL to my application. I would like to directly upload them to Amazon S3, without needing a form to do the upload, since I will temporarily cache the file I downloaded on the EC2 instance.
I would also like to retain the links to the files I uploaded so I can download them later.
I am essentially reposting the files I downloaded.
I looked around, but most of the solution seem to involve form uploading to S3 with a user.
Is there s direct upload solution?
You can upload directly to S3 using the AWS SDK for Ruby. The easiest way is:
require 'aws-sdk'
s3 = Aws::S3::Resource.new(region:'us-west-2')
obj = s3.bucket('bucket-name').object('key')
obj.upload_file('/path/to/source/file')
Or you can find a couple other options here.
You can simply use EvaporateJS to achieve this. You can also take advantage of sending ajax request to update file name to the database after each file upload. Though javascript exposes few details your bucket is not vulnerable to hack as S3 service provide a bucket policy.
Just set the <AllowedOrigin>*</AllowedOrigin> to <AllowedOrigin>specificwebsite.com</AllowedOrigin> in production mode.
I'm using s3-swf-upload-plugin in a Rails project to upload directly to S3. Pretty nifty, but can't seem to figure out how to make the uploaded files public. S3 doesn't seem to have the concept of public "buckets". Any ideas?
S3 supports four different access policies for both buckets and objects.
Take a look at the Canned Access Policies section in the S3 Documentation.
Specifically:
private
public-read
public-read-write
authenticated-read
So in your case, you'll need set the access policy on your bucket and uploaded files to public-read.
I use S3Fox for Firefox, http://www.s3fox.net/
You can browse your S3 buckets then right-click -> Edit ACL and set things to public.
You can also get the url for the bucket in a similar fashion.
It is very simple to use.