I was trying to implement a multipart upload by using S3UploadPartRequest class. However I was wondering how to deal the following situation:
During the multipart uploading , I change the token for some reason like ExpiredToken. And here is my question: should I start over the uploading or the part I uploaded still working ?
It turns out that the part I already uploaded still working, I just need to upload rest of them.
Related
I'd like to implement uploading a profile picture for users. I'm using a VueJs frontend with a Rails API. What I'm trying to do is upload the image only using the frontend. I'd like for the file to get uploaded without any calls API calls. I could then store the location of the file in the picture attribute in the backend and retrieve it. Is that possible? I'm also using Element library.
<el-upload :http-request="addAttachment">
<el-button size="small" type="primary">Click Upload</el-button>
</el-upload>```
What you are looking at is called,
direct uploads or browser based uploads.
There should be support from storage service you are using.
Example: using S3 and GCS it is possible.
Upload without any API calls? -
Not sure, I once had to make a small API call to get the signature key and use it with POST params to upload file to storage service(GCS)
Once the API response is returned, you then might want to write to db about the file path.
I'm trying to make an app where I take pictures from users add them to a canvas, draw stuff in them, then convert them to a base64 string and upload them.
For this purpose I'm considering the possibility to use a cdn but can't find information on what I can upload to them and how the client side uploading works. I'd like to be able to send the image as base64 and the name to be given to the file, so that when it arrives to the origin cdn, the base64 image is decoded and saved under the specified name (which I will add to the database on the server).Is this possible?Can I have some kind of save.php file on the origin cdn where I write my logic to save the file and to which I'll send XHR requests? Or how this whole thing work?I know this question may sound trivial but I'm looking for it for hours and still didn't find anything which explains in detail how the client side uploading work for CDNs.
CDNs usually do not provide such uploading service for client side, so you can not do it in this way.
Based on this sample http://aws.amazon.com/articles/0006282245644577, it is clear how to use multipart upload using the AWS iOS SDK. However, it seems that my uploads are not stitched together correctly when I try to resume an interrupted upload.
I use the code below to resume an upload. Is this the correct way to set the upload id of a multipart upload?
S3InitiateMultipartUploadRequest *initiateRequest = [[S3InitiateMultipartUploadRequest alloc] initWithKey:key inBucket:bucketName];
S3InitiateMultipartUploadResponse *initiateResponse = [self.amazonS3Client initiateMultipartUpload:initiateRequest];
self.multipartUpload = [initiateResponse multipartUpload];
// Set upload id to resume upload
self.multipartUpload.uploadId = uploadId;
I'd appreciate any help or pointers.
Your code should be robust enough to handle cases where you may need to track which parts were uploaded. Part Uploads of the multipart upload can be done in many ways (either in parallel, multithreaded manner or one after the other in sequence).
Whatever the above approach may be, you can use the listParts API to determine how many parts were successfully uploaded. Since you would already have the upload ID your design must support the ability to continue from the following part upload.
GET /ObjectName?uploadId=UploadId HTTP/1.1
Host: BucketName.s3.amazonaws.com
Date: Date
Authorization: Signature
Another useful resource to help optimize multipart uploads: http://aws.typepad.com/aws/2010/11/amazon-s3-multipart-upload.html
So here's the thing. I believe my case is pretty particular at this point, and some help from the experts it's highly advisable.
I have an API built on Rails (3.2.6) and what I want to able to do is receive a video file (mostly .mp4, .avi) and upload it to s3 through a Process Queue (Using Resque).
Now I'm kind of lost on how to do this. To my understanding, I would be receiving a byte[] (Array of bytes) through the request which is the video and send that as a param to my Resque job in order to upload it (Resque Job params can only be strings, not objects)?
Has anyone had any experience doing this sort of procedure. We're pretty much trying to mimic the http://docs.brightcove.com/en/media/ create_video method. Where a Video object can be created either by sending the direct file in the request or the link the file....
Any suggestions?
Try using the CarrierWave gem. You should allow someone to HTTP POST the file data to your API, and then save that file data on the backend and upload it to S3 using CarrierWave.
Amazon has multipart upload functionality where you can send a file in chunks and get it assembled on S3. This allows for some nice resume like functionality for uploading to S3. From another question i got this nice link: Rails 3 & Plupload
My question is does anyone have any examples where they used the plupload chunking feature with the Amazon multipart feature? Ideally with carrierwave & fog.
I can see it doing the following:
Generate Unique ID for the upload with plupload,
can we do an event when the plupload
starts?
Attaching an ajax request to
the chunk completed with the ID
Having ajax controller method on the
server which uploads to s3 using the
ID
when all are complete fire a
controller action to reassemble
There is supposedly some PHP code which does some combining, but not with S3 and i can't stand to read PHP.
this is very similar, you should find Interesting
enjoy & feel free to fork/pull request ... & so on
You can find simple core java code without AWS library, This will help u implement in any technology..
http://dextercoder.blogspot.in/2012/02/multipart-upload-to-amazon-s3-in-three.html