Im trying to wrap my head around Cloudfront. We notice some video sites don't allow us to download the video. I.e. there is no physical link to the file. Or at least, I am not able to locate it in the flash player's source code using Firebug.
On some sites, a typical block of code could look like the following:
<object width="496" height="24" type="application/x-shockwave-flash" id="media_player" name="media_player" data="/flash/jwplayer/player.swf" ....>
<param name="flashvars" value="file=http://some_bucket_name.s3.amazonaws.com/uploads/users/1/foo.mp3&title=Test&author=Foobar&plugins=&autostart=true&controlbar=bottom&repeat=none&screencolor=000000">
</object>
Above, you notice, from the html source code, that the file can be 'cleverly' downloaded through the physical link: http://some_bucket_name.s3.amazonaws.com/uploads/users/1/foo.mp3.
I understand what a CDN is. A good explanation can be found here.
If we use Cloudfront, will this disallow end-users from 'cleverly' downloading media files directly from our app since the files will be streamed?
As Wukerplank suggested: "You can make it difficult, but you can't make it impossible."
Actually, the only thing CloudFront would do is not make S3 / EC2 requests. It's not designed for adding security, but caching and speed. Here's something that should make it harder to download the contents of the link: How do I prevent hotlinking on Amazon S3 without using signed URLs? (checking for a correct referrer).
Using rtmpe adds another layer of protection. Most download apps have difficulties with it, but as Wukerplank says, nothing is bullet proof.
Related
I have a requirements that the admin panel of Umbraco needs to be able to upload large files of videos such as a 2gb, 4gb.. I heard that .Net has a limit of 4gb or 2gb of upload (correct me if I'm wrong). So basically I already changed the maxAllowedContentLength such as below
<!-- Max file size limitation -->
<security>
<requestFiltering>
<requestLimits maxAllowedContentLength="4257286400" />
</requestFiltering>
</security>
and because there are limits to the duration and upload size I also adjusted the execution time
<httpRuntime maxRequestLength="4257286400" executionTimeout="9999999" requestValidationMode="2.0" enableVersionHeader="false" targetFramework="4.5" />
Now the problem is when I try to upload a 1gb of file the page just keeps on loading and nothing happens when I try to turn on debug logs I can see that it's being canceled due to large files. Now I don't know how to solve this as I really need for the user to be able to upload large files.
The option of uploading the file to a different site like youtube or vimeo is not an option. I really need to upload it to the site directly. Is there a way to achieve this in Umbraco?
Note: I already used some package available but none work like
Import Media - Already installed this but I don't know how it works. Like there are no new feature on the admin panel so I don't know what to look for here
Would really appreaciate some help here. I just want to be able to upload large files in Umbraco
Just to summarize this:
How to upload Large file in Umbraco Backend-Office because my settings is not allowing me to do so. Minimum filesize is 2gb
How to use Import Media Package if anyone have a chance to use this
Looks like the issue is caused by different dimensions for values in attributes:
1) "maxAllowedContentLength" specifies the maximum length of content in a request, in bytes (uint): https://learn.microsoft.com/en-us/iis/configuration/system.webserver/security/requestfiltering/requestlimits/
2) "maxRequestLength" sets the maximum request size in kilobytes(Int32):
https://learn.microsoft.com/en-us/dotnet/api/system.web.configuration.httpruntimesection.maxrequestlength?view=netframework-4.8
What kind of error message are you getting about uploads "being canceled due to large files"?
AFAIK Umbraco doesn't have these kinds of limits by itself, it's all managed by IIS via the settings you've already found.
However I do seem to recall something about machine.config that might also set some of these properties and thus trump your local website settings. So that's something to check if you have access to that file (it'll be somewhere like C:\Windows\Microsoft.NET\Framework64\v4.0.30319\Config). Otherwise you might want to contact some hosting administrator/support and ask if there's a global limit set.
I want prevent users from uploading shell (exploit) on my host. I remember fckeditor, had few bugs that allowed a hacker uploads files on server. Is there a similar issue with ckeditor?
How trust to users files and make sure they aren’t fake files, for example: a hacker can edit inside a pdf files -> file have pdf extension and type but has malicious code.
Is using htmlencode,htmldecode enough for XSS attack?
CKEditor doesn't include any file upload, you have to add that part.
Again, CKEditor doesn't have that part. They sell CKFinder to fill that role and it has some checks to verify that the uploaded file is safe, but you must be very careful about which users do you allow to upload files to your server.
No. If you're using a WYSIWYG editor you are not going to htmlencode the provided data, and other basic tricks aren't also enough. You need a full check like HTMLPurifier
I'm writing a Rails application that serves files stored on a remote server to the end user.
In my case the files are stored on S3 but the user requests the file via the Rails-application (hiding the actual URL). If the file was on my servers local file-system, I could use the Apache header X-Sendfile to free up the Ruby process for other requests while Apache took over the task of sending the file to the client. But in my case - where the file is not on the local file-system, but on S3 - it seems that I'm forced to download it temporarily inside Rails before sending it to the client.
Isn't there a way for Apache to serve a "remote" file to the client that is not actually on the server it self. I don't mind if Apache has to download the file for this to work, as long as I don't have to tie up the Ruby process while it's going on.
Any suggestions?
Thomas, I have similar requirements/issues and I think I can answer your problem. First (and I'm not 100% sure you care for this part), hiding the S3 url is quite easy as Amazon allows you to point CNAMES to your bucket and use a custom URL instead of the amazon URL. To do that, you need to point your DNS to the correct amazon URL. When I set mine up it was similar to this: files.domain.com points to files.domain.com.s3.amazonaws.com. Then you need to create the bucket with the name of your custom URL (files.domain.com in this example). How to call that URL will be different depending on which gem you use, but a word of warning was that the attachment_fu plugin I was using was incorrectly sending me to files.domain.com/files.domain.com/name_of_file.... I couldn't find the setting to fix it, so a simple .sub method for the S3 portion of the plugin fixed it.
On to your other questions, to execute some rails code (like recording the hit in the db) before downloading you can simply do this:
def download
file = File.find(...
# code to record 'hit' to database
redirect_to 3Object.url_for(file.filename,
bucket,
:expires_in => 3.hours)
end
That code will still cause the file to be served by S3, but and still give you the ability to run some ruby. (Of course the above code won't work as is, you will need to point it to the correct file and bucket and my amazon keys are saved in a config file. The above is also using the syntax for the AWS::S3 gem - http://amazon.rubyforge.org/).
Second, the Content-Disposition: attachment issue is a bit more tricky. Hopefully, your situation is a bit more simple than mine and the following solution can work. Assuming the object 'file' (in this example) is the correct S3 object, you can set the disposition to attachment by
file.content_disposition = "attachment"
file.save
The above code can be executed after the file exists on the S3 server (unlike some other headers and permissions), which is nice and it can also be added when you upload the file (syntax depends on your plugin). I'm still trying to find a way to tell S3 to send it as an attachment and only when requested (not every time), and if you find that, please let me know your solution. I need to be able to sometimes download it and other times save embed images (for example) into HTML. I'm not using the above mentioned redirect but fortunately it seems that if you embed (such as a HTML image tag) a file with the content-disposition/attachment header, and the browser still displays the image normally (but I haven't throughly tested that across enough browsers to send it in the wild).
Hope that helps! Good luck.
I signed up for Amazons S3 service.
I am having a problem with my Gallery script. It want the URL relative to the server where the files are located at?
So instead of http://gallery.s3.amazonaws.com/10/images
They want: /home/www/gallery.s3.amazonaws.com/10/images
The problem is I don't know what to use when it is Amazon s3?
Anybody have a solution?
Thanks.
I think you will need to specify what gallery software you are using. (Gallery 2, Gallery 3, Coppermine, Etc.) Or, are you writing your own script? If so, what language/platform?
Often, something expecting a physical path does not accept an HTTP URL unless HTTP support has been purposefully added.
I'm working on a Rails app that accepts file uploads and where users can modify these files later. For example, they can change the text file contents or perform basic manipulations on images such as resizing, cropping, rotating etc.
At the moment the files are stored on the same server where Apache is running with Passenger to serve all application requests.
I need to move user files to dedicated server to distribute the load on my setup. At the moment our users upload around 10GB of files in a week, which is not huge amount but eventually it adds up.
And so i'm going through a different options on how to implement the communication between application server(s) and a file server. I'd like to start out with a simple and fool-proof solution. If it scales well later across multiple file servers, i'd be more than happy.
Here are some different options i've been investigating:
Amazon S3. I find it a bit difficult to implement for my application. It adds complexity of "uploading" the uploaded file again (possibly multiple times later), please mind that users can modify files and images with my app. Other than that, it would be nice "set it and forget it" solution.
Some sort of simple RPC server that lives on file server and transparently manages files when looking from the application server side. I haven't been able to find any standard and well tested tools here yet so this is a bit more theorethical in my mind. However, the Bert and Ernie built and used in GitHub seem interesting but maybe too complex just to start out.
MogileFS also seems interesting. Haven't seen it in use (but that's my problem :).
So i'm looking for different (and possibly standards-based) approaches how file servers for web applications are implemented and how they have been working in the wild.
Use S3. It is inexpensive, a-la-carte, and if people start downloading their files, your server won't have to get stressed because your download pages can point directly to the S3 URL of the uploaded file.
"Pedro" has a nice sample application that works with S3 at github.com.
Clone the application ( git clone git://github.com/pedro/paperclip-on-heroku.git )
Make sure that you have the right_aws gem installed.
Put your Amazon S3 credentials (API & secret) into config/s3.yml
Install the Firefox S3 plugin (http://www.s3fox.net/)
Go into Firefox S3 plugin and put in your api & secret.
Use the S3 plugin to create a bucket with a unique name, perhaps 'your-paperclip-demo'.
Edit app/models/user.rb, and put your bucket name on the second last line (:bucket => 'your-paperclip-demo').
Fire up your server locally and upload some files to your local app. You'll see from the S3 plugin that the file was uploaded to Amazon S3 in your new bucket.
I'm usually terribly incompetent or unlucky at getting these kinds of things working, but with Pedro's little S3 upload application I was successful. Good luck.
you could also try and compile a version of Dropbox (they provide the source) and ln -s that to your public/system directory so paperclip saves to it. this way you can access the files remotely from any desktop as well... I haven't done this yet so i can't attest to how easy/hard/valuable it is but it's on my teux deux list... :)
I think S3 is your best bet. With a plugin like Paperclip it's really very easy to add to a Rails application, and not having to worry about scaling it will save on headaches.