Preventing Vulnerabilities in ckeditor? - upload

I want prevent users from uploading shell (exploit) on my host. I remember fckeditor, had few bugs that allowed a hacker uploads files on server. Is there a similar issue with ckeditor?
How trust to users files and make sure they aren’t fake files, for example: a hacker can edit inside a pdf files -> file have pdf extension and type but has malicious code.
Is using htmlencode,htmldecode enough for XSS attack?

CKEditor doesn't include any file upload, you have to add that part.
Again, CKEditor doesn't have that part. They sell CKFinder to fill that role and it has some checks to verify that the uploaded file is safe, but you must be very careful about which users do you allow to upload files to your server.
No. If you're using a WYSIWYG editor you are not going to htmlencode the provided data, and other basic tricks aren't also enough. You need a full check like HTMLPurifier

Related

Remove unused files from ActiveStorage+DirectUpload

Consider the following example:
I have a form that includes a multiple files input;
The input file uses ActiveStorage and DirectUpload to upload files automatically as soon as they are included;
After adding some files they are uploaded automatically;
I never click the submit button so those files are never used nor accessible anywhere;
Does Rails support some built-in mechanism for removing these files or is something we have to implement ourselves?
Seems rather trivial to perform a DoS by continuously uploading files until something breaks.
Update 1
Forgot to mention that the example I'm following uses a 3rd party library (Dropzone in this case) and follow the example from the official documentation.
According to the documentation after a file upload we inject a hidden input field with the id of the uploaded blob.
I think the answer of Chiperific is good, since DirectUpload is executed in the submit there is little time for the requests to fail.
I mention requests because as far as i understand it, the process is like this:
The user selects a file from his computer and fills the rest form.
DirectUpload uploads the file to the storage.
The backend receives the body and updates the attachment and either creates or updates a model.
So, what happens if the file upload is successful but model validation is not? you would end up with a file in the storage without his corresponding model or with dirty one.
More information here: https://github.com/rails/rails/issues/31985
The answer then is no, rails does not have a mechanism of removing this files automatically. I guess you could check if the model creation/update was successful and remove the file manually if not.
I think your premise is incorrect.
The input file uses ActiveStorage and DirectUpload to upload files automatically as soon as they are included;
According to the docs:
Active Storage, with its included JavaScript library, supports uploading directly from the client to the cloud.
and
That's it! Uploads begin upon form submission.
So the point of Direct Storage seems to be to bypass some Rails ActiveStorage things and go straight to the service. BUT, it still doesn't happen until the form is submitted.
The example on the non-edge docs shows the user clicking "Submit" before the files are actually uploaded.

Encrypting or protecting files stored in iOS app's NSDocumentsDirectory

I have a custom requirement in one of my products and I need to protect or encrypt files that are stored inside the NSDocumentsDirectory folder. Even if these documents are mailed (The app has the ability to mail documents) to some other person , he or she will not be able to open this document without using my app (I will be using open in functionality of email attachments). So basically only the application can access all these documents and without the app the documents should be mere junk. IS there any way to do it, or has any one done something before.
I also saw this but could not get a complete idea.
If you want a quick and easy method for data that doesn't need serious security, just zip the files with a fixed password.
ZipArchive is a good library for this.
For a more serious approach, check iOS - Protecting files with a custom encryption key?
The other post you mentioned works on the concept of password protecting the files, I had encountered the same issue that was for my custom defined files in which our team, encoded the contents of the file on random locations, and saved it.
Only our Application could decode it correctly as we had the key :)
It was a windows application, It would work here also.

(Rails) Uploading Directories

I need to upload multiple files on my website.
But I need not just a form for uploading multiple files, I need to upload whole directories.
How's this possible for the minimalist?
Yours, Joern.
According to my somewhat limited knowledge this is not possible, only file transfer is possible, not directories.
Here are some workarounds, based on discussion on Velocity Reviews and another discussion:
upload a zip, which you unzip at the server side
upload directories over ftp (web page can be a front end to this)
upload files one by one
I would go either for zip or ftp. Note: someone might have produced a gem that enables uploading directories (I know nothing of such thing, but I will be happy to find out, if there is).
Adding another option to the list provided by Sorrow:
upload via REST/JSON
OK, this is a partial solution, but it does give you the opportunity to write a script that reads your directory and POSTS to your website.

where is the best place to save images from users upload

I have a website that shows galleries. Users can upload their own content from the web (by entering a URL) or by uploading a picture from their computer.
I am storing the URL in the database which works fine for the first use case but I need to figure out where to store the actual images if a user does a upload from their computer.
Is there any recommendation here or best practice on where I should store these?
Should I save them in the appdata or content folders? Should they not be stored with the website at all because it's user content?
You should NOT store the user uploads anywhere they can be directly accessed by a known URL within your site structure. This is a security risk as users could upload .htm file and .js files. Even a file with the correct extension can contain malicious code that can be executed in the context of your site by an authenticated user allowing server-side or client-side attacks.
See for example http://www.acunetix.com/websitesecurity/upload-forms-threat.htm and What security issues appear when users can upload their own files? which mention some of the issues you need to be aware of before you allow users to upload files and then present them for download within your site.
Don't put the files within your normal web site directory structure
Don't use the original file name the user gave you. You can add a content disposition header with the original file name so they can download it again as the same file name but the path and file name on the server shouldn't be something the user can influence.
Don't trust image files - resize them and offer only the resized version for subsequent download
Don't trust mime types or file extensions, open the file and manipulate it to make sure it's what it claims to be.
Limit the upload size and time.
Depending on the resources you have to implement something like this, it is extremely beneficial to store all this stuff in Amazon S3.
Once you get the upload you simply push it over to Amazon and pop the URL in your database as you're doing with the other images. As mentioned above it would probably be wise to open up the image and resize it before sending it over. This both checks it is actually an image and makes sure you don't accidentally present a full camera resolution image to an end user.
Doing this now will make it much, much easier if you ever have to migrate/failover your site and don't want to sync gigabytes of image assets.
One way is to store the image in a database table with a varbinary field.
Another way would be to store the image in the App_Data folder, and create a subfolder for each user (~/App_Data/[userid]/myImage.png).
For both approaches you'd need to create a separate action method that makes it possible to access the images.
While uploading images you need to verify the content of the file before uploading it. The file extension method is not trustable.
Use magic number method to verify the file content which will be an easy way.
See the stackoverflow post and see the list of magic numbers
One way of saving the file is converting it to binary format and save in our database and next method is using App_Data folder.
The storage option is based on your requirement. See this post also
Set upload limit by setting maxRequestLength property to Web.Config like this, where the size of file is specified in KB
<httpRuntime maxRequestLength="51200" executionTimeout="3600" />
You can save your trusted data just in parallel of htdocs/www folder so that any user can not access that folder. Also you can add .htaccess authentication on your trusted data (for .htaccess you should kept your .htpasswd file in parallel of htdocs/www folder) if you are using apache.

Secure File Upload in Ruby On Rails

I built a photo gallery which uses Paperclip and validates the content-type using validates_attachment_content_type.
The application runs on a shared host with Passenger.
Is it possible to bypass the validation and run malicious scripts from the public/pictures directory? If so, is there anything that I can do to avoid evil scripts from running or from being uploaded?
Is it possible to bypass the validation and run malicious scripts from the public/pictures directory?
Yes. You can have a perfectly valid renderable image file that also contains HTML with script injection. Thanks for the bogus content-sniffing, IE, you have ruined everything.
See http://webblaze.cs.berkeley.edu/2009/content-sniffing/ for a summary.
If so, is there anything that I can do to avoid evil scripts from running or from being uploaded?
Not really. In theory you can check the first 256 bytes for HTML tags, but then you have to know the exact details of what browsers content-sniff for, and keeping that comprehensive and up-to-date is a non-starter.
If you are processing the images and re-saving them yourself that can protect you. Otherwise, do one or both of:
only serve user-uploaded files from a different hostname, so they don't have access to the cookie/auth details that would allow an injected script to XSS into your site. (but look out for non-XSS attacks like general JavaScript/plugin exploits)
serve user-uploaded files through a server-side script that includes the 'Content-Disposition: attachment' header, so browsers don't attempt to view the page inline. (but look out of old versions of Flash ignoring it for Flash files) This approach also means you don't have to store files on your server filesystem under the filename the user submits, which saves you some heavy and difficult-to-get-right filename validation work.

Resources