I am trying to modify uploaded file once it is uploaded but I want to modify it (e.g. do some imagemagick operations on it).
I was trying to download file into tmp directory, modify file and then re-upload it.
Is there a nice way to do this?
Related
I'm downloading an Excel file from an Azure Storage Blob and therefore want to use stream_get_contents to get the file. But PhpSpreadsheet seems to only want to read the file off the filesystem.
For now, I'm saving it to a temp folder and reading it back, but that is less than ideal.
Is there a way to get PhpSpreadsheet to load via something other than a local file?
This is not supported. PhpSpreadsheet will always read from disk.
On a side note, since 1.13.0, PhpSpreasheet is able to write in memory. See https://github.com/PHPOffice/PhpSpreadsheet/pull/1292
I'm currently using the file watchers features to generate a minified version of the file each time I modify a css file and I would like to be able to upload BOTH files at the same time.
Is there any way to link those files so I don't need to upload both of them manually ? I don't want them to be uploaded automatically though, only when I explicitly want it (upload to).
To be more specific, I need PhpStorm to upload
main.min.css
when I manually upload
main.css
Is there any way a new image upload from a PHP form sent to directory a can be copied to directory b after it has been uploaded? In this case it's not possible to alter the upload path itself or copy it during upload so I'm looking for some kind of automatic replication of new directory contents into another directory after the file has been uploaded.
Is there an automated service/script that can move the content of one directory on our server to another directory? We upload files to www.mysite.com/upload/thumb for example but need them to be moved automatically to www.mysite.com/cs/upload/thumb - is this possible without running a move_uploaded_file PHP script (I would prefer it to be done by the server because we use the same page for many different landing page functions).
do you look for copy (http://php.net/manual/en/function.copy.php)?
I have a model called Image. Images have files attached using Dragonfly that are stored in S3.
I have a requirement that I need to zip up all images.
I'm using:
Zip::ZipFile.open(tmp_zip, Zip::ZipFile::CREATE) do |zipfile|
zipfile.add("image.jpg", image_path)
end
The problem I'm running into is that this works if image_path is local. When you need to call to S3 for the file, image_path is a remote path, such as http://example.s3.amazonaws.com/foo/image.jpg, and I don't think that there is a RubyZip method that handles that.
I'm debating on writing something that creates a temp file from the remote path, adds that temp file to the zip, then deletes the temp file.
But before I do that, does anyone know if RubyZip or some other zip library handles zipping up remote files? Or is there a better/easier method?
Thanks!
I have faced same issue and I have found a solution. So I am sharing it, might help someone.
You can add any remote file to zip without saving it in a temp file, then read it from temp file and finally deleting temp file.
create zip and add remote files in it
Zip::OutputStream.open(tmp_zip) do |zos|
zos.put_next_entry("image.jpg")
zos.print(URI.parse(image_url).read)
end
If you want to add any local files inside above temp_zip then you can open it again:
open this zip again and add any local files you want
zipfile = Zip::File.open(tmp_zip)
zipfile.add("report.pdf", my_pdf_path)
zipfile.close
One option would be to mount s3 locally. There various ways to do this using ftp like programers and there are dedicated programs as well. It depends on the OS you're running as well.
I don't see a way to stream a zip via zip using a remote URL.
I have a upload text file field, and with it I plan to save the file somewhere and then store the location of the file in a database. However, I want to make sure the file they uploaded is a .txt file, and not, say, an image file. I imagine this happens in the validation step. How does one validate such a thing? Also, how do you get the filename of the uploaded file? I could always just check if it said '.txt' but for future reference knowing how to validate without just the filename would be helpful.
Trying to validate the contents of a file based on the filename extension is opening the door for major hackerdom. It's trivial to change the extension and upload the file.
If you are on a Mac/Linux/Unix-based system the OS "file" command is the standard because it looks inside the file for key bytes that flag file types. http://en.wikipedia.org/wiki/File_(Unix) I'm not sure what's available for Windows, but this might help: Determine file type in Ruby
One way of doing it, the simple way really, would be to pass the file through an image loader, preferably one that handles multiple common formats, and see if it throws an error.
The other way is to manually check the file header for common image format headers. For example, .bmp files start with BM. Other formats have their own specific markings you can use.