Generating Thumbnails on Client - ios

My team and I are building an iOS application. We allow technicians in the field to upload images for certain issues they are resolving on technical equipment. It will be important to zoom in (so keep quality relatively high) when these images are uploaded to S3.
Recently we decided to add thumbnails because it will be much faster when others browse the iOS app, rather than downloading a 1.5-2.5mb image.
My co-worker decided the best way to handle this is to generate a 200-500kb thumbnail in iOS then upload the image and the thumbnail to s3.
I voiced my concern that some of our technicians may be in some parts of the world where internet is slow and data usage is limited. So doing all this additional work on the device and uploading makes no sense to me. However the team considers this a good solution and will move forward. I've shown them easy examples of how to generate thumbnails from S3 and Lambda automatically on the server... allowing us to either upload higher fidelity images with the additional bandwith or just increase the speed of the app by uploading much less. Sometimes a user may upload as many as 100 images... meaning an additional 20-50mb...
Anyways I wanted to hear some answers about how you guys think the best way to handle this is, mainly for my own sanity check.

I don't completely comprehend the intricacies of your project, but from experience, I have one word for you - Cloudinary. As opposed to S3, which a general purpose Cloud storage solution, cloudinary is designed to handle images.
We have a 200,000 hits a day online classified app that handles tens of thousands of photos daily. And cloudinary provides an extremely mean solution for all our needs. We have uploads by users from their mobile and desktop devices, bookmarking of those images, CDN based serving, and thumbnail generation.
Did I mention they have thumbnail generation built in? They have lots of other features as well, including
Resize and Crop
Optimized JPEG Custom Crop
Face Thumbnail
Rotated Circular Thumbnail
Zoom Effects and Zoom Image Overlay
Watermark Image
Optimized WebP
Overlay, Border, Shadow Text Overlay, Border, Shadow etc.
The admin console is pretty kickass too, with all of the above features available for you to configure over the cloud. And it fits well with pretty much any application (we use it in our internal Ruby, Go, NodeJS services, our Web Application and our iOS and Android apps as well).
I'm not paid to sell Cloudinary to you, but I can vouch that if it is image based services I needed, I would go for Cloudinary any day over S3. Major players like EBay and TED etc. use it for their image requirements.

Related

Large amounts of images in Firebase

I am currently developing an iOS App for a photographer. The main attraction of the App is going to be a gallery of their work, so about 300MB of Images.
My plan until now was to include lower resolution versions of the Images on the App itself and have the full resolution Images pulled from a FirebaseDatabase to replace the low-res versions.
Now I am doubting myself though, since pulling 300MB worth of Images from Firebase for every user seems like it would get rather costly. Would it be a better bet for me to just accept the bigger download and include all of the Images in the App? Considering that would also save the users time and give them a smoother experience it seems like the better option. But I am not sure if there are issues with this that I am not aware of.
How do people usually deal with this sort of a thing?

What are some options for handling image uploading/compression in ASP?

please bear with me as I'm not trying to frustrate anyone with inane questions, and I did google search this but I couldn't really find anything recent or helpful.
I am a novice programmer and I am using a classic asp web application. I just enabled the users to upload and download images, but I'm quickly regretting it as it's eating up all of the router bandwidth. I am finding my solution inadequate, so I wanted to start over.
My desire is threefold with this functionality:
Compression. I understand that this is impossible to do BEFORE uploading without some kind of Java/Silverlight/Flash portion of the application to handle uploads, correct? What is the common way most places go about this? Just allow regular file uploads and compress once they are on the server?
Resizing. I want to resize all images before they are uploaded to a reasonable size, instead of just telling users that try and upload huge camera images that they can't upload. I figure I just want to let them upload and have it resize for them before uploading. Does this functionality exist already?
Changing filetype. I want to allow users to upload all image file types but make them .jpg on the server after the upload.
With these three requirements, how hard is it to implement something like this in just pure code and libraries? Would it be better to just use a 3rd party plugin, such as ASPjpeg or ASPupload? Have you encountered something similar, and what was your solution?
Thanks.
Take a look at ASPJpeg and ASPUpload from Persits. We use these components to upload a full size image (can be png even though the library is "ASPJpeg"), resize it to several different sizes we need on our site, then store the resized images on the server in a variety of folders. The ASPUpload component is a little tricky but if you follow their sample code you'll be fine.
I never found a good component for decompressing uploaded zip files and had to write my own, which I've since abandoned. In the end with upload speeds increasing and storage getting so cheap, it started to matter less and less that the files were compressed before being uploaded.
EDIT: Just noticed you mentioned these components in your question. Consider this an endorsement of your idea to use them. :-)

Handling very large image files in web browsers

First post on SO; hopefully I am doing it right :-)
Have a situation where users need to upload and view very high resolution files (they need to pan, tilt, zoom, and annotate images). A single file sometimes crosses 1 GB so loading complete file on client side is not an option.
We are thinking about letting the users upload files to the server (like everyone does), then apply some encryption on server side creating multiple, relatively small low resolution images with varying sizes. We then give users thumbnails with canvas size option on the webpage for them to pick and start their work.
Lets assume a user opens low grade image with 1280 x 1028 canvas size. Image will be broken into tiles before display, and when user clicks on a title it will be like zooming in to a specific tile. Client will send request to the server asking for higher resolution image for the title. Server will send the image which will be broken into titles again for the user to click and get another higher resolution image from server and so on ... Having multiple images with varying resolution will help us break images into tiles and serve user needs ('keep zooming in' or out using tiles).
Has anyone dealt with humongous image files? Is there a preferred technical design you can suggest? How to handle areas that have been split across tiles is bothering me a lot so not sure how above approach can be modified to address this issue.
We need to plan for 100 to 200 users connected to the website simultaneously, and ours is .NET environment if it matters
Thanks!
The question is a little vague. I assume you are looking for hints, so here are a few:
I see uploading the images is a problem in the firstplace. Where I come from, upload-speeds are way slower than download speeds. (But there is litte you can do if you need your user to upload gigabytes...) Perhaps offer some more stable upload than web. FTP if you must.
Converting in smaller pieces should be no big problem. Use one of the availabe tools. Perhaps imagemagick. I see there is a .net wrapper out: https://magick.codeplex.com/
More than converting alone I think it is important not to do it everytime on the fly (you would need a realy big machine) but only once the image is uploaded. If you want to scale you can outsource this to another box in the network.
For the viewer. This is the interessting part. There are some ready to use ones. Google has one. It's called 'Maps' :). But there is a free alternative: OpenLayers from the OpenStreetmap Project: http://wiki.openstreetmap.org/wiki/OpenLayers All you have to do is naming your generated files in the right way and a litte configuration.
Even if you must for some reasons create the tiles on the fly or can't use something like OpenLayers I would try to stick to its naming scheme. Having something working to start with is never a bad idea.

Need assistance choosing a image management gem

I am interested in building a Rails based system for handling the display and organization of large amounts of photos. This is sort of like Flickr but smaller. Each photo will have metadata associated with it. Photos will be shown in a selectable list and grid view. It would be nice to be able to load images as they are needed as well (as this would probably speed things up).
At the moment I have a test version of my database working by images loading from the assets/images directory but it is beginning to run slow when displaying several images (200-600 images). This is due to the way I have my view setup. I am using a straight loop to display the images in both list and grid layouts.
I also manually resized the thumbnails and a medium sized image from a full sized source image. I am investigating other resizing methods. Any advice is appreciated here as well.
As I am new to handling the images this way, could someone point me in a direction based on experience designing and implementing something like Flickr?
I am investigating the following tools:
Paperclip
http://railscasts.com/episodes/134-paperclip
Requirements: ImageMajick
attachment_fu
http://clarkware.com/blog/2007/02/24/file-upload-fu#FileUploadFu
Requirement: One of the following: ImageScience, RMagick, miniMagick, ImageMajick?
CarrierWave
http://cloudinary.com/blog/ruby_on_rails_image_uploads_with_carrierwave_and_cloudinary
http://cloudinary.com/blog/advanced_image_transformations_in_the_cloud_with_carrierwave_cloudinary
I'd go with Carrierwave anyday. It is very flexible and has lot of useful strategies. It generates it's on Uploader class and has all nifty and self explanatory features such as automatic generation of thumbnails (as specified by you), blacklisting, formatting image, size constraints etc; which you can put to your use.
This Railscast by Ryan Bates - http://railscasts.com/episodes/253-carrierwave-file-uploads is very useful, if you haven't seen it already.
Paperclip and CarrierWave are totally appropriate tools for the job, and the one you choose is going to be a matter of personal preference. They both have tons of users and active, ongoing development. The difference is whether you'd prefer to define your file upload rules in a separate class (CarrierWave), or if you'd rather define them inline in your model (Paperclip).
I prefer CarrierWave, but based on usage it's clear plenty of people feel otherwise.
Note that neither gem is going to do anything for your slow view with 200-600 images. These gems are just for handling image uploads, and don't help you with anything beyond that.
Note also that Rails is really pretty bad at handling file uploads and file downloads, and you should avoid this where possible by letting other services (a cdn, your web server, s3, etc) handle these for you. The central gotcha is that if you handle a file transfer with rails, your entire web application process is busy for the duration of the transfer. (For related discussion on this topic, see: Best Ruby on Rails Architecture for Image Heavy App).

iOS app additional content download

I have an iOS app which has a lot of images and sounds in it, hence the build size is growing rapidly and can no longer fit in the 50MB limit for 3G download. I would like to upload those images and sounds to an online server and download them from the application on demand. Can anyone please recommend some online storage (for example Amazon S3) and give suggestions for best practices about this issue?
Thank you!
I upload my files to an "unlimited site" with Network solutions, however if you don't want to pay for anything like this just upload the photos to photo bucket or something and right click on them and click "Get image address" then use this URL to download the images to your app. (Check to make sure it doesn't violate their terms of use though, I don't know if it does or not!)
As for best practices, I probably use the absolute worst method which is displaying it in a UIWebView off-screen and taking a picture of it using CoreGraphics then saving that picture haha! I'll leave you with this link for a probably better solution to your second half of the question, that being "Best practices": ios store URL images in offline mode(not connected to internet)

Resources