I'm working on a Instagram-like app on iOS, and I'm wondering on how to optimize the file size of each picture so the users will use as least data as possible when fetching all those pictures from my backend. It seems like it will drain data if you download the file just as it is since high resolution pictures are around 1.5MB each. Is there a way to shrink the size of the picture while maintaining the quality of the picture as much as possible?
you can compress the image by saving it into json binary data.
OR simply core binary data.
OR elegantly Swift Realm - Core binary data.
Do you really want to customise image by you own! because there are lots of libraries already available for doing that.
Which are more effective and powerful.
like,
AFNetworking Api
Its is wonderful it could not only compressed images as per UIImageView current available size according to device resolution but also give you image cache flexibility.
Here is the link of Pod File and github
Just try It you will love it
You can compress a UIImage by converting it into NSData
UIImage *rainyImage =[UImage imageNamed:#"rainy.jpg"];
NSData *imgData= UIImageJPEGRepresentation(rainyImage,0.1 /*compressionQuality*/); // Second parameter of this function is the compression Quality
Now use this NSData object to save or convert it into UIImage
To convert it again into UIImage
UIImage *image = [UIImage imageWithData:imgData];
Hope this will resolve your issue.
Image compression will loose in the image quality.
Related
I am getting memory warning when loading large images, i have to cache the images also so that if the user will scroll up that image should be present. I am using SDWebImage Library.
cell!.productImageView?.sd_setImageWithURL(url)
According to your scenario I have two suggestions:
1. Generally recommended is that we can upload small size images on server so that we can avoid memory problems. A thumbnail or a small dimension sized image is considered if we are showing the images in a UITableView.However we can show larger image when we tap on a certain small image in UITableView and go to a detail view controller.
2. Secondly you can download the images and resize them and then use NSCache class to cache them rather using SDWebImage. As in your case imageWithContentsOfFile can't be used because you are downloading the images from some URL.However after download you can use imageWithContentsOfFile or you can resize images and use your own NSCache.
What are you using for display image on image view .UIImage imageNamed is maintain cache itself.
Please use below code.
[UIImage imageWithContentsOfFile:[[NSBundle mainBundle] pathForResource:#"imageName" ofType:#"png"]];
This will helpful to you for manage memory consumptions.
Try this code.
NSString *thumbnailImage = objectname.thumbnailUrl;
[cell.productImageView sd_setImageWithURL:[NSURL URLWithString:thumbnailImage]];
All the code I can find revolves around loading images directly into visual controls.
However, I have my own cache system (converting a project from another language) and so I as efficient as possible want the following:
Load jpg/png images - probably into a bitmap / cgimage. (his can either be from the file system or from images downloaded online)
Possibly save image back as a compressed/resized png/jpg file
Supply an image reference for a visual control
I am new to swift and ios platform, but as far as I can tell, cgimage is as close as it gets? However, there does not appear to be a way to load an image from he file system when using cgimage... But i have found people discussing ways for e.g. UIImage, so I am now doubting my initial impression ha cgimage was the best match for my needs.
It is easy to get confused between UIImage, CGImage and CIImage. The difference is following:
UIImage: UIImage object is a high-level way to display image data. You can create images from files, from Quartz image objects, or from raw image data you receive. They are immutable and must specify an image’s properties at initialization time. This also means that these image objects are safe to use from any thread.
Typically you can take NSData object containing a PNG or JPEG representation image and convert it to a UIImage.
CGImage: A CGImage can only represent bitmaps. Operations in CoreGraphics, such as blend modes and masking require CGImageRefs. If you need to access and change the actual bitmap data, you can use CGImage. It can also be converted to NSBitmapImageReps.
CIImage: A CIImage is an immutable object that represents an image. It is not an image. It only has the image data associated with it. It has all the information necessary to produce an image.
You typically use CIImage objects in conjunction with other Core Image classes such as CIFilter, CIContext, CIColor, and CIVector. You can create CIImage objects with data supplied from variety of sources such as Quartz 2D images, Core Videos image, etc.
It is required to use the various GPU optimized Core Image filters. They can also be converted to NSBitmapImageReps. It can be based on the CPU or the GPU.
In conclusion, UIImage is what you are looking for. Reasons are:
You can get image from device memory and assign it to UIImage
You can get image from URL and assign it to UIImage
You can write UIImage in your desired format to device memory
You can resize image assigned to UIImage
Once you have assigned an image to UIImage, you can use that instance in controls directly. e.g. setting background of a button, setting as image for UIImageView
Would have added code samples but all these are basic questions which have been already answered on Stackoverflow, so there is no point. Not to mention adding code will make this unnecessarily large.
Credit for summarizing differences: Randall Leung
You can load your image easily into an UIImage object...
NSData *data = [NSData dataWith...];
UIImage *image = [UIImage imageWithData:data];
If you then want to show it in a view, you can use an UIImageView:
UIImageView *imageView = [[UIImageView alloc] init]; // or whatever
...
imageView.image = image;
See more in UIImage documentation.
Per the documentation at: https://developer.apple.com/documentation/uikit/uiimage
let uiImage = uiImageView.image
let cgImage = uiImage.cgImage
I need to reduce the size of UIImage captured from Camera/Gallery & reduce to size to max 200kb.
The UIImage would then saved to a SQLite Database in the app.
I tried using UIImageJPEGRepresentation(image, compression); in while loop. but couldn't crack it.
Thanks for the help..!!!
You need to scale the image down first, straight out of camera it will likely be too big. Something like 1000 pixels on the longest side might be enough, but you can play with it and see what works for you. Then apply JPEG compression to a scaled image.
Also the way JPEG works, it's pointless to run the algorithm over and over again. Just pick a good compression rate and run it only once.
On a tangent, in many cases where you need to save a large data blob in a database, you might find it more memory efficient to save the data into a separate file in the file system, and store the path in the data base.
We know the image can be compressed with the Method UIImageJPEGRepresentation() as in the following codes.
NSData *imgData = UIImageJPEGRepresentation(imageResized, 0.5);
NSLog(#"imgData.length :%d",imgData.length);
imageResized = [UIImage imageWithData:imgData];
NSData *imgData2 = UIImageJPEGRepresentation(imageResized, 1);
NSLog(#"imgData2.length :%d",imgData2.length);
The log is:
2013-02-25 00:33:14.756 MyApp[1119:440b] imgData.length :371155
2013-02-25 00:33:20.988 MyApp[1119:440b] imgData2.length :1308415
What Im confused is that why the length of imgData and imgData2 are different. In my App, the image should be uploaded to the server. Should I upload the NSData to the server for saving storage? Is it possible for an Android phone to download the NSData and convert it to an image? Any help will be appreciated!
You start with a UIImage of some size (say 1024x768). This takes 1024x768x4 byes in memory. Then you compress it with a factor of 0.5 and get 371,155 bytes.
You then create a new UIImage with the compressed data. This is still a 1024x768 (or whatever) UIImage so it now takes the same amount of memory (1024x768x4) as the original image. You then convert it to a new JPG with less compression giving you 1,308,415 bytes.
Even though you create an uncompressed version of the compressed image, the number of bytes comes from converting the full sized UIImage. The 2nd, uncompressed image, though bigger, will still have the same lower quality of the compressed image.
Since your data represents a JPG, anything that downloads the data will be able to treat the data as a JPG, including an Android phone.
The number of bytes is bigger for the second image because you passed a much higher compression quality value to UIImageJPEGRepresentation. Higher quality takes more bytes.
The file once uploaded to a server will be a standard JPEG file, viewable by any device, including Android.
I need to send a UIImage over the Internet, but I've been having some issues so far.
I currently take the UIImage, make it a 16th the dimensions, convert it to NSData, and then create an NSString using NSData+base64. I then transmit this as part of a JSON request to a server.
Most of the time, this works perfectly.
However, the file sizes are large. On a cellular connection, an image can take ~10 seconds to load.
How do I transmit a UIImage, over the network, so that it's small and retains quality? Even when I make it a 16th of the previous number of pixels, a photograph can be 1.5MB. I'd like to get it around 300KB.
Compressing it with zip or gzip only seems to result in negligible (<.1MB) savings.
Thanks,
Robert
I'd suggest storing it as a JPEG or a PNG with some compression, Use a method like
UIImageJPEGRepresentation(UIImage *image, CGFloat compressionQuality)
Then post the NSData object you get from that. You should be able to find a good ssize/quality tradeoff by playing with the compressionQuality variable.
You didn't say how you are converting your images to data. You mention “photograph”. If your images are photographs, you should use UIImageJPEGRepresentation, not UIImagePNGRepresentation. You will have to play with the compressionQuality parameter to find a compression level that balances data size and image quality.