This is very basic requirement but did not get any result after research. In Swift 3.0, by using below API, we have saved captured image.
UIImageWriteToSavedPhotosAlbum
UIImage object have the image. Basic way to send image to server is convert the UIImage to NSData. To perform this below two API are available.
UIImageJPEGRepresentation //NOTE - We must specify the compression parameter.
UIImagePNGRepresentation // We don't need to specify the compression parameter.
By default, image saved is in JPEG format, we want to send SAME image to server WITHOUT any compression.
So size of jpeg image in album and size of image on server must be exactly same.
How do I achieve this ?
Thank you for help.
Related
In my application, I'm using YPImagepicker for selecting images from library and camera. I want to know image size in MB after selecting the pictures or capturing a photo. Task is to convert the images into data and send to backend via REST API. As of now we are limiting images into 5. So I want to see the size of every images if it is more than 1 Mb need to compress into 1mb.
let imgData = NSData(data: image.jpegData(compressionQuality: 1)!)
var imageSize: Int = imgData.count
print("actual size of image in KB: %f ", Double(imageSize) / 1024.0 / 1024.0)
the above sample I have used to check the size of the image but I'm not seeing the correct file size. For eg, I'm capturing one photo through app and it is getting saved in album when I check the image size it shows 3.4 MB in photo detail but in code I'm not getting the same size. What is best way to achieve this?
Apple doesn’t use JPEG for storing images in your iOS library. They use a proprietary file format with its own lossy compression.
The two file formats will yield different file sizes.
When you load an image from the user’s image library and convert it to JPEG data, it gets re-compressed using JPEG compression with the image quality value you specify. A compressionQuality value of 1.0 will create an image file with the best image quality but the largest file size. That is why your results are bigger than the files from the user’s image library. Try a lower compressionQuality value.
I'm working on a Instagram-like app on iOS, and I'm wondering on how to optimize the file size of each picture so the users will use as least data as possible when fetching all those pictures from my backend. It seems like it will drain data if you download the file just as it is since high resolution pictures are around 1.5MB each. Is there a way to shrink the size of the picture while maintaining the quality of the picture as much as possible?
you can compress the image by saving it into json binary data.
OR simply core binary data.
OR elegantly Swift Realm - Core binary data.
Do you really want to customise image by you own! because there are lots of libraries already available for doing that.
Which are more effective and powerful.
like,
AFNetworking Api
Its is wonderful it could not only compressed images as per UIImageView current available size according to device resolution but also give you image cache flexibility.
Here is the link of Pod File and github
Just try It you will love it
You can compress a UIImage by converting it into NSData
UIImage *rainyImage =[UImage imageNamed:#"rainy.jpg"];
NSData *imgData= UIImageJPEGRepresentation(rainyImage,0.1 /*compressionQuality*/); // Second parameter of this function is the compression Quality
Now use this NSData object to save or convert it into UIImage
To convert it again into UIImage
UIImage *image = [UIImage imageWithData:imgData];
Hope this will resolve your issue.
Image compression will loose in the image quality.
I am working on an iOS application that does some image processing.
The result of the processing is a grey-scale image.
When the process is finished, I want to save the original RGB image together with the result in a same image file to camera roll, so I thought of using alpha channel for that.
Also, I want to attach some parameters got in the processing as image metadata.
So here it comes my problem. I could not find an iOS compatible image format that allows saving alpha channel together with metadata.
On the one hand, JPEG images accept metadata, but not alpha channel.
On the other hand, PNG images accept alpha channel, but not metadata.
Any ideas?
Thanks in advance.
On the other hand, PNG images accept alpha channel, but not metadata.
But yes metadata.
We know the image can be compressed with the Method UIImageJPEGRepresentation() as in the following codes.
NSData *imgData = UIImageJPEGRepresentation(imageResized, 0.5);
NSLog(#"imgData.length :%d",imgData.length);
imageResized = [UIImage imageWithData:imgData];
NSData *imgData2 = UIImageJPEGRepresentation(imageResized, 1);
NSLog(#"imgData2.length :%d",imgData2.length);
The log is:
2013-02-25 00:33:14.756 MyApp[1119:440b] imgData.length :371155
2013-02-25 00:33:20.988 MyApp[1119:440b] imgData2.length :1308415
What Im confused is that why the length of imgData and imgData2 are different. In my App, the image should be uploaded to the server. Should I upload the NSData to the server for saving storage? Is it possible for an Android phone to download the NSData and convert it to an image? Any help will be appreciated!
You start with a UIImage of some size (say 1024x768). This takes 1024x768x4 byes in memory. Then you compress it with a factor of 0.5 and get 371,155 bytes.
You then create a new UIImage with the compressed data. This is still a 1024x768 (or whatever) UIImage so it now takes the same amount of memory (1024x768x4) as the original image. You then convert it to a new JPG with less compression giving you 1,308,415 bytes.
Even though you create an uncompressed version of the compressed image, the number of bytes comes from converting the full sized UIImage. The 2nd, uncompressed image, though bigger, will still have the same lower quality of the compressed image.
Since your data represents a JPG, anything that downloads the data will be able to treat the data as a JPG, including an Android phone.
The number of bytes is bigger for the second image because you passed a much higher compression quality value to UIImageJPEGRepresentation. Higher quality takes more bytes.
The file once uploaded to a server will be a standard JPEG file, viewable by any device, including Android.
I need to send a UIImage over the Internet, but I've been having some issues so far.
I currently take the UIImage, make it a 16th the dimensions, convert it to NSData, and then create an NSString using NSData+base64. I then transmit this as part of a JSON request to a server.
Most of the time, this works perfectly.
However, the file sizes are large. On a cellular connection, an image can take ~10 seconds to load.
How do I transmit a UIImage, over the network, so that it's small and retains quality? Even when I make it a 16th of the previous number of pixels, a photograph can be 1.5MB. I'd like to get it around 300KB.
Compressing it with zip or gzip only seems to result in negligible (<.1MB) savings.
Thanks,
Robert
I'd suggest storing it as a JPEG or a PNG with some compression, Use a method like
UIImageJPEGRepresentation(UIImage *image, CGFloat compressionQuality)
Then post the NSData object you get from that. You should be able to find a good ssize/quality tradeoff by playing with the compressionQuality variable.
You didn't say how you are converting your images to data. You mention “photograph”. If your images are photographs, you should use UIImageJPEGRepresentation, not UIImagePNGRepresentation. You will have to play with the compressionQuality parameter to find a compression level that balances data size and image quality.