Data to UIImage to UIImageJPEGRepresentation equality failure - ios

Why does this transformation fails to result in the same image data?
let path = Bundle(for: type(of: self)).url(forResource: "Image", withExtension: "jpg")
inputData = try! Data(contentsOf: path!)
let testImage = UIImage(data: inputData)
let testImageData = UIImageJPEGRepresentation(testImage!, 1.0)
expect(testImageData).to(equal(inputData))
From what I understand UIImageJPEGRepresentation and UIImagePNGRepresentation can strip the image of meta data. Is that the reason?

There is no particular reason why two JPEG files showing the same image would be identical. JPEG files have lots of header info, different compression algorithms, etc. And even if both files have a compression level of 1 (do they?) they are both lossy, so something will differ every time you expand and recompress. Your expectations here are just wrong. But then it also sounds like you’re trying to test something that does not need testing in the first place.

I was facing the same Issue, and was able to solve using UIImagePNGRepresentation to convert the UIImage to Data, and then compared to see if both Data were equal.

Related

Save Image (SwiftUI) to Realm database

I am using SwiftUI and Realm.
I want to save an Image to the database. The image is a photo taken from the device's camera and is not stored in the gallery.
It's my understanding that I need to convert the image to NSData, but I can only find syntax that works with UIImage.
I tried
let uiImage: UIImage = image.asUIImage()
but get this error:
Value of type 'Image?' has no member 'asUIImage'
What am I doing wrong here, how can I have an Image to Realm local database?
EDIT: Thanks for the suggested "possible duplicate", but no: the duplicate (which I have already seen prior to making my post) is for UIImage (UIKit). I am using Image (SwiftUI).
The struct Image is only a building block for the UI in Swift UI and it is not an object that represents the literal image, but rather something that displays some image.
The common approach, is to see how you create Image - where do you get the actual image from - and to use the source, the image itself to save it.
Just as side note, I wanted to mention that storing data blobs in Realm database can be extremely slow and more commonly used and fast approach is to write files to disk and to store only the names of those files in the database.
Elaborating on this approach, you can:
create a folder to store your images in Library path in user domain mask
You can read about iOS Sandbox file system and where you can store stuff at Apple File System Programming Guide.
For our purposes, it will suffice to this method.
let imagesFolderUrl = try! FileManager.default.url(for: . applicationSupportDirectory, in: .userDomainMask)
.appendingPathComponent("images_database")
You should check if this directory exists and create it if it doesn't - there's plenty of information about this online.
Then, when you have an image Data, you give it a name, you create a URL that will point to where it will be stored and then you write it.
let imageData: Data
let imageName = "some new name for this particular image - maybe image id or something"
let imageUrl = imagesFolderUrl.appendingPathComponent(imageName)
imageData.write(to: url) // Very slow operation that you should perform in background and not on UI thread
Then, you store the name of the image in the Realm database
Then, when you pull a record from Realm database and see the name of the image, you can construct the url again and read a Data from it
let record: RealmRecord
let imageName = record.imageName
let url = imagesFolder.appendingPathComponent(imageName)
let data = Data(url: imageName)
That's overly simplifying it.

What is the fastest way to convert an imageURL from Firebase into a UIImage?

In my iOS app I need to take an imageURL string and convert it into a UIImage.
I wrote the below function to handle this:
func getImage(urlString: String) -> UIImage {
let url = URL(string: urlString)!
do {
let data = try Data(contentsOf: url)
let image = UIImage(data: data)!
return image
} catch {
print(error, " This was the error in p2")
}
return UIImage(named: "media")!
}
The issue is that this takes too long. I believe it's a solid second or longer for this to complete.
I need that time to be significantly shorter.
Question: Is there a faster way to get the UIImage based on an imageURL from Firebase? (maybe a cocoa-pod? or better way to write the code?)
Additional questions:
Would this be any faster if the image in Firebase were of lower quality?
Would it be a viable solution to lower the quality of the image right before being passed into this function?
A lot the prominent iOS apps (and web and mobile and general) that do a lot of downloading of images take advantage of progressive jpeg. This way your user will see at least something while the image loads and over time the image will get progressively better. As a lot of commenters have mentioned, you’re not in control of Firebase like you would be if you had your own backend server delivering the pictures that you could do performance optimizations. Therefore one of the best things you can do is implement progressive jpeg in your app.
The first link is a library that will allow you to use progressive jpeg in your iOS app. The second link is a detailed approach used at FaceBook on faster loading of images.
https://www.airpair.com/ios/posts/loading-images-ios-faster-with-progressive-jpegs
https://code.fb.com/ios/faster-photos-in-facebook-for-ios/

iOS Swift base64encoding different than PAW app Base64 Encoding

Hoping to get some help with this. I am trying to post an image to a server and it requires base64 encoding of a PNG file. When I use the PAW app and encode the image everything shows up on the server beautifully. When I attempt to do the same with iOS Swift 4, the string produced is similar but also has differences and thus an incorrect image. Any idea of how to match the string that is correctly created in the PAW app in iOS. I have included code below along with screenshots of the strings (small samples) created.
Thanks!
let image : UIImage = UIImage(named:"STG.png")!
let imageData = UIImagePNGRepresentation(image)
var base64String = imageData?.base64EncodedString(options: [])
You are not comparing the same data at all. Loading the png into a UIImage and then converting the UIImage into a new png representation does not result in the same set of bytes at all.
You need to directly load the png file into a Data instance without doing any conversion.
let imageURL = Bundle.main.url(forResource: "STG", withExtension: "png")!
let imageData = try! Data(contentsOf: imageURL)
var base64String = imageData.base64EncodedString(options: [])
You might also need to try different options in the call to base64EncodedString.
Ok went down several rabbit holes and found some folks with similar issues to this. I eventually solved it by switching over to Alamofire instead of using the native urlsession. I also found out the server I was posting to would allow for multipart/form-data as well which I ended up using for my request. I was unable to get that to work using the native urlsession either.

Set a UIImageView's Image based on Binary Data

I have an image I'm trying to load from a web service. I've tested the data I got from the web service by adding data:image/gif;base64, before it and entering it as a URL in Chrome and the image loaded perfectly.
In my iOS app, I tried profilePic.image = UIImage.init(data: picData.data(using: .utf8)!) where picData is a string with the contents I tested above, however nothing loaded.
I get the feeling what I did wrong is somewhere in picData.data(using: .utf8)!) but I'm not sure. Any suggestions?
In case it helps, here's the binary data I'm working with: https://pastebin.com/xiWHaPB6
UTF-8 is an encoding for Unicode Strings not arbitrary 8 bit data!
Your if the mime type you've shown is accurate, the data is a Base64 encoded string. The first thing you want to do is convert that string to an unencoded, binary form. Then try creating your UIImage from that:
let unencodedData = Data(base64Encoded: picData)
let image = UIImage(data: unencodedData)

Storing images to CoreData - Swift

In my code I managed to save a textLabel with CoreData but I can't seem to properly save the image. I've read some tutorials and I know I have to convert it to NSData. But how do I do it?
Thanks in advance!
You shouldn't save large data inside core data, an Apple engineer told me this little trick at the last WWDC:
You can use the property "Allows external storage":
By doing this as far as i know, your image will be stored somewhere in the file system, and inside core data will be saved a link to your picture in the file system. every time you'll ask for the picture, core data will automatically retrive the image from the file system.
To save the image as NSData you can do:
let image = UIImage(named: "YourImage")
let imageData = NSData(data: UIImageJPEGRepresentation(image, 1.0))
managedObject?.setValue(imageData, forKey: "YourKey")
Remember that 1.0 in the UIImageJPEGRepresentatio means that your using the best quality and so the image will be large and heavy:
The quality of the resulting JPEG image, expressed as a value from 0.0
to 1.0. The value 0.0 represents the maximum compression (or lowest
quality) while the value 1.0 represents the least compression (or best
quality).
Core Data isn't meant to save big binary files like images. Use Document Directory in file system instead.
Here is sample code to achieve that.
let documentsDirectory = NSSearchPathForDirectoriesInDomains(.DocumentDirectory, .UserDomainMask, true).first as! String
// self.fileName is whatever the filename that you need to append to base directory here.
let path = documentsDirectory.stringByAppendingPathComponent(self.fileName)
let success = data.writeToFile(path, atomically: true)
if !success { // handle error }
It is recommended to save filename part to core data along with other meta data associated with that image and retrieve from file system every time you need it.
edit: also note that from ios8 onwards, persisting full file url is invalid since sandboxed app-id is dynamically generated. You will need to obtain documentsDirectory dynamically as needed.
Here you go for JPEG and for PNG you just use UIImagePNGRepresentation:
let image = UIImage(named: "YourImage")
let imageData = NSData(data: UIImageJPEGRepresentation(image, 1.0))
managedObject?.setValue(imageData, forKey: "YourKey")
Generally large data objects are not stored in a database or Core Data. Instead save the images in the Document directory (or a sub-directory) and save the file name in Core Data.
Se the answer by #Valentin on how to create a data representation of an image.
Save it with func writeToFile(_ path: String, atomically atomically: Bool) -> Bool

Resources