iOS Swift base64encoding different than PAW app Base64 Encoding - ios

Hoping to get some help with this. I am trying to post an image to a server and it requires base64 encoding of a PNG file. When I use the PAW app and encode the image everything shows up on the server beautifully. When I attempt to do the same with iOS Swift 4, the string produced is similar but also has differences and thus an incorrect image. Any idea of how to match the string that is correctly created in the PAW app in iOS. I have included code below along with screenshots of the strings (small samples) created.
Thanks!
let image : UIImage = UIImage(named:"STG.png")!
let imageData = UIImagePNGRepresentation(image)
var base64String = imageData?.base64EncodedString(options: [])

You are not comparing the same data at all. Loading the png into a UIImage and then converting the UIImage into a new png representation does not result in the same set of bytes at all.
You need to directly load the png file into a Data instance without doing any conversion.
let imageURL = Bundle.main.url(forResource: "STG", withExtension: "png")!
let imageData = try! Data(contentsOf: imageURL)
var base64String = imageData.base64EncodedString(options: [])
You might also need to try different options in the call to base64EncodedString.

Ok went down several rabbit holes and found some folks with similar issues to this. I eventually solved it by switching over to Alamofire instead of using the native urlsession. I also found out the server I was posting to would allow for multipart/form-data as well which I ended up using for my request. I was unable to get that to work using the native urlsession either.

Related

iOS, Swift, Image Metadata, XMP, DJI Drones

I'm writing an iOS Swift app to fetch metadata from DJI drone images. I'm trying to access the Xmp.drone-dji.X metadata. The iOS/Swift CGImageSource and CGImageMetadata libraries/classes get almost all of the metadata out of the image but not the Xmp.drone-dji. When I get a list of tags, those tag/values are not listed. I know the tags/data are in the images because I've examined the images using exif, exiv2, etc.
Any suggestions?
Here is the code I'm using so far:
result.itemProvider.loadDataRepresentation(forTypeIdentifier: UTType.image.identifier)
{ data, err in
if let data = data {
let src = CGImageSourceCreateWithData(data as CFData,nil)!
let md = CGImageSourceCopyPropertiesAtIndex(src,0,nil) as! NSDictionary
let md2 = CGImageSourceCopyMetadataAtIndex(src,0,nil)
}
Thanks,
Bobby
So, after a lot of searching, trial and error, I have found an answer.
I was not able to get any of the CGImage swift libraries to extract this info for me.
Adobe has a c++ library that parses xmp/xml data out of images and it purports to support iOS. I didnt want the hassle of building c++ on iOS, importing that into Xcode and then dealing with the fact that thrown errors do not propagate well from c++/objectiveC to Swift.
So, at a high level, I did the following:
get the bytes of the raw image as CFData or Data then cast to a String
then use String.range() to find beginning of XML/XMP data in image
searching for substring <?xpacket begin
use String.range() to find end of XML/XMP data in image
using substring <?xpacket end.*?>
Extract the XML document out of image data String
Use Swift XMLParser class to parse the XML and then copying attributes and
elements as necessary. I just simply added what I wanted to already
existing Exif NSdictionary returned by CGImage classes.
Happy to answer questions on this approach. My code will eventually be uploaded to GitHub under OpenAthenaIOS project.
Bobby

What is the fastest way to convert an imageURL from Firebase into a UIImage?

In my iOS app I need to take an imageURL string and convert it into a UIImage.
I wrote the below function to handle this:
func getImage(urlString: String) -> UIImage {
let url = URL(string: urlString)!
do {
let data = try Data(contentsOf: url)
let image = UIImage(data: data)!
return image
} catch {
print(error, " This was the error in p2")
}
return UIImage(named: "media")!
}
The issue is that this takes too long. I believe it's a solid second or longer for this to complete.
I need that time to be significantly shorter.
Question: Is there a faster way to get the UIImage based on an imageURL from Firebase? (maybe a cocoa-pod? or better way to write the code?)
Additional questions:
Would this be any faster if the image in Firebase were of lower quality?
Would it be a viable solution to lower the quality of the image right before being passed into this function?
A lot the prominent iOS apps (and web and mobile and general) that do a lot of downloading of images take advantage of progressive jpeg. This way your user will see at least something while the image loads and over time the image will get progressively better. As a lot of commenters have mentioned, you’re not in control of Firebase like you would be if you had your own backend server delivering the pictures that you could do performance optimizations. Therefore one of the best things you can do is implement progressive jpeg in your app.
The first link is a library that will allow you to use progressive jpeg in your iOS app. The second link is a detailed approach used at FaceBook on faster loading of images.
https://www.airpair.com/ios/posts/loading-images-ios-faster-with-progressive-jpegs
https://code.fb.com/ios/faster-photos-in-facebook-for-ios/

Data to UIImage to UIImageJPEGRepresentation equality failure

Why does this transformation fails to result in the same image data?
let path = Bundle(for: type(of: self)).url(forResource: "Image", withExtension: "jpg")
inputData = try! Data(contentsOf: path!)
let testImage = UIImage(data: inputData)
let testImageData = UIImageJPEGRepresentation(testImage!, 1.0)
expect(testImageData).to(equal(inputData))
From what I understand UIImageJPEGRepresentation and UIImagePNGRepresentation can strip the image of meta data. Is that the reason?
There is no particular reason why two JPEG files showing the same image would be identical. JPEG files have lots of header info, different compression algorithms, etc. And even if both files have a compression level of 1 (do they?) they are both lossy, so something will differ every time you expand and recompress. Your expectations here are just wrong. But then it also sounds like you’re trying to test something that does not need testing in the first place.
I was facing the same Issue, and was able to solve using UIImagePNGRepresentation to convert the UIImage to Data, and then compared to see if both Data were equal.

Decompress Base64 Encoded Image Bytes in Swift

I am developing an iOS Application in Swift for use by customers of the company for which I work. In addition to the iOS application, I am also making the backend API with ASP.Net Core. The method I have chosen to use for sending images is a compressed base64 encoded string representation of the image in a JSON array. This is to save on bandwidth and to make it easier to send multiple images.
On the API side, I am using the .NET Class System.IO.Compression.DeflateStream to compress the image bytes in memory before encoding them to a base64 string and sending them to the iOS application.
On the iOS side, I am a little confused as to what the process would be for decoding the string and decompressing the data to create a UIImage object for display in a UIImageView.
This all works without compression, but I wanted to try this to save on bandwidth. However, it's very possible that this is not the optimal solution, and I am open to change. Below is the snippet of code I am using to convert the base64 string to a Data object, then creating a UIImage object from that.
static func imageComplete(data json: JSON) -> [UIImage] {
var images: [UIImage] = []
for imageString in json.arrayValue {
if let compressedImageData = Data(base64Encoded: imageString.stringValue),
let image = UIImage(data: compressedImageData) {
images.append(image)
}
}
return images
}
TL;DR
In Swift, I want to decompress image bytes encoded as a Base64 string returned from an ASP.Net Core WebAPI.

Set a UIImageView's Image based on Binary Data

I have an image I'm trying to load from a web service. I've tested the data I got from the web service by adding data:image/gif;base64, before it and entering it as a URL in Chrome and the image loaded perfectly.
In my iOS app, I tried profilePic.image = UIImage.init(data: picData.data(using: .utf8)!) where picData is a string with the contents I tested above, however nothing loaded.
I get the feeling what I did wrong is somewhere in picData.data(using: .utf8)!) but I'm not sure. Any suggestions?
In case it helps, here's the binary data I'm working with: https://pastebin.com/xiWHaPB6
UTF-8 is an encoding for Unicode Strings not arbitrary 8 bit data!
Your if the mime type you've shown is accurate, the data is a Base64 encoded string. The first thing you want to do is convert that string to an unencoded, binary form. Then try creating your UIImage from that:
let unencodedData = Data(base64Encoded: picData)
let image = UIImage(data: unencodedData)

Resources