Decompress Base64 Encoded Image Bytes in Swift - ios

I am developing an iOS Application in Swift for use by customers of the company for which I work. In addition to the iOS application, I am also making the backend API with ASP.Net Core. The method I have chosen to use for sending images is a compressed base64 encoded string representation of the image in a JSON array. This is to save on bandwidth and to make it easier to send multiple images.
On the API side, I am using the .NET Class System.IO.Compression.DeflateStream to compress the image bytes in memory before encoding them to a base64 string and sending them to the iOS application.
On the iOS side, I am a little confused as to what the process would be for decoding the string and decompressing the data to create a UIImage object for display in a UIImageView.
This all works without compression, but I wanted to try this to save on bandwidth. However, it's very possible that this is not the optimal solution, and I am open to change. Below is the snippet of code I am using to convert the base64 string to a Data object, then creating a UIImage object from that.
static func imageComplete(data json: JSON) -> [UIImage] {
var images: [UIImage] = []
for imageString in json.arrayValue {
if let compressedImageData = Data(base64Encoded: imageString.stringValue),
let image = UIImage(data: compressedImageData) {
images.append(image)
}
}
return images
}
TL;DR
In Swift, I want to decompress image bytes encoded as a Base64 string returned from an ASP.Net Core WebAPI.

Related

iOS, Swift, Image Metadata, XMP, DJI Drones

I'm writing an iOS Swift app to fetch metadata from DJI drone images. I'm trying to access the Xmp.drone-dji.X metadata. The iOS/Swift CGImageSource and CGImageMetadata libraries/classes get almost all of the metadata out of the image but not the Xmp.drone-dji. When I get a list of tags, those tag/values are not listed. I know the tags/data are in the images because I've examined the images using exif, exiv2, etc.
Any suggestions?
Here is the code I'm using so far:
result.itemProvider.loadDataRepresentation(forTypeIdentifier: UTType.image.identifier)
{ data, err in
if let data = data {
let src = CGImageSourceCreateWithData(data as CFData,nil)!
let md = CGImageSourceCopyPropertiesAtIndex(src,0,nil) as! NSDictionary
let md2 = CGImageSourceCopyMetadataAtIndex(src,0,nil)
}
Thanks,
Bobby
So, after a lot of searching, trial and error, I have found an answer.
I was not able to get any of the CGImage swift libraries to extract this info for me.
Adobe has a c++ library that parses xmp/xml data out of images and it purports to support iOS. I didnt want the hassle of building c++ on iOS, importing that into Xcode and then dealing with the fact that thrown errors do not propagate well from c++/objectiveC to Swift.
So, at a high level, I did the following:
get the bytes of the raw image as CFData or Data then cast to a String
then use String.range() to find beginning of XML/XMP data in image
searching for substring <?xpacket begin
use String.range() to find end of XML/XMP data in image
using substring <?xpacket end.*?>
Extract the XML document out of image data String
Use Swift XMLParser class to parse the XML and then copying attributes and
elements as necessary. I just simply added what I wanted to already
existing Exif NSdictionary returned by CGImage classes.
Happy to answer questions on this approach. My code will eventually be uploaded to GitHub under OpenAthenaIOS project.
Bobby

iOS Swift base64encoding different than PAW app Base64 Encoding

Hoping to get some help with this. I am trying to post an image to a server and it requires base64 encoding of a PNG file. When I use the PAW app and encode the image everything shows up on the server beautifully. When I attempt to do the same with iOS Swift 4, the string produced is similar but also has differences and thus an incorrect image. Any idea of how to match the string that is correctly created in the PAW app in iOS. I have included code below along with screenshots of the strings (small samples) created.
Thanks!
let image : UIImage = UIImage(named:"STG.png")!
let imageData = UIImagePNGRepresentation(image)
var base64String = imageData?.base64EncodedString(options: [])
You are not comparing the same data at all. Loading the png into a UIImage and then converting the UIImage into a new png representation does not result in the same set of bytes at all.
You need to directly load the png file into a Data instance without doing any conversion.
let imageURL = Bundle.main.url(forResource: "STG", withExtension: "png")!
let imageData = try! Data(contentsOf: imageURL)
var base64String = imageData.base64EncodedString(options: [])
You might also need to try different options in the call to base64EncodedString.
Ok went down several rabbit holes and found some folks with similar issues to this. I eventually solved it by switching over to Alamofire instead of using the native urlsession. I also found out the server I was posting to would allow for multipart/form-data as well which I ended up using for my request. I was unable to get that to work using the native urlsession either.

Set a UIImageView's Image based on Binary Data

I have an image I'm trying to load from a web service. I've tested the data I got from the web service by adding data:image/gif;base64, before it and entering it as a URL in Chrome and the image loaded perfectly.
In my iOS app, I tried profilePic.image = UIImage.init(data: picData.data(using: .utf8)!) where picData is a string with the contents I tested above, however nothing loaded.
I get the feeling what I did wrong is somewhere in picData.data(using: .utf8)!) but I'm not sure. Any suggestions?
In case it helps, here's the binary data I'm working with: https://pastebin.com/xiWHaPB6
UTF-8 is an encoding for Unicode Strings not arbitrary 8 bit data!
Your if the mime type you've shown is accurate, the data is a Base64 encoded string. The first thing you want to do is convert that string to an unencoded, binary form. Then try creating your UIImage from that:
let unencodedData = Data(base64Encoded: picData)
let image = UIImage(data: unencodedData)

Curious, is it possible to get [UIImage] from NSMutableData

On my iOS side, i received a stream of images via NSInputStream. NSMutableData will be containing images in [JPG IMAGE][JPG IMAGE][PNG IMAGE].
Is there any elegant way of getting the images from NSMutableData?
EDIT: Thanks for all the replies. I do not know the start/end of the images. I converted the buffer to NSData and append it to NSMutableData. Do i need to include a delimiter in between the images?
Server(.Net): Continuous sending images, so on the client side (iOS) i will never encounter NSStreamEvent.EndEncountered . Closing the NetoworkStream will throw me ObjectDisposedException was unhandled, cannot access a disposed object (fixing it now)
Client: Does not know when the server has completed sending, all it has is a NSMutableData. (fixing it now) I convert the buffer to NSData and append it to NSMutableData
Basically what i want to achieve is:
1. I press connect on my client (iOS app),
2. Establish connection with the server (.Net)
3. Server create and send images to client
4. Client displays the images.
5. Connect and Disconnect initiated by client.
Update: I got it working via:
1. At server side, after each image sent, i append a delimeter.
2. At client side, i append the data to NSMutableData as usual and check the last few bytes for delimeter. If it matches, i can safely proceed to chunk the NSMutableData by the delimeter.
via that, i am able to receive & display multiple images!
You can use
[UIImage imageWithData:[yourMutableData copy]]
copy is not necessary. It makes your mutable data to immutable and you got copy.
You Can Use
NSData *data = yourData;
UIImage *image = [UIImage imageWithData:data];
It is possible, but it will be quite tricky in your case - because you indicated that the NSMutableData will contain more than one type of images (JPEG/PNG).
If we assume, that the data arrived in good format, the best way to explore and separate it is by the file signature:
For PNG it's the first 8 bytes
For JPEG it's the first 4 bytes
Incorporate this to separate the huge blob of data into an array of data chunks. You'll have to go byte by byte and look for subsequences and check if the subsequence happens to be of a JPEG or PNG file. Then simply initialize your UIImage objects like this:
var dataArray: [NSMutableData] = [imageData1, imageData2, imageData3]
var imageArray: [UIImage] = []
for imgData in dataArray {
if let image = UIImage(data: imgData) {
imageArray.append(image)
}
}

How to create base 64 Data in swift

I'm trying to create base64 data for protobuf encoding library.
I found this code:
/* Create a Base-64, UTF-8 encoded NSData from the receiver's contents using the given options.
*/
#availability(iOS, introduced=7.0)
func base64EncodedDataWithOptions(options: NSDataBase64EncodingOptions) -> NSData
In the source code of NSData. As I got, this method have to return base64encoded NSDate.
But I can't understand, how to convert my NSData(which I'm receiving from API) to this Base64 NSDate.
You said your data is NSData. Then just call the base64EncodedDataWithOptions and assign it to a new variable/constant:
let newData = yourData.base64EncodedDataWithOptions(NSDataBase64EncodingOptions.allZeros)
Check NSDataBase64EncodingOptions for encoding options and change allZeros as appropriate.

Resources