I'm writing a Swift chat app using JSQMessageViewController as well as PubNub. I have no problem getting text messages in real time and display them correctly. But I'm stuck on retrieving image messages, I can send images without any problems but when the receiver gets the image it becomes a NSCFString data. The output of print(message.data.message) in PubNub's didReceiveMessage function is :<UIImage: 0x155d52020>, {256, 342}, And the output of print(message.data) is : { message = "<UIImage: 0x155d52020>, {256, 342}"; subscribedChannel = aUpVlGKxjR; timetoken = 14497691787509050;}
Does anyone know how to convert this data to UIImage?
You need to convert UIImage to base64 encoding and then send to pubnub message and then decode base64 into UIImage.
Encode:
let imageData = UIImagePNGRepresentation(image)
let imageString = imageData.base64EncodedStringWithOptions(.allZeros)
Decode:
let imageData = NSData(base64EncodedString: imageString, options: NSDataBase64DecodingOptions.fromRaw(0)!)
var image = UIImage(data: imageData)
Reference: Convert between UIImage and Base64 string
Related
I have built an app which fetches contacts from phonebook and saves their name and photo. To save the photo I've used the following code
if let imageData = contact.thumbnailImageData {
imageStr = String(describing: UIImage(data: imageData)!)
} else {
imageStr = "null"
}
and when I print imageStr using print("IMGSTR: \(imageStr)") I get the following output
IMSTR: <UIImage:0x283942880 anonymous {1080, 1080} renderingMode=automatic>
Now I'm stuck on how to set this string to the UIImageView, I tried
imageview.image = UIImage(named: imageStr)
but it shows nothing
Could someone please help me in how to set the string <UIImage:0x283942880 anonymous {1080, 1080} renderingMode=automatic> to UIImageView?
No need to convert it to a String. UserDefaults supports Data objects. Store it as Data and when setting it to a UIImageView use let image = UIImage(data : imageData)
If you want to convert an instance of Data to String, you should use the String(decoding:as:) initializer, like this.(eg : let str = String(decoding: data, as: UTF8.self)).
So Im trying to upload images from photo library to a server with alamofire post method, the uploading parts is working. However it's when I convert the UIImage to Base64 before uploading that stripes/removes all the exif information from the image.
Before converting the UIImage to Base64 all exif information is there and accessible but after the conversion the exif information is removed. I have tried uploading a Base64 of the same UIImage but converted on a website and that kept the exif information, that proves that the problem is with the swift version of the conversion.
Here is what the converting part of the code looks like:
func imageTobase64(image: UIImage) -> String {
var base64String = ""
let cim = CIImage(image: image)
if (cim != nil) {
let imageData = image.jpegData(compressionQuality: 1.0)
base64String = imageData!.base64EncodedString(options: NSData.Base64EncodingOptions.lineLength64Characters)
}
return base64String
}
This question already has answers here:
Swift2 retrieving images from Firebase
(2 answers)
Closed 7 years ago.
On the process of trying Firebase, as a possible replacement to Parse.com (unfortunately to disappear), I have saved a PNG image online using the code below, in Swift.
let fn = self.toolBox.getDocumentsDirectory().stringByAppendingPathComponent("M0-1.png")
let im = UIImage(contentsOfFile: fn),
dat = UIImagePNGRepresentation(im!),
b64 = dat?.base64EncodedStringWithOptions(.Encoding64CharacterLineLength),
qs = ["string": b64!],
r = diltRootRef.childByAppendingPath("zs"),
us = ["img": qs]
r.setValue(us)
The saving part seems to work, but how am I suppose to get back the image I saved? All I have tried so far failed.
I would recommend retrieving images using observeSingleEventOfType(:_), because it's a one-time read.
Once you have the string value synchronized, you can use an NSData() initializer, and then create an UIImage.
imageRef.observeSingleEventOfType(.Value) { (imageSnap: FDataSnapshot!) in
let base64String = imageSnap.value as! String
let decodedData = NSData(base64EncodedString: base64String, options: NSDataBase64DecodingOptions(rawValue: 0))
let image = UIImage(data: decodedData!)
}
Check out this example repo on using base64 images in a UITableView.
I am trying to send a photo with the app I am working on, I have made the app to take a photo and then when you tap on send it would send the photo you just took to send it through mail.
But I don't know how to convert the photo that is of type AVCaptureStillImageOutput to UIImage and store it in a NSData in order to use it as an attachment in addAttachmentData.
I tried to do this:
let data: NSData = UIImagePNGRepresentation(image: stillImageOutput)
I have this function
func doSomethingWithImage(image: UIImage) {
// do something here
mc.addAttachmentData(UIImageJPEGRepresentation(UIImage(named: imageAsUIIMage)!, CGFloat(1.0))!, mimeType: "image/jpeg", fileName: "backupOne.jpeg")
}
But It shows me an error, "Use of unresolved identifier 'imageAsUIIMage'
I want to get the UIImage in order to send it through an e-mail.
You can get capture an image and save it to file and convert it to UIImage using stillImageOutput.captureStillImageAsynchronouslyFromConnection()
// setup code
stillImageOutput.outputSettings = [AVVideoCodecKey: AVVideoCodecJPEG]
captureSession.startRunning()
let connection = stillImageOutput.connectionWithMediaType(AVMediaTypeVideo)
stillImageOutput.captureStillImageAsynchronouslyFromConnection(connection) { (sampleBuffer, error) in
// NSData of jpeg data. Save to file and add as email attachment.
let jpegData = AVCaptureStillImageOutput.jpegStillImageNSDataRepresentation(sampleBuffer)
dispatch_async(dispatch_get_main_queue(), {
self.doSomethingWithJPEGImageData(jpegData!)
})
}
// later on
func doSomethingWithJPEGImageData(jpegData: NSData) {
mc.addAttachmentData(jpegData, mimeType: "image/jpeg", fileName: "backupOne.jpeg")
}
This question already has answers here:
Convert between UIImage and Base64 string
(24 answers)
Closed 6 years ago.
A web service echoes a Base64 encoded image as a string. How can one decode and display this encoded image in a Swift project?
Specifically, I would like to take an image, which is already provided by the web service as a string in Base64 format, and understand how to display it in a UIImageView.
The articles I have found thus far describe deprecated techniques or are written in Objective-C, which I am not familiar with. How do you take in a Base64-encoded string and convert it to a UIImage?
Turn your base64 encoded string into an NSData instance by doing something like this:
let encodedImageData = ... get string from your web service ...
let imageData = NSData(base64EncodedString: encodedImageData options: .allZeros)
Then turn the imageData into a UIImage:
let image = UIImage(data: imageData)
You can then set the image on a UIImageView for example:
imageView.image = image
To decode Base64 encoded string to image, you can use the following code in Swift:
let decodedData = NSData(base64EncodedString: base64String, options: NSDataBase64DecodingOptions.fromRaw(0)!)
var decodedimage = UIImage(data: decodedData)
println(decodedimage)
yourImageView.image = decodedimage as UIImage
Even better, you can check if decodedimage is nil or not before assigning to image view.