Is HEIC/HEIF Supported By UIImage - ios

I was under the impression that UIImage would support HEIC/HEIF files introduced in iOS 11. In my testing that does not appear to be the case though. If I do let image = UIImage(named: "test") which points to test.heic then image is nil. If I use an image literal then it crashes the app. Wondering if this is not implemented yet for now. Thanks.

While Zhao's answer works, it is fairly slow. The below is about 10-20 times faster. It still doesn't work in the simulator for some reason though so keep that in mind.
func convert(url: URL) -> UIImage? {
guard let source = CGImageSourceCreateWithURL(url as CFURL, nil) else { return nil }
guard let cgImage = CGImageSourceCreateImageAtIndex(source, 0, nil) else { return nil }
return UIImage(cgImage: cgImage)
}
This is kind of outlined on page 141 from the slides of a WWDC session but wasn't super clear to me before: https://devstreaming-cdn.apple.com/videos/wwdc/2017/511tj33587vdhds/511/511_working_with_heif_and_hevc.pdf
Unfortunately I still haven't been able to figure out a way to use images in the xcassets folder so you'll either have to include the files outside of assets or pull from on the web. If anyone knows a way around this please post.

In Xcode 10.1 (10B61), UIImage(named: "YourHeifImage") works just like other assets.
Interestingly though, when you want to try this out …and you AirDrop a HEIF pic from your (e.g. iPhone) Photos to your mac, it will get the extension .HEIC (all caps). When you then add that image to your Xcode xcassets, Xcode complains about an incorrect extension:
….xcassets: warning: Ambiguous Content: The image set "IMG_1791" references a file "IMG_1791.HEIC", but that file does not have a valid extension.
If you first change the extension to the lower-case .heic, and then add it to xcassets, all is well.

You can load HEIF via CIImage, then convert to UIImage
CIImage *ciImage = [CIImage imageWithContentsOfURL:url];
imageView.image = [UIImage imageWithCIImage:ciImage];

Related

UIImage returns an empty image path from the gallery only in iOS 10.3.2

I have a problem in react-native, I'm creating a component in swift in order to do image processing. In my code, a picker selects the image from the gallery and returns the link of the image to me and then opens this image in my native code. My code works on two iPhones ( iPhone 6S on 12.3.1 and iPhone 7 on 12.3.1) but the link, once inserted in the UIImage object, returns me nil on iPhone 7 on 10.3.2 .
let result = val!["filename"] == nil ? "" : val!["filename"]
print(result)
self.bg = UIImage(named: result as! String )
if(bg == nil){
print("ERROR : Nil image")
}
So I do have the following messages that appears
/var/mobile/Media/DCIM/100APPLE/IMG_0628.JPG
ERROR : Nil image
Normally, there should be the image that generates well, but not in this version of iOS, is there something I missed? Another way to open the image in previous versions ?
You are using the wrong method to get image. Actually method that you have used is consider when image need to get from Bundle or Assets.
You need to use following method to fix your issue.
self.bg = UIImage(contentsOfFile: result as! String)

Swift - Create a GIF from Images and turn it into NSData

This might be an amateur question, but although I have searched Stack Overflow extensibly, I haven't been able to get an answer for my specific problem.
I was successful in creating a GIF file from an array of images by following a Github example:
func createGIF(with images: [NSImage], name: NSURL, loopCount: Int = 0, frameDelay: Double) {
let destinationURL = name
let destinationGIF = CGImageDestinationCreateWithURL(destinationURL, kUTTypeGIF, images.count, nil)!
// This dictionary controls the delay between frames
// If you don't specify this, CGImage will apply a default delay
let properties = [
(kCGImagePropertyGIFDictionary as String): [(kCGImagePropertyGIFDelayTime as String): frameDelay]
]
for img in images {
// Convert an NSImage to CGImage, fitting within the specified rect
let cgImage = img.CGImageForProposedRect(nil, context: nil, hints: nil)!
// Add the frame to the GIF image
CGImageDestinationAddImage(destinationGIF, cgImage, properties)
}
// Write the GIF file to disk
CGImageDestinationFinalize(destinationGIF)
}
Now, I would like to turn the actual GIF into NSData so I can upload it to Firebase, and be able to retrieve it on another device.
To achieve my goal, I have two options: Either to find how to use the code above to extract the GIF created (which seems to directly be created when creating the file), or to use the images on the function's parameters to create a new GIF but keep it on NSData format.
Does anybody have any ideas on how to do this?
Since nobody went ahead for over six months I will just put the answer from #Sachin Vas' comment here:
You can get the data using NSData(contentsOf: URL)

UIImage init with scale in swift says "Extra Argument" Erroneously

OK. This is whacky. I've checked the various answers above, and this is the only one that seems to really hit the mark.
However, there doesn't seem to be an answer.
I am using Swift to create a new UIImage, exactly like this (Objective-C version):
UIImage *myImage = [[UIImage alloc] initWithCGImage: aCGImageRefAllocatedPreviously scale:aCGFloatScaleCalculatedPreviously orientation:UIImageOrientationUp];
That works fine. However, when I try the same exact call in Swift:
let myImage = UIImage ( CGImage: aCGImageAllocatedPreviously, scale:aCGFloatScaleCalculatedPreviously, orientation:UIImageOrientation.Up )
I get a compiler error, telling me that "scale" is an extra parameter.
This is the kind of error that you get when the signature is wrong.
However, it isn't wrong (as far as I can tell).
What I am doing, is creating a tutorial by exactly replicating an Objective-C function in Swift (I know, I know, Swift is different, so I shouldn't exactly replicate, but, as Bill Murray would say, "Baby Steps").
I'm wondering how the heck I'm screwing up.
If I call it with just the image (no scale or orientation), it works fine, but I need that scale.
I tried this in the playground and was able to get this code to compile:
import UIKit
import AVFoundation
let myImage = UIImage(CGImage: nil,
scale:0,
orientation:UIImageOrientation.Up)
This syntax also works:
let newImage = UIImage.init(CGImage: nil,
scale:0,
orientation:UIImageOrientation.Up )
OK. I figgered it out. I was a knucklehead.
I need to use takeRetainedValue() on the CGImage, like so:
let myImage = UIImage ( CGImage: aCGImageAllocatedPreviously.takeRetainedValue(),scale:aCGFloatScaleCalculatedPreviously,orientation:UIImageOrientation.Up )
That satisfies the signature.

Save image with the correct orientation - Swift & Core Image

I'm using Core Image in Swift for editing photos and I have a problem when I save the photo. I'm not saving it with correct orientation.
When I get the picture from the Photo Library I'm saving the orientation in a variable as UIImageOrientation but I don't know how to set it back before saving the edited photo to the Photo Library. Any ideas how?
Saving the orientation:
var orientation: UIImageOrientation = .Up
orientation = gotImage.imageOrientation
Saving the edited photo to the Photo Library:
#IBAction func savePhoto(sender: UIBarButtonItem) {
let originalImageSize = CIImage(image:gotImage)
filter.setValue(originalImageSize, forKey: kCIInputImageKey)
// 1
let imageToSave = filter.outputImage
// 2
let softwareContext = CIContext(options:[kCIContextUseSoftwareRenderer: true])
// 3
let cgimg = softwareContext.createCGImage(imageToSave, fromRect:imageToSave.extent())
// 4
let library = ALAssetsLibrary()
library.writeImageToSavedPhotosAlbum(cgimg,
metadata:imageToSave.properties(),
completionBlock:nil)
}
Instead of using the metadata version of writeImageToSavedPhotosAlbum, you can use :
library.writeImageToSavedPhotosAlbum(
cgimg,
orientation: orientation,
completionBlock:nil)
then you can pass in the orientation directly.
To satisfy Swift, you may need to typecast it first:
var orientation : ALAssetOrientation = ALAssetOrientation(rawValue:
gotImage.imageOrientation.rawValue)!
As per my somewhat inconclusive answer here.
(As you have confirmed, this solution does indeed work in Swift - I derived it, untested, from working Objective-C code)
If you are interested in manipulating other information from image metadata, here are a few related answers I provided to other questions...
Updating UIImage orientation metaData?
Force UIImagePickerController to take photo in portrait orientation/dimensions iOS
How to get author of image in cocoa
Getting a URL from (to) a "picked" image, iOS
And a small test project on github that digs out image metadata from various sources (it's objective-C, but the principles are the same).
You are calling writeImageToSavedPhotosAlbum:metadata:completionBlock:... The docs on that method say:
You must specify the orientation key in the metadata dictionary to preserve the orientation of the image.

Take screenshot from HLS video stream with iOS device

I'm developing an application which plays an HLS video and rendering it in an UIView.
At a determinate time I want to save a picture of the currently displayed video image. For this I begin an image context graphic, draw the UIView hierarchy in the context and save it in an UIImage with UIGraphicsGetImageFromCurrentImageContext method.
This work really fine on iOS simulator, the rendered image is perfect. But on a device the rendered image is totally white.
Anyone knows why it doesn't work on device ?
Or, is there a working way to take a screenshot of an HLS video on device ?
Thank for any help.
I was able to find a way to save a screenshot of an HLS live stream, by adding an AVPlayerItemVideoOutput object to the AVPlayerItem.
In initialisation:
self.output = AVPlayerItemVideoOutput(pixelBufferAttributes:
Dictionary<String, AnyObject>())
playerItem.addOutput(output!)
To save screenshot:
guard let time = self.player?.currentTime() else { return }
guard let pixelBuffer = self.output?.copyPixelBufferForItemTime(time,
itemTimeForDisplay: nil) else { return }
let ciImage = CIImage(CVPixelBuffer: pixelBuffer)
let temporaryContext = CIContext(options: nil)
let rect = CGRectMake(0, 0,
CGFloat(CVPixelBufferGetWidth(pixelBuffer)),
CGFloat(CVPixelBufferGetHeight(pixelBuffer)))
let videoImage = temporaryContext.createCGImage(ciImage, fromRect: rect)
let image = UIImage(CGImage: videoImage)
UIImageWriteToSavedPhotosAlbum(image, nil, nil, nil)
This seems not to work in the simulator, but works fine on a device. Code is in Swift 2 but should be straightforward to convert to obj-c or Swift 1.x.
People have tried, and failed (like me), apparently because of the nature of HLS. See: http://blog.denivip.ru/index.php/2012/12/screen-capture-in-ios/?lang=en

Resources