CMSampleBuffer extension to convert to UIImage: Too expensive on memory - ios

I'm trying to convert a CMSampleBuffer to an UIImage with Swift 3.0. A popular solution is to write an extension for the CMSampleBuffer class and add a getter to convert the buffer to an image. This is what it looks like:
import Foundation
import AVFoundation
extension CMSampleBuffer {
#available(iOS 9.0, *)
var uiImage: UIImage? {
guard let imageBuffer = CMSampleBufferGetImageBuffer(self) else { return nil }
let ciimage: CIImage = CIImage(cvImageBuffer: imageBuffer)
let image:UIImage = UIImage(ciImage: ciimage)
return image
}
}
It works fine but it's taking up a lot of memory, 40% of the total app memory. Is there a more memory efficient solution?
EDIT:
I have changed my code and it looks like this:
var uiImage: UIImage? {
guard let imageBuffer = CMSampleBufferGetImageBuffer(self) else { return nil }
CVPixelBufferLockBaseAddress(imageBuffer, CVPixelBufferLockFlags(rawValue: 0))
let baseAddress = CVPixelBufferGetBaseAddress(imageBuffer)
let bytesPerRow = CVPixelBufferGetBytesPerRow(imageBuffer)
let width = CVPixelBufferGetWidth(imageBuffer)
let height = CVPixelBufferGetHeight(imageBuffer)
let colorSpace = CGColorSpaceCreateDeviceRGB()
let bitmapInfo = CGBitmapInfo(rawValue: CGImageAlphaInfo.noneSkipFirst.rawValue | CGBitmapInfo.byteOrder32Little.rawValue)
var image: UIImage?
autoreleasepool(invoking: {() -> () in
guard let context = CGContext(data: baseAddress,
width: width,
height: height,
bitsPerComponent: 8,
bytesPerRow: bytesPerRow,
space: colorSpace,
bitmapInfo: bitmapInfo.rawValue) else { return }
guard let cgImage = context.makeImage() else { return }
image = UIImage(cgImage: cgImage)
})
CVPixelBufferUnlockBaseAddress(imageBuffer,CVPixelBufferLockFlags(rawValue: 0));
return image
}
The memory leak has something to do with the CGContext. Is there any other way I can free/release/deallocate it besides using an autoreleasepool?

Related

Scale Uimage to Large size App be Crashed in Swift

I use VImage to scale Uimage to large size (1920 * 1080), The memory grow up to high and crash app because of memory issue.
I need use VImage because of it keep some better resolution after scale
thanks!
here my code:
func resizeAllImages(){
for i in 0..<arrNumImages.count {
let image = arrNumImages[I]
let imageXib = image.resizeImageUsingVImage(size: CGSize(width: 1920, height: 1080)) ?? UIImage()
}
// Resize extension of Uiimage
func resizeImageUsingVImage(size:CGSize) -> UIImage? {
let cgImage = self.cgImage!
var format = vImage_CGImageFormat(bitsPerComponent: 8, bitsPerPixel: 32, colorSpace: nil, bitmapInfo: CGBitmapInfo(rawValue: CGImageAlphaInfo.first.rawValue), version: 0, decode: nil, renderingIntent: CGColorRenderingIntent.defaultIntent)
var sourceBuffer = vImage_Buffer()
defer {
free(sourceBuffer.data)
}
var error = vImageBuffer_InitWithCGImage(&sourceBuffer, &format, nil, cgImage, numericCast(kvImageNoFlags))
guard error == kvImageNoError else { return nil }
// create a destination buffer
let scale = self.scale
let destWidth = Int(size.width)
let destHeight = Int(size.height)
let bytesPerPixel = self.cgImage!.bitsPerPixel/8
let destBytesPerRow = destWidth * bytesPerPixel
let destData = UnsafeMutablePointer<UInt8>.allocate(capacity: destHeight * destBytesPerRow)
defer {
destData.deallocate()
}
var destBuffer = vImage_Buffer(data: destData, height: vImagePixelCount(destHeight), width: vImagePixelCount(destWidth), rowBytes: destBytesPerRow)
// scale the image
error = vImageScale_ARGB8888(&sourceBuffer, &destBuffer, nil, numericCast(kvImageHighQualityResampling))
guard error == kvImageNoError else { return nil }
// create a CGImage from vImage_Buffer
var destCGImage = vImageCreateCGImageFromBuffer(&destBuffer, &format, nil, nil, numericCast(kvImageNoFlags), &error)?.takeRetainedValue()
guard error == kvImageNoError else { return nil }
// create a UIImage
let resizedImage = destCGImage.flatMap { UIImage(cgImage: $0, scale: 0.0, orientation: self.imageOrientation) }
destCGImage = nil
return resizedImage
}

CAMetalLayer drawable texture is weird on some devices

I am using the below code to get and append a pixel buffer from a metal layer. On some non specific devices the output looks like below and the drawable textures pixelFormat is .invalid
static func make(with currentDrawable: CAMetalDrawable, usingBuffer pool: CVPixelBufferPool) -> (CVPixelBuffer?, UIImage) {
let destinationTexture = currentDrawable.texture
var pixelBuffer: CVPixelBuffer?
_ = CVPixelBufferPoolCreatePixelBuffer(kCFAllocatorDefault, pool, &pixelBuffer)
if let pixelBuffer = pixelBuffer {
CVPixelBufferLockBaseAddress(pixelBuffer, CVPixelBufferLockFlags.init(rawValue: 0))
let region = MTLRegionMake2D(0, 0, Int(currentDrawable.layer.drawableSize.width), Int(currentDrawable.layer.drawableSize.height))
let bytesPerRow = CVPixelBufferGetBytesPerRow(pixelBuffer)
let tempBuffer = CVPixelBufferGetBaseAddress(pixelBuffer)
destinationTexture.getBytes(tempBuffer!, bytesPerRow: Int(bytesPerRow), from: region, mipmapLevel: 0)
let image = imageFromCVPixelBuffer(buffer: pixelBuffer)
CVPixelBufferUnlockBaseAddress(pixelBuffer, CVPixelBufferLockFlags.init(rawValue: 0))
return (pixelBuffer, image)
}
return (nil, UIImage())
}
static func imageFromCVPixelBuffer(buffer: CVPixelBuffer) -> UIImage {
let ciimage = CIImage(cvPixelBuffer: buffer)
let cgimgage = context.createCGImage(ciimage, from: CGRect(x: 0, y: 0, width: CVPixelBufferGetWidth(buffer), height: CVPixelBufferGetHeight(buffer)))
let uiimage = UIImage(cgImage: cgimgage!)
return uiimage
}
Does anybody have any idea why this happens and how to prevent it?
There are several more feedback from people experiencing this can be found here: https://github.com/svtek/SceneKitVideoRecorder/issues/3

how to create grayscale image from nsimage in swift?

I created two applications: one for mac and one for iPhone. iPhone sends the video frames it captured to mac using MultipeerConnectivity framework. I have managed to find code for converting an UIimage to grayscale using this code:
func convertToGrayScale(image: UIImage) -> UIImage {
let imageRect:CGRect = CGRectMake(0, 0, image.size.width, image.size.height)
let colorSpace = CGColorSpaceCreateDeviceGray()
let width = image.size.width
let height = image.size.height
let bitmapInfo = CGBitmapInfo(rawValue: CGImageAlphaInfo.None.rawValue)
let context = CGBitmapContextCreate(nil, Int(width), Int(height), 8, 0, colorSpace, bitmapInfo.rawValue)
CGContextDrawImage(context, imageRect, image.CGImage)
let imageRef = CGBitmapContextCreateImage(context)
let newImage = UIImage(CGImage: imageRef!)
return newImage
}
In the code below, it sends the video frame to Mac:
func captureOutput(captureOutput: AVCaptureOutput!, didOutputSampleBuffer sampleBuffer: CMSampleBuffer!, fromConnection connection: AVCaptureConnection!) {
let imageBuffer = CMSampleBufferGetImageBuffer(sampleBuffer)
CVPixelBufferLockBaseAddress(imageBuffer!, kCVPixelBufferLock_ReadOnly)
let baseAddress = CVPixelBufferGetBaseAddress(imageBuffer!)
let bytesPerRow = CVPixelBufferGetBytesPerRow(imageBuffer!)
let width = CVPixelBufferGetWidth(imageBuffer!)
let height = CVPixelBufferGetHeight(imageBuffer!)
CVPixelBufferUnlockBaseAddress(imageBuffer!, 0)
let colorSpace = CGColorSpaceCreateDeviceRGB()
let bitmapInfo = CGBitmapInfo(rawValue: CGImageAlphaInfo.PremultipliedLast.rawValue)
let context = CGBitmapContextCreate(baseAddress, width, height, 8, bytesPerRow, colorSpace, bitmapInfo.rawValue)
let quarzImage = CGBitmapContextCreateImage(context)
let image = UIImage(CGImage: quarzImage!)
let grayImage = convertToGrayScale(image)
let data: NSData = UIImagePNGRepresentation(grayImage)!
delegate?.recievedOutput(data)
}
The delegate method is just sending the data using session.sendData()
So, here comes to the Mac side. When mac received NSData, I created an NSImage from the data and created a .png image file using this code:
func session(session: MCSession, didReceiveData data: NSData, fromPeer peerID: MCPeerID) {
let image: NSImage = NSImage(data: data)!.imageRotatedByDegreess(270)
let cgRef = image.CGImageForProposedRect(nil, context: nil, hints: nil)
let representation = NSBitmapImageRep(CGImage: cgRef!)
let pngData = representation.representationUsingType(NSBitmapImageFileType.NSPNGFileType, properties: [NSImageCompressionFactor: 1.0])
pngData?.writeToFile("/Users/JunhongXu/Desktop/image/\(result.description).png", atomically: true)
result[4]++
self.delegate?.presentRecievedImage(image)
}
Although the image is like the picture below, when I checked my image file property, it is in RGB format. How can I change the ColorSpace of my NSImage to grayscale instead of RGB?
enter image description here
I have found a simple solution to my problem. Since it is already in grayscale when it transimitted to my Mac, I am able to use the code below to convert the image representation's ColorSpace to grayscale and save it as a .png file:
let newRep = representation.bitmapImageRepByConvertingToColorSpace(NSColorSpace.genericGrayColorSpace(), renderingIntent: NSColorRenderingIntent.Default)
let pngData = newRep!.representationUsingType(NSBitmapImageFileType.NSPNGFileType, properties: [NSImageCompressionFactor: 1.0])
pngData?.writeToFile("/Users/JunhongXu/Desktop/image/\(result.description).png", atomically: true)

AVCaptureStillImageOutput.pngStillImageNSDataRepresentation?

I am working with AVCaptureStillImageOutput for the first time, I save a JPEG image at some point.
Instead of a JPEG image I would like to save a PNG image. What do I need to do for that?
I have those 3 lines of code along the app:
let stillImageOutput = AVCaptureStillImageOutput()
stillImageOutput.outputSettings = [AVVideoCodecKey:AVVideoCodecJPEG]
let imageData = AVCaptureStillImageOutput.jpegStillImageNSDataRepresentation(imageDataSampleBuffer)
Is there a simple way to modify those lines to get what I want?
After browsing the net, it seems like the anser is NO (unless I have not been lucky enough), nevertheless I still believe there must be some good solution.
There is sample code in the AVFoundation Programming Guide that shows how to convert a CMSampleBuffer to a UIImage (under Converting CMSampleBuffer to a UIImage Object). From there, you can use UIImagePNGRepresentation(image) to encode it as PNG data.
Here is a Swift translation of that code:
extension UIImage
{
// Translated from <https://developer.apple.com/library/ios/documentation/AudioVideo/Conceptual/AVFoundationPG/Articles/06_MediaRepresentations.html#//apple_ref/doc/uid/TP40010188-CH2-SW4>
convenience init?(fromSampleBuffer sampleBuffer: CMSampleBuffer)
{
guard let imageBuffer: CVPixelBuffer = CMSampleBufferGetImageBuffer(sampleBuffer) else { return nil }
if CVPixelBufferLockBaseAddress(imageBuffer, kCVPixelBufferLock_ReadOnly) != kCVReturnSuccess { return nil }
defer { CVPixelBufferUnlockBaseAddress(imageBuffer, kCVPixelBufferLock_ReadOnly) }
let context = CGBitmapContextCreate(
CVPixelBufferGetBaseAddress(imageBuffer),
CVPixelBufferGetWidth(imageBuffer),
CVPixelBufferGetHeight(imageBuffer),
8,
CVPixelBufferGetBytesPerRow(imageBuffer),
CGColorSpaceCreateDeviceRGB(),
CGBitmapInfo.ByteOrder32Little.rawValue | CGImageAlphaInfo.PremultipliedFirst.rawValue)
guard let quartzImage = CGBitmapContextCreateImage(context) else { return nil }
self.init(CGImage: quartzImage)
}
}
Here is Swift 4 version of the above code.
extension UIImage
{
convenience init?(fromSampleBuffer sampleBuffer: CMSampleBuffer)
{
guard let imageBuffer: CVPixelBuffer = CMSampleBufferGetImageBuffer(sampleBuffer) else { return nil }
if CVPixelBufferLockBaseAddress(imageBuffer, .readOnly) != kCVReturnSuccess { return nil }
defer { CVPixelBufferUnlockBaseAddress(imageBuffer, .readOnly) }
let context = CGContext(
data: CVPixelBufferGetBaseAddress(imageBuffer),
width: CVPixelBufferGetWidth(imageBuffer),
height: CVPixelBufferGetHeight(imageBuffer),
bitsPerComponent: 8,
bytesPerRow: CVPixelBufferGetBytesPerRow(imageBuffer),
space: CGColorSpaceCreateDeviceRGB(),
bitmapInfo: CGBitmapInfo.byteOrder32Little.rawValue | CGImageAlphaInfo.premultipliedFirst.rawValue)
guard let quartzImage = context!.makeImage() else { return nil }
self.init(cgImage: quartzImage)
}
}

UIImage from CMSampleBuffer has a blue tint

I am displaying a video feed of CMSampleBuffers converted to UIImages inside a UIImageView. In the photo below, the background layer is an AVCapturePreviewLayer and the center is the buffer feed. My goal is to remove the blue tint.
Here is the CMSampleBuffer to UIImage code
extension CMSampleBuffer {
func imageRepresentation() -> UIImage? {
let imageBuffer: CVImageBufferRef = CMSampleBufferGetImageBuffer(self)!
CVPixelBufferLockBaseAddress(imageBuffer, 0)
let address = CVPixelBufferGetBaseAddressOfPlane(imageBuffer, 0)
let bytesPerRow = CVPixelBufferGetBytesPerRow(imageBuffer)
let width = CVPixelBufferGetWidth(imageBuffer)
let height = CVPixelBufferGetHeight(imageBuffer)
let colorSpace = CGColorSpaceCreateDeviceRGB()
let context = CGBitmapContextCreate(address, width, height, 8, bytesPerRow, colorSpace, CGImageAlphaInfo.NoneSkipFirst.rawValue)
let imageRef = CGBitmapContextCreateImage(context)
CVPixelBufferUnlockBaseAddress(imageBuffer, 0)
let resultImage: UIImage = UIImage(CGImage: imageRef!)
return resultImage
}
}
AVCaptureVideoDataOutput setup:
class MovieRecorder: NSObject {
// vars
private let captureVideoDataOutput = AVCaptureVideoDataOutput()
// capture session boilerplate setup...
captureVideoDataOutput.videoSettings = [kCVPixelBufferPixelFormatTypeKey: Int(kCVPixelFormatType_32BGRA)]
captureVideoDataOutput.alwaysDiscardsLateVideoFrames = true
captureVideoDataOutput.setSampleBufferDelegate(self, queue: captureDataOutputQueue)
}
The problem was with the bitmapInfo. This bitmap info fixed it.
let bitmapInfo = CGBitmapInfo(rawValue: CGImageAlphaInfo.NoneSkipFirst.rawValue | CGBitmapInfo.ByteOrder32Little.rawValue)
let context = CGBitmapContextCreate(address, width, height, 8, bytesPerRow, colorSpace, bitmapInfo.rawValue)

Resources