How to put CALayer on CMSampleBuffer of video data output? - ios

Here's my code:
func captureOutput(_ output: AVCaptureOutput,
didOutput sampleBuffer: CMSampleBuffer,
from connection: AVCaptureConnection) {
guard CMSampleBufferDataIsReady(sampleBuffer) else { return }
// This is the delegate method for
// AVCaptureVideoDataOutputSampleBufferDelegate and
// AVCaptureAudioDataOutputSampleBufferDelegate
}
I'm trying to use Vision framework and draw a layer over eyes where I got the success. However I want to record the same.

Related

Swift - captureOutput is not being executed

I am currently trying to implement a camera live feed to my app. I've got it setup but somehow it's not working as expected. As far as I understand, captureOutput should be executed every time a frame is recognized and the print message should be output in console, but somehow it's not - the console won't show the print command.
Does anybody see any possible mistake inside of the code?
I don't know whether that's connected to my problem, but at start of the app the console shows the following:
[BoringSSL] nw_protocol_boringssl_get_output_frames(1301) [C1.1:2][0x106b24530] get output frames failed, state 8196
import UIKit
import AVKit
import Vision
class CameraViewController: UIViewController, AVCaptureVideoDataOutputSampleBufferDelegate {
override func viewDidLoad() {
super.viewDidLoad()
let captureSession = AVCaptureSession()
guard let captureDevice = AVCaptureDevice.default(for: .video) else { return }
guard let input = try? AVCaptureDeviceInput(device: captureDevice) else{ return }
captureSession.addInput(input)
captureSession.startRunning()
let previewLayer = AVCaptureVideoPreviewLayer(session: captureSession)
view.layer.addSublayer(previewLayer)
previewLayer.frame = view.frame
let dataOutput = AVCaptureVideoDataOutput()
dataOutput.setSampleBufferDelegate(self, queue: DispatchQueue(label: "videoQueue"))
captureSession.addOutput(dataOutput)
// let request = VNCoreMLRequest
// VNImageRequestHandler(cgImage: <#T##CGImage#>, options: [:]).perform(request)
}
func captureOutput(_ output: AVCaptureOutput, didDrop sampleBuffer: CMSampleBuffer, from connection: AVCaptureConnection) {
print("Es hat funktioniert")
}
}
You need to implement captureOutput(_:didOutput:from:) not captureOutput(_:didDrop:from:)
func captureOutput(_ output: AVCaptureOutput,
didOutput sampleBuffer: CMSampleBuffer,
from connection: AVCaptureConnection) {
print("Es hat funktioniert")
}

Using AVCaptureVideoDataOutput and AVCaptureAudioDataOutput

I have an AVCaptureVideoDataOutput session running set up as below, which works great and records the buffer to a file.
I want to also record the audio but there doesn't seem to be any in the buffer even though I've added the microphone as an input to the captureSession.
I suspect I need to also use AVCaptureAudioDataOutput.
func setupCaptureSession()
{
captureSession.beginConfiguration()
captureSession.sessionPreset = AVCaptureSessionPreset1280x720
videoOutput.setSampleBufferDelegate(self, queue: DispatchQueue(label: "sample
buffer delegate", attributes: []))
videoOutput.videoSettings = [(kCVPixelBufferPixelFormatTypeKey as NSString) : NSNumber(value: kCVPixelFormatType_32BGRA as UInt32)]
videoOutput.alwaysDiscardsLateVideoFrames = true
captureSession.addOutput(videoOutput)
captureSession.addInput(deviceInputFromDevice(backCameraDevice))
captureSession.addInput(deviceInputFromDevice(micDevice))
captureSession.commitConfiguration()
captureSession.startRunning()
}
and then here's how I get the video buffer and send it off to be written to a file
func captureOutput(_ captureOutput: AVCaptureOutput!, didOutputSampleBuffer sampleBuffer: CMSampleBuffer!, from connection: AVCaptureConnection!)
{
writeVideoFromData(sampleBuffer) // a function that writes the buffer to disk
}
Check func captureOutput(AVCaptureOutput, didOutput: CMSampleBuffer, from: AVCaptureConnection)
of AVCaptureAudioDataOutputSampleBufferDelegate if you want grab sample data of audio from AVCaptureAudioDataOutput
https://developer.apple.com/documentation/avfoundation/avcaptureaudiodataoutputsamplebufferdelegate

didOutputSampleBuffer Delegate Never Gets Called (iOS / Swift 3)

I am trying to keep track of the sample buffer rate of a video recording.
I have a view controller with AVCaptureFileOutputRecordingDelegate and AVCaptureVideoDataOutputSampleBufferDelegate and then setting the buffer output like so:
sessionQueue.async { [weak self] in
if let `self` = self {
let movieFileOutput = AVCaptureMovieFileOutput()
let bufferQueue = DispatchQueue(label: "bufferRate", qos: .userInteractive, attributes: .concurrent)
let theOutput = AVCaptureVideoDataOutput()
theOutput.videoSettings = [(kCVPixelBufferPixelFormatTypeKey as NSString): NSNumber(value:kCVPixelFormatType_32BGRA)]
theOutput.alwaysDiscardsLateVideoFrames = true
theOutput.setSampleBufferDelegate(self, queue: bufferQueue)
if self.session.canAddOutput(theOutput) {
self.session.addOutput(theOutput)
print("ADDED BUFFER OUTPUT")
}
if self.session.canAddOutput(movieFileOutput) {
self.session.beginConfiguration()
self.session.addOutput(movieFileOutput)
self.session.sessionPreset = AVCaptureSessionPresetHigh
if let connection = movieFileOutput.connection(withMediaType: AVMediaTypeVideo) {
if connection.isVideoStabilizationSupported {
connection.preferredVideoStabilizationMode = .auto
}
}
self.session.commitConfiguration()
self.movieFileOutput = movieFileOutput
DispatchQueue.main.async { [weak self] in
if let `self` = self {
self.recordButton.isEnabled = true
}
}
}
}
}
Additionally, I have the function put into place to read the buffer:
func captureOutput(_ captureOutput: AVCaptureOutput!, didOutputSampleBuffer sampleBuffer: CMSampleBuffer!, from connection: AVCaptureConnection!) {
print("captured \(sampleBuffer)")
}
The problem is that when running the camera, it records correctly like it is supposed to (code not shown as it is working like normal) but the captureOutput sample buffer never gets called. What am I doing wrong? I assume it has to do with the way I am setting it up?
Make sure the delegate signature is correct with Swift 3 syntax.
func captureOutput(_ output: AVCaptureOutput, didOutput sampleBuffer: CMSampleBuffer, from connection: AVCaptureConnection)
I was using Swift 2.3 syntax and compiler did not warn me of the issue. try to type out the didOut sampleBuffer and check if XCode autocorrects the proper syntax for that delegate method.

Swift: captureOutput not being called

So I am working on my first Swift app which was going fine until I got stuck here. Please take a look at the code below.
//
// CameraFrames.swift
// Explore
//
// Created by Kushagra Agarwal on 13/08/15.
// Copyright © 2015 Kushagra Agarwal. All rights reserved.
//
import Foundation
import AVFoundation
protocol CameraFramesDelegate {
func processCameraFrames(sampleBuffer : CMSampleBufferRef)
}
class CameraFrames : NSObject, AVCaptureVideoDataOutputSampleBufferDelegate {
var previewLayer : AVCaptureVideoPreviewLayer?
var delegate : CameraFramesDelegate?
let captureSession = AVCaptureSession();
var captureDevice : AVCaptureDevice?
override init() {
super.init()
captureSession.beginConfiguration()
// Capture the session with High settings preset
captureSession.sessionPreset = AVCaptureSessionPresetHigh;
self.captureDevice = AVCaptureDevice.defaultDeviceWithMediaType(AVMediaTypeVideo);
// Because the old error handling method is deprecated
do {
try captureSession.addInput(AVCaptureDeviceInput(device: captureDevice));
} catch {
print("Error in getting input from camera");
}
self.previewLayer = AVCaptureVideoPreviewLayer(session: captureSession);
let output = AVCaptureVideoDataOutput();
var outputQueue : dispatch_queue_t?
outputQueue = dispatch_queue_create("outputQueue", DISPATCH_QUEUE_SERIAL);
output.setSampleBufferDelegate(self, queue: outputQueue)
output.alwaysDiscardsLateVideoFrames = true;
output.videoSettings = nil;
captureSession.addOutput(output);
captureSession.commitConfiguration()
captureSession.startRunning();
}
func captureOutput(captureOutput: AVCaptureOutput!, didDropSampleBuffer sampleBuffer: CMSampleBuffer!, fromConnection connection: AVCaptureConnection!) {
print("frame dropped")
}
func captureOutput(captureOutput: AVCaptureOutput, didOutputSampleBuffer sampleBuffer: CMSampleBufferRef, fromConnection connection: AVCaptureConnection) {
print("frame received")
}
}
The preview shows up fine on the phone but the captureOutput is never called for some reason. I looked at some old SO threads, but none helped me resolve this issue. Any idea what can be the reason behind it?
Edit: I forgot to mention that I have a view controller in which I am previewing the AVCaptureVideoPreviewLayer layer, which works well. I have a feeling that the issue is somewhere in setting up the outputQueue but I can't figure out what.

How to convert CMSampleBuffer to CMAttachmentBearer in swift

I'm new of swift, I want to call function CMCopyDictionaryOfAttachments in (void)captureOutput:(AVCaptureOutput *)captureOutput didOutputSampleBuffer:(CMSampleBufferRef)sampleBuffer fromConnection:(AVCaptureConnection *)connection delegate
Codes of mine:
// MARK: Delegates
func captureOutput(captureOutput: AVCaptureOutput!, didOutputSampleBuffer sampleBuffer: CMSampleBuffer!, fromConnection connection: AVCaptureConnection!) {
// got an image
let pixelBuffer : CVPixelBufferRef = CMSampleBufferGetImageBuffer(sampleBuffer)
let attachments : CFDictionaryRef = CMCopyDictionaryOfAttachments(kCFAllocatorDefault, sampleBuffer, CMAttachmentMode( kCMAttachmentMode_ShouldPropagate)) as CFDictionaryRef!
}
This got an error by xcode :'CMSampleBuffer' is not identical to 'CMAttachmentBearer'
so how can I use sampleBuffer as target, this code works if written in objective-c
I guess the major problem in your code is that you passe the CMSampleBuffer instead of the CVPixelBufferRef.
The next problem then is that CMCopyDictionaryOfAttachments returns an unmanaged instance, which needs to be converted using takeRetainedValue().
func captureOutput(captureOutput: AVCaptureOutput!, didOutputSampleBuffer sampleBuffer: CMSampleBuffer!, fromConnection connection: AVCaptureConnection!) {
// got an image
let pixelBuffer : CVPixelBufferRef = CMSampleBufferGetImageBuffer(sampleBuffer)
let attachments : [NSObject : AnyObject] = CMCopyDictionaryOfAttachments(kCFAllocatorDefault, pixelBuffer, CMAttachmentMode( kCMAttachmentMode_ShouldPropagate)).takeRetainedValue()
}

Resources