I'm capturing video, audio, and photos in one view controller, ideally with one capture session.
The issue I'm currently having is with recording video. It displays output to my preview fine. I have the AVCaptureFileOutputRecordingDelegate enabled and the following method implemented.
func captureOutput(captureOutput: AVCaptureFileOutput!, didFinishRecordingToOutputFileAtURL outputFileURL: NSURL!, fromConnections connections: [AnyObject]!, error: NSError!)
var outputUrl = NSURL(fileURLWithPath: NSTemporaryDirectory() + "test.mp4")
movieOutput?.startRecordingToOutputFileURL(outputUrl, recordingDelegate: self)
I'm getting this error when I run the above code though:
'NSInvalidArgumentException', reason: '*** -[AVCaptureMovieFileOutput startRecordingToOutputFileURL:recordingDelegate:] - no active/enabled connections.'
My configuration:
func configureCaptureSession() {
capturedPhoto.contentMode = .ScaleAspectFill
capturedPhoto.clipsToBounds = true
capturedPhoto.hidden = true
captureSession = AVCaptureSession()
captureSession!.beginConfiguration()
captureSession!.sessionPreset = AVCaptureSessionPresetPhoto
var error: NSError?
AVAudioSession.sharedInstance().setCategory(AVAudioSessionCategoryRecord, error: &error)
AVAudioSession.sharedInstance().setActive(true, error: &error)
var audioDevice = AVCaptureDevice.defaultDeviceWithMediaType(AVMediaTypeAudio)
var audioInput = AVCaptureDeviceInput(device: audioDevice, error: &error)
if error == nil && captureSession!.canAddInput(audioInput) {
captureSession!.addInput(audioInput)
}
photoCaptureDevice = AVCaptureDevice.defaultDeviceWithMediaType(AVMediaTypeVideo)
photoDeviceInput = AVCaptureDeviceInput(device: photoCaptureDevice, error: &error)
if error == nil && captureSession!.canAddInput(photoDeviceInput) {
captureSession!.addInput(photoDeviceInput)
stillImageOutput = AVCaptureStillImageOutput()
stillImageOutput!.outputSettings = [AVVideoCodecKey: AVVideoCodecJPEG]
if captureSession!.canAddOutput(stillImageOutput) {
captureSession!.addOutput(stillImageOutput)
}
movieOutput = AVCaptureMovieFileOutput()
if captureSession!.canAddOutput(movieOutput) {
captureSession!.addOutput(movieOutput)
}
photoPreviewLayer = AVCaptureVideoPreviewLayer(session: captureSession)
photoPreviewLayer!.videoGravity = AVLayerVideoGravityResizeAspectFill
photoPreviewLayer!.connection?.videoOrientation = AVCaptureVideoOrientation.Portrait
cameraView.layer.addSublayer(photoPreviewLayer)
contentView.addGestureRecognizer(UITapGestureRecognizer(target: self, action: "focusPhoto:"))
}
captureSession!.commitConfiguration()
captureSession!.startRunning()
}
I found two problems in your code.
Duplicated file
As per Apple documentation of startRecordingToOutputFileURL:recordingDelegate::
Starts recording to a given URL.
The method sets the file URL to which the receiver is currently writing output media. If a file at the given URL already exists when capturing starts, recording to the new file will fail.
In iOS, this frame accurate file switching is not supported. You must call stopRecording before calling this method again to avoid any errors.
When recording is stopped either by calling stopRecording, by changing files using this method, or because of an error, the remaining data that needs to be included to the file will be written in the background. Therefore, you must specify a delegate that will be notified when all data has been written to the file using the captureOutput:didFinishRecordingToOutputFileAtURL:fromConnections:error: method.
In your case, if test.mp4 is already exist when you start a new recording, it will fail. So it's better to give the recorded file an unique name each time. For example, use the current timestamp.
Session preset
In your code, you set sessionPreset to AVCaptureSessionPresetPhoto:
captureSession!.sessionPreset = AVCaptureSessionPresetPhoto
But according to my personal experience, it is not suitable for an video output and will result in your error. Change to AVCaptureSessionPresetHigh and try again. Also it's recommended to call canSetSessionPreset: and apply the preset only if canSetSessionPreset: returns YES.
Sample code
Apple offers a nice sample code on the usage of AVFoundation, you may want to check it out.
Related
I want to create something like a hearing aid app, where once i hit a the "startRecording" UIButton, it continuously records what I'm saying, and simultaneously plays it back to me at the same instant, in my earphones. It's basically to help people with hearing disabilities to hear the sounds from the surrounding environment, better and louder through earphones.
I am trying to implement it using the AVAudioKit, with the AudioRecorder and AudioPlayer working together with the same filepath "filename", in a while loop.
I get the error for line: audioPlayer.delegate = self
Thread 1: Fatal error: Unexpectedly found nil while unwrapping an Optional value.
#IBOutlet weak var startRecording: UIButton!
var recordingSession : AVAudioSession!
var audioRecorder : AVAudioRecorder!
var audioPlayer : AVAudioPlayer!
var fileNameString : String = "test.m4a"
#IBAction func buttonPressed(_ sender: Any) {
print("button pressed")
let filename = getDirectory().appendingPathComponent("\(fileNameString)")
if audioRecorder == nil{ // DAF needs to be started
let settings = [AVFormatIDKey: Int(kAudioFormatAppleLossless),
AVEncoderAudioQualityKey: AVAudioQuality.max.rawValue,
AVEncoderBitRateKey: 320000,
AVNumberOfChannelsKey: 1,
AVSampleRateKey: 12000.0] as [String : Any]
do{
audioRecorder = try AVAudioRecorder(url: filename, settings: settings)
audioRecorder.delegate = self
//audioRecorder.record()
do{
audioPlayer = try AVAudioPlayer(contentsOf: filename, fileTypeHint: nil)
}
catch let error{
print("\(error)")
}
audioPlayer.delegate = self
audioPlayer.prepareToPlay()
while true {
audioRecorder.record()
sleep(1)
audioPlayer.play()
}
//startRecording.setTitle("Stop ", for: .normal)
} catch{
print ("failed")
}
}
else { // DAF started, needs to stop
audioRecorder.stop()
audioRecorder = nil
startRecording.setTitle("Start", for: .normal)
playRecording()
}
AVAudioRecording to a file and reading from that file to play will result in too much latency for real-time audio, due to the API writing and reading files using fairly large blocks or buffers of samples.
A better iOS API is to use for your purpose is the Audio Unit API with the RemoteIO Audio Unit. Using the RemoteIO Audio Unit can result in very low latencies from microphone to speaker (or headset). This is a C callback API however, as Apple currently does not recommend using Swift inside a real-time audio context.
When I call the function .stopfetchingaudio() from EZAudio, my app crashes.
var microphone: EZMicrophone!
func didMove(to view: SKView){
/*
* setup all dependencys for the fft analysis
*/
//setup audio session
session = AVAudioSession.sharedInstance()
do{
try session.setCategory(AVAudioSessionCategoryPlayAndRecord)
try session.setActive(true)
}catch{
print("Audio Session setup Fails")
}
//create a mic instance
microphone = EZMicrophone(delegate: self)
}
func stopMic(){
microphone.stopFetchingAudio()
}
I get this error:
xyz-abv[435:35687] fatal error: unexpectedly found nil while unwrapping an Optional value
But I don't know which optional it mean.
I think it should be:
func stopMic(){
if let _ = microphone {
microphone.stopFetchingAudio()
}
}
Explanation: The reason is you move from one view(where microphone is used) to another view without intializing it. And when you call the stop method from second view controller, it causes error, because microphone is NIL.
i'm updating my app to Swift 2 with Xcode 7. this is my code of a ViewController viewDidLoad.
override func viewDidLoad() {
super.viewDidLoad()
// Get an instance of the AVCaptureDevice class to initialize a device object and provide the video
// as the media type parameter.
let captureDevice = AVCaptureDevice.defaultDeviceWithMediaType(AVMediaTypeVideo)
// Get an instance of the AVCaptureDeviceInput class using the previous device object.
var error:NSError?
let input: AnyObject! = AVCaptureDeviceInput.deviceInputWithDevice(captureDevice, error: &error)
if (error != nil) {
// If any error occurs, simply log the description of it and don't continue any more.
print("\(error?.localizedDescription)")
return
}
// Initialize the captureSession object.
captureSession = AVCaptureSession()
// Set the input device on the capture session.
captureSession?.addInput(input as! AVCaptureInput)
// Initialize a AVCaptureMetadataOutput object and set it as the output device to the capture session.
let captureMetadataOutput = AVCaptureMetadataOutput()
captureSession?.addOutput(captureMetadataOutput)
// Set delegate and use the default dispatch queue to execute the call back
captureMetadataOutput.setMetadataObjectsDelegate(self, queue: dispatch_get_main_queue())
captureMetadataOutput.metadataObjectTypes = supportedBarCodes
// Initialize the video preview layer and add it as a sublayer to the viewPreview view's layer.
videoPreviewLayer = AVCaptureVideoPreviewLayer(session: captureSession)
videoPreviewLayer?.videoGravity = AVLayerVideoGravityResizeAspectFill
videoPreviewLayer?.frame = view.layer.bounds
view.layer.addSublayer(videoPreviewLayer!)
// Start video capture.
captureSession?.startRunning()
// Move the message label to the top view
view.bringSubviewToFront(messageLabel)
// Initialize QR Code Frame to highlight the QR code
qrCodeFrameView = UIView()
qrCodeFrameView?.layer.borderColor = UIColor.greenColor().CGColor
qrCodeFrameView?.layer.borderWidth = 2
view.addSubview(qrCodeFrameView!)
view.bringSubviewToFront(qrCodeFrameView!)
}
on line
let input: AnyObject! = AVCaptureDeviceInput.deviceInputWithDevice(captureDevice, error: &error)
i get the error Extra argument error in call. I already tried with the method do{} and catch{} but it didn't work, i always get that error.
How can i fix that? Thanks
Swift 2 introduced new error handling. To solve the problem you are having, you need to catch the error instead of passing an NSError object to the AVCaptureDevice method:
override func viewDidLoad() {
super.viewDidLoad()
do {
let captureDevice = AVCaptureDevice.defaultDeviceWithMediaType(AVMediaTypeVideo)
let input = try AVCaptureDeviceInput(device: captureDevice)
// Do the rest of your work...
} catch let error as NSError {
// Handle any errors
print(error)
}
}
For a more in-depth explanation have a look at this article:
Error Handling in Swift 2.0
it doesn't look like that type method exists anymore on AVCaptureDeviceInput, see -> https://developer.apple.com/library/prerelease/ios/documentation/AVFoundation/Reference/AVCaptureDeviceInput_Class/index.html#//apple_ref/swift/cl/c:objc(cs)AVCaptureDeviceInput
(it looks like you probably want to use init(device:))
...as a handy tip: anytime you're browsing the developer library via the web, if you're not sure if you're seeing the latest 'prerelease' version of the documentation check the URL -> add '/prerelease' between 'library' and '/ios' if necessary :)
I have built an app that records an audio clip, and saves it to a file called xxx.m4a.
Once the recording is made, I can find this xxx.m4a file on my system and play it back - and it plays back fine. The problem isn't recording this audio track.
My problem is playing this audio track back from within the app.
// to demonstrate how I am building the recordedFileURL
let currentFileName = "xxx.m4a"
let dirPaths = NSSearchPathForDirectoriesInDomains(.DocumentDirectory, .UserDomainMask, true)
let docsDir: AnyObject = dirPaths[0]
let recordedFilePath = docsDir.stringByAppendingPathComponent(currentFileName)
var recordedFileURL = NSURL(fileURLWithPath: recordedFilePath)
// quick check for my own sanity
var checkIfExists = NSFileManager.defaultManager()
if checkIfExists.fileExistsAtPath(recordedFilePath){
println("audio file exists!")
}else{
println("audio file does not exist")
}
// play the recorded audio
var error: NSError?
let player = AVAudioPlayer(contentOfURL: recordedFileURL!, error: &error)
if player == nil {
println("error creating avaudioplayer")
if let e = error {
println(e.localizedDescription)
}
}
println("about to play")
player.delegate = self
player.prepareToPlay()
player.volume = 1.0
player.play()
println("told to play")
// -------
// further on in this class I have the following delegate methods:
func audioPlayerDidFinishPlaying(player: AVAudioPlayer!, successfully flag: Bool){
println("finished playing (successfully? \(flag))")
}
func audioPlayerDecodeErrorDidOccur(player: AVAudioPlayer!, error: NSError!){
println("audioPlayerDecodeErrorDidOccur: \(error.localizedDescription)")
}
My console output when I run this code is like this:
audio file exists!
about to play
told to play
I dont get anything logged into the console from either audioPlayerDidFinishPlaying or audioPlayerDecodeErrorDidOccur.
Can someone explain to me why I my audio clip isnt playing?
Many thanks.
You playing code is fine; your only problem is with the scope of your AVAudioPlayer object. Basically, you need to keep a strong reference to the player around all the time it's playing, otherwise it'll be destroyed by ARC, as it'll think you don't need it any more.
Normally you'd make the AVAudioPlayer object a property of whatever class your code is in, rather than making it a local variable in a method.
In the place where you want to use the Playback using speakers before you initialize your AVAudioPlayer, add the following code:
let recordingSession = AVAudioSession.sharedInstance()
do{
try recordingSession.setCategory(AVAudioSessionCategoryPlayback)
}catch{
}
and when you are just recording using AVAudioRecorder use this before initializing that:
do {
recordingSession = AVAudioSession.sharedInstance()
try recordingSession.setCategory(AVAudioSessionCategoryRecord)
}catch {}
I've managed to write some code that opens the camera and previews the video. I now want to capture the frames from the output to send to a server ideally encoded as H.264
Here's what I've got:
import UIKit
import AVFoundation
class ViewController: UIViewController {
let captureSession = AVCaptureSession()
var previewLayer : AVCaptureVideoPreviewLayer?
// If we find a device we'll store it here for later use
var captureDevice : AVCaptureDevice?
override func viewDidLoad() {
super.viewDidLoad()
// Do any additional setup after loading the view, typically from a nib.
captureSession.sessionPreset = AVCaptureSessionPresetHigh
let devices = AVCaptureDevice.devices()
// Loop through all the capture devices on this phone
for device in devices {
// Make sure this particular device supports video
if (device.hasMediaType(AVMediaTypeVideo)) {
// Finally check the position and confirm we've got the back camera
if(device.position == AVCaptureDevicePosition.Back) {
captureDevice = device as? AVCaptureDevice
if captureDevice != nil {
println("Capture device found")
beginSession()
}
}
}
}
}
func beginSession() {
var err : NSError? = nil
captureSession.addInput(AVCaptureDeviceInput(device: captureDevice, error: &err))
if err != nil {
println("error: \(err?.localizedDescription)")
}
previewLayer = AVCaptureVideoPreviewLayer(session: captureSession)
self.view.layer.addSublayer(previewLayer)
previewLayer?.frame = self.view.layer.frame
captureSession.startRunning()
}
}
This open the camera successfully and I can preview the footage.
I've found this Objective C code that looks like it gets the output but I don't know how to convert it to swift. It's using AVCaptureVideoDataOutput, AVAssetWriter, AVAssetWriterInput and AVAssetWriterInputPixelBufferAdaptor to write frames out to an H.264 encoded movie file.
Can use AVCaptureVideoDataOutput and AVCaptureMovieFileOutput at the same time?
Can someone help convert it or give me pointers as to how to get the frames out of my current code?
Apple has a sample project AVCam in ObjectiveC that works with this things.
Here's another question on SO about using AVCamera in Swift.
I personally used this https://github.com/alex-chan/AVCamSwift, and it's fine. I only had to convert it to Latest Swift syntax in Xcode and it worked fine.
Another suggestion is to use the ObjectiveC code that you found and import it in you Swift code through a bridging header.