exception com.apple.coreaudio.avfaudio reason: error -50 - ios

I have this message when i try to play an audio with a different pitch:
And i googled for that error with no succeed. If i set breakpoints it stops here:
I test printing all objects to see is anything is nit but i didnt found anything. The most misterious thing is that only happens in my iphone6+, in other phones i tested this out doesnt break. Then searched the project where i looked into to add this sound effects which is this:
https://github.com/atikur/Pitch-Perfect
And if you run it it works, until you change...
AVAudioSession.sharedInstance().setCategory(AVAudioSessionCategoryPlayAndRecord, error: &error)
To:
AVAudioSession.sharedInstance().setCategory(AVAudioSessionCategoryPlayback, error: &error)
And then boom (ONLY IN REAL DEVICE ATTACHED TO XCODE, it works in the simulator):
2015-03-21 11:56:13.311 Pitch Perfect[1237:607678] 11:56:13.311 ERROR: [0x10320c000] AVAudioFile.mm:496: -[AVAudioFile readIntoBuffer:frameCount:error:]: error -50
2015-03-21 11:56:13.313 Pitch Perfect[1237:607678] * Terminating app due to uncaught exception 'com.apple.coreaudio.avfaudio', reason: 'error -50'
* First throw call stack:
(0x18687a530 0x1978040e4 0x18687a3f0 0x1851ea6c0 0x185232d38 0x1852130f8 0x185212ccc 0x100584fd4 0x100584f94 0x10058fdb8 0x1005882c4 0x1005925d4 0x100592208 0x198037dc8 0x198037d24 0x198034ef8)
libc++abi.dylib: terminating with uncaught exception of type NSException
And the really really weird thing is this screenshot, for some reason after printing audioEngine, audioEngine.outputNode gets nil?

I had the same error... I had created a "sound.swift" class that my view controller would instantiate... I decided to simplify everything and focus on making the sound work. So I have put the following code in the view controller and it works:
//fetch recorded file
var pitchPlayer = AVAudioPlayerNode()
var timePitch = AVAudioUnitTimePitch()
let dirPath = NSSearchPathForDirectoriesInDomains(.DocumentDirectory,.UserDomainMask,true)[0] as! String
var pathArray = [dirPath, String("son.wav")]
filePath = NSURL.fileURLWithPathComponents(pathArray)
audioFile = AVAudioFile(forReading: filePath.filePathURL, error: nil)
audioEngine = AVAudioEngine()
audioEngine.attachNode(pitchPlayer)
audioEngine.attachNode(timePitch)
//Create a session
var session=AVAudioSession.sharedInstance()
session.setCategory(AVAudioSessionCategoryPlayAndRecord,error:nil)
//output audio
session.overrideOutputAudioPort(AVAudioSessionPortOverride.Speaker, error: nil)
audioEngine.connect(pitchPlayer, to: timePitch, format: audioFile.processingFormat)
audioEngine.connect(timePitch, to: audioEngine.outputNode, format: audioFile.processingFormat)
pitchPlayer.scheduleFile(audioFile, atTime: nil, completionHandler: nil)
audioEngine.startAndReturnError(&audioError)
pitchPlayer.play()

Related

iOS WebRTC Local Media NSInvalidArgumentException [RTCI420Frame nativeHandle]

I am trying to create a local media stream in my iOS webRTC app. See code below
let localStream = pcFactory.mediaStream(withLabel: "ARDAMS")!
let audio = pcFactory.audioTrack(withID: "ARDAMSa0")
localStream.addAudioTrack(audio!)
var device: AVCaptureDevice?
for captureDevice in AVCaptureDevice.devices(withMediaType: AVMediaTypeVideo){
if let captureDevice = captureDevice as? AVCaptureDevice{
if captureDevice.position == AVCaptureDevicePosition.front{
device = captureDevice
}
}
}
if let device = device{
let capture = RTCVideoCapturer(deviceName: device.localizedName)
let videoSource = pcFactory.videoSource(with: capture, constraints: nil)
localVideoTrack = pcFactory.videoTrack(withID: "ARDAMSv0", source: videoSource)
localStream.addVideoTrack(localVideoTrack)
}
self.peerConnection?.add(localStream)
localVideoTrack?.add(localVideoView)
Everything works, but when after I add the localVideoView to the localVideoTrack I get an error:
-[RTCI420Frame nativeHandle]: unrecognized selector sent to instance 0x170010620
*** Terminating app due to uncaught exception 'NSInvalidArgumentException', reason: '-[RTCI420Frame nativeHandle]: unrecognized selector sent to instance 0x170010620'
All of the code is running on the main thread and the app has the appropriate permissions and plist keys. When I walk through the code line by line using the debugger everything seems to be running correctly. This code was taken from the Obj-C AppRTC demo, it has just been converted to swift. I can't seem to find the difference between my swift project that crashes and the working AppRTC project. Any idea what I am doing wrong? I am testing on a 64 bit device. Thanks!

AVAudioEngine throws exception when connecting AVAudioPlayerNode to output

All I want to do (for now) is play a sound file using AVAudioEngine and AVAudioPlayerNode. Here's my code:
//Init engine and player
let audioEngine = AVAudioEngine()
let audioPlayerNode = AVAudioPlayerNode()
//Hook them up
audioEngine.attach(audioPlayerNode)
audioEngine.connect(audioPlayerNode, to: audioEngine.outputNode, format: nil) // < error
While executing the last line, an exception is thrown. Nothing is printed to the console and the execution continues, but when I set an exceptional breakpoint I get the following with LLDB:
(lldb) po [$arg1 reason]
error: Execution was interrupted, reason: EXC_BAD_ACCESS (code=1, address=0xffffd593).
The process has been returned to the state before expression evaluation.
What can't be accessed here? I haven't even loaded a file yet... Thanks for any hints.
Environment
Xcode 8.2.1
iPhone 5 running on iOS 10.3.2 (14F89)
Edit
Here is some more contextual information. The code above is part of an iOS game built with SpriteKit and GameplayKit. It is located in a subclass of GKStateMachine, within a method called playSound. This method is invoked in course of a touch event originating from my subclassed SKScene called GameScene. Upon touchesBegan, the call is delegated to all entities with a TouchComponent that have a method with the same signature (touchesBegan). These components will fire a touchDown event to their delegate, which is in turn my subclass of GKStateMachine called GameStateMachine. If the touch event is correct w.r.t. the game rules, the score property of my GameStateMachine is incremented. Within the score setter, the final method playSound is called if the score increases. Here's a sequence diagram of what I just described:
Here's a working example of playing a local file resource in a playground:
import AVFoundation
import PlaygroundSupport
// prevent program from exiting immediately
PlaygroundPage.current.needsIndefiniteExecution = true
let fileURL = Bundle.main.url(forResource: "song", withExtension: "mp3")!
let file = try! AVAudioFile(forReading: fileURL)
let audioEngine = AVAudioEngine()
let audioPlayerNode = AVAudioPlayerNode()
audioEngine.attach(audioPlayerNode)
audioEngine.connect(audioPlayerNode, to: audioEngine.outputNode, format: nil)
audioPlayerNode.scheduleFile(file, at: nil, completionHandler: nil)
// need to start the engine before we play
try! audioEngine.start()
audioPlayerNode.play()
This assumes that song.mp3 exists in the playground resources directory.
Apparently, the exception thrown is normal. It occurs under any condition and in any environment I have tested, but does not interfere with normal functionality.
The reason why no sound was played is that the AVAudioEngine and AVAudioPlayerNode objects were released as soon as the function returned, because they had no strong pointers keeping them alive. I have fixed the issue by keeping those two objects as properties.

Video compression using AVAssetWriter

I've created a function to compress a video file. It uses AVAssetWriter and adds inputs and outputs for video and audio tracks. When it starts writing I'm getting an error when the AVAssetReader for the audio track starts reading, audioReader.startReading(). Here the error, *** Terminating app due to uncaught exception 'NSInvalidArgumentException', reason: '*** -[AVAssetReader startReading] cannot be called again after reading has already started'.
The code: https://gist.github.com/jaumevn/9ba329aaf49c81c57a276fd135f53f20
Can anyone see what's the problem here? Thanks!
Line 77 of your code, you're starting a second AVAssetReader on the same file.
You don't need to hook up two readers, instead, you should hook up your AVAudioAssetReader as an Output for the existing AVAssetReader.
Something like this:
let videoReaderSettings : [String : Int] = [kCVPixelBufferPixelFormatTypeKey as String : Int(kCVPixelFormatType_420YpCbCr8BiPlanarVideoRange)]
let videoReaderOutput = AVAssetReaderTrackOutput(track: videoAssetTrack, outputSettings: videoReaderSettings)
let videoReader = try! AVAssetReader(asset: videoAssetUrl)
var settings = [String : AnyObject]()
settings[AVFormatIDKey] = Int(kAudioFormatLinearPCM)
let audioReaderOutput = AVAssetReaderTrackOutput(track: audioAssetTrack, outputSettings: settings)
videoReader.addOutput(videoReaderOutput)
videoReader.addOutput(audioReaderOutput)
videoWriter.startWriting()
videoReader.startReading()
Look into using AVCaptureVideoDataOutputSampleBufferDelegate and AVCaptureAudioDataOutputSampleBufferDelegate to capture and process the buffers from the reader.

Video upload from iPhone device fails but works on Simulator perfectly ERROR:'Cannot read file'

I am trying to upload a video from iPhone device as:
var uploadTask = self.session?.uploadTaskWithRequest(request, fromFile:NSURL(string: assetFilePath.path)!)
This code works on simulator and gives a session task object which I can resume. But it does not work on iPhone device.
It fails as:
2015-05-19 18:36:44.718 myApp[327:24703] *** Terminating app due to uncaught exception 'NSInvalidArgumentException', reason: 'Cannot read file at file:///var/mobile/Media/DCIM/100APPLE/IMG_0144.mp4'
I tried to check if the video file has read access, but it returns false on iPhone:
fileManager.fileExistsAtPath(asset.path) // returns false
Anybody has encountered this before, or am I doing something wrong here?
Code which I am using to get the file path is :
let options = PHFetchOptions()
options.sortDescriptors = [
NSSortDescriptor(key: "creationDate", ascending: true)
]
currentVideofetch = PHAsset.fetchAssetsWithMediaType(.Video, options: options)
let asset = self.currentVideofetch.objectAtIndex(indexPath.row) as? PHAsset
var assetLength:NSNumber!
var assetFilePath:NSString!
if let checkdAsset = asset {
PHImageManager.defaultManager().requestImageDataForAsset(checkdAsset,options: nil) {
imageData,dataUTI,orientation,info in
assetLength = imageData.length as NSNumber
let assetFilePathUrl = info["PHImageFileURLKey"] as? NSURL
assetFilePath = assetFilePathUrl!.absoluteString!
println("Assets FilePath \(assetFilePath)") // returns file:///var/mobile/Media/DCIM/100APPLE/IMG_0144.mp4
}
}
After messing up with lot. This is classic permissions issue in iOS. Unfortunately I didn't get any straight answers to this. We had to copy file to our local directory of my App. After that everything is works like a charm.
But in case of large file I send copying file logic in background task.

Saving AVAudioRecorder to NSUserDefaults

I'm trying to start an audio recording on the apple watch and allow it to be stopped on the iPhone.
To share the information from the watch to the phone I am trying to do the following:
var recorder : AVAudioRecorder!
recorder = AVAudioRecorder(URL: soundFileURL, settings: recordSettings, error: &error)
if let watchDefaults = NSUserDefaults(suiteName: "group.spywatchkit") {
let encodedRecorder = NSKeyedArchiver.archivedDataWithRootObject(recorder) as NSData
watchDefaults.setObject(encodedRecorder, forKey: "test")
However, this results in the following error:
'NSInvalidArgumentException', reason: '-[AVAudioSession encodeWithCoder:]: unrecognized selector sent to instance
This appears to be failing since the AVAudioRecorder object doesn't conform to the NSCoder protocol. Is there another way to save this object? Can I recreate the object later?

Resources