iOS WebRTC Local Media NSInvalidArgumentException [RTCI420Frame nativeHandle] - ios

I am trying to create a local media stream in my iOS webRTC app. See code below
let localStream = pcFactory.mediaStream(withLabel: "ARDAMS")!
let audio = pcFactory.audioTrack(withID: "ARDAMSa0")
localStream.addAudioTrack(audio!)
var device: AVCaptureDevice?
for captureDevice in AVCaptureDevice.devices(withMediaType: AVMediaTypeVideo){
if let captureDevice = captureDevice as? AVCaptureDevice{
if captureDevice.position == AVCaptureDevicePosition.front{
device = captureDevice
}
}
}
if let device = device{
let capture = RTCVideoCapturer(deviceName: device.localizedName)
let videoSource = pcFactory.videoSource(with: capture, constraints: nil)
localVideoTrack = pcFactory.videoTrack(withID: "ARDAMSv0", source: videoSource)
localStream.addVideoTrack(localVideoTrack)
}
self.peerConnection?.add(localStream)
localVideoTrack?.add(localVideoView)
Everything works, but when after I add the localVideoView to the localVideoTrack I get an error:
-[RTCI420Frame nativeHandle]: unrecognized selector sent to instance 0x170010620
*** Terminating app due to uncaught exception 'NSInvalidArgumentException', reason: '-[RTCI420Frame nativeHandle]: unrecognized selector sent to instance 0x170010620'
All of the code is running on the main thread and the app has the appropriate permissions and plist keys. When I walk through the code line by line using the debugger everything seems to be running correctly. This code was taken from the Obj-C AppRTC demo, it has just been converted to swift. I can't seem to find the difference between my swift project that crashes and the working AppRTC project. Any idea what I am doing wrong? I am testing on a 64 bit device. Thanks!

Related

AVCaptureStillImageOutput connection withType returns nil connection

I'm using the AVCaptureStillImageOutput to capture a photo on iOS.
To capture the photo I'm calling captureStillImageAsynchronously.
This requires a connection so I use:
let connection = stillImageOutput.connection(withMediaType: AVMediaTypeVideo)
However on certain devices namely iPad 2 on iOS 9.3.5 I see that the connection returned is nil.
I also tried iterating through all connections using:
stillImageOutput.connections
This shows there are no connections available.
Has anyone else encountered this issue? Is there another better way to obtain the connection? I realize I'm using a deprecated class however the new method is not available on iOS 9 and we still need to support this platform. BTW the camera app itself appears to work just fine on this device.
Also just noticed that canAddInput on AVCaptureSession is returning false.
The input is obtained as follows:
guard let captureDevices = AVCaptureDevice.devices(withMediaType: AVMediaTypeVideo) as? [AVCaptureDevice],
let captureDevice = captureDevices.first(where: { $0.position == .back }),
let captureDeviceInput = try? AVCaptureDeviceInput(device: captureDevice) else {
return
}

NSRangeException with PHFetchResult

I am developing an iOS App that fetches videos from the Photo gallery and it always worked. I just tested the app on a different device (this is not the first on which I test) and it crashes while I use the retrieved data and I don't understand why...
Here is my code :
self.videosAssets = PHAsset.fetchAssetsWithMediaType(.Video, options: nil)
if self.videosAssets != nil {
for i in 0..<self.videosAssets!.count {
if let video = self.videosAssets!.objectAtIndex(i) as? PHAsset {
self.videos.append(Video(asset: video))
}
}
}
It fetches 221 videos but it crashes when i == 59.
Here is the error I get :
Terminating app due to uncaught exception 'NSRangeException', reason: '*** -[__NSArray0 objectAtIndex:]: index 0 beyond bounds for empty NSArray'
self.videosAssets = PHAsset.fetchAssetsWithMediaType(.Video, options: nil)
if let videoAssets = self.videosAssets {
videoAssets.forEach { video in
if video as? PHAsset { self.videos.append(Video(asset: video)) }
}
}
After testing the Video constructor, it was the problem. For some reason, when I call let resources = PHAssetResource.assetResourcesForAsset(asset), it returns me an empty array and this is where the app was crashing.
Sorry for this useless post, maybe it will help someone...

Video upload from iPhone device fails but works on Simulator perfectly ERROR:'Cannot read file'

I am trying to upload a video from iPhone device as:
var uploadTask = self.session?.uploadTaskWithRequest(request, fromFile:NSURL(string: assetFilePath.path)!)
This code works on simulator and gives a session task object which I can resume. But it does not work on iPhone device.
It fails as:
2015-05-19 18:36:44.718 myApp[327:24703] *** Terminating app due to uncaught exception 'NSInvalidArgumentException', reason: 'Cannot read file at file:///var/mobile/Media/DCIM/100APPLE/IMG_0144.mp4'
I tried to check if the video file has read access, but it returns false on iPhone:
fileManager.fileExistsAtPath(asset.path) // returns false
Anybody has encountered this before, or am I doing something wrong here?
Code which I am using to get the file path is :
let options = PHFetchOptions()
options.sortDescriptors = [
NSSortDescriptor(key: "creationDate", ascending: true)
]
currentVideofetch = PHAsset.fetchAssetsWithMediaType(.Video, options: options)
let asset = self.currentVideofetch.objectAtIndex(indexPath.row) as? PHAsset
var assetLength:NSNumber!
var assetFilePath:NSString!
if let checkdAsset = asset {
PHImageManager.defaultManager().requestImageDataForAsset(checkdAsset,options: nil) {
imageData,dataUTI,orientation,info in
assetLength = imageData.length as NSNumber
let assetFilePathUrl = info["PHImageFileURLKey"] as? NSURL
assetFilePath = assetFilePathUrl!.absoluteString!
println("Assets FilePath \(assetFilePath)") // returns file:///var/mobile/Media/DCIM/100APPLE/IMG_0144.mp4
}
}
After messing up with lot. This is classic permissions issue in iOS. Unfortunately I didn't get any straight answers to this. We had to copy file to our local directory of my App. After that everything is works like a charm.
But in case of large file I send copying file logic in background task.

exception com.apple.coreaudio.avfaudio reason: error -50

I have this message when i try to play an audio with a different pitch:
And i googled for that error with no succeed. If i set breakpoints it stops here:
I test printing all objects to see is anything is nit but i didnt found anything. The most misterious thing is that only happens in my iphone6+, in other phones i tested this out doesnt break. Then searched the project where i looked into to add this sound effects which is this:
https://github.com/atikur/Pitch-Perfect
And if you run it it works, until you change...
AVAudioSession.sharedInstance().setCategory(AVAudioSessionCategoryPlayAndRecord, error: &error)
To:
AVAudioSession.sharedInstance().setCategory(AVAudioSessionCategoryPlayback, error: &error)
And then boom (ONLY IN REAL DEVICE ATTACHED TO XCODE, it works in the simulator):
2015-03-21 11:56:13.311 Pitch Perfect[1237:607678] 11:56:13.311 ERROR: [0x10320c000] AVAudioFile.mm:496: -[AVAudioFile readIntoBuffer:frameCount:error:]: error -50
2015-03-21 11:56:13.313 Pitch Perfect[1237:607678] * Terminating app due to uncaught exception 'com.apple.coreaudio.avfaudio', reason: 'error -50'
* First throw call stack:
(0x18687a530 0x1978040e4 0x18687a3f0 0x1851ea6c0 0x185232d38 0x1852130f8 0x185212ccc 0x100584fd4 0x100584f94 0x10058fdb8 0x1005882c4 0x1005925d4 0x100592208 0x198037dc8 0x198037d24 0x198034ef8)
libc++abi.dylib: terminating with uncaught exception of type NSException
And the really really weird thing is this screenshot, for some reason after printing audioEngine, audioEngine.outputNode gets nil?
I had the same error... I had created a "sound.swift" class that my view controller would instantiate... I decided to simplify everything and focus on making the sound work. So I have put the following code in the view controller and it works:
//fetch recorded file
var pitchPlayer = AVAudioPlayerNode()
var timePitch = AVAudioUnitTimePitch()
let dirPath = NSSearchPathForDirectoriesInDomains(.DocumentDirectory,.UserDomainMask,true)[0] as! String
var pathArray = [dirPath, String("son.wav")]
filePath = NSURL.fileURLWithPathComponents(pathArray)
audioFile = AVAudioFile(forReading: filePath.filePathURL, error: nil)
audioEngine = AVAudioEngine()
audioEngine.attachNode(pitchPlayer)
audioEngine.attachNode(timePitch)
//Create a session
var session=AVAudioSession.sharedInstance()
session.setCategory(AVAudioSessionCategoryPlayAndRecord,error:nil)
//output audio
session.overrideOutputAudioPort(AVAudioSessionPortOverride.Speaker, error: nil)
audioEngine.connect(pitchPlayer, to: timePitch, format: audioFile.processingFormat)
audioEngine.connect(timePitch, to: audioEngine.outputNode, format: audioFile.processingFormat)
pitchPlayer.scheduleFile(audioFile, atTime: nil, completionHandler: nil)
audioEngine.startAndReturnError(&audioError)
pitchPlayer.play()

Saving AVAudioRecorder to NSUserDefaults

I'm trying to start an audio recording on the apple watch and allow it to be stopped on the iPhone.
To share the information from the watch to the phone I am trying to do the following:
var recorder : AVAudioRecorder!
recorder = AVAudioRecorder(URL: soundFileURL, settings: recordSettings, error: &error)
if let watchDefaults = NSUserDefaults(suiteName: "group.spywatchkit") {
let encodedRecorder = NSKeyedArchiver.archivedDataWithRootObject(recorder) as NSData
watchDefaults.setObject(encodedRecorder, forKey: "test")
However, this results in the following error:
'NSInvalidArgumentException', reason: '-[AVAudioSession encodeWithCoder:]: unrecognized selector sent to instance
This appears to be failing since the AVAudioRecorder object doesn't conform to the NSCoder protocol. Is there another way to save this object? Can I recreate the object later?

Resources