AVAudioEngine: no sound - ios

I'm trying to test AVAudioEngine, watched the WWDC conferences about it, and did everything by the book to try to play a simple file.
Despite everything being exactly the same as in the samples that I found (WWDC and a few other places), despite everything seeming fine (no error, seems to be running), I have no sound output.
Here's the code:
NSError *error = Nil;
[AVAudioSession.sharedInstance setCategory:AVAudioSessionCategoryPlayback withOptions:AVAudioSessionCategoryOptionMixWithOthers error: &error];
[AVAudioSession.sharedInstance setActive:YES error:&error];
AVAudioEngine *engine = [[AVAudioEngine alloc] init];
AVAudioPlayerNode *player = [[AVAudioPlayerNode alloc] init];
[engine attachNode:player];
NSURL *fileURL = [[NSBundle mainBundle] URLForResource:#"piano" withExtension:#"wav"];
AVAudioFile *file = [[AVAudioFile alloc] initForReading:fileURL error:&error];
AVAudioMixerNode *mainMixer = [engine mainMixerNode];
AVAudioOutputNode *outputNode = engine.outputNode;
[engine connect:player to:mainMixer format:file.processingFormat];
[engine connect:engine.mainMixerNode to:outputNode format:nil];
[engine prepare];
if (!engine.isRunning) {
if (![engine startAndReturnError:&error]) {
//TODO
}
}
[player scheduleFile:file atTime:nil completionHandler:nil];
[player play];
NSLog(#"Engine description: %#", [engine description]);
Here's the output of the log call:
Engine description:
________ GraphDescription ________
AVAudioEngineGraph 0x10100d820: initialized = 1, running = 1, number of nodes = 3
******** output chain ********
node 0x28388d200 {'auou' 'rioc' 'appl'}, 'I'
inputs = 1
(bus0, en1) <- (bus0) 0x2838a0880, {'aumx' 'mcmx' 'appl'}, [ 2 ch, 44100 Hz, 'lpcm' (0x00000029) 32-bit little-endian float, deinterleaved]
node 0x2838a0880 {'aumx' 'mcmx' 'appl'}, 'I'
inputs = 1
(bus0, en1) <- (bus0) 0x282a8df00, {'augn' 'sspl' 'appl'}, [ 2 ch, 44100 Hz, 'lpcm' (0x00000029) 32-bit little-endian float, deinterleaved]
outputs = 1
(bus0, en1) -> (bus0) 0x28388d200, {'auou' 'rioc' 'appl'}, [ 2 ch, 44100 Hz, 'lpcm' (0x00000029) 32-bit little-endian float, deinterleaved]
node 0x282a8df00 {'augn' 'sspl' 'appl'}, 'I'
outputs = 1
(bus0, en1) -> (bus0) 0x2838a0880, {'aumx' 'mcmx' 'appl'}, [ 2 ch, 44100 Hz, 'lpcm' (0x00000029) 32-bit little-endian float, deinterleaved]
______________________________________
I have no idea how to diagnose this and can't find any mistake or reason for which this would not work.
Any idea?

Keeping the question in case anyone has this problem: as I was testing, all my variables were local variables in a test method, and as soon as the method was finished it was probably GC by ARC or something.
Having the AVAudioEngine variable outside (keeping a reference to it) fixed it.

Related

How to get right AVAudioFormat for connecting nodes

I've got crash with this code
// ViewController.swift
import UIKit
import AVFoundation
class ViewController: UIViewController {
var engine:AVAudioEngine!
var EQNode:AVAudioUnitEQ!
override func viewDidLoad() {
engine.reset()
let Format = engine.inputNode.outputFormat(forBus: 0)
print("channelcount:",engine.inputNode.inputFormat(forBus: 0).channelCount)
//----->Start CRASH App stoped here
engine.connect(engine.inputNode, to: EQNode, format: Format)
engine.connect(EQNode, to: engine.mainMixerNode, format: Format)
var error: NSError?
engine.prepare()
print("done prepare")
do {
try engine.start()
} catch {
print(error)
}
print("done start")
}
}
And if I change Format to nil it make my app not working but not crash.
All of this work perfectly fine on Xcode simulator with no error.
But in the real iOS device (I use iPad 2019) test it crash.
Detail about my app: Live microphone adjust in Equalizer and display Equalized sound real-time.
ERROR:
SelfhearEQ[3532:760180] [aurioc] AURemoteIO.cpp:1086:Initialize: failed: -10851
(enable 1, outf< 2 ch, 0 Hz, Float32, non-inter> inf< 1 ch, 44100 Hz, Float32>)
channelcount: 0
2019-10-22 18:01:29.891748+0700 SelfhearEQ[3532:760180] [aurioc] AURemoteIO.cpp:1086:Initialize: failed: -10851
(enable 1, outf< 2 ch, 0 Hz, Float32, non-inter> inf< 1 ch, 44100 Hz, Float32>)
2019-10-22 18:01:29.892326+0700 SelfhearEQ[3532:760180] [avae]
AVAEInternal.h:76 required condition is false: [AVAudioEngineGraph.mm:2127:_Connect: (IsFormatSampleRateAndChannelCountValid(format))]
2019-10-22 18:01:29.896270+0700 SelfhearEQ[3532:760180] *** Terminating app due to uncaught exception 'com.apple.coreaudio.avfaudio', reason: 'required condition is false: IsFormatSampleRateAndChannelCountValid(format)'
I found the answer for this it nothing about formatting causing this error.
Check it out on my other question it fixed.
avaudioengine-connect-crash-on-hardware-not-simulator

Invalid image metadata when try to display a livephoto with PHLivePhotoView objective-c

I am trying to load a jpg image together with a mov file with objective-c on ios device to display a live photo, and I make following code snippet to do that in viewDidLoad function:
- (void)viewDidLoad {
[super viewDidLoad];
PHLivePhotoView *photoView = [[PHLivePhotoView alloc]initWithFrame:self.view.bounds];
NSURL *imageUrl = [[NSBundle mainBundle] URLForResource:#"livePhoto" withExtension:#"jpg"];
NSURL *videoUrl = [[NSBundle mainBundle] URLForResource:#"livePhoto" withExtension:#"mov"];
[PHLivePhoto requestLivePhotoWithResourceFileURLs:#[videoUrl, imageUrl] placeholderImage:[UIImage imageNamed:#"livePhoto.jpg"] targetSize:self.view.bounds.size contentMode:PHImageContentModeAspectFit resultHandler:^(PHLivePhoto *livePhoto, NSDictionary *info){
NSLog(#"we are in handler");
photoView.livePhoto = livePhoto;
photoView.contentMode = UIViewContentModeScaleAspectFit;
photoView.tag = 87;
[self.view addSubview:photoView];
[self.view sendSubviewToBack:photoView];
}];
}
I have drag the file livePhoto.jpg and livePhoto.mov to Xcode project
But when build this Xcode log this error:
2017-11-28 17:46:08.568455+0800 Live Photos[3669:1276778] we are in handler
2017-11-28 17:46:08.580439+0800 Live Photos[3669:1276778] we are in handler
2017-11-28 17:46:08.597147+0800 Live Photos[3669:1276806] Error: Invalid image metadata
2017-11-28 17:46:08.607881+0800 Live Photos[3669:1276806] Error: Invalid video metadata
2017-11-28 17:46:08.608329+0800 Live Photos[3669:1276778] we are in handler
Any idea about that? Thanks.
And another thing to ask:
Why does the resultHandler was called twice according to what is printed?
TL;DR
Here's the code to store Live Photos and upload them to a server:
1. Capturing Live Photo
- (void)captureOutput:(AVCapturePhotoOutput *)output didFinishProcessingPhotoSampleBuffer:(CMSampleBufferRef)photoSampleBuffer previewPhotoSampleBuffer:(CMSampleBufferRef)previewPhotoSampleBuffer resolvedSettings:(AVCaptureResolvedPhotoSettings *)resolvedSettings bracketSettings:(AVCaptureBracketedStillImageSettings *)bracketSettings error:(NSError *)error {
if (error) {
[self raiseError:error];
return;
}
NSData *imageData = [AVCapturePhotoOutput JPEGPhotoDataRepresentationForJPEGSampleBuffer:photoSampleBuffer previewPhotoSampleBuffer:previewPhotoSampleBuffer];
CIImage *image = [CIImage imageWithData:imageData];
[self.expectedAsset addInput:image.properties]; // 1. This is the metadata (which will be lost in step 2.)
[self.expectedAsset addInput:[UIImage imageWithCIImage:image]]; // 2. Creating image, but UIImage is not designed to contain the required metadata
}
- (void)captureOutput:(AVCapturePhotoOutput *)output
didFinishProcessingLivePhotoToMovieFileAtURL:(NSURL *)outputFileURL duration:(CMTime)duration photoDisplayTime:(CMTime)photoDisplayTime resolvedSettings:(AVCaptureResolvedPhotoSettings *)resolvedSettings error:(nullable NSError *)error {
if (error) {
[self raiseError:error];
} else {
[self.expectedAsset addInput:outputFileURL]; // 3. Store the URL to the actual video file
}
}
expectedAsset is just an object holding all required information. You can use a NSDictionary instead. And since this code snippet is a >= iOS 11 API, heres the one for "deprecated" iOS...
#pragma clang diagnostic push
#pragma clang diagnostic ignored "-Wunguarded-availability"
- (void)captureOutput:(AVCapturePhotoOutput *)output didFinishProcessingPhoto:(AVCapturePhoto *)photo error:(NSError *)error {
if (error) {
[self raiseError:error];
} else {
[self.expectedAsset addInput:[photo metadata]];
[self.expectedAsset addInput:[UIImage imageWithData:[photo fileDataRepresentation]]];
}
}
#pragma clang diagnostic pop
2. Generate NSData
- (NSData*)imageData {
NSData *jpgData = UIImageJPEGRepresentation(self.image, 1); // This is the UIImage (without metadata) from step 2 above
CGImageSourceRef source = CGImageSourceCreateWithData((__bridge CFDataRef) jpgData, NULL);
NSMutableData *dest_data = [NSMutableData data];
CFStringRef uti = CGImageSourceGetType(source);
NSMutableDictionary *maker = [NSMutableDictionary new];
[maker setObject:[self.imageMetadata objectForKey:(NSString*)kCGImagePropertyMakerAppleDictionary] forKey:(NSString *)kCGImagePropertyMakerAppleDictionary]; // imageMetadata is the dictionary form step 1 above
CGImageDestinationRef destination = CGImageDestinationCreateWithData((__bridge CFMutableDataRef)dest_data,uti,1,NULL);
CGImageDestinationAddImageFromSource(destination, source , 0, (__bridge CFDictionaryRef) maker);
CGImageDestinationFinalize(destination);
return dest_data;
}
- (void)dataRepresentation:(DataRepresentationLoaded)callback {
callback(#{#"image": self.imageData, #"video": [NSData dataWithContentsOfURL:self.livePhotoURL]}); // LivePhotoURL is the url from step 3 above
}
Long Answer
This is caused by wrong Metadata in the video/image file.
When creating a live photo, PHLivePhoto searches for the key 17 in kCGImagePropertyMakerAppleDictionary (which is the asset identifier) and matches this with the com.apple.quicktime.content.identifier of the mov file. The mov file also needs to have an entry for the time where the still image was captured (com.apple.quicktime.still-image-time).
Make sure your files haven't been edited (or exported) somewhere. Event the UIImageJPEGRepresentation function will remove this data from the image.
Here's a code snippet I'm using to convert the UIImage to NSData:
- (NSData*)imageData {
NSData *jpgData = UIImageJPEGRepresentation(self.image, 1);
CGImageSourceRef source = CGImageSourceCreateWithData((__bridge CFDataRef) jpgData, NULL);
NSMutableData *dest_data = [NSMutableData data];
CFStringRef uti = CGImageSourceGetType(source);
NSMutableDictionary *maker = [NSMutableDictionary new];
[maker setObject:[self.imageMetadata objectForKey:(NSString*)kCGImagePropertyMakerAppleDictionary] forKey:(NSString *)kCGImagePropertyMakerAppleDictionary];
CGImageDestinationRef destination = CGImageDestinationCreateWithData((__bridge CFMutableDataRef)dest_data,uti,1,NULL);
CGImageDestinationAddImageFromSource(destination, source , 0, (__bridge CFDictionaryRef) maker);
CGImageDestinationFinalize(destination);
return dest_data;
}
The Handler gets called twice to first tell you about corrupt data, and the second time about the cancellation of the process (these are two different keys).
EDIT:
Here's your mov data:
$ ffmpeg -i cf70b7de66bd89654967aeef1d557816.mov
Metadata:
major_brand : qt
minor_version : 0
compatible_brands: qt
creation_time : 2018-01-27T11:07:38.000000Z
com.apple.quicktime.content.identifier: cf70b7de66bd89654967aeef1d557816
Duration: 00:00:15.05, start: 0.000000, bitrate: 1189 kb/s
Stream #0:0(und): Video: h264 (High) (avc1 / 0x31637661), yuv420p(progressive), 540x960, 1051 kb/s, 29.84 fps, 29.97 tbr, 30k tbn, 59.94 tbc (default)
Metadata:
creation_time : 2018-01-27T11:07:38.000000Z
handler_name : Core Media Data Handler
encoder : 'avc1'
Stream #0:1(und): Audio: aac (LC) (mp4a / 0x6134706D), 44100 Hz, stereo, fltp, 128 kb/s (default)
Metadata:
creation_time : 2018-01-27T11:07:38.000000Z
handler_name : Core Media Data Handler
The com.apple.quicktime.still-image-time key is missing here.
Here's the metadata how it should look like:
Metadata:
major_brand : qt
minor_version : 0
compatible_brands: qt
creation_time : 2017-12-15T12:41:00.000000Z
com.apple.quicktime.content.identifier: 89CB44DA-D129-43F3-A0BC-2C980767B810
com.apple.quicktime.location.ISO6709: +51.5117+007.4668+086.000/
com.apple.quicktime.make: Apple
com.apple.quicktime.model: iPhone X
com.apple.quicktime.software: 11.1.2
com.apple.quicktime.creationdate: 2017-12-15T13:41:00+0100
Duration: 00:00:01.63, start: 0.000000, bitrate: 8902 kb/s
Stream #0:0(und): Video: h264 (High) (avc1 / 0x31637661), yuvj420p(pc, smpte170m/smpte432/bt709), 1440x1080, 8135 kb/s, 26.94 fps, 30 tbr, 600 tbn, 1200 tbc (default)
Metadata:
rotate : 90
creation_time : 2017-12-15T12:41:00.000000Z
handler_name : Core Media Data Handler
encoder : H.264
Side data:
displaymatrix: rotation of -90.00 degrees
Stream #0:1(und): Audio: pcm_s16le (lpcm / 0x6D63706C), 44100 Hz, mono, s16, 705 kb/s (default)
Metadata:
creation_time : 2017-12-15T12:41:00.000000Z
handler_name : Core Media Data Handler
Stream #0:2(und): Data: none (mebx / 0x7862656D), 12 kb/s (default)
Metadata:
creation_time : 2017-12-15T12:41:00.000000Z
handler_name : Core Media Data Handler
Stream #0:3(und): Data: none (mebx / 0x7862656D), 43 kb/s (default)
Metadata:
creation_time : 2017-12-15T12:41:00.000000Z
handler_name : Core Media Data Handler
And just FYI, heres your JPEG Data:
$ magick identify -format %[EXIF:*] cf70b7de66bd89654967aeef1d557816.jpg
exif:ColorSpace=1
exif:ExifImageLength=960
exif:ExifImageWidth=540
exif:ExifOffset=26
exif:MakerNote=65, 112, 112, 108, 101, 32, 105, 79, 83, 0, 0, 1, 77, 77, 0, 1, 0, 17, 0, 2, 0, 0, 0, 33, 0, 0, 0, 32, 0, 0, 0, 0, 99, 102, 55, 48, 98, 55, 100, 101, 54, 54, 98, 100, 56, 57, 54, 53, 52, 57, 54, 55, 97, 101, 101, 102, 49, 100, 53, 53, 55, 56, 49, 54, 0, 0

AVAudioSession error activating: Error Domain=NSOSStatusErrorDomain Code=561017449: Audio device error on integrating CallKit API to Objective C

I am developing a VoIP application using Pjsip in Objective-C.
I want to try and integrate CallKit but I got an error on configureAudioSession. I copied AudioController.h and AudioController.mm from SpeakerBox from Apple into my project.
And I added this code :
AudioController *audioController;
- (void)configureAudioSession {
if (!audioController) {
audioController = [[AudioController alloc] init];
}
}
- (void)handleIncomingCallFrom:(NSString *)dest {
CXCallUpdate *callUpdate = [[CXCallUpdate alloc] init];
[callUpdate setLocalizedCallerName:dest];
[callUpdate setHasVideo:NO];
CXHandle *calleeHandle = [[CXHandle alloc] initWithType:CXHandleTypeGeneric value:dest];
[callUpdate setRemoteHandle:calleeHandle];
[provider reportNewIncomingCallWithUUID:[NSUUID UUID] update:callUpdate completion:^(NSError *error){
[self configureAudioSession];
}];
}
Phone is ringing, I can handle the call but it crashes whenever I answer. I receive this error :
AVAudioSession error activating: Error Domain=NSOSStatusErrorDomain Code=561017449 "(null)"
2017-03-09 18:17:48.830893 MyVoIPProject[1620:971182] [aurioc] 892: failed: '!pri' (enable 3, outf< 1 ch, 16000 Hz, Int16> inf< 1 ch, 16000 Hz, Int16>)
2017-03-09 18:17:48.841301 MyVoIPProject[1620:971182] [aurioc] 892: failed: '!pri' (enable 3, outf< 1 ch, 44100 Hz, Int16> inf< 1 ch, 44100 Hz, Int16>)
2017-03-09 18:17:48.850282 MyVoIPProject[1620:971182] [aurioc] 892: failed: '!pri' (enable 3, outf< 1 ch, 48000 Hz, Int16> inf< 1 ch, 48000 Hz, Int16>)
.
.
.
.
Can you tell me how can I integrate Callkit?
This bug causes by you forget to add the Mircophone description in your Info.plist.
Reference: SpeakerBox from Apple
iOS - AudioUnitInitialize returns error code 561017449

AudioKit crashes on device, but not simulator

In the below code AudioKit.start() crashes on my iPhone SE with iOS 10.1.1. It works fine on the Simulator.
private func play(note: Int) {
let pluckedString = AKPluckedString()
AudioKit.output = pluckedString
AudioKit.start() // <-- Crash here!
let frequency = note.midiNoteToFrequency()
pluckedString.trigger(frequency: frequency)
}
The console error log is
2016-12-04 10:51:45.765130 MyApp[1833:720319] [aurioc] 889: failed: -10851 (enable 2, outf< 2 ch, 0 Hz, Float32, non-inter> inf< 2 ch, 0 Hz, Float32, non-inter>)
2016-12-04 10:51:45.766519 MyApp[1833:720319] [aurioc] 889: failed: -10851 (enable 2, outf< 2 ch, 44100 Hz, Float32, non-inter> inf< 2 ch, 0 Hz, Float32, non-inter>)
2016-12-04 10:51:45.767008 MyApp[1833:720319] [aurioc] 889: failed: -10851 (enable 2, outf< 2 ch, 44100 Hz, Float32, non-inter> inf< 2 ch, 0 Hz, Float32, non-inter>)
2016-12-04 10:51:45.767982 MyApp[1833:720319] [central] 54: ERROR: [0x1b42d7c40] >avae> AVAudioEngineGraph.mm:2515: PerformCommand: error -10851
What have I missed? I can't find any documentation about any needed extra setup for devices compared to the simulator. The version of AudioKit is 3.5. XCode version is 8.1
I found the issue. I had a recording category set for the audio session. By making sure that the audio session category isn't AVAudioSessionCategoryRecord on playback; my app doesn't crash anymore.

read()/write() calls on iOS seem to be limited by 2250 bytes

I am having a strange problem trying to read and write 9k bytes with open(), read() and write(). When I attempt to write 9k to a file and read it back, the data only goes up to 2250 bytes. After that everything is zeros.
Here is my code (except for the filename which isn't relevant, I'm just putting it to NSDocumentDirectory):
int fp = open([appFile cStringUsingEncoding:NSASCIIStringEncoding], O_RDWR | O_CREAT, 0644);
[_detailViewController log:#"first open() returns %i (err: %i)", fp, errno];
int data2[10000];
int data3[10000];
for (int i=0;i<10000;i++) data2[i] = 1;
[_detailViewController log:#"resetting seek to 0"];
int seekPos = lseek(fp, 0, SEEK_SET);
result = write(fp, data2, 9000);
[_detailViewController log:#"wrote 9k, result is %i", result];
[_detailViewController log:#"resetting seek to 0"];
seekPos = lseek(fp, 0, SEEK_SET);
result = read(fp, data3, 9000);
[_detailViewController log:#"read 9k, result is %i", result];
[_detailViewController log:#"values of data2[2248-2252] = 0x%x 0x%x 0x%x 0x%x 0x%x", data2[2248], data2[2249], data2[2250], data2[2251], data2[2252]];
[_detailViewController log:#"values of data3[2248-2252] = 0x%x 0x%x 0x%x 0x%x 0x%x", data3[2248], data3[2249], data3[2250], data3[2251], data3[2252]];
close(fp);
And here is the strange output:
2013-02-13 14:08:38.290 FileTester[2800:907] first open() returns 6 (err: 3)
2013-02-13 14:08:38.295 FileTester[2800:907] resetting seek to 0
2013-02-13 14:08:38.301 FileTester[2800:907] wrote 9k, result is 9000
2013-02-13 14:08:38.306 FileTester[2800:907] resetting seek to 0
2013-02-13 14:08:38.311 FileTester[2800:907] read 9k, result is 9000
2013-02-13 14:08:38.319 FileTester[2800:907] values of data2[2248-2252] = 0x1 0x1 0x1 0x1 0x1
2013-02-13 14:08:38.327 FileTester[2800:907] values of data3[2248-2252] = 0x1 0x1 0x0 0x0 0x0
As you can see on the last line, the data suddenly goes to zero.
Any ideas what I might be doing wrong? The thing that really gets me is that both the read() and write() return 9000.
As mentioned by ughoavgfhw (Thanks!) the problem was I was mixing up bytes and ints. 9000 bytes is the same thing as 2250 ints, since each int is 4 bytes.

Resources