averagePowerForChannel working with iOSimulator but don't on iPad - ios

In my small application for iOS 8.0 I'm measuring microphone's input level and if it's greater than -40 dB I'm changing the image in UIImageView and playing mp3-file. It works great in iOSimulator but doesn't work on iPad iOS 8.0. (microphone is accessible for app). What's the issue?
Thanks.
// ViewController.m
#import "ViewController.h"
#import "AVFoundation/AVAudioPlayer.h"
#import <AVFoundation/AVFoundation.h>
#interface ViewController ()
#end
#implementation ViewController
NSTimer *meterTimer;
AVAudioRecorder *recorder;
AVAudioPlayer *player;
NSString *audioFilePath; NSURL *pathAsURL; // for audioplayer
- (void)viewDidLoad
{ [super viewDidLoad];
meterTimer = [NSTimer scheduledTimerWithTimeInterval:0.3 // sets timer interrupt
target:self selector:#selector(timerArrived) userInfo:nil repeats:YES];
NSDictionary *settings = [NSDictionary dictionaryWithObjectsAndKeys:
[NSNumber numberWithFloat: 44100.0],AVSampleRateKey,
[NSNumber numberWithInt: kAudioFormatAppleLossless],AVFormatIDKey,
[NSNumber numberWithInt: 1],AVNumberOfChannelsKey,
[NSNumber numberWithInt: AVAudioQualityMax],AVEncoderAudioQualityKey,nil];
NSError *error;
recorder = [[AVAudioRecorder alloc] initWithURL:[NSURL URLWithString:[NSTemporaryDirectory()
stringByAppendingPathComponent:#"tmp.caf"]] settings:settings error:&error];
[recorder prepareToRecord];
recorder.meteringEnabled = YES;
[recorder record];
}
- (void)timerArrived // called by timer
{ int intPower; float fValue;
[recorder updateMeters]; // check input level
fValue = [recorder averagePowerForChannel:0];
intPower = roundf(fValue);
if (intPower > -40)
{ _microphImage.image = [UIImage imageNamed:#"Microphone2.png"];
audioFilePath = [[NSBundle mainBundle] pathForResource:#"aga" ofType:#"mp3"];
pathAsURL = [[NSURL alloc] initFileURLWithPath:audioFilePath]; NSError *error;
player = [[AVAudioPlayer alloc] initWithContentsOfURL:pathAsURL error:&error];
[player play];
}
else {_microphImage.image = [UIImage imageNamed:#"Microphone.png"];}
}
- (void)didReceiveMemoryWarning {[super didReceiveMemoryWarning];}
#end

Related

how to call the function to display the data in objective-c

I am creating the app for kids.I am new to this field.
The below code is for speech:
-(void)textToSpeechAction:(NSMutableArray *)imageStoreArray :(int)counter :(UIImageView *)imageChangeImageView :(UIImageView *)spekerOrMic :(BOOL)isMicPresent
{
spekerOrMic.image = [UIImage imageNamed:#"speaker.png"];
NSArray *items = [[imageStoreArray objectAtIndex:counter] componentsSeparatedByString:#"."];
NSString *speechString;
if(_isWritePresent)
{
NSArray *viewToRemove = [spekerOrMic subviews];
for (UIImageView *v in viewToRemove) {
[v removeFromSuperview];
}
spekerOrMic.image = [UIImage imageNamed:#""];
spekerOrMic.backgroundColor = [UIColor colorWithRed:41/255.0 green:52/255.0 blue:44/255.0 alpha:1.0];
NSString *tempString = [items objectAtIndex:0];
NSArray *tempArray = [tempString componentsSeparatedByString:#" "];
speechString = [tempArray objectAtIndex:1];
}
else
{
speechString = [items objectAtIndex:0];
}
AVSpeechSynthesizer *synthesizer = [[AVSpeechSynthesizer alloc]init];
AVSpeechUtterance *utterance = [AVSpeechUtterance speechUtteranceWithString:speechString];
[utterance setRate:0.2f];
utterance.voice = [AVSpeechSynthesisVoice voiceWithLanguage:#"en-US"];
[synthesizer speakUtterance:utterance];
imageChangeImageView.image = [UIImage imageNamed:[imageStoreArray objectAtIndex:counter]];
if(isMicPresent)
{
[NSTimer scheduledTimerWithTimeInterval:3.0 target:self selector:#selector(micAction:) userInfo:spekerOrMic repeats:NO];
}
}
-(void)micAction:(NSTimer *)timer
{
NSLog(#"mic action");
UIImageView *micOrSpeaker = timer.userInfo ;
micOrSpeaker.image = [UIImage imageNamed:#"mic.png"];
// Set the audio file
NSArray *pathComponents = [NSArray arrayWithObjects:
[NSSearchPathForDirectoriesInDomains(NSDocumentDirectory, NSUserDomainMask, YES) lastObject],
#"MyAudioMemo.m4a",
nil];
NSURL *outputFileURL = [NSURL fileURLWithPathComponents:pathComponents];
// Setup audio session
AVAudioSession *session = [AVAudioSession sharedInstance];
[session setCategory:AVAudioSessionCategoryPlayAndRecord error:nil];
// Define the recorder setting
NSMutableDictionary *recordSetting = [[NSMutableDictionary alloc] init];
[recordSetting setValue:[NSNumber numberWithInt:kAudioFormatMPEG4AAC] forKey:AVFormatIDKey];
[recordSetting setValue:[NSNumber numberWithFloat:44100.0] forKey:AVSampleRateKey];
[recordSetting setValue:[NSNumber numberWithInt: 2] forKey:AVNumberOfChannelsKey];
// Initiate and prepare the recorder
recorder = [[AVAudioRecorder alloc] initWithURL:outputFileURL settings:recordSetting error:NULL];
recorder.delegate = self;
recorder.meteringEnabled = YES;
[recorder prepareToRecord];
[recorder record];
[NSTimer scheduledTimerWithTimeInterval:1.0 target:self selector:#selector(recordStopAction:) userInfo:micOrSpeaker repeats:NO];
}
-(void)recordStopAction:(NSTimer *)timer
{
NSLog(#"stop");
[recorder stop];
UIImageView *micOrSpeaker = timer.userInfo;
micOrSpeaker.image = [UIImage imageNamed:#""];
_isRecordComplete = YES;
AVAudioSession *audioSession = [AVAudioSession sharedInstance];
[audioSession setActive:NO error:nil];
}
-(void)recordPlayAction
{
if (!recorder.recording){
_player = [[AVAudioPlayer alloc] initWithContentsOfURL:recorder.url error:nil];
[_player setDelegate:self];
[_player play];
}
}
alphabet phonics code:
NSMutableArray *arrForA = [[NSMutableArray alloc] initWithObjects:#"apple.png", #"ant.png", nil];
NSMutableArray *arrForB = [[NSMutableArray alloc] initWithObjects:#"bee.png", #"bear.png", nil];
dictAlpha = [[NSMutableDictionary alloc] initWithObjectsAndKeys: arrForA, #"a.png", arrForB,#"b.png", nil];
NSLog(#"%#",dictAlpha); // 1
commonFunctionObject = [[SpeechCommonFunctions alloc]init];
commonFunctionObject.isRecordComplete = NO;
counter = 0;
isMicPresent = YES;
_confirmationPopupView.hidden = true;
[NSTimer scheduledTimerWithTimeInterval:2.0 target:self selector:#selector(repeatActionFire) userInfo:nil repeats:NO];
}
-(void)repeatActionFire
{
keys=[dictAlpha allKeys];
if(counter>=keys.count)
{
NSLog(#"finished");
[_alphabetsShowImageView removeFromSuperview];
[_speakerOrMicImageView removeFromSuperview];
[_images removeFromSuperview];
UIImageView *congratzView = [[UIImageView alloc]initWithFrame:self.view.frame];
congratzView.image = [UIImage imageNamed:#"congratulation.png"];
[self.view addSubview:congratzView];
}
else{
[commonFunctionObject textToSpeechAction:keys :counter :_alphabetsShowImageView:_speakerOrMicImageView :isMicPresent];
[NSTimer scheduledTimerWithTimeInterval:10.0 target:self selector:#selector(ActionToCkeckRecordCompletion) userInfo:nil repeats:NO];
}
}
-(void)pik{
arrVal = [dictAlpha objectForKey:keys[i]];
if(j<arrVal.count){
[commonFunctionObject textToSpeechAction:arrVal :j :_images :_speakerOrMicImageView :isMicPresent];
[NSTimer scheduledTimerWithTimeInterval:10.0 target:self selector:#selector(ActionToCkeckRecordCompletion1) userInfo:nil repeats:NO];
}
else
{
// [arrVal removeAllObjects];
[_images removeFromSuperview];
counter+=1;
[self repeatActionFire];
}
}
-(void)ActionToCkeckRecordCompletion1
{
if(commonFunctionObject.isRecordComplete)
{
_confirmationPopupView.hidden = false;
}
[self pik];
}
-(void)ActionToCkeckRecordCompletion
{
if(commonFunctionObject.isRecordComplete)
{
_confirmationPopupView.hidden = false;
}
[self pik];
}
- (void)didReceiveMemoryWarning {
[super didReceiveMemoryWarning];
// Dispose of any resources that can be recreated.
}
/*
#pragma mark - Navigation
// In a storyboard-based application, you will often want to do a little preparation before navigation
- (void)prepareForSegue:(UIStoryboardSegue *)segue sender:(id)sender {
// Get the new view controller using [segue destinationViewController].
// Pass the selected object to the new view controller.
}
*/
- (IBAction)playButtonAction:(id)sender
{
[commonFunctionObject recordPlayAction];
}
- (IBAction)nextButtonAction:(id)sender
{
j+=1;
[self pik];
_confirmationPopupView.hidden = true;
commonFunctionObject.isRecordComplete = NO;
if(commonFunctionObject.player.playing){[commonFunctionObject.player stop];}
[self repeatActionFire];
}
- (IBAction)retryButtonAction:(id)sender
{
_confirmationPopupView.hidden = true;
commonFunctionObject.isRecordComplete = NO;
if(commonFunctionObject.player.playing){[commonFunctionObject.player stop];}
[self repeatActionFire];
}
In alphabet phonics code ,i need to modify the code.
According the code my output is getting as :
first it display the a.png image then apple image then ant image then b.png image but bat image is not displaying .how to do?

How to stop AVPlayer Ios and remove the periodicTimeObserverForInterval

I'm having a little difficulty stopping AVPlayer.
I have a method that records and plays music simultaneously. I'm using AVPlayer to play the music because I want to use the addPeriodicTimeObserverForInterval Function. I have it set up as follows:
- (IBAction) recordVoice:(id)sender {
if(!recorder.isRecording){
//set up the file name to record to
NSString *recordingLocation = [self createFileName];
recordingName = recordingLocation;
NSArray *pathComponents = [NSArray arrayWithObjects:[NSSearchPathForDirectoriesInDomains(NSDocumentDirectory, NSUserDomainMask, YES)lastObject],
recordingLocation, nil];
NSURL *outputFileURL = [NSURL fileURLWithPathComponents:pathComponents];
recordingURL = outputFileURL;
// Setup audio session
session = [AVAudioSession sharedInstance];
[session setCategory:AVAudioSessionCategoryPlayAndRecord withOptions:AVAudioSessionCategoryOptionDefaultToSpeaker
error:nil];
// Define the recording settings to record as m4a
NSMutableDictionary *recordSetting = [[NSMutableDictionary alloc] init];
[recordSetting setValue:[NSNumber numberWithInt:kAudioFormatMPEG4AAC] forKey:AVFormatIDKey];
[recordSetting setValue:[NSNumber numberWithFloat:44100.0] forKey:AVSampleRateKey];
[recordSetting setValue:[NSNumber numberWithInt:2] forKey:AVNumberOfChannelsKey];
// initiate and prepare the recorder
recorder = [[AVAudioRecorder alloc] initWithURL:outputFileURL settings:recordSetting error:NULL];
recorder.delegate = self;
recorder.meteringEnabled = YES;
[recorder prepareToRecord];
[session setActive:YES error:nil];
[recorder record];
// find which song to play and initiate an AVPlayer to play it
NSString *playerLocation = self.TitleLabel.text;
NSString *path = [[NSBundle mainBundle] pathForResource:playerLocation ofType:#"m4a"];
player = [[AVPlayer alloc] initWithURL:[NSURL fileURLWithPath:path]];
lastTime = nil;
//check where the player is at and update the song lines accordingly
[player addPeriodicTimeObserverForInterval:CMTimeMake(3, 10) queue:NULL usingBlock:^(CMTime time){
NSTimeInterval seconds = CMTimeGetSeconds(time);
for (NSDictionary *item in robotR33) {
NSNumber *time = item[#"time"];
if ( seconds > [time doubleValue] && [time doubleValue] >= [lastTime doubleValue] ) {
lastTime = #(seconds);
NSString *str = item[#"line"];
[self nextLine:str];
};
}
}];
[player play];
[_recordButton setImage:[UIImage imageNamed:#"micRecording.gif"] forState:UIControlStateNormal];
}
else{
[recorder stop];
player = nil;
[session setActive:NO error:nil];
}
}
If the recorder is not recording I set up both a new recorder AVAudioRecorder and an AVPlayer. In the AVPlayer I set up an AddPeriodicTimeObserverForInterval which updates the UI based on the position of the player.
If the recorder is recording I stop the recorder and I set the player to nil. This stops the audio from playing but I notice that the addPeriodicTimeObserverInterval is still running because the UI continues to update. Should I destroy the AVPlayer altogether and if so how should I do that? Many thanks in advance.
Also as an aside, I have a warning inside the addPeriodicTimeObserverForInterval block. I am looping over an Array called robotR33. Xcode tells me that 'Capturing self strongly in this block is likely to lead to a retain cycle". Could this be part of my problem?
When finished playing the observer needs to be removed from the player.
Adding [player removeTimeObserver:self.timeObserver] works.

Assigning AVAudioRecorderDelegate to non UIViewController

I'm creating an iOS application that starts recording audio when a prompt occurs rather than when a button is pressed. Because of this, I've put the AVAudioRecorder in a scheduler object that is called from it's own class. However, when I assign the scheduler as the delegate for the recorder, I get the warning Assigning to 'id<AVAudioRecorderDelegate>' from incompatible type 'LWPScheduler *__strong'. Here is the implementation for the scheduler:
#implementation LWPScheduler
#synthesize tempo;
#synthesize userName;
+(LWPScheduler *)masterScheduler{
static LWPScheduler *masterScheduler = nil;
if (masterScheduler == nil)
{
masterScheduler = [[self alloc] init];
}
return masterScheduler;
}
- (id)init
{
self = [super init];
if (self) {
self.tempo = 120;
self.userName = #"Tim Burland";
//recording init
// Set the audio file
NSArray *pathComponents = [NSArray arrayWithObjects:
[NSSearchPathForDirectoriesInDomains(NSDocumentDirectory, NSUserDomainMask, YES) lastObject],
#"MyAudioMemo.m4a",
nil];
NSURL *outputFileURL = [NSURL fileURLWithPathComponents:pathComponents];
//audio session
AVAudioSession *session = [AVAudioSession sharedInstance];
[session setCategory:AVAudioSessionCategoryPlayAndRecord error:nil];
//recorder settings
NSMutableDictionary *recordSetting = [[NSMutableDictionary alloc] init];
[recordSetting setValue:[NSNumber numberWithInt:kAudioFormatMPEG4AAC] forKey:AVFormatIDKey];
[recordSetting setValue:[NSNumber numberWithFloat:44100.0] forKey:AVSampleRateKey];
[recordSetting setValue:[NSNumber numberWithInt: 2] forKey:AVNumberOfChannelsKey];
// Initiate and prepare the recorder
_recorder = [[AVAudioRecorder alloc] initWithURL:outputFileURL settings:recordSetting error:NULL];
_recorder.delegate = self; //Incompatible Type Warning Here
_recorder.meteringEnabled = YES;
[_recorder prepareToRecord];
}
return self;
}
#end
My question is whether I have to migrate the audio handling to the controller for the view the recorder would be contained in. Thanks for your help!
On your interface (or private class extension) tell the compiler that your are conforming to the protocol the delegate expects. For e.g.:
#interface LWPScheduler : NSObject <AVAudioRecorderDelegate>
// ...
#end
The protocol defines required and/or optional methods you may have to implement (Xcode will warn you about the required one). After telling the interface the class confirms to the protocol, _recorder.delegate = self; will just work.

IOS issues with AVFoundation music player, working 50% of the time

i have some issues with the AVFoundation music player.
1) It does not start instantly, i guess buffering is slow on the simulator?
2) It only starts with sound 50% of the time, the other 50% it does not start, very unreliable
Here is my class
Music_Player.h
#interface Music_Player : NSObject <AVAudioPlayerDelegate>
#property (nonatomic, retain) AVAudioPlayer *audioPlayer;
#property (nonatomic, retain) NSString *trackPlaying;
#property (nonatomic) BOOL isPlaying;
#property (nonatomic, retain) NSTimer *timer;
#property (nonatomic, retain) UISlider *slider;
-(void)initTrack: (NSString *) track;
-(void)startPlayer;
-(void)pausePlayer;
-(void)stopPlayer;
-(void)sliderUp;
-(void)sliderDown;
#end
Music_Player.m
#import "Music Player.h"
#implementation Music_Player
#synthesize audioPlayer;
#synthesize trackPlaying;
#synthesize timer;
#synthesize isPlaying;
#synthesize slider;
-(void)initTrack:(NSString *)track
{
/* Init slider */
self.isPlaying = FALSE;
self.trackPlaying = track;
NSBundle *mainBundle = [NSBundle mainBundle];
NSString *filePath = [mainBundle pathForResource:self.trackPlaying ofType:#"mp3"];
NSData *fileData = [NSData dataWithContentsOfFile:filePath];
NSError *error = nil;
self.audioPlayer = [[AVAudioPlayer alloc] initWithData:fileData error:&error];
[self.audioPlayer prepareToPlay];
/* Set slider max value */
self.slider.minimumValue = 0;
self.slider.maximumValue = self.audioPlayer.duration - 5;
}
-(void)startPlayer
{
if (self.isPlaying == TRUE)
{
NSLog(#"Pause clicked");
[self.audioPlayer pause];
self.isPlaying = FALSE;
} else {
NSLog(#"Play clicked");
[self.audioPlayer play];
self.isPlaying = TRUE;
if (![self.timer isValid]) {
self.timer = [NSTimer scheduledTimerWithTimeInterval:1.0 target:self selector:#selector(updateTime) userInfo:nil repeats:YES];
}
}
}
- (void)updateTime {
if (self.isPlaying == TRUE) {
NSTimeInterval currentTime = self.audioPlayer.currentTime;
NSLog(#"%f", currentTime);
// update UI with currentTime;
slider.value = round(currentTime);
}
}
-(void)pausePlayer
{
if (self.isPlaying == TRUE)
{
[self.audioPlayer pause];
self.isPlaying = FALSE;
}
}
-(void)stopPlayer
{
if (self.isPlaying == TRUE)
{
NSLog(#"Stop clicked");
[self.audioPlayer stop];
self.audioPlayer.currentTime = 0;
self.slider.value = round(self.audioPlayer.currentTime);
self.isPlaying = FALSE;
}
}
-(void)sliderUp
{
if (self.isPlaying == FALSE)
{
self.audioPlayer.currentTime = round(slider.value);
[self.audioPlayer play];
self.isPlaying = TRUE;
}
}
-(void)sliderDown
{
if (self.isPlaying == TRUE)
{
self.isPlaying = FALSE;
[self.audioPlayer stop];
}
}
/* AUDIO PLAYER DELEGATES */
- (void)audioPlayerDidFinishPlaying:(AVAudioPlayer *)player successfully:(BOOL)flag
{
NSLog(#"Did finish with, %c", flag);
}
- (void)audioPlayerDecodeErrorDidOccur:(AVAudioPlayer *)player error:(NSError *)error
{
NSLog(#"Error %#",error);
}
#end
I just set up the slider property and init it with a track from my viewController
/* Init Music */
self.music_player = [[Music_Player alloc] init];
self.music_player.slider = self.slider;
[self.music_player initTrack:#"track1"];
And then i just pass on the Btn clicks and slider value changes to the music_player class, what could be the issue? I will be testing it on a iPhone tomorow, so could it just be a simulator issue?
Two things:
In initWithData:error:, is there an error set after being called?
Why not use [AVAudioPlayer initWithContentsOfURL:error:]? Eg:
- (void)initTrack:(NSString *)track
{
self.isPlaying = NO;
self.trackPlaying = track;
NSBundle *mainBundle = ;
NSString *filePath = [[NSBundle mainBundle] pathForResource:self.trackPlaying ofType:#"mp3"];
NSURL *fileURL = [NSURL fileURLWithPath:filePath];
NSError *error = nil;
self.audioPlayer = [[AVAudioPlayer alloc] initWithContentsOfURL:fileURL error:&error];
}

iOS 5 - AVAPlayer not working anymore

I've a bit of code which was working fine with iOS 4.3. I had a look on the Internet, I found others having the same problem without answer which worked for me. I think that I can record something but I cannot play it. Here is my code:
DetailViewController.h
#import <UIKit/UIKit.h>
#import <AVFoundation/AVFoundation.h>
#import <CoreAudio/CoreAudioTypes.h>
#import <AudioToolbox/AudioServices.h>
#interface DetailViewController : UIViewController <UISplitViewControllerDelegate, AVAudioRecorderDelegate> {
id detailItem;
UILabel *detailDescriptionLabel;
IBOutlet UIButton *btnStart;
IBOutlet UIButton *btnPlay;
//Variables setup for access in the class:
NSURL * recordedTmpFile;
AVAudioRecorder * recorder;
BOOL toggle;
}
// Needed properties
#property (nonatomic, retain) IBOutlet UIButton *btnStart;
#property (nonatomic, retain) IBOutlet UIButton *btnPlay;
#property (strong, nonatomic) id detailItem;
#property (strong, nonatomic) IBOutlet UILabel *detailDescriptionLabel;
-(IBAction) start_button_pressed;
-(IBAction) play_button_pressed;
#end
DetailViewController.m
- (void)viewDidLoad {
[super viewDidLoad];
toggle = YES;
btnPlay.hidden = YES;
NSError *error;
// Create the Audio Session
AVAudioSession *audioSession = [AVAudioSession sharedInstance];
// Set up the type of session
[audioSession setCategory:AVAudioSessionCategoryPlayAndRecord error:&error];
// Activate the session.
[audioSession setActive:YES error:&error];
[self configureView];
}
-(IBAction) start_button_pressed{
if (toggle) {
toggle = NO;
[btnStart setTitle:#"Press to stop recording" forState:UIControlStateNormal];
btnPlay.enabled = toggle;
btnPlay.hidden = !toggle;
NSError *error;
NSMutableDictionary *recordSettings = [[NSMutableDictionary alloc] init];
[recordSettings setValue:[NSNumber numberWithInt:kAudioFormatAppleIMA4] forKey:AVFormatIDKey];
[recordSettings setValue:[NSNumber numberWithFloat:44100.0] forKey:AVSampleRateKey];
[recordSettings setValue:[NSNumber numberWithInt:2] forKey:AVNumberOfChannelsKey];
// Create a temporary files to save the recording.
recordedTmpFile = [NSURL fileURLWithPath:[NSTemporaryDirectory() stringByAppendingPathComponent:[NSString stringWithFormat: #"%.0f.%#", [NSDate timeIntervalSinceReferenceDate] * 1000.0, #"caf"]]];
NSLog(#"The temporary file used is: %#", recordedTmpFile);
recorder = [[AVAudioRecorder alloc] initWithURL:recordedTmpFile settings:recordSettings error:&error];
[recorder setDelegate:self];
[recorder prepareToRecord];
[recorder record];
}
else {
toggle = YES;
[btnStart setTitle:#"Start recording" forState:UIControlStateNormal];
btnPlay.hidden = !toggle;
btnPlay.enabled = toggle;
NSLog(#"Recording stopped and saved in file: %#", recordedTmpFile);
[recorder stop];
}
}
-(IBAction) play_button_pressed{
NSError *error;
AVAudioPlayer * avPlayer = [[AVAudioPlayer alloc] initWithContentsOfURL:recordedTmpFile error:&error];
if (!error)
{
[avPlayer prepareToPlay];
[avPlayer play];
NSLog(#"File is playing");
}
}
- (void) audioPlayerDidFinishPlaying: (AVAudioPlayer *) player
successfully: (BOOL) flag {
NSLog (#"audioPlayerDidFinishPlaying:successfully:");
}
- (void)audioRecorderDidFinishRecording:(AVAudioRecorder *) aRecorder successfully: (BOOL)flag
{
NSLog (#"audioRecorderDidFinishRecording:successfully:");
}
Here is the of my program running:
2011-11-25 11:58:02.005 Bluetooth1[897:707] The temporary file used is: file://localhost/private/var/mobile/Applications/D81023F8-C53D-4AC4-B1F7-14D66EB4844A/tmp/343915082005.caf
2011-11-25 11:58:05.956 Bluetooth1[897:707] Recording stopped and saved in file: file://localhost/private/var/mobile/Applications/D81023F8-C53D-4AC4-B1F7-14D66EB4844A/tmp/343915082005.caf
2011-11-25 11:58:05.998 Bluetooth1[897:707] audioRecorderDidFinishRecording:successfully:
2011-11-25 11:58:11.785 Bluetooth1[897:707] File is playing
For some reason, the function audioPlayerDidFinishPlaying is never called. However it seems that something has been recorded. Right now I do not know which part is not working but I guess this has something to do with AVAudioPlayer.
[EDIT] It's getting weirder and weirder. I wanted to make sure that something was recorded so I look for taking the duration of the record. Here is the new play function:
-(IBAction) play_button_pressed{
NSError *error;
AVAudioPlayer * avPlayer = [[AVAudioPlayer alloc] initWithContentsOfURL: recordedTmpFile error:&error];
if (!error)
{
AVURLAsset* audioAsset = [AVURLAsset URLAssetWithURL:recordedTmpFile options:nil];
CMTime audioDuration = audioAsset.duration;
float audioDurationSeconds = CMTimeGetSeconds(audioDuration);
[avPlayer prepareToPlay];
[avPlayer play];
NSString *something = [NSString stringWithFormat:#"%f",audioDurationSeconds];
NSLog(#"File is playing: %#", something);
}
else
{
NSLog(#"Error playing.");
}
}
Now, the length of the record is recorded and it make sense (if I record for 10s it shows something around 10s). However, when I put these lines of code for the first time I forgot to do the conversion float to NSString. So it crashed... and the app play the sound... After different tests I can conclude that my app can record and play a sound but is as to crash to play the recorded sound. I've no idea what can be the problem. I found that AVPlayer is asynchronous, is their something to do with that? I'm completely lost...
Replace the urlpath with the following code:
NSString *documentsDirectory = [NSSearchPathForDirectoriesInDomains(
NSDocumentDirectory, NSUserDomainMask, YES) objectAtIndex:0];
NSString *filepath = [documentsDirectory stringByAppendingPathComponent:#"urfile.xxx"];
NSURL *url = [NSURL fileURLWithPath:filepath];
Try the solution here:
Recording and playback
OK, that is not really cool to answer you own questions. Moreover when the answer is not clean but it is working... In order to play what I have recorded I have used the following block of code:
AVURLAsset* audioAsset = [AVURLAsset URLAssetWithURL:recordedTmpFile options:nil];
CMTime audioDuration = audioAsset.duration;
float audioDurationSeconds = CMTimeGetSeconds(audioDuration);
[avPlayer prepareToPlay];
[avPlayer play];
// Block for audioDurationSeconds seconds
[NSThread sleepForTimeInterval:audioDurationSeconds];
I am calculating the length of the recorded file and I am waiting for this amount of time... it is dirty but it is doing the trick. Plus, if it launched in another thread it will not block the application.
I anyone has something I would gladly take it!

Resources