I'm building AudioKit into an experimental psychology add testing interoception (internal body sense), and it's very simple for now, I just have an AudioManager.m class that declares:
#implementation AudioManager {
AKAudioPlayer* player;
bool canPlay;
}
+(AudioManager *)sharedManager {
static dispatch_once_t pred = 0;
static AudioManager *instance = nil;
dispatch_once(&pred, ^{
instance = [[AudioManager alloc] init];
[instance setup];
});
return instance;
}
little player methods like:
-(NSError*)playHTTIntro {
return [self playWavInResources:#"Htt1"];
}
are then invoked from particular view controllers as:
-(void)viewDidAppear:(BOOL)animated {
[super viewDidAppear:animated];
[[AudioManager sharedManager] playHTTGuess];
}
the weird problem I'm having is that as files get called and played, increasing amplitude and distortion accumulates
by the time I've played 10 or 15 files, the voice audio files are almost impossible to understand
the playWavInResourcesabove is implemented as:
-(NSError*)playWavInResources:(NSString*)wavName {
if(player)
[player stop];
NSError* error;
AKAudioFile* audioFile = [[AKAudioFile alloc] initForReading:[NSURL fileURLWithPath:[[NSBundle mainBundle] pathForResource:wavName ofType:#"wav"]] error:&error];
if(!error) {
if(player)
[player replaceWithFile:audioFile error:&error];
else
player = [[AKAudioPlayer alloc] initWithFile:audioFile looping:false lazyBuffering:false error:&error completionHandler:nil];
AudioKit.output = player;
// Start audio engine if not already
NSError* error;
[AudioKit startAndReturnError:&error];
if(!error)
[player start];
}
return error;
}
Has anyone seem this problem before?
Related
I'm developing an iOS application for a news website, the website has a live audio news channel, i have tried using AVAudioPlayer and did the following:
.h file:
#interface ViewController : UIViewController <AVAudioPlayerDelegate>
#property (strong, nonatomic) AVAudioPlayer *audioPlayer;
.m file (viewDidLoad):
- (void)viewDidLoad {
[super viewDidLoad];
NSURL *url = [NSURL fileURLWithPath:#"http://198.178.123.23:8662/stream/1/;listen.mp3"];
NSError *error;
_audioPlayer = [[AVAudioPlayer alloc]
initWithContentsOfURL:url
error:&error];
if (error)
{
NSLog(#"Error in audioPlayer: %#",
[error localizedDescription]);
} else {
_audioPlayer.delegate = self;
[_audioPlayer prepareToPlay];
}
}
But the application is giving me the following error:
2015-05-25 15:03:12.353 AudioPlayer[2043:421434] Error in audioPlayer: The operation couldn’t be completed. (OSStatus error 2003334207.)
I don't know what's wrong with this code.
Can anyone tell me where's the error in the code ?
Many thanks.
I think there is a better way to play the audio link in the default MPMoviePlayer with audio url. Rest of the things will be managed by default player.
EDIT :
NSURL *audioPath = [NSURL URLWithString:#"Your Audio URL"];
MPMoviePlayerViewController *mpViewController = [[MPMoviePlayerViewController alloc] initWithContentURL:audioPath];
// Remove the movie player view controller from the "playback did finish" notification observers
[[NSNotificationCenter defaultCenter] removeObserver:mpViewController
name:MPMoviePlayerPlaybackDidFinishNotification
object:mpViewController.moviePlayer];
// Register this class as an observer instead
[[NSNotificationCenter defaultCenter] addObserver:self
selector:#selector(movieFinishedCallback:)
name:MPMoviePlayerPlaybackDidFinishNotification
object:mpViewController.moviePlayer];
// Set the modal transition style of your choice
mpViewController.modalTransitionStyle = UIModalTransitionStyleCrossDissolve;
// Present the movie player view controller
[self presentViewController:mpViewController animated:YES completion:nil];
// Start playback
[mpViewController.moviePlayer prepareToPlay];
[mpViewController.moviePlayer play];
You're getting an error because AVAudioPlayer doesn't support HTTP (meaning that you can't play media from the internet), so what you can do is use AVPlayer it's pretty much similar, here's an example :
- (void)viewDidLoad {
[super viewDidLoad];
NSString *urlAddress = #"http://198.178.123.23:8662/stream/1/;listen.mp3";
urlStream = [NSURL URLWithString:urlAddress];
self.audioPlayer = [AVPlayer playerWithURL:urlStream];
NSError *error;
if (error)
{
NSLog(#"Error in audioPlayer: %#",
[error localizedDescription]);
} else {
[_audioPlayer prepareForInterfaceBuilder];
}
}
Hope my answer helped you :)
I am using this approach to save the buffer data of the AVPlayer for video files. Found as the answer in this question Saving buffer data of AVPlayer.
iPhone and iPad - iOS 8.1.3
I made the necessary changes to play video and it is working very nicely except when I try to play a very long video (11-12 minutes long and about 85mb in size) the video will stall roughly 4 minutes after the connection finishes loading. I get an event for playbackBufferEmpty and a player item stalled notification.
This is the gist of the code
viewController.m
#property (nonatomic, strong) NSMutableData *videoData;
#property (nonatomic, strong) NSURLConnection *connection;
#property (nonatomic, strong) AVURLAsset *vidAsset;
#property (nonatomic, strong) AVPlayerItem *playerItem;
#property (nonatomic, strong) AVPlayerLayer *avlayer;
#property (nonatomic, strong) NSHTTPURLResponse *response;
#property (nonatomic, strong) NSMutableArray *pendingRequests;
/**
Startup a Video
*/
- (void)startVideo
{
self.vidAsset = [AVURLAsset URLAssetWithURL:[self videoURLWithCustomScheme:#"streaming"] options:nil];
[self.vidAsset.resourceLoader setDelegate:self queue:dispatch_get_main_queue()];
self.pendingRequests = [NSMutableArray array];
// Init Player Item
self.playerItem = [AVPlayerItem playerItemWithAsset:self.vidAsset];
[self.playerItem addObserver:self forKeyPath:#"status" options:NSKeyValueObservingOptionNew context:NULL];
self.player = [[AVPlayer alloc] initWithPlayerItem:self.playerItem];
// Init a video Layer
self.avlayer = [AVPlayerLayer playerLayerWithPlayer:self.player];
[self.avlayer setFrame:self.view.frame];
[self.view.layer addSublayer:self.avlayer];
}
- (NSURL *)getRemoteVideoURL
{
NSString *urlString = [#"http://path/to/your/long.mp4"];
return [NSURL URLWithString:urlString];
}
- (NSURL *)videoURLWithCustomScheme:(NSString *)scheme
{
NSURLComponents *components = [[NSURLComponents alloc] initWithURL:[self getRemoteVideoURL] resolvingAgainstBaseURL:NO];
components.scheme = scheme;
return [components URL];
}
/**
NSURLConnection Delegate Methods
*/
- (void)connection:(NSURLConnection *)connection didReceiveResponse:(NSURLResponse *)response
{
NSLog(#"didReceiveResponse");
self.videoData = [NSMutableData data];
self.response = (NSHTTPURLResponse *)response;
[self processPendingRequests];
}
- (void)connection:(NSURLConnection *)connection didReceiveData:(NSData *)data
{
NSLog(#"Received Data - appending to video & processing request");
[self.videoData appendData:data];
[self processPendingRequests];
}
- (void)connectionDidFinishLoading:(NSURLConnection *)connection
{
NSLog(#"connectionDidFinishLoading::WriteToFile");
[self processPendingRequests];
[self.videoData writeToFile:[self getVideoCachePath:self.vidSelected] atomically:YES];
}
/**
AVURLAsset resource loader methods
*/
- (void)processPendingRequests
{
NSMutableArray *requestsCompleted = [NSMutableArray array];
for (AVAssetResourceLoadingRequest *loadingRequest in self.pendingRequests)
{
[self fillInContentInformation:loadingRequest.contentInformationRequest];
BOOL didRespondCompletely = [self respondWithDataForRequest:loadingRequest.dataRequest];
if (didRespondCompletely)
{
[requestsCompleted addObject:loadingRequest];
[loadingRequest finishLoading];
}
}
[self.pendingRequests removeObjectsInArray:requestsCompleted];
}
- (void)fillInContentInformation:(AVAssetResourceLoadingContentInformationRequest *)contentInformationRequest
{
if (contentInformationRequest == nil || self.response == nil)
{
return;
}
NSString *mimeType = [self.response MIMEType];
CFStringRef contentType = UTTypeCreatePreferredIdentifierForTag(kUTTagClassMIMEType, (__bridge CFStringRef)(mimeType), NULL);
contentInformationRequest.byteRangeAccessSupported = YES;
contentInformationRequest.contentType = CFBridgingRelease(contentType);
contentInformationRequest.contentLength = [self.response expectedContentLength];
}
- (BOOL)respondWithDataForRequest:(AVAssetResourceLoadingDataRequest *)dataRequest
{
long long startOffset = dataRequest.requestedOffset;
if (dataRequest.currentOffset != 0)
{
startOffset = dataRequest.currentOffset;
}
// Don't have any data at all for this request
if (self.videoData.length < startOffset)
{
NSLog(#"NO DATA FOR REQUEST");
return NO;
}
// This is the total data we have from startOffset to whatever has been downloaded so far
NSUInteger unreadBytes = self.videoData.length - (NSUInteger)startOffset;
// Respond with whatever is available if we can't satisfy the request fully yet
NSUInteger numberOfBytesToRespondWith = MIN((NSUInteger)dataRequest.requestedLength, unreadBytes);
[dataRequest respondWithData:[self.videoData subdataWithRange:NSMakeRange((NSUInteger)startOffset, numberOfBytesToRespondWith)]];
long long endOffset = startOffset + dataRequest.requestedLength;
BOOL didRespondFully = self.videoData.length >= endOffset;
return didRespondFully;
}
- (BOOL)resourceLoader:(AVAssetResourceLoader *)resourceLoader shouldWaitForLoadingOfRequestedResource:(AVAssetResourceLoadingRequest *)loadingRequest
{
if (self.connection == nil)
{
NSURL *interceptedURL = [loadingRequest.request URL];
NSURLComponents *actualURLComponents = [[NSURLComponents alloc] initWithURL:interceptedURL resolvingAgainstBaseURL:NO];
actualURLComponents.scheme = #"http";
NSURLRequest *request = [NSURLRequest requestWithURL:[actualURLComponents URL]];
self.connection = [[NSURLConnection alloc] initWithRequest:request delegate:self startImmediately:NO];
[self.connection setDelegateQueue:[NSOperationQueue mainQueue]];
[self.connection start];
}
[self.pendingRequests addObject:loadingRequest];
return YES;
}
- (void)resourceLoader:(AVAssetResourceLoader *)resourceLoader didCancelLoadingRequest:(AVAssetResourceLoadingRequest *)loadingRequest
{
NSLog(#"didCancelLoadingRequest");
[self.pendingRequests removeObject:loadingRequest];
}
/**
KVO
*/
- (void)observeValueForKeyPath:(NSString *)keyPath ofObject:(id)object change:(NSDictionary *)change context:(void *)context
{
if (context == StatusObservationContext)
{
AVPlayerStatus status = [[change objectForKey:NSKeyValueChangeNewKey] integerValue];
if (status == AVPlayerStatusReadyToPlay) {
[self initHud];
[self play:NO];
} else if (status == AVPlayerStatusFailed)
{
NSLog(#"ERROR::AVPlayerStatusFailed");
} else if (status == AVPlayerItemStatusUnknown)
{
NSLog(#"ERROR::AVPlayerItemStatusUnknown");
}
} else if (context == CurrentItemObservationContext) {
} else if (context == RateObservationContext) {
} else if (context == BufferObservationContext){
} else if (context == playbackLikelyToKeepUp) {
if (self.player.currentItem.playbackLikelyToKeepUp)
}
} else if (context == playbackBufferEmpty) {
if (self.player.currentItem.playbackBufferEmpty)
{
NSLog(#"Video Asset is playable: %d", self.videoAsset.isPlayable);
NSLog(#"Player Item Status: %ld", self.player.currentItem.status);
NSLog(#"Connection Request: %#", self.connection.currentRequest);
NSLog(#"Video Data: %lu", (unsigned long)self.videoData.length);
}
} else if(context == playbackBufferFull) {
} else {
[super observeValueForKeyPath:keyPath ofObject:object change:change context:context];
}
}
The problem seems to be that some time after the connection finishes loading, the player item buffer goes empty. My thought at the moment is that something is being deallocated when the connection finishes loading and messing up the playerItem buffer.
However at the time the buffer goes empty the playerItem status is good, the video asset is playable, the video data is good
If I throttle the wifi through charles and slow down the connection, the video will play as long as the connection does not finish loading within a few minutes of the end of the video.
If I set the connection nil on the finished loading event, the resource loader will fire up a new connection when shouldWaitForLoadingOfRequestedResource fires again. In this case the loading starts all over again and the video will continue playing.
I should mention that this long video plays fine if I play it as a normal http url asset, and also plays fine after being saved to the device and loaded from there.
when the resource loader delegate fires up the NSURLConnection, the connection takes over saving the NSData to the pending requests and processing them. when the connection finished loading, the resource loader regains responsibility for handling the loading requests. the code was adding the loading request to the pending requests array but the issue was that they were not being processed. changed the method to the following and it works.
//AVAssetResourceLoader
- (BOOL)resourceLoader:(AVAssetResourceLoader *)resourceLoader shouldWaitForLoadingOfRequestedResource:(AVAssetResourceLoadingRequest *)loadingRequest
{
if(isLoadingComplete == YES)
{
//NSLog(#"LOADING WAS COMPLETE");
[self.pendingRequests addObject:loadingRequest];
[self processPendingRequests];
return YES;
}
if (self.connection == nil)
{
NSURL *interceptedURL = [loadingRequest.request URL];
NSURLComponents *actualURLComponents = [[NSURLComponents alloc] initWithURL:interceptedURL resolvingAgainstBaseURL:NO];
actualURLComponents.scheme = #"http";
self.request = [NSURLRequest requestWithURL:[actualURLComponents URL]];
self.connection = [[NSURLConnection alloc] initWithRequest:self.request delegate:self startImmediately:NO];
[self.connection setDelegateQueue:[NSOperationQueue mainQueue]];
isLoadingComplete = NO;
[self.connection start];
}
[self.pendingRequests addObject:loadingRequest];
return YES;
}
I'm trying to play some music for my game in the background. The music will never stop unless the user turns it off in settings. The music will always play in the background for each view and doesn't pause or something.
For this reason I've made a singleton class for my background music. But when I press "Stop the music", the app breakpoints for an exception (I'm not seeing one, so I don't know what's wrong).
The music still stops, but there is something wrong and I don't know what. Is it right to make it in a singleton class, or do I need to solve this on an other way?
Here is a screenshot of when the exception happens:
Here is the code for my singleton class:
#import <Foundation/Foundation.h>
#import <AVFoundation/AVFoundation.h>
#interface Music : NSObject
#property AVAudioPlayer *player;
- (void)stop;
- (void)play;
+ (Music *)sharedInstance;
#end
#import "Music.h"
#implementation Music
+ (Music *)sharedInstance {
static Music *sharedInstance;
static dispatch_once_t onceToken;
dispatch_once(&onceToken,^{
sharedInstance = [[Music alloc] init];
});
return sharedInstance;
}
-(instancetype)init{
self = [super init];
if (self) {
NSURL *url = [NSURL fileURLWithPath:[[NSBundle mainBundle]
pathForResource:#"water_2"
ofType:#"wav"]];
self.player = [[AVAudioPlayer alloc] initWithContentsOfURL:url error:nil];
self.player.numberOfLoops = -1;
}
return self;
}
- (void)stop{
[self.player stop];
}
- (void)play{
[self.player play];
}
I would switch up your code and I would use an IBAction instead of void stop and void play
-(IBAction)stop {
[player stop];
}
-(IBAction)play {
[player play];
}
I am using a singleton for AVAudioPlayer, it works great.
I wish to add a check before initializing the player and stop the player if it already playing something.
I added this code :
SoundPlayer *player = [SoundPlayer sharedInstance];
if([player isPlaying]){
[player stop];
}
But it gives me a EXE_BAD_ACCESS on the if([player isPlaying]) line.
2 Questions:
If this is a singleton and I am getting back the same player then why doesn't it stopes by itself?
Why am I getting the error?
HERE IS THE FULL CODE
#import "SoundPlayer.h"
#implementation SoundPlayer
#synthesize helpMode;
static SoundPlayer *sharedInstance = nil;
+ (SoundPlayer *)sharedInstance {
if (sharedInstance == nil) {
sharedInstance = [[super allocWithZone:NULL] init];
}
return sharedInstance;
}
+(BOOL)playSound:(NSURL*)url{
NSError *error;
SoundPlayer *player = [SoundPlayer sharedInstance];
if([player isPlaying]){
[player stop];
}
player = [[SoundPlayer sharedInstance] initWithContentsOfURL:url error:&error];
[player play];
if (player == nil){
NSLog(#"error %# \n for file %#",[error description],[url path]);
return NO;
}else {
if (![player helpMode]) {
[player play];
}else{
NSLog(#"help mode");
}
}
return YES;
}
-(void)stopSound{
if ([self isPlaying]) {
[self stop];
}
}
#end
Im not sure if this is whats causing the error, but figured I would post my response to the discussion here to make my point clearer since Ill have more room and code formatting.
No you dont have to override all the methods, i was just asking to make sure I understood everything right.
The other piece of what im saying is that just like in stopSound{}, you should be using self not
SoundPlayer *player = [SoundPlayer sharedInstance];
so change your code to this, run it and see if its fixed, post a comment below to let me know the outcome.
-(BOOL)playSound:(NSURL*)url{
NSError *error;
// SoundPlayer *player = [SoundPlayer sharedInstance];
if([self isPlaying]){
[self stop];
}
//not sure if this should be self or super (i think super, try both)
//if you overrided the init then definitely self
//[self initWithContentsOfURL:url error:&error];
[super initWithContentsOfURL:url error:&error];
[self play];
if (self == nil){
NSLog(#"error %# \n for file %#",[error description],[url path]);
return NO;
}else {
if (![self helpMode]) {
[self play];
}else{
NSLog(#"help mode");
}
}
return YES;
}
Really all your doing is creating a pointer to self with you create a player object since it is a singleton. With that said Im not sure why that would make it crash other then the fact that you "should" be using self instead.
+ (SoundPlayer *)sharedInstance {
if (sharedInstance == nil) {
sharedInstance = [[super allocWithZone:NULL] init];
}
return sharedInstance;
}
see here that AVAudioPlayer only has 2 init methods and neither are just init hence you are not completely initializing your super class. You need to override initWithContentsOfURL:url method and initialize this object and its super with the url parameter. Just make sure you put the initWithContentsOfURL:url in both the .h and .m. Also idk if it matters but try just alloc not allocWithZone.
i am using the AVAudioPlayer and setting its delegate but its delegate is not getting called
+ (void) playflip
{
NSString *path;
path = [[NSBundle mainBundle] pathForResource:#"flip" ofType:#"mp3"];
AVAudioPlayer *flip;
flip = [[AVAudioPlayer alloc] initWithContentsOfURL:[NSURL fileURLWithPath:path] error:Nil];
flip.delegate = self;
[flip play];
}
My class where i am implementing is the sound class
#interface SoundClass : NSObject <AVAudioPlayerDelegate>
I am calling this delegate
- (void)audioPlayerDidFinishPlaying:(AVAudioPlayer *)player successfully:(BOOL)flag
{
NSLog(#"delegate called");
[player release];
player = nil;
}
It looks like maybe your flip object is going out of scope, because the rest of your code looks fine. Here's what I do:
// controller.h
#interface SoundClass : NSObject <AVAudioPlayerDelegate> {}
// #property(nonatomic,retain) NSMutableDictionary *sounds;
// I have lots of sounds, pre-loaded in a dictionary so that I can reference by name
// With one player, you can just use:
#property(nonatomic,retain) AVAudioPlayer *player;
Then allocate and load the sound in your .m
player = [[AVAudioPlayer alloc] initWithContentsOfURL:[NSURL fileURLWithPath:path] error:Nil];
[player prepareToPlay];
player.delegate = self;
[player play];
Now you should get your DidFinishPlaying notification.