I am currently making an app that is capable of streaming a music file. The problem is, our client wants it that while we are streaming an audio file, the streamed bytes will also be saved in the local storage. Which means that the streamed audio file will also be saved on the device's storage. For example I have this m4a file streamed, when the user stops streaming the audio file, the streamed music file will be saved in the device's local storage for future use.
Is this possible? If it is, what library should I use?
Thanks.
Yes it is possible use AVFoundation framework for this and play your audio.
First drag AVFoundation framework from built phase section then import like this #import <AVFoundation/AVFoundation.h>
Then declare AVAudioPlayer in .h file of your view controller instance like this
AVAudioPlayer *myAudioPlayer;
In view controller.m put this code
NSURL *fileURL = // your url.
myAudioPlayer = [[AVAudioPlayer alloc] initWithContentsOfURL:fileURL error:nil];
myAudioPlayer.numberOfLoops = -1; //infinite loop
[myAudioPlayer play];
Via this way you are able to play audio in your iOS device.hope it will help your for playing audio
Related
I'm trying to play an audio file while it's downloading, is that possible in iOS?
I know I can get the location of the file "after" it's downloaded through:
(void)URLSession:(NSURLSession *)session downloadTask:(NSURLSessionDownloadTask *)downloadTask didFinishDownloadingToURL:(NSURL *)location
But I need to read/play it while it's still downloading.
Update
My original problem is:
I need to play a streamable audio file in either right or left headphones. There's already method setPan in AVAudioPlayer, but AVAudioPlayer does not work perfectly with streaming data. On the other hand, AVPlayer works perfect with streaming data, but it does not allow to "setPan".
My idea was to start downloading the audio file and pass it to AVAudioPlayer and call "setPan", but i can't access the file while it's being downloaded. And I'm still not sure if AVAudioPlayer will append the new downloaded data that is added to the file, while the file is playing.
Directly stream audio using
AVPlayer *anAudioStreamer = [[AVPlayer alloc] initWithURL:[NSURL URLWithString:#"http://my.url.com/my.mp3"]];
[anAudioStreamer play];
I am programming an app to allow for locally stored mp3 file playback as well as streaming from a server.
I use
AVAudioPlayer *player;
AVPlayer *player2;
depending on the type of playback. How do I query what the device volume level is initially so that I can set my volume.slider to its initial value?
First, import:
#import <AVFoundation/AVAudioSession.h>
Then get The system wide output volume set by the user using:
float volume = [[AVAudioSession sharedInstance] outputVolume];
AVPlayer does not play a .aif file recorded with AVAudioRecorder. My current process is:
AVAudioSession.sharedInstance's category is set to AVAudioSessionCategoryPlayAndRecord
AVAudioRecorder is instantiated with a NSURL in the app's documents directory and settings of format of kAudioFormatAppleIMA4, bit rate of 32000, 1 channel, and sample rate of 16000.0.
Audio is recorded, then stopped. The file is saved and I can find it in the app's documents directory and verify the audio is properly recorded.
An instance of AVPlayer is instantiated with the file's NSURL. An observer is added to this object to register changes to its status. When the AVPlayer is ReadyToPlay, I call the play function.
However, despite control flow reaching the point of AVPlayerStatus.ReadyToPlay and calling play, no sound is produced. I'm checking errors and verifying the existence and attributes of the file at the NSURL. I've also tried following the process of this SO post by instantiating an AVAsset, AVPlayerItem, and AVPlayer. No luck.
Any thoughts on why AVPlayer isn't playing audio for this newly recorded local file?
edit: The app is built for iOS 9 with Xcode 7 beta 5. The app is able to play audio files streamed with AVPlayer.
The issue was not caused by AVAudioRecorder or AVPlayer. My recording URL is a NSURL I'll call fileURL. I passed this NSURL to a different view controller but AVPlayer wouldn't play when instantiated with fileURL.
I got a hint to what was going wrong when
fileURL.checkResourceIsReachableAndReturnError(&error)
failed with the error "The file does not exist." Instead, I instantiated a new NSURL with the path of the newly recorded file using:
let url = NSURL(fileURLWithPath: path)
With this url, I was able to instantiate and play an instance of AVPlayer.
I want to get the m4a audio playback time without player implementation.
I found AVAsset implementation but didn't worked for me.
playbacktime can only be used when using MPMediaPalyback
From AVAsset you can get only the duration of the Asset not the playback time.
below is a refernce link
How to get the duration of an audio file in iOS?
You can access the duration property of the audio file but you cannot get the playback time.
Playback time will only be available when you are using a player. :-)
I'm trying to use the standardized "iPod" audio player to play some MP3 tracks in an iPhone app I'm building. The tracks are downloaded from the internet and stored in the app's "Documents" directory. I thought of using a MPMusicPlayerController to do this, but I don't seem to be able to get it to work. Also, I've seen the AVAudioPlayer, but that just plays the audio without an interface. Any suggestions?
The MPMusicPlayerController is for playing items out of the iPod library (songs sync'd via iTunes) so you won't be able to use it for this.
You can get the NSData for your audio using...
NSData* data = [NSMutableData dataWithContentsOfFile:resourcePath options:0 error:&err];
Then use an AVAudioPlayer created from that data and call play.
AVAudioPlayer* player = [[AVAudioPlayer alloc] initWithData:data error:&err];
[player play];