I want to get the m4a audio playback time without player implementation.
I found AVAsset implementation but didn't worked for me.
playbacktime can only be used when using MPMediaPalyback
From AVAsset you can get only the duration of the Asset not the playback time.
below is a refernce link
How to get the duration of an audio file in iOS?
You can access the duration property of the audio file but you cannot get the playback time.
Playback time will only be available when you are using a player. :-)
Related
I trying to play Audio with specific PTS(=DTS. decoding time stamp.)
So, I have tried to use AudioQueueEnqueueBufferWithParameters() with the inStartTime parameter to delay the start of playing each buffer, But that is not working.
I know that in android, MediaCodec class's queueInputBuffer method can play audio data with PTS. (see description : MediaCodec.queueInputBuffer)
I want to find the API in iOS that like MediaCodec's queueInputBuffer method.
If there is no the API in iOS,
How can I play each audio buffer with specific PTS ?
AVPlayer does not play a .aif file recorded with AVAudioRecorder. My current process is:
AVAudioSession.sharedInstance's category is set to AVAudioSessionCategoryPlayAndRecord
AVAudioRecorder is instantiated with a NSURL in the app's documents directory and settings of format of kAudioFormatAppleIMA4, bit rate of 32000, 1 channel, and sample rate of 16000.0.
Audio is recorded, then stopped. The file is saved and I can find it in the app's documents directory and verify the audio is properly recorded.
An instance of AVPlayer is instantiated with the file's NSURL. An observer is added to this object to register changes to its status. When the AVPlayer is ReadyToPlay, I call the play function.
However, despite control flow reaching the point of AVPlayerStatus.ReadyToPlay and calling play, no sound is produced. I'm checking errors and verifying the existence and attributes of the file at the NSURL. I've also tried following the process of this SO post by instantiating an AVAsset, AVPlayerItem, and AVPlayer. No luck.
Any thoughts on why AVPlayer isn't playing audio for this newly recorded local file?
edit: The app is built for iOS 9 with Xcode 7 beta 5. The app is able to play audio files streamed with AVPlayer.
The issue was not caused by AVAudioRecorder or AVPlayer. My recording URL is a NSURL I'll call fileURL. I passed this NSURL to a different view controller but AVPlayer wouldn't play when instantiated with fileURL.
I got a hint to what was going wrong when
fileURL.checkResourceIsReachableAndReturnError(&error)
failed with the error "The file does not exist." Instead, I instantiated a new NSURL with the path of the newly recorded file using:
let url = NSURL(fileURLWithPath: path)
With this url, I was able to instantiate and play an instance of AVPlayer.
I am currently making an app that is capable of streaming a music file. The problem is, our client wants it that while we are streaming an audio file, the streamed bytes will also be saved in the local storage. Which means that the streamed audio file will also be saved on the device's storage. For example I have this m4a file streamed, when the user stops streaming the audio file, the streamed music file will be saved in the device's local storage for future use.
Is this possible? If it is, what library should I use?
Thanks.
Yes it is possible use AVFoundation framework for this and play your audio.
First drag AVFoundation framework from built phase section then import like this #import <AVFoundation/AVFoundation.h>
Then declare AVAudioPlayer in .h file of your view controller instance like this
AVAudioPlayer *myAudioPlayer;
In view controller.m put this code
NSURL *fileURL = // your url.
myAudioPlayer = [[AVAudioPlayer alloc] initWithContentsOfURL:fileURL error:nil];
myAudioPlayer.numberOfLoops = -1; //infinite loop
[myAudioPlayer play];
Via this way you are able to play audio in your iOS device.hope it will help your for playing audio
I have been recording video successfully in my app using AVAssetWriter for long time but today I start to see some strange warning comes when I stop recording,
Scenario:
I record the video & can record again the video multiple times [NO WARNINGS]
I play the video in MPMoviePlayerController [NO WARNINGS]
I record the video after playing the video and once I click stop recording I get the warning
Warning:
MP AVAudioSessionDelegateMediaPlayerOnly end interruption. Interruptor <RecorderServer> category <(null)> resumable <0>, _state = 0
Does anyone know what might be the issue or had similar issue like I have?
It feels like I have solved my problem, although it was not a big issue just a minor mistake which I did,when I play video in MPMoviePlayerController , after I finish playing video using the notification, i was not releasing the player object, I thought it would be enough to unregister from the notification but it helped when i set the self.player=nil;
Seems that your audio session category is set to kAudioSessionCategory_MediaPlayback when you play which is ok. Change it to a suitable category for recording.
Look the different available categories here http://developer.apple.com/library/ios/documentation/Audio/Conceptual/AudioSessionProgrammingGuide/AudioSessionCategories/AudioSessionCategories.html#//apple_ref/doc/uid/TP40007875-CH4-SW1
I'm a new iOS developer, I'm working on a video player app for a video sharing site, where sometimes a recording consists of two video streams (one showing the presenter, the other showing the recording of his screen). I'm trying to play this second video with AVFoundation, creating an AVPlayer. With some videos it works very well, but with some others it runs out of memory. After lot of investigating I figured that it tries to buffer the whole video into the memory.
I've spent hours googling it, but couldn't find anything.
I created a small project just to demonstrate this:
github project. It sets up two AVPlayer's, for two different video streams, and updates the UI to show the loadedTimeRanges of the players' AVPlayerItem. For the first video it only buffers ~60 seconds, which is nice, but for the second video it keeps buffering.
self.player1 = [AVPlayer playerWithURL:url1];
self.player2 = [AVPlayer playerWithURL:url2];
and the two text labels:
self.data1.text = [NSString stringWithFormat:#"Player 1 loadedTimeRanges: %#",
self.player1.currentItem.loadedTimeRanges];
self.data2.text = [NSString stringWithFormat:#"Player 2 loadedTimeRanges: %#",
self.player2.currentItem.loadedTimeRanges];
Maybe this could be important: The over-buffering video does not have an audio track, just a video.
UPDATE: I reproduced the problem with using MPMoviePlayerController instead of AVPlayer, and checking the playableDuration property. With the first movie it stops around 60 seconds, with the second movie it keeps going and then it runs out of memory.
UPDATE2: I got the actual video files and put them up to Dropbox, and tried to stream those: then I don't have the problem! It buffers the whole movie, but it does not run out of memory. It only runs out of memory if I stream them from the original site (our video sharing site). The URLs are there in the github project.
I'm really looking forward to any hints what could cause this.
Thank you!
This problem is indeed caused by the lack of an audio track for video streams sent from Wowza media server. (I have inferred from your stream URLs that you're using Wowza media server to stream your videos).
To verify this issue, I created a 5 minute video file with no audio track.
mplayer -nolirc -vo null -ao null -frames 0 -identify test_60.mp4
...
Opening video decoder: [ffmpeg] FFmpeg's libavcodec codec family
Selected video codec: [ffh264] vfm: ffmpeg (FFmpeg H.264)
==========================================================================
ID_VIDEO_CODEC=ffh264
Audio: no sound
Starting playback...
...
Then I added a mp3 track to that video file using mp4box.
MP4Box -new -add test_60.mp4 -add test_music.mp3 test_60_music.mp4
And verified that there was indeed an audio track.
mplayer -nolirc -vo null -ao null -frames 0 -identify /tmp/test_60_music.mp4
...
AUDIO: 44100 Hz, 2 ch, floatle, 320.0 kbit/11.34% (ratio: 40000->352800)
ID_AUDIO_BITRATE=320000
ID_AUDIO_RATE=44100
ID_AUDIO_NCH=2
Selected audio codec: [ffmp3float] afm: ffmpeg (FFmpeg MPEG layer-3 audio)
==========================================================================
AO: [null] 44100Hz 2ch floatle (4 bytes per sample)
ID_AUDIO_CODEC=ffmp3float
Starting playback...
...
Then, I put both the test_60.mp4 and test_60_music.mp4 in the Wowza content directory, and tested them. I actually wrote a small test app similar to yours to examine the loadedTimeRanges, but just loading the videos via safari from the device should be sufficient to see the difference.
I opened wowza_server:1935/vod/mp4:test_60.mp4/playlist.m3u8 and pressed pause as soon as it started playing. The buffer indicator kept increasing until the full 5 minute video was loaded.
Then, I opened wowza_server:1935/vod/mp4:test_60_music.mp4/playlist.m3u8 and did the same, but only the first 1/5th (roughly 1 minute) was loaded.
So it seems like a problem with the Wowza server's packetization - note this problem does not happen for me on adobe (flash) media server 5.0. Only 60 seconds is buffered regardless of whether the video contains an audio track.
Hope that's helpful. I've asked for input from Wowza folks at the Wowza forums