I'm trying to figure out how to correcty schedule an audiofile in the near future. My actual goal is to play multiple tracks synchonized.
So how to configure 'aTime' correctly so it starts in about for instance 0.3 seconds from now.
I think that I maybe need the hostTime as well, but I don't know how to use that correctly
func createStartTime() -> AVAudioTime? {
var time:AVAudioTime?
if let lastPlayer = self.trackPlayerDictionary[lastPlayerKey] {
if let sampleRate = lastPlayer.file?.processingFormat.sampleRate {
var sampleTime = AVAudioFramePosition(shortStartDelay * sampleRate )
time = AVAudioTime(sampleTime: sampleTime, atRate: sampleRate)
}
}
return time
}
Here is the function I use to start playback:
func playAtTime(aTime:AVAudioTime?){
self.startingFrame = AVAudioFramePosition(self.currentTime * self.file!.processingFormat.sampleRate)
let frameCount = AVAudioFrameCount(self.file!.length - self.startingFrame!)
self.player.scheduleSegment(self.file!, startingFrame: self.startingFrame!, frameCount: frameCount, atTime: aTime, completionHandler:{ () -> Void in
NSLog("done playing")//actually done scheduling
})
self.player.play()
}
I figured it out!
for the hostTime parameter I filled in mach_absolute_time(), this is the computer/iPad's 'now' time. the AVAudioTime(hostTime:sampleTime:atRate) adds the sampleTime to the hostTime and gives back a time in the near future that can be used to schedule multiple audio segments at the same startingTime
func createStartTime() -> AVAudioTime? {
var time:AVAudioTime?
if let lastPlayer = self.trackPlayerDictionary[lastPlayerKey] {
if let sampleRate = lastPlayer.file?.processingFormat.sampleRate {
var sampleTime = AVAudioFramePosition(shortStartDelay * sampleRate )
time = AVAudioTime(hostTime: mach_absolute_time(), sampleTime: sampleTime, atRate: sampleRate)
}
}
return time
}
Well - it is ObjC - but you'll get the point...
No need for mach_absolute_time() - if your engine is running you already got a #property lastRenderTime in AVAudioNode - your player's superclass ...
AVAudioFormat *outputFormat = [playerA outputFormatForBus:0];
const float kStartDelayTime = 0.0; // seconds - in case you wanna delay the start
AVAudioFramePosition startSampleTime = playerA.lastRenderTime.sampleTime;
AVAudioTime *startTime = [AVAudioTime timeWithSampleTime:(startSampleTime + (kStartDelayTime * outputFormat.sampleRate)) atRate:outputFormat.sampleRate];
[playerA playAtTime: startTime];
[playerB playAtTime: startTime];
[playerC playAtTime: startTime];
[playerD playAtTime: startTime];
[player...
By the way - you can achieve the same 100% sample-frame accurate result with the AVAudioPlayer class...
NSTimeInterval startDelayTime = 0.0; // seconds - in case you wanna delay the start
NSTimeInterval now = playerA.deviceCurrentTime;
NSTimeInterval startTime = now + startDelayTime;
[playerA playAtTime: startTime];
[playerB playAtTime: startTime];
[playerC playAtTime: startTime];
[playerD playAtTime: startTime];
[player...
With no startDelayTime the first 100-200ms of all players will get clipped off because the start command actually takes its time to the run loop although the players have already started (well, been scheduled) 100% in sync at now. But with a startDelayTime = 0.25 you are good to go. And never forget to prepareToPlay your players in advance so that at start time no additional buffering or setup has to be done - just starting them guys ;-)
For an even more in-depth explanation have a look at my answer in
AVAudioEngine multiple AVAudioInputNodes do not play in perfect sync
Related
How can I save time when audio was stopped in session and continue playback from the stop point in next session?
My code:
- (void)initPlayer:(NSString*) audioFile fileExtension:(NSString*)fileExtension
{
NSURL *audioFileLocationURL = [[NSBundle mainBundle] URLForResource:audioFile withExtension:fileExtension];
NSError *error;
self.audioPlayer = [[AVAudioPlayer alloc] initWithContentsOfURL:audioFileLocationURL error:&error];
if ([audioFile isEqualToString:#"2"]) {
_index = 1;
}
else if ([audioFile isEqualToString:#"3"]) {
_index = 2;
}
[self song];
}
- (void)playAudio {
[self.audioPlayer play];
}
- (void)pauseAudio {
[self.audioPlayer pause];
}
- (BOOL)isPlaying {
return [self.audioPlayer isPlaying];
}
-(NSString*)timeFormat:(float)value{
float minutes = floor(lroundf(value)/60);
float seconds = lroundf(value) - (minutes * 60);
int roundedSeconds = lroundf(seconds);
int roundedMinutes = lroundf(minutes);
NSString *time = [[NSString alloc]
initWithFormat:#"%d:%02d",
roundedMinutes, roundedSeconds];
return time;
}
- (void)setCurrentAudioTime:(float)value {
[self.audioPlayer setCurrentTime:value];
}
- (NSTimeInterval)getCurrentAudioTime {
return [self.audioPlayer currentTime];
}
- (float)getAudioDuration {
return [self.audioPlayer duration];
}
You can use AVPlayer's currentTime property. It returns the playback time of the current AVPlayerItem.
To restore the playback time in the next session, you can pass the stored time to AVPlayer's seekToTime:
[self.player seekToTime:storedPlaybackTime];
https://developer.apple.com/library/mac/documentation/AVFoundation/Reference/AVPlayer_Class/index.html#//apple_ref/doc/uid/TP40009530-CH1-SW2
https://developer.apple.com/library/mac/documentation/AVFoundation/Reference/AVPlayer_Class/index.html#//apple_ref/occ/instm/AVPlayer/seekToTime%3a
To persist the CMTime returned by currentTime, you can use the AVFoundation convenience methods provided by NSValue.
To wrap CMTime in an NSValue, use valueWithCMTime:
[NSValue valueWithCMTime:player.currentTime];
To get an CMTime struct from the persisted value, use:
CMTime persistedTime = [storeValue CMTimeValue];
After you wrapped the CMTime struct in a NSValue instance, you can use keyed archiver & NSData to write the time to disk.
NSHipster has a good article about that topic:http://nshipster.com/nscoding/
The most easy way will be to keep a local db with the song name and when it is stopped add that data to the db. Then when the playback resumes later check the local db first if it has any entries in the past. If not continue from starting.
Also make sure that there is no entry made if the song finishes.
Hope this idea helps you...
I need to create a custom video plugin using swift. But I don't know how to get video full duration and current playing time. In my console just appeared this output, C.CMTime. I'm not sure what wrong with my code.
My code
let url = NSBundle.mainBundle().URLForResource("Video", withExtension:"mp4")
let asset = AVURLAsset(URL:url, options:nil)
let duration: CMTime = asset.duration
println(duration)
You can use CMTimeGetSeconds to converts a CMTime to seconds.
let durationTime = CMTimeGetSeconds(duration)
Use ios Objective c concept
- (NSTimeInterval) playableDuration
{
// use loadedTimeRanges to compute playableDuration.
AVPlayerItem * item = _moviePlayer.currentItem;
if (item.status == AVPlayerItemStatusReadyToPlay) {
NSArray * timeRangeArray = item.loadedTimeRanges;
CMTimeRange aTimeRange = [[timeRangeArray objectAtIndex:0] CMTimeRangeValue];
double startTime = CMTimeGetSeconds(aTimeRange.start);
double loadedDuration = CMTimeGetSeconds(aTimeRange.duration);
// FIXME: shoule we sum up all sections to have a total playable duration,
// or we just use first section as whole?
NSLog(#"get time range, its start is %f seconds, its duration is %f seconds.", startTime, loadedDuration);
return (NSTimeInterval)(startTime + loadedDuration);
}
else
{
return(CMTimeGetSeconds(kCMTimeInvalid));
}
}
I am trying to fetch all frames of video and converting and storing them as individual images.
I am using this code in AV Foundation Programming Guide.
the code for getting multiple images is
CMTime firstThird = CMTimeMakeWithSeconds(durationSeconds/3.0, 600);
CMTime secondThird = CMTimeMakeWithSeconds(durationSeconds*2.0/3.0, 600);
CMTime end = CMTimeMakeWithSeconds(durationSeconds, 600);
this is hard coded, but I want to convert whole video. I know I can use for loop but what to do with this durationsecond means how can I use from begging to end to get all frames?
here is my attempt
for(float f=0.0; f<=durationSeconds; f++) {
[times addObject:[NSValue valueWithCMTime:CMTimeMakeWithSeconds(durationSeconds, 600)]];
}
Any time you're about to write hundreds of lines of nearly identical code is probably a time where you need to be using a loop of some sort:
for (int currentFrame = 0; currentFrame < durationSeconds; ++currentFrame) {
CMTime currentTime = CMTimeMakeWithSeconds(i, 600);
// the rest of the code you need to create the image or whatever
}
That snippet will grab one frame per second. If you wanted to grab 30 frames per second, it'd look more like this:
const CGFloat framesPerSecond = 30.0;
for (int currentFrame = 0; currentFrame < (durationSeconds * framesPerSecond); ++currentFrame) {
CMTime currentTime = CMTimeMakeWithSeconds(currentFrame/framesPerSecond, 600);
// again, the code you need to create the image from this time
}
Just set the value of framesPerSecond to however many frames per second you want to capture.
As a disclaimer, I'm not completely familiar with this stuff, so a <= might be appropriate in the conditional statements here.
ADDENDUM: The code I've posted is only going to grab the timestamp for which to grab an image. The rest of the code should look something like this:
AVAsset *myAsset = // your asset here
AVAssetImageGenerator *imageGenerator = [[AVAssetImageGenerator alloc] initWithAsset:myAsset];
NSError *error;
CMTime actualTime;
CGImageRef currentImage = [imageGenerator copyCGImageAtTime:currentTime
actualTime:&actualTime
error:&error];
if (!error) {
[someMutableArray addObject:[[UIImage alloc] initWithCGImage:currentImage]];
}
I am new to iphone. I am working on audio player. I have to show the current time and remaining time of the song in audioplayer. In video player it will gets as default but in audioplayer it is not getting so that i write a logic for getting current time of the song. The code below is for that
int minutes = (int)audioPlayer.currentTime / 60;
int seconds = (int)audioPlayer.currentTime % 60;
startDurationLabel.text = [NSString stringWithFormat:#"%d:%02d",minutes,seconds];
here audioPlayer is instance of AVAudioPlayer and startDurationLabel is the label for display the current time of the song.
But I am struggling to get this logic to work and show the remaining time of the song
If any body know this please help me...
Try This
CGFloat remainingTime = audioPlayer.duration - audioPlayer.currentTime
NSTimeInterval remaining = audioPlayer.duration - audioPlayer.currentTime;
Try this -
NSString *strTimeLeft = [self getTimeFromTimeInterval:CMTimeGetSeconds(_player.currentItem.duration) - CMTimeGetSeconds(_player.currentTime)];
Add this method in your class
- (NSString*)getTimeFromTimeInterval:(NSTimeInterval)timeInterval
{
NSInteger interval = (NSInteger)timeInterval;
NSInteger seconds = interval%60;
NSInteger minutes = (interval/ 60)%60;
//NSInteger hr = (interval/3600)%60;
NSString *strTime = [NSString stringWithFormat:#"%02d:%02d",minutes,seconds];
return strTime;
}
The MPMoviePlayerController has a property called playableDuration.
playableDuration The amount of currently playable content (read-only).
#property (nonatomic, readonly) NSTimeInterval playableDuration
For progressively downloaded network content, this property reflects
the amount of content that can be played now.
Is there something similar for AVPlayer?
I can't find anything in the Apple Docs or Google (not even here at Stackoverflow.com)
Thanks in advance.
playableDuration can be roughly implemented by following procedure:
- (NSTimeInterval) playableDuration
{
// use loadedTimeRanges to compute playableDuration.
AVPlayerItem * item = _moviePlayer.currentItem;
if (item.status == AVPlayerItemStatusReadyToPlay) {
NSArray * timeRangeArray = item.loadedTimeRanges;
CMTimeRange aTimeRange = [[timeRangeArray objectAtIndex:0] CMTimeRangeValue];
double startTime = CMTimeGetSeconds(aTimeRange.start);
double loadedDuration = CMTimeGetSeconds(aTimeRange.duration);
// FIXME: shoule we sum up all sections to have a total playable duration,
// or we just use first section as whole?
NSLog(#"get time range, its start is %f seconds, its duration is %f seconds.", startTime, loadedDuration);
return (NSTimeInterval)(startTime + loadedDuration);
}
else
{
return(CMTimeGetSeconds(kCMTimeInvalid));
}
}
_moviePlayer is your AVPlayer instance, by checking AVPlayerItem's loadedTimeRanges, you can compute a estimated playableDuration.
For videos that has only 1 secion, you can use this procedure; but for multi-section video, you may want to check all time ranges in array of loadedTimeRagnes to get correct answer.
all you need is
self.player.currentItem.asset.duration
simply best
Building on John's Answer…
This is the apparent default behavior of Apple players: "Show the Max Time of the playable range that encloses the current time"
- (NSTimeInterval)currentItemPlayableDuration{
// use loadedTimeRanges to compute playableDuration.
AVPlayerItem * item = self.audioPlayer.currentItem;
if (item.status == AVPlayerItemStatusReadyToPlay) {
NSArray * timeRangeArray = item.loadedTimeRanges;
CMTime currentTime = self.audioPlayer.currentTime;
__block CMTimeRange aTimeRange;
[timeRangeArray enumerateObjectsUsingBlock:^(id obj, NSUInteger idx, BOOL *stop) {
aTimeRange = [[timeRangeArray objectAtIndex:0] CMTimeRangeValue];
if(CMTimeRangeContainsTime(aTimeRange, currentTime))
*stop = YES;
}];
CMTime maxTime = CMTimeRangeGetEnd(aTimeRange);
return CMTimeGetSeconds(maxTime);
}
else
{
return(CMTimeGetSeconds(kCMTimeInvalid));
}
}
You will have to detect when the AVPlayer is ready to play your media file.
Let me know, if you don't know how to do this.
However, once the media file is loaded, you can use this method:
#import <AVFoundation/AVFoundation.h>
/**
* Get the duration for the currently set AVPlayer's item.
*/
- (CMTime)playerItemDuration {
AVPlayerItem *playerItem = [mPlayer currentItem];
if (playerItem.status == AVPlayerItemStatusReadyToPlay) {
return [[playerItem asset] duration];
}
return(kCMTimeInvalid);
}
When you use this method its important to understand (because you're streaming content) that the length value may be invalid or something. So you must check this before using it for processing.
CMTime playerDuration = [self playerItemDuration];
if (CMTIME_IS_INVALID(playerDuration)) {
return;
}
double duration = CMTimeGetSeconds(playerDuration);
Swift version of closes playable duration:
var playableDuration: TimeInterval? {
guard let currentItem = currentItem else { return nil }
guard currentItem.status == .readyToPlay else { return nil }
let timeRangeArray = currentItem.loadedTimeRanges
let currentTime = self.currentTime()
for value in timeRangeArray {
let timeRange = value.timeRangeValue
if CMTimeRangeContainsTime(timeRange, currentTime) {
return CMTimeGetSeconds(CMTimeRangeGetEnd(timeRange))
}
}
guard let timeRange = timeRangeArray.first?.timeRangeValue else { return 0}
let startTime = CMTimeGetSeconds(timeRange.start)
let loadedDuration = CMTimeGetSeconds(timeRange.duration)
return startTime + loadedDuration
}