AVPlayer rate property for HTTP Live Streaming - ios

I am trying to develop a player using AVFoundation and enable fast forward and rewind. I am setting the player's rate property to 0, 0.25, 0.5, 1, 1.5, 2.0.
rate property of 0,1 is working as expected and setting it to pause and play.
0.25, 0.5 also works and displays video in slow motion.
My question is that setting the property of 1.5, 2.0 is not working, it just keeps playing. This works for .mp4 videos though. Is this not supported for HLS? I am using the sample HLS stream's provided by Apple.
http://devimages.apple.com/iphone/samples/bipbopgear4.html
How do we do enable rewind and fast forward? Should I somehow use seekToTime?
Any help will be appreciated !

Looks like if I have an i-Frame playlist, FF/RW is supported. But then thats only from iOS5.0+.
Confirmed from Apple Dev Forums.
Rate is supported only for I-Frame Playlists for HLS content. For normal playlist, only rate = 0, 1 is supported(which is essentially play/pause)
For non-HLS content, rate can be use for <0, =0,>0 to support FF/RW/Slow forward etc.

The rate property only controls playback speed. 0 for stopped and up to 1 for the normal rate of the current item. Any value over 1 is treated as 1. If you want to "fast forward" to a specific point you will need to use the method you mentioned, "seekToTime". There is no way (AFAIK) to play a movie faster than it's normal rate using public API's. Hope that helps.

Related

Bitrate is not getting limited for H.264 HW accelerated encode on iOS using the VideoToolbox API

Bitrate is not getting limited for H.264 HW accelerated encode on iOS using the VideoToolbox API with property kVTCompressionPropertyKey_AverageBitRate.
It is observed that the bitrate is shooting upto 4mbps(for both 1280x780, 640x360) at times for H.264 HW accelerated encode though the encoder's bitrate is configured rightly.
This high bitrate value is not in the acceptable limits.
*There is a single property for setting bitrate i.e kVTCompressionPropertyKey_AverageBitRate available in the videoToolbox. The documentation says "This is not a hard limit; the bit rate may peak above this".
I have tried below two things :
1. Set bitrate and Set Data rate to some hardcoded values, as a part of encoderSpec attribute of VTCompressionSessionCreate in the init. Removed any re configuring/setting of bitrate after the init.
2. Set bitrate and Set Data rate using VTSessionSetProperty run time
Both does not seem to work.
Is there any way to restrict the bitrate to certain limit ? Any help is greatly appreciated.
If you deal with motion scene, 4 Mbps perhaps is a right value. In non-real time situation, I think you should try to configure Profile to High with Level 5, setting H264EntropyMode to CABAC and extend the value of MaxKeyFrameInterval key.

Mute the audio at a particular interval of time while casting the video in ios using chromecast

I am working on chrome cast based application using google cast api. Initially, I am joining to the chromecast session from youtube and playing the video. Later I joined to this session from my application.
There is a requirement in my application to mute the audio at particular interval of time.
I required to mute the audio from 00:01:34:03(hh:mm:ss:ms) to 00:01:34:15((hh:mm:ss:ms).
Converting the time to seconds in the below way.
Time to seconds conversion: (00*60*60)+(01*60)+34+(03/1000) = 94.003 -> Mute start time
Calling the mute method after the interval of: Mute start time - Current streaming position
I am using approximateStreamPosition value (in GCKMediaControlChannel header file) to known the stream position of the casting video. It is returning the value in double format say 94.70801001358.
In this 94 is seconds duration, what does the value after the decimal point indicates(.70801001358). Is it milliseconds? If so can I round it to three digits.
As I required to mute the audio in milliseconds duration, is it causes any delay or advance muting of the audio if I round off the value.
The 0.70801001358 is in seconds; I am not sure what you mean by asking if that is in milliseconds. In milliseconds, that number would be 708.01001358.
You wont be able to have millisecond accuracy in controlling mute (or any other control command for that matter); just setting up a command and transfer time from your iOS device to Chromecast together will throw your calculations off by a good number of milliseconds.

How to change the percentage according to the recorded audio volume

As shown on the below figure (Source : Disney spot light app https://itunes.apple.com/in/app/disney-spotlight-karaoke/id455072135?mt=8)
I want to change the percentage value, depends on recorded sound while recording audio.
if we keep the device in the quite silent places, the peracentage not to be changed..
If we keep the device at a full loudly place, song/ voice / machine sound or any sound.. the percentage needs to be increase according to the volume recorded.
How can we do it in iOS using objective c
Supposing you're using an AVAudioRecorder to record your sounds, you can call:
[audioRecorder setMeteringEnabled:YES];
before starting to record in order to enable the audio-level metering. Then at any time you can call:
[audioRecorder updateMeters];
float audioPower = [audioRecorder averagePowerForChannel:0]; // You can change 0 according to your input channels
audioPower will give you a float value between -160 and 0 dB, that you then can adapt to your needs.
If you want to constantly poll the audio level you can call the code above through an NSTimer action or use an NSOperationQueue
Check AVAudioRecorder reference for more details on these methods

MPMusicPlayerController not responding to currentPlaybackRate near 1

I'm trying to use the currentPlaybackRate property on MPMusicPlayerController to adjust the tempo of a music track as it plays. The property works as expected when the rate is less than 0.90 or greater than 1.13, but for the range just above and below 1, there seems to be no change in tempo. Here's what I'm trying:
UIAppDelegate.musicPlayer = [MPMusicPlayerController iPodMusicPlayer];
... load music player with track from library
[UIAppDelegate.musicPlayer play];
- (void)speedUp{
UIAppDelegate.musicPlayer.currentPlaybackRate = UIAppDelegate.musicPlayer.currentPlaybackRate + 0.03125;
}
- (void)speedDown
{
UIAppDelegate.musicPlayer.currentPlaybackRate = UIAppDelegate.musicPlayer.currentPlaybackRate - 0.03125;
}
I can monitor the value currentPlaybackRate and see that it's being correctly set, but there seems to be no different in playback tempo until the 0.9 or 1.13 threshold has been reached. Does anyone have any guidance or experience on the matter?
I'm no expert, but I suspect that this phenomenon may be merely an artefact of the algorithm used to change the playback speed without raising or lowering the pitch. It's a tricky business, and here it must be done in real time without much distortion, so probably an integral multiple of the tempo is needed. You might want to read the wikipedia article on time stretching, http://en.wikipedia.org/wiki/Audio_timescale-pitch_modification
Actually I've found out the problem: the sentence myMusicPlayer.currentPlaybackRate = 1.2 must be placed after the sentence .play(). If you put the rate setting before the .play(), it would not work.

portaudio video/audio sync

I use ffmpeg to decode video/audio stream and use portaudio to play audio. I encounter a sync problem with portaudio. I have a function like below,
double AudioPlayer::getPlaySec() const
{
double const latency = Pa_GetStreamInfo( mPaStream )->outputLatency;
double const bytesPerSec = mSampleRate * Pa_GetSampleSize( mSampleFormat ) * mChannel;
double const playtime = mConsumedBytes / bytesPerSec;
return playtime - latency;
}
mCousumeBytes is the byte count which written into audio device in the portaudio callback function. I thought I could have got the playing time according to the byte count. Actually, when I execute the other process ( like open firefox ) which make cpu busy, the audio become intermittent, but the callback doesn't stop so that mConsumeBytes is more than expected, and getPlaySec return a time which is larger than playing time.
I have no idea how this happened. Any suggestion is welcome. Thanks!
Latency, in PortAudio is defined a bit vaguely. Something like the average time between when you put data into the buffer and when you can expect it to play. That's not something you want to use for this purpose.
Instead, to find the current playback time of the device, you can actually poll the device using the Pa_GetStreamTime function.
You may want to see this document for more detailed info.
I know this is old. But still; PortAudio v19+ can provide you with its own sample rate. You should use that for audio sync, since actual sample rate playback can differ between different hardware. PortAudio might try to compensate (depending on implementation). If you have drift problems, try using that.

Resources