audio slow motion like default Slo-Mo Camera functionality using 240FPS - ios

i want to implement Slowmotion Video like Defalut functionality of Slo-Mo in Camera and i used following code and it worked fine for video.
but in Audio track of that video is not working properly.
double videoScaleFactor =8.0;
compositionAudioTrack scaleTimeRange:CMTimeRangeMake(kCMTimeZero, videoDuration)
toDuration:CMTimeMake(videoDuration.value* videoScaleFactor,videoDuration.timescale)];
[compositionVideoTrack scaleTimeRange:CMTimeRangeMake(kCMTimeZero, videoDuration)
toDuration:CMTimeMake(videoDuration.value* videoScaleFactor, videoDuration.timescale)];
this scenario is woking properly for video slowmotion.But in audio slow-motion it is not working...
Please help me..

i found solution of Audio SlowMotion
double videoScaleFactor =8.0;
[compositionAudioTrack scaleTimeRange:CMTimeRangeMake(kCMTimeZero, videoDuration)
toDuration:CMTimeMake(videoDuration.value* videoScaleFactor,videoDuration.timescale)];
its working properly but not working in AVPlayer
so for that you have to set following property of AVPlayerItem
AVPlayerItem *playerItem = nil;
playerItem.audioTimePitchAlgorithm = AVAudioTimePitchAlgorithmVarispeed;

Related

Inserting slowMo AVComposition into AVMutableComposition

PHPhotoLibrary gives an AVComposition rather than an AVURLAsset for a video recorded in SlowMo. I want to insert slowMo videos into another AVMutableComposition, so this means I need to insert this AVComposition into AVMutableComposition which is my editing timeline. The hack I used before was to load the tracks and segments and find the mediaURL of asset.
AVCompositionTrack *track = [avAsset tracks][0];
AVCompositionTrackSegment *segment = track.segments[0];
mediaURL = [segment sourceURL];
Once I had mediaURL, I was able to create a new AVAsset that could be inserted into AVMutableComposition. But I wonder if there is a cleaner approach that allows the slowMo video composition to be directly inserted into the timeline AVMutableComposition?

How to Speed up the audio Playing in MpMoviePlayerController

I am using MPMoviePlayerController in my app to play the video and audios. I want to give an option to user to play the audio/video slower/faster then the normal speed i.e 0.5x (slower then normal ), 1x (normal speed), 2x (double speed then normal.).
I want to know is there any way that i can speed up/down the MPMoviePlayerController streaming so that user can have options to listen/view the audio/video at slower/faster speed.
i found the solution to this problem myself.
When you use MPMoviePlayerController and make its instance then you have a property of MpMoviePlayerController as currentPlaybackRate. It is set to 1.0 by default means normal playing. If you sets its value to 1.5 or 2.0 then it will play the currently playing audio at that speed.
See the following code.
MPMoviePlayerController *moviePlayer = [MPMoviePlayerController alloc]init];
moviePlayer.movieSourceType = MPMovieSourceTypeStreaming;
moviePlayer.contentURL = #"http://someduumyUrl.com" ;
moviePlayer.controlStyle = MPMovieControlStyleDefault;
[moviePlayer prepareToPlay];
moviePlayer.currentPlaybackRate = 2.0 // will play the audio at double speed.

insertEmptyTimeRange to AVMutableCompositionTrack not working

I'm stitching videos together in an AVMutableCompositionTrack, using this:
AVMutableVideoCompositionLayerInstruction *passThroughLayer = [AVMutableVideoCompositionLayerInstruction videoCompositionLayerInstructionWithAssetTrack:videoTrack];
I'm also adding a CALayer with text and images to the composition, using an animationLayer.
At the beginning, I add 5 seconds of nothing to insert a title using insertEmptyTimeRange.
Up to here, everything's working fine.
Now I want to add some «nothing» to the end of the video, using insertEmptyTimeRange again - but that fails miserably.
CMTime creditsDuration = CMTimeMakeWithSeconds(5, 600);
CMTimeRange creditsRange = CMTimeRangeMake([[compositionVideoTrack asset] duration], creditsDuration);
[compositionVideoTrack insertEmptyTimeRange:creditsRange];
[compositionAudioTrack insertEmptyTimeRange:creditsRange];
NSLog(#"credit-range %f from %f", CMTimeGetSeconds(creditsRange.duration), CMTimeGetSeconds(creditsRange.start));
NSLog(#"Total duration %f", CMTimeGetSeconds([[compositionVideoTrack asset] duration]));
The insert-points are correct (first NSLog), but the total duration won't get extended...
Any ideas what I could be doing wrong?
Turns out, it seems to be impossible to add an empty timerange to the end of an AVMutableComposition.
This answer saved my life: AVMutableComposition of a Solid Color with No AVAsset

iOS AVPlayer won't play

I have the following code set up to play a video in my iOS app, but it just doesn't play. The app compiles and runs, but all I get a red frame where I positioned it. When debugging, I found that the the program doesn't even step into the last line [player play]. Also, the video runs fine in a UIWebView.
NSString *streamingString = [NSString stringWithFormat:#"http://youtu.be/...."];
AVAsset *asset = [AVAsset assetWithURL:[NSURL URLWithString:streamingString]];
AVPlayerItem *playerItem = [[AVPlayerItem alloc] initWithAsset:asset];
AVPlayer *player = [AVPlayer playerWithPlayerItem:playerItem];
AVPlayerLayer *layer = [AVPlayerLayer playerLayerWithPlayer:player];
[layer setPlayer:player];
[layer setFrame:CGRectMake(50, 50, 400, 300)];
[layer setBackgroundColor:[UIColor redColor].CGColor];
[layer setVideoGravity:AVLayerVideoGravityResizeAspectFill];
[self.view.layer addSublayer:layer];
[player play];
I am a new programmer and this is my first post to Stackoverflow, so please excuse me if I have not given enough information or am missing something obvious! Thank you so much!
assetWithURL: is often, but not always, used with a URL that points to a file already on the device, often within the bundle. If you are using an external URL, such as one on the internet, the url must resolve to a video that is in a format that the AVPlayer can understand. A URL to a YouTube web page will not work. In general, you can't play a YouTube video in AVPlayer.
Check out this document from Apple and this video on YouTube.
EDIT:
Some developers have used this library to play a YouTube video in their app. Another option is to use a UIWebView.

iOS AVAsset.duration is zero for HTTP live streaming but works for progressive

I have an iOS app that plays video from a HTTP Live stream "playlist.m3u8" and I have a custom player created using AVPlayer. To handle normal user interactions such as scrubbing i need to get the duration of the video, but for some reason on iOS 4.3 using xcode 4.0 when I call the following code I get a CMTime that when converted to seconds gives a NaN -- I know what it's doing because CMTimeValue = 0 and CMTimeScale = 0 which gives the NaN and the CMTimeFlags = 17 which is even more strange.
Here's the code I uses which isn't complex at all:
AVPlayerItem *pItem = mPlayer.currentItem;
AVAsset* asset = pItem.asset;
CMTime d = asset.duration;
double duration = CMTimeGetSeconds(asset.duration);
I should also mention that I do monitor the status of the loading playlist to make sure it's ready before i start playing/scrubbing:
[mPlayer addObserver:self forKeyPath:#"currentItem.status" options:0 context:VideoPlaybackViewDelegateStatusContext];
Thanks for any help on this issues anyone could provide.
https://developer.apple.com/library/ios/releasenotes/AudioVideo/RN-AVFoundation-Old/#//apple_ref/doc/uid/TP40011199-CH1-SW4
The docs above mention that duration should now be obtained from the AVPlayerItem instance, rather than its corresponding AVAsset. To get the duration from the current player item via key-value observing, I use the following method (originally pulled from NGMoviePlayer which was written for iOS 4.0):
- (void)loadPlayerWithItem:(AVPlayerItem *)playerItem {
self.player = [AVPlayer playerWithPlayerItem:playerItem];
...
// changed this from previous value currentItem.asset.duration
[self.player addObserver:self forKeyPath:#"currentItem.duration"
options:0
context:nil];
...
}
I implemented the above change in my player and the duration is working now! This change in AVFoundation was the root cause of the issue. CMTimeFlags = 17 indicates kCMTimeFlags_Indefinite & kCMTimeFlags_Valid, and the docs specify:
In particular, the duration reported by the URL asset for streaming-based media is typically kCMTimeIndefinite, while the duration of a corresponding AVPlayerItem may be different and may change while it plays.

Resources