AVMutableComposition play blank with AVPlayer but export fine with AVAssetExportSession - ios

I create an AVMutableComposition containing videos imported from the library. I pass an AVMutableVideoComposition containing an array of AVMutableVideoCompositionInstruction in order to transform (scale, rotate) and fade (opacity) between video tracks.
With most videos no problem occurs. I can play the composition using AVPlayer or export it using AVAssetExportSession without any problem. But for some videos when the composition is passed to an AVPlayer I get a black screen like when the timing of the differents AVMutableVideoCompositionInstruction are not set correctly. However when I export the exact same composition containing the exact same instruction and timing (actually the same composition) to an AVAssetExportSession the movie exports perfectly. If I play the imported video alone as simple AVAsset using the AVPlayer, the video play fine.
I tried to raise the time scale of my movie to 60000 instead of 600 hopping it was the duration of the imported video that was an issue (floating point being lost) but it is not the case.
It happens under iOS8 but may also happens under iOS7 I dont know.
PS: I didn't post code because there are far too many lines.

Related

How can i change AVPlayer Video Quality while playing

I used this library for playing video from URL, and I am able to play video.
Now I want to change the currently playing video quality like (low, medium, high).
Library uses AVPlayer and how can i change quality with AVPlayer?
I listen about preferred​Peak​Bit​Rate but I have no idea about it.
Please help me how can i do that?
You cannot set video quality directly on the AVPlayer, however, you can do this by accessing videoComposition property on AVPlayerItem which is then supplied to AVPlayer (via for example replaceCurrentItem: method or on AVPlayer initialization). So:
Create or get you AVPlayerItem's AVVideoComposition instance, and set it's frameDuration, renderSize and renderScale properties. Take a look in docs for more info.
Set it to your "movie" videoComposition property, AVPlayerItem instance (again, look in docs for details).
Play that in your player.
If you want to do this on the fly while playing movie, adjustments of time for player item should be done I guess.

iOS : How to apply audio effect on recorded video

I am developing an application which require to apply audio effect on recorded video.
I am recording video using GPUImage library. I can successfully done with it. Now, I need to apply audio effect like Chipmunk, Gorila, Large Room, etc.
I looked into Apple's document and it say that AudioEngine can't apply AVAudioUnitTimePitch on Input Node (as Microphone).
For solving this problem, I use following mechanism.
Record video & audio at same time.
Play video. While playing video, start AudioEngine on Audio file and apply AVAudioUnitTimePitch on it.
[playerNode play]; // Start playing audio file with video preview
Merge video and new effected audio file.
Problem :
User have to preview a full video for audio effect merge. This is not a good solution.
If I set volume of playerNode to 0 (zero). Then It record mute video.
Please provide any better suggestion to do this things. Thanks in advance.

AVPlayer does not play back AVComposition with more than 2 clips

I am attempting to stitch together video assets using AVComposition based on the code here:
https://developer.apple.com/library/mac/samplecode/AVCompositionDebugViewer/Introduction/Intro.html
On OSX it works perfectly, however on iOS when playing back via AVPlayer it only works with 1 or 2 input clips. If I attempt to add a third there will be nothing played back on the AVPlayerLayer. Wierdly if I observer the AVPlayer playback time using addPeriodicTimeObserverForInterval the video appears to be playing for the correct duration, but nothing plays back on the layer.
Does anyone have any insight into why this would be?
Turns out I was creating CMTime objects with differing timeScale values, which was causing rounding errors and creating gaps in my tracks. If a track had a gap then it would just fail to play. Ensuring that all my CMTime objects had the same timeScale made everything work perfectly.

Loosing audio after processing in AVComposition

I am using AVMutableComposition to combine two pieces of media (one video one graphic overlay). The video was captured using UIImagePickerController.
Video records fine and audio is there when previewing the recording.
Process with Composition, and export session. Video saves fine (with overlay), but no audio.
iOS7.
I'm not specifically doing anything with audio in the composition. I just assumed it would "come along" with the video file. Is that accurate, or do I need to create a dedicated audio track in the composition?
_mike
After much researching, I found the solution to this in another Stackoverflow question (and answer).
iOS AVFoundation Export Session is missing audio.
Many thanks to that user.

Merge audio with video perfectly

I've been trying to merge audio with video using AVMutableComposition and AVExportSession. Everything works perfectly except that the audio and video sources don't have the same duration.
So the exported movie is a bit laggy. Is there anyway to resize or redefine the rate of the video so that its duration becomes exactly equal to the audio's duration? For example, if the audio lasts 10 seconds and the video lasts 9 seconds, I'd like to play the video back at 9/10 speed, so they both end at the same time.
b
Solved
Use something like that:
[compositionVideoTrack scaleTimeRange:CMTimeRangeMake(kCMTimeZero,videoAsset.duration) toDuration:audioAsset.duration];

Resources