IOS Video recording using UIImages arriving at random times - ios

I'm developing an iOS app that gets UIImages at random times from an internet connection, and progressively constructs a video file from them as the images come in. I got it working a little, but the fact that the images dont arrive at the same rate all the time is messing up the video.
How do I re-calculate CMTime when each new UIImage arrives so that it adjusts for the varying frame rate of the arriving UIImages, which can arrive anywhere from milliseconds to seconds apart??
Here is what I'm doing so far, some code is not shown, but here is the basic thing
.
.
adaptor = [AVAssetWriterInputPixelBufferAdaptor
assetWriterInputPixelBufferAdaptorWithAssetWriterInput: videoStream
sourcePixelBufferAttributes:attributes];
CMTime frameTime=CMTimeMake(1,10); // assumed initial frame rate
.
.
-(void)addImageToMovie:(UIImage*)img {
append_ok=FALSE;
buffer = [self pixelBufferFromCGImage:[img CGImage] andSize:img.size];
while (!append_ok) {
if (adaptor.assetWriterInput.readyForMoreMediaData){
frameTime.value +=1;
append_ok = [adaptor appendPixelBuffer:buffer withPresentationTime:frameTime];
[NSThread sleepForTimeInterval:0.01];
} else {
[NSThread sleepForTimeInterval:0.01];
}
}
if(buffer) {
CVBufferRelease(buffer);
}
}

It depends on the number of frames. Instead of adding 1, add 10 to frameTime.value.

Related

Recording Video at 25 Frames per second in iOS

Im doing a custom camera to film at Full HD or just HD quality. The issue is that after I set the Camera to 25 Frames with the follow code:
- (void) setFrameRate:(AVCaptureDevice*) camera {
NSError *error;
if (![camera lockForConfiguration:&error]) {
NSLog(#"Could not lock device %# for configuration: %#", camera, error);
return;
}
AVCaptureDeviceFormat *format = camera.activeFormat;
double epsilon = 0.00000001;
int desiredFrameRate = 25;
for (AVFrameRateRange *range in format.videoSupportedFrameRateRanges) {
if (range.minFrameRate <= (desiredFrameRate + epsilon) &&
range.maxFrameRate >= (desiredFrameRate - epsilon)) {
[camera setActiveVideoMaxFrameDuration:CMTimeMake(10, desiredFrameRate*10)];
[camera setActiveVideoMinFrameDuration:CMTimeMake(10, desiredFrameRate*10)];
break;
}
}
[camera unlockForConfiguration];
}
It changes the video fps but not to exactly 25 frames per second like I set in method. It fluctuate between 23.93 and 25.50 frames per second.
Anyone knows why?
After several attempts and debugging I find out that the issue with the frame rate not being exactly 25 frame has to do with the recording method and not wiht the device setup.
I was using AVAssetWriter object to record the video like the example showed in the follow link (https://reformatcode.com/code/ios/ios-8-film-from-both-back-and-front-camera).
But in no way was possible to get the exactly 25 fps.
Change the object for recording video for AVCaptureMovieFileOutput and was supper easy from there setting up and recording. Result its much more precise, between 25 and 25.01.

How to get frame from video on iOS

In my app I want to take frames from a video for filtering them. I try to take frame frim video by time offset. This is my code:
- (UIImage*)getVideoFrameForTime:(NSDate*)time {
CGImageRef thumbnailImageRef = NULL;
NSError *igError = nil;
NSTimeInterval timeinterval = [time timeIntervalSinceDate:self.videoFilterStart];
CMTime atTime = CMTimeMakeWithSeconds(timeinterval, 1000);
thumbnailImageRef =
[self.assetImageGenerator copyCGImageAtTime:atTime
actualTime:NULL
error:&igError];
if (!thumbnailImageRef) {
NSLog(#"thumbnailImageGenerationError %#", igError );
}
UIImage *image = thumbnailImageRef ? [[UIImage alloc] initWithCGImage:thumbnailImageRef] : nil;
return image;
}
Unfortunately, I see only frames which located on integer seconds: 1, 2, 3.. Even when time interval is non-integer (1.5, etc).
How to get frames at any non-integer interval?
Thnx to #shallowThought I found an answer in this question Grab frames from video using Swift
You just need to add this two lines
assetImgGenerate.requestedTimeToleranceAfter = kCMTimeZero;
assetImgGenerate.requestedTimeToleranceBefore = kCMTimeZero;
Use this project to get more frame details
The corresponding project on github: iFrameExtractor.git
If I remember correctly NSDate's accuracy only goes up to the second which explains why frames are only take on integer seconds. You'll have to use a different type of input to get frames at non-integer seconds.

How to create several mp4 files with AVAssetWriter at the same time

I try to save four video streams with AVAssetWriter on the iPhone as .mp4. With three streams everything works fine, but the 4th mp4 file is always empty.
Here is a piece of my code:
-(void)writeImagesToMovie:(CVPixelBufferRef) buffer :(int) cameraID
{
AVAssetWriterInput* writerInput;
AVAssetWriterInputPixelBufferAdaptor *adaptor;
int* frameNumber;
switch (cameraID) {
case 1:
writerInput = writerInput1;
adaptor = adaptor1;
frameNumber =&frameNumber1;
break;
case 2:
writerInput = writerInput2;
adaptor = adaptor2;
frameNumber =&frameNumber2;
break;
case 3:
writerInput = writerInput3;
adaptor = adaptor3;
frameNumber =&frameNumber3;
break;
default:
writerInput = writerInput4;
adaptor = adaptor4;
frameNumber =&frameNumber4;
break;
}
if(writerInput.readyForMoreMediaData){
CMTime frameTime = CMTimeMake(1, 30);//150, 600
// CMTime = Value and Timescale.
// Timescale = the number of tics per second you want
// Value is the number of tics
// For us - each frame we add will be 1/4th of a second
// Apple recommend 600 tics per second for video because it is a
// multiple of the standard video rates 24, 30, 60 fps etc.
CMTime lastTime=CMTimeMake((int64_t)*frameNumber, 30);
CMTime presentTime=CMTimeAdd(lastTime, frameTime);
if (*frameNumber == 0) {presentTime = CMTimeMake(0, 30);}//600
// This ensures the first frame starts at 0.
// Give the Image to the AVAssetWriter to add to your video
[adaptor appendPixelBuffer:buffer withPresentationTime:presentTime];
*frameNumber=*frameNumber+1;
}
}
I call this method in a loop and give in each run a image, which should be written in one of the four mp4 files.
The last called AVAssetWriterInput gets a image, but the file remains empty.
If I change the order of the calls, always the last called AVAssetWriterInput leaves an empty file.
Are there any ideas?

Why AVSampleBufferDisplayLayer stops showing CMSampleBuffers taken from AVCaptureVideoDataOutput's delegate?

I want to display some CMSampleBuffer's with the AVSampleBufferDisplayLayer, but it freezes after showing the first sample.
I get the samplebuffers from the AVCaptureVideoDataOutputSampleBuffer delegate:
-(void)captureOutput:(AVCaptureOutput *)captureOutput didOutputSampleBuffer:(CMSampleBufferRef)sampleBuffer fromConnection:(AVCaptureConnection *)connection
{
CFRetain(sampleBuffer);
[self imageToBuffer:sampleBuffer];
CFRelease(sampleBuffer);
}
put them into a vector
-(void) imageToBuffer: (CMSampleBufferRef )source{
//buffers is defined as: std::vector<CMSampleBufferRef> buffers;
CMSampleBufferRef newRef;
CMSampleBufferCreateCopy(kCFAllocatorDefault, source, &newRef);
buffers.push_back(newRef);
}
Then try to show them via AVSampleBufferDisplayLayer (in another ViewController)
AVSampleBufferDisplayLayer * displayLayer = [[AVSampleBufferDisplayLayer alloc] init];
displayLayer.bounds = self.view.bounds;
displayLayer.position = CGPointMake(CGRectGetMidX(self.displayOnMe.bounds), CGRectGetMidY(self.displayOnMe.bounds));
displayLayer.videoGravity = AVLayerVideoGravityResizeAspectFill;
displayLayer.backgroundColor = [[UIColor greenColor] CGColor];
[self.view.layer addSublayer:displayLayer];
self.view.autoresizingMask = UIViewAutoresizingFlexibleWidth | UIViewAutoresizingFlexibleHeight;
dispatch_queue_t queue = dispatch_queue_create("My queue", DISPATCH_QUEUE_SERIAL);
[displayLayer setNeedsDisplay];
[displayLayer requestMediaDataWhenReadyOnQueue:queue
usingBlock:^{
while ([displayLayer isReadyForMoreMediaData]) {
if (samplesKey < buffers.size()) {
CMSampleBufferRef buf = buffers[samplesKey];
[displayLayer enqueueSampleBuffer:buffers[samplesKey]];
samplesKey++;
}else
{
[displayLayer stopRequestingMediaData];
break;
}
}
}];
but it shows the first sample then freezes, and does nothing.
And my video data output settings are as follows:
//set up our output
self.videoDataOutput = [[AVCaptureVideoDataOutput alloc] init];
dispatch_queue_t queue = dispatch_queue_create("VideoQueue", DISPATCH_QUEUE_SERIAL);
[_videoDataOutput setSampleBufferDelegate:self queue:queue];
[_videoDataOutput setVideoSettings:[NSDictionary dictionaryWithObjectsAndKeys:
[NSNumber numberWithInt:kCVPixelFormatType_32BGRA],(id)kCVPixelBufferPixelFormatTypeKey,
nil]];
I came across this problem in the same context, trying to take the output from AVCaptureVideoDataOutput and display it in a AVSampleDisplay layer.
If your frames come out in display order, then the fix is very easy, just set the display immediately flag on the CMSampleBufferRef.
Get the sample buffer returned by the delegate and then...
CFArrayRef attachments = CMSampleBufferGetSampleAttachmentsArray(sampleBuffer, YES);
CFMutableDictionaryRef dict = (CFMutableDictionaryRef)CFArrayGetValueAtIndex(attachments, 0);
CFDictionarySetValue(dict, kCMSampleAttachmentKey_DisplayImmediately, kCFBooleanTrue);
If your frames come out in encoder order (not display order), then the time stamps on the CMSampleBuffer need to be zero biased and restamped such that the first frames timestamp is equal to time 0.
double pts = CMTimeGetSeconds(CMSampleBufferGetPresentationTimeStamp(sampleBuffer));
// ptsStart is equal to the first frames presentationTimeStamp so playback starts from time 0.
CMTime presentationTimeStamp = CMTimeMake((pts-ptsStart)*1000000,1000000);
CMSampleBufferSetOutputPresentationTimeStamp(sampleBuffer, presentationTimeStamp);
Update:
I ran into a situation where some video still wasn't playing smoothly when I used the zero bias method and I investigated further. The correct answer seems to be using the PTS from the first frame you intend to play.
My answer is here, but I will post it here, too.
Set rate at which AVSampleBufferDisplayLayer renders sample buffers
The Timebase needs to be set to the presentation time stamp (pts) of the first frame you intend to decode. I was indexing the pts of the first frame to 0 by subtracting the initial pts from all subsequent pts and setting the Timebase to 0. For whatever reason, that didn't work with certain video.
You want something like this (called before a call to decode):
CMTimebaseRef controlTimebase;
CMTimebaseCreateWithMasterClock( CFAllocatorGetDefault(), CMClockGetHostTimeClock(), &controlTimebase );
displayLayer.controlTimebase = controlTimebase;
// Set the timebase to the initial pts here
CMTimebaseSetTime(displayLayer.controlTimebase, CMTimeMake(ptsInitial, 1));
CMTimebaseSetRate(displayLayer.controlTimebase, 1.0);
Set the PTS for the CMSampleBuffer...
CMSampleBufferSetOutputPresentationTimeStamp(sampleBuffer, presentationTimeStamp);
And maybe make sure display immediately isn't set....
CFDictionarySetValue(dict, kCMSampleAttachmentKey_DisplayImmediately, kCFBooleanFalse);
This is covered very briefly in WWDC 2014 Session 513.

What is the default buffer size of AVPlayer?

What is the default maximum and minimum buffer size of iOS AVPlayer? How to increasing its size for mp3 streaming? thank you
If your buffer size means playback buffer, then I believe this size is undocumented.
This behaviour is provided by AVFoundation, quoted on Advances in AVFoundation Playback
And we see that playback starts but then we quickly run into a stall because we didn't have enough buffer to play to the end.
In that case, we'll go into the waiting state and re-buffer until we have enough to play through.
They just mentioned enough not the exact time or size, it make sense that is a dynamic unit of measurement according to current network situation, but who knows.
Back to the topic, if you would like to take control of buffering.
You can observe AVPlayerItem's loadedTimeRanges property.
below code snippet had 5 seconds buffer time:
if ([keyPath isEqualToString:#"loadedTimeRanges"]) {
NSArray *timeRanges = (NSArray *)[change objectForKey:NSKeyValueChangeNewKey];
if (timeRanges && [timeRanges count]) {
CMTimeRange timerange = [[timeRanges objectAtIndex:0] CMTimeRangeValue];
if (self.audioPlayer.rate == 0 && !pauseReasonForced) {
CMTime bufferdTime = CMTimeAdd(timerange.start, timerange.duration);
CMTime milestone = CMTimeAdd(self.audioPlayer.currentTime, CMTimeMakeWithSeconds(5.0f, timerange.duration.timescale));
if (CMTIME_COMPARE_INLINE(bufferdTime , >, milestone) && self.audioPlayer.currentItem.status == AVPlayerItemStatusReadyToPlay && !interruptedWhilePlaying && !routeChangedWhilePlaying) {
if (![self isPlaying]) {
[self.audioPlayer play];
}
}
}
}
}

Resources