How to set videoMaximumDuration for AvcaptureSession in iOS - ios

I am using AVCaptureSession for Recording Video.But i can't able to set maximum Video length.If i am using ImagePicker Controller there is method is used for set maximum video duration like videoMaximumDuration .But In AVCaptureSession how i can set MaximumDuration .please help me..Advance Thanks

You can set the maximum duration using the property maxRecordedDuration of your AVCaptureMovieFileOutput settings.
Here's an example.
self.movieFileOutput = [[AVCaptureMovieFileOutput alloc]init];
Float64 maximumVideoLength = 60; //Whatever value you wish to set as the maximum, in seconds
int32_t prefferedTimeScale = 30 //Frames per second
CMTime maxDuration = CMTimeMakeWithSeconds(maximumVideoLength, preferredTimescale_;
self.movieFileOutput.maxRecordedDuration = maxDuration;
self.movieFileOutput.minFreeDiskSpaceLimit = 1024*1024;
if(self.captureSession canAddOutput:self.movieFileOutput){
[self.captureSession addOutput:self.movieFileOutput];
}
I hope this answers your question

Related

Set OpenCV's iOS CvVideoCamera default fps above 30

Whenever trying to set the CvVideoCamera's default fps to above 30 it stays set to 30 fps. It allows me to set it lower, but nothing above 30 fps. I'm using an iPhone 7 so I know it is capable of shooting 1920x1080 video at 60fps. I have looked into using AVCaptureSession, but OpenCV's CvVideoCamera allows easy access and processing of individual frames so I would like to stick with it if at all possible.
self.videoCamera = [[CvVideoCamera alloc]initWithParentView:self.videoPreviewView];
self.videoCamera.defaultAVCaptureDevicePosition = AVCaptureDevicePositionBack;
self.videoCamera.defaultAVCaptureSessionPreset = AVCaptureSessionPreset1920x1080;
self.videoCamera.defaultAVCaptureVideoOrientation = AVCaptureVideoOrientationLandscapeLeft;
self.videoCamera.defaultFPS = 60; //This still sets it to 30 FPS
self.videoCamera.grayscaleMode = NO;
self.videoCamera.delegate = self;

How to get frame from video on iOS

In my app I want to take frames from a video for filtering them. I try to take frame frim video by time offset. This is my code:
- (UIImage*)getVideoFrameForTime:(NSDate*)time {
CGImageRef thumbnailImageRef = NULL;
NSError *igError = nil;
NSTimeInterval timeinterval = [time timeIntervalSinceDate:self.videoFilterStart];
CMTime atTime = CMTimeMakeWithSeconds(timeinterval, 1000);
thumbnailImageRef =
[self.assetImageGenerator copyCGImageAtTime:atTime
actualTime:NULL
error:&igError];
if (!thumbnailImageRef) {
NSLog(#"thumbnailImageGenerationError %#", igError );
}
UIImage *image = thumbnailImageRef ? [[UIImage alloc] initWithCGImage:thumbnailImageRef] : nil;
return image;
}
Unfortunately, I see only frames which located on integer seconds: 1, 2, 3.. Even when time interval is non-integer (1.5, etc).
How to get frames at any non-integer interval?
Thnx to #shallowThought I found an answer in this question Grab frames from video using Swift
You just need to add this two lines
assetImgGenerate.requestedTimeToleranceAfter = kCMTimeZero;
assetImgGenerate.requestedTimeToleranceBefore = kCMTimeZero;
Use this project to get more frame details
The corresponding project on github: iFrameExtractor.git
If I remember correctly NSDate's accuracy only goes up to the second which explains why frames are only take on integer seconds. You'll have to use a different type of input to get frames at non-integer seconds.

How to get the available video dimensions/quality from a video url iOS?

I am creating custom video player using AVPlayer in ios (OBJECTIVE-C).I have a settings button which on clicking will display the available video dimensions and audio formats.
Below is the design:
so,I want to know:
1).How to get the available dimensions from a video url(not a local video)?
2). Even if I am able to get the dimensions,Can I switch between the available dimensions while playing in AVPlayer?
Can anyone give me a hint?
If it is not HLS (streaming) video, you can get Resolution information with the following code.
Sample code:
// player is playing
if (_player.rate != 0 && _player.error == nil)
{
AVAssetTrack *track = [[_player.currentItem.asset tracksWithMediaType:AVMediaTypeVideo] firstObject];
if (track != nil)
{
CGSize naturalSize = [track naturalSize];
naturalSize = CGSizeApplyAffineTransform(naturalSize, track.preferredTransform);
NSInteger width = (NSInteger) naturalSize.width;
NSInteger height = (NSInteger) naturalSize.height;
NSLog(#"Resolution : %ld x %ld", width, height);
}
}
However, for HLS video, the code above does not work.
I have solved this in a different way.
When I played a video, I got the image from video and calculated the resolution of that.
Here is the sample code:
// player is playing
if (_player.rate != 0 && _player.error == nil)
{
AVAssetTrack *track = [[_player.currentItem.asset tracksWithMediaType:AVMediaTypeVideo] firstObject];
CMTime currentTime = _player.currentItem.currentTime;
CVPixelBufferRef buffer = [_videoOutput copyPixelBufferForItemTime:currentTime itemTimeForDisplay:nil];
NSInteger width = CVPixelBufferGetWidth(buffer);
NSInteger height = CVPixelBufferGetHeight(buffer);
NSLog(#"Resolution : %ld x %ld", width, height);
}
As you have mentioned the that it is not a local video, you can call on some web service to return the available video dimensions for that particular video. After that change the URL to other available video and seek to the current position.
Refer This

UiVideoEditorController change size

I am captureing video use AVCaptureSession
And then Edit video using UIVideoEditorController
UIVideoEditorController *videoEditor = [[UIVideoEditorController alloc] init];
videoEditor.videoPath = recordedFile.path;
videoEditor.videoQuality = UIImagePickerControllerQualityTypeIFrame960x540;
videoEditor.delegate = self;
However Out video size is always return as 640 *360
Does anyone have idea why?

AVSpeechSynthesizer postUtteranceDelay timing problems

The code below takes a list of directions and reads them out using AVSpeechSynthesizer. Once complete, the user will be able to select a variable amount of time and the app will read out the instructions to fit the time span.
The problem is that when I press the play button, the delay between directions is significantly longer than it should be. Instead of the two minutes I've hardcoded it with, it takes over three. I've logged the value of all my postUtteranceDelays and they add up properly. It's also not due to processing time because when set postUtteranceDelay to 0 there is no pause between directions. I'm not sure what is going on.
- (IBAction)play:(UIButton *)sender {
[sender setTitle:#"Showering" forState:UIControlStateNormal];
Shower *shower = [[SpecificShower alloc] init];
NSUInteger totalRatio = [shower calculateTotalRatio:shower];
NSNumber *offset = #18.0; // estimated time to speak instructions combined
NSNumber *seconds = #120.0; // hard coded but just for testing
int totalSeconds = seconds.intValue - offset.intValue;
self.synthesizer = [[AVSpeechSynthesizer alloc] init];
for (NSDictionary* direction in shower.directions) {
AVSpeechUtterance *aDirection = [[AVSpeechUtterance alloc] initWithString:direction[#"text"]];
NSNumber *directionLength = direction[#"length"];
aDirection.rate = .3;
aDirection.preUtteranceDelay = 0;
// totalRatio is calculated by adding all the lengths together
// then the individual direction length is divided by totalRatio
// and that fraction is multiplied by total number of seconds
// to come up with the postUtteranceDelay for each direction
aDirection.postUtteranceDelay = totalSeconds * [directionLength floatValue]/totalRatio;
NSLog(#"%f", aDirection.postUtteranceDelay);
[self.synthesizer speakUtterance:aDirection];
}
}
You're not alone. This seems to be a bug as noted with a workaround over here.
There are also radars filed here and here.
Let's hope this gets fixed soon.

Resources