AVCaptureSession get same settings as built in iPhone camera - ios

I've been stuck on this for quite some time. There appears to be a zoom factor when I am recording with the built in camera app on my iPhone. However, I cannot seem to get the same result when I use an AVCaptureSession inside of my app. Here is an example of what I'm talking about. The top was recorded from the iPhone camera app, and the bottom was recorded inside of my app using AVCaptureSession. It's like there is a fish eye effect occurring. My code for setting up my camera is below.
- (void)configureCameraSession{
_captureSession = [AVCaptureSession new];
[_captureSession beginConfiguration];
_device = [AVCaptureDevice defaultDeviceWithMediaType:AVMediaTypeVideo];
if(!_device){
[self noValidCamera];
return;
}
AVCaptureDeviceFormat *currentFormat;
int frameRate = 60;
for (AVCaptureDeviceFormat *format in _device.formats)
{
NSArray *ranges = format.videoSupportedFrameRateRanges;
AVFrameRateRange *frameRates = ranges[0];
//CMVideoFormatDescriptionGet
int resolutionWidth = CMVideoFormatDescriptionGetDimensions(format.formatDescription).width;
if(frameRates.maxFrameRate > 59 && resolutionWidth >= 1920){
CameraFormat *cformat = [[CameraFormat alloc] init];
cformat.format = format;
cformat.fps = 60;
if(frameRates.maxFrameRate > 119){
cformat.fps = 120;
}
if(frameRates.maxFrameRate > 239){
cformat.fps = 240;
}
NSString *resolution;
if(resolutionWidth > 1920){
resolution = #"4K ";
}
else{
resolution = [[NSNumber numberWithInt:CMVideoFormatDescriptionGetDimensions(format.formatDescription).height] stringValue];
resolution = [resolution stringByAppendingString:#"p "];
}
//stringValue];
NSString *fps = [[NSNumber numberWithInt:cformat.fps] stringValue];
fps = [fps stringByAppendingString:#" FPS"];
cformat.label = [resolution stringByAppendingString:fps];
BOOL isUniqueFormat = YES;
for(int i = 0; i < [_formatList count]; i++){
if([_formatList[i].label isEqualToString:cformat.label]){
isUniqueFormat = NO;
break;
}
}
if(isUniqueFormat){
[_formatList addObject:cformat];
frameRate = cformat.fps;
currentFormat = cformat.format;
}
}
}
if(!currentFormat){
[self noValidCamera];
return;
}
_currentCameraFormat = _analysisViewController.currentCameraFormat;
if(_currentCameraFormat.fps == 0){
_currentCameraFormatIndex = 0;
_currentCameraFormat = _formatList[_currentCameraFormatIndex];
_analysisViewController.currentCameraFormat = _currentCameraFormat;
currentFormat = _currentCameraFormat.format;
frameRate = _currentCameraFormat.fps;
}
else{
currentFormat = _currentCameraFormat.format;
frameRate = _currentCameraFormat.fps;
}
NSString *resolution;
if(CMVideoFormatDescriptionGetDimensions(currentFormat.formatDescription).width > 1920){
resolution = #"4K ";
}
else{
resolution = [[NSNumber numberWithInt:CMVideoFormatDescriptionGetDimensions(currentFormat.formatDescription).height] stringValue];
resolution = [resolution stringByAppendingString:#"p "];
}
NSString *fps = [[NSNumber numberWithInt:frameRate] stringValue];
fps = [fps stringByAppendingString:#" FPS ▼"];
[self.videoLabelButton setTitle:[resolution stringByAppendingString:fps] forState:UIControlStateNormal];
[_device lockForConfiguration:nil];
_device.activeFormat = currentFormat;
_device.activeVideoMinFrameDuration = CMTimeMake(1, frameRate);
_device.activeVideoMaxFrameDuration = CMTimeMake(1, frameRate);
[_device unlockForConfiguration];
//Input
AVCaptureDeviceInput *input = [AVCaptureDeviceInput deviceInputWithDevice:_device error:nil];
[_captureSession addInput:input];
//Output
_movieFileOutput = [[AVCaptureMovieFileOutput alloc] init];
if([_captureSession canAddOutput:_movieFileOutput]){
[_captureSession addOutput:_movieFileOutput];
}
[self setMovieOutputOrientation];
//Preview Layer
_previewLayer = [[AVCaptureVideoPreviewLayer alloc] initWithSession:_captureSession];
_previewView = [[UIView alloc] initWithFrame:self.imageView.bounds];
[self addFocusToView:_previewView];
[_previewView addSubview:self.imageView];
_previewLayer.frame = _previewView.bounds;
_previewLayer.videoGravity = AVLayerVideoGravityResizeAspectFill;
[_previewView.layer addSublayer:_previewLayer];
_previewLayer.connection.videoOrientation = [self videoOrientationFromCurrentDeviceOrientation];
AVCaptureVideoStabilizationMode stabilizationMode = AVCaptureVideoStabilizationModeCinematic;
if ([_device.activeFormat isVideoStabilizationModeSupported:stabilizationMode]) {
[_previewLayer.connection setPreferredVideoStabilizationMode:stabilizationMode];
}
[self.view addSubview:_previewView];
[self.view addSubview:self.topInfoBar];
[self.view addSubview:self.recordButton];
[self.view addSubview:self.doneButton];
[_captureSession commitConfiguration];
[_captureSession startRunning];
}

Okay so I finally figured it out. The issue was that on some formats video stabilization is supported so Apple will use it if it can. The problem is that I was not turning this on when I could. Basically check to see if that option is available for the device's active format and turn it on Auto.
AVCaptureConnection *videoConnection = nil;
for (AVCaptureConnection *connection in [_movieFileOutput connections]){
for ( AVCaptureInputPort *port in [connection inputPorts]){
if ([[port mediaType] isEqual:AVMediaTypeVideo]){
videoConnection = connection;
}
}
}
if([videoConnection isVideoOrientationSupported]){
[videoConnection setVideoOrientation:[self videoOrientationFromCurrentDeviceOrientation]];
}
//Check if we can use video stabilization and if so then set it to auto
if (videoConnection.supportsVideoStabilization) {
videoConnection.preferredVideoStabilizationMode = AVCaptureVideoStabilizationModeAuto;
}

Related

iphone camera iso value

I set the shutter speed, ISO and other camera parameters and capture an image using the code below. If I set the ISO to something like 119, it gets reported as 125 in the Exif info (it is always rounded to a standard value). How can I tell what the real ISO was? The maxISO gets reported as 734 but if I set it to 734 it is 800 in the Exif information. I had the same problem with the exposure time. If I set it to 800ms it shows up as 1s in the but the value calculated from the ShutterSpeedValue is correct.
- (void)setCameraSettings:(long)expTime1000thSec iso:(int)isoValue
{
if ( currentCaptureDevice ) {
[captureSession beginConfiguration];
NSError *error = nil;
if ([currentCaptureDevice lockForConfiguration:&error]) {
if ([currentCaptureDevice isExposureModeSupported:AVCaptureExposureModeLocked]) {
CMTime minTime, maxTime, exposureTime;
if ( isoValue < minISO ) {
isoValue = minISO;
} else if ( isoValue > maxISO ) {
isoValue = maxISO;
}
exposureTime = CMTimeMake(expTime1000thSec, EXP_TIME_UNIT); // in 1/EXP_TIME_UNIT of a second
minTime = currentCaptureDevice.activeFormat.minExposureDuration;
maxTime = currentCaptureDevice.activeFormat.maxExposureDuration;
if ( CMTimeCompare(exposureTime, minTime) < 0 ) {
exposureTime = minTime;
} else if ( CMTimeCompare(exposureTime, maxTime) > 0 ) {
exposureTime = maxTime;
}
NSLog(#"setting exp time to %lld/%d s (want %ld) iso=%d", exposureTime.value, exposureTime.timescale, expTime1000thSec, isoValue);
[currentCaptureDevice setExposureModeCustomWithDuration:exposureTime ISO:isoValue completionHandler:nil];
}
if (currentCaptureDevice.lowLightBoostSupported) {
currentCaptureDevice.automaticallyEnablesLowLightBoostWhenAvailable = NO;
NSLog(#"setting automaticallyEnablesLowLightBoostWhenAvailable = NO");
}
// lock the gains
if ([currentCaptureDevice isWhiteBalanceModeSupported:AVCaptureWhiteBalanceModeLocked]) {
currentCaptureDevice.whiteBalanceMode = AVCaptureWhiteBalanceModeLocked;
NSLog(#"setting AVCaptureWhiteBalanceModeLocked");
}
// set the gains
AVCaptureWhiteBalanceGains gains;
gains.redGain = 1.0;
gains.greenGain = 1.0;
gains.blueGain = 1.0;
AVCaptureWhiteBalanceGains normalizedGains = [self normalizedGains:gains];
[currentCaptureDevice setWhiteBalanceModeLockedWithDeviceWhiteBalanceGains:normalizedGains completionHandler:nil];
NSLog(#"setWhiteBalanceModeLockedWithDeviceWhiteBalanceGains g.red=%.2lf g.green=%.2lf g.blue=%.2lf",
normalizedGains.redGain, normalizedGains.greenGain, normalizedGains.blueGain);
[currentCaptureDevice unlockForConfiguration];
}
[captureSession commitConfiguration];
}
}
- (void)captureStillImage
{
NSLog(#"about to request a capture from: %#", [self stillImageOutput]);
if ( videoConnection ) {
waitingForCapture = true;
NSLog(#"requesting a capture from: %#", [self stillImageOutput]);
[[self stillImageOutput] captureStillImageAsynchronouslyFromConnection:videoConnection
completionHandler:^(CMSampleBufferRef imageSampleBuffer, NSError *error) {
if ( imageSampleBuffer ) {
CFDictionaryRef exifAttachments = CMGetAttachment(imageSampleBuffer, kCGImagePropertyExifDictionary, NULL);
if (exifAttachments) {
NSLog(#"attachements: %#", exifAttachments);
} else {
NSLog(#"no attachments");
}
NSLog(#"name: %#", [currentCaptureDevice localizedName]);
NSData *imageData = [AVCaptureStillImageOutput jpegStillImageNSDataRepresentation:imageSampleBuffer];
UIImage *image = [[UIImage alloc] initWithData:imageData];
[self setStillImage:image];
NSDictionary *dict = (__bridge NSDictionary*)exifAttachments;
NSString *value = [dict objectForKey:#"PixelXDimension"];
[self setImageWidth:[NSNumber numberWithInt:[value intValue]]];
value = [dict objectForKey:#"PixelYDimension"];
[self setImageHeight:[NSNumber numberWithInt:[value intValue]]];
value = [dict objectForKey:#"BrightnessValue"];
[self setImageBrightnessValue:[NSNumber numberWithFloat:[value floatValue]]];
value = [dict objectForKey:#"ShutterSpeedValue"];
double val = [value doubleValue];
val = 1.0 / pow(2.0, val);
[self setImageExposureTime:[NSNumber numberWithDouble:val]];
value = [dict objectForKey:#"ApertureValue"];
[self setImageApertureValue:[NSNumber numberWithFloat:[value floatValue]]];
NSArray *values = [dict objectForKey:#"ISOSpeedRatings"];
[self setImageISOSpeedRatings:[NSNumber numberWithInt:[ [values objectAtIndex:0] intValue]]];
NSLog(#"r/g/b gains = %.2lf/%.2lf/%.2lf",
currentCaptureDevice.deviceWhiteBalanceGains.redGain,
currentCaptureDevice.deviceWhiteBalanceGains.greenGain,
currentCaptureDevice.deviceWhiteBalanceGains.blueGain);
[[NSNotificationCenter defaultCenter] postNotificationName:kImageCapturedSuccessfully object:nil];
} else {
NSLog(#"imageSampleBuffer = NULL");
}
waitingForCapture = false;
}];
} else {
NSLog(#"can't capture from: videoConnection");
}
}
I answered your other question, same thing, you are not setting the CustomSettings mode:
[device lockForConfiguration:nil];
if([device isExposureModeSupported:AVCaptureExposureModeCustom]){
[device setExposureMode:AVCaptureExposureModeCustom];
[device setExposureModeCustomWithDuration:exposureTime ISO:exposureISO completionHandler:^(CMTime syncTime) {}];
[device setExposureTargetBias:exposureBIAS completionHandler:^(CMTime syncTime) {}];
}
Regards.
I added:
[device setExposureMode:AVCaptureExposureModeCustom];
and changed my completionHandler from nil to:
completionHandler:^(CMTime syncTime) {}
and I still see the same results. It seems that ISO can only be set to the "standard" values and not to any value between minISO and maxISO.

How to Record Video Using Front and Back Camera and still keep recording?

I'm using AVFoundation. I wanna to record video using both (front and Back side) camera. I record video on one side when i change the camera mode back to front, the camera still freeze. Is it possible to record video continuously on both side.
Sample Code:
- (void) startup
{
if (_session == nil)
{
NSLog(#"Starting up server");
self.isCapturing = NO;
self.isPaused = NO;
_currentFile = 0;
_discont = NO;
// create capture device with video input
_session = [[AVCaptureSession alloc] init];
AVCaptureDevice *backCamera = [self frontCamera];
AVCaptureDeviceInput* input = [AVCaptureDeviceInput deviceInputWithDevice:backCamera error:nil];
[_session addInput:input];
// audio input from default mic
AVCaptureDevice* mic = [AVCaptureDevice defaultDeviceWithMediaType:AVMediaTypeAudio];
AVCaptureDeviceInput* micinput = [AVCaptureDeviceInput deviceInputWithDevice:mic error:nil];
[_session addInput:micinput];
// create an output for YUV output with self as delegate
_captureQueue = dispatch_queue_create("com.softcraftsystems.comss", DISPATCH_QUEUE_SERIAL);
AVCaptureVideoDataOutput* videoout = [[AVCaptureVideoDataOutput alloc] init];
[videoout setSampleBufferDelegate:self queue:_captureQueue];
NSDictionary* setcapSettings = [NSDictionary dictionaryWithObjectsAndKeys:
[NSNumber numberWithInt:kCVPixelFormatType_420YpCbCr8BiPlanarVideoRange], kCVPixelBufferPixelFormatTypeKey,
nil];
videoout.videoSettings = setcapSettings;
[_session addOutput:videoout];
_videoConnection = [videoout connectionWithMediaType:AVMediaTypeVideo];
// find the actual dimensions used so we can set up the encoder to the same.
NSDictionary* actual = videoout.videoSettings;
_cy = [[actual objectForKey:#"Height"] integerValue];
_cx = [[actual objectForKey:#"Width"] integerValue];
AVCaptureAudioDataOutput* audioout = [[AVCaptureAudioDataOutput alloc] init];
[audioout setSampleBufferDelegate:self queue:_captureQueue];
[_session addOutput:audioout];
_audioConnection = [audioout connectionWithMediaType:AVMediaTypeAudio];
// for audio, we want the channels and sample rate, but we can't get those from audioout.audiosettings on ios, so
// we need to wait for the first sample
// start capture and a preview layer
[_session startRunning];
_preview = [AVCaptureVideoPreviewLayer layerWithSession:_session];
_preview.videoGravity = AVLayerVideoGravityResizeAspectFill;
}
}
- (AVCaptureDevice *)frontCamera
{
NSArray *devices = [AVCaptureDevice devicesWithMediaType:AVMediaTypeVideo];
for (AVCaptureDevice *device in devices) {
if ([device position] == AVCaptureDevicePositionFront) {
return device;
}
}
return nil;
}
- (AVCaptureDevice *)backCamera
{
NSArray *devices = [AVCaptureDevice devicesWithMediaType:AVMediaTypeVideo];
for (AVCaptureDevice *device in devices) {
if ([device position] == AVCaptureDevicePositionBack) {
return device;
}
}
return nil;
}
- (void) startupFront
{
_session = nil;
[_session stopRunning];
if (_session == nil)
{
NSLog(#"Starting up server");
self.isCapturing = NO;
self.isPaused = NO;
_currentFile = 0;
_discont = NO;
// create capture device with video input
_session = [[AVCaptureSession alloc] init];
AVCaptureDevice *backCamera = [self backCamera];
AVCaptureDeviceInput* input = [AVCaptureDeviceInput deviceInputWithDevice:backCamera error:nil];
[_session addInput:input];
// audio input from default mic
AVCaptureDevice* mic = [AVCaptureDevice defaultDeviceWithMediaType:AVMediaTypeAudio];
AVCaptureDeviceInput* micinput = [AVCaptureDeviceInput deviceInputWithDevice:mic error:nil];
[_session addInput:micinput];
// create an output for YUV output with self as delegate
_captureQueue = dispatch_queue_create("com.softcraftsystems.comss", DISPATCH_QUEUE_SERIAL);
AVCaptureVideoDataOutput* videoout = [[AVCaptureVideoDataOutput alloc] init];
[videoout setSampleBufferDelegate:self queue:_captureQueue];
NSDictionary* setcapSettings = [NSDictionary dictionaryWithObjectsAndKeys:
[NSNumber numberWithInt:kCVPixelFormatType_420YpCbCr8BiPlanarVideoRange], kCVPixelBufferPixelFormatTypeKey,
nil];
videoout.videoSettings = setcapSettings;
[_session addOutput:videoout];
_videoConnection = [videoout connectionWithMediaType:AVMediaTypeVideo];
// find the actual dimensions used so we can set up the encoder to the same.
NSDictionary* actual = videoout.videoSettings;
_cy = [[actual objectForKey:#"Height"] integerValue];
_cx = [[actual objectForKey:#"Width"] integerValue];
AVCaptureAudioDataOutput* audioout = [[AVCaptureAudioDataOutput alloc] init];
[audioout setSampleBufferDelegate:self queue:_captureQueue];
[_session addOutput:audioout];
_audioConnection = [audioout connectionWithMediaType:AVMediaTypeAudio];
// for audio, we want the channels and sample rate, but we can't get those from audioout.audiosettings on ios, so
// we need to wait for the first sample
// start capture and a preview layer
[_session startRunning];
_preview = [AVCaptureVideoPreviewLayer layerWithSession:_session];
_preview.videoGravity = AVLayerVideoGravityResizeAspectFill;
}
}
- (void) startCapture
{
#synchronized(self)
{
if (!self.isCapturing)
{
NSLog(#"starting capture");
// create the encoder once we have the audio params
_encoder = nil;
self.isPaused = NO;
_discont = NO;
_timeOffset = CMTimeMake(0, 0);
self.isCapturing = YES;
}
}
}
- (void) stopCapture
{
#synchronized(self)
{
if (self.isCapturing)
{
NSString* filename = [NSString stringWithFormat:#"capture%d.mp4", _currentFile];
NSString* path = [NSTemporaryDirectory() stringByAppendingPathComponent:filename];
NSURL* url = [NSURL fileURLWithPath:path];
_currentFile++;
// serialize with audio and video capture
self.isCapturing = NO;
dispatch_async(_captureQueue, ^{
[_encoder finishWithCompletionHandler:^{
self.isCapturing = NO;
_encoder = nil;
ALAssetsLibrary *library = [[ALAssetsLibrary alloc] init];
[library writeVideoAtPathToSavedPhotosAlbum:url completionBlock:^(NSURL *assetURL, NSError *error){
NSLog(#"save completed");
[[NSFileManager defaultManager] removeItemAtPath:path error:nil];
}];
}];
});
}
}
}
- (void) pauseCapture
{
#synchronized(self)
{
if (self.isCapturing)
{
NSLog(#"Pausing capture");
self.isPaused = YES;
_discont = YES;
}
}
}
- (void) resumeCapture
{
#synchronized(self)
{
if (self.isPaused)
{
NSLog(#"Resuming capture");
self.isPaused = NO;
}
}
}
- (CMSampleBufferRef) adjustTime:(CMSampleBufferRef) sample by:(CMTime) offset
{
CMItemCount count;
CMSampleBufferGetSampleTimingInfoArray(sample, 0, nil, &count);
CMSampleTimingInfo* pInfo = malloc(sizeof(CMSampleTimingInfo) * count);
CMSampleBufferGetSampleTimingInfoArray(sample, count, pInfo, &count);
for (CMItemCount i = 0; i < count; i++)
{
pInfo[i].decodeTimeStamp = CMTimeSubtract(pInfo[i].decodeTimeStamp, offset);
pInfo[i].presentationTimeStamp = CMTimeSubtract(pInfo[i].presentationTimeStamp, offset);
}
CMSampleBufferRef sout;
CMSampleBufferCreateCopyWithNewTiming(nil, sample, count, pInfo, &sout);
free(pInfo);
return sout;
}
- (void) setAudioFormat:(CMFormatDescriptionRef) fmt
{
const AudioStreamBasicDescription *asbd = CMAudioFormatDescriptionGetStreamBasicDescription(fmt);
_samplerate = asbd->mSampleRate;
_channels = asbd->mChannelsPerFrame;
}
- (void) captureOutput:(AVCaptureOutput *)captureOutput didOutputSampleBuffer:(CMSampleBufferRef)sampleBuffer fromConnection:(AVCaptureConnection *)connection
{
BOOL bVideo = YES;
#synchronized(self)
{
if (!self.isCapturing || self.isPaused)
{
return;
}
if (connection != _videoConnection)
{
bVideo = NO;
}
if ((_encoder == nil) && !bVideo)
{
CMFormatDescriptionRef fmt = CMSampleBufferGetFormatDescription(sampleBuffer);
[self setAudioFormat:fmt];
NSString* filename = [NSString stringWithFormat:#"capture%d.mp4", _currentFile];
NSString* path = [NSTemporaryDirectory() stringByAppendingPathComponent:filename];
_encoder = [VideoEncoder encoderForPath:path Height:_cy width:_cx channels:_channels samples:_samplerate];
}
if (_discont)
{
if (bVideo)
{
return;
}
_discont = NO;
// calc adjustment
CMTime pts = CMSampleBufferGetPresentationTimeStamp(sampleBuffer);
CMTime last = bVideo ? _lastVideo : _lastAudio;
if (last.flags & kCMTimeFlags_Valid)
{
if (_timeOffset.flags & kCMTimeFlags_Valid)
{
pts = CMTimeSubtract(pts, _timeOffset);
}
CMTime offset = CMTimeSubtract(pts, last);
NSLog(#"Setting offset from %s", bVideo?"video": "audio");
NSLog(#"Adding %f to %f (pts %f)", ((double)offset.value)/offset.timescale, ((double)_timeOffset.value)/_timeOffset.timescale, ((double)pts.value/pts.timescale));
// this stops us having to set a scale for _timeOffset before we see the first video time
if (_timeOffset.value == 0)
{
_timeOffset = offset;
}
else
{
_timeOffset = CMTimeAdd(_timeOffset, offset);
}
}
_lastVideo.flags = 0;
_lastAudio.flags = 0;
}
// retain so that we can release either this or modified one
CFRetain(sampleBuffer);
if (_timeOffset.value > 0)
{
CFRelease(sampleBuffer);
sampleBuffer = [self adjustTime:sampleBuffer by:_timeOffset];
}
// record most recent time so we know the length of the pause
CMTime pts = CMSampleBufferGetPresentationTimeStamp(sampleBuffer);
CMTime dur = CMSampleBufferGetDuration(sampleBuffer);
if (dur.value > 0)
{
pts = CMTimeAdd(pts, dur);
}
if (bVideo)
{
_lastVideo = pts;
}
else
{
_lastAudio = pts;
}
}
// pass frame to encoder
[_encoder encodeFrame:sampleBuffer isVideo:bVideo];
CFRelease(sampleBuffer);
}
- (void) shutdown
{
NSLog(#"shutting down server");
if (_session)
{
[_session stopRunning];
_session = nil;
}
[_encoder finishWithCompletionHandler:^{
NSLog(#"Capture completed");
}];
}
According to me. it is not possible, continue recording when we switch the camera,
because, there resolution and quality difference between them, a video can have only one resolution and quality throughout the video.
and secondly, every time you switch between camera it'll alloc and initialize the camera.
unfortunately its not possible according to me.
but if you find solution, do tell me please.

AVCapture capturing and getting framebuffer at 60 fps in iOS 7

I'm developping an app which requires capturing framebuffer at as much fps as possible. I've already figured out how to force iphone to capture at 60 fps but
- (void)captureOutput:(AVCaptureOutput *)captureOutput didOutputSampleBuffer:(CMSampleBufferRef)sampleBuffer fromConnection:(AVCaptureConnection *)connection
method is being called only 15 times a second, which means that iPhone downgrades capture output to 15 fps.
Has anybody faced such problem? Is there any possibility to increase capturing frame rate?
Update my code:
camera = [AVCaptureDevice defaultDeviceWithMediaType:AVMediaTypeVideo];
if([camera isTorchModeSupported:AVCaptureTorchModeOn]) {
[camera lockForConfiguration:nil];
camera.torchMode=AVCaptureTorchModeOn;
[camera unlockForConfiguration];
}
[self configureCameraForHighestFrameRate:camera];
// Create a AVCaptureInput with the camera device
NSError *error=nil;
AVCaptureInput* cameraInput = [[AVCaptureDeviceInput alloc] initWithDevice:camera error:&error];
if (cameraInput == nil) {
NSLog(#"Error to create camera capture:%#",error);
}
// Set the output
AVCaptureVideoDataOutput* videoOutput = [[AVCaptureVideoDataOutput alloc] init];
// create a queue to run the capture on
dispatch_queue_t captureQueue=dispatch_queue_create("captureQueue", NULL);
// setup our delegate
[videoOutput setSampleBufferDelegate:self queue:captureQueue];
// configure the pixel format
videoOutput.videoSettings = [NSDictionary dictionaryWithObjectsAndKeys:[NSNumber numberWithUnsignedInt:kCVPixelFormatType_32BGRA], (id)kCVPixelBufferPixelFormatTypeKey,
nil];
// Add the input and output
[captureSession addInput:cameraInput];
[captureSession addOutput:videoOutput];
I took configureCameraForHighestFrameRate method here https://developer.apple.com/library/mac/documentation/AVFoundation/Reference/AVCaptureDevice_Class/Reference/Reference.html
I am getting samples at 60 fps on the iPhone 5 and 120 fps on the iPhone 5s, both when doing real time motion detection in captureOutput and when saving the frames to a video using AVAssetWriter.
You have to set thew AVCaptureSession to a format that supports 60 fps:
AVsession = [[AVCaptureSession alloc] init];
AVCaptureDevice *videoDevice = [AVCaptureDevice defaultDeviceWithMediaType:AVMediaTypeVideo];
AVCaptureDeviceInput *capInput = [AVCaptureDeviceInput deviceInputWithDevice:videoDevice error:&error];
if (capInput) [AVsession addInput:capInput];
for(AVCaptureDeviceFormat *vFormat in [videoDevice formats] )
{
CMFormatDescriptionRef description= vFormat.formatDescription;
float maxrate=((AVFrameRateRange*)[vFormat.videoSupportedFrameRateRanges objectAtIndex:0]).maxFrameRate;
if(maxrate>59 && CMFormatDescriptionGetMediaSubType(description)==kCVPixelFormatType_420YpCbCr8BiPlanarFullRange)
{
if ( YES == [videoDevice lockForConfiguration:NULL] )
{
videoDevice.activeFormat = vFormat;
[videoDevice setActiveVideoMinFrameDuration:CMTimeMake(10,600)];
[videoDevice setActiveVideoMaxFrameDuration:CMTimeMake(10,600)];
[videoDevice unlockForConfiguration];
NSLog(#"formats %# %# %#",vFormat.mediaType,vFormat.formatDescription,vFormat.videoSupportedFrameRateRanges);
}
}
}
prevLayer = [AVCaptureVideoPreviewLayer layerWithSession: AVsession];
prevLayer.videoGravity = AVLayerVideoGravityResizeAspectFill;
[self.view.layer addSublayer: prevLayer];
AVCaptureVideoDataOutput *videoOut = [[AVCaptureVideoDataOutput alloc] init];
dispatch_queue_t videoQueue = dispatch_queue_create("videoQueue", NULL);
[videoOut setSampleBufferDelegate:self queue:videoQueue];
videoOut.videoSettings = #{(id)kCVPixelBufferPixelFormatTypeKey: #(kCVPixelFormatType_32BGRA)};
videoOut.alwaysDiscardsLateVideoFrames=YES;
if (videoOut)
{
[AVsession addOutput:videoOut];
videoConnection = [videoOut connectionWithMediaType:AVMediaTypeVideo];
}
Two other comment if you want to write to a file using AVAssetWriter. Don't use the pixelAdaptor, just ad the samples with
[videoWriterInput appendSampleBuffer:sampleBuffer]
Secondly when setting up the assetwriter use
[AVAssetWriterInput assetWriterInputWithMediaType:AVMediaTypeVideo
outputSettings:videoSettings
sourceFormatHint:formatDescription];
The sourceFormatHint makes a difference in writing speed.
I have developed the same function for Swift 2.0. I post here the code for who could need it:
// Set your desired frame rate
func setupCamera(maxFpsDesired: Double = 120) {
var captureSession = AVCaptureSession()
captureSession.sessionPreset = AVCaptureSessionPreset1920x1080
let backCamera = AVCaptureDevice.defaultDeviceWithMediaType(AVMediaTypeVideo)
do{ let input = try AVCaptureDeviceInput(device: backCamera)
captureSession.addInput(input) }
catch { print("Error: can't access camera")
return
}
do {
var finalFormat = AVCaptureDeviceFormat()
var maxFps: Double = 0
for vFormat in backCamera!.formats {
var ranges = vFormat.videoSupportedFrameRateRanges as! [AVFrameRateRange]
let frameRates = ranges[0]
/*
"frameRates.maxFrameRate >= maxFps" select the video format
desired with the highest resolution available, because
the camera formats are ordered; else
"frameRates.maxFrameRate > maxFps" select the first
format available with the desired fps
*/
if frameRates.maxFrameRate >= maxFps && frameRates.maxFrameRate <= maxFpsDesired {
maxFps = frameRates.maxFrameRate
finalFormat = vFormat as! AVCaptureDeviceFormat
}
}
if maxFps != 0 {
let timeValue = Int64(1200.0 / maxFps)
let timeScale: Int64 = 1200
try backCamera!.lockForConfiguration()
backCamera!.activeFormat = finalFormat
backCamera!.activeVideoMinFrameDuration = CMTimeMake(timeValue, timeScale)
backCamera!.activeVideoMaxFrameDuration = CMTimeMake(timeValue, timeScale) backCamera!.focusMode = AVCaptureFocusMode.AutoFocus
backCamera!.unlockForConfiguration()
}
}
catch {
print("Something was wrong")
}
let videoOutput = AVCaptureVideoDataOutput()
videoOutput.alwaysDiscardsLateVideoFrames = true
videoOutput.videoSettings = NSDictionary(object: Int(kCVPixelFormatType_32BGRA),
forKey: kCVPixelBufferPixelFormatTypeKey as String) as [NSObject : AnyObject]
videoOutput.setSampleBufferDelegate(self, queue: dispatch_queue_create("sample buffer delegate", DISPATCH_QUEUE_SERIAL))
if captureSession.canAddOutput(videoOutput){
captureSession.addOutput(videoOutput) }
let previewLayer = AVCaptureVideoPreviewLayer(session: captureSession)
view.layer.addSublayer(previewLayer)
previewLayer.transform = CATransform3DMakeRotation(-1.5708, 0, 0, 1);
previewLayer.frame = self.view.bounds
previewLayer.videoGravity = AVLayerVideoGravityResizeAspectFill;
self.view.layer.addSublayer(previewLayer)
captureSession.startRunning()
}
Had the same problem. Fixed by using this function after [AVCaptureSession addInput:cameraDeviceInput]. Somehow I could not change the framerate on my iPad pro before capture session was started. So at first I changed video format after the device was added to the capture session.
- (void)switchFormatWithDesiredFPS:(CGFloat)desiredFPS
{
BOOL isRunning = _captureSession.isRunning;
if (isRunning) [_captureSession stopRunning];
AVCaptureDevice *videoDevice = [AVCaptureDevice defaultDeviceWithMediaType:AVMediaTypeVideo];
AVCaptureDeviceFormat *selectedFormat = nil;
int32_t maxWidth = 0;
AVFrameRateRange *frameRateRange = nil;
for (AVCaptureDeviceFormat *format in [videoDevice formats]) {
for (AVFrameRateRange *range in format.videoSupportedFrameRateRanges) {
CMFormatDescriptionRef desc = format.formatDescription;
CMVideoDimensions dimensions = CMVideoFormatDescriptionGetDimensions(desc);
int32_t width = dimensions.width;
if (range.minFrameRate <= desiredFPS && desiredFPS <= range.maxFrameRate && width >= maxWidth) {
selectedFormat = format;
frameRateRange = range;
maxWidth = width;
}
}
}
if (selectedFormat) {
if ([videoDevice lockForConfiguration:nil]) {
NSLog(#"selected format:%#", selectedFormat);
videoDevice.activeFormat = selectedFormat;
videoDevice.activeVideoMinFrameDuration = CMTimeMake(1, (int32_t)desiredFPS);
videoDevice.activeVideoMaxFrameDuration = CMTimeMake(1, (int32_t)desiredFPS);
[videoDevice unlockForConfiguration];
}
}
if (isRunning) [_captureSession startRunning];
}

Capture image with bounds?

I am able to capture images from the iOS rear facing camera. Everything is working flawlessly except I want it to take the picture as per the bounds in my UIView.
My code is below:
- (void)viewDidLoad
{
[super viewDidLoad];
// Do any additional setup after loading the view.
session = [[AVCaptureSession alloc] init];
session.sessionPreset = AVCaptureSessionPresetMedium;
captureVideoPreviewLayer = [[AVCaptureVideoPreviewLayer alloc] initWithSession:session];
captureVideoPreviewLayer.videoGravity = AVLayerVideoGravityResizeAspectFill;
captureVideoPreviewLayer.frame = vImagePreview.bounds;
[vImagePreview.layer addSublayer:captureVideoPreviewLayer];
AVCaptureDevice *device = [self backFacingCameraIfAvailable];
NSError *error = nil;
AVCaptureDeviceInput *input = [AVCaptureDeviceInput deviceInputWithDevice:device error:&error];
if (!input) {
// Handle the error appropriately.
NSLog(#"ERROR: trying to open camera: %#", error);
}
[session addInput:input];
stillImageOutput = [[AVCaptureStillImageOutput alloc] init];
NSDictionary *outputSettings = [[NSDictionary alloc] initWithObjectsAndKeys: AVVideoCodecJPEG, AVVideoCodecKey, nil];
[stillImageOutput setOutputSettings:outputSettings];
AVCaptureConnection *videoConnection = nil;
for (AVCaptureConnection *connection in stillImageOutput.connections) {
for (AVCaptureInputPort *port in [connection inputPorts]) {
if ([[port mediaType] isEqual:AVMediaTypeVideo] ) {
videoConnection = connection;
break;
}
}
if (videoConnection) {
break;
}
}
[session startRunning];
[session addOutput:stillImageOutput];
}
-(AVCaptureDevice *)backFacingCameraIfAvailable{
NSArray *videoDevices = [AVCaptureDevice devicesWithMediaType:AVMediaTypeVideo];
AVCaptureDevice *captureDevice = nil;
for (AVCaptureDevice *device in videoDevices){
if (device.position == AVCaptureDevicePositionBack){
captureDevice = device;
break;
}
}
// couldn't find one on the back, so just get the default video device.
if (!captureDevice){
captureDevice = [AVCaptureDevice defaultDeviceWithMediaType:AVMediaTypeVideo];
}
return captureDevice;
}
And below is the code to capture the image:
- (IBAction)captureTask {
AVCaptureConnection *videoConnection = nil;
for (AVCaptureConnection *connection in stillImageOutput.connections){
for (AVCaptureInputPort *port in [connection inputPorts]){
if ([[port mediaType] isEqual:AVMediaTypeVideo]){
videoConnection = connection;
break;
}
}
if (videoConnection) {
break;
}
}
NSLog(#"about to request a capture from: %#", stillImageOutput);
[stillImageOutput captureStillImageAsynchronouslyFromConnection:videoConnection completionHandler: ^(CMSampleBufferRef imageSampleBuffer, NSError *error)
{
CFDictionaryRef exifAttachments = CMGetAttachment(imageSampleBuffer, kCGImagePropertyExifDictionary, NULL);
if (exifAttachments) {
// Do something with the attachments.
NSLog(#"attachements: %#", exifAttachments);
} else {
NSLog(#"no attachments");
}
NSData *imageData = [AVCaptureStillImageOutput jpegStillImageNSDataRepresentation:imageSampleBuffer];
UIImage *image = [[UIImage alloc] initWithData:imageData];
stillImage = image;
}];
}
The issue i'm facing is that it's taking the picture, and saving to stillImage, however, the image is for the whole iPhone screen from what I can tell. It's not to the bounds of the UIView *vImagePreview I created. Is there a way to clip the bounds of the captured image??
[EDIT]
After reading the docs, I realized the image was proper resolution, as per here: session.sessionPreset = AVCaptureSessionPresetMedium;. Is there a way to make the image like a square? Like how Instagram makes their images? All of the session presets according to the docs are not at all squares :(
I tried with the below:
captureVideoPreviewLayer.videoGravity = AVLayerVideoGravityResize;
However, it only resizes the image to fit the current view, doesn't make a square image.
I understand your frustration, presets should be customizable or have more options! What I do with my images is crop them about the center, for which I wrote the following code:
- (UIImage *)crop:(UIImage *)image from:(CGSize)src to:(CGSize)dst
{
CGPoint cropCenter = CGPointMake((src.width/2), (src.height/2));
CGPoint cropStart = CGPointMake((cropCenter.x - (dst.width/2)), (cropCenter.y - (dst.height/2)));
CGRect cropRect = CGRectMake(cropStart.x, cropStart.y, dst.width, dst.height);
CGImageRef cropRef = CGImageCreateWithImageInRect(image.CGImage, cropRect);
UIImage* cropImage = [UIImage imageWithCGImage:cropRef];
CGImageRelease(cropRef);
return cropImage;
}
Where src represents the original dimensions and dst represents the cropped dimensions; and image is of course the image you want cropped.
If the device is retina display,
then this screenshot works like mentioned below:
if ([[UIScreen mainScreen] respondsToSelector:#selector(scale)] == YES && [[UIScreen mainScreen] scale] == 2.00)
{
CGPoint cropCenter = CGPointMake((src.width), (src.height));
CGPoint cropStart = CGPointMake((cropCenter.x - (dst.width)), (cropCenter.y - (dst.height)));
CGRect cropRect = CGRectMake(cropStart.x, cropStart.y, dst.width*2, dst.height*2);
CGImageRef cropRef = CGImageCreateWithImageInRect(image.CGImage, cropRect);
UIImage* cropImage = [UIImage imageWithCGImage:cropRef];
CGImageRelease(cropRef);
return cropImage;
}
else
{
CGPoint cropCenter = CGPointMake((src.width/2), (src.height/2));
CGPoint cropStart = CGPointMake((cropCenter.x - (dst.width/2)), (cropCenter.y - (dst.height/2)));
CGRect cropRect = CGRectMake(cropStart.x, cropStart.y, dst.width, dst.height);
CGImageRef cropRef = CGImageCreateWithImageInRect(image.CGImage, cropRect);
UIImage* cropImage = [UIImage imageWithCGImage:cropRef];
CGImageRelease(cropRef);
return cropImage;
}

captureOutput:didFinishRecordingToOutputFileAtURL:fromConnections:error not being called

I am writing a videocapture app for ios 4+. It works fine on devices with ios 5+ but in ios 4+ the delegate didFinishRecordingToOutputFileAtURL is not being called after the recording has stopped. I have checked apple's reference which says "This method is always called for each recording request, even if no data is successfully written to the file."
https://developer.apple.com/library/ios/#documentation/AVFoundation/Reference/AVCaptureFileOutputRecordingDelegate_Protocol/Reference/Reference.html
Any suggestions ?
Here is the complete code:
/
/
// HomeViewController.m
// MyAgingBooth
//
// Created by Mahmud on 29/10/11.
// Copyright 2011 __MyCompanyName__. All rights reserved.
//
#import "HomeViewController.h"
#import "Globals.h"
#import <MobileCoreServices/MobileCoreServices.h>
#import <MediaPlayer/MPMoviePlayerController.h>
#import "SharedData.h"
#import "ResultViewController.h"
#implementation HomeViewController
#synthesize BtnFromCamera, PreviewLayer;
- (id)initWithNibName:(NSString *)nibNameOrNil bundle:(NSBundle *)nibBundleOrNil
{
self = [super initWithNibName:nibNameOrNil bundle:nibBundleOrNil];
if (self) {
// Custom initialization
isPlaying=NO;
playerScore=0;
playerTurn=0;
}
return self;
}
- (void)dealloc
{
//[levelTimer release];
[super dealloc];
}
- (void)didReceiveMemoryWarning
{
// Releases the view if it doesn't have a superview.
[super didReceiveMemoryWarning];
// Release any cached data, images, etc that aren't in use.
}
#pragma mark - View lifecycle
- (void)viewDidLoad
{
[super viewDidLoad];
self.navigationController.navigationBar.barStyle = UIBarStyleBlackTranslucent;
playerName.text=[NSString stringWithFormat: #"Player %d", (playerTurn+1)];
//add a right bar button item proceed to next.
UIBarButtonItem *proceedButton = [[UIBarButtonItem alloc] initWithTitle:#"Proceed" style:UIBarButtonItemStylePlain target:self action:#selector(proceedToNext)];
//[proceedButton setImage:[UIImage imageNamed:#"info.png"]];
self.navigationItem.rightBarButtonItem=proceedButton;
[proceedButton release];
[BtnFromCamera setTitle:#"Start" forState:UIControlStateNormal];
//[self.BtnFromCamera setImage:camera forState:UIControlStateNormal];
//[self.BtnFromCamera setImage:camera forState:UIControlStateSelected];
NSArray *words=[NSArray arrayWithObjects:#"SAY: Girls",#"SAY: Shut up", #"SAY: Tiger",#"SAY: Absurd",#"SAY: Tonight", #"SAY: Amstardam", nil];
[word setText:[words objectAtIndex:arc4random()%6]];
[self initCaptureSession];
}
-(void) proceedToNext
{
self.title=#"Back";
ResultViewController *resultViewController= [[ResultViewController alloc] initWithNibName:#"ResultViewController" bundle:nil];
[self.navigationController pushViewController:resultViewController animated:YES];
[resultViewController release];
}
- (void)viewDidUnload
{
[super viewDidUnload];
// Release any retained subviews of the main view.
// e.g. self.myOutlet = nil;
}
- (BOOL)shouldAutorotateToInterfaceOrientation:(UIInterfaceOrientation)interfaceOrientation
{
// Return YES for supported orientations
return (interfaceOrientation == UIInterfaceOrientationPortrait);
}
//Action handlers for the buttons
// take snap with camera
-(void) initCaptureSession
{
NSLog(#"Setting up capture session");
CaptureSession = [[AVCaptureSession alloc] init];
//----- ADD INPUTS -----
NSLog(#"Adding video input");
//ADD VIDEO INPUT
AVCaptureDevice *VideoDevice = [self frontFacingCameraIfAvailable ];
//[AVCaptureDevice defaultDeviceWithMediaType:AVMediaTypeVideo];
if (VideoDevice)
{
NSError *error;
VideoInputDevice = [AVCaptureDeviceInput deviceInputWithDevice:VideoDevice error:&error];
if (!error)
{
if ([CaptureSession canAddInput:VideoInputDevice])
[CaptureSession addInput:VideoInputDevice];
else
NSLog(#"Couldn't add video input");
}
else
{
NSLog(#"Couldn't create video input");
}
}
else
{
NSLog(#"Couldn't create video capture device");
}
//ADD AUDIO INPUT
NSLog(#"Adding audio input");
AVCaptureDevice *audioCaptureDevice = [AVCaptureDevice defaultDeviceWithMediaType:AVMediaTypeAudio];
NSError *error = nil;
AVCaptureDeviceInput *audioInput = [AVCaptureDeviceInput deviceInputWithDevice:audioCaptureDevice error:&error];
if (audioInput)
{
[CaptureSession addInput:audioInput];
}
//----- ADD OUTPUTS -----
//ADD VIDEO PREVIEW LAYER
NSLog(#"Adding video preview layer");
[self setPreviewLayer:[[[AVCaptureVideoPreviewLayer alloc] initWithSession:CaptureSession] autorelease]];
PreviewLayer.orientation = AVCaptureVideoOrientationPortrait; //<<SET ORIENTATION. You can deliberatly set this wrong to flip the image and may actually need to set it wrong to get the right image
[[self PreviewLayer] setVideoGravity:AVLayerVideoGravityResizeAspectFill];
//ADD MOVIE FILE OUTPUT
NSLog(#"Adding movie file output");
MovieFileOutput = [[AVCaptureMovieFileOutput alloc] init];
Float64 TotalSeconds = 60; //Total seconds
int32_t preferredTimeScale = 30; //Frames per second
CMTime maxDuration = CMTimeMakeWithSeconds(TotalSeconds, preferredTimeScale); //<<SET MAX DURATION
MovieFileOutput.maxRecordedDuration = maxDuration;
MovieFileOutput.minFreeDiskSpaceLimit = 1024 * 1024; //<<SET MIN FREE SPACE IN BYTES FOR RECORDING TO CONTINUE ON A VOLUME
if ([CaptureSession canAddOutput:MovieFileOutput])
[CaptureSession addOutput:MovieFileOutput];
AudioOutput = [[AVCaptureAudioDataOutput alloc] init];
if([CaptureSession canAddOutput:AudioOutput])
{
[CaptureSession addOutput:AudioOutput];
NSLog(#"AudioOutput addedd");
}
//SET THE CONNECTION PROPERTIES (output properties)
[self CameraSetOutputProperties]; //(We call a method as it also has to be done after changing camera)
//----- SET THE IMAGE QUALITY / RESOLUTION -----
//Options:
// AVCaptureSessionPresetHigh - Highest recording quality (varies per device)
// AVCaptureSessionPresetMedium - Suitable for WiFi sharing (actual values may change)
// AVCaptureSessionPresetLow - Suitable for 3G sharing (actual values may change)
// AVCaptureSessionPreset640x480 - 640x480 VGA (check its supported before setting it)
// AVCaptureSessionPreset1280x720 - 1280x720 720p HD (check its supported before setting it)
// AVCaptureSessionPresetPhoto - Full photo resolution (not supported for video output)
NSLog(#"Setting image quality");
[CaptureSession setSessionPreset:AVCaptureSessionPresetMedium];
if ([CaptureSession canSetSessionPreset:AVCaptureSessionPreset640x480]) //Check size based configs are supported before setting them
[CaptureSession setSessionPreset:AVCaptureSessionPreset640x480];
//----- DISPLAY THE PREVIEW LAYER -----
//Display it full screen under out view controller existing controls
NSLog(#"Display the preview layer");
CGRect layerRect = CGRectMake(10,44,300,290); //[[[self view] layer] bounds];
[PreviewLayer setBounds:layerRect];
[PreviewLayer setPosition:CGPointMake(CGRectGetMidX(layerRect),
CGRectGetMidY(layerRect))];
//[[[self view] layer] addSublayer:[[self CaptureManager] previewLayer]];
//We use this instead so it goes on a layer behind our UI controls (avoids us having to manually bring each control to the front):
UIView *CameraView = [[[UIView alloc] init] autorelease];
[[self view] addSubview:CameraView];
//[self.view sendSubviewToBack:CameraView];
[[CameraView layer] addSublayer:PreviewLayer];
//----- START THE CAPTURE SESSION RUNNING -----
[CaptureSession startRunning];
}
//********** CAMERA SET OUTPUT PROPERTIES **********
- (void) CameraSetOutputProperties
{
AVCaptureConnection *CaptureConnection=nil;
//SET THE CONNECTION PROPERTIES (output properties)
NSComparisonResult order = [[UIDevice currentDevice].systemVersion compare: #"5.0.0" options: NSNumericSearch];
if (order == NSOrderedSame || order == NSOrderedDescending) {
// OS version >= 5.0.0
CaptureConnection = [MovieFileOutput connectionWithMediaType:AVMediaTypeVideo];
if (CaptureConnection.supportsVideoMinFrameDuration)
CaptureConnection.videoMinFrameDuration = CMTimeMake(1, CAPTURE_FRAMES_PER_SECOND);
if (CaptureConnection.supportsVideoMaxFrameDuration)
CaptureConnection.videoMaxFrameDuration = CMTimeMake(1, CAPTURE_FRAMES_PER_SECOND);
if (CaptureConnection.supportsVideoMinFrameDuration)
{
CMTimeShow(CaptureConnection.videoMinFrameDuration);
CMTimeShow(CaptureConnection.videoMaxFrameDuration);
}
} else {
// OS version < 5.0.0
CaptureConnection = [self connectionWithMediaType:AVMediaTypeVideo fromConnections:[MovieFileOutput connections]];
}
//Set landscape (if required)
if ([CaptureConnection isVideoOrientationSupported])
{
AVCaptureVideoOrientation orientation = AVCaptureVideoOrientationLandscapeRight; //<<<<<SET VIDEO ORIENTATION IF LANDSCAPE
[CaptureConnection setVideoOrientation:orientation];
}
//Set frame rate (if requried)
//CMTimeShow(CaptureConnection.videoMinFrameDuration);
//CMTimeShow(CaptureConnection.videoMaxFrameDuration);
}
- (IBAction) StartVideo
{
if (!isPlaying) {
self.navigationItem.rightBarButtonItem.enabled=NO;
[BtnFromCamera setTitle:#"Stop" forState:UIControlStateNormal];
playerName.text=[NSString stringWithFormat:#"Player %d", playerTurn+1];
playerScore=0;
count=0;
isPlaying=YES;
//Create temporary URL to record to
NSString *outputPath = [[NSString alloc] initWithFormat:#"%#%#", NSTemporaryDirectory(), #"output.mov"];
NSURL *outputURL = [[NSURL alloc] initFileURLWithPath:outputPath];
NSFileManager *fileManager = [NSFileManager defaultManager];
if ([fileManager fileExistsAtPath:outputPath])
{
NSError *error;
if ([fileManager removeItemAtPath:outputPath error:&error] == NO)
{
//Error - handle if requried
NSLog(#"file remove error");
}
}
[outputPath release];
//Start recording
[MovieFileOutput startRecordingToOutputFileURL:outputURL recordingDelegate:self];
[outputURL release];
//NSString *DestFilename = # "output.mov";
//Set the file save to URL
/* NSString *documentsDirectory = [NSSearchPathForDirectoriesInDomains(NSDocumentDirectory, NSUserDomainMask, YES) objectAtIndex:0];
NSDateFormatter *dateFormatter = [[NSDateFormatter alloc] init];
[dateFormatter setDateFormat:#"yyyy-MM-dd_HH-mm-ss"];
NSString *destinationPath = [documentsDirectory stringByAppendingFormat:#"/output_%#.mov", [dateFormatter stringFromDate:[NSDate date]]];
[dateFormatter release];
NSURL* saveLocationURL = [[NSURL alloc] initFileURLWithPath:destinationPath];
[MovieFileOutput startRecordingToOutputFileURL:saveLocationURL recordingDelegate:self];
[saveLocationURL release]; */
levelTimer = [NSTimer scheduledTimerWithTimeInterval:0.05 target: self selector: #selector(levelTimerCallback:) userInfo: nil repeats: YES];
}
else
{
isPlaying=NO;
NSLog(#"STOP RECORDING");
[MovieFileOutput stopRecording];
[levelTimer invalidate];
[BtnFromCamera setTitle:#"Start" forState:UIControlStateNormal];
self.navigationItem.rightBarButtonItem.enabled=YES;
}
}
//********** DID FINISH RECORDING TO OUTPUT FILE AT URL **********/
- (void)captureOutput:(AVCaptureFileOutput *)captureOutput
didFinishRecordingToOutputFileAtURL:(NSURL *)outputFileURL
fromConnections:(NSArray *)connections
error:(NSError *)error
{
NSLog(#"didFinishRecordingToOutputFileAtURL - enter");
BOOL RecordedSuccessfully = YES;
if ([error code] != noErr)
{
// A problem occurred: Find out if the recording was successful.
id value = [[error userInfo] objectForKey:AVErrorRecordingSuccessfullyFinishedKey];
if (value)
{
RecordedSuccessfully = [value boolValue];
}
}
if (RecordedSuccessfully)
{
//----- RECORDED SUCESSFULLY -----
NSLog(#"didFinishRecordingToOutputFileAtURL - success");
ALAssetsLibrary *library = [[ALAssetsLibrary alloc] init];
if ([library videoAtPathIsCompatibleWithSavedPhotosAlbum:outputFileURL])
{
[library writeVideoAtPathToSavedPhotosAlbum:outputFileURL
completionBlock:^(NSURL *assetURL, NSError *error)
{
if (error)
{
NSLog(#"File save error");
}
else
{
playerScore=(playerScore/count);
NSDictionary *dict = [[NSDictionary alloc] initWithObjectsAndKeys:playerName.text, #"PlayerName", [NSNumber numberWithFloat:playerScore], #"Score", assetURL, #"VideoURL",nil];
SharedData *d=[SharedData sharedManager];
[d.PlayerStats addObject:dict];
[dict release];
playerName.text=[NSString stringWithFormat:#"Score %f", playerScore];
playerTurn++;
}
}];
}
else {
NSString *assetURL=[self copyFileToDocuments:outputFileURL];
if(assetURL!=nil)
{
playerScore=(playerScore/count);
NSDictionary *dict = [[NSDictionary alloc] initWithObjectsAndKeys:playerName.text, #"PlayerName", [NSNumber numberWithFloat:playerScore], #"Score",assetURL , #"VideoURL",nil];
SharedData *d=[SharedData sharedManager];
[d.PlayerStats addObject:dict];
[dict release];
playerName.text=[NSString stringWithFormat:#"Score %f", playerScore];
playerTurn++;
}
}
[library release];
}
}
- (NSString*) copyFileToDocuments:(NSURL *)fileURL
{
NSString *documentsDirectory = [NSSearchPathForDirectoriesInDomains(NSDocumentDirectory, NSUserDomainMask, YES) objectAtIndex:0];
NSDateFormatter *dateFormatter = [[NSDateFormatter alloc] init];
[dateFormatter setDateFormat:#"yyyy-MM-dd_HH-mm-ss"];
NSString *destinationPath = [documentsDirectory stringByAppendingFormat:#"/output_%#.mov", [dateFormatter stringFromDate:[NSDate date]]];
[dateFormatter release];
NSError *error;
if (![[NSFileManager defaultManager] copyItemAtURL:fileURL toURL:[NSURL fileURLWithPath:destinationPath] error:&error]) {
NSLog(#"File save error %#", [error localizedDescription]);
return nil;
}
return destinationPath;
}
- (void)levelTimerCallback:(NSTimer *)timer {
AVCaptureConnection *audioConnection = [self connectionWithMediaType:AVMediaTypeAudio fromConnections:[MovieFileOutput connections]];
//return [audioConnection isActive];
for (AVCaptureAudioChannel *channel in audioConnection.audioChannels) {
float avg = channel.averagePowerLevel;
// float peak = channel.peakHoldLevel;
float vol=powf(10, avg)*1000;
NSLog(#"Power: %f",vol);
if (isPlaying && vol > 0) {
playerScore=playerScore+vol;
count=count+1;
}
}
}
- (AVCaptureConnection *)connectionWithMediaType:(NSString *)mediaType fromConnections:(NSArray *)connections
{
for ( AVCaptureConnection *connection in connections ) {
for ( AVCaptureInputPort *port in [connection inputPorts] ) {
if ( [[port mediaType] isEqual:mediaType] ) {
return connection;
}
}
}
return nil;
}
- (AVCaptureDevice *)frontFacingCameraIfAvailable
{
// look at all the video devices and get the first one that's on the front
NSArray *videoDevices = [AVCaptureDevice devicesWithMediaType:AVMediaTypeVideo];
AVCaptureDevice *captureDevice = nil;
for (AVCaptureDevice *device in videoDevices)
{
if (device.position == AVCaptureDevicePositionFront)
{
captureDevice = device;
break;
}
}
// couldn't find one on the front, so just get the default video device.
if ( ! captureDevice)
{
captureDevice = [AVCaptureDevice defaultDeviceWithMediaType:AVMediaTypeVideo];
}
return captureDevice;
}
#end
Fixed the issue. I accedentally added an extra audio output. Removing the following fragment works for me.
AudioOutput = [[AVCaptureAudioDataOutput alloc] init];
if([CaptureSession canAddOutput:AudioOutput])
{
[CaptureSession addOutput:AudioOutput];
NSLog(#"AudioOutput addedd");
}

Resources