It keeps on giving me the error:
Error Domain=AVFoundationErrorDomain Code=-11814 "Cannot Record"
I am not sure what the problem is? I am trying to record the sound right when the counter reaches 1 after a picture is taken.
static int counter;
//counter will always be zero it think unless it is assigned.
if (counter == 0){
dispatch_async([self sessionQueue], ^{
// Update the orientation on the still image output video connection before capturing.
[[[self stillImageOutput] connectionWithMediaType:AVMediaTypeVideo] setVideoOrientation:[[(AVCaptureVideoPreviewLayer *)[[self previewView] layer] connection] videoOrientation]];
// Flash set to Auto for Still Capture
[AVCamViewController setFlashMode:AVCaptureFlashModeAuto forDevice:[[self videoDeviceInput] device]];
// Capture a still image.
[[self stillImageOutput] captureStillImageAsynchronouslyFromConnection:[[self stillImageOutput] connectionWithMediaType:AVMediaTypeVideo] completionHandler:^(CMSampleBufferRef imageDataSampleBuffer, NSError *error) {
if (imageDataSampleBuffer)
{//[AVCaptureSession snapStillImage];
NSData *imageData = [AVCaptureStillImageOutput jpegStillImageNSDataRepresentation:imageDataSampleBuffer];
UIImage *image = [[UIImage alloc] initWithData:imageData];
[[[ALAssetsLibrary alloc] init] writeImageToSavedPhotosAlbum:[image CGImage] orientation:(ALAssetOrientation)[image imageOrientation] completionBlock:nil];
}
NSLog(#"i");
}];
});
if (!_audioRecorder.recording)
{
//start recording as part of still image
_playButton.enabled = NO;
_stopButton.enabled = YES;
[_audioRecorder record];
for(int i=0;i<1000;i++)
{
//do nothing just counting
}
//stop the recording
}
}
else if (counter == 1)
{
[self recordForDuration:5];
}
}
This error is because you use an emulator, you need use an device
Regards
Make sure there is only one instance of AVCaptureSession running.
Having your device restrict the camera access under "Settings > General > Restrictions" will also give you this error.
I ran into the same error when I was trying AVFoundation on a Mac, using a Mac Catalyst app. That is documented and intended behaviour, apparently:
https://forums.developer.apple.com/thread/124652#389519
https://developer.apple.com/documentation/avfoundation/cameras_and_media_capture
This issue behaves as intended. We do document that we are not listing
devices in Catalyst mode. See Important iPad apps running in macOS
cannot use the AVFoundation Capture classes. These apps should instead
use UIImagePickerController for photo and video capture.
Related
I want to use both of the objective c methods listed below in my application. The first method uploads a UIImagePicker photograph to a local server.
// I would still like to use this method structure but with the `AVCam` classes.
-(void)uploadPhoto {
//upload the image and the title to the web service
[[API sharedInstance] commandWithParams:[NSMutableDictionary dictionaryWithObjectsAndKeys:#"upload", #"command", UIImageJPEGRepresentation(photo.image,70), #"file", fldTitle.text, #"title", nil] onCompletion:^(NSDictionary *json) {
//completion
if (![json objectForKey:#"error"]) {
//success
[[[UIAlertView alloc]initWithTitle:#"Success!" message:#"Your photo is uploaded" delegate:nil cancelButtonTitle:#"Yay!" otherButtonTitles: nil] show];
} else {
//error, check for expired session and if so - authorize the user
NSString* errorMsg = [json objectForKey:#"error"];
[UIAlertView error:errorMsg];
if ([#"Authorization required" compare:errorMsg]==NSOrderedSame) {
[self performSegueWithIdentifier:#"ShowLogin" sender:nil];
}
}
}];
}
I want to add a second method : The second method performs an IBAction picture snap using AVCam but I changed it to void to launch the the view loads using [self snapStillImage].
EDIT
- (IBAction)snapStillImage:(id)sender
{
dispatch_async([self sessionQueue], ^{
// Update the orientation on the still image output video connection before capturing.
[[[self stillImageOutput] connectionWithMediaType:AVMediaTypeVideo] setVideoOrientation:[[(AVCaptureVideoPreviewLayer *)[[self previewView] layer] connection] videoOrientation]];
// Flash set to Auto for Still Capture
[ViewController5 setFlashMode:AVCaptureFlashModeAuto forDevice:[[self videoDeviceInput] device]];
// Capture a still image.
[[self stillImageOutput] captureStillImageAsynchronouslyFromConnection:[[self stillImageOutput] connectionWithMediaType:AVMediaTypeVideo] completionHandler:^(CMSampleBufferRef imageDataSampleBuffer, NSError *error) {
if (imageDataSampleBuffer)
{
NSData *imageData = [AVCaptureStillImageOutput jpegStillImageNSDataRepresentation:imageDataSampleBuffer];
UIImage *image = [[UIImage alloc] initWithData:imageData];
[[[ALAssetsLibrary alloc] init] writeImageToSavedPhotosAlbum:[image CGImage] orientation:(ALAssetOrientation)[image imageOrientation] completionBlock:nil];
//
photo = [[UIImage alloc] initWithData:imageData];
}
}];
});
}
Can someone please set photo via AVCam? At the very least humor me and start a dialogue about AVFoundation and its appropriate classes for tackling an issue like this.
Additional info: The avcam method is simply an excerpt from this https://developer.apple.com/library/ios/samplecode/AVCam/Introduction/Intro.html
#Aksh1t I want to set an UIImage named image with the original contents of the AVFoundation snap. Not UIImagePicker. Here is the method that sets the outlet using UIImagePicker.
#pragma mark - Image picker delegate methods
-(void)imagePickerController:(UIImagePickerController *)picker didFinishPickingMediaWithInfo:(NSDictionary *)info {
UIImage *image = [info objectForKey:UIImagePickerControllerOriginalImage];
// Resize the image from the camera
UIImage *scaledImage = [image resizedImageWithContentMode:UIViewContentModeScaleAspectFill bounds:CGSizeMake(photo.frame.size.width, photo.frame.size.height) interpolationQuality:kCGInterpolationHigh];
// Crop the image to a square (yikes, fancy!)
UIImage *croppedImage = [scaledImage croppedImage:CGRectMake((scaledImage.size.width -photo.frame.size.width)/2, (scaledImage.size.height -photo.frame.size.height)/2, photo.frame.size.width, photo.frame.size.height)];
// Show the photo on the screen
photo.image = croppedImage;
[picker dismissModalViewControllerAnimated:NO];
}
After that I simply want to upload it using the first method I posted. Sorry for being unclear. I basically want to do this in my new app (i was unclear about what app).
Take a photo using AVCam
Set that photo to an UIImageView IBOutlet named photo
Upload photo (the original AVCam photo) to the server
The basic framework is above and I will answer any questions
The following line of code in your snapStillImage method takes a photo into the imageData variable.
NSData *imageData = [AVCaptureStillImageOutput jpegStillImageNSDataRepresentation:imageDataSampleBuffer];
Next, you are creating one UIImage object from this data like this
UIImage *image = [[UIImage alloc] initWithData:imageData];
Instead of the above code, make a global variable UIImage *photo;, and initialize that with the imageData when your snapStillImage takes the photo like this
photo = [[UIImage alloc] initWithData:imageData];
Since photo is a global variable, you will then be able to use that in your uploadPhoto method and send it to your server.
Hope this helps, and if you have any question, leave it in the comments.
Edit:
Since you already have a IBOutlet UIImageView *photo; in your file, you don't even need a global variable to store the UIImage. You can just replace the following line in your snapStillImage method:
UIImage *image = [[UIImage alloc] initWithData:imageData];
with this line
photo.image = [[UIImage alloc] initWithData:imageData];
I'm using Apple's AVCam source code to create a custom camera, its working like a charm, the problem is once I captured a video or image with it, and then checked it into photo library its orientation gets changed to landscape (even I captured it in portrait orientation). I searched a lot for this, but couldn't find a way for this. Any help?
For a note, my app only supports portrait and capturing should only in portrait.
Update:
AVCaptureConnection *captureConnection = ...
if ([captureConnection isVideoOrientationSupported])
{
AVCaptureVideoOrientation orientation = AVCaptureVideoOrientationPortrait;
[captureConnection setVideoOrientation:orientation];
}
This doesn't work.
For capturing image you should set orientation too. When you save image to disk you should use
writeImageToSavedPhotosAlbum:orientation:completionBlock:
function and set correct "orientation" parameter there too.
Usage: https://developer.apple.com/library/ios/documentation/AssetsLibrary/Reference/ALAssetsLibrary_Class/index.html#//apple_ref/occ/instm/ALAssetsLibrary/writeImageToSavedPhotosAlbum:orientation:completionBlock:
Example on Objective C:
// Flash set to Auto for Still Capture
[CameraViewController setFlashMode:AVCaptureFlashModeAuto
forDevice:[[self videoDeviceInput] device]];
// Capture a still image.
[[self stillImageOutput] captureStillImageAsynchronouslyFromConnection:[[self stillImageOutput] connectionWithMediaType:AVMediaTypeVideo]
completionHandler:^(CMSampleBufferRef imageDataSampleBuffer, NSError *error) {
if (imageDataSampleBuffer) {
self.imageData = [AVCaptureStillImageOutput jpegStillImageNSDataRepresentation:imageDataSampleBuffer];
[[[ALAssetsLibrary alloc] init] writeImageToSavedPhotosAlbum:image.CGImage
orientation:(ALAssetOrientation)[image imageOrientation]
completionBlock:^(NSURL *assetURL, NSError *error) {
if(error == nil) {
NSLog(#"PHOTO SAVED - assetURL: %#", assetURL);
} else {
NSLog(#"ERROR : %#",error);
}
}];
I am using the AVFoundation framework and implementing according to to capture video and setting up a previewLayer on my view that shows an output size of 640 x 480. I would like to have the video cropped to this frame (so the user gets exactly what they see in the UI as an output) but I am having trouble finding the point at which such a change should take place.
I would like to crop the video before / during capture rather than after the user takes the video as the user will be shooting multiple videos in a "wizard" and I am trying to limit time between steps.
Is this even possible?
here is the setup code for the capture session:
AVCaptureSession *newCaptureSession = [[AVCaptureSession alloc] init];
newCaptureSession.sessionPreset = AVCaptureSessionPresetiFrame1280x720;
// Add inputs and output to the capture session
if ([newCaptureSession canAddInput:newVideoInput]) {
[newCaptureSession addInput:newVideoInput];
}
if ([newCaptureSession canAddInput:newAudioInput]) {
[newCaptureSession addInput:newAudioInput];
}
if ([newCaptureSession canAddOutput:newStillImageOutput]) {
[newCaptureSession addOutput:newStillImageOutput];
}
[self setStillImageOutput:newStillImageOutput];
[self setVideoInput:newVideoInput];
[self setAudioInput:newAudioInput];
[self setSession:newCaptureSession];
// Set up the movie file output
NSURL *outputFileURL = [self tempFileURL];
AVCamRecorder *newRecorder = [[AVCamRecorder alloc] initWithSession:[self session] outputFileURL:outputFileURL];
[newRecorder setDelegate:self];
// Send an error to the delegate if video recording is unavailable
if (![newRecorder recordsVideo] && [newRecorder recordsAudio]) {
NSString *localizedDescription = NSLocalizedString(#"Video recording unavailable", #"Video recording unavailable description");
NSString *localizedFailureReason = NSLocalizedString(#"Movies recorded on this device will only contain audio. They will be accessible through iTunes file sharing.", #"Video recording unavailable failure reason");
NSDictionary *errorDict = [NSDictionary dictionaryWithObjectsAndKeys:
localizedDescription, NSLocalizedDescriptionKey,
localizedFailureReason, NSLocalizedFailureReasonErrorKey,
nil];
NSError *noVideoError = [NSError errorWithDomain:#"AVCam" code:0 userInfo:errorDict];
if ([[self delegate] respondsToSelector:#selector(captureManager:didFailWithError:)]) {
[[self delegate] captureManager:self didFailWithError:noVideoError];
}
}
[self setRecorder:newRecorder];
The video record start method:
-(void)startRecordingWithOrientation:(AVCaptureVideoOrientation)videoOrientation;
{
AVCaptureConnection *videoConnection = [AVCamUtilities connectionWithMediaType:AVMediaTypeVideo fromConnections:[[self movieFileOutput] connections]];
if ([videoConnection isVideoOrientationSupported])
[videoConnection setVideoOrientation:videoOrientation];
[[self movieFileOutput] startRecordingToOutputFileURL:[self outputFileURL] recordingDelegate:self];
}
In my application I am displaying a AVCaptureVideoPreviewLayer and then capturing a still image when the user clicks a button using the captureStillImageAsynchronouslyFromConnection function in AVCaptureOutput. This has worked well for me up until the iPhone 5, on which it never completes.
My setup code is:
...
self.imageOutput = [[AVCaptureStillImageOutput alloc] init];
NSDictionary *outputSettings = [[NSDictionary alloc] initWithObjectsAndKeys: AVVideoCodecJPEG, AVVideoCodecKey, nil];
[self.imageOutput setOutputSettings:outputSettings];
self.captureSession = [[[AVCaptureSession alloc] init] autorelease];
[self.captureSession addInput:self.rearFacingDeviceInput];
[self.captureSession addOutput:self.imageOutput];
[self.captureSession setSessionPreset:AVCaptureSessionPresetPhoto];
self.previewLayer = [[AVCaptureVideoPreviewLayer alloc] initWithSession:self.captureSession];
self.previewLayer.frame = CGRectMake(0, 0, 320, 427);
self.previewLayer.videoGravity = AVLayerVideoGravityResizeAspect;
[self.captureSession startRunning];
[outputSettings release];
My capture method is:
AVCaptureConnection *videoConnection = nil;
for (AVCaptureConnection *connection in self.imageOutput.connections){
for (AVCaptureInputPort *port in [connection inputPorts]){
if ([[port mediaType] isEqual:AVMediaTypeVideo] ){
videoConnection = connection;
break;
}
}
if (videoConnection) { break; }
}
//code to abort if not return 'soon'
...
[self.imageOutput captureStillImageAsynchronouslyFromConnection:videoConnection completionHandler: ^(CMSampleBufferRef imageSampleBuffer, NSError *error){
//use image here
}];
captureStillImageAsynchronouslyFromConnection never completes for me using an iPhone5
I have tested:
It isn't OS 6 as this code works on both an iPhone 4s and an iPod (iPod touch (4th generation) that have been updated
The captureSession is running
videoConnection is not nil
imageOutput is not nil
Also:
I'm using this method and not UIImagePickerController because I need to put the preview as a subview.
calling stopRunning on the capture Session takes several seconds on the iPhone 5 as well
Well, this code works fine. Tested for both iPhone 4 and 5, (baseSDK 7.1, under ARC).
Few things you have to consider.
1) be sure you set rearFacingDeviceInput properly,
AVCaptureDevice *device = [AVCaptureDevice defaultDeviceWithMediaType:AVMediaTypeVideo];
[self setRearFacingDeviceInput:[AVCaptureDeviceInput deviceInputWithDevice:device error:nil]];
2) as Vincent mentioned, there will be an error, try to log both an error and imageSampleBuffer
3) session's -startRunning and -stopRunning operations takes long time to complete (seconds, even 5-6s), those methods doesn't return until they are done all the work, to avoid blocked UI you shouldn't call these methods on main thread, one way is using GCD
dispatch_queue_t serialQueue = dispatch_queue_create("queue", NULL);
dispatch_async(serialQueue, ^{
[self.captureSession startRunning];
});
If still captureStillImageAsynchronously doesn't complete (to ensure that, add breakpoint in block, and log everything) you should check your device's camera. I believe your code works on all iPhone 5 devices. Hope this helps, good luck.
I have come up with an implementation of AVFoundation and ImageIO to take care of the photo taking in my application. I have an issue with it, however. The images I take are always dark, even if the flash goes off. Here's the code I use:
[[self currentCaptureOutput] captureStillImageAsynchronouslyFromConnection:[[self currentCaptureOutput].connections lastObject]
completionHandler:^(CMSampleBufferRef imageDataSampleBuffer, NSError *error) {
[[[blockSelf currentPreviewLayer] session] stopRunning];
if (!error) {
NSData *data = [AVCaptureStillImageOutput jpegStillImageNSDataRepresentation:imageDataSampleBuffer];
CGImageSourceRef source = CGImageSourceCreateWithData((CFDataRef) data, NULL);
if (source) {
UIImage *image = [blockSelf imageWithSource:source];
[blockSelf updateWithCapturedImage:image];
CFRelease(source);
}
}
}];
Is there anything there that could cause the image taken to not include the flash?
I found I sometimes got dark images if the AVCaptureSession was set up immediately before this call. Perhaps it takes a while for the auto-exposure & white balance settings to adjust themselves.
The solution was to set up the AVCaptureSession, then wait until the AVCaptureDevice's adjustingExposure and adjustingWhiteBalance properties are both NO (observe these with KVO) before calling -[AVCaptureStillImageOutput captureStillImageAsynchronouslyFromConnection: completionHandler:].