I want to use AVCaptureSession to capture images within my iOS App.
Based on the code from Apple's sample for AVCam https://developer.apple.com/library/ios/samplecode/AVCam/Introduction/Intro.html I tried to show a preview ImageView every time an image is captured.
// Capture a still image.
[[self stillImageOutput] captureStillImageAsynchronouslyFromConnection:[[self stillImageOutput] connectionWithMediaType:AVMediaTypeVideo] completionHandler:^(CMSampleBufferRef imageDataSampleBuffer, NSError *error) {
if (imageDataSampleBuffer)
{
NSData *imageData = [AVCaptureStillImageOutput jpegStillImageNSDataRepresentation:imageDataSampleBuffer];
[self processImage:[UIImage imageWithData:imageData]];
}
}];
After capturing processImage is called
- (void) processImage:(UIImage *)image
{
[[self postButton] setEnabled:YES];
[[self postButton] setHidden:NO];
[[self cancelButton] setEnabled:YES];
[[self cancelButton] setHidden:NO];
_preview = [[UIImageView alloc] init];
[_preview setImage:image];
_preview.hidden = NO;
}
But the ImageView is still unchanged/empty when it is displayed.
Can someone help me along?
This code works for me
AVCaptureConnection *connection = [_currentOutput connectionWithMediaType:AVMediaTypeVideo];
[self _setOrientationForConnection:connection];
[_captureOutputPhoto captureStillImageAsynchronouslyFromConnection:connection completionHandler:
^(CMSampleBufferRef imageDataSampleBuffer, NSError *error) {
if (!imageDataSampleBuffer) {
DLog(#"failed to obtain image data sample buffer");
// return delegate error
return;
}
if (error) {
if ([_delegate respondsToSelector:#selector(vision:capturedPhoto:error:)]) {
[_delegate vision:self capturedPhoto:nil error:error];
}
return;
}
NSMutableDictionary *photoDict = [[NSMutableDictionary alloc] init];
NSDictionary *metadata = nil;
// add photo metadata (ie EXIF: Aperture, Brightness, Exposure, FocalLength, etc)
metadata = (__bridge NSDictionary *)CMCopyDictionaryOfAttachments(kCFAllocatorDefault, imageDataSampleBuffer, kCMAttachmentMode_ShouldPropagate);
if (metadata) {
[photoDict setObject:metadata forKey:PBJVisionPhotoMetadataKey];
CFRelease((__bridge CFTypeRef)(metadata));
} else {
DLog(#"failed to generate metadata for photo");
}
// add JPEG and image data
NSData *jpegData = [AVCaptureStillImageOutput jpegStillImageNSDataRepresentation:imageDataSampleBuffer];
if (jpegData) {
// add JPEG
[photoDict setObject:jpegData forKey:PBJVisionPhotoJPEGKey];
// add image
UIImage *image = [self _uiimageFromJPEGData:jpegData];
if (image) {
[photoDict setObject:image forKey:PBJVisionPhotoImageKey];
} else {
DLog(#"failed to create image from JPEG");
// TODO: return delegate on error
}
// add thumbnail
UIImage *thumbnail = [self _thumbnailJPEGData:jpegData];
if (thumbnail) {
[photoDict setObject:thumbnail forKey:PBJVisionPhotoThumbnailKey];
} else {
DLog(#"failed to create a thumnbail");
// TODO: return delegate on error
}
} else {
DLog(#"failed to create jpeg still image data");
// TODO: return delegate on error
}
if ([_delegate respondsToSelector:#selector(vision:capturedPhoto:error:)]) {
[_delegate vision:self capturedPhoto:photoDict error:error];
}
// run a post shot focus
[self performSelector:#selector(_focus) withObject:nil afterDelay:0.5f];
}];
I bet that the trouble here is that completionHandler for captureStillImageAsynchronouslyFromConnection isn't being called on main thread. Try redispatching your call out to the main thread.
dispatch_async(dispatch_get_main_queue(), ^{
[self processImage:[UIImage imageWithData:imageData]];
});
I also notice that you're never adding _preview as a subview of anything.
Related
My app has a button that should take a screenshot of whatever is on the screen by calling "takeSnapShot" (see code below).
I have two views and one of them is a picker view that I use for the camera, so in the screen I see the image coming from the camera and next to it another view with images.
The thing is that I capture the screen but it doesn't capture the image coming from the camera.
Also, I think I render the view.layer but the debugger keeps saying>
"Snapshotting a view that has not been rendered results in an empty
snapshot. Ensure your view has been rendered at least once before
snapshotting or snapshot after screen updates."
Any ideas? Thanks!
This is the code:
#import "ViewController.h"
#interface ViewController ()
#end
#implementation ViewController
UIImage *capturaPantalla;
-(void)takeSnapShot:(id)sender{
UIGraphicsBeginImageContext(picker.view.window.bounds.size);
[picker.view.layer renderInContext:UIGraphicsGetCurrentContext()];
capturaPantalla = UIGraphicsGetImageFromCurrentImageContext();
UIGraphicsEndImageContext();
UIImageWriteToSavedPhotosAlbum(capturaPantalla,nil,nil,nil);
}
I did the similar function before, and used AVCaptureSession to make it.
// 1. Create the AVCaptureSession
AVCaptureSession *session = [[AVCaptureSession alloc] init];
[self setSession:session];
// 2. Setup the preview view
[[self previewView] setSession:session];
// 3. Check for device authorization
[self checkDeviceAuthorizationStatus];
dispatch_queue_t sessionQueue = dispatch_queue_create("session queue", DISPATCH_QUEUE_SERIAL);
[self setSessionQueue:sessionQueue];
============================================
My countdown snapshot function:
- (void) takePhoto
{
[timer1 invalidate];
dispatch_async([self sessionQueue], ^{
// 4. Update the orientation on the still image output video connection before capturing.
[[[self stillImageOutput] connectionWithMediaType:AVMediaTypeVideo] setVideoOrientation:[[(AVCaptureVideoPreviewLayer *)[[self previewView] layer] connection] videoOrientation]];
// 5. Flash set to Auto for Still Capture
[iSKITACamViewController setFlashMode:flashMode forDevice:[[self videoDeviceInput] device]];
AVCaptureConnection *stillImageConnection = [[self stillImageOutput] connectionWithMediaType:AVMediaTypeVideo];
CGFloat maxScale = stillImageConnection.videoMaxScaleAndCropFactor;
if (effectiveScale > 1.0f && effectiveScale < maxScale)
{
stillImageConnection.videoScaleAndCropFactor = effectiveScale;;
}
[self playSound];
// 6.Capture a still image.
[[self stillImageOutput] captureStillImageAsynchronouslyFromConnection:[[self stillImageOutput] connectionWithMediaType:AVMediaTypeVideo] completionHandler:^(CMSampleBufferRef imageDataSampleBuffer, NSError *error) {
if (imageDataSampleBuffer)
{
NSData *imageData = [AVCaptureStillImageOutput jpegStillImageNSDataRepresentation:imageDataSampleBuffer];
UIImage *image = [[UIImage alloc] initWithData:imageData];
if(iPhone5 && [self.modeLabel.text isEqualToString:#"PhotoMode"]) {
UIImage *image1 = [image crop:CGRectMake(0, 0, image.size.width, image.size.height)];
[[[ALAssetsLibrary alloc] init] writeImageToSavedPhotosAlbum:[image1 CGImage] orientation:(ALAssetOrientation)[image1 imageOrientation] completionBlock:nil];
} else {
[[[ALAssetsLibrary alloc] init] writeImageToSavedPhotosAlbum:[image CGImage] orientation:(ALAssetOrientation)[image imageOrientation] completionBlock:nil];
}
self.priveImageView.image = image;
}
}];
});
}
I've been writing a camera app for iOS 8 that uses AVFoundation to set up and handle recording and saving (not ImagePickerController). I'm trying to save use the maxRecordedFileSize attribute of the AVCaptureMovieFileOutput class to allow the user to fill up all available space on the phone (minus a 250MB buffer left for apple stuff).
- (unsigned long long) availableFreespaceInMb {
unsigned long long freeSpace;
NSError *error = nil;
NSArray *paths = NSSearchPathForDirectoriesInDomains(NSDocumentDirectory, NSUserDomainMask, YES);
NSDictionary *dictionary = [[NSFileManager defaultManager] attributesOfFileSystemForPath:[paths lastObject] error: &error];
if (dictionary) {
NSNumber *fileSystemFreeSizeInBytes = [dictionary objectForKey: NSFileSystemFreeSize];
freeSpace = [fileSystemFreeSizeInBytes unsignedLongLongValue];
} else {
NSLog(#"Error getting free space");
//Handle error
}
//convert to MB
freeSpace = (freeSpace/1024ll)/1024ll;
freeSpace -= _recordSpaceBufferInMb; // 250 MB
NSLog(#"Remaining space in MB: %llu", freeSpace);
NSLog(#" Diff Since Last: %llu", (_prevRemSpaceMb - freeSpace));
_prevRemSpaceMb = freeSpace;
return freeSpace;
}
The AVErrorMaximumFileSizeReached is thrown when available space (minus buffer) is reduced to zero, and no save error is thrown, but the video does not appear in the camera roll and is not saved. When I set the maxRecordedDuration field the AVErrorMaximumDurationReached is thrown and the video DOES save. I calculate max time from max size, but I always have plenty of space left due to frame compression.
- (void) toggleMovieRecording
{
double factor = 1.0;
if (_currentFramerate == _slowFPS) {
factor = _slowMotionFactor;
}
double availableRecordTimeInSeconds = [self remainingRecordTimeInSeconds] / factor;
unsigned long long remainingSpace = [self availableFreespaceInMb] * 1024 * 1024;
if (![[self movieFileOutput] isRecording]) {
if (availableSpaceInMb < 50) {
NSLog(#"TMR:Not enough space, can't record");
[AVViewController currentVideoOrientation];
[_previewView memoryAlert];
return;
}
}
if (![self enableRecording]) {
return;
}
[[self recordButton] setEnabled:NO];
dispatch_async([self sessionQueue], ^{
if (![[self movieFileOutput] isRecording])
{
if ([[UIDevice currentDevice] isMultitaskingSupported])
{
[self setBackgroundRecordingID:[[UIApplication sharedApplication] beginBackgroundTaskWithExpirationHandler:nil]];
}
// Update the orientation on the movie file output video connection before starting recording.
[[[self movieFileOutput] connectionWithMediaType:AVMediaTypeVideo] setVideoOrientation: [AVViewController currentVideoOrientation]];//[[(AVCaptureVideoPreviewLayer *)[[self previewView] layer] connection] videoOrientation]];
// Start recording to a temporary file.
NSString *outputFilePath = [NSTemporaryDirectory() stringByAppendingPathComponent:[#"movie" stringByAppendingPathExtension:#"mov"]];
// Is there already a file like this?
NSFileManager *fileManager = [NSFileManager defaultManager];
if ([fileManager fileExistsAtPath:outputFilePath]) {
NSLog(#"filexists");
NSError *err;
if ([fileManager removeItemAtPath:outputFilePath error:&err] == NO) {
NSLog(#"Error, file exists at path");
}
}
[_previewView startRecording];
// Set the movie file output to stop recording a bit before the phone is full
[_movieFileOutput setMaxRecordedFileSize:remainingSpace]; // Less than the total remaining space
// [_movieFileOutput setMaxRecordedDuration:CMTimeMake(availableRecordTimeInSeconds, 1.0)];
[_movieFileOutput startRecordingToOutputFileURL:[NSURL fileURLWithPath:outputFilePath] recordingDelegate:self];
}
else
{
[_previewView stopRecording];
[[self movieFileOutput] stopRecording];
}
});
}
- (void)captureOutput:(AVCaptureFileOutput *)captureOutput didFinishRecordingToOutputFileAtURL:(NSURL *)outputFileURL fromConnections:(NSArray *)connections error:(NSError *)error
NSLog(#"AVViewController: didFinishRecordingToOutputFile");
if (error) {
NSLog(#"%#", error);
NSLog(#"Caught Error");
if ([error code] == AVErrorDiskFull) {
NSLog(#"Caught disk full error");
} else if ([error code] == AVErrorMaximumFileSizeReached) {
NSLog(#"Caught max file size error");
} else if ([error code] == AVErrorMaximumDurationReached) {
NSLog(#"Caught max duration error");
} else {
NSLog(#"Caught other error");
}
[self remainingRecordTimeInSeconds];
dispatch_async(dispatch_get_main_queue(), ^{
[_previewView stopRecording];
[_previewView memoryAlert];
});
}
// Note the backgroundRecordingID for use in the ALAssetsLibrary completion handler to end the background task associated with this recording. This allows a new recording to be started, associated with a new UIBackgroundTaskIdentifier, once the movie file output's -isRecording is back to NO — which happens sometime after this method returns.
UIBackgroundTaskIdentifier backgroundRecordingID = [self backgroundRecordingID];
[self setBackgroundRecordingID:UIBackgroundTaskInvalid];
[[[ALAssetsLibrary alloc] init] writeVideoAtPathToSavedPhotosAlbum:outputFileURL completionBlock:^(NSURL *assetURL, NSError *error) {
if (error) {
NSLog(#"%#", error);
NSLog(#"Error during write");
} else {
NSLog(#"Writing to photos album");
}
[[NSFileManager defaultManager] removeItemAtURL:outputFileURL error:nil];
if (backgroundRecordingID != UIBackgroundTaskInvalid)
[[UIApplication sharedApplication] endBackgroundTask:backgroundRecordingID];
}];
if (error) {
[_session stopRunning];
dispatch_after(dispatch_time(DISPATCH_TIME_NOW, 1.0 * NSEC_PER_SEC), _sessionQueue, ^{
[_session startRunning];
});
}
"Writing to photos album" appears when both errors are thrown. I'm completely stumped by this. Any iOS insights?
The code sample you provided is hard to test since there are a properties and methods missing. Although I am not able to compile your code, there are definitely some red flags that could be causing the issue. The issues below were found inside of:
captureOutput:didFinishRecordingToOutputFileAtURL:fromConnections:error:
Issue 1: the method is handling the passed in error, but then continues executing the method. instead it should be:
- (void)captureOutput:(AVCaptureFileOutput *)captureOutput didFinishRecordingToOutputFileAtURL:(NSURL *)outputFileURL fromConnections:(NSArray *)connections error:(NSError *)error {
if (error) {
// handle error then bail
return;
}
// continue on
}
Issue 2: The ALAssetsLibrary object that you are instantiating isn't stored to a property, so once captureOutput:didFinishRecordingToOutputFileAtURL:fromConnections:error: finishes the object will be released (potentially never firing your completion block). Instead it should be:
// hold onto the assets library beyond this scope
self.assetsLibrary = [[ALAssetsLibrary alloc] init];
// get weak reference to self for later removal of the assets library
__weak typeof(self) weakSelf = self;
[self.assetsLibrary writeVideoAtPathToSavedPhotosAlbum:outputFileURL completionBlock:^(NSURL *assetURL, NSError *error) {
// handle error
// handle cleanup
// cleanup new property
weakSelf.assetsLibrary = nil;
}];
If fixing these issues doesn't fix the problem, please provide the missing code to make your sample compile.
I have a problem with displaying image in my UIImageView.
So this is how I get images in ShareViewController:
- (void)viewWillAppear:(BOOL)animated
{
[super viewWillAppear:animated];
for (NSItemProvider* itemProvider in ((NSExtensionItem*)self.extensionContext.inputItems[0]).attachments )
{
if([itemProvider hasItemConformingToTypeIdentifier:#"public.image"])
{
++counter;
[itemProvider loadItemForTypeIdentifier:#"public.image" options:nil completionHandler:
^(id<NSSecureCoding> item, NSError *error)
{
UIImage *sharedImage = nil;
if([(NSObject*)item isKindOfClass:[NSURL class]])
{
sharedImage = [UIImage imageWithData:[NSData dataWithContentsOfURL:(NSURL*)item]];
}
if([(NSObject*)item isKindOfClass:[UIImage class]])
{
sharedImage = (UIImage*)item;
}
if ([(NSObject *)item isKindOfClass:[NSData class]])
{
sharedImage = [UIImage imageWithData:(NSData *)item];
}
[[NSNotificationCenter defaultCenter] postNotificationName:#"kNotificationDidLoadItem" object:sharedImage];
}];
}
}
}
this is how I receive notification in DetailsViewController:
- (void)didLoadSharedImage:(NSNotification *)notification {
UIImage *sharedImage = [notification object];
[self.sharedImages addObject:sharedImage];
if (!self.theImageView.image) {
self.theImageView.image = sharedImage;
}
}
So there is no visible image after this method. I debugged it and I saw that image is there, but it's strange that I see it when I push new view controller form DetailsViewController and then pop back. Only then UIImageView seems like refresh its self.
Did you try to execute didLoadSharedImage into UIThread. like
dispatch_async(dispatch_get_main_queue(), ^{
if (!self.theImageView.image) {
self.theImageView.image = sharedImage;
}
})
Good luck
What is the AVFoundation equivalent for this UIImagePicker enabled method? My app uses AVFoundation and I want to nix UIImagePicker entirley. This method comes after a UIImagePicker class is set up in the program. It sets a hypothetical image a user takes (physically using the iPhone camera interface) to an IBOutlet named photo, declared earlier in app. excerpt from http://www.raywenderlich.com/13541/how-to-create-an-app-like-instagram-with-a-web-service-backend-part-22 (download at bottom)
#pragma mark - Image picker delegate methods
-(void)imagePickerController:(UIImagePickerController *)picker didFinishPickingMediaWithInfo:(NSDictionary *)info {
UIImage *image = [info objectForKey:UIImagePickerControllerOriginalImage];
// Resize the image from the camera
UIImage *scaledImage = [image resizedImageWithContentMode:UIViewContentModeScaleAspectFill bounds:CGSizeMake(photo.frame.size.width, photo.frame.size.height) interpolationQuality:kCGInterpolationHigh];
// Crop the image to a square (yikes, fancy!)
UIImage *croppedImage = [scaledImage croppedImage:CGRectMake((scaledImage.size.width -photo.frame.size.width)/2, (scaledImage.size.height -photo.frame.size.height)/2, photo.frame.size.width, photo.frame.size.height)];
// Show the photo on the screen
photo.image = croppedImage;
[picker dismissModalViewControllerAnimated:NO];
}
//Successful avcam snap i want to use. Uses AVFoundation
- (IBAction)snapStillImage:(id)sender
{
dispatch_async([self sessionQueue], ^{
// Update the orientation on the still image output video connection before capturing.
[[[self stillImageOutput] connectionWithMediaType:AVMediaTypeVideo] setVideoOrientation:[[(AVCaptureVideoPreviewLayer *)[[self previewView] layer] connection] videoOrientation]];
// Flash set to Auto for Still Capture
[PhotoScreen setFlashMode:AVCaptureFlashModeAuto forDevice:[[self videoDeviceInput] device]];
// Capture a still image.
[[self stillImageOutput] captureStillImageAsynchronouslyFromConnection:[[self stillImageOutput] connectionWithMediaType:AVMediaTypeVideo] completionHandler:^(CMSampleBufferRef imageDataSampleBuffer, NSError *error) {
if (imageDataSampleBuffer)
{
NSData *imageData = [AVCaptureStillImageOutput jpegStillImageNSDataRepresentation:imageDataSampleBuffer];
UIImage *image = [[UIImage alloc] initWithData:imageData];
[[[ALAssetsLibrary alloc] init] writeImageToSavedPhotosAlbum:[image CGImage] orientation:(ALAssetOrientation)[image imageOrientation] completionBlock:nil];
//
}
}];
});
}
Note: the AVCam method is part of an AVSession which apples describes as the class to coordinate all data flow for media capture using AVFoundation. The full official program from Apple for AVCam is here https://developer.apple.com/library/ios/samplecode/AVCam/Introduction/Intro.html
I happened to stumble upon this excerpt located in AVCamViewController. I noticed it looks a lot like the method in question. Can someone help verify its scalability?
#pragma mark File Output Delegate
- (void)captureOutput:(AVCaptureFileOutput *)captureOutput didFinishRecordingToOutputFileAtURL:(NSURL *)outputFileURL fromConnections:(NSArray *)connections error:(NSError *)error
{
if (error)
NSLog(#"%#", error);
[self setLockInterfaceRotation:NO];
// Note the backgroundRecordingID for use in the ALAssetsLibrary completion handler to end the background task associated with this recording. This allows a new recording to be started, associated with a new UIBackgroundTaskIdentifier, once the movie file output's -isRecording is back to NO — which happens sometime after this method returns.
UIBackgroundTaskIdentifier backgroundRecordingID = [self backgroundRecordingID];
[self setBackgroundRecordingID:UIBackgroundTaskInvalid];
[[[ALAssetsLibrary alloc] init] writeVideoAtPathToSavedPhotosAlbum:outputFileURL completionBlock:^(NSURL *assetURL, NSError *error) {
if (error)
NSLog(#"%#", error);
[[NSFileManager defaultManager] removeItemAtURL:outputFileURL error:nil];
if (backgroundRecordingID != UIBackgroundTaskInvalid)
[[UIApplication sharedApplication] endBackgroundTask:backgroundRecordingID];
}];
}
Hi I am saving an image in an NSObject called CaptureManager. in the .h I set the UIImage properties:
#property (nonatomic,strong) UIImage *captureImage;
and then in the .m set the image
- (void) captureStillImage
{
AVCaptureConnection *stillImageConnection = [[self stillImageOutput] connectionWithMediaType:AVMediaTypeVideo];
[[self stillImageOutput] captureStillImageAsynchronouslyFromConnection:stillImageConnection
completionHandler:^(CMSampleBufferRef imageDataSampleBuffer, NSError *error) {
if (imageDataSampleBuffer != NULL) {
NSData *imageData = [AVCaptureStillImageOutput jpegStillImageNSDataRepresentation:imageDataSampleBuffer];
captureImage = [[UIImage alloc] initWithData:imageData];
[[NSNotificationCenter defaultCenter] postNotificationName: #"photoTaken" object:nil userInfo:nil];
}
if ([[self delegate] respondsToSelector:#selector(captureManagerStillImageCaptured:)]) {
[[self delegate] captureManagerStillImageCaptured:self];
}
}];
}
Now in my ViewController I am trying to display that Image like so:
CaptureManager *captureManager = [[CaptureManager alloc] init];
UIImage *captureBackgroundImage = captureManager.captureImage;
UIImageView *captureImageView = [[UIImageView alloc]initWithFrame:CGRectMake(0, 0, 320, 480)];
[captureImageView setImage:captureBackgroundImage];
[self.view addSubview:captureImageView];
[self.view bringSubviewToFront:captureImageView];
Did I do everything correctly? Is the problem where I have the captureImage set? Could it be in the captureStillImage method with the completion handler messing this up? Nothing is getting displayed on the screen. Any help would be really appreciated!