What is the AVFoundation equivalent for this UIImagePicker enabled method? My app uses AVFoundation and I want to nix UIImagePicker entirley. This method comes after a UIImagePicker class is set up in the program. It sets a hypothetical image a user takes (physically using the iPhone camera interface) to an IBOutlet named photo, declared earlier in app. excerpt from http://www.raywenderlich.com/13541/how-to-create-an-app-like-instagram-with-a-web-service-backend-part-22 (download at bottom)
#pragma mark - Image picker delegate methods
-(void)imagePickerController:(UIImagePickerController *)picker didFinishPickingMediaWithInfo:(NSDictionary *)info {
UIImage *image = [info objectForKey:UIImagePickerControllerOriginalImage];
// Resize the image from the camera
UIImage *scaledImage = [image resizedImageWithContentMode:UIViewContentModeScaleAspectFill bounds:CGSizeMake(photo.frame.size.width, photo.frame.size.height) interpolationQuality:kCGInterpolationHigh];
// Crop the image to a square (yikes, fancy!)
UIImage *croppedImage = [scaledImage croppedImage:CGRectMake((scaledImage.size.width -photo.frame.size.width)/2, (scaledImage.size.height -photo.frame.size.height)/2, photo.frame.size.width, photo.frame.size.height)];
// Show the photo on the screen
photo.image = croppedImage;
[picker dismissModalViewControllerAnimated:NO];
}
//Successful avcam snap i want to use. Uses AVFoundation
- (IBAction)snapStillImage:(id)sender
{
dispatch_async([self sessionQueue], ^{
// Update the orientation on the still image output video connection before capturing.
[[[self stillImageOutput] connectionWithMediaType:AVMediaTypeVideo] setVideoOrientation:[[(AVCaptureVideoPreviewLayer *)[[self previewView] layer] connection] videoOrientation]];
// Flash set to Auto for Still Capture
[PhotoScreen setFlashMode:AVCaptureFlashModeAuto forDevice:[[self videoDeviceInput] device]];
// Capture a still image.
[[self stillImageOutput] captureStillImageAsynchronouslyFromConnection:[[self stillImageOutput] connectionWithMediaType:AVMediaTypeVideo] completionHandler:^(CMSampleBufferRef imageDataSampleBuffer, NSError *error) {
if (imageDataSampleBuffer)
{
NSData *imageData = [AVCaptureStillImageOutput jpegStillImageNSDataRepresentation:imageDataSampleBuffer];
UIImage *image = [[UIImage alloc] initWithData:imageData];
[[[ALAssetsLibrary alloc] init] writeImageToSavedPhotosAlbum:[image CGImage] orientation:(ALAssetOrientation)[image imageOrientation] completionBlock:nil];
//
}
}];
});
}
Note: the AVCam method is part of an AVSession which apples describes as the class to coordinate all data flow for media capture using AVFoundation. The full official program from Apple for AVCam is here https://developer.apple.com/library/ios/samplecode/AVCam/Introduction/Intro.html
I happened to stumble upon this excerpt located in AVCamViewController. I noticed it looks a lot like the method in question. Can someone help verify its scalability?
#pragma mark File Output Delegate
- (void)captureOutput:(AVCaptureFileOutput *)captureOutput didFinishRecordingToOutputFileAtURL:(NSURL *)outputFileURL fromConnections:(NSArray *)connections error:(NSError *)error
{
if (error)
NSLog(#"%#", error);
[self setLockInterfaceRotation:NO];
// Note the backgroundRecordingID for use in the ALAssetsLibrary completion handler to end the background task associated with this recording. This allows a new recording to be started, associated with a new UIBackgroundTaskIdentifier, once the movie file output's -isRecording is back to NO — which happens sometime after this method returns.
UIBackgroundTaskIdentifier backgroundRecordingID = [self backgroundRecordingID];
[self setBackgroundRecordingID:UIBackgroundTaskInvalid];
[[[ALAssetsLibrary alloc] init] writeVideoAtPathToSavedPhotosAlbum:outputFileURL completionBlock:^(NSURL *assetURL, NSError *error) {
if (error)
NSLog(#"%#", error);
[[NSFileManager defaultManager] removeItemAtURL:outputFileURL error:nil];
if (backgroundRecordingID != UIBackgroundTaskInvalid)
[[UIApplication sharedApplication] endBackgroundTask:backgroundRecordingID];
}];
}
Related
My app has a button that should take a screenshot of whatever is on the screen by calling "takeSnapShot" (see code below).
I have two views and one of them is a picker view that I use for the camera, so in the screen I see the image coming from the camera and next to it another view with images.
The thing is that I capture the screen but it doesn't capture the image coming from the camera.
Also, I think I render the view.layer but the debugger keeps saying>
"Snapshotting a view that has not been rendered results in an empty
snapshot. Ensure your view has been rendered at least once before
snapshotting or snapshot after screen updates."
Any ideas? Thanks!
This is the code:
#import "ViewController.h"
#interface ViewController ()
#end
#implementation ViewController
UIImage *capturaPantalla;
-(void)takeSnapShot:(id)sender{
UIGraphicsBeginImageContext(picker.view.window.bounds.size);
[picker.view.layer renderInContext:UIGraphicsGetCurrentContext()];
capturaPantalla = UIGraphicsGetImageFromCurrentImageContext();
UIGraphicsEndImageContext();
UIImageWriteToSavedPhotosAlbum(capturaPantalla,nil,nil,nil);
}
I did the similar function before, and used AVCaptureSession to make it.
// 1. Create the AVCaptureSession
AVCaptureSession *session = [[AVCaptureSession alloc] init];
[self setSession:session];
// 2. Setup the preview view
[[self previewView] setSession:session];
// 3. Check for device authorization
[self checkDeviceAuthorizationStatus];
dispatch_queue_t sessionQueue = dispatch_queue_create("session queue", DISPATCH_QUEUE_SERIAL);
[self setSessionQueue:sessionQueue];
============================================
My countdown snapshot function:
- (void) takePhoto
{
[timer1 invalidate];
dispatch_async([self sessionQueue], ^{
// 4. Update the orientation on the still image output video connection before capturing.
[[[self stillImageOutput] connectionWithMediaType:AVMediaTypeVideo] setVideoOrientation:[[(AVCaptureVideoPreviewLayer *)[[self previewView] layer] connection] videoOrientation]];
// 5. Flash set to Auto for Still Capture
[iSKITACamViewController setFlashMode:flashMode forDevice:[[self videoDeviceInput] device]];
AVCaptureConnection *stillImageConnection = [[self stillImageOutput] connectionWithMediaType:AVMediaTypeVideo];
CGFloat maxScale = stillImageConnection.videoMaxScaleAndCropFactor;
if (effectiveScale > 1.0f && effectiveScale < maxScale)
{
stillImageConnection.videoScaleAndCropFactor = effectiveScale;;
}
[self playSound];
// 6.Capture a still image.
[[self stillImageOutput] captureStillImageAsynchronouslyFromConnection:[[self stillImageOutput] connectionWithMediaType:AVMediaTypeVideo] completionHandler:^(CMSampleBufferRef imageDataSampleBuffer, NSError *error) {
if (imageDataSampleBuffer)
{
NSData *imageData = [AVCaptureStillImageOutput jpegStillImageNSDataRepresentation:imageDataSampleBuffer];
UIImage *image = [[UIImage alloc] initWithData:imageData];
if(iPhone5 && [self.modeLabel.text isEqualToString:#"PhotoMode"]) {
UIImage *image1 = [image crop:CGRectMake(0, 0, image.size.width, image.size.height)];
[[[ALAssetsLibrary alloc] init] writeImageToSavedPhotosAlbum:[image1 CGImage] orientation:(ALAssetOrientation)[image1 imageOrientation] completionBlock:nil];
} else {
[[[ALAssetsLibrary alloc] init] writeImageToSavedPhotosAlbum:[image CGImage] orientation:(ALAssetOrientation)[image imageOrientation] completionBlock:nil];
}
self.priveImageView.image = image;
}
}];
});
}
I am using AVCam made by apple for my custom camera view. Honestly it is not to simple to understand what's going on in the class AVCamViewController if you see it at first time.
Right now I am interested how they set frame of captured image. I tried to found where some fames setters or something like this, but I have not found any.
I searched in Google and found answer here AVCam not in fullscreen
But when I implemented that solution I just realised that it just made live camera preview layer with the same size as my view, but when app saves image in the method - (IBAction)snapStillImage:(id)sender in the gallery images still was with 2 stripes from left and right.
My question is how can I remove this stripes or in which line in source code apple set this stuff?
Also as additional sub-question how can I set type create just photo, because the app requests me "Microphone settings" and I don't need it just need make a photo and that's it.
This code from apple sources will save image to the photo library.
- (IBAction)snapStillImage:(id)sender
{
dispatch_async([self sessionQueue], ^{
// Update the orientation on the still image output video connection before capturing.
[[[self stillImageOutput] connectionWithMediaType:AVMediaTypeVideo] setVideoOrientation:[[(AVCaptureVideoPreviewLayer *)[[self previewView] layer] connection] videoOrientation]];
// Flash set to Auto for Still Capture
[AVCamViewController setFlashMode:AVCaptureFlashModeAuto forDevice:[[self videoDeviceInput] device]];
// Capture a still image.
[[self stillImageOutput] captureStillImageAsynchronouslyFromConnection:[[self stillImageOutput] connectionWithMediaType:AVMediaTypeVideo] completionHandler:^(CMSampleBufferRef imageDataSampleBuffer, NSError *error) {
if (imageDataSampleBuffer)
{
NSData *imageData = [AVCaptureStillImageOutput jpegStillImageNSDataRepresentation:imageDataSampleBuffer];
UIImage *image = [[UIImage alloc] initWithData:imageData];
UIImage *proccessingImage = [SAPAdjustImageHelper adjustImage:image];
NSNumber *email_id = [SAPCoreDataEmailHelper currentEmailId];
[SAPFileManagerHelper addImage:proccessingImage toFolderWithEmailId:email_id];
}
}];
});
}
Are you setting the session preset?
You can use your session with the session preset set in AVCaptureSessionPresetPhoto.
For the another subquestion: You need to add only the AVCaptureStillImageOutput output.
How to set the session Preset?
[session setSessionPreset:AVCaptureSessionPresetPhoto];
How to configure the session to use only StillImageOutput to take photos and ?
- (void)viewDidLoad
{
[super viewDidLoad];
// Create the AVCaptureSession
AVCaptureSession *session = [[AVCaptureSession alloc] init];
[self setSession:session];
// Setup the preview view
[[self previewView] setSession:session];
// Setup the session Preset
[session setSessionPreset:AVCaptureSessionPresetPhoto];
// Check for device authorization
[self checkDeviceAuthorizationStatus];
dispatch_queue_t sessionQueue = dispatch_queue_create("session queue", DISPATCH_QUEUE_SERIAL);
[self setSessionQueue:sessionQueue];
dispatch_async(sessionQueue, ^{
//[self setBackgroundRecordingID:UIBackgroundTaskInvalid];
NSError *error = nil;
AVCaptureDevice *videoDevice = [AVCamViewController deviceWithMediaType:AVMediaTypeVideo preferringPosition:AVCaptureDevicePositionBack];
AVCaptureDeviceInput *videoDeviceInput = [AVCaptureDeviceInput deviceInputWithDevice:videoDevice error:&error];
if (error)
{
NSLog(#"%#", error);
}
if ([session canAddInput:videoDeviceInput])
{
[session addInput:videoDeviceInput];
[self setVideoDeviceInput:videoDeviceInput];
dispatch_async(dispatch_get_main_queue(), ^{
// Why are we dispatching this to the main queue?
// Because AVCaptureVideoPreviewLayer is the backing layer for AVCamPreviewView and UIView can only be manipulated on main thread.
// Note: As an exception to the above rule, it is not necessary to serialize video orientation changes on the AVCaptureVideoPreviewLayer’s connection with other session manipulation.
[[(AVCaptureVideoPreviewLayer *)[[self previewView] layer] connection] setVideoOrientation:(AVCaptureVideoOrientation)[self interfaceOrientation]];
});
}
AVCaptureStillImageOutput *stillImageOutput = [[AVCaptureStillImageOutput alloc] init];
if ([session canAddOutput:stillImageOutput])
{
[stillImageOutput setOutputSettings:#{AVVideoCodecKey : AVVideoCodecJPEG}];
[session addOutput:stillImageOutput];
[self setStillImageOutput:stillImageOutput];
}
});
}
The blacks fields are needed to keep the aspect ratio, if want to avoid them you should change the videoGravity in the AVCaptureVideoPreviewLayer, or in the contentMode of the UIImageView displaying the captured image.
This is an expected behavior, don't understand your concern.
I want to use AVCaptureSession to capture images within my iOS App.
Based on the code from Apple's sample for AVCam https://developer.apple.com/library/ios/samplecode/AVCam/Introduction/Intro.html I tried to show a preview ImageView every time an image is captured.
// Capture a still image.
[[self stillImageOutput] captureStillImageAsynchronouslyFromConnection:[[self stillImageOutput] connectionWithMediaType:AVMediaTypeVideo] completionHandler:^(CMSampleBufferRef imageDataSampleBuffer, NSError *error) {
if (imageDataSampleBuffer)
{
NSData *imageData = [AVCaptureStillImageOutput jpegStillImageNSDataRepresentation:imageDataSampleBuffer];
[self processImage:[UIImage imageWithData:imageData]];
}
}];
After capturing processImage is called
- (void) processImage:(UIImage *)image
{
[[self postButton] setEnabled:YES];
[[self postButton] setHidden:NO];
[[self cancelButton] setEnabled:YES];
[[self cancelButton] setHidden:NO];
_preview = [[UIImageView alloc] init];
[_preview setImage:image];
_preview.hidden = NO;
}
But the ImageView is still unchanged/empty when it is displayed.
Can someone help me along?
This code works for me
AVCaptureConnection *connection = [_currentOutput connectionWithMediaType:AVMediaTypeVideo];
[self _setOrientationForConnection:connection];
[_captureOutputPhoto captureStillImageAsynchronouslyFromConnection:connection completionHandler:
^(CMSampleBufferRef imageDataSampleBuffer, NSError *error) {
if (!imageDataSampleBuffer) {
DLog(#"failed to obtain image data sample buffer");
// return delegate error
return;
}
if (error) {
if ([_delegate respondsToSelector:#selector(vision:capturedPhoto:error:)]) {
[_delegate vision:self capturedPhoto:nil error:error];
}
return;
}
NSMutableDictionary *photoDict = [[NSMutableDictionary alloc] init];
NSDictionary *metadata = nil;
// add photo metadata (ie EXIF: Aperture, Brightness, Exposure, FocalLength, etc)
metadata = (__bridge NSDictionary *)CMCopyDictionaryOfAttachments(kCFAllocatorDefault, imageDataSampleBuffer, kCMAttachmentMode_ShouldPropagate);
if (metadata) {
[photoDict setObject:metadata forKey:PBJVisionPhotoMetadataKey];
CFRelease((__bridge CFTypeRef)(metadata));
} else {
DLog(#"failed to generate metadata for photo");
}
// add JPEG and image data
NSData *jpegData = [AVCaptureStillImageOutput jpegStillImageNSDataRepresentation:imageDataSampleBuffer];
if (jpegData) {
// add JPEG
[photoDict setObject:jpegData forKey:PBJVisionPhotoJPEGKey];
// add image
UIImage *image = [self _uiimageFromJPEGData:jpegData];
if (image) {
[photoDict setObject:image forKey:PBJVisionPhotoImageKey];
} else {
DLog(#"failed to create image from JPEG");
// TODO: return delegate on error
}
// add thumbnail
UIImage *thumbnail = [self _thumbnailJPEGData:jpegData];
if (thumbnail) {
[photoDict setObject:thumbnail forKey:PBJVisionPhotoThumbnailKey];
} else {
DLog(#"failed to create a thumnbail");
// TODO: return delegate on error
}
} else {
DLog(#"failed to create jpeg still image data");
// TODO: return delegate on error
}
if ([_delegate respondsToSelector:#selector(vision:capturedPhoto:error:)]) {
[_delegate vision:self capturedPhoto:photoDict error:error];
}
// run a post shot focus
[self performSelector:#selector(_focus) withObject:nil afterDelay:0.5f];
}];
I bet that the trouble here is that completionHandler for captureStillImageAsynchronouslyFromConnection isn't being called on main thread. Try redispatching your call out to the main thread.
dispatch_async(dispatch_get_main_queue(), ^{
[self processImage:[UIImage imageWithData:imageData]];
});
I also notice that you're never adding _preview as a subview of anything.
I've built my own custom camera class using AVCaptureSession and a lot of the code from apples AVCam demo app. Basically the function I use to capture my image is verbatim from apple's app, however when I snap a picture, I get a received memory warning in my console. It doesn't happen all the time, but almost always on the first picture. This is the code that is my problem..
- (void)capImage { //method to capture image from AVCaptureSession video feed
dispatch_async([self sessionQueue], ^{
// Update the orientation on the still image output video connection before capturing.
UIDeviceOrientation orientation = [[UIDevice currentDevice] orientation];
[[[self stillImageOutput] connectionWithMediaType:AVMediaTypeVideo] setVideoOrientation:(AVCaptureVideoOrientation)orientation];
// Flash set to Auto for Still Capture
//[AVCamViewController setFlashMode:AVCaptureFlashModeAuto forDevice:[[self videoDeviceInput] device]];
// Capture a still image.
[[self stillImageOutput] captureStillImageAsynchronouslyFromConnection:[[self stillImageOutput] connectionWithMediaType:AVMediaTypeVideo] completionHandler:^(CMSampleBufferRef imageDataSampleBuffer, NSError *error) {
if (imageDataSampleBuffer)
{
NSData *imageData = [AVCaptureStillImageOutput jpegStillImageNSDataRepresentation:imageDataSampleBuffer];
UIImage *image = [[UIImage alloc] initWithData:imageData];
[[[ALAssetsLibrary alloc] init] writeImageToSavedPhotosAlbum:[image CGImage] orientation:(ALAssetOrientation)[image imageOrientation] completionBlock:nil];
[self processImage:[UIImage imageWithData:imageData]];
}
}];
});
}
Any ideas why this would be happening?
I'm using jpegStillImageNSDataRepresentation to capture a still image from the front-facing camera on an iOS device (iOS 5.0).
I'm using a AVCaptureVideoPreviewLayer to display the output of the front-facing camera prior to taking the image. By default, the display is mirrored, and I do not adjust this.
The still image that is captured is not mirrored (i.e. it is flipped about the vertical axis compared with what is displayed by the preview layer). I want it to be mirrored. Any ideas how to accomplish that?
Here is the relevant code:
- (void) captureStillImage {
AVCaptureConnection *stillImageConnection = [stillImageOutput connectionWithMediaType:AVMediaTypeVideo];
UIDeviceOrientation curDeviceOrientation = [[UIDevice currentDevice] orientation];
AVCaptureVideoOrientation avcaptureOrientation = [self avOrientationForDeviceOrientation:curDeviceOrientation];
[stillImageConnection setVideoOrientation:avcaptureOrientation];
[stillImageOutput setOutputSettings:[NSDictionary dictionaryWithObject:AVVideoCodecJPEG
forKey:AVVideoCodecKey]];
[stillImageOutput captureStillImageAsynchronouslyFromConnection:stillImageConnection completionHandler:^(CMSampleBufferRef imageDataSampleBuffer, NSError *error) {
if (!error) {
NSData *jpegData = [AVCaptureStillImageOutput jpegStillImageNSDataRepresentation:imageDataSampleBuffer];
UIImage *image = [UIImage imageWithData:jpegData];
[self showImage:image];
}
}];
}