Hi I am saving an image in an NSObject called CaptureManager. in the .h I set the UIImage properties:
#property (nonatomic,strong) UIImage *captureImage;
and then in the .m set the image
- (void) captureStillImage
{
AVCaptureConnection *stillImageConnection = [[self stillImageOutput] connectionWithMediaType:AVMediaTypeVideo];
[[self stillImageOutput] captureStillImageAsynchronouslyFromConnection:stillImageConnection
completionHandler:^(CMSampleBufferRef imageDataSampleBuffer, NSError *error) {
if (imageDataSampleBuffer != NULL) {
NSData *imageData = [AVCaptureStillImageOutput jpegStillImageNSDataRepresentation:imageDataSampleBuffer];
captureImage = [[UIImage alloc] initWithData:imageData];
[[NSNotificationCenter defaultCenter] postNotificationName: #"photoTaken" object:nil userInfo:nil];
}
if ([[self delegate] respondsToSelector:#selector(captureManagerStillImageCaptured:)]) {
[[self delegate] captureManagerStillImageCaptured:self];
}
}];
}
Now in my ViewController I am trying to display that Image like so:
CaptureManager *captureManager = [[CaptureManager alloc] init];
UIImage *captureBackgroundImage = captureManager.captureImage;
UIImageView *captureImageView = [[UIImageView alloc]initWithFrame:CGRectMake(0, 0, 320, 480)];
[captureImageView setImage:captureBackgroundImage];
[self.view addSubview:captureImageView];
[self.view bringSubviewToFront:captureImageView];
Did I do everything correctly? Is the problem where I have the captureImage set? Could it be in the captureStillImage method with the completion handler messing this up? Nothing is getting displayed on the screen. Any help would be really appreciated!
Related
My app has a button that should take a screenshot of whatever is on the screen by calling "takeSnapShot" (see code below).
I have two views and one of them is a picker view that I use for the camera, so in the screen I see the image coming from the camera and next to it another view with images.
The thing is that I capture the screen but it doesn't capture the image coming from the camera.
Also, I think I render the view.layer but the debugger keeps saying>
"Snapshotting a view that has not been rendered results in an empty
snapshot. Ensure your view has been rendered at least once before
snapshotting or snapshot after screen updates."
Any ideas? Thanks!
This is the code:
#import "ViewController.h"
#interface ViewController ()
#end
#implementation ViewController
UIImage *capturaPantalla;
-(void)takeSnapShot:(id)sender{
UIGraphicsBeginImageContext(picker.view.window.bounds.size);
[picker.view.layer renderInContext:UIGraphicsGetCurrentContext()];
capturaPantalla = UIGraphicsGetImageFromCurrentImageContext();
UIGraphicsEndImageContext();
UIImageWriteToSavedPhotosAlbum(capturaPantalla,nil,nil,nil);
}
I did the similar function before, and used AVCaptureSession to make it.
// 1. Create the AVCaptureSession
AVCaptureSession *session = [[AVCaptureSession alloc] init];
[self setSession:session];
// 2. Setup the preview view
[[self previewView] setSession:session];
// 3. Check for device authorization
[self checkDeviceAuthorizationStatus];
dispatch_queue_t sessionQueue = dispatch_queue_create("session queue", DISPATCH_QUEUE_SERIAL);
[self setSessionQueue:sessionQueue];
============================================
My countdown snapshot function:
- (void) takePhoto
{
[timer1 invalidate];
dispatch_async([self sessionQueue], ^{
// 4. Update the orientation on the still image output video connection before capturing.
[[[self stillImageOutput] connectionWithMediaType:AVMediaTypeVideo] setVideoOrientation:[[(AVCaptureVideoPreviewLayer *)[[self previewView] layer] connection] videoOrientation]];
// 5. Flash set to Auto for Still Capture
[iSKITACamViewController setFlashMode:flashMode forDevice:[[self videoDeviceInput] device]];
AVCaptureConnection *stillImageConnection = [[self stillImageOutput] connectionWithMediaType:AVMediaTypeVideo];
CGFloat maxScale = stillImageConnection.videoMaxScaleAndCropFactor;
if (effectiveScale > 1.0f && effectiveScale < maxScale)
{
stillImageConnection.videoScaleAndCropFactor = effectiveScale;;
}
[self playSound];
// 6.Capture a still image.
[[self stillImageOutput] captureStillImageAsynchronouslyFromConnection:[[self stillImageOutput] connectionWithMediaType:AVMediaTypeVideo] completionHandler:^(CMSampleBufferRef imageDataSampleBuffer, NSError *error) {
if (imageDataSampleBuffer)
{
NSData *imageData = [AVCaptureStillImageOutput jpegStillImageNSDataRepresentation:imageDataSampleBuffer];
UIImage *image = [[UIImage alloc] initWithData:imageData];
if(iPhone5 && [self.modeLabel.text isEqualToString:#"PhotoMode"]) {
UIImage *image1 = [image crop:CGRectMake(0, 0, image.size.width, image.size.height)];
[[[ALAssetsLibrary alloc] init] writeImageToSavedPhotosAlbum:[image1 CGImage] orientation:(ALAssetOrientation)[image1 imageOrientation] completionBlock:nil];
} else {
[[[ALAssetsLibrary alloc] init] writeImageToSavedPhotosAlbum:[image CGImage] orientation:(ALAssetOrientation)[image imageOrientation] completionBlock:nil];
}
self.priveImageView.image = image;
}
}];
});
}
So I'm pulling down about 50 images from my API using NSURLConnection, its working great, except its locking up the UI when it runs. I'm assuming that is because I'm updating the UI in real time form the NSURLConnection self delegate. So I'm thinking what I need to do is put placeholder loading images in the UIImage, then update them somehow once the delegate has acquired all the data, but how do I do that, can someone give me some coding examples?
- (void)connectionDidFinishLoading:(NSURLConnection *)connection {
// The request is complete and data has been received
// You can parse the stuff in your instance variable now
NSData *imageData = _dataDictionary[ [connection description] ];
if(imageData!=nil)
{
NSLog(#"%#%#",[connection description],imageData);
UIImageView *imageView = [[UIImageView alloc] initWithFrame: CGRectMake(self.x, 0, self.screenWidth, self.screenHight)];
// Process thi image
// resize the resulting image for this device
UIImage *resizedImage = [self imageScaleCropToSize:[UIImage imageWithData: imageData ]];
self.x = (self.x + imageView.frame.size.width);
if(self.x > self.view.frame.size.width) {
self.scrollView.contentSize = CGSizeMake(self.x, self.scrollView.frame.size.height);
}
[imageView setImage:resizedImage];
// add the image
[self.scrollView addSubview: imageView];
}
}
You can use SDWebImage library to achieve this.
Suppose imageArray have all the image url path.
You can use SDWebImageManager to download all the images and show them in ImageView. Also you can show downloading progress using this block.
- (void)showImages:(NSArray *)imageArray
{
SDWebImageManager *manager = [SDWebImageManager sharedManager];
for (NSString *imagePath in imageArray)
{
[manager downloadImageWithURL:[NSURL URLWithString:imagePath]
options:SDWebImageLowPriority
progress:^(NSInteger receivedSize, NSInteger expectedSize){}
completed:^(UIImage *image, NSError *error, SDImageCacheType cacheType, BOOL finished, NSURL *imageURL)
{
if(!error)
self.imgView_Image.image = image;
else
{
UIAlertView *alert = [[UIAlertView alloc] initWithTitle:#"please check your Connection and try again" message:#"No Internet Connection" delegate:nil cancelButtonTitle:#"Cancel" otherButtonTitles: nil];
[alert show];
}
}];
}
}
First create protocol in that class .h, where you call NSURLConnection request for download image (Where you implement this method connectionDidFinishLoading).
#protocol YourClassNameDelegate <NSObject>
- (void)didFinishLoadingImage:(UIImage *)downloadImage;
#end
and create property for that protocol in same class,
#interface YourViewController : UIViewController
#property (nonatomic, retain) id<YourClassNameDelegate>delegate;
#end
then synthesise it in .m, #synthesize delegate;
After that call didFinishLoadingImage: in connectionDidFinishLoading,
- (void)connectionDidFinishLoading:(NSURLConnection *)connection {
// The request is complete and data has been received
// You can parse the stuff in your instance variable now
NSData *imageData = _dataDictionary[ [connection description] ];
if(imageData!=nil)
{
NSLog(#"%#%#",[connection description],imageData);
UIImageView *imageView = [[UIImageView alloc] initWithFrame: CGRectMake(self.x, 0, self.screenWidth, self.screenHight)];
// Process thi image
// resize the resulting image for this device
UIImage *resizedImage = [self imageScaleCropToSize:[UIImage imageWithData: imageData ]];
self.x = (self.x + imageView.frame.size.width);
if(self.x > self.view.frame.size.width) {
self.scrollView.contentSize = CGSizeMake(self.x, self.scrollView.frame.size.height);
}
[self.delegate didFinishLoadingImage:resizedImage];
[imageView setImage:resizedImage];
// add the image
[self.scrollView addSubview: imageView];
}
}
and finally from where you push to YourViewController set delegate to self, like :
YourViewController *controller = [[YourViewController alloc] init];
controller.delegate = self;
//.....
in YourViewController.m, where you want to set downloaded image, in that class implement this method.
#pragma mark - YourClassName delegate method
- (void)didFinishLoadingImage:(UIImage *)downloadImage
{
//yourImageView.image = downloadImage;
}
I'm working on an app that allows the user to record video in realtime while some HUD info is displayed in a preview layer.
I've got a lot of this worked out, but one problem I have is that the video itself is rotated right by 90 degrees when the iPad is in landscape mode. When the home button is on the left, the video is rotated 90 degrees CCW, while it goes 90 degrees CW when the home button is to the right. The video should be upright regardless of which landscape mode I'm using. This [SO message] seems to tally pretty closely to what I'm dealing with, but the solution has deprecations included. 1
In my main controller:
[self setCaptureManager:[[[CaptureSessionManager alloc] init] autorelease]];
[[self captureManager] addVideoInput];
[[self captureManager] addVideoPreviewLayer];
CGRect layerRect = [[[self view] layer] bounds];
[[[self captureManager] previewLayer] setBounds:layerRect];
[[[self captureManager] previewLayer] setPosition:CGPointMake(CGRectGetMidX(layerRect),
CGRectGetMidY(layerRect))];
[[[self view] layer] addSublayer:[[self captureManager] previewLayer]];
CaptureSessionManager.h
#import <CoreMedia/CoreMedia.h>
#import <AVFoundation/AVFoundation.h>
#interface CaptureSessionManager : NSObject {
}
#property(retain) AVCaptureVideoPreviewLayer *previewLayer;
#property(retain) AVCaptureSession *captureSession;
- (void)addVideoPreviewLayer;
- (void)addVideoInput;
#end
CaptureSessionManager.m
#import "CaptureSessionManager.h"
#implementation CaptureSessionManager
#synthesize captureSession;
#synthesize previewLayer;
#pragma mark Capture Session Configuration
- (id)init {
if ((self = [super init])) {
[self setCaptureSession:[[AVCaptureSession alloc] init]];
if ([captureSession canSetSessionPreset:AVCaptureSessionPresetHigh]) {
captureSession.sessionPreset = AVCaptureSessionPresetHigh;
}
}
return self;
}
- (void)addVideoPreviewLayer {
[self setPreviewLayer:[[[AVCaptureVideoPreviewLayer alloc] initWithSession:[self captureSession]] autorelease]];
[[self previewLayer] setVideoGravity:AVLayerVideoGravityResizeAspectFill];
}
- (void)addVideoInput {
AVCaptureDevice *videoDevice = [AVCaptureDevice defaultDeviceWithMediaType:AVMediaTypeVideo];
if (videoDevice) {
NSError *error;
AVCaptureDeviceInput *videoIn = [AVCaptureDeviceInput deviceInputWithDevice:videoDevice error:&error];
if (!error) {
if ([[self captureSession] canAddInput:videoIn])
[[self captureSession] addInput:videoIn];
else
NSLog(#"Couldn't add video input");
}
else
NSLog(#"Couldn't create video input");
}
else
NSLog(#"Couldn't create video capture device");
}
- (void)dealloc {
[[self captureSession] stopRunning];
[previewLayer release], previewLayer = nil;
[captureSession release], captureSession = nil;
[super dealloc];
}
#end
If answers in that question you mentioned are not working, then you can achieve it other way around. Use CATransform3DMakeRotation to rotate your preview layer according to the orientation of the device.
For example, for UIDeviceOrientationLandscapeLeft you can rotate your preview layer like this:
preview.transform = CATransform3DMakeRotation(-M_PI/2, 0, 0, 1);
Found that the videoOrientation attribute of the preview layer connection needed to be set.
Here's the change to my addVideoPreviewLayer method...
- (void)addVideoPreviewLayer {
[self setPreviewLayer:[[[AVCaptureVideoPreviewLayer alloc] initWithSession:[self captureSession]] autorelease]];
[[self previewLayer] setVideoGravity:AVLayerVideoGravityResizeAspectFill];
[[previewLayer connection] setVideoOrientation:AVCaptureVideoOrientationLandscapeRight];
}
What is the AVFoundation equivalent for this UIImagePicker enabled method? My app uses AVFoundation and I want to nix UIImagePicker entirley. This method comes after a UIImagePicker class is set up in the program. It sets a hypothetical image a user takes (physically using the iPhone camera interface) to an IBOutlet named photo, declared earlier in app. excerpt from http://www.raywenderlich.com/13541/how-to-create-an-app-like-instagram-with-a-web-service-backend-part-22 (download at bottom)
#pragma mark - Image picker delegate methods
-(void)imagePickerController:(UIImagePickerController *)picker didFinishPickingMediaWithInfo:(NSDictionary *)info {
UIImage *image = [info objectForKey:UIImagePickerControllerOriginalImage];
// Resize the image from the camera
UIImage *scaledImage = [image resizedImageWithContentMode:UIViewContentModeScaleAspectFill bounds:CGSizeMake(photo.frame.size.width, photo.frame.size.height) interpolationQuality:kCGInterpolationHigh];
// Crop the image to a square (yikes, fancy!)
UIImage *croppedImage = [scaledImage croppedImage:CGRectMake((scaledImage.size.width -photo.frame.size.width)/2, (scaledImage.size.height -photo.frame.size.height)/2, photo.frame.size.width, photo.frame.size.height)];
// Show the photo on the screen
photo.image = croppedImage;
[picker dismissModalViewControllerAnimated:NO];
}
//Successful avcam snap i want to use. Uses AVFoundation
- (IBAction)snapStillImage:(id)sender
{
dispatch_async([self sessionQueue], ^{
// Update the orientation on the still image output video connection before capturing.
[[[self stillImageOutput] connectionWithMediaType:AVMediaTypeVideo] setVideoOrientation:[[(AVCaptureVideoPreviewLayer *)[[self previewView] layer] connection] videoOrientation]];
// Flash set to Auto for Still Capture
[PhotoScreen setFlashMode:AVCaptureFlashModeAuto forDevice:[[self videoDeviceInput] device]];
// Capture a still image.
[[self stillImageOutput] captureStillImageAsynchronouslyFromConnection:[[self stillImageOutput] connectionWithMediaType:AVMediaTypeVideo] completionHandler:^(CMSampleBufferRef imageDataSampleBuffer, NSError *error) {
if (imageDataSampleBuffer)
{
NSData *imageData = [AVCaptureStillImageOutput jpegStillImageNSDataRepresentation:imageDataSampleBuffer];
UIImage *image = [[UIImage alloc] initWithData:imageData];
[[[ALAssetsLibrary alloc] init] writeImageToSavedPhotosAlbum:[image CGImage] orientation:(ALAssetOrientation)[image imageOrientation] completionBlock:nil];
//
}
}];
});
}
Note: the AVCam method is part of an AVSession which apples describes as the class to coordinate all data flow for media capture using AVFoundation. The full official program from Apple for AVCam is here https://developer.apple.com/library/ios/samplecode/AVCam/Introduction/Intro.html
I happened to stumble upon this excerpt located in AVCamViewController. I noticed it looks a lot like the method in question. Can someone help verify its scalability?
#pragma mark File Output Delegate
- (void)captureOutput:(AVCaptureFileOutput *)captureOutput didFinishRecordingToOutputFileAtURL:(NSURL *)outputFileURL fromConnections:(NSArray *)connections error:(NSError *)error
{
if (error)
NSLog(#"%#", error);
[self setLockInterfaceRotation:NO];
// Note the backgroundRecordingID for use in the ALAssetsLibrary completion handler to end the background task associated with this recording. This allows a new recording to be started, associated with a new UIBackgroundTaskIdentifier, once the movie file output's -isRecording is back to NO — which happens sometime after this method returns.
UIBackgroundTaskIdentifier backgroundRecordingID = [self backgroundRecordingID];
[self setBackgroundRecordingID:UIBackgroundTaskInvalid];
[[[ALAssetsLibrary alloc] init] writeVideoAtPathToSavedPhotosAlbum:outputFileURL completionBlock:^(NSURL *assetURL, NSError *error) {
if (error)
NSLog(#"%#", error);
[[NSFileManager defaultManager] removeItemAtURL:outputFileURL error:nil];
if (backgroundRecordingID != UIBackgroundTaskInvalid)
[[UIApplication sharedApplication] endBackgroundTask:backgroundRecordingID];
}];
}
I want to use AVCaptureSession to capture images within my iOS App.
Based on the code from Apple's sample for AVCam https://developer.apple.com/library/ios/samplecode/AVCam/Introduction/Intro.html I tried to show a preview ImageView every time an image is captured.
// Capture a still image.
[[self stillImageOutput] captureStillImageAsynchronouslyFromConnection:[[self stillImageOutput] connectionWithMediaType:AVMediaTypeVideo] completionHandler:^(CMSampleBufferRef imageDataSampleBuffer, NSError *error) {
if (imageDataSampleBuffer)
{
NSData *imageData = [AVCaptureStillImageOutput jpegStillImageNSDataRepresentation:imageDataSampleBuffer];
[self processImage:[UIImage imageWithData:imageData]];
}
}];
After capturing processImage is called
- (void) processImage:(UIImage *)image
{
[[self postButton] setEnabled:YES];
[[self postButton] setHidden:NO];
[[self cancelButton] setEnabled:YES];
[[self cancelButton] setHidden:NO];
_preview = [[UIImageView alloc] init];
[_preview setImage:image];
_preview.hidden = NO;
}
But the ImageView is still unchanged/empty when it is displayed.
Can someone help me along?
This code works for me
AVCaptureConnection *connection = [_currentOutput connectionWithMediaType:AVMediaTypeVideo];
[self _setOrientationForConnection:connection];
[_captureOutputPhoto captureStillImageAsynchronouslyFromConnection:connection completionHandler:
^(CMSampleBufferRef imageDataSampleBuffer, NSError *error) {
if (!imageDataSampleBuffer) {
DLog(#"failed to obtain image data sample buffer");
// return delegate error
return;
}
if (error) {
if ([_delegate respondsToSelector:#selector(vision:capturedPhoto:error:)]) {
[_delegate vision:self capturedPhoto:nil error:error];
}
return;
}
NSMutableDictionary *photoDict = [[NSMutableDictionary alloc] init];
NSDictionary *metadata = nil;
// add photo metadata (ie EXIF: Aperture, Brightness, Exposure, FocalLength, etc)
metadata = (__bridge NSDictionary *)CMCopyDictionaryOfAttachments(kCFAllocatorDefault, imageDataSampleBuffer, kCMAttachmentMode_ShouldPropagate);
if (metadata) {
[photoDict setObject:metadata forKey:PBJVisionPhotoMetadataKey];
CFRelease((__bridge CFTypeRef)(metadata));
} else {
DLog(#"failed to generate metadata for photo");
}
// add JPEG and image data
NSData *jpegData = [AVCaptureStillImageOutput jpegStillImageNSDataRepresentation:imageDataSampleBuffer];
if (jpegData) {
// add JPEG
[photoDict setObject:jpegData forKey:PBJVisionPhotoJPEGKey];
// add image
UIImage *image = [self _uiimageFromJPEGData:jpegData];
if (image) {
[photoDict setObject:image forKey:PBJVisionPhotoImageKey];
} else {
DLog(#"failed to create image from JPEG");
// TODO: return delegate on error
}
// add thumbnail
UIImage *thumbnail = [self _thumbnailJPEGData:jpegData];
if (thumbnail) {
[photoDict setObject:thumbnail forKey:PBJVisionPhotoThumbnailKey];
} else {
DLog(#"failed to create a thumnbail");
// TODO: return delegate on error
}
} else {
DLog(#"failed to create jpeg still image data");
// TODO: return delegate on error
}
if ([_delegate respondsToSelector:#selector(vision:capturedPhoto:error:)]) {
[_delegate vision:self capturedPhoto:photoDict error:error];
}
// run a post shot focus
[self performSelector:#selector(_focus) withObject:nil afterDelay:0.5f];
}];
I bet that the trouble here is that completionHandler for captureStillImageAsynchronouslyFromConnection isn't being called on main thread. Try redispatching your call out to the main thread.
dispatch_async(dispatch_get_main_queue(), ^{
[self processImage:[UIImage imageWithData:imageData]];
});
I also notice that you're never adding _preview as a subview of anything.