I have come up with an implementation of AVFoundation and ImageIO to take care of the photo taking in my application. I have an issue with it, however. The images I take are always dark, even if the flash goes off. Here's the code I use:
[[self currentCaptureOutput] captureStillImageAsynchronouslyFromConnection:[[self currentCaptureOutput].connections lastObject]
completionHandler:^(CMSampleBufferRef imageDataSampleBuffer, NSError *error) {
[[[blockSelf currentPreviewLayer] session] stopRunning];
if (!error) {
NSData *data = [AVCaptureStillImageOutput jpegStillImageNSDataRepresentation:imageDataSampleBuffer];
CGImageSourceRef source = CGImageSourceCreateWithData((CFDataRef) data, NULL);
if (source) {
UIImage *image = [blockSelf imageWithSource:source];
[blockSelf updateWithCapturedImage:image];
CFRelease(source);
}
}
}];
Is there anything there that could cause the image taken to not include the flash?
I found I sometimes got dark images if the AVCaptureSession was set up immediately before this call. Perhaps it takes a while for the auto-exposure & white balance settings to adjust themselves.
The solution was to set up the AVCaptureSession, then wait until the AVCaptureDevice's adjustingExposure and adjustingWhiteBalance properties are both NO (observe these with KVO) before calling -[AVCaptureStillImageOutput captureStillImageAsynchronouslyFromConnection: completionHandler:].
Related
I am creating an app that makes use of iMessage and MMS.
[stillImageOutput captureStillImageAsynchronouslyFromConnection:videoConnection completionHandler:^(CMSampleBufferRef imageDataSampleBuffer, NSError *error) {
if (imageDataSampleBuffer) {
The problem is, after I take a picture then click the send button, the image randomly becomes a question mark.
NSData *imageData = [AVCaptureStillImageOutput jpegStillImageNSDataRepresentation:imageDataSampleBuffer];
The above code was the first the one we used followed by:
[self sendSMSmessage:myMessage image:imageData];
next, in the sendSMSmessage method, the following codes were called:
MFMessageComposeViewController *myText = [[MFMessageComposeViewController alloc] init];
[myText setMessageComposeDelegate:self];
[myText setBody:myMessage];
[myText addAttachmentData:image typeIdentifier:#"image/jpeg" filename:#"image.jpeg"];
Then, I present the MFMessageComposeViewController myText
[self presentViewController:myText animated:YES completion:nil];
Then, I clicked Send... It successfully sent for the app and I can see the picture on MFMessageComposeViewController. But when I try to look at iMessage, some of the pictures I tried to send is good. But some are not. Some show question mark. Did the images got corrupted in the process or what? I tried to compress the image using the following:
CGFloat compressionQuality = 0.3;
NSData *imageData = [NSData dataWithData:(UIImageJPEGRepresentation
([UIImage imageWithData:[AVCaptureStillImageOutput jpegStillImageNSDataRepresentation:imageDataSampleBuffer]]
, compressionQuality))];
but still performs the same. In 10 tries, I get something like 4 failed images. Could it be a problem in iPhone or just my app itself? Thanks!
I don't know if some people would experience this, but the thing I did to fix this was:
Declaring the imageData as a strong reference:
__strong NSData *imageData = [AVCaptureStillImageOutput jpegStillImageNSDataRepresentation:imageDataSampleBuffer];
And then I changed the typeIdentifier of the myText composer into #"public.jpeg" instead of #"image/jpeg"
[myText addAttachmentData:image typeIdentifier:#"public.jpeg" filename:#"image.jpeg"];
I am working on an App which is taking photos of the user every day. Everything works fine and the Images get saved to the device. I currently can access them from the standard photos applicaltion.
In the next step of programming the application, I like to build a gallery in my App. I am thinking of iCarousel, but I am not sure yet.
Now I like to know which is the best way to save the images the user makes of himself? The user should be able to access the pictures by using the standard photos application on the device and in my gallery in my App. I am targeting iOS 8.1.
Currently I am saving the photos like this:
- (void)takePhoto{
NSLog(#"CameraController: takePhoto()");
AVCaptureConnection *videoConnection = nil;
for(AVCaptureConnection *connection in stillImageOutput.connections){
for(AVCaptureInputPort *port in [connection inputPorts]){
if([[port mediaType] isEqual:AVMediaTypeVideo]){
videoConnection = connection;
break;
}
}
if(videoConnection){
break;
}
}
[stillImageOutput captureStillImageAsynchronouslyFromConnection:videoConnection completionHandler:^(CMSampleBufferRef imageDataSampleBuffer, NSError *error) {
if(imageDataSampleBuffer != NULL) { //this code gets executed if a photo is taken
NSData *imageData = [AVCaptureStillImageOutput jpegStillImageNSDataRepresentation:imageDataSampleBuffer];
UIImage *combined = [UIImage imageWithData:imageData];
//....
dispatch_async(dispatch_get_global_queue(DISPATCH_QUEUE_PRIORITY_DEFAULT, 0), ^{
UIImageWriteToSavedPhotosAlbum(combined, nil, nil, nil);
NSLog(#"CameraController: Image saved");
});
}
}];
}
You should read about (and use) PhotoKit.
I am using iCarousel now. I use NSSearchPathForDirectoriesInDomains(NSDocumentDirectory,NSUserDomainMask, YES); for generating a path for the pictures I like to save and to display in iCarousel.
I'm using Apple's AVCam source code to create a custom camera, its working like a charm, the problem is once I captured a video or image with it, and then checked it into photo library its orientation gets changed to landscape (even I captured it in portrait orientation). I searched a lot for this, but couldn't find a way for this. Any help?
For a note, my app only supports portrait and capturing should only in portrait.
Update:
AVCaptureConnection *captureConnection = ...
if ([captureConnection isVideoOrientationSupported])
{
AVCaptureVideoOrientation orientation = AVCaptureVideoOrientationPortrait;
[captureConnection setVideoOrientation:orientation];
}
This doesn't work.
For capturing image you should set orientation too. When you save image to disk you should use
writeImageToSavedPhotosAlbum:orientation:completionBlock:
function and set correct "orientation" parameter there too.
Usage: https://developer.apple.com/library/ios/documentation/AssetsLibrary/Reference/ALAssetsLibrary_Class/index.html#//apple_ref/occ/instm/ALAssetsLibrary/writeImageToSavedPhotosAlbum:orientation:completionBlock:
Example on Objective C:
// Flash set to Auto for Still Capture
[CameraViewController setFlashMode:AVCaptureFlashModeAuto
forDevice:[[self videoDeviceInput] device]];
// Capture a still image.
[[self stillImageOutput] captureStillImageAsynchronouslyFromConnection:[[self stillImageOutput] connectionWithMediaType:AVMediaTypeVideo]
completionHandler:^(CMSampleBufferRef imageDataSampleBuffer, NSError *error) {
if (imageDataSampleBuffer) {
self.imageData = [AVCaptureStillImageOutput jpegStillImageNSDataRepresentation:imageDataSampleBuffer];
[[[ALAssetsLibrary alloc] init] writeImageToSavedPhotosAlbum:image.CGImage
orientation:(ALAssetOrientation)[image imageOrientation]
completionBlock:^(NSURL *assetURL, NSError *error) {
if(error == nil) {
NSLog(#"PHOTO SAVED - assetURL: %#", assetURL);
} else {
NSLog(#"ERROR : %#",error);
}
}];
I'm trying to run a void saveImageToLibrary when
[self.avSnapper captureStillImageAsynchronouslyFromConnection:captureConnection
completionHandler:handler];
is done. How would I go about it?
I thing this is what you want
[self.avSnapper captureStillImageAsynchronouslyFromConnection:videoConnection completionHandler: ^(CMSampleBufferRef imageSampleBuffer, NSError *error){
CFDictionaryRef exifAttachments = CMGetAttachment( imageSampleBuffer, kCGImagePropertyExifDictionary, NULL);
if (exifAttachments){
// Do something with the attachments if you want to.
NSLog(#"attachements: %#", exifAttachments);
}
else
NSLog(#"no attachments");
NSData *imageData = [AVCaptureStillImageOutput jpegStillImageNSDataRepresentation:imageSampleBuffer];
UIImage *image = [[UIImage alloc] initWithData:imageData];
self.vImage.image = image;
}];
If you need to run some code at function completion, there is "completionHandler" parameter.
According to documentation: "A block to invoke after the image has been captured.".
Edit: You can read about blocks programming here. As a block notation can be a bit confusing, there is a simple trick that may help you. When you create function signature in XCode using autocompletion, you have blue placeholders for variables you need to pass. Now, when you hit 'enter' on block placeholder, XCode generates empty block with matching syntax for you.
It keeps on giving me the error:
Error Domain=AVFoundationErrorDomain Code=-11814 "Cannot Record"
I am not sure what the problem is? I am trying to record the sound right when the counter reaches 1 after a picture is taken.
static int counter;
//counter will always be zero it think unless it is assigned.
if (counter == 0){
dispatch_async([self sessionQueue], ^{
// Update the orientation on the still image output video connection before capturing.
[[[self stillImageOutput] connectionWithMediaType:AVMediaTypeVideo] setVideoOrientation:[[(AVCaptureVideoPreviewLayer *)[[self previewView] layer] connection] videoOrientation]];
// Flash set to Auto for Still Capture
[AVCamViewController setFlashMode:AVCaptureFlashModeAuto forDevice:[[self videoDeviceInput] device]];
// Capture a still image.
[[self stillImageOutput] captureStillImageAsynchronouslyFromConnection:[[self stillImageOutput] connectionWithMediaType:AVMediaTypeVideo] completionHandler:^(CMSampleBufferRef imageDataSampleBuffer, NSError *error) {
if (imageDataSampleBuffer)
{//[AVCaptureSession snapStillImage];
NSData *imageData = [AVCaptureStillImageOutput jpegStillImageNSDataRepresentation:imageDataSampleBuffer];
UIImage *image = [[UIImage alloc] initWithData:imageData];
[[[ALAssetsLibrary alloc] init] writeImageToSavedPhotosAlbum:[image CGImage] orientation:(ALAssetOrientation)[image imageOrientation] completionBlock:nil];
}
NSLog(#"i");
}];
});
if (!_audioRecorder.recording)
{
//start recording as part of still image
_playButton.enabled = NO;
_stopButton.enabled = YES;
[_audioRecorder record];
for(int i=0;i<1000;i++)
{
//do nothing just counting
}
//stop the recording
}
}
else if (counter == 1)
{
[self recordForDuration:5];
}
}
This error is because you use an emulator, you need use an device
Regards
Make sure there is only one instance of AVCaptureSession running.
Having your device restrict the camera access under "Settings > General > Restrictions" will also give you this error.
I ran into the same error when I was trying AVFoundation on a Mac, using a Mac Catalyst app. That is documented and intended behaviour, apparently:
https://forums.developer.apple.com/thread/124652#389519
https://developer.apple.com/documentation/avfoundation/cameras_and_media_capture
This issue behaves as intended. We do document that we are not listing
devices in Catalyst mode. See Important iPad apps running in macOS
cannot use the AVFoundation Capture classes. These apps should instead
use UIImagePickerController for photo and video capture.