I'm trying to run a void saveImageToLibrary when
[self.avSnapper captureStillImageAsynchronouslyFromConnection:captureConnection
completionHandler:handler];
is done. How would I go about it?
I thing this is what you want
[self.avSnapper captureStillImageAsynchronouslyFromConnection:videoConnection completionHandler: ^(CMSampleBufferRef imageSampleBuffer, NSError *error){
CFDictionaryRef exifAttachments = CMGetAttachment( imageSampleBuffer, kCGImagePropertyExifDictionary, NULL);
if (exifAttachments){
// Do something with the attachments if you want to.
NSLog(#"attachements: %#", exifAttachments);
}
else
NSLog(#"no attachments");
NSData *imageData = [AVCaptureStillImageOutput jpegStillImageNSDataRepresentation:imageSampleBuffer];
UIImage *image = [[UIImage alloc] initWithData:imageData];
self.vImage.image = image;
}];
If you need to run some code at function completion, there is "completionHandler" parameter.
According to documentation: "A block to invoke after the image has been captured.".
Edit: You can read about blocks programming here. As a block notation can be a bit confusing, there is a simple trick that may help you. When you create function signature in XCode using autocompletion, you have blue placeholders for variables you need to pass. Now, when you hit 'enter' on block placeholder, XCode generates empty block with matching syntax for you.
Related
I am working on an App which is taking photos of the user every day. Everything works fine and the Images get saved to the device. I currently can access them from the standard photos applicaltion.
In the next step of programming the application, I like to build a gallery in my App. I am thinking of iCarousel, but I am not sure yet.
Now I like to know which is the best way to save the images the user makes of himself? The user should be able to access the pictures by using the standard photos application on the device and in my gallery in my App. I am targeting iOS 8.1.
Currently I am saving the photos like this:
- (void)takePhoto{
NSLog(#"CameraController: takePhoto()");
AVCaptureConnection *videoConnection = nil;
for(AVCaptureConnection *connection in stillImageOutput.connections){
for(AVCaptureInputPort *port in [connection inputPorts]){
if([[port mediaType] isEqual:AVMediaTypeVideo]){
videoConnection = connection;
break;
}
}
if(videoConnection){
break;
}
}
[stillImageOutput captureStillImageAsynchronouslyFromConnection:videoConnection completionHandler:^(CMSampleBufferRef imageDataSampleBuffer, NSError *error) {
if(imageDataSampleBuffer != NULL) { //this code gets executed if a photo is taken
NSData *imageData = [AVCaptureStillImageOutput jpegStillImageNSDataRepresentation:imageDataSampleBuffer];
UIImage *combined = [UIImage imageWithData:imageData];
//....
dispatch_async(dispatch_get_global_queue(DISPATCH_QUEUE_PRIORITY_DEFAULT, 0), ^{
UIImageWriteToSavedPhotosAlbum(combined, nil, nil, nil);
NSLog(#"CameraController: Image saved");
});
}
}];
}
You should read about (and use) PhotoKit.
I am using iCarousel now. I use NSSearchPathForDirectoriesInDomains(NSDocumentDirectory,NSUserDomainMask, YES); for generating a path for the pictures I like to save and to display in iCarousel.
I have successfully retrieved the image (lets say user profile pic) from parse, and i'm displaying it in the ios app. But if there is no image in the file column, then i want to display default image which is stored in the locally (not in parse), not successful.
My simple question is, how to check if there is image or not ? Below is the simple code;
PFFile *userImage = [object objectForKey:#"profilePic"]; __block UIImage *userProfilePic =[[UIImage alloc]init];
[userImage getDataInBackgroundWithBlock:^(NSData *data, NSError *error) {
if (!error) {
//this block will execute only if there is image
if(data) {
userProfilePic = [UIImage imageWithData:data];
userPic.image=userProfilePic;
}
//this block is not at all executing. This block should execute when user doesn't have their profile uploaded
if(!data) {
UIImage *userProfileDefaultPic = [[UIImage alloc]init];
userProfileDefaultPic=[UIImage imageNamed:#"defaultprofile.png"];
userPic.image=userProfileDefaultPic;
}
}
}];
Hope you can help me. Thanks in advance.
Pradeep
At last found the solution !
(!data) will not execute , instead we need to check if the PFFile and the UIImage to store PFFile has image or not, like below;
PFFile *userImage = [object objectForKey:#"profilePic"];
__block UIImage *userProfilePic =[[UIImage alloc]init];
if(userImage && ![userProfilePic isEqual:[NSNull null]]) //This is important to check if the file if image is there or not and if not, then display default pic in else part
{
[userImage getDataInBackgroundWithBlock:^(NSData *data, NSError *error) {
if(!error)
{
userProfilePic = [UIImage imageWithData:data];
userPic.image=userProfilePic;
}
}];
}
else
{
userProfilePic =[UIImage imageNamed:#"defaultprofilepic.png"];
userPic.image=userProfilePic;
}
Your code already does that. You just need to set your image on the main thread and not background thread.
if(!data) { [dispatch_async(dispatch_get_main_queue(), ^{
UIImage *userProfileDefaultPic = [[UIImage alloc]init];
userProfileDefaultPic=[UIImage imageNamed:#"defaultprofile.png"];
userPic.image=userProfileDefaultPic;
}}];
I'm working with a UICollectionView and I need to load images in to the collection cell.
Here is the code
dispatch_queue_t queue = dispatch_get_global_queue(DISPATCH_QUEUE_PRIORITY_HIGH, 0ul);
dispatch_async(queue, ^{
NSURL *imageURL = [NSURL URLWithString:post.cover_img_url];
NSData *data = [NSData dataWithContentsOfURL:imageURL];
UIImage *image = [UIImage imageWithData:data];
dispatch_async(dispatch_get_main_queue(), ^{
[cell.cover_img setImage:image];
});
});
post.cover_img_url = the_url_for_the_image;
cell.cover_img = the_UIImageView_object;
When I set the image not from the web, it works perfectly.
But when I try from the web, the UIImageView is blank not image.
Anyone have any ideas?
Your code looks like it should be working, but there are a couple of things that you need to test. First, you need to check to make sure that imageURL is both non-nil, and points to exactly where you think it's pointing. Copy paste a log result to your desktop browser if you have to.
NSLog(#"%#",imageURL);
Second, in the event that fetching the image fails, you need to be able to detect this. I recommend that if you continue to use the dataWithContentsOfURL: route that you at least use the following to check the error:
NSError *error = NULL;
NSData *data = [NSData dataWithContentsOfURL:imageURL options:NSDataReadingMappedIfSafe error:&error];
if (error) {
// ...
}
I am working on an iOS app which dispatch quite a number of tasks to my serial queue. The task is to download images from my web server, save it to disk, and later displayed on UIImageView. However, [NSURLConnection sendAsynchrousRequest] will keep eating up more and more memory until iOS kill my process.
The downloader method looks like this:
// dispatch_queue_t is created once by: m_pRequestQueue = dispatch_queue_create( "mynamespace.app", DISPATCH_QUEUE_SERIAL);
- (void) downloadImageInBackgroundWithURL:(NSString*) szUrl {
__block typeof(self) bSelf = self;
__block typeof(m_pUrlRequestQueue) bpUrlRequestQueue = m_pRequestQueue;
dispatch_async( m_pRequestQueue, ^{
NSAutoreleasePool *pAutoreleasePool = [[NSAutoreleasePool alloc] init];
NSURLRequest *pRequest = [NSURLRequest requestWithURL:[NSURL URLWithString:szUrl]
cachePolicy:NSURLRequestReloadIgnoringCacheData
timeoutInterval:URL_REQUEST_TIMEOUT];
[NSURLConnection sendAsynchronousRequest:pRequest queue:bpUrlRequestQueue completionHandler:^(NSURLResponse *pResponse, NSData *pData, NSError *pError) {
NSAutoreleasePool *pPool = [[NSAutoreleasePool alloc] init];
if ( pError != nil ) {
} else {
// convert image to png format
UIImage *pImg = [UIImage imageWithData:pData];
NSData *pDataPng = UIImagePNGRepresentation(pImg);
bool bSaved = [[NSFileManager defaultManager] createFileAtPath:szCacheFile contents:pDataPng attributes:nil];
}
__block typeof(pDataPng) bpDataPng = pDataPng;
__block typeof(pError) bpError = pError;
dispatch_sync( dispatch_get_main_queue(), ^ {
NSAutoreleasePool *autoreleasepool = [[NSAutoreleasePool alloc] init];
UIImage *pImage = [[UIImage alloc] initWithData:bpDataPng];
// display the image
[pImage release];
// NSLog( #"image retain count: %d", [pImage retainCount] ); // 0, bad access
[autoreleasepool drain];
});
}
[pPool drain];
}]; // end sendAsynchronousRequest
[pAutoreleasePool drain];
}); // end dispatch_async
} // end downloadImageInBackgroundWithURL
I am quite sure it is something inside [NSURLConnection sendAsynchronousRequest] as the profiler is showing that the function is the one eating up all the memory...
However, I am also not very sure about the dispatch_*** and block things, I've always used C and C++ code with pthread before, but after reading from Apple's documentation on migrating away from thread, I decided to give GCD a try, objective-c is so troublesome and I'm not sure how to release the NSData *pData and NSURLResponse *pResponse as it crash whenever I do it.
Please advice... really need help to learn and appreciate objective-c...
ADDITIONAL EDIT:
Thanks to #robhayward, I put the pImg and pDataPng outside as __block variable, use his RHCacheImageView way of downloading data ( NSData initWithContentOfURL )
Thanks as well to #JorisKluivers, the first UIImage can actually be reused to display as UIImageView recognized both jpg and png format, just my later processing requires png format and I am reading from the disk later just when required
I would firstly put it down to the image and data objects that you are creating:
UIImage *pImg = [UIImage imageWithData:pData];
NSData *pDataPng = UIImagePNGRepresentation(pImg);
Which might be hanging around too long, perhaps put them outside the block, as they are probably being created/released on different threads:
__block UIImage *pImg = nil;
__block NSData *pDataPng = nil;
[NSURLConnection sendAsynchronousRequest..
(Also consider using ARC if you can)
I have some code on Github that does a similar job without this issue, feel free to check it out:
https://github.com/robinhayward/RHCache/blob/master/RHCache/RHCache/Helpers/UIImageView/RHCacheImageView.m
First of all try simplifying your code. Things I did:
Remove the outer dispatch_async. This is not needed, your sendAsynchronousRequest is async already. This also removes the need another __block variable on the queue.
You create an image named pImg from the received pData, then convert that back to NSData of type png, and later create another image pImage from that again. Instead of converting over and over, just reuse the first image. You could even write the original pData to disk (unless you really want the png format on disk).
I didn't compile the code below myself, so it might contain a few mistakes. But it is a simpler version that might help solve the leak.
- (void) downloadImageInBackgroundWithURL:(NSString*)szUrl
{
__block typeof(self) bSelf = self;
NSAutoreleasePool *pAutoreleasePool = [[NSAutoreleasePool alloc] init];
NSURLRequest *pRequest = [NSURLRequest requestWithURL:[NSURL URLWithString:szUrl]
cachePolicy:NSURLRequestReloadIgnoringCacheData
timeoutInterval:URL_REQUEST_TIMEOUT];
[NSURLConnection sendAsynchronousRequest:pRequest queue:m_pRequestQueue completionHandler:^(NSURLResponse *pResponse, NSData *pData, NSError *pError) {
NSAutoreleasePool *pPool = [[NSAutoreleasePool alloc] init];
if (pError) {
// TODO: handle error
return;
}
// convert image to png format
__block UIImage *pImg = [UIImage imageWithData:pData];
// possibly just write pData to disk
NSData *pDataPng = UIImagePNGRepresentation(pImg);
bool bSaved = [[NSFileManager defaultManager] createFileAtPath:szCacheFile contents:pDataPng attributes:nil];
dispatch_sync( dispatch_get_main_queue(), ^ {
// display the image in var pImg
});
}];
[pAutoreleasePool drain];
}
I have come up with an implementation of AVFoundation and ImageIO to take care of the photo taking in my application. I have an issue with it, however. The images I take are always dark, even if the flash goes off. Here's the code I use:
[[self currentCaptureOutput] captureStillImageAsynchronouslyFromConnection:[[self currentCaptureOutput].connections lastObject]
completionHandler:^(CMSampleBufferRef imageDataSampleBuffer, NSError *error) {
[[[blockSelf currentPreviewLayer] session] stopRunning];
if (!error) {
NSData *data = [AVCaptureStillImageOutput jpegStillImageNSDataRepresentation:imageDataSampleBuffer];
CGImageSourceRef source = CGImageSourceCreateWithData((CFDataRef) data, NULL);
if (source) {
UIImage *image = [blockSelf imageWithSource:source];
[blockSelf updateWithCapturedImage:image];
CFRelease(source);
}
}
}];
Is there anything there that could cause the image taken to not include the flash?
I found I sometimes got dark images if the AVCaptureSession was set up immediately before this call. Perhaps it takes a while for the auto-exposure & white balance settings to adjust themselves.
The solution was to set up the AVCaptureSession, then wait until the AVCaptureDevice's adjustingExposure and adjustingWhiteBalance properties are both NO (observe these with KVO) before calling -[AVCaptureStillImageOutput captureStillImageAsynchronouslyFromConnection: completionHandler:].