I got a big performance issue using UIImagePickerController and saving the image on disk. I can't figure out what I am doing wrong. Here is my code:
- (void)imagePickerController:(UIImagePickerController *)pick
didFinishPickingMediaWithInfo:(NSDictionary *)info {
UIImage *image = [info objectForKey:#"UIImagePickerControllerOriginalImage"];
NSData *imageData = [NSData dataWithData:UIImagePNGRepresentation(image)];
iPixAppDelegate *delegate = (iPixAppDelegate *)[[UIApplication sharedApplication] delegate];
[delegate addPicture:imageData];
}
The addPicture method creates a new picture object that is initialized this way:
- (Picture*) initPicture:(NSData*)dat inFolder:(NSString*)pat {
self.data = dat;
NSDate *d = [NSDate date];
NSDateFormatter *formatter = [[NSDateFormatter alloc] init];
[formatter setDateFormat:#"yyyy-mm-dd hh-mm-ss"];
self.name = [[formatter stringFromDate:d] stringByAppendingString:#".png"]; //The name by default of a picture is the date it has been taken
[formatter release];
self.path = [pat stringByAppendingPathComponent:self.name];
if(![self fileExistsAtPath:self.path]){
[self.data writeToFile:self.path atomically:YES];
}
return self;
}
The UIImagePickerController is quite fast but the program becomes very slow when I save picture on the disk.
Any idea on what I am doing wrong?
I had a similar issue. The way I got round it was to handle the image from the picker in a seperate thread. My problem was the main thread handling my app/UI was crashing out when trying to close the picker and handle the image:
- (void)imagePickerController:(UIImagePickerController *)picker
didFinishPickingImage:(UIImage *)image
editingInfo:(NSDictionary *)editingInfo
{
[[picker parentViewController] dismissModalViewControllerAnimated:YES];
NSLog(#"picker did finish");
[NSThread detachNewThreadSelector:#selector(useImage:) toTarget:self withObject:image];
}
Your problem might be due to you taking the original image.
The original image from the camera has a resolution of around 1200x1400, which is a lot of memory and will cause the device to crash if you try making a picture out of it (it will run out of memory).
I would suggest resizing the image to be smaller (the native 320x480).
Related
I am using UIIMagePicker Controller for capturing image. The camera capture works fine for first 30 to 40 shots but it will crash the app after around 40 captures. I do not get any memory warning or crash report on xcode.
This issue look like memory leak but i have monitored Instruments and memory use is not going more then 60 MB.
The image pick from gallery do not cause this issue.
Code i am using :-
- (void)imagePickerController:(UIImagePickerController *)picker didFinishPickingMediaWithInfo:(NSDictionary *)info
{
[MBProgressHUD showHUDAddedTo:self.view animated:true];
UIImage *chosenImage = info[UIImagePickerControllerOriginalImage];
NSDate *now = [NSDate date];
NSDateFormatter *dateFormatter = [[NSDateFormatter alloc] init];
dateFormatter.dateFormat = #"MMM-dd-yyyy";
[dateFormatter setTimeZone:[NSTimeZone systemTimeZone]];
NSDateFormatter *timeFormatter = [[NSDateFormatter alloc] init];
timeFormatter.dateFormat = #"HH:mm a";
[timeFormatter setTimeZone:[NSTimeZone systemTimeZone]];
NSString *strtime=[NSString stringWithFormat:#"%#\n%# ",[timeFormatter stringFromDate:now],[dateFormatter stringFromDate:now]];
lblTime.numberOfLines =0;
[lblTime setText:strtime];
[lblTime setHidden:YES];
imgTakenPhoto.image = chosenImage;
[btnCrossOnDentView setHidden:YES];
[btnDoneWithDent setHidden:YES];
App_Delegate.isEdited = YES;
[picker dismissViewControllerAnimated:YES completion: ^{
dispatch_async(dispatch_get_main_queue(), ^{
[MBProgressHUD hideAllHUDsForView:self.view animated:true];
imagePickerController = nil;
}); }];
}
Its an iPhone OS issue. I also submitted a report regarding this issue. Same code will run fine in iPad. Report a bug and submit your project.
The best workaround to this is that instead of using UIImagePicker for Camera, use your custom camera view as like WhatsApp, Provide one Capture button onto it to capture as many pics as you want.
Use AVCapture
This will completely remove the burden of opening the Camera Controller.
I also faced this problem my current working application. This is because of memory overflow while running your application. I removed unnecessary memory space in application running state. Now my application working fine without any problem.
I have simple text file generated in my application. The thing is I want to upload this text file on iCloud so that if the user installs app and inputs data he desires and then agin uninstalls this app. Then the next time he installs that app again I want to fetch the text file uploaded the first time he had used the same app.
I am facing a huge problem in integrating iCloud to my app.
I have done much research but didn't got any specific answers.
P.S. = I am not using Core data.
All i want is to upload the text file generated by the app into the iCloud Drive.
Please guide me step by step how can I achieve this. I have my developer account and I have a bit knowledge about the certificates and all. But still if anyone can please guide me how to achieve it.
I JUST WANT TO UPLOAD THE TEXT FILE TO ICLOUD AND RETRIEVE IT AGAIN WHEN THE SAME APP IS INSTALLED AGAIN (EVEN IF THE APP IS GETTING INSTALLED ON OTHER DEVICES).
ViewController.m
#pragma mark - Image Pick
- (IBAction)pickImage:(id)sender {
//select an image
UIImagePickerController *picker = [[UIImagePickerController alloc]init];
picker.delegate = self;
[self presentViewController:picker animated:YES completion:nil];
}
-(void)imagePickerControllerDidCancel:(UIImagePickerController *)picker
{
[self dismissViewControllerAnimated:YES completion:nil];
}
-(void)imagePickerController:(UIImagePickerController *)picker didFinishPickingMediaWithInfo:(NSDictionary<NSString *,id> *)info
{
[self dismissViewControllerAnimated:YES completion:nil];
UIImage* image = [info objectForKey:UIImagePickerControllerOriginalImage];
self.imageView.image = image;
//SAVE IMAGE IN iCloud
AppDelegate* myAppDelegate = (AppDelegate*)[UIApplication sharedApplication].delegate;
NSURL* cloudeImage = [myAppDelegate applicationCloudFolder:#"thePicture"];
NSData* imageDate = UIImagePNGRepresentation(image);
[imageDate writeToURL:cloudeImage atomically:YES];
}
-(void)populateUI
{
AppDelegate* myAppDelegate = (AppDelegate*)[UIApplication sharedApplication].delegate;
NSURL* cloudeImageURL = [myAppDelegate applicationCloudFolder:#"thePicture"];
NSData* imageDate = [NSData dataWithContentsOfURL:cloudeImageURL];
UIImage* image = [UIImage imageWithData:imageDate];
if (image) {
self.imageView.image = image;
}
else
{
//download image from iCloud
NSLog(#"Downloading Image...");
[[NSFileManager defaultManager]startDownloadingUbiquitousItemAtURL:cloudeImageURL error:nil];
}
}
- (NSMetadataQuery *)query {
if (!_query) {
_query = [[NSMetadataQuery alloc]init];
NSArray *scopes = #[NSMetadataQueryUbiquitousDocumentsScope];
_query.searchScopes = scopes;
NSPredicate *predicate = [NSPredicate predicateWithFormat:#"%K like %#", NSMetadataItemFSNameKey, #"*"];
_query.predicate = predicate;
if (![_query startQuery]) {
NSLog(#"Query didn't start... for whatever reason");
}
}
return _query;
}
AppDelegate.m
-(NSURL*)applicationCloudFolder:(NSString*)fileName
{
//TEAM ID AND CONTAINER ID
NSString* teamID = #"V58ESG9PLE";
NSString* bundelID =[NSBundle mainBundle].bundleIdentifier;
NSString* containerID = [NSString stringWithFormat:#"%#.%#",teamID,bundelID];
// URL to Cloud Folder
NSURL* cloudeRootURL = [[NSFileManager defaultManager] URLForUbiquityContainerIdentifier:containerID];
NSLog(#"cloudeRootURL %#",cloudeRootURL);
NSURL* cloudDocuments = [cloudeRootURL URLByAppendingPathComponent:#"Document"];
//Apend our file name
cloudDocuments = [cloudDocuments URLByAppendingPathComponent:fileName];
return cloudDocuments;
}
Now I am not getting is my data being saved? ,Where is it getting saved? how can I retrieve it?
P.S. I am saving a picture
I want to use both of the objective c methods listed below in my application. The first method uploads a UIImagePicker photograph to a local server.
// I would still like to use this method structure but with the `AVCam` classes.
-(void)uploadPhoto {
//upload the image and the title to the web service
[[API sharedInstance] commandWithParams:[NSMutableDictionary dictionaryWithObjectsAndKeys:#"upload", #"command", UIImageJPEGRepresentation(photo.image,70), #"file", fldTitle.text, #"title", nil] onCompletion:^(NSDictionary *json) {
//completion
if (![json objectForKey:#"error"]) {
//success
[[[UIAlertView alloc]initWithTitle:#"Success!" message:#"Your photo is uploaded" delegate:nil cancelButtonTitle:#"Yay!" otherButtonTitles: nil] show];
} else {
//error, check for expired session and if so - authorize the user
NSString* errorMsg = [json objectForKey:#"error"];
[UIAlertView error:errorMsg];
if ([#"Authorization required" compare:errorMsg]==NSOrderedSame) {
[self performSegueWithIdentifier:#"ShowLogin" sender:nil];
}
}
}];
}
I want to add a second method : The second method performs an IBAction picture snap using AVCam but I changed it to void to launch the the view loads using [self snapStillImage].
EDIT
- (IBAction)snapStillImage:(id)sender
{
dispatch_async([self sessionQueue], ^{
// Update the orientation on the still image output video connection before capturing.
[[[self stillImageOutput] connectionWithMediaType:AVMediaTypeVideo] setVideoOrientation:[[(AVCaptureVideoPreviewLayer *)[[self previewView] layer] connection] videoOrientation]];
// Flash set to Auto for Still Capture
[ViewController5 setFlashMode:AVCaptureFlashModeAuto forDevice:[[self videoDeviceInput] device]];
// Capture a still image.
[[self stillImageOutput] captureStillImageAsynchronouslyFromConnection:[[self stillImageOutput] connectionWithMediaType:AVMediaTypeVideo] completionHandler:^(CMSampleBufferRef imageDataSampleBuffer, NSError *error) {
if (imageDataSampleBuffer)
{
NSData *imageData = [AVCaptureStillImageOutput jpegStillImageNSDataRepresentation:imageDataSampleBuffer];
UIImage *image = [[UIImage alloc] initWithData:imageData];
[[[ALAssetsLibrary alloc] init] writeImageToSavedPhotosAlbum:[image CGImage] orientation:(ALAssetOrientation)[image imageOrientation] completionBlock:nil];
//
photo = [[UIImage alloc] initWithData:imageData];
}
}];
});
}
Can someone please set photo via AVCam? At the very least humor me and start a dialogue about AVFoundation and its appropriate classes for tackling an issue like this.
Additional info: The avcam method is simply an excerpt from this https://developer.apple.com/library/ios/samplecode/AVCam/Introduction/Intro.html
#Aksh1t I want to set an UIImage named image with the original contents of the AVFoundation snap. Not UIImagePicker. Here is the method that sets the outlet using UIImagePicker.
#pragma mark - Image picker delegate methods
-(void)imagePickerController:(UIImagePickerController *)picker didFinishPickingMediaWithInfo:(NSDictionary *)info {
UIImage *image = [info objectForKey:UIImagePickerControllerOriginalImage];
// Resize the image from the camera
UIImage *scaledImage = [image resizedImageWithContentMode:UIViewContentModeScaleAspectFill bounds:CGSizeMake(photo.frame.size.width, photo.frame.size.height) interpolationQuality:kCGInterpolationHigh];
// Crop the image to a square (yikes, fancy!)
UIImage *croppedImage = [scaledImage croppedImage:CGRectMake((scaledImage.size.width -photo.frame.size.width)/2, (scaledImage.size.height -photo.frame.size.height)/2, photo.frame.size.width, photo.frame.size.height)];
// Show the photo on the screen
photo.image = croppedImage;
[picker dismissModalViewControllerAnimated:NO];
}
After that I simply want to upload it using the first method I posted. Sorry for being unclear. I basically want to do this in my new app (i was unclear about what app).
Take a photo using AVCam
Set that photo to an UIImageView IBOutlet named photo
Upload photo (the original AVCam photo) to the server
The basic framework is above and I will answer any questions
The following line of code in your snapStillImage method takes a photo into the imageData variable.
NSData *imageData = [AVCaptureStillImageOutput jpegStillImageNSDataRepresentation:imageDataSampleBuffer];
Next, you are creating one UIImage object from this data like this
UIImage *image = [[UIImage alloc] initWithData:imageData];
Instead of the above code, make a global variable UIImage *photo;, and initialize that with the imageData when your snapStillImage takes the photo like this
photo = [[UIImage alloc] initWithData:imageData];
Since photo is a global variable, you will then be able to use that in your uploadPhoto method and send it to your server.
Hope this helps, and if you have any question, leave it in the comments.
Edit:
Since you already have a IBOutlet UIImageView *photo; in your file, you don't even need a global variable to store the UIImage. You can just replace the following line in your snapStillImage method:
UIImage *image = [[UIImage alloc] initWithData:imageData];
with this line
photo.image = [[UIImage alloc] initWithData:imageData];
I have an app where I can shoot a video and then store it in core-data with some other data. It's stored as Transformable and "Store in External Record File". I get the video clip into an object called movie like this;
(void)imagePickerController:(UIImagePickerController *)picker didFinishPickingMediaWithInfo:(NSDictionary *)info {
movie = [info objectForKey:UIImagePickerControllerEditedImage];
Where I get stuck is to get a thumbnail from the movie. Using MPMoviePlayerController I have to have an URL, but the movie isn't stored anywhere yet. Plus finding the URL from the core-data is a mystery as well.
The closest help I can find here is Getting a thumbnail from a video url or data in iPhone SDK . But I get caught out on the URL issue.
I am saving it to core-data like this;
NSManagedObjectContext *context = [self managedObjectContext];
NSManagedObject *newMedia = [NSEntityDescriptioninsertNewObjectForEntityForName:#"Media" inManagedObjectContext:context];
[newMedia setValue:#"Video Clip " forKey:#"title"];
[newMedia setValue:now forKey:#"date_time"];
[newMedia setValue:movie forKey:#"movie"];
[newMedia setValue:[self generateImage] forKey:#"frame"];
I would be grateful if there is someone out there that can give me a pointer.
I used like this in my app
- (UIImage *)imageFromMovie:(NSURL *)movieURL atTime:(NSTimeInterval)time {
// set up the movie player
MPMoviePlayerController *mp = [[MPMoviePlayerController alloc]
initWithContentURL:movieURL];
mp.shouldAutoplay = NO;
mp.initialPlaybackTime = time;
mp.currentPlaybackTime = time;
// get the thumbnail
UIImage *thumbnail = [mp thumbnailImageAtTime:time
timeOption:MPMovieTimeOptionNearestKeyFrame];
// clean up the movie player
[mp stop];
[mp release];
return(thumbnail);
}
used like
imageView.image = [self imageFromMovie:fileURL atTime:10.0];
To get thumbnail from movie...
- (void)imagePickerController:(UIImagePickerController *)picker didFinishPickingMediaWithInfo:(NSDictionary *)info {
if ([[info objectForKey:#"UIImagePickerControllerMediaType"] rangeOfString:#"movie"].location!=NSNotFound)
{
MPMoviePlayerController *theMovie = [[MPMoviePlayerController alloc] initWithContentURL:[info objectForKey:#"UIImagePickerControllerMediaURL"]];
theMovie.view.frame = self.view.bounds;
theMovie.controlStyle = MPMovieControlStyleNone;
theMovie.shouldAutoplay=NO;
UIImage *imgTemp = [theMovie thumbnailImageAtTime:0 timeOption:MPMovieTimeOptionExact];
}
}
You can get your movie URL like this and then use this URL to get thumbnail using MPMoviePlayerController.
-(void)imagePickerController:(UIImagePickerController *)picker didFinishPickingMediaWithInfo:(NSDictionary *)info
{
NSURL * movieURL = [info valueForKey:UIImagePickerControllerMediaURL] ;
. .. .
.. ..
}
I'm working on an iphone app that uses Abbyy OCR.
Using the wrapper class for iphone there is a method :
[ocrManager recognizeImage:[choosenImage image] withCallback:self];
a UIImage is passed as a parameter which is used to recognize characters. But every time i receive exception "Required Data File Missed".
-(void)imagePickerController:(UIImagePickerController *)picker didFinishPickingMediaWithInfo:(NSDictionary *)info
{
[picker dismissModalViewControllerAnimated:YES];
[[UIApplication sharedApplication] setStatusBarHidden:NO];
if(check == 1)
{
return;
}
check = 1;
UIImage *image = [info objectForKey:#"UIImagePickerControllerOriginalImage"];
[choosenImage setImage:image];
[process setHidden:NO];
NSString *filePath = [[NSBundle mainBundle] pathForResource:#"license" ofType:#""];
NSData *license = [[NSData alloc] initWithContentsOfFile:filePath];
CMocrManager *ocrManager = [CMocrManager createManager:license];
NSSet *languages = [[NSSet alloc] initWithArray:[NSArray arrayWithObject:#"English"]];
[ocrManager setLanguages:languages];
[ocrManager setDefaultImageResolution:0];
#try {
[ocrManager recognizeImage:[choosenImage image] withCallback:self];
}
#catch (NSException *exception) {
NSString *ex = exception.reason;
}
CMocrLayout* recognitionResult = [ocrManager copyRecognitionResults];
NSArray* strings = [recognitionResult copyStrings];
}
the image can be seen on UIImageView but when I pass it to recognizeImage method it throws the exception.
I've been researching a lot but unable to find a solution. Any help is appreciated.
My name is Nikolay Khlebinsky, i work # ABBYY.
"Required Data File Missed" error message is displayed when resource files are missing (keywords, patterns or dictionaries). Refer to «How to Work with the ABBYY Mobile OCR Engine Library on the iPhone» help article for project organization guide. You can also look for iPhone project sample in the engine distributive.
If you would still experience any difficulties, please visit our technical support contacts page at http://www.abbyy.com/support/contacts/ Choose your country and your product, hit 'Search' and you'll get contacts of the proper ABBYY representatives. Contacting them is the fastest way to solve technical issues.