How to upload this photo? - ios

I want to use both of the objective c methods listed below in my application. The first method uploads a UIImagePicker photograph to a local server.
// I would still like to use this method structure but with the `AVCam` classes.
-(void)uploadPhoto {
//upload the image and the title to the web service
[[API sharedInstance] commandWithParams:[NSMutableDictionary dictionaryWithObjectsAndKeys:#"upload", #"command", UIImageJPEGRepresentation(photo.image,70), #"file", fldTitle.text, #"title", nil] onCompletion:^(NSDictionary *json) {
//completion
if (![json objectForKey:#"error"]) {
//success
[[[UIAlertView alloc]initWithTitle:#"Success!" message:#"Your photo is uploaded" delegate:nil cancelButtonTitle:#"Yay!" otherButtonTitles: nil] show];
} else {
//error, check for expired session and if so - authorize the user
NSString* errorMsg = [json objectForKey:#"error"];
[UIAlertView error:errorMsg];
if ([#"Authorization required" compare:errorMsg]==NSOrderedSame) {
[self performSegueWithIdentifier:#"ShowLogin" sender:nil];
}
}
}];
}
I want to add a second method : The second method performs an IBAction picture snap using AVCam but I changed it to void to launch the the view loads using [self snapStillImage].
EDIT
- (IBAction)snapStillImage:(id)sender
{
dispatch_async([self sessionQueue], ^{
// Update the orientation on the still image output video connection before capturing.
[[[self stillImageOutput] connectionWithMediaType:AVMediaTypeVideo] setVideoOrientation:[[(AVCaptureVideoPreviewLayer *)[[self previewView] layer] connection] videoOrientation]];
// Flash set to Auto for Still Capture
[ViewController5 setFlashMode:AVCaptureFlashModeAuto forDevice:[[self videoDeviceInput] device]];
// Capture a still image.
[[self stillImageOutput] captureStillImageAsynchronouslyFromConnection:[[self stillImageOutput] connectionWithMediaType:AVMediaTypeVideo] completionHandler:^(CMSampleBufferRef imageDataSampleBuffer, NSError *error) {
if (imageDataSampleBuffer)
{
NSData *imageData = [AVCaptureStillImageOutput jpegStillImageNSDataRepresentation:imageDataSampleBuffer];
UIImage *image = [[UIImage alloc] initWithData:imageData];
[[[ALAssetsLibrary alloc] init] writeImageToSavedPhotosAlbum:[image CGImage] orientation:(ALAssetOrientation)[image imageOrientation] completionBlock:nil];
//
photo = [[UIImage alloc] initWithData:imageData];
}
}];
});
}
Can someone please set photo via AVCam? At the very least humor me and start a dialogue about AVFoundation and its appropriate classes for tackling an issue like this.
Additional info: The avcam method is simply an excerpt from this https://developer.apple.com/library/ios/samplecode/AVCam/Introduction/Intro.html
#Aksh1t I want to set an UIImage named image with the original contents of the AVFoundation snap. Not UIImagePicker. Here is the method that sets the outlet using UIImagePicker.
#pragma mark - Image picker delegate methods
-(void)imagePickerController:(UIImagePickerController *)picker didFinishPickingMediaWithInfo:(NSDictionary *)info {
UIImage *image = [info objectForKey:UIImagePickerControllerOriginalImage];
// Resize the image from the camera
UIImage *scaledImage = [image resizedImageWithContentMode:UIViewContentModeScaleAspectFill bounds:CGSizeMake(photo.frame.size.width, photo.frame.size.height) interpolationQuality:kCGInterpolationHigh];
// Crop the image to a square (yikes, fancy!)
UIImage *croppedImage = [scaledImage croppedImage:CGRectMake((scaledImage.size.width -photo.frame.size.width)/2, (scaledImage.size.height -photo.frame.size.height)/2, photo.frame.size.width, photo.frame.size.height)];
// Show the photo on the screen
photo.image = croppedImage;
[picker dismissModalViewControllerAnimated:NO];
}
After that I simply want to upload it using the first method I posted. Sorry for being unclear. I basically want to do this in my new app (i was unclear about what app).
Take a photo using AVCam
Set that photo to an UIImageView IBOutlet named photo
Upload photo (the original AVCam photo) to the server
The basic framework is above and I will answer any questions

The following line of code in your snapStillImage method takes a photo into the imageData variable.
NSData *imageData = [AVCaptureStillImageOutput jpegStillImageNSDataRepresentation:imageDataSampleBuffer];
Next, you are creating one UIImage object from this data like this
UIImage *image = [[UIImage alloc] initWithData:imageData];
Instead of the above code, make a global variable UIImage *photo;, and initialize that with the imageData when your snapStillImage takes the photo like this
photo = [[UIImage alloc] initWithData:imageData];
Since photo is a global variable, you will then be able to use that in your uploadPhoto method and send it to your server.
Hope this helps, and if you have any question, leave it in the comments.
Edit:
Since you already have a IBOutlet UIImageView *photo; in your file, you don't even need a global variable to store the UIImage. You can just replace the following line in your snapStillImage method:
UIImage *image = [[UIImage alloc] initWithData:imageData];
with this line
photo.image = [[UIImage alloc] initWithData:imageData];

Related

memory increase use UIImagePickerController to take photo and save to user's album

I'm using MWPhotoBrowser to display my photos taken by UIImagePickerController.
Here is my code for saving to user's album.
-(void)imagePickerController:(UIImagePickerController *)pickerdidFinishPickingMediaWithInfo:(NSDictionary *)info
{
// NSDictionary* metadata=info[UIImagePickerControllerMediaMetadata];
// [[PHPhotoLibrary sharedPhotoLibrary]performChanges:^{
// NSURL* url=info[UIImagePickerControllerReferenceURL];
// PHAssetChangeRequest* assetChangeReques= [PHAssetChangeRequest creationRequestForAssetFromImageAtFileURL:url];
// PHAssetCollectionChangeRequest* request=[PHAssetCollectionChangeRequest changeRequestForAssetCollection:self.assetCollection];
// [request addAssets:#[[assetChangeReques placeholderForCreatedAsset]]];
// }completionHandler:^(BOOL success,NSError* error){
// if (success) {
// // self.previewImgView.image=image;
// }
// }];
ALAssetsLibrary* alassetsLibary=[[ALAssetsLibrary alloc] init];
[alassetsLibary saveImageData:UIImageJPEGRepresentation(info[UIImagePickerControllerOriginalImage], 1.0f) toAlbum:self.assetCollection.localizedTitle metadata:info[UIImagePickerControllerMediaMetadata] completion:^(NSURL* assetUrl,NSError* error)
{
}failure:^(NSError* error){
}];
self.previewImgView.image=[info[UIImagePickerControllerOriginalImage] getThumbnailWithSize:self.previewImgView.bounds] ;
alassetsLibary=nil;
info=nil;
}
I have used to ways to save the photo above. But it can't handle the problem same as yours.
In the second method. I have tried to change the UIImageJPEGRepresentation parameter from 0.1 to 1.0. But it did't work.
I have try to let alassetsLibary=nil; info=nil; But when I open the photoBrowser. The memory increase so fast.
If anybody has solution,please tell me!!
And I wondering how Apple handle with this issue in native camera app as well.
Gratitude.
I always use this to save pics to the album:
- (void)imagePickerController:(UIImagePickerController *)picker didFinishPickingMediaWithInfo:(NSDictionary *)info {
ALAssetsLibrary *library = [[ALAssetsLibrary alloc] init];
[library writeImageToSavedPhotosAlbum:((UIImage *)[info objectForKey:UIImagePickerControllerEditedImage]).CGImage
metadata:[info objectForKey:UIImagePickerControllerMediaMetadata]
completionBlock:^(NSURL *assetURL, NSError *error) {
NSLog(#"assetURL %#", assetURL);
imageURL=[assetURL absoluteString];
//save URL for future use
[[NSUserDefaults standardUserDefaults] setObject:imageURL forKey:#"imageurl"];
}];
[picker dismissViewControllerAnimated:YES completion:NULL];
}

Error Domain=AVFoundationErrorDomain Code=-11814 "Cannot Record"

It keeps on giving me the error:
Error Domain=AVFoundationErrorDomain Code=-11814 "Cannot Record"
I am not sure what the problem is? I am trying to record the sound right when the counter reaches 1 after a picture is taken.
static int counter;
//counter will always be zero it think unless it is assigned.
if (counter == 0){
dispatch_async([self sessionQueue], ^{
// Update the orientation on the still image output video connection before capturing.
[[[self stillImageOutput] connectionWithMediaType:AVMediaTypeVideo] setVideoOrientation:[[(AVCaptureVideoPreviewLayer *)[[self previewView] layer] connection] videoOrientation]];
// Flash set to Auto for Still Capture
[AVCamViewController setFlashMode:AVCaptureFlashModeAuto forDevice:[[self videoDeviceInput] device]];
// Capture a still image.
[[self stillImageOutput] captureStillImageAsynchronouslyFromConnection:[[self stillImageOutput] connectionWithMediaType:AVMediaTypeVideo] completionHandler:^(CMSampleBufferRef imageDataSampleBuffer, NSError *error) {
if (imageDataSampleBuffer)
{//[AVCaptureSession snapStillImage];
NSData *imageData = [AVCaptureStillImageOutput jpegStillImageNSDataRepresentation:imageDataSampleBuffer];
UIImage *image = [[UIImage alloc] initWithData:imageData];
[[[ALAssetsLibrary alloc] init] writeImageToSavedPhotosAlbum:[image CGImage] orientation:(ALAssetOrientation)[image imageOrientation] completionBlock:nil];
}
NSLog(#"i");
}];
});
if (!_audioRecorder.recording)
{
//start recording as part of still image
_playButton.enabled = NO;
_stopButton.enabled = YES;
[_audioRecorder record];
for(int i=0;i<1000;i++)
{
//do nothing just counting
}
//stop the recording
}
}
else if (counter == 1)
{
[self recordForDuration:5];
}
}
This error is because you use an emulator, you need use an device
Regards
Make sure there is only one instance of AVCaptureSession running.
Having your device restrict the camera access under "Settings > General > Restrictions" will also give you this error.
I ran into the same error when I was trying AVFoundation on a Mac, using a Mac Catalyst app. That is documented and intended behaviour, apparently:
https://forums.developer.apple.com/thread/124652#389519
https://developer.apple.com/documentation/avfoundation/cameras_and_media_capture
This issue behaves as intended. We do document that we are not listing
devices in Catalyst mode. See Important iPad apps running in macOS
cannot use the AVFoundation Capture classes. These apps should instead
use UIImagePickerController for photo and video capture.

didFinishPickingMediaWithInfo get Image filename

I am trying to get the image filename from the camera. I have seen many, many posts about retrieving the filename using UIImagePickerControllerReferenceURL or UIImagePickerControllerMediaURL, but that only gets you a filename similar to: //asset/asset.JPG?id=D1D715A8-6956-49FF-AF07-CE60FE93AE32&ext=JPG. I believe this is because the image is not saved yet, thus a name like img0001.jpg in not available.
Does anyone know how to get the filename? I am as suing I need to do this after UIImageWriteToSavedPhotosAlbum, but I cannot seem to figure it out.
I have seen the question asked many times, but none seem to 'really' answer it, they just always give the filename from an image chosen from the picker not from the camera.
Thanks for any help
** UPDATE:
-(void)imagePickerController:(UIImagePickerController *)picker didFinishPickingMediaWithInfo:(NSDictionary *)info
{
[self.popoverController dismissPopoverAnimated:true];
NSString *mediaType = [info objectForKey:UIImagePickerControllerMediaType];
[self dismissViewControllerAnimated:YES completion:nil];
if ([mediaType isEqualToString:(NSString *)kUTTypeImage])
{
UIImage *image = [info objectForKey:UIImagePickerControllerOriginalImage];
imageView.image = image;
if (newMedia)
{
UIImageWriteToSavedPhotosAlbum(image,
self,
#selector(image:finishedSavingWithError:contextInfo:),
nil);
NSURL *refURL = [info valueForKey:UIImagePickerControllerMediaURL];
// define the block to call when we get the asset based on the url (below)
ALAssetsLibraryAssetForURLResultBlock resultblock = ^(ALAsset *imageAsset)
{
ALAssetRepresentation *imageRep = [imageAsset defaultRepresentation];
NSLog(#"[imageRep filename] : %#", [imageRep filename]);
};
// get the asset library and fetch the asset based on the ref url (pass in block above)
ALAssetsLibrary* assetslibrary = [[ALAssetsLibrary alloc] init];
[assetslibrary assetForURL:refURL resultBlock:resultblock failureBlock:nil];
}
}
}
My code above, but when imageRep filename is logged it is null.

Giving a video file a specific name before saving to photo roll - iOS

I am working on an iOS app and I recently implemented a imagePicker in a popover that allows a user to record photos and videos. I would like to store these in the iPad's photo roll so that the user can sync them in iTunes later. I've been able to do this successfully, but I would like to give each photo and video a unique name that contains useful information about the photo and video so I can load them later. Specifically, I would like to store the photo using a property of the class, live_trial_id, as the filename. Below is the code that I am currently using to store my photos and videos to the photo roll. I understand that I could do this with metadata for pictures, but for the videos I am lost. Thanks in advance for any help with this issue!
-(void) imagePickerController:(UIImagePickerController *)picker didFinishPickingMediaWithInfo:(NSDictionary *)info{
NSString *mediaType = [info objectForKey:UIImagePickerControllerMediaType];
UIImage *originalImage, *editedImage, *imageToSave;
// Handle a still image capture
if( [mediaType isEqualToString:#"public.image"]){
editedImage = (UIImage *) [info objectForKey:
UIImagePickerControllerEditedImage];
originalImage = (UIImage *) [info objectForKey:
UIImagePickerControllerOriginalImage];
if (editedImage) {
imageToSave = editedImage;
} else {
imageToSave = originalImage;
}
// Get the image metadata
UIImagePickerControllerSourceType pickerType = picker.sourceType;
if(pickerType == UIImagePickerControllerSourceTypeCamera)
{
NSDictionary *imageMetadata = [info objectForKey:
UIImagePickerControllerMediaMetadata];
// Get the assets library
ALAssetsLibrary *library = [[ALAssetsLibrary alloc] init];
ALAssetsLibraryWriteImageCompletionBlock imageWriteCompletionBlock =
^(NSURL *newURL, NSError *error) {
if (error) {
NSLog( #"Error writing image with metadata to Photo Library: %#", error );
} else {
NSLog( #"Wrote image with metadata to Photo Library");
}
};
// Save the new image (original or edited) to the Camera Roll
[library writeImageToSavedPhotosAlbum:[imageToSave CGImage]
metadata:imageMetadata
completionBlock:imageWriteCompletionBlock];
}
}
if ([mediaType isEqualToString:#"public.movie"]) {
UIImagePickerControllerSourceType pickerType = picker.sourceType;
if(pickerType == UIImagePickerControllerSourceTypeCamera)
{
NSURL *mediaURL = [info objectForKey:UIImagePickerControllerMediaURL];
// Get the assets library
ALAssetsLibrary *library = [[ALAssetsLibrary alloc] init];
ALAssetsLibraryWriteVideoCompletionBlock videoWriteCompletionBlock =
^(NSURL *newURL, NSError *error) {
if (error) {
NSLog( #"Error writing image with metadata to Photo Library: %#", error );
} else {
NSLog( #"Wrote image with metadata to Photo Library");
}
};
// Save the new image (original or edited) to the Camera Roll
[library writeVideoAtPathToSavedPhotosAlbum:mediaURL
completionBlock:videoWriteCompletionBlock];
}
}
I would also like to avoid creating a custom library or custom metadata if at all possible. I really just wanted to change the filename on the way to the photo roll
I ended up answering my own question. What I wanted to do was to save to the application's documents directory instead of the photo roll. This allowed me to access the files that I had saved and also to sync them to the computer when attached later.

Getting a URL from (to) a "picked" image, iOS

With this code below, I can extract metadata from an image (pre-added to my project), and render the info as text. This is exactly what I want to do. The SYMetadata is created by pointing to an image via URL. initWithAbsolutePathURL. I want to do the same thing with a UIImage or maybe the image that is being loaded to the UIImage. How do I get the URL to the image that the picker selects? Or how do I create an "asset" from this incoming image?
The documentation describes initWithAsset. Have not figured out how to use it yet though, or if this is the right way to go for my purpose. Any help greatly appreciated.
NSURL *imageURL = [[NSBundle mainBundle] URLForResource:#"someImage" withExtension:#"jpg"];
SYMetadata *metadata = [[SYMetadata alloc] initWithAbsolutePathURL:imageURL];
[textView setText:[[metadata allMetadatas] description]];
Note: I tried adding an NSURL like this imageURL = [info valueForKey:#"UIImagePickerControllerReferenceURL"];, in the "pickerDidFinish" method but the metadata is null after I add this URL to the above code.
If you are using the imagePickerController, the delegate method will give you what you need
- (void) imagePickerController:(UIImagePickerController *)picker
didFinishPickingMediaWithInfo:(NSDictionary *)info
{
if ([[info allKeys] containsObject:UIImagePickerControllerReferenceURL]){
// you will get this key if your image comes from a library
[self setMetaDataFromAssetLibrary:info];
} else if ([[info allKeys] containsObject:UIImagePickerControllerMediaMetadata]){
// if the image comes from the camera you get the metadata in it's own key
self.rawMetaData = [self metaDataFromCamera:info];
}
}
From Asset Library - bear in mind that it takes time to complete and has an asynchronous completion block, so you might want to add a completion flag to ensure you don't access the property before it has been updated.
- (void) setMetaDataFromAssetLibrary:(NSDictionary*)info
{
NSURL *assetURL = [info objectForKey:UIImagePickerControllerReferenceURL];
ALAssetsLibrary *library = [[ALAssetsLibrary alloc] init];
[library assetForURL:assetURL
resultBlock:^(ALAsset *asset) {
self.rawMetaData = asset.defaultRepresentation.metadata;
}
failureBlock:^(NSError *error) {
NSLog (#"error %#",error);
}];
}
From Camera:
- (NSDictionary*)metaDataFromCamera:(NSDictionary*)info
{
NSMutableDictionary *imageMetadata = [info objectForKey:UIImagePickerControllerMediaMetadata];
return imageMetadata;
}
Here is how to get metadata from a UIImage
- (NSDictionary*)metaDataFromImage:(UIImage*)image
{
NSData *jpegData = [NSData dataWithData:UIImageJPEGRepresentation(image, 1.0)];
return [self metaDataFromData:jpegData];
}
But take care - a UIImage can already stripped of much of the metadata from the original.. you will be better off getting the metadata from the NSData that was used to create the UIImage...
- (NSDictionary*)metaDataFromData:(NSData*)data
{
CGImageSourceRef source = CGImageSourceCreateWithData((__bridge CFDataRef)data, NULL);
CFDictionaryRef imageMetaData = CGImageSourceCopyPropertiesAtIndex(source,0,NULL);
return (__bridge NSDictionary *)(imageMetaData);
}
If you've an ALAsset (in my sample _detailItem), you can have metadata in this way:
NSDictionary *myMetadata = [[_detailItem defaultRepresentation] metadata];

Resources