Cropping an image before sending to the server on iOS - ios

I'm trying to crop an image before sending to the server and I'm having issues.
I'm was trying to do this:
imageUploadReq.photo = [self encodeToBase64String:[UIImage imageWithData:UIImageJPEGRepresentation(fileData, 0.07f)]];
But Xcode is complaining that "Incompatible pointer types passing NSData * to parameter type UIImage". I tried to cast it, but it wouldn't work either.
Here is the code:
- (void)uploadPhoto {
NSData *fileData;
if (self.image != nil) {
UIImage *newImage = [self resizeImage:self.image toWidth:320.0f andHeight:480.0f];
fileData = UIImageJPEGRepresentation(newImage, 0.07f);
}
WUTModelImageUploadReq *imageUploadReq = [[WUTModelImageUploadReq alloc]init];
// I'm trying to set the first parameter of UIImageJPEGRepresentation to fileData
imageUploadReq.photo = [self encodeToBase64String:[UIImage imageWithData:UIImageJPEGRepresentation(self.viewControllerPost.imageForPost, 0.07f)]];
imageUploadReq.extension = #"jpg";
void (^wsSuccessHandler)(AFHTTPRequestOperation *operation, NSDictionary* responseObject) = ^(AFHTTPRequestOperation *operation, id responseObject){
NSLog(#"Pull Feed responseObject %#",responseObject);
NSError *error;
WUTModelPostImageResponse *wsResponse = [[WUTModelPostImageResponse alloc]initWithDictionary:(NSDictionary *)responseObject error:&error];
if (error) {
errorMessage = #"Failure to upload image.";
[self postExecuteFail];
}else{
if (wsResponse.success) {
WUTModelImage *imageTemp = [wsResponse.data firstObject];
[postItem setObject:imageTemp.photo forKey:#"photo"];
[self uploadPostFeed];
}else{
errorMessage = #"Failure to upload image.";
[self postExecuteFail];
}
}
};
void (^wsErrorHandler)(AFHTTPRequestOperation *operation, NSError *error) = ^(AFHTTPRequestOperation *operation, NSError *error){
if ([error.localizedDescription rangeOfString:#"401"].location != NSNotFound)
errorMessage = #"It seems that your login session get expire, Please relogin after logged out.";
else
errorMessage = #"Failure to upload image.";
[self postExecuteFail];
};
AFHTTPRequestOperation *op = [WUTCommonWebServices WebServicePostCallWithAccessTokenForEndPoint:WS_UploadImage WithJson:imageUploadReq ForSuccess:wsSuccessHandler ForFailure:wsErrorHandler];
[op start];
}

I think, you need this:
+(UIImage*)imageWithImage:(UIImage*)image scaledToSize:(CGSize)newSize{
UIGraphicsBeginImageContext( newSize );
[image drawInRect:CGRectMake(0,0,newSize.width,newSize.height)];
UIImage* newImage = UIGraphicsGetImageFromCurrentImageContext();
UIGraphicsEndImageContext();
return newImage; }
You just need to pass original image and your desired size and it will return your expected image.
Call the method like:
UIImage *originalImage = [info valueForKey:UIImagePickerControllerOriginalImage]; // here originalImage is image taken from camera or you can use whatever you want
UIImage * image = [GlobalFunction imageWithImage:originalImage scaledToSize:CGSizeMake(200, 200)]; // GlobalFunction is class where I have defined this method and it returns UIImage of size (200, 200)

Related

IOS to calling function giving giving error

I am using a MTBBarcodeScanner interface to implement a barcode scanner application.
I need to get the still image of the scanner in my code, so I am trying to call the function:
- (void)captureStillImage:(void (^)(UIImage *image, NSError *error))captureBlock {
if ([self isCapturingStillImage]) {
if (captureBlock) {
NSError *error = [NSError errorWithDomain:kErrorDomain
code:kErrorCodeStillImageCaptureInProgress
userInfo:#{NSLocalizedDescriptionKey : #"Still image capture is already in progress. Check with isCapturingStillImage"}];
captureBlock(nil, error);
}
return;
}
AVCaptureConnection *stillConnection = [self.stillImageOutput connectionWithMediaType:AVMediaTypeVideo];
if (stillConnection == nil) {
if (captureBlock) {
NSError *error = [NSError errorWithDomain:kErrorDomain
code:kErrorCodeSessionIsClosed
userInfo:#{NSLocalizedDescriptionKey : #"AVCaptureConnection is closed"}];
captureBlock(nil, error);
}
return;
}
[self.stillImageOutput captureStillImageAsynchronouslyFromConnection:stillConnection
completionHandler:^(CMSampleBufferRef imageDataSampleBuffer, NSError *error) {
if (error) {
captureBlock(nil, error);
return;
}
NSData *jpegData = [AVCaptureStillImageOutput jpegStillImageNSDataRepresentation:imageDataSampleBuffer];
UIImage *image = [UIImage imageWithData:jpegData];
if (captureBlock) {
captureBlock(image, nil);
}
}];
}
From my viewcontroller I am calling this function like:
UIImage *img;
NSError *e;
[_scanner captureStillImage:img :e];
but giving me the error:
No visible #interface for 'MTBBarcodeScanner' declares the selector 'captureStillImage::
How can I call this function my UIViewcontroller subclass?
The syntax of your block is incorrect. It should be the following:
[_scanner captureStillImage:^(UIImage *image, NSError *error) {
}];
Also, this is a callback function, you are not supposed to feed parameters into it, these are being returned from it.
If you would like to have variables representing the return values of the callback function outside you callback, you need to declare __block variables.
__block UIImage* img;
__block NSError* e;
[_scanner captureStillImage:^(UIImage *image, NSError *error) {
img = image;
e = error;
}];

GoogleDrive file download issue

I am able to download file from GoogleDrive API.
Using the following code.
NSString *url = [NSString stringWithFormat:#"https://www.googleapis.com/drive/v3/files/%#?alt=media",
identifier];
GTMSessionFetcher *fetcher = [self.service.fetcherService fetcherWithURLString:url];
[fetcher beginFetchWithCompletionHandler:^(NSData *data, NSError *error) {
if (error == nil) {
NSLog(#"Retrieved file content");
// Do something with data
UIImage *img = [UIImage imageWithData:data];
NSLog(#"%#", img);
} else {
NSLog(#"An error occurred: %#", error);
[self showErrorAlert:error];
}
}];
It was working fine.(Getting image from data)
Now its not able to compose image.
UIImage *img = [UIImage imageWithData:data];
NSLog(#"%#", img);
Its giving Null.

Updating image and saving it in dictionary

I am getting image as url from dictionary on ViewController A and I have passed that dictionary to ViewController B.I want that if the user has updated the image then it shows the updated image else it shows the previous image and I am doing the following code for it .Kindly check and tell why is it not working as desired and showing the previous image only in every case.
-(void)showUserImage:(NSURL*)imgUrl
{
[ConnectionManager setSharedCacheForImages];
NSURLRequest *request = [[NSURLRequest alloc] initWithURL:imgUrl];
NSURLSession *session = [ConnectionManager prepareSessionForRequest];
NSCachedURLResponse *cachedResponse = [[NSURLCache sharedURLCache] cachedResponseForRequest:request];
if (cachedResponse.data) {
UIImage *downloadedImage = [UIImage imageWithData:cachedResponse.data];
dispatch_async(dispatch_get_main_queue(), ^{
_profileImageView.image = downloadedImage;
});
} else {
NSURLSessionDataTask *task = [session dataTaskWithRequest:request completionHandler:^(NSData * _Nullable data, NSURLResponse * _Nullable response, NSError * _Nullable error) {
NSHTTPURLResponse *res = (NSHTTPURLResponse *)response;
if(res.statusCode == 200){
dispatch_async(dispatch_get_main_queue(), ^{
_profileImageView.image = [UIImage imageWithData:data];
});
}
}];
[task resume];
}
- (void)imagePickerController:(UIImagePickerController *)picker didFinishPickingImage:(UIImage *)image editingInfo:(NSDictionary *)editingInfo {
if(_profileImageView.image == [_detailsDictionary valueForKey:#"ProfilePictureUrl"]) {
NSLog(#"Th url of image is %#",[_detailsDictionary valueForKey:#"ProfilePictureUrl"]);
}
else {
_profileImageView.image = image;
UIImage *updatedImage = _profileImageView.image;
NSData *imageData = UIImageJPEGRepresentation(updatedImage, 100);
NSString *strEncoded = [imageData base64EncodedStringWithOptions:0];
[_detailsDictionary setObject:strEncoded forKey:#"ProfilePictureUrl"];
[self dismissViewControllerAnimated:YES completion:nil];
}
}
#Dirtydanee, He is absolutely correct, you are doing incompatible comparison between Url and UIImage. So please correct this with following code.
NSData *data1 = UIImagePNGRepresentation(previousImage);
NSData *data2 = UIImagePNGRepresentation(currentImage);
if([data1 isEqualToData:data2]) {
//Do something
} else {
//Do something
}
Convert images into NSData and compare the data.
If you want bit-by-bit comparison Please look at the following link:
Generate hash from UIImage
The problem seems to be in this line:
if(_profileImageView.image == [_detailsDictionary valueForKey:#"ProfilePictureUrl"]) {
You are trying to compare the _profileImageView.image, what is UIImage, with [_detailsDictionary valueForKey:#"ProfilePictureUrl"], what is NSURL instance, coming from the dictionary.
What you could do instead, is checking if the picked image and the profileImage is the same.
if(_profileImageView.image == image) {
// etc..
To clear previously cached images, just call:
[[NSURLCache sharedURLCache] removeAllCachedResponses];
Hope this helps!

Passing Data to Singleton iOS

I'm currently having some trouble with data getting lost when transferring from a ViewController to a subclass of PFFile. The data being passed is image data to upload to a users profile. Here's the code for selecting the image:
- (void)imagePickerController:(UIImagePickerController *)picker didFinishPickingMediaWithInfo:(NSDictionary *)info
{
// Access the uncropped image from info dictionary
UIImage *image = [info objectForKey:#"UIImagePickerControllerOriginalImage"];
// Dismiss controller
[picker dismissViewControllerAnimated:YES completion:nil];
// Resize image
_focusedImage.image = image;
NSData *imageData = UIImageJPEGRepresentation(image, 0.05f);
PFFile *imageFile = [PFFile fileWithName:#"Image.jpg" data:imageData];
[[imageUpload uploadImage] setImagePFFile:imageFile];
}
The Log on imageFile in this view is printing out correctly. However, when I pass the data through to my singleton class imageUpload uploadImage This is what the data structure looks like:
+ (imageUpload *) uploadImage
{
static imageUpload*_sharedImageUpload = nil;
_sharedImageUpload = [[self alloc] init];
_sharedImageUpload.imageData = [[NSData alloc] init];
PFUser *user = [PFUser currentUser];
_sharedImageUpload.imagePFFile = [[PFFile alloc] init];
PFFile *imageFile = [PFFile fileWithName:#"Image.jpg" data:_sharedImageUpload.imageData];
[imageFile saveInBackgroundWithBlock:^(BOOL succeeded, NSError *error) {
if (!error)
{
[user setObject:imageFile forKey:#"image"];
[user saveInBackgroundWithBlock:^(BOOL succeeded, NSError *error) {
if (!error)
{
NSLog(#"This should be the profile image upload");
}
else
{
NSLog(#"Something went wrong: %#", error);
}
}];
}
}];
return _sharedImageUpload;
}
When I get to this point, the system just uploads a blank file (zero bytes) to Parse. The naming is right and its going in the right place on the database, but somewhere along the line the data in the file itself is being lost. I can't figure out why. Does anyone have any suggestions?
It looks like you're confusing objects and methods. What you want is a singleton object that has a method / function that uploads your image. I think this is what you're looking for:
//ImageUploader.h
#import <Foundation/Foundation.h>
#interface ImageUploader : NSObject
+ (instancetype)uploader;
- (void)uploadImageFile:(PFFile *)aFile;
#end
//ImageUploader.m
#import "ImageUploader.h"
#implementation ImageUploader
+ (instancetype)uploader {
static ImageUploader * _uploader = nil;
static dispatch_once_t onceToken;
dispatch_once(&onceToken, ^{
_uploader = [[self alloc] init];
});
return _uploader;
}
-(void)uploadPFFile:(PFFile *)imageFile{
[imageFile saveInBackgroundWithBlock:^(BOOL succeeded, NSError *error) {
if (!error)
{
[user setObject:imageFile forKey:#"image"];
[user saveInBackgroundWithBlock:^(BOOL succeeded, NSError *error) {
if (!error)
{
NSLog(#"This should be the profile image upload");
}
else
{
NSLog(#"Something went wrong: %#", error);
}
}];
}
}];
}
#end
You invoke it by calling [[ImageUploader uploader]uploadImageFile:someFile].

Download Data Asynchronous ios

I am getting user informations from Facebook which are picture and username.Than i show to user this informations. But the problem is; picture is coming late.So i used SVProgressHUD like Loading... I want to dismiss my SVProgressHUD after download my picture and show the user.Do i need to use Asynchronous or something like that?
Here is my code part;
- (void)viewDidLoad
{
[super viewDidLoad];
// Do any additional setup after loading the view.
RoundedImageView *profileImageView = [[RoundedImageView alloc] initWithFrame:CGRectMake(27, 80, 70, 70)];
_userNameLabel.hidden = YES;
profileImageView.hidden = YES;
[SVProgressHUD showWithStatus:#"Loading..."];
//[NSTimer scheduledTimerWithTimeInterval:2.0f target:self selector:#selector(LoadingDismiss) userInfo:nil repeats:NO];
[[FBRequest requestForMe] startWithCompletionHandler:^(FBRequestConnection *connection, NSDictionary<FBGraphUser> *FBuser, NSError *error) {
if (error) {
// Handle error
}
else {
NSString *username = [FBuser name];
NSLog(#"username = %#",username);
NSString *userBirtday = [FBuser birthday];
NSLog(#"birthday = %#",userBirtday);
NSString *email = [FBuser objectForKey:#"email"];
NSLog(#"email = %#",email);
NSString *userID = [FBuser objectForKey:#"id"];
NSLog(#"userID = %#",userID);
//==========================================================================ResimAlma
NSString *userImageURL = [NSString stringWithFormat:#"https://graph.facebook.com/%#/picture?type=normal", userID];
NSURL * imageURL = [NSURL URLWithString:userImageURL];
NSData * imageData = [NSData dataWithContentsOfURL:imageURL];
UIImage *image = [UIImage imageWithData:imageData];
//Configring the rounded imageview by setting appropriate image and offset.
profileImageView.imageOffset = 2.5;
profileImageView.image = image;
profileImageView.backgroundImage = [UIImage imageNamed:#"dp_holder_large.png"];
[self.view addSubview:profileImageView];
if (image == nil) {
profileImageView.imageOffset = 2.5;
profileImageView.image = [UIImage imageNamed:#"noImage.png"];
profileImageView.backgroundImage = [UIImage imageNamed:#"dp_holder_large.png"];
}else{
profileImageView.imageOffset = 2.5;
profileImageView.image = image;
profileImageView.backgroundImage = [UIImage imageNamed:#"dp_holder_large.png"];
}
_userNameLabel.text = username;
_userNameLabel.hidden = NO;
profileImageView.hidden = NO;
}
}];
[SVProgressHUD dismiss];
}
Thank you for your interest and help. :)
Put [SVProgressHUD dismiss] in the end of your completion block. With your current code, the progress indicator will dismiss immediately after you make a request to Facebook (since that call is non-blocking).
Put
[SVProgressHUD dismiss];
inside your completion block

Resources