Send UIImage to server - ios

I want to send UIImage from my application to the server. I use ASIHTTPRequest. I'll send NSData but how to convert from UIImage to NSData?

If you need PNG data in your NSData you can use:
NSData *data = UIImagePNGRepresentation(img);
Where img is your UIImage. There is a similar function for JPG.

The UIKit functions UIImageJPEGRepresentation() and UIImagePNGRepresentation() should do what you want.

Related

image are not showing on imageview. in ios

http://hauwengweb.azurewebsites.net/api/AccomodationImages/images/1
I'm trying to download image on imageView. If you will paste this url on browser it will show, but on imageView it's not showing. If you will try any other image, then the same code will work, but when I used this url, the image does not show.
The image in question seems to be a WebP image (served with the wrong MIME type of image/png), which is not a format natively supported by UIImage. However, you can use iOS-WebP to decode the image:
Add the framework, header and implementation to your project, then use:
#import "UIImage+WebP.h"
NSURL *url = [NSURL URLWithString:#"http://hauwengweb.azurewebsites.net/api/AccomodationImages/images/1"];
NSData *data = [NSData dataWithContentsOfURL:url];
imageView.image = [UIImage imageWithWebPData:data];
And please remember to do the download and decoding steps asynchronously so as not to block the main UI.
Try this?
NSURL *url = [NSURL URLWithString:#"http://hauwengweb.azurewebsites.net/api/AccomodationImages/images/1"];
NSData *data = [NSData dataWithContentsOfURL:url];
self.img.image = [UIImage imageWithData:data];
Edit
The URL you provided is not an absolute path hence the data being fetched cannot be converted into an UIImage. There is something wrong with the URL or the formatting of it.

How do i save an image in core data then retrieve it? Using swift

I don't want to use NSUserDefaults, so how do I save an image in core data then retrieve it?
My image is in this variable.
var image:UIIMage = image1
Can you please give me some sample code to do this?
UIImage -> NSData
NSData *imageData = UIImagePNGRepresentation(image); or UIImageJpegRepresentation(image)
NSData -> UIImage
UIImage *image=[UIImage imageWithData:data];
Convert UIImage into NSData and save as the entity's attribute with the type Binary Data
this question is already answered by stack-overflow refer this
How to store an image in core data

Uploading image to WCF json service

I have function in wcf json service take two parameters to upload an image:
Public Function UploadDamageImage(ByVal UploadDamageImageRequest As UploadDamageImageRequest) As UploadDamageImageResponse
How can I send an image as a parameter to this function as bytes?
I'm using AFHTTPRequestOperationManager.
To send a UIImage as bytes to a function, you can first convert the UIImage to a NSData object and then get the byte array from that.
UIImage *image = // your image...
NSData *imgData = UIImagePNGRepresentation(image);
NSString *byteArr = [data base64Encoding];
For the second line of code, the Apple Docs for UIImage explains:
...you can get an NSData object containing either a PNG or JPEG representation of the image data using the UIImagePNGRepresentation and UIImageJPEGRepresentation functions.

Get thumbnail from ALAssetsRepresentation as NSData

I am writing NSData to a file and saving it in the device's app documents folder. For that, is it possible to get thumbnail from ALAssetsRepresentation object in NSData format. If so, any helpful links to that?
I couldn't find anything similar, other than getting CGImageRef from ALAssetsRepresentation. I don't want CGImageRef format as I have to use UIImageJPEGRepresentation or UIImagePNGRepresentation to convert it to NSData.
Try this one
GImageRef iref = [myasset thumbnail];
if (iref)
{
UIImage *theThumbnail = [UIImage imageWithCGImage:iref];
NSData *thumnailData = UIImagePNGRepresentation(theThumbnail);
}

Can't display BLOB image in iOS application

I'm trying to display a BLOB image (get from web server using Json) in my iOS app, but when I run my application I get an empty UIimageView, here is my code :
NSData *dataURL = [NSData dataWithContentsOfURL:[NSURL URLWithString:encodedUrl]];
NSData *profileImage1 = [[NSData alloc] initWithBytes:[dataURL bytes] length:[dataURL length]];
UIImage *profileImage2 = [UIImage imageWithData:profileImage1];
[profilImage setImage:profileImage2];
How can I fix this problem?
[UIImage imageWithData:data] only parses known image file formats like JPEG, PNG, etc. (Full info in the decomentation). Passing blobs isn't supported by UIImage. You need to do some decoding to be able to use the data for the UIImage. You can use GMTBase.64 for encoding and decoding of data. Read the docs and other posts and you'll find out how to change your code.
Hope this helps.

Resources