I am trying to use this method to then upload imageData to my service but the resulting imageData does not take into account orientation and I am uploading images in the incorrect orientation.
Any suggestions?
I solve the exact same problem using this f°, to get the original image size, you just have to pass size == CGSizeMake (10000000000, 10000000000), hope this help :)
+ (UIImage *)createThumbFromImageIO:(UIImage*)image maxPixelSize:(CGSize)size
{
NSData *imagedata = UIImageJPEGRepresentation(image, 1);
CGImageSourceRef imageSource = CGImageSourceCreateWithData((__bridge CFDataRef)imagedata, NULL);
if (!imageSource) return nil;
CGSize imageSize = image.size;
imageSize = CGSizeMake(size.width,size.height);
CFDictionaryRef options = (__bridge CFDictionaryRef)[NSDictionary dictionaryWithObjectsAndKeys:
(id)kCFBooleanTrue, (id)kCGImageSourceCreateThumbnailWithTransform,
(id)kCFBooleanTrue, (id)kCGImageSourceCreateThumbnailFromImageIfAbsent,
(id)[NSNumber numberWithFloat:imageSize.height], (id)kCGImageSourceThumbnailMaxPixelSize,
nil];
CGImageRef imgRef = CGImageSourceCreateThumbnailAtIndex(imageSource, 0, options);
image = [UIImage imageWithCGImage:imgRef];
CGImageRelease(imgRef);
CFRelease(imageSource);
return image;
}
I'm trying to convert a CVPixelBufferRef into a UIImage using the following snippet:
UIImage *image = nil;
CMSampleBufferRef sampleBuffer = (CMSampleBufferRef)CMBufferQueueDequeueAndRetain(_queue);
if (sampleBuffer)
{
CVPixelBufferRef pixelBuffer = CMSampleBufferGetImageBuffer(sampleBuffer);
NSUInteger width = CVPixelBufferGetWidth(pixelBuffer);
NSUInteger height = CVPixelBufferGetHeight(pixelBuffer);
CIImage *coreImage = [[CIImage alloc] initWithCVPixelBuffer:pixelBuffer options:nil];
CGImageRef imageRef = [_context createCGImage:coreImage fromRect:CGRectMake(0, 0, width, height)];
image = [UIImage imageWithCGImage:imageRef];
CFRelease(sampleBuffer);
CFRelease(imageRef);
}
My problem is that it works fine when I run the code on a device but fails to render when run on simulator, the console outputs the following:
Render failed because a pixel format YCC420f is not supported
Any Ideas?
I am successfully displaying text, but how can i display image in the pdf file ?
Actually I am new to pdf conversion
Thanks in advance.
i do it this way, my basic image is in a NSData object:
NSMutableData *pdfData = [NSMutableData data];
UIGraphicsBeginPDFContextToData(pdfData, CGRectMake(0, 0, 2480, 3508), nil);
UIImage *aImg = [UIImage imageWithData:myImageData];
UIGraphicsBeginPDFPage();
NSData *jpegData = UIImageJPEGRepresentation(aImg, 0.5);
CGDataProviderRef dp = CGDataProviderCreateWithCFData((__bridge CFDataRef)jpegData);
CGImageRef cgImage = CGImageCreateWithJPEGDataProvider(dp, NULL, true, kCGRenderingIntentDefault);
[[UIImage imageWithCGImage:cgImage] drawInRect:CGRectMake(0, 0, aImg.size.width, aImg.size.height)];
UIGraphicsEndPDFContext();
return pdfData;
When a user selects an image from the their photo library, I'm resizing it and then uploading it to my server and then at some other point in the app the user can view all their photos. (I'm simplifying the work flow)
The uiimageview on the "detail" screen is 320 x 320. Based upon the below method should I be using:
UIImage *image = [UIImage imageWithCGImage:img];
or
UIImage *image = [UIImage imageWithCGImage:img scale:[UIScreen mainScreen].scale orientation:img.imageOrientation];
Part B would be when I request download the image (nsdata) should I use imageWithCGIImage or imageWithCGIImage:scale:orientation
- (UIImage *)resizedImageForUpload:(UIImage *)originalImage {
static CGSize __maxSize = {640, 640};
NSMutableData *data = [[NSMutableData alloc] initWithData:UIImageJPEGRepresentation(originalImage, 1.0)];
CFMutableDataRef dataRef = (__bridge CFMutableDataRef)data;
CGImageSourceRef imgSrc = CGImageSourceCreateWithData(dataRef, NULL);
CGFloat width = [originalImage maxDimensionForConstraintSize:__maxSize];
NSNumber *maxWidth = [NSNumber numberWithFloat:width];
NSDictionary *options = #{
(__bridge NSString *)kCGImageSourceCreateThumbnailFromImageAlways : #YES,
(__bridge NSString *)kCGImageSourceCreateThumbnailWithTransform: #YES,
(__bridge NSString *)kCGImageSourceThumbnailMaxPixelSize : maxWidth
};
CFDictionaryRef cfOptions = (__bridge CFDictionaryRef)options;
CGImageRef img = CGImageSourceCreateThumbnailAtIndex(imgSrc, 0, cfOptions);
CFStringRef type = CGImageSourceGetType(imgSrc);
CGImageDestinationRef imgDest = CGImageDestinationCreateWithData(dataRef, type, 1, NULL);
CGImageDestinationAddImage(imgDest, img, NULL);
CGImageDestinationFinalize(imgDest);
UIImage *image = [UIImage imageWithCGImage:img];
CFRelease(imgSrc);
CGImageRelease(img);
CFRelease(imgDest);
return image;
}
It appears I've found the answer to my own question. The resizeImageForUpload shouldn't try to scale based upon the device. Since I'm defining a max size 640,640 (retina size for my 320,320 uiimageview) no other manipulation is necessary. I've added some caching for the images and I'm handing the scaling at that point:
UIImage *image = [UIImage imageWithData:imgData scale:[UIScreen mainScreen].scale];
and then return. The reason why I thought I had messed something up was that I was trying to scale an already scaled image. Lessons learned.
I'm currently uploading an image to a server using Imgur on iOS with the following code:
NSData* imageData = UIImagePNGRepresentation(image);
NSArray* paths = NSSearchPathForDirectoriesInDomains(NSDocumentDirectory, NSUserDomainMask, YES);
NSString* fullPathToFile = [[paths objectAtIndex:0] stringByAppendingPathComponent:#"SBTempImage.png"];
[imageData writeToFile:fullPathToFile atomically:NO];
[uploadRequest setFile:fullPathToFile forKey:#"image"];
The code works fine when run in the simulator and uploading a file from the simulator's photo library because I'm on a fast ethernet connection. However, the same code times out on the iPhone when selecting an image taken with the iPhone. So, I tried it by saving a small image from the web and attempting to upload that, which worked.
This leads me to believe the large images taken by the iPhone are timing out over the somewhat slow 3G network. Is there any way to compress/resize the image from the iPhone before sending it?
This snippet will resize the image:
UIGraphicsBeginImageContext(newSize);
[image drawInRect:CGRectMake(0,0,newSize.width,newSize.height)];
UIImage* newImage = UIGraphicsGetImageFromCurrentImageContext();
UIGraphicsEndImageContext();
The variable newSize is a CGSize and can be defined like so:
CGSize newSize = CGSizeMake(100.0f, 100.0f);
A self-contained solution:
- (UIImage *)compressForUpload:(UIImage *)original scale:(CGFloat)scale
{
// Calculate new size given scale factor.
CGSize originalSize = original.size;
CGSize newSize = CGSizeMake(originalSize.width * scale, originalSize.height * scale);
// Scale the original image to match the new size.
UIGraphicsBeginImageContext(newSize);
[original drawInRect:CGRectMake(0, 0, newSize.width, newSize.height)];
UIImage *compressedImage = UIGraphicsGetImageFromCurrentImageContext();
UIGraphicsEndImageContext();
return compressedImage;
}
Thanks to #Tuan Nguyen.
To complement #Tuan Nguyen, this is maybe the fastest and most elegant way to do that.
To link to John Muchow's post at iphonedevelopertips.com , adding a category to a UIImage is a very very handy way to scale in a very fast fashion.
Just calling
UIImage *_image = [[[UIImage alloc] initWithData:SOME_NSDATA] scaleToSize:CGSizeMake(640.0,480.0)];
returns you a 640x480 representation image of your NSDATA ( that could be an online image ) without any more line of code.
Matt Gemmell's MGImageUtilities are very nice, resizing efficiently and with some effort-reducing methods.
In this code 0.5 means 50% ...
UIImage *original = image;
UIImage *compressedImage = UIImageJPEGRepresentation(original, 0.5f);
use this simple method NSData *data = UIImageJPEGRepresentation(chosenImage, 0.2f);
Swift implementation of Zorayr's function (with a bit of a change to include height or width constraints by actual units not scale):
class func compressForUpload(original:UIImage, withHeightLimit heightLimit:CGFloat, andWidthLimit widthLimit:CGFloat)->UIImage{
let originalSize = original.size
var newSize = originalSize
if originalSize.width > widthLimit && originalSize.width > originalSize.height {
newSize.width = widthLimit
newSize.height = originalSize.height*(widthLimit/originalSize.width)
}else if originalSize.height > heightLimit && originalSize.height > originalSize.width {
newSize.height = heightLimit
newSize.width = originalSize.width*(heightLimit/originalSize.height)
}
// Scale the original image to match the new size.
UIGraphicsBeginImageContext(newSize)
original.drawInRect(CGRectMake(0, 0, newSize.width, newSize.height))
let compressedImage = UIGraphicsGetImageFromCurrentImageContext();
UIGraphicsEndImageContext();
return compressedImage
}
#import <ImageIO/ImageIO.h>
#import <MobileCoreServices/MobileCoreServices.h>
+ (UIImage *)resizeImage:(UIImage *)image toResolution:(int)resolution {
NSData *imageData = UIImagePNGRepresentation(image);
CGImageSourceRef src = CGImageSourceCreateWithData((__bridge CFDataRef)imageData, NULL);
CFDictionaryRef options = (__bridge CFDictionaryRef) #{
(id) kCGImageSourceCreateThumbnailWithTransform : #YES,
(id) kCGImageSourceCreateThumbnailFromImageAlways : #YES,
(id) kCGImageSourceThumbnailMaxPixelSize : #(resolution)
};
CGImageRef thumbnail = CGImageSourceCreateThumbnailAtIndex(src, 0, options);
CFRelease(src);
UIImage *img = [[UIImage alloc]initWithCGImage:thumbnail];
return img;
}
Swift 2.0 version of Jagandeep Singh method but need to convert data to image because NSData? is not converted UIImage automatically.
let orginalImage:UIImage = image
let compressedData = UIImageJPEGRepresentation(orginalImage, 0.5)
let compressedImage = UIImage(data: compressedData!)
NsData *data=UiImageJPEGRepresentation(Img.image,0.2f);
UIImage *image = [UIImage imageNamed:#"image.png"];
NSData *imgData1 = UIImageJPEGRepresentation(image, 1);
NSLog(#"Original --- Size of Image(bytes):%d",[imgData1 length]);
NSData *imgData2 = UIImageJPEGRepresentation(image, 0.5);
NSLog(#"After --- Size of Image(bytes):%d",[imgData2 length]);
image = [UIImage imageWithData:imgData2];
imgTest.image = image;
try to convert JPG by scaling factor. Here I am using 0.5
In my case:
Original --- Size of Image(bytes): 85KB & After --- Size of Image(bytes): 23KB
-(UIImage *) resizeImage:(UIImage *)orginalImage resizeSize:(CGSize)size
{
CGFloat actualHeight = orginalImage.size.height;
CGFloat actualWidth = orginalImage.size.width;
// if(actualWidth <= size.width && actualHeight<=size.height)
// {
// return orginalImage;
// }
float oldRatio = actualWidth/actualHeight;
float newRatio = size.width/size.height;
if(oldRatio < newRatio)
{
oldRatio = size.height/actualHeight;
actualWidth = oldRatio * actualWidth;
actualHeight = size.height;
}
else
{
oldRatio = size.width/actualWidth;
actualHeight = oldRatio * actualHeight;
actualWidth = size.width;
}
CGRect rect = CGRectMake(0.0,0.0,actualWidth,actualHeight);
UIGraphicsBeginImageContext(rect.size);
[orginalImage drawInRect:rect];
orginalImage = UIGraphicsGetImageFromCurrentImageContext();
UIGraphicsEndImageContext();
return orginalImage;
}
This is the method calling
UIImage *compimage=[appdel resizeImage:imagemain resizeSize:CGSizeMake(40,40)];
it returns image this image u can display any where...........
You should be able to make a smaller image by doing something like
UIImage *small = [UIImage imageWithCGImage:original.CGImage scale:0.25 orientation:original.imageOrientation];
(for a quarter-size image) then convert the smaller image to a PNG or whatever format you need.