Black border in bitmap image in iOS - ios

I am using the following code to create a bitmap image.
UIGraphicsBeginImageContextWithOptions(CGSizeMake(targetWidth, targetHeight), NO, 0.0);
CGContextRef context = UIGraphicsGetCurrentContext();
CGContextClearRect(context,CGRectMake(0, 0, targetWidth, targetHeight));
float red ,green, blue ,alpha ;
for (int Row = 1; Row <= targetHeight; Row++)
{
if (Row <= originalHeight) {
for (int Col = 0; Col < targetWidth; iCol++)
{
if (Col < originalWidth) {
UIColor *color = [originalImage colorAtPixel:CGPointMake(Col, Row) :originalImage];
[color getRed:&red green:&green blue:&blue alpha:&alpha];
if (red == 0.0 && green == 0.0 && blue == 0.0 && alpha == 0.0) {
CGContextSetRGBFillColor(context, 0 , 0, 0 , 0); //set transparent pixels
}
else {
CGContextSetRGBFillColor(context, red , green, blue , 1);
}
CGContextFillRect(context, CGRectMake(Col,Row,1,1));
}
}
}
}
finalImage = UIGraphicsGetImageFromCurrentImageContext();
UIGraphicsEndImageContext();
//below code is used from another link
- (UIColor *)colorAtPixel:(CGPoint)point :(UIImage*)image {
// Cancel if point is outside image coordinates
if (!CGRectContainsPoint(CGRectMake(0.0f, 0.0f, image.size.width, image.size.height), point)) {
return nil;
}
NSInteger pointX = trunc(point.x);
NSInteger pointY = trunc(point.y);
CGImageRef cgImage = image.CGImage;
NSUInteger width = image.size.width;
NSUInteger height = image.size.height;
CGColorSpaceRef colorSpace = CGColorSpaceCreateDeviceRGB();
int bytesPerPixel = 4;
int bytesPerRow = bytesPerPixel * 1;
NSUInteger bitsPerComponent = 8;
unsigned char pixelData[4] = { 0, 0, 0, 0 };
CGContextRef context = CGBitmapContextCreate(pixelData,
1,
1,
bitsPerComponent,
bytesPerRow,
colorSpace,
kCGImageAlphaPremultipliedLast | kCGBitmapByteOrder32Big);
CGColorSpaceRelease(colorSpace);
CGContextSetBlendMode(context, kCGBlendModeCopy);
// Draw the pixel we are interested in onto the bitmap context
CGContextTranslateCTM(context, -pointX, pointY-(CGFloat)height);
CGContextDrawImage(context, CGRectMake(0.0f, 0.0f, (CGFloat)width, (CGFloat)height), cgImage);
CGContextRelease(context);
// Convert color values [0..255] to floats [0.0..1.0]
CGFloat red = (CGFloat)pixelData[0] / 255.0f;
CGFloat green = (CGFloat)pixelData[1] / 255.0f;
CGFloat blue = (CGFloat)pixelData[2] / 255.0f;
CGFloat alpha = (CGFloat)pixelData[3] / 255.0f;
return [UIColor colorWithRed:red green:green blue:blue alpha:alpha];
}
I want the image to look transparent. I change the Black background of the context to transparent using CGContextSetRGBFillColor(context, 0 , 0, 0 , 0); //set transparent
which works fine. But there it still leaves a black border on the image.I want to remove it.
How can this be achieved? Any pointers?

One way would be to make a soft threshold instead of a hard limit. For example, check if the luminance of a given pixel is below some small, but non-zero amount. If so, set the alpha of the pixel to some interpolated value. For example:
const double epsilon = 0.1; // or some other small value
double luminance = (red * 0.2126) + (green * 0.7152) + (blue * 0.0722);
if (luminance < epsilon)
{
CGContextSetRGBFillColor(context, red, green, blue, luminance / epsilon);
}
else
{
CGContextSetRGBFillColor(context, red, green, blue, 1.0);
}

Related

inconsistencies in colors when drawing

I have a UIImageView and I draw to it using UIColor orangeColor. Now, I have a function that is supposed to detect the pixelColor of a pixel tapped on.
R: 1.000000 G: 0.501961 B: 0.000000
That's the RGB value I receive when attempting to detect the pixelColor for UIOrange
It should be.
R: 1.000000 G: 0.5 B: 0.000000
Here's my function
- (UIColor *)colorAtPixel:(CGPoint)point {
// Cancel if point is outside image coordinates
if (!CGRectContainsPoint(CGRectMake(0.0f, 0.0f, _overlay_imageView.frame.size.width, _overlay_imageView.frame.size.height), point)) {
return nil;
}
// Create a 1x1 pixel byte array and bitmap context to draw the pixel into.
// Reference: http://stackoverflow.com/questions/1042830/retrieving-a-pixel-alpha-value-for-a-uiimage
NSInteger pointX = trunc(point.x);
NSInteger pointY = trunc(point.y);
CGImageRef cgImage = _overlay_imageView.image.CGImage;
NSUInteger width = CGImageGetWidth(cgImage);
NSUInteger height = CGImageGetHeight(cgImage);
CGColorSpaceRef colorSpace = CGColorSpaceCreateDeviceRGB();
int bytesPerPixel = 4;
int bytesPerRow = bytesPerPixel * 1;
NSUInteger bitsPerComponent = 8;
unsigned char pixelData[4] = { 0, 0, 0, 0 };
CGContextRef context = CGBitmapContextCreate(pixelData,
1,
1,
bitsPerComponent,
bytesPerRow,
colorSpace,
kCGImageAlphaPremultipliedLast | kCGBitmapByteOrder32Big);
CGColorSpaceRelease(colorSpace);
CGContextSetBlendMode(context, kCGBlendModeCopy);
// Draw the pixel we are interested in onto the bitmap context
CGContextTranslateCTM(context, -pointX, -pointY);
CGContextDrawImage(context, CGRectMake(0.0f, 0.0f, (CGFloat)width, (CGFloat)height), cgImage);
CGContextRelease(context);
// Convert color values [0..255] to floats [0.0..1.0]
CGFloat red = (CGFloat)pixelData[0] / 255.0f;
CGFloat green = (CGFloat)pixelData[1] / 255.0f;
CGFloat blue = (CGFloat)pixelData[2] / 255.0f;
CGFloat alpha = (CGFloat)pixelData[3] / 255.0f;
return [UIColor colorWithRed:red green:green blue:blue alpha:alpha];
}
Any ideas?
I must mention, my UIImageView has a clearBackground, and its ontop of a black canvas. Is that maybe the issue?
There's nothing wrong with your function. This is a a result of floating point math. Half of an integer 255 (the max value of an unsigned byte) is either 127/255.0 or 128/255.0 depending on how you round. Neither one of those is 0.5. They are 0.498039215686275 and 0.501960784313725 respectively.
EDIT: I guess I should add that the colors in the CGImage are stored as bytes, not floats. So when you create your orange with a float in UIColor its getting truncated to R:255, G:128, B:0 A:255. When you read this back as a float you get 1.0 0.501961 B: 0.0 A: 1.0

How do I check if CGContext contains point?

Basically I have image that gets drawn when the touch is moved. When i toggle on a button called eraser I want it to detect if the context I have drawn is NOT black at the position the touch is at. Is there any way to do this?
- (id)initWithFrame:(CGRect)frame
{
self = [super initWithFrame:frame];
if (self) {
hue = 0.0;
[self initContext:frame.size];
framsize = frame.size;
}
return self;
}
-(void)clear{
cacheContext = nil;
cacheBitmap = nil;
[self initContext:framsize];
[self setNeedsDisplay];
}
- (BOOL) initContext:(CGSize)size {
float scaleFactor = [[UIScreen mainScreen] scale];
int bitmapByteCount;
int bitmapBytesPerRow;
// Declare the number of bytes per row. Each pixel in the bitmap in this
// example is represented by 4 bytes; 8 bits each of red, green, blue, and
// alpha.
bitmapBytesPerRow = (size.width * 4);
bitmapByteCount = (bitmapBytesPerRow * size.height)*scaleFactor*scaleFactor;
// Allocate memory for image data. This is the destination in memory
// where any drawing to the bitmap context will be rendered.
cacheBitmap = malloc( bitmapByteCount );
if (cacheBitmap == NULL){
return NO;
}
CGBitmapInfo bitmapInfo = kCGImageAlphaPremultipliedFirst | kCGBitmapByteOrder32Big;
CGColorSpaceRef colorSpace = CGColorSpaceCreateDeviceRGB();
cacheContext = CGBitmapContextCreate (cacheBitmap, size.width*scaleFactor, size.height *scaleFactor, 8, bitmapBytesPerRow*scaleFactor, colorSpace, bitmapInfo);
CGContextScaleCTM(cacheContext, scaleFactor, scaleFactor);
CGColorSpaceRelease(colorSpace);
CGContextSetRGBFillColor(cacheContext, 1, 0, 0, 0.0);
CGContextFillRect(cacheContext, (CGRect){CGPointZero, CGSizeMake(size.height*scaleFactor, size.width*scaleFactor)});
return YES;
}
//-(float) alphaAtX:(int)x y:(int)y //get alpha component using the pointer 'pixelData'
//{
// float scaleFactor = [[UIScreen mainScreen] scale];
// return pixelData[(y * 320*scaleFactor + x) *4 + 3]; //+0 for red, +1 for green, +2 for blue, +3 for alpha
//}
-(UIColor *) colorOfPoint:(CGPoint)point
{
unsigned char pixel[4] = {0};
CGColorSpaceRef colorSpace = CGColorSpaceCreateDeviceRGB();
CGContextRef context = CGBitmapContextCreate(pixel,
1, 1, 8, 4, colorSpace, (CGBitmapInfo)kCGImageAlphaPremultipliedLast);
CGContextTranslateCTM(context, -point.x, -point.y);
[self.layer renderInContext:context];
CGContextRelease(context);
CGColorSpaceRelease(colorSpace);
UIColor *color = [UIColor colorWithRed:pixel[0]/255.0
green:pixel[1]/255.0 blue:pixel[2]/255.0
alpha:pixel[3]/255.0];
return color;
}
- (void) drawToCache:(CGPoint)currentpos andLastPos:(CGPoint)pos Color:(UIColor *)colors Thickness:(float)thickness{
hue += 0.005;
if(hue > 1.0) hue = 0.0;
UIColor *color;
if (!colors) {
color = [UIColor colorWithHue:hue saturation:0.7 brightness:1.0 alpha:1.0];
} else {
color = colors;
}
CGContextSetStrokeColorWithColor(cacheContext, [color CGColor]);
CGContextSetLineCap(cacheContext, kCGLineCapRound);
CGContextSetLineWidth(cacheContext, 6+thickness);
CGPoint lastPoint = currentpos;
CGPoint newPoint = lastPoint;
CGContextMoveToPoint(cacheContext, lastPoint.x, lastPoint.y);
CGContextAddLineToPoint(cacheContext, newPoint.x, newPoint.y);
CGContextStrokePath(cacheContext);
CGRect dirtyPoint1 = CGRectMake(lastPoint.x-10, lastPoint.y-10, 20, 20);
CGRect dirtyPoint2 = CGRectMake(newPoint.x-10, newPoint.y-10, 20, 20);
[self setNeedsDisplayInRect:CGRectUnion(dirtyPoint1, dirtyPoint2)];
}
- (void) drawRect:(CGRect)rect {
CGContextRef context = UIGraphicsGetCurrentContext();
CGImageRef cacheImage = CGBitmapContextCreateImage(cacheContext);
CGContextDrawImage(context, self.bounds, cacheImage);
CGImageRelease(cacheImage);
}
#end
AFAIK there is no easy way to get color of some point in CGContext. But you can use approach described here http://www.markj.net/iphone-uiimage-pixel-color/
I have not tested this code but I hope it works or is easy to fix:
// ctxSize - is a size of context
- (UIColor*) getPixelColorForContext:(CGContextRef)cgctx size:(CGSize)ctxSize atLocation:(CGPoint)point
{
UIColor* color = nil;
// Create off screen bitmap context to draw the image into. Format ARGB is 4 bytes for each pixel: Alpa, Red, Green, Blue
if (cgctx == NULL) { return nil; /* error */ }
size_t w = ctxSize.width;
// Now we can get a pointer to the image data associated with the bitmap
// context.
unsigned char* data = CGBitmapContextGetData (cgctx);
if (data != NULL) {
//offset locates the pixel in the data from x,y.
//4 for 4 bytes of data per pixel, w is width of one row of data.
int offset = 4*((w*round(point.y))+round(point.x));
int alpha = data[offset];
int red = data[offset+1];
int green = data[offset+2];
int blue = data[offset+3];
NSLog(#"offset: %i colors: RGB A %i %i %i %i",offset,red,green,blue,alpha);
color = [UIColor colorWithRed:(red/255.0f) green:(green/255.0f) blue:(blue/255.0f) alpha:(alpha/255.0f)];
}
// Free image data memory for the context
if (data) { free(data); }
return color;
}

How do I get bitmapcontext for retina and find specific pixel inside it? [duplicate]

Basically I have image that gets drawn when the touch is moved. When i toggle on a button called eraser I want it to detect if the context I have drawn is NOT black at the position the touch is at. Is there any way to do this?
- (id)initWithFrame:(CGRect)frame
{
self = [super initWithFrame:frame];
if (self) {
hue = 0.0;
[self initContext:frame.size];
framsize = frame.size;
}
return self;
}
-(void)clear{
cacheContext = nil;
cacheBitmap = nil;
[self initContext:framsize];
[self setNeedsDisplay];
}
- (BOOL) initContext:(CGSize)size {
float scaleFactor = [[UIScreen mainScreen] scale];
int bitmapByteCount;
int bitmapBytesPerRow;
// Declare the number of bytes per row. Each pixel in the bitmap in this
// example is represented by 4 bytes; 8 bits each of red, green, blue, and
// alpha.
bitmapBytesPerRow = (size.width * 4);
bitmapByteCount = (bitmapBytesPerRow * size.height)*scaleFactor*scaleFactor;
// Allocate memory for image data. This is the destination in memory
// where any drawing to the bitmap context will be rendered.
cacheBitmap = malloc( bitmapByteCount );
if (cacheBitmap == NULL){
return NO;
}
CGBitmapInfo bitmapInfo = kCGImageAlphaPremultipliedFirst | kCGBitmapByteOrder32Big;
CGColorSpaceRef colorSpace = CGColorSpaceCreateDeviceRGB();
cacheContext = CGBitmapContextCreate (cacheBitmap, size.width*scaleFactor, size.height *scaleFactor, 8, bitmapBytesPerRow*scaleFactor, colorSpace, bitmapInfo);
CGContextScaleCTM(cacheContext, scaleFactor, scaleFactor);
CGColorSpaceRelease(colorSpace);
CGContextSetRGBFillColor(cacheContext, 1, 0, 0, 0.0);
CGContextFillRect(cacheContext, (CGRect){CGPointZero, CGSizeMake(size.height*scaleFactor, size.width*scaleFactor)});
return YES;
}
//-(float) alphaAtX:(int)x y:(int)y //get alpha component using the pointer 'pixelData'
//{
// float scaleFactor = [[UIScreen mainScreen] scale];
// return pixelData[(y * 320*scaleFactor + x) *4 + 3]; //+0 for red, +1 for green, +2 for blue, +3 for alpha
//}
-(UIColor *) colorOfPoint:(CGPoint)point
{
unsigned char pixel[4] = {0};
CGColorSpaceRef colorSpace = CGColorSpaceCreateDeviceRGB();
CGContextRef context = CGBitmapContextCreate(pixel,
1, 1, 8, 4, colorSpace, (CGBitmapInfo)kCGImageAlphaPremultipliedLast);
CGContextTranslateCTM(context, -point.x, -point.y);
[self.layer renderInContext:context];
CGContextRelease(context);
CGColorSpaceRelease(colorSpace);
UIColor *color = [UIColor colorWithRed:pixel[0]/255.0
green:pixel[1]/255.0 blue:pixel[2]/255.0
alpha:pixel[3]/255.0];
return color;
}
- (void) drawToCache:(CGPoint)currentpos andLastPos:(CGPoint)pos Color:(UIColor *)colors Thickness:(float)thickness{
hue += 0.005;
if(hue > 1.0) hue = 0.0;
UIColor *color;
if (!colors) {
color = [UIColor colorWithHue:hue saturation:0.7 brightness:1.0 alpha:1.0];
} else {
color = colors;
}
CGContextSetStrokeColorWithColor(cacheContext, [color CGColor]);
CGContextSetLineCap(cacheContext, kCGLineCapRound);
CGContextSetLineWidth(cacheContext, 6+thickness);
CGPoint lastPoint = currentpos;
CGPoint newPoint = lastPoint;
CGContextMoveToPoint(cacheContext, lastPoint.x, lastPoint.y);
CGContextAddLineToPoint(cacheContext, newPoint.x, newPoint.y);
CGContextStrokePath(cacheContext);
CGRect dirtyPoint1 = CGRectMake(lastPoint.x-10, lastPoint.y-10, 20, 20);
CGRect dirtyPoint2 = CGRectMake(newPoint.x-10, newPoint.y-10, 20, 20);
[self setNeedsDisplayInRect:CGRectUnion(dirtyPoint1, dirtyPoint2)];
}
- (void) drawRect:(CGRect)rect {
CGContextRef context = UIGraphicsGetCurrentContext();
CGImageRef cacheImage = CGBitmapContextCreateImage(cacheContext);
CGContextDrawImage(context, self.bounds, cacheImage);
CGImageRelease(cacheImage);
}
#end
AFAIK there is no easy way to get color of some point in CGContext. But you can use approach described here http://www.markj.net/iphone-uiimage-pixel-color/
I have not tested this code but I hope it works or is easy to fix:
// ctxSize - is a size of context
- (UIColor*) getPixelColorForContext:(CGContextRef)cgctx size:(CGSize)ctxSize atLocation:(CGPoint)point
{
UIColor* color = nil;
// Create off screen bitmap context to draw the image into. Format ARGB is 4 bytes for each pixel: Alpa, Red, Green, Blue
if (cgctx == NULL) { return nil; /* error */ }
size_t w = ctxSize.width;
// Now we can get a pointer to the image data associated with the bitmap
// context.
unsigned char* data = CGBitmapContextGetData (cgctx);
if (data != NULL) {
//offset locates the pixel in the data from x,y.
//4 for 4 bytes of data per pixel, w is width of one row of data.
int offset = 4*((w*round(point.y))+round(point.x));
int alpha = data[offset];
int red = data[offset+1];
int green = data[offset+2];
int blue = data[offset+3];
NSLog(#"offset: %i colors: RGB A %i %i %i %i",offset,red,green,blue,alpha);
color = [UIColor colorWithRed:(red/255.0f) green:(green/255.0f) blue:(blue/255.0f) alpha:(alpha/255.0f)];
}
// Free image data memory for the context
if (data) { free(data); }
return color;
}

How to replace colour from UIImage with other color [duplicate]

This question already exists:
replacing specific color in a uiimage
Closed 9 years ago.
I want to change color of UIImage with another color, firstly i want to touch the image and detect the color of the touched pixel and then want to replace color of touched pixel with another color.
I have following code in which i am trying to change touched pixel color but its returning transparent image.
- (UIColor *)colorAtPixel:(CGPoint)point
{
// Cancel if point is outside image coordinates
if (!CGRectContainsPoint(CGRectMake(0.0f, 0.0f, self.size.width, self.size.height), point)) {
return nil;
}
// Create a 1x1 pixel byte array and bitmap context to draw the pixel into.
// Reference: http://stackoverflow.com/questions/1042830/retrieving-a-pixel-alpha-value-for-a-uiimage
NSInteger pointX = trunc(point.x);
NSInteger pointY = trunc(point.y);
CGImageRef cgImage = self.CGImage;
NSUInteger width = self.size.width;
NSUInteger height = self.size.height;
CGColorSpaceRef colorSpace = CGColorSpaceCreateDeviceRGB();
int bytesPerPixel = 4;
int bytesPerRow = bytesPerPixel * 1;
NSUInteger bitsPerComponent = 8;
unsigned char pixelData[4] = { 0, 0, 0, 0 };
CGContextRef context = CGBitmapContextCreate(pixelData,
1,
1,
bitsPerComponent,
bytesPerRow,
colorSpace,
kCGImageAlphaPremultipliedLast | kCGBitmapByteOrder32Big);
CGColorSpaceRelease(colorSpace);
CGContextSetBlendMode(context, kCGBlendModeCopy);
// Draw the pixel we are interested in onto the bitmap context
CGContextTranslateCTM(context, -pointX, pointY-(CGFloat)height);
CGContextDrawImage(context, CGRectMake(0.0f, 0.0f, (CGFloat)width, (CGFloat)height), cgImage);
CGContextRelease(context);
// Convert color values [0..255] to floats [0.0..1.0]
red = (CGFloat)pixelData[0] / 255.0f;
green = (CGFloat)pixelData[1] / 255.0f;
blue = (CGFloat)pixelData[2] / 255.0f;
alpha = (CGFloat)pixelData[3] / 255.0f;
[self changeWhiteColorTransparent:imageview.img];
return [UIColor colorWithRed:red green:green blue:blue alpha:alpha];
}
-(void)changeWhiteColorTransparent: (UIImage *)image{
CGImageRef rawImageRef = image.CGImage;
const float colorMasking[6] = { red, 0, green, 0, blue, 0 };
UIGraphicsBeginImageContext(image.size);
CGImageRef maskedImageRef = CGImageCreateWithMaskingColors(rawImageRef, colorMasking);
{
//if in iphone
CGContextTranslateCTM(UIGraphicsGetCurrentContext(), 0.0, image.size.height);
CGContextScaleCTM(UIGraphicsGetCurrentContext(), 1.0, -1.0);
}
CGContextDrawImage(UIGraphicsGetCurrentContext(), CGRectMake(0, 0, image.size.width, image.size.height), maskedImageRef);
UIImage *result = UIGraphicsGetImageFromCurrentImageContext();
CGImageRelease(maskedImageRef);
UIGraphicsEndImageContext();
NSString *imagespath =[self createDirectoryInDocumentsFolderWithName:#"images"];
NSFileManager *fileM = [NSFileManager defaultManager];
NSArray *contents= [fileM contentsOfDirectoryAtPath:imagespath error:nil];
NSString *savedImagePath = [imagespath stringByAppendingPathComponent:[NSString stringWithFormat:#"images%d.png",[contents count] + 1]];
NSData *imageData = UIImagePNGRepresentation(result);
[imageData writeToFile:savedImagePath atomically:NO];
}
here i an trying to make touched colour transparent but if i want to change it to yellow color then what i need to change i code?
Call this method form uitouch move...
-(void) getPixelColorAtLocation:(CGPoint)point
{
unsigned char pixel[4] = {0};
CGColorSpaceRef colorSpace = CGColorSpaceCreateDeviceRGB();
CGContextRef context = CGBitmapContextCreate(pixel, 1, 1, 8, 4, colorSpace, kCGImageAlphaPremultipliedLast);
CGContextTranslateCTM(context, -point.x, -point.y);
[self.layer renderInContext:context];
// NSLog(#"x- %f y- %f",point.x,point.y);
CGContextRelease(context);
CGColorSpaceRelease(colorSpace);
NSLog(#"RGB Color code :%d %d %d",pixel[0],pixel[1],pixel[2]);
}
set this RGB value to your imageview .
You can get color code of touch point in RGB colr combination. try it.
it will help you.

iOS find subimage in a larger image

What is the best way to find coordinates of subimages in a larger image. Subimage is very simple and always the same. For example how do I find coordinates of all black squares in the image below:
I think the best way here might be to write a function/UIImage category to check the color at a pixel in the image. Then (if you know for a fact the images are squares), you can check the color of each pixel moving down diagonally until you one is a different color (then you have the location and size of your square).
One working implementation I found for checking the color of a pixel is in the open source component OBShapedButton.
It is a UIImage category.
Code:
- (UIColor *)colorAtPixel:(CGPoint)point {
// Cancel if point is outside image coordinates
if (!CGRectContainsPoint(CGRectMake(0.0f, 0.0f, self.size.width, self.size.height), point)) {
return nil;
}
// Create a 1x1 pixel byte array and bitmap context to draw the pixel into.
// Reference: http://stackoverflow.com/questions/1042830/retrieving-a-pixel-alpha-value-for-a-uiimage
NSInteger pointX = trunc(point.x);
NSInteger pointY = trunc(point.y);
CGImageRef cgImage = self.CGImage;
NSUInteger width = self.size.width;
NSUInteger height = self.size.height;
CGColorSpaceRef colorSpace = CGColorSpaceCreateDeviceRGB();
int bytesPerPixel = 4;
int bytesPerRow = bytesPerPixel * 1;
NSUInteger bitsPerComponent = 8;
unsigned char pixelData[4] = { 0, 0, 0, 0 };
CGContextRef context = CGBitmapContextCreate(pixelData,
1,
1,
bitsPerComponent,
bytesPerRow,
colorSpace,
kCGImageAlphaPremultipliedLast | kCGBitmapByteOrder32Big);
CGColorSpaceRelease(colorSpace);
CGContextSetBlendMode(context, kCGBlendModeCopy);
// Draw the pixel we are interested in onto the bitmap context
CGContextTranslateCTM(context, -pointX, pointY-(CGFloat)height);
CGContextDrawImage(context, CGRectMake(0.0f, 0.0f, (CGFloat)width, (CGFloat)height), cgImage);
CGContextRelease(context);
// Convert color values [0..255] to floats [0.0..1.0]
CGFloat red = (CGFloat)pixelData[0] / 255.0f;
CGFloat green = (CGFloat)pixelData[1] / 255.0f;
CGFloat blue = (CGFloat)pixelData[2] / 255.0f;
CGFloat alpha = (CGFloat)pixelData[3] / 255.0f;
return [UIColor colorWithRed:red green:green blue:blue alpha:alpha];
}

Resources