Painting a Gradient inside an Image by tapping in Objective-C? - ios

I have an animal image and that image have a white background and the shape of animal is black outline. That is fixed on my image view in my .xib.
Now I would like to paint on the image, however only on the particular closed part.
Suppose a user touches on the hand then only hands will fill the gradient. The rest of the image will remain the same.
- (UIImage*)imageFromRawData:(unsigned char *)rawData
{
NSUInteger bitsPerComponent = 8;
NSUInteger bytesPerPixel = 4;
NSUInteger bytesPerRow = bytesPerPixel * self.imageDoodle.image.size.width;
CGImageRef imageRef = [self.imageDoodle.image CGImage];
CGColorSpaceRef colorSpace = CGImageGetColorSpace(imageRef);
CGContextRef context = CGBitmapContextCreate(rawData,self.imageDoodle.image.size.width,
self.imageDoodle.image.size.height,bitsPerComponent,bytesPerRow,colorSpace,
kCGImageAlphaPremultipliedLast);
imageRef = CGBitmapContextCreateImage (context);
UIImage* rawImage = [UIImage imageWithCGImage:imageRef];
CGContextRelease(context);
CGImageRelease(imageRef);
return rawImage;
}
-(unsigned char*)rawDataFromImage:(UIImage *)image
{
CGImageRef imageRef = [image CGImage];
NSUInteger width = CGImageGetWidth(imageRef);
NSUInteger height = CGImageGetHeight(imageRef);
NSLog(#"w=%d,h=%d",width,height);
CGColorSpaceRef colorSpace = CGColorSpaceCreateDeviceRGB();
unsigned char *rawData = malloc(height * width * 4);
NSUInteger bytesPerPixel = 4;
NSUInteger bytesPerRow = bytesPerPixel * width;
NSUInteger bitsPerComponent = 8;
CGContextRef context = CGBitmapContextCreate(rawData, width, height, bitsPerComponent, bytesPerRow, colorSpace, kCGImageAlphaPremultipliedLast | kCGBitmapByteOrder32Big);
CGColorSpaceRelease(colorSpace);
CGContextDrawImage(context, CGRectMake(0, 0, width, height), imageRef);
CGContextRelease(context);
return rawData;
}
Where would I need to change my code to support this?
This is possible by UIBezierPath but I don't know how to implement it in this case.

Related

How to add Imageview On Transparent Area of Image

I am developing color Photo Frame app but I am stuck in one part. I have lots of frame in my app. So I want to Find Transparent area of my app and put UIImageView programmatically on that part. But i was trying number of code to read pixel by pixel and many more but nothing works
Frame
here code which i use for find area of transparent
CGImageRef imageRef = [image CGImage];
CGColorSpaceRef colorSpace = CGColorSpaceCreateDeviceRGB();
unsigned char *rawData = malloc(image.size.width * image.size.height * 4);
NSUInteger bytesPerPixel = 4;
NSUInteger bytesPerRow = bytesPerPixel * image.size.width;
NSUInteger bitsPerComponent = 8;
CGContextRef context = CGBitmapContextCreate(rawData, image.size.width, image.size.height, bitsPerComponent, bytesPerRow, colorSpace, kCGImageAlphaPremultipliedLast | kCGBitmapByteOrder32Big);
CGColorSpaceRelease(colorSpace);
CGContextDrawImage(context, CGRectMake(0, 0, image.size.width , image.size.height), imageRef);
CGContextRelease(context);
unsigned char * rawData2 = malloc(image.size.width * image.size.height * 4);
BOOL isBlank = YES;
for(int index=0;index<(image.size.width * image.size.height * 4);index+=4)
{
if(index%4==0)
{
if(rawData[(int)index+3]==0)
{
rawData2[(int)index] = rawData[(int)index];
rawData2[(int)index+1] = rawData[(int)index+1];
rawData2[(int)index+2] = rawData[(int)index+2];
rawData2[(int)index+3] = rawData[(int)index+3];
isBlank=NO;
}
else
{
rawData2[(int)index] = 0;
rawData2[(int)index+1] = 0;
rawData2[(int)index+2] = 0;
rawData2[(int)index+3] = 0;
}
}
}
How to find transparent area frame(CGreact) in the imageview?

Why is UIImage from imageWithCGImage breaking in arm64?

Below is some code that converts a UIImage to a CGImage, makes some changes to the CGImage, then converts it back to a UIImage.
This works if my architectures include arm6 and arm7 only. If I add arm64, the UIImage returned at the end is null.
There are some hard coded numbers in the code, which makes me think that is the problem. But I am not sure how to programmatically determine these values.
Here's the code, minus some details in the middle:
CGImageRef imageRef = [anImage CGImage];
NSUInteger width = CGImageGetWidth(imageRef);
NSUInteger height = CGImageGetHeight(imageRef);
CGColorSpaceRef colorSpace = CGColorSpaceCreateDeviceRGB();
// Raw data malloc'd
unsigned char *rawData = malloc(height * width * 4);
NSUInteger bytesPerPixel = 4;
NSUInteger bytesPerRow = bytesPerPixel * width;
NSUInteger bitsPerComponent = 8;
CGContextRef context =
CGBitmapContextCreate(rawData, width, height,
bitsPerComponent, bytesPerRow, colorSpace,
kCGImageAlphaPremultipliedLast | kCGBitmapByteOrder32Big);
CGColorSpaceRelease(colorSpace);
CGContextDrawImage(context, CGRectMake(0, 0, width, height), imageRef);
...
// Change the alpha or color of pixels based on certain criteria
...
CGImageRef ref = CGBitmapContextCreateImage(context);
free(rawData);
CGContextRelease(context);
image = [UIImage imageWithCGImage:ref];
CFRelease(ref);
Any thoughts on what's happening here?

Freeing raw image data after creating a UIImage from it, corrupts image

I am taking a UIImage, breaking it down to the raw Pixel Data like so
CGImageRef imageRef = self.image.CGImage;
NSUInteger width = CGImageGetWidth(imageRef);
NSUInteger height = CGImageGetHeight(imageRef);
CGColorSpaceRef colorSpace = CGColorSpaceCreateDeviceRGB();
_rawData = (UInt8 *)calloc(height * width * 4, sizeof(UInt8));
NSUInteger bytesPerPixel = 4;
NSUInteger bytesPerRow = bytesPerPixel * width;
NSUInteger bitsPerComponent = 8;
CGContextRef context = CGBitmapContextCreate(_rawData, width, height, bitsPerComponent, bytesPerRow, colorSpace, kCGImageAlphaPremultipliedLast | kCGBitmapByteOrder32Big);
CGColorSpaceRelease(colorSpace);
CGContextDrawImage(context, CGRectMake(0, 0, width, height), imageRef);
CGContextRelease(context);
I then edit a couple pixels in the _rawData array with different colors, then re-create the UIImage like this from the edited _rawData pixel data like so. (In this, I am just changing the second pixel in the image to be red)
size_t width = CGImageGetWidth(_image.CGImage);
NSUInteger pixel = 1; // second pixel
NSUInteger position = pixel*4;
NSUInteger redIndex = position;
NSUInteger greenIndex = position+1;
NSUInteger blueIndex = position+2;
NSUInteger alphaIndex = position+3;
_rawData[redIndex] = 255;
_rawData[greenIndex] = 0;
_rawData[blueIndex] = 0;
_rawData[alphaIndex] = 255;
size_t height = CGImageGetHeight(_image.CGImage);
size_t bitsPerComponent = 8;
size_t bitsPerPixel = 32;
size_t bytesPerRow = 4*width;
size_t length = height*width*4;
CGColorSpaceRef colorSpace = CGColorSpaceCreateDeviceRGB();
CGBitmapInfo bitmapInfo = kCGImageAlphaPremultipliedLast | kCGBitmapByteOrder32Big;
CGDataProviderRef provider = CGDataProviderCreateWithData(NULL, _rawData, length, NULL);
CGImageRef newImageRef = CGImageCreate(width, height, bitsPerComponent, bitsPerPixel, bytesPerRow, colorSpace, bitmapInfo, provider, NULL, NO, kCGRenderingIntentDefault);
UIImage *newImage = [UIImage imageWithCGImage:newImageRef];
CGColorSpaceRelease(colorSpace);
CGDataProviderRelease(provider);
CGImageRelease(newImageRef);
My problem begins here: I now have a new UIImage that has the second pixel changed to red, but I have a memory leak. I need to free the _rawData that has been calloc'd. Whenever I call
free(_rawData);
even though its after i've already created my "newImage". That image I just created is corrupted when I show it on screen. I thought that CGImageCreate() would create a new object in memory so then I could free the old memory. Is that not true?
What in the world I am doing wrong?

how to change the alpha of a groundoverlay in google maps ios sdk?

i added a groundoverlay to a mapview, and i found thoese ways to change the alpha of groundoverlay.icon.
How to set the opacity/alpha of a UIImage?
but it seems has no affect in the app, i still can not see the map or other groundoverlays behind the image.
is there a solution to handle this?
+ (UIImage *) setImage:(UIImage *)image withAlpha:(CGFloat)alpha
{
// Create a pixel buffer in an easy to use format
CGImageRef imageRef = [image CGImage];
NSUInteger width = CGImageGetWidth(imageRef);
NSUInteger height = CGImageGetHeight(imageRef);
CGColorSpaceRef colorSpace = CGColorSpaceCreateDeviceRGB();
UInt8 * m_PixelBuf = malloc(sizeof(UInt8) * height * width * 4);
NSUInteger bytesPerPixel = 4;
NSUInteger bytesPerRow = bytesPerPixel * width;
NSUInteger bitsPerComponent = 8;
CGContextRef context = CGBitmapContextCreate(m_PixelBuf, width, height,
bitsPerComponent, bytesPerRow, colorSpace,
kCGImageAlphaPremultipliedLast | kCGBitmapByteOrder32Big);
CGContextDrawImage(context, CGRectMake(0, 0, width, height), imageRef);
CGContextRelease(context);
//alter the alpha
int length = height * width * 4;
for (int i=0; i<length; i+=4)
{
m_PixelBuf[i+3] = 255*alpha;
}
//create a new image
CGContextRef ctx = CGBitmapContextCreate(m_PixelBuf, width, height,
bitsPerComponent, bytesPerRow, colorSpace,
kCGImageAlphaPremultipliedLast | kCGBitmapByteOrder32Big);
CGImageRef newImgRef = CGBitmapContextCreateImage(ctx);
CGColorSpaceRelease(colorSpace);
CGContextRelease(ctx);
free(m_PixelBuf);
UIImage *finalImage = [UIImage imageWithCGImage:newImgRef];
CGImageRelease(newImgRef);
return finalImage;
}

iOS -- detect the color of a pixel?

For example, suppose I want to detect the color of the pixel with screen coordinates (100, 200). Is there a way to do this?
EDIT -- I'm not worried about retina display issues for now.
This may not be the most direct route, but you could:
Use UIGraphicsBeginImageContextWithOptions to grab the screen (see the Apple Q&A QA1703 - "Screen Capture in UIKit Applications").
Then use CGImageCreateWithImageInRect to grab the portion of the resultant image you require.
Finally analyse the resultant image. It gets complicated at this point, but thankfully there's an existing question that should show you the way: How to get the RGB values for a pixel on an image on the iphone
Alternatively, there's the following blog article that has accompanying code: What Color is My Pixel? Image based color picker on iPhone
Here is how to do it
CGContextRef ctx;
CGImageRef imageRef = [image CGImage];
NSUInteger width = CGImageGetWidth(imageRef);
NSUInteger height = CGImageGetHeight(imageRef);
CGColorSpaceRef colorSpace = CGColorSpaceCreateDeviceRGB();
unsigned char *rawData = malloc(height * width * 4);
NSUInteger bytesPerPixel = 4;
NSUInteger bytesPerRow = bytesPerPixel * width;
NSUInteger bitsPerComponent = 8;
CGContextRef context = CGBitmapContextCreate(rawData, width, height,
bitsPerComponent, bytesPerRow, colorSpace,
kCGImageAlphaPremultipliedLast | kCGBitmapByteOrder32Big);
CGColorSpaceRelease(colorSpace);
CGContextDrawImage(context, CGRectMake(0, 0, width, height), imageRef);
CGContextRelease(context);
//GET PIXEL FROM POINT
int index = 4*((width*round(yCoor))+round(xCoor));
int R = rawData[index];
int G = rawData[index+1];
int B = rawData[index+2];
NSLog(#"%d %d %d", R, G, B);
//IF YOU WANT TO ALTER THE PIXELS
int byteIndex = 0;
for(int ii = 0 ; ii < width * height ; ++ii)
{
rawData[byteIndex] = (char)(newPixelValue);
rawData[byteIndex+1] = (char)(newPixelValue);
rawData[byteIndex+2] = (char)(newPixelValue);
byteIndex += 4;
}
ctx = CGBitmapContextCreate(rawData,
CGImageGetWidth( imageRef ),
CGImageGetHeight( imageRef ),
8,
CGImageGetBytesPerRow( imageRef ),
CGImageGetColorSpace( imageRef ),
kCGImageAlphaPremultipliedLast );
imageRef = CGBitmapContextCreateImage(ctx);
UIImage* rawImage = [UIImage imageWithCGImage:imageRef];
CGContextRelease(ctx);
image = rawImage;
free(rawData);
Try This One Where "self.m_imgvwSource" is the uiview/uiimageview as per your need
- (UIColor *) GetCurrentPixelColorAtPoint:(CGPoint)point
{
unsigned char pixel[4] = {0};
CGColorSpaceRef colorSpace = CGColorSpaceCreateDeviceRGB();
CGContextRef context = CGBitmapContextCreate(pixel, 1, 1, 8, 4, colorSpace, kCGBitmapAlphaInfoMask & kCGImageAlphaPremultipliedLast);
CGContextTranslateCTM(context, -point.x, -point.y);
[self.m_imgvwSource.layer renderInContext:context];
CGContextRelease(context);
CGColorSpaceRelease(colorSpace);
NSLog(#"pixel: %d %d %d %d", pixel[0], pixel[1], pixel[2], pixel[3]);
UIColor *color = [UIColor colorWithRed:pixel[0]/255.0 green:pixel[1]/255.0 blue:pixel[2]/255.0 alpha:pixel[3]/255.0];
return color;
}

Resources