Fill CGBitmapContext with color - ios

I'm trying to fill a CGBitmapContext with a solid red color and get CGImgRef, but image is all transparent.
Here's the code
CGColorSpaceRef colorSpace = CGColorSpaceCreateDeviceRGB();
CGContextRef context = CGBitmapContextCreate(NULL,
1,
1,
8,
0,
colorSpace,
kCGImageAlphaPremultipliedLast | kCGBitmapByteOrder32Big);
CGFloat components[] = {1.0,0.0,0.0,1.0};
CGContextSetFillColor(context, components);
CGContextFillRect(context, CGRectMake(0, 0, 1, 1));
CGImageRef imgRef = CGBitmapContextCreateImage(context);
Please don't recommend using UIBezierPaths and other UIKit things. I need to use CoreGraphics.

The problem is that you forgot to set a fill color space. You must do that in order for CGContextSetFillColor to work correctly (because that tells it how many components to expect and how to interpret them). This works:
CGFloat components[] = {1,0,0,1};
CGContextSetFillColorSpace(context, colorSpace); // <-- this was missing
CGContextSetFillColor(context, components);
// and now fill ...

problem was solved using CGContextSetFillColorWithColor(context, [UIColor redColor].CGColor); Btw, it would be interesting to know what i was doing wrong. Also Apple documentation on CGContextSetFillColor says "Note that the preferred API to use is now CGContextSetFillColorWithColor."

Related

CoreGraphics Image render from OpenGL has black background

I am unable to render an image from an OpenGL context with a transparent background in CoreGraphics.
The rendered image has a black background.
This is the draw code
GLint default_frame_buffer = 0;
glGetIntegerv(GL_FRAMEBUFFER_BINDING, &default_frame_buffer);
if (default_frame_buffer == 0) {
target = createBMGLRenderTarget(width, height);
setFiltering(target, BMGL_BilinearFiltering);
glBindFramebuffer(GL_FRAMEBUFFER, target->framebuffer);
}
glViewport(0, 0, width, height);
if (background) {
CGFloat red, green, blue, alpha;
[background getRed:&red green:&green blue:&blue alpha:&alpha];
glClearColor(red, green, blue, alpha);
} else {
glClearColor(0.f, 0.f, 0.f, 0.f);
}
glClear(GL_COLOR_BUFFER_BIT | GL_DEPTH_BUFFER_BIT);
glEnable(GL_BLEND);
glBlendFunc(GL_SRC_ALPHA, GL_ONE_MINUS_SRC_ALPHA);
drawer.draw()
glFlush();
glFinish();
GLubyte *data = (GLubyte *)malloc(width * height * 4);
glPixelStorei(GL_PACK_ALIGNMENT, 4);
glReadPixels(0, 0, width, height, GL_RGBA, GL_UNSIGNED_BYTE, data);
CGDataProviderRef ref = CGDataProviderCreateWithData(NULL, data, width * height * 4, NULL);
CGColorSpaceRef colorspace = CGColorSpaceCreateDeviceRGB();
CGImageRef imgRef = CGImageCreate(width, height, 8, 32, width * 4, colorspace, kCGBitmapByteOrder32Big | kCGImageAlphaPremultipliedLast, ref, NULL, NO, kCGRenderingIntentDefault);
UIGraphicsBeginImageContextWithOptions(size, YES, 0.0);
CGContextRef cgContext = UIGraphicsGetCurrentContext();
CGContextSetBlendMode(cgContext, kCGBlendModeCopy);
CGContextDrawImage(cgContext, CGRectMake(0, 0, size.width, size.height), imgRef);
UIImage *image = UIGraphicsGetImageFromCurrentImageContext();
UIGraphicsEndImageContext();
free(data);
CGDataProviderRelease(ref);
CGColorSpaceRelease(colorspace);
CGImageRelease(imgRef);
I have tried to specifically set opaque to false as well as different blend modes but it still adds a black background, to the original clear image. I am able to set the GLKView to have a transparent background, but rendering the image and then drawing its contents into a CGImage doesn't work.
Does anyone know why this is?
I believe your problem is in how to create UIImage from raw RGBA data. To confirm this you may check what your data is on a pixel you know to be transparent. Like data[pixelIndex*4 + 3] should be zero where you expect it to be transparent. If it is not transparent then the issue is on openGL part.
Anyway the most probable reason your image is not transparent is you are premultiplying alpha using kCGImageAlphaPremultipliedLast. Try using kCGBitmapByteOrder32Big|kCGImageAlphaLast.

How to get CGImageRef from CGContextRef?

I am trying to add two images into context, however it does not work and throws
GBitmapContextCreateImage: invalid context 0x0
error. I use the following code:
//some image
CGImageRef image = ...
//some image as well, but masked. Works perfectly.
CGImageRef blurredAndMasked = CGImageCreateWithMask(blurred, mask);
//Both images are fine. Sure.
//Initializing the context and color space
CGColorSpaceRef colorSpace = CGColorSpaceCreateDeviceRGB();
CGContextRef ctx = CGBitmapContextCreate(nil, frameSize.width, frameSize.height, 8, 0, colorSpace, kCGBitmapAlphaInfoMask);
//Drawing images into the context
CGContextDrawImage(ctx, CGRectMake(0, 0, frameSize.width, frameSize.height), image);
CGContextDrawImage(ctx, CGRectMake(0, 0, frameSize.width, frameSize.height), blurredAndMasked);
//getting the resulting image
CGImageRef ret = CGBitmapContextCreateImage(ctx);
//releasing the stuff
CGImageRelease(image);
CGImageRelease(blurred);
CGImageRelease(blurredAndMasked);
CGColorSpaceRelease(colorSpace);
CGContextRelease(ctx);
this seems fine but the resulting images are all black or look very similar to this:
What should be changed in the code? Thanks in advance!
kCGBitmapAlphaInfoMask is not a valid value for the last (bitmapInfo) argument to CGBitmapContextCreate. It is a mask (hence the name) you can use with the & operator to get just the CGImageAlphaInfo out of a CGBitmapInfo value. You would never pass kCGBitmapAlphaInfoMask where a CGBitmapInfo or CGImageAlphaInfo is expected.
Assuming you don't need a specific byte order, I believe this is the highest performance pixel format with alpha channel on iOS:
CGContextRef ctx = CGBitmapContextCreate(nil,
frameSize.width, frameSize.height, 8, 0, colorSpace,
kCGBitmapByteOrder32Little | kCGImageAlphaPremultipliedFirst);
And this should be the highest performance without alpha channel:
CGContextRef ctx = CGBitmapContextCreate(nil,
frameSize.width, frameSize.height, 8, 0, colorSpace,
kCGBitmapByteOrder32Little | kCGImageAlphaNoneSkipFirst);

CGColorGetComponents always returns 0 CGColor (black)

I want to get the color of the background of a UILabel. I'm using this method:
- (UIColor *)colorOfPoint:(CGPoint)point{
unsigned char pixel[4] = {0};
CGColorSpaceRef colorSpace = CGColorSpaceCreateDeviceRGB();
CGContextRef context = CGBitmapContextCreate(pixel, 1, 1, 8, 4, colorSpace, (CGBitmapInfo)kCGImageAlphaPremultipliedLast);
CGContextTranslateCTM(context, -point.x, -point.y);
[self.superview.layer renderInContext:context];
CGContextRelease(context);
CGColorSpaceRelease(colorSpace);
UIColor *color = [UIColor colorWithRed:pixel[0]/255.0
green:pixel[1]/255.0
blue:pixel[2]/255.0
alpha:pixel[3]/255.0];
return color;
}
However, when I call const CGFloat *componentColors = CGColorGetComponents([self colorOfPoint:self.amountLabel.frame.origin].CGColor); in a subview of UITableViewCell, it always return 0 (aka black) for the background. How could that be possible?
where are you running this code?
inside a subclass on UILabel?
I can't see why you call
[self.superView.layer renderInContext: ]
I suggest you change this to
[myLabel.layer renderInContext: ]
where myLabel is the pointer to the label you want the colour from.
Incidentally, have you tried myLabel.backgroundColor ?
(the backgroundColor property is read/write, i.e. there is a getter there as well as a setter..) the code you have for getting a colour from a specific point is good, but it seems overkill for a label, better suited to an image or something, unless you've filled it with a gradient or a pattern image this seems like a pretty expensive way to get the background colour..

How to take screenshot programmatically along with current date printed on it?

I have taken screenshot programmatically for a scroll view in xcode. But the problem i am facing is, i need to take screenshot along with present date in it. Please help me.
Before taking screenshot, put one label with current date, and remove it after taking screenshot.
Or you could use the EXIF meta data to record the date (it's probably already in there). I guess an embedded date in the image is harder to fake, though.
This is how you can add a text to an image. Try this with the date as string. Once you taken the screenshot pass it to the below method and use the output image. Modify the below method for your convenience.
//Add text to UIImage
-(UIImage *)addText:(UIImage *)img text:(NSString *)text1{
int w = img.size.width;
int h = img.size.height;
//lon = h - lon;
CGColorSpaceRef colorSpace = CGColorSpaceCreateDeviceRGB();
CGContextRef context = CGBitmapContextCreate(NULL, w, h, 8, 4 * w, colorSpace, kCGImageAlphaPremultipliedFirst);
CGContextDrawImage(context, CGRectMake(0, 0, w, h), img.CGImage);
CGContextSetRGBFillColor(context, 0.0, 0.0, 1.0, 1);
char* text = (char *)[text1 cStringUsingEncoding:NSASCIIStringEncoding];// "05/05/09";
CGContextSelectFont(context, "Arial", 18, kCGEncodingMacRoman);
CGContextSetTextDrawingMode(context, kCGTextFill);
CGContextSetRGBFillColor(context, 255, 255, 255, 1);
//rotate text
CGContextSetTextMatrix(context, CGAffineTransformMakeRotation( -M_PI/4 ));
CGContextShowTextAtPoint(context, 4, 52, text, strlen(text));
CGImageRef imageMasked = CGBitmapContextCreateImage(context);
CGContextRelease(context);
CGColorSpaceRelease(colorSpace);
return [UIImage imageWithCGImage:imageMasked];
}
Source

Cannot paint image created using CGBitmapContextCreate

Under IOS I am trying to create an image by drawing into a context created via CGBitmapContextCreate. The following is the code that I am using (which is similar to some examples that I found), but when I call it, nothing is painted. Note that 'context' in this example is obtained via a call to UIGraphicsGetCurrentContext and I can successfully paint other things into this context. Is there something that I am missing?
Thanks
CGContextRef bmContext;
CGColorSpaceRef colorSpace = CGColorSpaceCreateDeviceRGB();
bmContext = CGBitmapContextCreate(nil, 200, 200, 8, 0, colorSpace, kCGImageAlphaPremultipliedLast);
CGFloat Color[4] = {
(CGFloat) 1,
(CGFloat) 0,
(CGFloat) 0,
(CGFloat) 1 };
CGRect Rect = { { 0, 0 }, { 200, 200 } };
CGContextSetFillColor(bmContext, Color);
CGContextFillRect (bmContext, Rect);
CGImageRef image = CGBitmapContextCreateImage(bmContext);
CGContextDrawImage(context, CGRectMake(100, 100, 200, 200), image);
CGImageRelease(image);
CGContextRelease(bmContext);
CFRelease(colorSpace);
After much experimentation I have found the problem. It appears that I need to call:
CGContextSetFillColorSpace(bmContext, colorSpace);
Otherwise it appears to be a grey scale context and is expecting only 2 values when I set the fill color, an intensity and an alpha. This means that in my example above, the alpha was always zero resulting in nothing being drawn. Setting the context to a RGB colorspace fixes the problem.
Thanks to all who took time to look at my problem.

Resources