I'm exporting my OpenGL drawings to a UIImage using the following method:
-(UIImage*)saveOpenGLDrawnToUIImage:(NSInteger)aWidth height:(NSInteger)aHeight {
NSInteger myDataLength = aWidth * aHeight * 4;
GLubyte *buffer = (GLubyte *) malloc(myDataLength);
glReadPixels(0, 0, aWidth, aHeight, GL_RGBA, GL_UNSIGNED_BYTE, buffer);
GLubyte *buffer2 = (GLubyte *) malloc(myDataLength);
for(int y = 0; y < aHeight; y++)
{
for(int x = 0; x < aWidth * 4; x++)
{
buffer2[(aHeight-1 - y) * aWidth * 4 + x] = buffer[y * 4 * aWidth + x];
}
}
CGDataProviderRef provider = CGDataProviderCreateWithData(NULL, buffer2, myDataLength, NULL);
int bitsPerComponent = 8;
int bitsPerPixel = 32;
int bytesPerRow = 4 * aWidth;
CGColorSpaceRef colorSpaceRef = CGColorSpaceCreateDeviceRGB();
CGBitmapInfo bitmapInfo = kCGBitmapByteOrderDefault;
CGColorRenderingIntent renderingIntent = kCGRenderingIntentDefault;
CGImageRef imageRef = CGImageCreate(aWidth, aHeight, bitsPerComponent, bitsPerPixel, bytesPerRow, colorSpaceRef, bitmapInfo, provider, NULL, NO, renderingIntent);
UIImage *myImage = [UIImage imageWithCGImage:imageRef];
return myImage;
}
But the background is always black instead of transparent.
I tried using CGImageRef imageRef = CGImageCreateWithPNGDataProvider(provider, NULL, NO, kCGRenderingIntentDefault); but it always generates nil.
How can I get this to process with a transparent background?
This is a question on how to create an image from raw RGBA data. I think you are missing a descriptor on the bitmap info to tell that you want an alpha channel...
Try Creating UIImage from raw RGBA data
The first comment is describing you should use kCGBitmapByteOrder32Big|kCGImageAlphaLast
Related
I got my problem solved already by using a different code. i just want to know what is wrong with the following one?
I wanted to change colour of every pixel in UIImage using bitmap data. My code is as follows:
- (void)touchesEnded:(NSSet *)touches withEvent:(UIEvent *)event
{
UIImage *image = self.imageViewMain.image;
CGImageRef imageRef = image.CGImage;
NSData *data = (NSData *)CFBridgingRelease(CGDataProviderCopyData(CGImageGetDataProvider(imageRef)));
char *pixels = (char *)[data bytes];
// this is where we manipulate the individual pixels
for(int i = 1; i < [data length]; i += 3)
{
int r = i;
int g = i+1;
int b = i+2;
int a = i+3;
pixels[r] = 0; // eg. remove red
pixels[g] = pixels[g];
pixels[b] = pixels[b];
pixels[a] = pixels[a];
}
// create a new image from the modified pixel data
size_t width = CGImageGetWidth(imageRef);
size_t height = CGImageGetHeight(imageRef);
size_t bitsPerComponent = CGImageGetBitsPerComponent(imageRef);
size_t bitsPerPixel = CGImageGetBitsPerPixel(imageRef);
size_t bytesPerRow = CGImageGetBytesPerRow(imageRef);
CGColorSpaceRef colorspace = CGColorSpaceCreateDeviceRGB();
CGBitmapInfo bitmapInfo = CGImageGetBitmapInfo(imageRef);
CGDataProviderRef provider = CGDataProviderCreateWithData(NULL, pixels, [data length], NULL);
CGImageRef newImageRef = CGImageCreate (
width,
height,
bitsPerComponent,
bitsPerPixel,
bytesPerRow,
colorspace,
bitmapInfo,
provider,
NULL,
false,
kCGRenderingIntentDefault
);
// the modified image
UIImage *newImage = [UIImage imageWithCGImage:newImageRef];
// cleanup
free(pixels);
CGImageRelease(imageRef);
CGColorSpaceRelease(colorspace);
CGDataProviderRelease(provider);
CGImageRelease(newImageRef);
}
But when this code runs - I get EXC_BAD_ACCESS error shown like in the following image :
And here is some more information from debugging:
What is it that I'm missing or doing wrong ?
try to alloc memory for pixels array like following code
char *pixels = (char *)malloc(data.length);
memcpy(pixels, [data bytes], data.length);
when pixels is not necessary, release this memory by call free(pixels)
I'm using following code for reading on image from OpenGL ES scene:
- (UIImage *)drawableToCGImage
{
CGRect myRect = self.bounds;
NSInteger myDataLength = myRect.size.width * myRect.size.height * 4;
glFinish();
glPixelStorei(GL_PACK_ALIGNMENT, 4);
int width = myRect.size.width;
int height = myRect.size.height;
GLubyte *buffer = (GLubyte *) malloc(myDataLength);
GLubyte *buffer2 = (GLubyte *) malloc(myDataLength);
glReadPixels(0, 0, width, height, GL_RGBA, GL_UNSIGNED_BYTE, buffer2);
for(int y1 = 0; y1 < height; y1++) {
for(int x1 = 0; x1 < width * 4; x1++) {
buffer[(height - 1 - y1) * width * 4 + x1] = buffer2[y1 * 4 * width + x1];
}
}
free(buffer2);
CGDataProviderRef provider = CGDataProviderCreateWithData(NULL, buffer, myDataLength, NULL);
int bitsPerComponent = 8;
int bitsPerPixel = 32;
int bytesPerRow = 4 * myRect.size.width;
CGColorSpaceRef colorSpaceRef = CGColorSpaceCreateDeviceRGB();
CGBitmapInfo bitmapInfo = kCGBitmapByteOrderDefault;
CGColorRenderingIntent renderingIntent = kCGRenderingIntentDefault;
CGImageRef imageRef = CGImageCreate(myRect.size.width, myRect.size.height, bitsPerComponent, bitsPerPixel, bytesPerRow, colorSpaceRef, bitmapInfo, provider, NULL, NO, renderingIntent);
CGColorSpaceRelease(colorSpaceRef);
CGDataProviderRelease(provider);
UIImage *image = [UIImage imageWithCGImage:imageRef];
CGImageRelease(imageRef);
return image;
}
It works perfectly for iPad and old iPhone versions, but I noticed that on iPhone 6 (both device and simulators) it looks like monochrome glitches.
What could it be?
Also, here is my code for CAEAGLLayer properties:
eaglLayer.drawableProperties = [NSDictionary dictionaryWithObjectsAndKeys:
#YES, kEAGLDrawablePropertyRetainedBacking,
kEAGLColorFormatRGBA8, kEAGLDrawablePropertyColorFormat, nil];
Could somebody shed the light on this crazy magic, please?
Thanks to #MaticOblak, I've figured out the problem.
Buffer was filled incorrectly, because float values of rect size were not correctly rounded (yeah, only for iPhone 6 dimension). Instead integer values should be used.
UPD: my issue was fixed with the following code:
GLint viewport[4];
glGetIntegerv(GL_VIEWPORT, viewport);
int width = viewport[2];
int height = viewport[3];
I'm working in a WebRTC app for iOS. My goal is record a video from WebRTC objects.
I have the delegate RTCVideoRenderer that provides me this method.
-(void)renderFrame:(RTCI420Frame *)frame{
}
My question is: How can I convert the object RTCI420Frame in a usefull object for show image or save to disk.
RTCI420Frames use the YUV420 format. You can easily convert them to RGB using OpenCV, then convert them to a UIImage. Make sure you #import <RTCI420Frame.h>
-(void) processFrame:(RTCI420Frame *)frame {
cv::Mat mYUV((int)frame.height + (int)frame.chromaHeight,(int)frame.width, CV_8UC1, (void*) frame.yPlane);
cv::Mat mRGB((int)frame.height, (int)frame.width, CV_8UC1);
cvtColor(mYUV, mRGB, CV_YUV2RGB_I420);
UIImage *image = [self UIImageFromCVMat:mRGB];
}
-(UIImage *)UIImageFromCVMat:(cv::Mat)cvMat
{
NSData *data = [NSData dataWithBytes:cvMat.data length:cvMat.elemSize()*cvMat.total()];
CGColorSpaceRef colorSpace;
if (cvMat.elemSize() == 1) {
colorSpace = CGColorSpaceCreateDeviceGray();
} else {
colorSpace = CGColorSpaceCreateDeviceRGB();
}
CGDataProviderRef provider = CGDataProviderCreateWithCFData((__bridge CFDataRef)data);
// Creating CGImage from cv::Mat
CGImageRef imageRef = CGImageCreate(cvMat.cols,
cvMat.rows,
8,
8 * cvMat.elemSize(),
cvMat.step[0],
colorSpace,
kCGImageAlphaNone|kCGBitmapByteOrderDefault,
provider,
NULL,
false,
kCGRenderingIntentDefault
);
// Getting UIImage from CGImage
UIImage *finalImage = [UIImage imageWithCGImage:imageRef];
CGImageRelease(imageRef);
CGDataProviderRelease(provider);
CGColorSpaceRelease(colorSpace);
return finalImage;
}
You may want to do this on a separate thread, especially if you are doing any video processing. Also, remember to use the .mm file extension so you can use C++.
If you don't want to use OpenCV, it is possible to do it manually. The following code kind of works, but the colors are messed up and it crashes after a few seconds.
int width = (int)frame.width;
int height = (int)frame.height;
uint8_t *data = (uint8_t *)malloc(width * height * 4);
const uint8_t* yPlane = frame.yPlane;
const uint8_t* uPlane = frame.uPlane;
const uint8_t* vPlane = frame.vPlane;
for (int i = 0; i < width * height; i++) {
int rgbOffset = i * 4;
uint8_t y = yPlane[i];
uint8_t u = uPlane[i/4];
uint8_t v = vPlane[i/4];
uint8_t r = y + 1.402 * (v - 128);
uint8_t g = y - 0.344 * (u - 128) - 0.714 * (v - 128);
uint8_t b = y + 1.772 * (u - 128);
data[rgbOffset] = r;
data[rgbOffset + 1] = g;
data[rgbOffset + 2] = b;
data[rgbOffset + 3] = UINT8_MAX;
}
CGColorSpaceRef colorSpace = CGColorSpaceCreateDeviceRGB();
CGContextRef gtx = CGBitmapContextCreate(data, width, height, 8, width * 4, colorSpace, kCGImageAlphaPremultipliedLast);
CGImageRef cgImage = CGBitmapContextCreateImage(gtx);
UIImage *uiImage = [[UIImage alloc] initWithCGImage:cgImage];
free(data);
I cannot find a combination of parameters that works with float (I managed to save with unsigned byte):
float *rawImagePixels = (float*)malloc(width * height * sizeof(float));
glPixelStorei(GL_PACK_ALIGNMENT, 4);
glReadPixels(0, 0, width, height, GL_RED_EXT, GL_HALF_FLOAT_OES, rawImagePixels);
NSData *data = [NSData dataWithBytes:rawImagePixels length:width*height];
CGDataProviderRef provider = CGDataProviderCreateWithCFData((__bridge CFDataRef)data);
CGColorSpaceRef colorspace = CGColorSpaceCreateDeviceGray();
CGImageRef iref = CGImageCreate(width, height, 16, 16, width, colorspace, kCGImageAlphaNone|kCGBitmapByteOrder16Big, provider, NULL, NO, kCGRenderingIntentDefault);
UIImage *myImage = [UIImage imageWithCGImage:iref];
UIImageWriteToSavedPhotosAlbum(myImage, self, nil, nil);
My texture is a 1-channel half float texture. How can I save it as a UIImage?
Extracted from the question
I got semi-decent results with this code:
GLhalf *rawImagePixels = (GLhalf*)malloc(width * height * sizeof(GLhalf));
glPixelStorei(GL_PACK_ALIGNMENT, 2);
glReadPixels(0, 0, width, height, GL_RED_EXT, GL_HALF_FLOAT_OES, rawImagePixels);
NSData *data = [NSData dataWithBytes:rawImagePixels length:width * height * sizeof(GLhalf)];
CGDataProviderRef provider = CGDataProviderCreateWithCFData((__bridge CFDataRef)data);
CGColorSpaceRef colorspace = CGColorSpaceCreateDeviceGray();
int bitsPerComponent = 16, bitsPerPixel = 16, bytesPerRow = width * 2;
CGImageRef iref = CGImageCreate(width, height, bitsPerComponent, bitsPerPixel, bytesPerRow, colorspace, kCGImageAlphaNone|kCGBitmapByteOrder16Little, provider, NULL, NO, kCGRenderingIntentDefault);
UIImage *myImage = [UIImage imageWithCGImage:iref];
UIImageWriteToSavedPhotosAlbum(myImage, self, nil, nil);
I wish to convert a UIImage to 8 bits. I have attempted to do this but I am not sure if I have done it right because I am getting a message later when I try to use the image proccessing library leptonica that states it is not 8 bits. Can anyone tell me if I am doing this correctly or the code on how to do it?
Thanks!
CODE
CGImageRef myCGImage = image.CGImage;
CFDataRef data = CGDataProviderCopyData(CGImageGetDataProvider(myCGImage));
const UInt8 *imageData = CFDataGetBytePtr(data);
Following code will work for images without alpha channel:
CGImageRef c = [[UIImage imageNamed:#"100_3077"] CGImage];
size_t bitsPerPixel = CGImageGetBitsPerPixel(c);
size_t bitsPerComponent = CGImageGetBitsPerComponent(c);
size_t width = CGImageGetWidth(c);
size_t height = CGImageGetHeight(c);
CGImageAlphaInfo a = CGImageGetAlphaInfo(c);
NSAssert(bitsPerPixel == 32 && bitsPerComponent == 8 && a == kCGImageAlphaNoneSkipLast, #"unsupported image type supplied");
CGContextRef targetImage = CGBitmapContextCreate(NULL, width, height, 8, 1 * CGImageGetWidth(c), CGColorSpaceCreateDeviceGray(), kCGImageAlphaNone);
UInt32 *sourceData = (UInt32*)[((__bridge_transfer NSData*) CGDataProviderCopyData(CGImageGetDataProvider(c))) bytes];
UInt32 *sourceDataPtr;
UInt8 *targetData = CGBitmapContextGetData(targetImage);
UInt8 r,g,b;
uint offset;
for (uint y = 0; y < height; y++)
{
for (uint x = 0; x < width; x++)
{
offset = y * width + x;
if (offset+2 < width * height)
{
sourceDataPtr = &sourceData[y * width + x];
r = sourceDataPtr[0+0];
g = sourceDataPtr[0+1];
b = sourceDataPtr[0+2];
targetData[y * width + x] = (r+g+b) / 3;
}
}
}
CGImageRef newImageRef = CGBitmapContextCreateImage(targetImage);
UIImage *newImage = [UIImage imageWithCGImage:newImageRef];
CGContextRelease(targetImage);
CGImageRelease(newImageRef);
With this code I converted an rgb image to an grayscale image:
Hope this helps