I'm trying to convert array of images into a video file. In the process I have to fill pixel buffer from selected images. Here is the code snippet:
CVPixelBufferLockBaseAddress(pixelBuffer, 0)
let pixelData = CVPixelBufferGetBaseAddress(pixelBuffer)
let bitmapInfo:CGBitmapInfo = CGBitmapInfo(rawValue: CGImageAlphaInfo.PremultipliedFirst.rawValue)
let rgbColorSpace = CGColorSpaceCreateDeviceRGB()
print("\npixel buffer width: \(CVPixelBufferGetWidth(pixelBuffer))\n")
print("\nbytes per row: \(CVPixelBufferGetBytesPerRow(pixelBuffer))\n")
let context = CGBitmapContextCreate(
pixelData,
Int(image.size.width),
Int(image.size.height),
CGImageGetBitsPerComponent(image.CGImage),
CVPixelBufferGetBytesPerRow(pixelBuffer),
rgbColorSpace,
bitmapInfo.rawValue
)
CGContextDrawImage(context, CGRectMake(0, 0, image.size.width, image.size.height), image.CGImage)
CVPixelBufferUnlockBaseAddress(pixelBuffer, 0)
After executing these lines I get the following message in xcode:
CGBitmapContextCreate: invalid data bytes/row: should be at least 13056 for 8 integer bits/component, 3 components, kCGImageAlphaPremultipliedFirst.CGContextDrawImage: invalid context 0x0.
After debugging I get the following value:
CVPixelBufferGetWidth(pixelBuffer) // value 480
CVPixelBufferGetBytesPerRow(pixelBuffer) // value 1920
What should I do to get valid data bytes/row? What are the 3 components specified in console log? I saw similar questions in stackoverflow but nothing helped in my case.
Related
We are creating UIImage from a buffer using following code
let context = CGContext(data: baseAddress, width: frameWidth, height: frameHeight, bitsPerComponent: 8, bytesPerRow: bytesPerRow, space: colorSpace, bitmapInfo: bitmapInfo.rawValue)!
let cgImage0 = context.makeImage()
let uiImage0 = UIImage(cgImage: cgImage0!)
The problem is we can create color image buffer using AVCaptureVideoDataOutput() only with kCVPixelFormatType_32BGRA format which is the most similar format to kCVPixelFormatType_32RGBA
At some point we need UIImage and we needed to convert buffer to UIImage and because UIImage is RGBA formatted colors has become distorted.
How can we get UIImage from buffer in kCVPixelFormatType_32RGBA format.
By making
var bitmapInfo = CGBitmapInfo.byteOrder32Little.rawValue
bitmapInfo |= CGImageAlphaInfo.premultipliedFirst.rawValue & CGBitmapInfo.alphaInfoMask.rawValue
instead of
var bitmapInfo = CGBitmapInfo(rawValue: CGImageAlphaInfo.premultipliedLast.rawValue)
But getting this from Apple document is very hard.
i am making an app in Swift (i have the last xcode update) that has to generate a video from some images.
I got the code from this answer How do I export UIImage array as a movie?
I call the function like this:
let size = CGSize(width: 1280, height: 720)
let pathVideo = NSSearchPathForDirectoriesInDomains(.DocumentDirectory, .UserDomainMask, true)
let percorsoVideo = pathVideo[0]
writeImagesAsMovie(arrayImmagini, videoPath: percorsoVideo+"/prova.mp4", videoSize: size, videoFPS: 1)
"arrayImmagini" is defined literally like this:
var arrayImmagini = [UIImage(imageLiteral: "Frames/turtle/turtle0.jpg"), UIImage(imageLiteral: "Frames/turtle/turtle1.jpg"), ...]
When i try to run the code i get a completely black video and xcode gives me this 2 errors as many times as many images there are in the array:
Sep 5 09:24:15 Prova[1554] <Error>: CGBitmapContextCreate: invalid data bytes/row: should be at least 7680 for 8 integer bits/component, 3 components, kCGImageAlphaPremultipliedFirst.
Sep 5 09:24:15 Prova[1554] <Error>: CGContextDrawImage: invalid context 0x0. If you want to see the backtrace, please set CG_CONTEXT_SHOW_BACKTRACE environmental variable.
Reading the documentation about CGBitmapContextCreate, i tried to call it differently:
func fillPixelBufferFromImage(image: UIImage, pixelBuffer: CVPixelBufferRef) {
CVPixelBufferLockBaseAddress(pixelBuffer, 0)
let pixelData = CVPixelBufferGetBaseAddress(pixelBuffer)
let rgbColorSpace = CGColorSpaceCreateDeviceRGB()
// Create CGBitmapContext
let context = CGBitmapContextCreate(
nil,
Int(image.size.width),
Int(image.size.height),
8,
0,
rgbColorSpace,
CGImageAlphaInfo.PremultipliedFirst.rawValue
)
// Draw image into context
CGContextDrawImage(context, CGRectMake(0, 0, image.size.width, image.size.height), image.CGImage)
CVPixelBufferUnlockBaseAddress(pixelBuffer, 0)
}
Instead of:
func fillPixelBufferFromImage(image: UIImage, pixelBuffer: CVPixelBufferRef) {
CVPixelBufferLockBaseAddress(pixelBuffer, 0)
let pixelData = CVPixelBufferGetBaseAddress(pixelBuffer)
let rgbColorSpace = CGColorSpaceCreateDeviceRGB()
// Create CGBitmapContext
let context = CGBitmapContextCreate(
pixelData,
Int(image.size.width),
Int(image.size.height),
8,
CVPixelBufferGetBytesPerRow(pixelBuffer),
rgbColorSpace,
CGImageAlphaInfo.PremultipliedFirst.rawValue
)
// Draw image into context
CGContextDrawImage(context, CGRectMake(0, 0, image.size.width, image.size.height), image.CGImage)
CVPixelBufferUnlockBaseAddress(pixelBuffer, 0)
}
This made xcode stop giving me errors but i still get a black video.
Please help me, i'm new to app development and even newer to AVFoundation, i don't have any clues about how to solve it by myself.
Thank you!
After many attempts to make this work i found out what the problem was.
You can't give a video size more little than the pictures' size.
Once i did this, everything worked:
let size = CGSize(width: 1920, height: 1280)
all!
I've been doing a lot of research into this and I've integrated several different solutions into my project, but none of them seem to work. My current solution has been borrowed from this thread.
When I run my code, however, two things happen:
The pixel array remains initialized but unpopulated (Full of 0s)
I get two errors:
CGBitmapContextCreate: unsupported parameter combination: set CGBITMAP_CONTEXT_LOG_ERRORS environmental variable to see the details
and
CGContextDrawImage: invalid context 0x0. If you want to see the backtrace, please set
Any ideas? Here is my current built function that I'm calling for my Image class:
init?(fromImage image: UIImage!) {
let imageRef = image!.CGImage
self.width = CGImageGetWidth(imageRef)
self.height = CGImageGetHeight(imageRef)
let colorspace = CGColorSpaceCreateDeviceRGB()
let bytesPerRow = (4 * width);
let bitsPerComponent :UInt = 8
let pixels = UnsafeMutablePointer<UInt8>(malloc(width*height*4))
var context = CGBitmapContextCreate(pixels, width, height, Int(bitsPerComponent), bytesPerRow, colorspace, 0);
CGContextDrawImage(context, CGRectMake(0, 0, CGFloat(width), CGFloat(height)), imageRef)
Any pointers would help a lot, as I'm new to understanding how all of this CGBitmap stuff works.
Thanks a ton!
You should not pass an 0 as the bitmapInfo param for CGBitmapContextCreate. For RGBA you shall pass CGImageAlphaInfo.PremultipliedLast.rawValue.
Supported combinations of bitsPerComponent, bytesPerRow, colorspace and bitmapInfo can be found here:
https://developer.apple.com/library/mac/documentation/GraphicsImaging/Conceptual/drawingwithquartz2d/dq_context/dq_context.html#//apple_ref/doc/uid/TP30001066-CH203-BCIBHHBB
Note that 32 bits per pixel (bpp) is 4 bytes per pixel and you use it to calculate bytesPerRow
You need to convert the image to NSData and then convert NSData
to UInt8 array.
let data: NSData = UIImagePNGRepresentation(image)
// or let data: NSData = UIImageJPGRepresentation(image)
let count = data.length / sizeof(UInt8)
// create an array of Uint8
var array = [UInt8](count: count, repeatedValue: 0)
// copy bytes into array
data.getBytes(&array, length:count * sizeof(UInt8))
I am trying to store image data in buffer in my app so i will be able to use it however I get EXC_BAD_ACCESS error on CGContextDrawImage line.
Here is the code i am using:
resizedImage.Array() // resizedImage - resized image of 280x140 pixels size
func Array() {
let dataBuffer = UnsafeMutablePointer<CUnsignedChar>.alloc(156800) //alloc(156800)
memset(dataBuffer, 0, 156800)
let colorSpace = CGColorSpaceCreateDeviceRGB()
let bitmapInfo = CGBitmapInfo(rawValue: CGBitmapInfo.ByteOrder32Big.rawValue | CGImageAlphaInfo.PremultipliedLast.rawValue)
let context = CGBitmapContextCreate(dataBuffer, 280, 140, 8, 156800, colorSpace, bitmapInfo.rawValue)
let imageRef = CGImageCreateWithImageInRect(CGImage, CGRectMake(0, 0, 280, 140))
let rectMake = CGRectMake(0, 0, 280, 140)
CGContextDrawImage(context, rectMake, imageRef)
return
}
When trying to set buffer and bytes as null and 0 as in apple documentation for automatic memory allocation app doesn't crash, but it gives this errors:
Sep 17 17:18:33 VideoTester[4846] <Error>: CGBitmapContextCreate: invalid data bytes/row: should be at least 1120 for 8 integer bits/component, 3 components, kCGImageAlphaPremultipliedLast.
Sep 17 17:18:33 VideoTester[4846] <Error>: CGContextDrawImage: invalid context 0x0. If you want to see the backtrace, please set CG_CONTEXT_SHOW_BACKTRACE environmental variable.
Seems like I am missing something but can't really figure it out, any advice is needed, thank you!
Got this working with a help of peacer212, everything seems to work as needed without any errors, here is a working code:
var dataBuffer = [UInt8](count: 280*140 * 4, repeatedValue: 0)
let colorSpace = CGColorSpaceCreateDeviceRGB()
let bitmapInfo = CGBitmapInfo(rawValue: CGBitmapInfo.ByteOrder32Big.rawValue | CGImageAlphaInfo.PremultipliedLast.rawValue)
let context = CGBitmapContextCreate(&dataBuffer, 280, 140, 8, 1120, colorSpace, bitmapInfo.rawValue)
let imageRef = CGImageCreateWithImageInRect(CGImage, CGRectMake(0, 0, 280, 140))
let rectMake = CGRectMake(0, 0, 280, 140)
CGContextDrawImage(context, rectMake, imageRef)
I have found an example of Objective-C code that gets the color of a pixel at a point here:
How to get the pixel color on touch?
The specific section of code I need help with is where the context is created using CGColorSpaceCreateDeviceRGB:
---This is the Objective-C code
unsigned char pixel[4] = {0};
CGColorSpaceRef colorSpace = CGColorSpaceCreateDeviceRGB();
CGContextRef context = CGBitmapContextCreate(pixel,
1, 1, 8, 4, colorSpace, (CGBitmapInfo)kCGImageAlphaPremultipliedLast);
CGContextTranslateCTM(context, -point.x, -point.y);
My best attempt looks as follows (I'm not returning anything yet I'm trying to get the context properly first):
---This is my best attempt at a Swift conversion
func getPixelColorAtPoint()
{
let pixel = UnsafeMutablePointer<CUnsignedChar>.alloc(1)
var colorSpace:CGColorSpaceRef = CGColorSpaceCreateDeviceRGB()
let context = CGBitmapContextCreate(pixel, width: 1, height: 1, bitsPerComponent: 8, bytesPerRow: 4, space: nil, bitmapInfo: CGImageAlphaInfo.PremultipliedLast)
}
However this gives me an error
Cannot convert the expression's type '(UnsafeMutablePointer<CUnsignedChar>, width: IntegerLiteralConvertible, height: IntegerLiteralConvertible, bitsPerComponent: IntegerLiteralConvertible, bytesPerRow: IntegerLiteralConvertible, space: NilLiteralConvertible, bitmapInfo: CGImageAlphaInfo)' to type 'IntegerLiteralConvertible'
If you could advise how I need to tweak my code above to get the context function paramters entered correctly I would appreciate it thank you!
There are two different problems:
CGBitmapContextCreate() is a function, not a method, and therefore does not
use external parameter names by default.
CGImageAlphaInfo.PremultipliedLast cannot be passed as bitmapInfo: parameter,
compare Swift OpenGL unresolved identifier kCGImageAlphaPremultipliedLast.
So this should compile:
let pixel = UnsafeMutablePointer<CUnsignedChar>.alloc(4)
var colorSpace = CGColorSpaceCreateDeviceRGB()
let bitmapInfo = CGBitmapInfo(CGImageAlphaInfo.PremultipliedLast.rawValue)
let context = CGBitmapContextCreate(pixel, 1, 1, 8, 4, colorSpace, bitmapInfo)
// ...
pixel.dealloc(4)
Note that you should allocate space for 4 bytes, not 1.
Alternatively:
var pixel : [UInt8] = [0, 0, 0, 0]
var colorSpace = CGColorSpaceCreateDeviceRGB()
let bitmapInfo = CGBitmapInfo(CGImageAlphaInfo.PremultipliedLast.rawValue)
let context = CGBitmapContextCreate(UnsafeMutablePointer(pixel), 1, 1, 8, 4, colorSpace, bitmapInfo)