How to create an ARGB8888 Binary image on iOS swift - ios

I am using the following code to create the ARGB8888 image. Is this the correct Image format for ARBG8888 or should I use a different format.
This link is used to create the below Image FOrmat
guard let imageConversionToCGImage = img.cgImage,
let imageFormat = vImage_CGImageFormat(
bitsPerComponent: 8,
bitsPerPixel: 32,
colorSpace: CGColorSpaceCreateDeviceRGB(),
bitmapInfo: CGBitmapInfo(rawValue: CGImageAlphaInfo.first.rawValue),
renderingIntent: .defaultIntent),
//Source buffer
let sourceBuffer = try? vImage_Buffer(cgImage: imageConversionToCGImage, format: imageFormat, flags: .noFlags),
//ARGBImage
let argb8888CGImage = try? sourceBuffer.createCGImage(format: imageFormat) else { return } ```
and I am getting the image data using the DataProvider.CFData for getting the data . Is this the right format or am I doing it wrong. What should I do to make it so I get the ARGB data.

When you pass a populated vImage_CGImageFormat to that initializer, vImage will attempt to convert the source image to the specified format. For example, you could pass a grayscale image format such as:
let imageFormat = vImage_CGImageFormat(
bitsPerComponent: 8,
bitsPerPixel: 8,
colorSpace: CGColorSpaceCreateDeviceGray(),
bitmapInfo: CGBitmapInfo(rawValue: CGImageAlphaInfo.none.rawValue),
renderingIntent: .defaultIntent)
And the returned buffer will contain a grayscale representation of the source image regardless of the source image's format.
If you want to populate a buffer based on the format of the source image, pass an empty vImage_CGImageFormat and use the vImageBuffer_InitWithCGImage function:
var imageFormat = vImage_CGImageFormat()
var sourceBuffer = vImage_Buffer()
vImageBuffer_InitWithCGImage(&sourceBuffer,
&imageFormat,
nil,
cgImage,
vImage_Flags(kvImageNoFlags))
On return, imageFormat contains the color space and bit depth of the source image, and sourceBuffer contains the image itself.
Apple have just released the vImage.PixelBuffer that provides a Swift friendly API to vImage. Take a look at init(cgImage:cgImageFormat:pixelFormat:).
Finally, init(data:width:height:byteCountPerRow:pixelFormat:) shows an example of creating a vImage pixel buffer from a Core Graphics image's underlying data.

Related

How to convert a numerical data array into RAW image data in Swift?

I have a data array of Int16 or Int32 numerical values that are the raw image data from a 11MP camera chip with an RGGB pixel layout (CFA). The data are exported by the camera driver as FITS data, which is basically a vector or long string of bytes or 16bit/pixel data in my case.
I like to convert these data into a raw image format in Swift in order to use the powerful debayering and demosaicing features and algorithms in iOS/Swift. I do not intend to demosaic myself, since iOS has a great library for this already (see WWDC2016 keynote on Raw Processing with Core Image).
I need to make iOS “believe” my data are actual raw image data.
I tried using CreatePixelBufferWithBytes in Swift and then CIImage from pixelbuffer but to no avail. The CIImage.cgimage is not an RGB color image.
Is there a simple way to create a raw or DNG image in Swift from raw numerical data?
Here is what I tried with the CVPixelBuffer approach, but I do not get any color image out of this:
imgRawData is a [Int32] or [Float32] array with width*height number of elements.
var pixelBuffer: CVPixelBuffer?
let attrs = [kCVPixelBufferCGImageCompatibilityKey: kCFBooleanTrue,
kCVPixelBufferCGBitmapContextCompatibilityKey: kCFBooleanTrue ]
CVPixelBufferCreateWithBytes(kCFAllocatorDefault, width, height, kCVPixelFormatType_14Bayer_RGGB, &imgRawData, 2*width, nil, nil, attrs as CFDictionary, &pixelBuffer)
let dummyImg = UIImage(systemName: "star.fill")?.cgImage
let ciiraw = CIImage(cvPixelBuffer: pixelBuffer!)
let cif = CIFilter.lanczosScaleTransform()
cif.scale = 0.25
cif.inputImage = ciiraw
let cii = cif.outputImage
let context: CIContext = CIContext.init(options: nil)
guard let cgi = context.createCGImage(cii!, from: cii!.extent) else { return dummyImg! }
Quickview of Xcode shows me only black&white or grayscale images. So does the SwiftUI View of the CGImage...
You can use CGContext and pass your raw values in as bitmapinfo, see init:
init?(data: UnsafeMutableRawPointer?, width: Int, height: Int, bitsPerComponent: Int, bytesPerRow: Int, space: CGColorSpace, bitmapInfo: UInt32)
And for space parameter, which takes CGColorSpace you would use CGColorSpaceCreateDeviceRGB().
You will then use your image with a code similar to this one:
let imageRef = CGContext.makeImage(context!)
let imageRep = NSBitmapImageRep(cgImage: imageRef()!)
Play around with it for a bit, I think you will find what you are looking for.

getting uncompressed CIImage data

I'm trying to get CIImage uncompress data.
For now the only way I found to get compressed data is using CIContext as follow:
let ciContext = CIContext()
let ciImage = CIImage(color: .red).cropped(to: .init(x: 0, y: 0, width: 192, height: 192))
guard let ciImageData = ciContext.jpegRepresentation(of: ciImage, colorSpace: CGColorSpace(name: CGColorSpace.sRGB)!, options: [:]) else {
fatalError()
}
print(ciImageData.count) // Prints 1331
Is it possible to get (as efficiently as possible) the uncompressed CIImage data?
As you can see, ciContext.jpegRepresentation is compressing the image data as JPEG and gives you a Data object that can be written as-is as a JPEG file to disk (including image metadata).
You need to use a different CIContext API for rendering directly into (uncompressed) bitmap data:
let rowBytes = 4 * Int(ciImage.extent.width) // 4 channels (RGBA) of 8-bit data
let dataSize = rowBytes * Int(ciImage.extent.height)
var data = Data(count: dataSize)
data.withUnsafeMutableBytes { data in
ciContext.render(ciImage, toBitmap: data, rowBytes: rowBytes, bounds: ciImage.extent, format: .RGBA8, colorSpace: CGColorSpace(name: CGColorSpace.sRGB)!)
}
Alternatively, you can create a CVPixelBuffer with the correct size and format and render into that with CIContext.render(_ image: CIImage, to buffer: CVPixelBuffer). I think Core ML has direct support for CVPixelBuffer inputs, so this might be the better option.

Generate Laplacian image by Apple-Metal MPSImageLaplacian

I am trying to generate Laplacian image out of rgb CGImage by using metal laplacian.
The current code used:
if let croppedImage = self.cropImage2(image: UIImage(ciImage: image), rect: rect)?.cgImage {
let commandBuffer = self.commandQueue.makeCommandBuffer()!
let laplacian = MPSImageLaplacian(device: self.device)
let textureLoader = MTKTextureLoader(device: self.device)
let options: [MTKTextureLoader.Option : Any]? = nil
let srcTex = try! textureLoader.newTexture(cgImage: croppedImage, options: options)
let desc = MTLTextureDescriptor.texture2DDescriptor(pixelFormat: srcTex.pixelFormat, width: srcTex.width, height: srcTex.height, mipmapped: false)
let lapTex = self.device.makeTexture(descriptor: desc)
laplacian.encode(commandBuffer: commandBuffer, sourceTexture: srcTex, destinationTexture: lapTex!)
let output = CIImage(mtlTexture: lapTex!, options: [:])?.cgImage
print("output: \(output?.width)")
print("")
}
I suspect the problem is in makeTexture:
let lapTex = self.device.makeTexture(descriptor: desc)
the width and height of the lapTex in debugger are invalid although the desc and srcTex contains valid data including width and height.
Looks like order or initialisation is wrong but couldn't find what.
Does anyone has an idea what is wrong?
Thanks
There are a few things wrong here.
First, as mentioned in my comment, the command buffer isn't being committed, so the kernel work is never being performed.
Second, you need to wait for the work to complete before attempting to read back the results. (On macOS you'd additionally need to use a blit command encoder to ensure that the contents of the texture are copied back to CPU-accessible memory.)
Third, it's important to create the destination texture with the appropriate usage flags. The default of .shaderRead is insufficient in this case, since the MPS kernel writes to the texture. Therefore, you should explicitly set the usage property on the texture descriptor (to either [.shaderRead, .shaderWrite] or .shaderWrite, depending on how you go on to use the texture).
Fourth, it may be the case that the pixel format of your source texture isn't a writable format, so unless you're absolutely certain it is, consider setting the destination pixel format to a known-writable format (like .rgba8unorm) instead of assuming the destination should match the source. This also helps later when creating CGImages.
Finally, there is no guarantee that the cgImage property of a CIImage is non-nil when it wasn't created from a CGImage. Calling the property doesn't (necessarily) create a new backing CGImage. So, you need to explicitly create a CGImage somehow.
One way of doing this would be to create a Metal device-backed CIContext and use its createCGImage(_:from:) method. Although this might work, it seems redundant if the intent is simply to create a CGImage from a MTLTexture (for display purposes, let's say).
Instead, consider using the getBytes(_:bytesPerRow:from:mipmapLevel:) method to get the bytes from the texture and load them into a CG bitmap context. It's then trivial to create a CGImage from the context.
Here's a function that computes the Laplacian of an image and returns the resulting image:
func laplacian(_ image: CGImage) -> CGImage? {
let commandBuffer = self.commandQueue.makeCommandBuffer()!
let laplacian = MPSImageLaplacian(device: self.device)
let textureLoader = MTKTextureLoader(device: self.device)
let options: [MTKTextureLoader.Option : Any]? = nil
let srcTex = try! textureLoader.newTexture(cgImage: image, options: options)
let desc = MTLTextureDescriptor.texture2DDescriptor(pixelFormat: srcTex.pixelFormat,
width: srcTex.width,
height: srcTex.height,
mipmapped: false)
desc.pixelFormat = .rgba8Unorm
desc.usage = [.shaderRead, .shaderWrite]
let lapTex = self.device.makeTexture(descriptor: desc)!
laplacian.encode(commandBuffer: commandBuffer, sourceTexture: srcTex, destinationTexture: lapTex)
#if os(macOS)
let blitCommandEncoder = commandBuffer.makeBlitCommandEncoder()!
blitCommandEncoder.synchronize(resource: lapTex)
blitCommandEncoder.endEncoding()
#endif
commandBuffer.commit()
commandBuffer.waitUntilCompleted()
// Note: You may want to use a different color space depending
// on what you're doing with the image
let colorSpace = CGColorSpaceCreateDeviceRGB()
// Note: We skip the last component (A) since the Laplacian of the alpha
// channel of an opaque image is 0 everywhere, and that interacts oddly
// when we treat the result as an RGBA image.
let bitmapInfo = CGImageAlphaInfo.noneSkipLast.rawValue
let bytesPerRow = lapTex.width * 4
let bitmapContext = CGContext(data: nil,
width: lapTex.width,
height: lapTex.height,
bitsPerComponent: 8,
bytesPerRow: bytesPerRow,
space: colorSpace,
bitmapInfo: bitmapInfo)!
lapTex.getBytes(bitmapContext.data!,
bytesPerRow: bytesPerRow,
from: MTLRegionMake2D(0, 0, lapTex.width, lapTex.height),
mipmapLevel: 0)
return bitmapContext.makeImage()
}

MTKTextureLoader saturates image

I am trying to use a MTKTextureLoader to load a CGImage as a texture. Here is the original image
However after I convert that CGImage into a MTLTexture and that texture back to a CGImage it looks horrible, like this:
Here is sorta what is going on in code.
The image is loaded in as a CGImage (I have checked and that image does appear to have the full visual quality)
I have a function view() that allows me to view a NSImage by using it in a CALayer like so:
func view() {
.....
imageView!.layer = CALayer()
imageView!.layer!.contentsGravity = kCAGravityResizeAspectFill
imageView!.layer!.contents = img
imageView!.wantsLayer = true
So I did the following
let cg = CoolImage()
let ns = NSImage(cgImage: cg, size: Size(width: cg.width, height: cg.height))
view(image: ns)
And checked sure enough it had the full visual fidelity.
So then I loaded the cg image into a MTLTexture like so
let textureLoader = MTKTextureLoader(device: metalState.sharedDevice!)
let options = [
MTKTextureLoader.Option.textureUsage: NSNumber(value: MTLTextureUsage.shaderRead.rawValue | MTLTextureUsage.shaderWrite.rawValue | MTLTextureUsage.renderTarget.rawValue),
MTKTextureLoader.Option.SRGB: false
]
return ensure(try textureLoader.newTexture(cgImage: cg, options: options))
I then converted the MTLTexture back to a UIImage like so:
let texture = self
let width = texture.width
let height = texture.height
let bytesPerRow = width * 4
let data = UnsafeMutableRawPointer.allocate(bytes: bytesPerRow * height, alignedTo: 4)
defer {
data.deallocate(bytes: bytesPerRow * height, alignedTo: 4)
}
let region = MTLRegionMake2D(0, 0, width, height)
texture.getBytes(data, bytesPerRow: bytesPerRow, from: region, mipmapLevel: 0)
var buffer = vImage_Buffer(data: data, height: UInt(height), width: UInt(width), rowBytes: bytesPerRow)
var map: [UInt8] = [0, 1, 2, 3]
if (pixelFormat == .bgra8Unorm) {
map = [2, 1, 0, 3]
}
vImagePermuteChannels_ARGB8888(&buffer, &buffer, map, 0)
guard let colorSpace = CGColorSpace(name: CGColorSpace.genericRGBLinear) else { return nil }
guard let context = CGContext(data: data, width: width, height: height, bitsPerComponent: 8, bytesPerRow: bytesPerRow, space: colorSpace, bitmapInfo: CGImageAlphaInfo.noneSkipLast.rawValue) else { return nil }
guard let cgImage = context.makeImage() else { return nil }
return NSImage(cgImage: cgImage, size: Size(width: width, height: height))
And viewed it.
The resulting image was quite saturated and I believe it was because of the CGImage to MTLTexture conversion which I have been fairly successful with in the past.
Please note that this texture was never rendered only converted.
You are probably wondering why I am using all of these conversions and that is a great point. My actual pipeline does not work anything like this HOWEVER it does require each of these conversion components to be working smoothly. This is not my actual use case just something to show the problem.
The problem here isn't the conversion from CGImage to MTLTexture. The problem is that you're assuming that the color space of the source image is linear. More likely than not, the image data is actually sRGB-encoded, so by creating a bitmap context with a generic linear color space, you're incorrectly telling CG that it should gamma-encode the image data before display, which leads to the desaturation you're seeing.
You can fix this by using the native color space of the original CGImage, or by otherwise accounting for the fact that your image data is sRGB-encoded.

64-bit RGBA UIImage? CGBitmapInfo for 64-bit

I'm trying to save a 16-bit depth PNG image with P3 color space from a Metal texture on iOS. The texture has pixelformat = .rgba16Unorm, and I extract the data with this code
func dataProviderRef() -> CGDataProvider? {
let pixelCount = width * height
var imageBytes = [UInt8](repeating: 0, count: pixelCount * bytesPerPixel)
let region = MTLRegionMake2D(0, 0, width, height)
getBytes(&imageBytes, bytesPerRow: bytesPerRow, from: region, mipmapLevel: 0)
return CGDataProvider(data: NSData(bytes: &imageBytes, length: pixelCount * bytesPerPixel * MemoryLayout<UInt8>.size))
}
I figured out that the way to save a PNG image on iOS would be to create a UIImage first, and to initialize it, I need to create a CGImage. The problem is I don't know what to pass to CGIBitmapInfo. In the documentation I can see you can specify the byteOrder for 32-bit formats, but not for 64-bit.
The function I use to convert the texture to an UIImage is this,
extension UIImage {
public convenience init?(texture: MTLTexture) {
guard let rgbColorSpace = texture.defaultColorSpace else {
return nil
}
let bitmapInfo:CGBitmapInfo = [CGBitmapInfo(rawValue: CGImageAlphaInfo.last.rawValue)]
guard let provider = texture.dataProviderRef() else {
return nil
}
guard let cgim = CGImage(
width: texture.width,
height: texture.height,
bitsPerComponent: texture.bitsPerComponent,
bitsPerPixel: texture.bitsPerPixel,
bytesPerRow: texture.bytesPerRow,
space: rgbColorSpace,
bitmapInfo: bitmapInfo,
provider: provider,
decode: nil,
shouldInterpolate: false,
intent: .defaultIntent
)
else {
return nil
}
self.init(cgImage: cgim)
}
}
Note that "texture" is using a series of attributes that do not exist in MTLTexture. I created a simple extension for convenience. The only interesting bit I guess it's the color space, that at the moment is simply,
public extension MTLTexture {
var defaultColorSpace: CGColorSpace? {
get {
switch pixelFormat {
case .rgba16Unorm:
return CGColorSpace(name: CGColorSpace.displayP3)
default:
return CGColorSpaceCreateDeviceRGB()
}
}
}
}
It looks like the image I'm creating with that code above is sampling 4 bytes per pixel, instead of 8. So I obviously end up with a funny looking image...
How do I create the appropriate CGBitmapInfo? Is it even possible?
P.S. If you want to see the full code with an example, it's all in github: https://github.com/endavid/VidEngine/tree/master/SampleColorPalette
The answer was using byteOrder16. For instance, I've replace bitmapInfo in the code above for this,
let isFloat = texture.bitsPerComponent == 16
let bitmapInfo:CGBitmapInfo = [isFloat ? .byteOrder16Little : .byteOrder32Big, CGBitmapInfo(rawValue: CGImageAlphaInfo.last.rawValue)]
(The alpha can be premultiplied as well).
The SDK documentation does not provide many hints of why this is, but the book Programming with Quartz has a nice explanation of the meaning of these 16 bits:
The value byteOrder16Little specifies to Quartz that each 16-bit chunk of data supplied by your data provider should be treated in little endian order [...] For example, when using a value of byteOrder16Little for an image that specifies RGB format with 16 bits per component and 48 bits per pixel, your data provider supplies the data for each pixel where the components are ordered R, G, B, but each color component value is in little-endian order [...] For best performance when using byteOrder16Little, either the pixel size or the component size of the image must be 16 bits.
So for a 64-bit image in rgba16, the pixel size is 64 bits, but the component size is 16 bits. It works nicely :)
(Thanks #warrenm !)

Resources