CVMetalTextureCacheCreateTextureFromImage always return null - ios

I'm trying to render I420 (YCbCr planner) via MetalKit
most of examples are using the CMSampleBuffer which from Camera,
but my goal is using a given I420 bytes.
I do something like this:
let data = NSMutableData(contentsOfURL: NSBundle.mainBundle().URLForResource("yuv_640_360", withExtension: "yuv")!)
// Cache for Y
CVMetalTextureCacheCreate(kCFAllocatorDefault, nil, self.device!, nil, &videoTextureCache)
var pixelBuffer: CVPixelBuffer?
CVPixelBufferCreateWithBytes(kCFAllocatorDefault, Int(size.width), Int(size.height), kCVPixelFormatType_420YpCbCr8BiPlanarVideoRange, data.mutableBytes, Int(size.width), nil, nil, [
"kCVPixelBufferMetalCompatibilityKey": true,
"kCVPixelBufferOpenGLCompatibilityKey": true,
"kCVPixelBufferIOSurfacePropertiesKey": []
]
, &pixelBuffer)
// Y texture
var yTextureRef : Unmanaged<CVMetalTexture>?
let yWidth = CVPixelBufferGetWidthOfPlane(pixelBuffer, 0)
let yHeight = CVPixelBufferGetHeightOfPlane(pixelBuffer, 0)
let result = CVMetalTextureCacheCreateTextureFromImage(kCFAllocatorDefault, (videoTextureCache?.takeUnretainedValue())!, pixelBuffer, nil, MTLPixelFormat.R8Unorm, yWidth, yHeight, 0, &yTextureRef);
basically the code is almost same as other examples but I create my own CVPixelBuffer by myself.
I got no error when I creating CVPixelBuffer and CVMetalTexture,
but it always return null for yTexture.
How do I create the right CVPixelBuffer and use it to render ?

problem solved.
iosurface is important, I found iosurface always be null if you create CVPixelBuffer by CVPixelBufferCreateWithBytes or CVPixelBufferCreateWithPlanarBytes.
once you use a CVPixelBuffer which's iosurface is null, then MetalTexture always be null.
should do something like this:
let result = CVPixelBufferCreate(kCFAllocatorDefault, width, height, kCVPixelFormatType_420YpCbCr8Planar, [
String(kCVPixelBufferIOSurfacePropertiesKey): [
"IOSurfaceOpenGLESFBOCompatibility": true,
"IOSurfaceOpenGLESTextureCompatibility": true,
"IOSurfaceCoreAnimationCompatibility": true,
]
], &self.pixelBuffer)
CVPixelBufferLockBaseAddress(self.pixelBuffer!, 0)
for index in 0...2 {
memcpy(CVPixelBufferGetBaseAddressOfPlane(self.pixelBuffer!, index), planesAddress[index], planesWidth[index] * planesHeight[index])
}
CVPixelBufferUnlockBaseAddress(self.pixelBuffer!, 0)

Related

Writing encoded audio CMSampleBuffer not working

I'm using AudioConverter to convert uncompressed CMSampleBuffer being captured via AVCaptureSession to AudioBufferList:
let packetDescriptionsPtr = UnsafeMutablePointer<AudioStreamPacketDescription>.allocate(capacity: 1)
AudioConverterFillComplexBuffer(
converter,
inputDataProc,
Unmanaged.passUnretained(self).toOpaque(),
&ioOutputDataPacketSize,
outOutputData.unsafeMutablePointer,
packetDescriptionsPtr
)
I am then constructing a CMSampleBuffer containing compressed data using packet descriptions like so:
CMAudioSampleBufferCreateWithPacketDescriptions(
allocator: kCFAllocatorDefault,
dataBuffer: nil,
dataReady: false,
makeDataReadyCallback: nil,
refcon: nil,
formatDescription: formatDescription!,
sampleCount: Int(data.unsafePointer.pointee.mNumberBuffers),
presentationTimeStamp: presentationTimeStamp,
packetDescriptions: &packetDescriptions,
sampleBufferOut: &sampleBuffer)
When I tried saving the buffer using AVAssetWriter I got the following error:
-[AVAssetWriterInput appendSampleBuffer:] Cannot append sample buffer: First input buffer must have an appropriate kCMSampleBufferAttachmentKey_TrimDurationAtStart since the codec has encoder delay'
I decided to prime the first three buffers knowing that each is of consistent length:
if self.receivedAudioBuffers < 2 {
let primingDuration = CMTimeMake(value: 1024, timescale: 44100)
CMSetAttachment(sampleBuffer,
key: kCMSampleBufferAttachmentKey_TrimDurationAtStart,
value: CMTimeCopyAsDictionary(primingDuration, allocator: kCFAllocatorDefault),
attachmentMode: kCMAttachmentMode_ShouldNotPropagate)
self.receivedAudioBuffers += 1
}
else if self.receivedAudioBuffers == 2 {
let primingDuration = CMTimeMake(value: 64, timescale: 44100)
CMSetAttachment(sampleBuffer,
key: kCMSampleBufferAttachmentKey_TrimDurationAtStart,
value: CMTimeCopyAsDictionary(primingDuration, allocator: kCFAllocatorDefault),
attachmentMode: kCMAttachmentMode_ShouldNotPropagate)
self.receivedAudioBuffers += 1
}
Now I no longer get the error and neither do I get any errors when appending the sample but the audio doesn't play in the recording and also messes up the whole video file (seems like the timing info gets corrupted).
Is there anything here that I'm missing? How should I correctly append audio CMSampleBuffer?

Convert pixelBuffer(kCVPixelFormatType_420YpCbCr8Planar) to CIImage

Now I am trying to convert pixelBuffer to CIImage, but fails.
When pixelBuffer type is kCVPixelFormatType_420YpCbCr8BiPlanarFullRange, the following code can be executed without errors.
let sourceImage = CIImage.init(cvPixelBuffer: imageBuffer, options: nil)
However when pixelBuffer type is kCVPixelFormatType_420YpCbCr8Planar, it fails.
The log is as follows.
[api] -[CIImage initWithCVPixelBuffer:options:] failed because its pixel format y420 is not supported.
Therefore, I want to know how to convert pixelBuffer type or to convert pixelBuffer(kCVPixelFormatType_420YpCbCr8Planar) to CIImage.
Please teach me.
Check out vImageConverter
With this you can convert to ARGB8888 vImageConvert_420Yp8_CbCr8ToARGB8888
and muss create a vImage_Buffer
Example for conversion matrix
/// This is the YCbCr to RGB conversion opaque object used by the convert function.
private var conversionMatrix: vImage_YpCbCrToARGB = {
var pixelRange = vImage_YpCbCrPixelRange(Yp_bias: 0, CbCr_bias: 128, YpRangeMax: 255, CbCrRangeMax: 255, YpMax: 255, YpMin: 1, CbCrMax: 255, CbCrMin: 0)
var matrix = vImage_YpCbCrToARGB()
vImageConvert_YpCbCrToARGB_GenerateConversion(kvImage_YpCbCrToARGBMatrix_ITU_R_709_2, &pixelRange, &matrix, kvImage420Yp8_CbCr8, kvImageARGB8888, UInt32(kvImageNoFlags))
return matrix
}()
Example code: CapturedImageSampler.swift

How to generate audio file from Hex/Binary(raw data) value in iOS?

I am working on BLE project where an audio recorder hardware continuously streaming data and send to the iOS application. From iOS application end, I need to read transferred data.
Hardware sending HEX data to iOS application, We need to create .mp3/.wav file
Is anyone have an idea to create an audio file from binary/hex input data?
Note: I have to use raw data(Hex) to create an audio file.
Thanks
Its unclear from your question how the data is coming in, but I'm going to assume at this point that you periodically have a Data of Linear PCM data as signed integers that you want to append. If it's some other format, then you'll have to change the settings. This is all just general-purpose stuff; you will almost certainly have to modify it to your specific problem.
(Much of this code is based on Create a silent audio CMSampleBufferRef)
First you need a writer:
let writer = try AVAssetWriter(outputURL: outputURL, fileType: .wav)
Then you need to know how how your data is formatted (this is quietly assuming that the data is a multiple of the frame size; if this isn't true, you'll need to keep track of the partial frames):
let numChannels = 1
let sampleRate = 44100
let bytesPerFrame = MemoryLayout<Int16>.size * numChannels
let frames = data.count / bytesPerFrame
let duration = Double(frames) / Double(sampleRate)
let blockSize = frames * bytesPerFrame
Then you need to know what the current frame is. This will update over time.
var currentFrame: Int64 = 0
Now you need a description of your data:
var asbd = AudioStreamBasicDescription(
mSampleRate: Float64(sampleRate),
mFormatID: kAudioFormatLinearPCM,
mFormatFlags: kLinearPCMFormatFlagIsSignedInteger,
mBytesPerPacket: UInt32(bytesPerFrame),
mFramesPerPacket: 1,
mBytesPerFrame: UInt32(bytesPerFrame),
mChannelsPerFrame: UInt32(numChannels),
mBitsPerChannel: UInt32(MemoryLayout<Int16>.size*8),
mReserved: 0
)
var formatDesc: CMAudioFormatDescription?
status = CMAudioFormatDescriptionCreate(kCFAllocatorDefault, &asbd, 0, nil, 0, nil, nil, &formatDesc)
assert(status == noErr)
And create your input adapter and add it to the writer
let settings:[String : Any] = [ AVFormatIDKey : kAudioFormatLinearPCM,
AVNumberOfChannelsKey : numChannels,
AVSampleRateKey : sampleRate ]
let input = AVAssetWriterInput(mediaType: .audio, outputSettings: settings, sourceFormatHint: formatDesc)
writer.add(input)
That's all the one-time setup, it's time to start the writer:
writer.startWriting()
writer.startSession(atSourceTime: kCMTimeZero)
If all your data is the same size, you can create a reusable buffer (or you can create a new one each time):
var block: CMBlockBuffer?
var status = CMBlockBufferCreateWithMemoryBlock(
kCFAllocatorDefault,
nil,
blockSize, // blockLength
nil, // blockAllocator
nil, // customBlockSource
0, // offsetToData
blockSize, // dataLength
0, // flags
&block
)
assert(status == kCMBlockBufferNoErr)
When data comes in, copy it into the buffer:
status = CMBlockBufferReplaceDataBytes(&inputData, block!, 0, blockSize)
assert(status == kCMBlockBufferNoErr)
Now create a sample buffer from the buffer and append it to the writer input:
var sampleBuffer: CMSampleBuffer?
status = CMAudioSampleBufferCreateReadyWithPacketDescriptions(
kCFAllocatorDefault,
block, // dataBuffer
formatDesc!,
frames, // numSamples
CMTimeMake(currentFrame, Int32(sampleRate)), // sbufPTS
nil, // packetDescriptions
&sampleBuffer
)
assert(status == noErr)
input.append(sampleBuffer!)
When everything is done, finalize the writer and you're done:
input.markAsFinished()
writer.finishWriting{}

Core Audio Swift Equalizer adjusts all bands at once?

I am having trouble setting up a kAudioUnitSubType_NBandEQ in Swift. Here is my code to initialize the EQ:
var cd:AudioComponentDescription = AudioComponentDescription(componentType: OSType(kAudioUnitType_Effect),componentSubType: OSType(kAudioUnitSubType_NBandEQ),componentManufacturer: OSType(kAudioUnitManufacturer_Apple),componentFlags: 0,componentFlagsMask: 0)
// Add the node to the graph
status = AUGraphAddNode(graph, &cd, &MyAppNode)
println(status)
// Once the graph has been opened get an instance of the equalizer
status = AUGraphNodeInfo(graph, self.MyAppNode, nil, &MyAppUnit)
println(status)
var eqFrequencies: [UInt32] = [ 32, 250, 500, 1000, 2000, 16000 ]
status = AudioUnitSetProperty(
self.MyAppUnit,
AudioUnitPropertyID(kAUNBandEQProperty_NumberOfBands),
AudioUnitScope(kAudioUnitScope_Global),
0,
eqFrequencies,
UInt32(eqFrequencies.count*sizeof(UInt32))
)
println(status)
status = AudioUnitInitialize(self.MyAppUnit)
println(status)
var ioUnitOutputElement:AudioUnitElement = 0
var samplerOutputElement:AudioUnitElement = 0
AUGraphConnectNodeInput(graph, sourceNode, sourceOutputBusNumber, self.MyAppNode, 0)
AUGraphConnectNodeInput(graph, self.MyAppNode, 0, destinationNode, destinationInputBusNumber)
and then to apply changes in the frequency gains my code is as follows:
if (MyAppUnit == nil) {return}
else{
var bandValue0 :Float32 = tenBands.objectAtIndex(0) as! Float32
var bandValue1 :Float32 = tenBands.objectAtIndex(1) as! Float32
var bandValue2 :Float32 = tenBands.objectAtIndex(2) as! Float32
var bandValue3 :Float32 = tenBands.objectAtIndex(3) as! Float32
var bandValue4 :Float32 = tenBands.objectAtIndex(4) as! Float32
var bandValue5 :Float32 = tenBands.objectAtIndex(5) as! Float32
AudioUnitSetParameter(self.MyAppUnit, 0, AudioUnitScope(kAudioUnitScope_Global), 0, bandValue0, 0);
AudioUnitSetParameter(self.MyAppUnit, 1, AudioUnitScope(kAudioUnitScope_Global), 0, bandValue1, 0);
AudioUnitSetParameter(self.MyAppUnit, 2, AudioUnitScope(kAudioUnitScope_Global), 0, bandValue2, 0);
AudioUnitSetParameter(self.MyAppUnit, 3, AudioUnitScope(kAudioUnitScope_Global), 0, bandValue3, 0);
AudioUnitSetParameter(self.MyAppUnit, 4, AudioUnitScope(kAudioUnitScope_Global), 0, bandValue4, 0);
AudioUnitSetParameter(self.MyAppUnit, 5, AudioUnitScope(kAudioUnitScope_Global), 0, bandValue5, 0);
}
Can anyone point out what I am doing wrong here? I think it is related to the second variable in AudioUnitSetParameter. I have tried AudioUnitParameterID(0), and AudioUnitParameterID(kAUNBandEQParam_Gain + 1) for this Value but those don't seem to work at all. Any help is appreciated!
Comment adding as answer because comments are insufficient.
The following Code is in Objective-c but it should help identify your problem.
There are a number of places this might fail. Firstly, you should check the status of the AudioUnitSetParameter, and indeed all the AudioUnit Calls as this will give you a clearer point of where you're code is failing.
I've done this successfully in Objective-C and have a test app i can make available, if you need it, which shows the complete graph setup and setting the bands and gains by moving a slider ... back to your specific question. The following works just fine for me, this might help you rule out a particular section.
You can try and obtain the current "gain", this will indicate if your bands are set up correctly.
- (AudioUnitParameterValue)gainForBandAtPosition:(uint)bandPosition
{
AudioUnitParameterValue gain;
AudioUnitParameterID parameterID = kAUNBandEQParam_Gain + bandPosition;
OSStatus status = AudioUnitGetParameter(equalizerUnit,
parameterID,
kAudioUnitScope_Global,
0,
&gain);
if (status != noErr) {
#throw [NSException exceptionWithName:#"gettingParamGainErrorException"
reason:[NSString stringWithFormat:#"OSStatus Error on getting EQ Gain! Status returned %d).", (int)status]
userInfo:nil];
}
return gain;
}
then setting the gain can be done in the following way;
- (void)setGain:(AudioUnitParameterValue)gain forBandAtPosition:(uint)bandPosition
{
AudioUnitParameterID parameterID = kAUNBandEQParam_Gain + bandPosition;
OSStatus status = AudioUnitSetParameter(equalizerUnit,
parameterID,
kAudioUnitScope_Global,
0,
gain,
0);
if (status != noErr) {
#throw [NSException exceptionWithName:#"settingParamGainAudioErrorException"
reason:[NSString stringWithFormat:#"OSStatus Error on setting EQ gain! Status returned %d).", (int)status]
userInfo:nil];
}
}
Finally, what value are you trying to set, the valid range (if I'm not mistaken) is -125.0 to 25.0

Call CGPatternCreate in Swift

I'm wondering how to convert the following objective-c method to Swift function?
CGPatternRef pattern = CGPatternCreate(NULL,
rect,
CGAffineTransformIdentity,
24,
24,
kCGPatternTilingConstantSpacing,
true,
&callbacks);
My code:
let callbacks : CGPatternCallbacks = CGPatternCallbacks(version: 0)
let pattern : CGPatternRef = CGPatternCreate(nil,
rect,
CGAffineTransformIdentity,
24,
24,
kCGPatternTilingConstantSpacing,
true,
callbacks)
But I got an error message:
'CGPatternCallbacks' is not convertible to 'CConstPointer'
Is there any sample code for this? Thanks
Something like that
var callbacks : CGPatternCallbacks = CGPatternCallbacks(version: 0)
var pattern = CGPatternCreate(nil,
rect,
CGAffineTransformIdentity,
24,
24,
kCGPatternTilingConstantSpacing,
true,
&callbacks)
This solution is a problematic one:
The pointer registered within CGPatternCallbacks (for a function that draws the pattern) should be CFunctionPointer<(UnsafeMutablePointer, CGContext>)->Void)
This means that the function pointer should be transformed to UnasfeMutablePointer then to COpaquePointer then to CFuncfionPointer
And still anyway i'm getting an exception on function call, there is a simplier solution:
[http://www.raywenderlich.com/90695/modern-core-graphics-with-swift-part-3][1]
Option 1
//global function - outside of the class
func myDrawColoredPattern(info: UnsafeMutablePointer<Void>, context: CGContextRef?) -> Void {
//draw pattern using context....
}
var callbacks : CGPatternCallbacks = CGPatternCallbacks(version: 0, drawPattern: myDrawColoredPattern, releaseInfo: nil)
let pattern: CGPatternRef? = CGPatternCreate(nil, rect, CGAffineTransformIdentity, 24, 24, CGPatternTiling.ConstantSpacing, true, &callbacks)
Option 2 - the 'swift' way
let drawPattern: CGPatternDrawPatternCallback = { (_, context) in
//draw pattern using context...
}
var callbacks = CGPatternCallbacks(version: 0, drawPattern: drawPattern, releaseInfo: nil)
let pattern: CGPatternRef? = CGPatternCreate(nil, rect, CGAffineTransformIdentity, 24, 24, CGPatternTiling.ConstantSpacing, true, &callbacks)

Resources