Core Audio Swift Equalizer adjusts all bands at once? - ios

I am having trouble setting up a kAudioUnitSubType_NBandEQ in Swift. Here is my code to initialize the EQ:
var cd:AudioComponentDescription = AudioComponentDescription(componentType: OSType(kAudioUnitType_Effect),componentSubType: OSType(kAudioUnitSubType_NBandEQ),componentManufacturer: OSType(kAudioUnitManufacturer_Apple),componentFlags: 0,componentFlagsMask: 0)
// Add the node to the graph
status = AUGraphAddNode(graph, &cd, &MyAppNode)
println(status)
// Once the graph has been opened get an instance of the equalizer
status = AUGraphNodeInfo(graph, self.MyAppNode, nil, &MyAppUnit)
println(status)
var eqFrequencies: [UInt32] = [ 32, 250, 500, 1000, 2000, 16000 ]
status = AudioUnitSetProperty(
self.MyAppUnit,
AudioUnitPropertyID(kAUNBandEQProperty_NumberOfBands),
AudioUnitScope(kAudioUnitScope_Global),
0,
eqFrequencies,
UInt32(eqFrequencies.count*sizeof(UInt32))
)
println(status)
status = AudioUnitInitialize(self.MyAppUnit)
println(status)
var ioUnitOutputElement:AudioUnitElement = 0
var samplerOutputElement:AudioUnitElement = 0
AUGraphConnectNodeInput(graph, sourceNode, sourceOutputBusNumber, self.MyAppNode, 0)
AUGraphConnectNodeInput(graph, self.MyAppNode, 0, destinationNode, destinationInputBusNumber)
and then to apply changes in the frequency gains my code is as follows:
if (MyAppUnit == nil) {return}
else{
var bandValue0 :Float32 = tenBands.objectAtIndex(0) as! Float32
var bandValue1 :Float32 = tenBands.objectAtIndex(1) as! Float32
var bandValue2 :Float32 = tenBands.objectAtIndex(2) as! Float32
var bandValue3 :Float32 = tenBands.objectAtIndex(3) as! Float32
var bandValue4 :Float32 = tenBands.objectAtIndex(4) as! Float32
var bandValue5 :Float32 = tenBands.objectAtIndex(5) as! Float32
AudioUnitSetParameter(self.MyAppUnit, 0, AudioUnitScope(kAudioUnitScope_Global), 0, bandValue0, 0);
AudioUnitSetParameter(self.MyAppUnit, 1, AudioUnitScope(kAudioUnitScope_Global), 0, bandValue1, 0);
AudioUnitSetParameter(self.MyAppUnit, 2, AudioUnitScope(kAudioUnitScope_Global), 0, bandValue2, 0);
AudioUnitSetParameter(self.MyAppUnit, 3, AudioUnitScope(kAudioUnitScope_Global), 0, bandValue3, 0);
AudioUnitSetParameter(self.MyAppUnit, 4, AudioUnitScope(kAudioUnitScope_Global), 0, bandValue4, 0);
AudioUnitSetParameter(self.MyAppUnit, 5, AudioUnitScope(kAudioUnitScope_Global), 0, bandValue5, 0);
}
Can anyone point out what I am doing wrong here? I think it is related to the second variable in AudioUnitSetParameter. I have tried AudioUnitParameterID(0), and AudioUnitParameterID(kAUNBandEQParam_Gain + 1) for this Value but those don't seem to work at all. Any help is appreciated!

Comment adding as answer because comments are insufficient.
The following Code is in Objective-c but it should help identify your problem.
There are a number of places this might fail. Firstly, you should check the status of the AudioUnitSetParameter, and indeed all the AudioUnit Calls as this will give you a clearer point of where you're code is failing.
I've done this successfully in Objective-C and have a test app i can make available, if you need it, which shows the complete graph setup and setting the bands and gains by moving a slider ... back to your specific question. The following works just fine for me, this might help you rule out a particular section.
You can try and obtain the current "gain", this will indicate if your bands are set up correctly.
- (AudioUnitParameterValue)gainForBandAtPosition:(uint)bandPosition
{
AudioUnitParameterValue gain;
AudioUnitParameterID parameterID = kAUNBandEQParam_Gain + bandPosition;
OSStatus status = AudioUnitGetParameter(equalizerUnit,
parameterID,
kAudioUnitScope_Global,
0,
&gain);
if (status != noErr) {
#throw [NSException exceptionWithName:#"gettingParamGainErrorException"
reason:[NSString stringWithFormat:#"OSStatus Error on getting EQ Gain! Status returned %d).", (int)status]
userInfo:nil];
}
return gain;
}
then setting the gain can be done in the following way;
- (void)setGain:(AudioUnitParameterValue)gain forBandAtPosition:(uint)bandPosition
{
AudioUnitParameterID parameterID = kAUNBandEQParam_Gain + bandPosition;
OSStatus status = AudioUnitSetParameter(equalizerUnit,
parameterID,
kAudioUnitScope_Global,
0,
gain,
0);
if (status != noErr) {
#throw [NSException exceptionWithName:#"settingParamGainAudioErrorException"
reason:[NSString stringWithFormat:#"OSStatus Error on setting EQ gain! Status returned %d).", (int)status]
userInfo:nil];
}
}
Finally, what value are you trying to set, the valid range (if I'm not mistaken) is -125.0 to 25.0

Related

How to set an Instrument in Apple MIDI?

I followed the following two posts to playback music notes using Apple MIDI, but the sounds that is coming out does not sound like a note being played, but just the same frequency from a generator that is being played. Basically I don't hear an instrument playing the note, but just a generator creating the same frequency as the note.
Posts:
How do you change the music instrument for a MusicTrack?
Play musical notes in Swift Playground
My Code:
func playMusic(){
var sequence : MusicSequence? = nil
var musicSequence = NewMusicSequence(&sequence)
var track : MusicTrack? = nil
var musicTrack = MusicSequenceNewTrack(sequence!, &track)
// Adding notes
var time = MusicTimeStamp(0.0)
for notee in TestOne().notes{
var number = freqsScale[notee.frequency] ?? 0
print(number)
print(notee.distance)
number += 11
var note = MIDINoteMessage(channel: 0,
note: UInt8(number),
velocity: 64,
releaseVelocity: 0,
duration: notee.distance )
musicTrack = MusicTrackNewMIDINoteEvent(track!, time, &note)
time += Double(notee.distance)
}
var inMessage = MIDIChannelMessage(status: 0xB0, data1: 120, data2: 0, reserved: 0)
MusicTrackNewMIDIChannelEvent(track!, 0, &inMessage)
inMessage = MIDIChannelMessage(status: 0xC0, data1: 48, data2: 0, reserved: 0)
MusicTrackNewMIDIChannelEvent(track!, 0, &inMessage)
// Creating a player
var musicPlayer : MusicPlayer? = nil
var player = NewMusicPlayer(&musicPlayer)
player = MusicPlayerSetSequence(musicPlayer!, sequence)
player = MusicPlayerStart(musicPlayer!)
}
What am I doing wrong?
What is the correct way to set an instrument?

Metal Depth Clamping

I want to disable clamping between far and close points. Already tred to modify sampler to disable clamp to edge (constexpr sampler s(address::clamp_to_zero) and it worked as expected for the edges, but coordinates between most far and close points are still clamping.
Current unwanted result:
https://gph.is/g/ZyWjkzW
Expected result:
https://i.imgur.com/GjvwgyU.png
Also tried encoder.setDepthClipMode(.clip) but it didn't worked.
Some portions of code:
let descriptor = MTLRenderPipelineDescriptor()
descriptor.colorAttachments[0].pixelFormat = .rgba16Float
descriptor.colorAttachments[1].pixelFormat = .rgba16Float
descriptor.depthAttachmentPixelFormat = .invalid
let descriptor = MTLRenderPassDescriptor()
descriptor.colorAttachments[0].texture = outputColorTexture
descriptor.colorAttachments[0].clearColor = clearColor
descriptor.colorAttachments[0].loadAction = .load
descriptor.colorAttachments[0].storeAction = .store
descriptor.colorAttachments[1].texture = outputDepthTexture
descriptor.colorAttachments[1].clearColor = clearColor
descriptor.colorAttachments[1].loadAction = .load
descriptor.colorAttachments[1].storeAction = .store
descriptor.renderTargetWidth = Int(drawableSize.width)
descriptor.renderTargetHeight = Int(drawableSize.height)
guard let encoder = commandBuffer.makeRenderCommandEncoder(descriptor: descriptor) else { throw RenderingError.makeDescriptorFailed }
encoder.setDepthClipMode(.clip)
encoder.setRenderPipelineState(pipelineState)
encoder.setFragmentTexture(inputColorTexture, index: 0)
encoder.setFragmentTexture(inputDepthTexture, index: 1)
encoder.setFragmentBuffer(uniformsBuffer, offset: 0, index: 0)
encoder.drawPrimitives(type: .triangleStrip, vertexStart: 0, vertexCount: 4)
encoder.endEncoding()

How to generate audio file from Hex/Binary(raw data) value in iOS?

I am working on BLE project where an audio recorder hardware continuously streaming data and send to the iOS application. From iOS application end, I need to read transferred data.
Hardware sending HEX data to iOS application, We need to create .mp3/.wav file
Is anyone have an idea to create an audio file from binary/hex input data?
Note: I have to use raw data(Hex) to create an audio file.
Thanks
Its unclear from your question how the data is coming in, but I'm going to assume at this point that you periodically have a Data of Linear PCM data as signed integers that you want to append. If it's some other format, then you'll have to change the settings. This is all just general-purpose stuff; you will almost certainly have to modify it to your specific problem.
(Much of this code is based on Create a silent audio CMSampleBufferRef)
First you need a writer:
let writer = try AVAssetWriter(outputURL: outputURL, fileType: .wav)
Then you need to know how how your data is formatted (this is quietly assuming that the data is a multiple of the frame size; if this isn't true, you'll need to keep track of the partial frames):
let numChannels = 1
let sampleRate = 44100
let bytesPerFrame = MemoryLayout<Int16>.size * numChannels
let frames = data.count / bytesPerFrame
let duration = Double(frames) / Double(sampleRate)
let blockSize = frames * bytesPerFrame
Then you need to know what the current frame is. This will update over time.
var currentFrame: Int64 = 0
Now you need a description of your data:
var asbd = AudioStreamBasicDescription(
mSampleRate: Float64(sampleRate),
mFormatID: kAudioFormatLinearPCM,
mFormatFlags: kLinearPCMFormatFlagIsSignedInteger,
mBytesPerPacket: UInt32(bytesPerFrame),
mFramesPerPacket: 1,
mBytesPerFrame: UInt32(bytesPerFrame),
mChannelsPerFrame: UInt32(numChannels),
mBitsPerChannel: UInt32(MemoryLayout<Int16>.size*8),
mReserved: 0
)
var formatDesc: CMAudioFormatDescription?
status = CMAudioFormatDescriptionCreate(kCFAllocatorDefault, &asbd, 0, nil, 0, nil, nil, &formatDesc)
assert(status == noErr)
And create your input adapter and add it to the writer
let settings:[String : Any] = [ AVFormatIDKey : kAudioFormatLinearPCM,
AVNumberOfChannelsKey : numChannels,
AVSampleRateKey : sampleRate ]
let input = AVAssetWriterInput(mediaType: .audio, outputSettings: settings, sourceFormatHint: formatDesc)
writer.add(input)
That's all the one-time setup, it's time to start the writer:
writer.startWriting()
writer.startSession(atSourceTime: kCMTimeZero)
If all your data is the same size, you can create a reusable buffer (or you can create a new one each time):
var block: CMBlockBuffer?
var status = CMBlockBufferCreateWithMemoryBlock(
kCFAllocatorDefault,
nil,
blockSize, // blockLength
nil, // blockAllocator
nil, // customBlockSource
0, // offsetToData
blockSize, // dataLength
0, // flags
&block
)
assert(status == kCMBlockBufferNoErr)
When data comes in, copy it into the buffer:
status = CMBlockBufferReplaceDataBytes(&inputData, block!, 0, blockSize)
assert(status == kCMBlockBufferNoErr)
Now create a sample buffer from the buffer and append it to the writer input:
var sampleBuffer: CMSampleBuffer?
status = CMAudioSampleBufferCreateReadyWithPacketDescriptions(
kCFAllocatorDefault,
block, // dataBuffer
formatDesc!,
frames, // numSamples
CMTimeMake(currentFrame, Int32(sampleRate)), // sbufPTS
nil, // packetDescriptions
&sampleBuffer
)
assert(status == noErr)
input.append(sampleBuffer!)
When everything is done, finalize the writer and you're done:
input.markAsFinished()
writer.finishWriting{}

CVMetalTextureCacheCreateTextureFromImage always return null

I'm trying to render I420 (YCbCr planner) via MetalKit
most of examples are using the CMSampleBuffer which from Camera,
but my goal is using a given I420 bytes.
I do something like this:
let data = NSMutableData(contentsOfURL: NSBundle.mainBundle().URLForResource("yuv_640_360", withExtension: "yuv")!)
// Cache for Y
CVMetalTextureCacheCreate(kCFAllocatorDefault, nil, self.device!, nil, &videoTextureCache)
var pixelBuffer: CVPixelBuffer?
CVPixelBufferCreateWithBytes(kCFAllocatorDefault, Int(size.width), Int(size.height), kCVPixelFormatType_420YpCbCr8BiPlanarVideoRange, data.mutableBytes, Int(size.width), nil, nil, [
"kCVPixelBufferMetalCompatibilityKey": true,
"kCVPixelBufferOpenGLCompatibilityKey": true,
"kCVPixelBufferIOSurfacePropertiesKey": []
]
, &pixelBuffer)
// Y texture
var yTextureRef : Unmanaged<CVMetalTexture>?
let yWidth = CVPixelBufferGetWidthOfPlane(pixelBuffer, 0)
let yHeight = CVPixelBufferGetHeightOfPlane(pixelBuffer, 0)
let result = CVMetalTextureCacheCreateTextureFromImage(kCFAllocatorDefault, (videoTextureCache?.takeUnretainedValue())!, pixelBuffer, nil, MTLPixelFormat.R8Unorm, yWidth, yHeight, 0, &yTextureRef);
basically the code is almost same as other examples but I create my own CVPixelBuffer by myself.
I got no error when I creating CVPixelBuffer and CVMetalTexture,
but it always return null for yTexture.
How do I create the right CVPixelBuffer and use it to render ?
problem solved.
iosurface is important, I found iosurface always be null if you create CVPixelBuffer by CVPixelBufferCreateWithBytes or CVPixelBufferCreateWithPlanarBytes.
once you use a CVPixelBuffer which's iosurface is null, then MetalTexture always be null.
should do something like this:
let result = CVPixelBufferCreate(kCFAllocatorDefault, width, height, kCVPixelFormatType_420YpCbCr8Planar, [
String(kCVPixelBufferIOSurfacePropertiesKey): [
"IOSurfaceOpenGLESFBOCompatibility": true,
"IOSurfaceOpenGLESTextureCompatibility": true,
"IOSurfaceCoreAnimationCompatibility": true,
]
], &self.pixelBuffer)
CVPixelBufferLockBaseAddress(self.pixelBuffer!, 0)
for index in 0...2 {
memcpy(CVPixelBufferGetBaseAddressOfPlane(self.pixelBuffer!, index), planesAddress[index], planesWidth[index] * planesHeight[index])
}
CVPixelBufferUnlockBaseAddress(self.pixelBuffer!, 0)

Swift glGetProgramInfoLog

Does anyone know how to implement this in swift? The entire function call is
glGetProgramInfoLog(
<#program: GLuint#>,
<#bufsize: GLsizei#>,
<#length: UnsafeMutablePointer<GLsizei>#>,
<#infolog: UnsafeMutablePointer<GLchar>#>)
I understand passing the pointers but not the buffer sizes. Android doesn't even have these parameters at all.
For anyone looking for an answer you can use following code :
Where program is let program = glCreateProgram()
Swift 2
var message = [CChar](count: 256, repeatedValue: CChar(0))
var length = GLsizei(0)
glGetProgramInfoLog(program, 256, &length, &message)
print(String(UTF8String: message))
Swift 3
var message = [CChar](repeating: CChar(0), count: 256)
var length = GLsizei(0)
glGetProgramInfoLog(program, 256, &length, &message)
var s = String.init(utf8String: message)!
if(s.characters.count > 0){print("Shader compile log: \(s)")} //only prints if log isnt empty
Try this:
var message = [CChar](count: 256, repeatedValue: CChar(0))
var length = GLsizei(0)
var log = Array<GLchar>(count: Int(length ), repeatedValue: 0)
log.withUnsafeBufferPointer { logPointer -> Void in
glGetShaderInfoLog(yourProgram, length, &length, UnsafeMutablePointer(logPointer.baseAddress))
NSLog("Shader compile log: \n%#", String(UTF8String: log)!)
}

Resources