I'm wondering how to convert the following objective-c method to Swift function?
CGPatternRef pattern = CGPatternCreate(NULL,
rect,
CGAffineTransformIdentity,
24,
24,
kCGPatternTilingConstantSpacing,
true,
&callbacks);
My code:
let callbacks : CGPatternCallbacks = CGPatternCallbacks(version: 0)
let pattern : CGPatternRef = CGPatternCreate(nil,
rect,
CGAffineTransformIdentity,
24,
24,
kCGPatternTilingConstantSpacing,
true,
callbacks)
But I got an error message:
'CGPatternCallbacks' is not convertible to 'CConstPointer'
Is there any sample code for this? Thanks
Something like that
var callbacks : CGPatternCallbacks = CGPatternCallbacks(version: 0)
var pattern = CGPatternCreate(nil,
rect,
CGAffineTransformIdentity,
24,
24,
kCGPatternTilingConstantSpacing,
true,
&callbacks)
This solution is a problematic one:
The pointer registered within CGPatternCallbacks (for a function that draws the pattern) should be CFunctionPointer<(UnsafeMutablePointer, CGContext>)->Void)
This means that the function pointer should be transformed to UnasfeMutablePointer then to COpaquePointer then to CFuncfionPointer
And still anyway i'm getting an exception on function call, there is a simplier solution:
[http://www.raywenderlich.com/90695/modern-core-graphics-with-swift-part-3][1]
Option 1
//global function - outside of the class
func myDrawColoredPattern(info: UnsafeMutablePointer<Void>, context: CGContextRef?) -> Void {
//draw pattern using context....
}
var callbacks : CGPatternCallbacks = CGPatternCallbacks(version: 0, drawPattern: myDrawColoredPattern, releaseInfo: nil)
let pattern: CGPatternRef? = CGPatternCreate(nil, rect, CGAffineTransformIdentity, 24, 24, CGPatternTiling.ConstantSpacing, true, &callbacks)
Option 2 - the 'swift' way
let drawPattern: CGPatternDrawPatternCallback = { (_, context) in
//draw pattern using context...
}
var callbacks = CGPatternCallbacks(version: 0, drawPattern: drawPattern, releaseInfo: nil)
let pattern: CGPatternRef? = CGPatternCreate(nil, rect, CGAffineTransformIdentity, 24, 24, CGPatternTiling.ConstantSpacing, true, &callbacks)
Related
The swift syntax changed over the years and this code that was working perfecly is not anymore...
var zerosR = [Float](count: windowSizeOverTwo, repeatedValue: 0.0)
var zerosI = [Float](count: windowSizeOverTwo, repeatedValue: 0.0)
var cplxData = DSPSplitComplex( realp: &zerosR, imagp: &zerosI )
let xAsComplex = UnsafePointer<DSPComplex>( inputSignal.withUnsafeBufferPointer { $0.baseAddress } )
vDSP_ctoz( xAsComplex, 2, &cplxData, 1, vDSP_Length(windowSizeOverTwo) )
vDSP_fft_zrip( setup, &cplxData, 1, log2n, FFTDirection(kFFTDirection_Forward) )
Every line of this code shows an error under Swift 4
I was able to convert everything except for this line
let xAsComplex = UnsafePointer<DSPComplex>( inputSignal.withUnsafeBufferPointer { $0.baseAddress } )
that does not compile with this error
Cannot convert value of type 'UnsafePointer?' to expected
argument type 'UnsafePointer<_>?'
The pointer to the storage of Float elements in the inputSignal array must be rebound to point to an array of DSPComplex values:
let inputSignal: [Float] = ...
inputSignal.withUnsafeBufferPointer {
floatPtr in
floatPtr.withMemoryRebound(to: DSPComplex.self) {
cmplxPtr in
vDSP_ctoz(cmplxPtr.baseAddress!, 2, &cplxData, 1, vDSP_Length(windowSizeOverTwo) )
}
}
See also UnsafeRawPointer Migration for more information.
Note that those pointers are only valid during the execution of the closure, and must not be passed to the outside. What you did in
let xAsComplex = UnsafePointer<DSPComplex>( inputSignal.withUnsafeBufferPointer { $0.baseAddress } )
was actually relying on undefined behavior.
I'm trying to render I420 (YCbCr planner) via MetalKit
most of examples are using the CMSampleBuffer which from Camera,
but my goal is using a given I420 bytes.
I do something like this:
let data = NSMutableData(contentsOfURL: NSBundle.mainBundle().URLForResource("yuv_640_360", withExtension: "yuv")!)
// Cache for Y
CVMetalTextureCacheCreate(kCFAllocatorDefault, nil, self.device!, nil, &videoTextureCache)
var pixelBuffer: CVPixelBuffer?
CVPixelBufferCreateWithBytes(kCFAllocatorDefault, Int(size.width), Int(size.height), kCVPixelFormatType_420YpCbCr8BiPlanarVideoRange, data.mutableBytes, Int(size.width), nil, nil, [
"kCVPixelBufferMetalCompatibilityKey": true,
"kCVPixelBufferOpenGLCompatibilityKey": true,
"kCVPixelBufferIOSurfacePropertiesKey": []
]
, &pixelBuffer)
// Y texture
var yTextureRef : Unmanaged<CVMetalTexture>?
let yWidth = CVPixelBufferGetWidthOfPlane(pixelBuffer, 0)
let yHeight = CVPixelBufferGetHeightOfPlane(pixelBuffer, 0)
let result = CVMetalTextureCacheCreateTextureFromImage(kCFAllocatorDefault, (videoTextureCache?.takeUnretainedValue())!, pixelBuffer, nil, MTLPixelFormat.R8Unorm, yWidth, yHeight, 0, &yTextureRef);
basically the code is almost same as other examples but I create my own CVPixelBuffer by myself.
I got no error when I creating CVPixelBuffer and CVMetalTexture,
but it always return null for yTexture.
How do I create the right CVPixelBuffer and use it to render ?
problem solved.
iosurface is important, I found iosurface always be null if you create CVPixelBuffer by CVPixelBufferCreateWithBytes or CVPixelBufferCreateWithPlanarBytes.
once you use a CVPixelBuffer which's iosurface is null, then MetalTexture always be null.
should do something like this:
let result = CVPixelBufferCreate(kCFAllocatorDefault, width, height, kCVPixelFormatType_420YpCbCr8Planar, [
String(kCVPixelBufferIOSurfacePropertiesKey): [
"IOSurfaceOpenGLESFBOCompatibility": true,
"IOSurfaceOpenGLESTextureCompatibility": true,
"IOSurfaceCoreAnimationCompatibility": true,
]
], &self.pixelBuffer)
CVPixelBufferLockBaseAddress(self.pixelBuffer!, 0)
for index in 0...2 {
memcpy(CVPixelBufferGetBaseAddressOfPlane(self.pixelBuffer!, index), planesAddress[index], planesWidth[index] * planesHeight[index])
}
CVPixelBufferUnlockBaseAddress(self.pixelBuffer!, 0)
I'm working on a project that uses CoreText; I need to initialize a CTRunDelegateCallbacks:
var imageCallback = CTRunDelegateCallbacks(version: kCTRunDelegateCurrentVersion, dealloc: { (refCon) -> Void in
print("RunDelegate dealloc")
}, getAscent: { ( refCon) -> CGFloat in
return 0
}, getDescent: { (refCon) -> CGFloat in
return 0
}) { (refCon) -> CGFloat in
return 0
}
The parameter refCon is UnsafeMutablePointer<Void> type, which is also called void * type in C. I want to get the pointer's raw value. How to do it?
I do not recommend converting the pointer to a non-pointer type. I'll show you how to do it at the end of this answer, but first I'll show you how you should really handle refCon.
Create a struct to hold whatever information you need to pass through to the callbacks. Example:
struct MyRunExtent {
let ascent: CGFloat
let descent: CGFloat
let width: CGFloat
}
Then create an UnsafeMutablePointer using its alloc class method, and initialize the allocated storage:
let extentBuffer = UnsafeMutablePointer<MyRunExtent>.alloc(1)
extentBuffer.initialize(MyRunExtent(ascent: 12, descent: 4, width: 10))
In your callbacks, convert the pointer argument to an UnsafePointer<MyRunExtent> and pull what you need out of its memory:
var callbacks = CTRunDelegateCallbacks(version: kCTRunDelegateVersion1, dealloc: { pointer in
pointer.dealloc(1)
}, getAscent: { pointer in
return UnsafePointer<MyRunExtent>(pointer).memory.ascent
}, getDescent: { pointer in
return UnsafePointer<MyRunExtent>(pointer).memory.descent
}, getWidth: { pointer in
return UnsafePointer<MyRunExtent>(pointer).memory.width
})
Now you can create your delegate, using callbacks and extentBuffer:
let delegate = CTRunDelegateCreate(&callbacks, extentBuffer)!
Here's a test:
let richText = NSMutableAttributedString(string: "hello \u{FFFC} world")
richText.addAttribute(kCTRunDelegateAttributeName as String, value: delegate, range: NSMakeRange(6, 1))
let line = CTLineCreateWithAttributedString(richText)
let runs = (CTLineGetGlyphRuns(line) as [AnyObject]).map { $0 as! CTRun }
runs.forEach {
var ascent = CGFloat(0), descent = CGFloat(0), leading = CGFloat(0)
let width = CTRunGetTypographicBounds($0, CFRangeMake(0, 0), &ascent, &descent, &leading)
print("width:\(width) ascent:\(ascent) descent:\(descent) leading:\(leading)")
}
The output (in a playground):
2015-12-21 12:26:00.505 iOS Playground[17525:8055669] -[__NSCFType encodeWithCoder:]: unrecognized selector sent to instance 0x7f94bcb01dc0
width:28.6875 ascent:9.240234375 descent:2.759765625 leading:0.0
width:10.0 ascent:12.0 descent:4.0 leading:0.0
width:32.009765625 ascent:9.240234375 descent:2.759765625 leading:0.0
The first line of output is because the playground execution process can't encode the delegate to send back to Xcode for display, and turns out to be harmless. Anyway, you can see that the bounds of the middle run were computed using my callbacks and the content of my extentBuffer.
And now, the moment you've been waiting for…
You can get the pointer's “raw value” this way, if the pointer and an Int are the same size on the running system:
let rawValue = unsafeBitCast(refCon, Int.self)
If they're different sizes, you'll get a fatal error at runtime.
You could cast it to a CGFloat this way, if the pointer and a CGFloat are the same size:
let float = unsafeBitCast(refCon, CGFloat.self)
I am having trouble setting up a kAudioUnitSubType_NBandEQ in Swift. Here is my code to initialize the EQ:
var cd:AudioComponentDescription = AudioComponentDescription(componentType: OSType(kAudioUnitType_Effect),componentSubType: OSType(kAudioUnitSubType_NBandEQ),componentManufacturer: OSType(kAudioUnitManufacturer_Apple),componentFlags: 0,componentFlagsMask: 0)
// Add the node to the graph
status = AUGraphAddNode(graph, &cd, &MyAppNode)
println(status)
// Once the graph has been opened get an instance of the equalizer
status = AUGraphNodeInfo(graph, self.MyAppNode, nil, &MyAppUnit)
println(status)
var eqFrequencies: [UInt32] = [ 32, 250, 500, 1000, 2000, 16000 ]
status = AudioUnitSetProperty(
self.MyAppUnit,
AudioUnitPropertyID(kAUNBandEQProperty_NumberOfBands),
AudioUnitScope(kAudioUnitScope_Global),
0,
eqFrequencies,
UInt32(eqFrequencies.count*sizeof(UInt32))
)
println(status)
status = AudioUnitInitialize(self.MyAppUnit)
println(status)
var ioUnitOutputElement:AudioUnitElement = 0
var samplerOutputElement:AudioUnitElement = 0
AUGraphConnectNodeInput(graph, sourceNode, sourceOutputBusNumber, self.MyAppNode, 0)
AUGraphConnectNodeInput(graph, self.MyAppNode, 0, destinationNode, destinationInputBusNumber)
and then to apply changes in the frequency gains my code is as follows:
if (MyAppUnit == nil) {return}
else{
var bandValue0 :Float32 = tenBands.objectAtIndex(0) as! Float32
var bandValue1 :Float32 = tenBands.objectAtIndex(1) as! Float32
var bandValue2 :Float32 = tenBands.objectAtIndex(2) as! Float32
var bandValue3 :Float32 = tenBands.objectAtIndex(3) as! Float32
var bandValue4 :Float32 = tenBands.objectAtIndex(4) as! Float32
var bandValue5 :Float32 = tenBands.objectAtIndex(5) as! Float32
AudioUnitSetParameter(self.MyAppUnit, 0, AudioUnitScope(kAudioUnitScope_Global), 0, bandValue0, 0);
AudioUnitSetParameter(self.MyAppUnit, 1, AudioUnitScope(kAudioUnitScope_Global), 0, bandValue1, 0);
AudioUnitSetParameter(self.MyAppUnit, 2, AudioUnitScope(kAudioUnitScope_Global), 0, bandValue2, 0);
AudioUnitSetParameter(self.MyAppUnit, 3, AudioUnitScope(kAudioUnitScope_Global), 0, bandValue3, 0);
AudioUnitSetParameter(self.MyAppUnit, 4, AudioUnitScope(kAudioUnitScope_Global), 0, bandValue4, 0);
AudioUnitSetParameter(self.MyAppUnit, 5, AudioUnitScope(kAudioUnitScope_Global), 0, bandValue5, 0);
}
Can anyone point out what I am doing wrong here? I think it is related to the second variable in AudioUnitSetParameter. I have tried AudioUnitParameterID(0), and AudioUnitParameterID(kAUNBandEQParam_Gain + 1) for this Value but those don't seem to work at all. Any help is appreciated!
Comment adding as answer because comments are insufficient.
The following Code is in Objective-c but it should help identify your problem.
There are a number of places this might fail. Firstly, you should check the status of the AudioUnitSetParameter, and indeed all the AudioUnit Calls as this will give you a clearer point of where you're code is failing.
I've done this successfully in Objective-C and have a test app i can make available, if you need it, which shows the complete graph setup and setting the bands and gains by moving a slider ... back to your specific question. The following works just fine for me, this might help you rule out a particular section.
You can try and obtain the current "gain", this will indicate if your bands are set up correctly.
- (AudioUnitParameterValue)gainForBandAtPosition:(uint)bandPosition
{
AudioUnitParameterValue gain;
AudioUnitParameterID parameterID = kAUNBandEQParam_Gain + bandPosition;
OSStatus status = AudioUnitGetParameter(equalizerUnit,
parameterID,
kAudioUnitScope_Global,
0,
&gain);
if (status != noErr) {
#throw [NSException exceptionWithName:#"gettingParamGainErrorException"
reason:[NSString stringWithFormat:#"OSStatus Error on getting EQ Gain! Status returned %d).", (int)status]
userInfo:nil];
}
return gain;
}
then setting the gain can be done in the following way;
- (void)setGain:(AudioUnitParameterValue)gain forBandAtPosition:(uint)bandPosition
{
AudioUnitParameterID parameterID = kAUNBandEQParam_Gain + bandPosition;
OSStatus status = AudioUnitSetParameter(equalizerUnit,
parameterID,
kAudioUnitScope_Global,
0,
gain,
0);
if (status != noErr) {
#throw [NSException exceptionWithName:#"settingParamGainAudioErrorException"
reason:[NSString stringWithFormat:#"OSStatus Error on setting EQ gain! Status returned %d).", (int)status]
userInfo:nil];
}
}
Finally, what value are you trying to set, the valid range (if I'm not mistaken) is -125.0 to 25.0
Does anyone know how to implement this in swift? The entire function call is
glGetProgramInfoLog(
<#program: GLuint#>,
<#bufsize: GLsizei#>,
<#length: UnsafeMutablePointer<GLsizei>#>,
<#infolog: UnsafeMutablePointer<GLchar>#>)
I understand passing the pointers but not the buffer sizes. Android doesn't even have these parameters at all.
For anyone looking for an answer you can use following code :
Where program is let program = glCreateProgram()
Swift 2
var message = [CChar](count: 256, repeatedValue: CChar(0))
var length = GLsizei(0)
glGetProgramInfoLog(program, 256, &length, &message)
print(String(UTF8String: message))
Swift 3
var message = [CChar](repeating: CChar(0), count: 256)
var length = GLsizei(0)
glGetProgramInfoLog(program, 256, &length, &message)
var s = String.init(utf8String: message)!
if(s.characters.count > 0){print("Shader compile log: \(s)")} //only prints if log isnt empty
Try this:
var message = [CChar](count: 256, repeatedValue: CChar(0))
var length = GLsizei(0)
var log = Array<GLchar>(count: Int(length ), repeatedValue: 0)
log.withUnsafeBufferPointer { logPointer -> Void in
glGetShaderInfoLog(yourProgram, length, &length, UnsafeMutablePointer(logPointer.baseAddress))
NSLog("Shader compile log: \n%#", String(UTF8String: log)!)
}