Swift glGetProgramInfoLog - ios

Does anyone know how to implement this in swift? The entire function call is
glGetProgramInfoLog(
<#program: GLuint#>,
<#bufsize: GLsizei#>,
<#length: UnsafeMutablePointer<GLsizei>#>,
<#infolog: UnsafeMutablePointer<GLchar>#>)
I understand passing the pointers but not the buffer sizes. Android doesn't even have these parameters at all.

For anyone looking for an answer you can use following code :
Where program is let program = glCreateProgram()
Swift 2
var message = [CChar](count: 256, repeatedValue: CChar(0))
var length = GLsizei(0)
glGetProgramInfoLog(program, 256, &length, &message)
print(String(UTF8String: message))
Swift 3
var message = [CChar](repeating: CChar(0), count: 256)
var length = GLsizei(0)
glGetProgramInfoLog(program, 256, &length, &message)
var s = String.init(utf8String: message)!
if(s.characters.count > 0){print("Shader compile log: \(s)")} //only prints if log isnt empty

Try this:
var message = [CChar](count: 256, repeatedValue: CChar(0))
var length = GLsizei(0)
var log = Array<GLchar>(count: Int(length ), repeatedValue: 0)
log.withUnsafeBufferPointer { logPointer -> Void in
glGetShaderInfoLog(yourProgram, length, &length, UnsafeMutablePointer(logPointer.baseAddress))
NSLog("Shader compile log: \n%#", String(UTF8String: log)!)
}

Related

How to interleave arrays of real and complex numbers to use vDSP_ctoz?

The swift syntax changed over the years and this code that was working perfecly is not anymore...
var zerosR = [Float](count: windowSizeOverTwo, repeatedValue: 0.0)
var zerosI = [Float](count: windowSizeOverTwo, repeatedValue: 0.0)
var cplxData = DSPSplitComplex( realp: &zerosR, imagp: &zerosI )
let xAsComplex = UnsafePointer<DSPComplex>( inputSignal.withUnsafeBufferPointer { $0.baseAddress } )
vDSP_ctoz( xAsComplex, 2, &cplxData, 1, vDSP_Length(windowSizeOverTwo) )
vDSP_fft_zrip( setup, &cplxData, 1, log2n, FFTDirection(kFFTDirection_Forward) )
Every line of this code shows an error under Swift 4
I was able to convert everything except for this line
let xAsComplex = UnsafePointer<DSPComplex>( inputSignal.withUnsafeBufferPointer { $0.baseAddress } )
that does not compile with this error
Cannot convert value of type 'UnsafePointer?' to expected
argument type 'UnsafePointer<_>?'
The pointer to the storage of Float elements in the inputSignal array must be rebound to point to an array of DSPComplex values:
let inputSignal: [Float] = ...
inputSignal.withUnsafeBufferPointer {
floatPtr in
floatPtr.withMemoryRebound(to: DSPComplex.self) {
cmplxPtr in
vDSP_ctoz(cmplxPtr.baseAddress!, 2, &cplxData, 1, vDSP_Length(windowSizeOverTwo) )
}
}
See also UnsafeRawPointer Migration for more information.
Note that those pointers are only valid during the execution of the closure, and must not be passed to the outside. What you did in
let xAsComplex = UnsafePointer<DSPComplex>( inputSignal.withUnsafeBufferPointer { $0.baseAddress } )
was actually relying on undefined behavior.

Size of the NSInputStream buffer

I'm trying to use NSInputStream for receiving data using TCP socket connection. On the server side I send data size before sending of the data itself. on the iOS client side I need to extract first 4 bytes from the NSInputStream, because I need to check if size of data has received completely, but I have a problem with it:
...
case NSStreamEvent.HasBytesAvailable:
if ( aStream == inputstream){
while (inputstream.hasBytesAvailable){
var readBufferRef = UnsafeMutablePointer<UnsafeMutablePointer<UInt8>>()
var readBufferLengthRef = 0
let readBufferIsAvailable = inputstream.getBuffer(readBufferRef, length: &readBufferLengthRef)
...
}
}
break
After receiving of data readBufferLengthRef always equals to 0.
How it can be?
And how can I get size of the NSInputStream buffer?
UPD:
Code:
case NSStreamEvent.HasBytesAvailable:
NSLog("HasBytesAvaible")
var buffer = [UInt8](count: 1024, repeatedValue: 0)
if ( aStream == inputstream){
while (inputstream.hasBytesAvailable){
var readBufferRef: UnsafeMutablePointer<UInt8> = nil
var readBufferLengthRef = 0
let readBufferIsAvailable = inputstream.getBuffer(&readBufferRef, length: &readBufferLengthRef)
//debugger: readBufferLengthRef = (int)0
}
}
break
In your code, readBufferRef is defined as a "pointer to a pointer"
but never allocated, and therefore it is the NULL pointer.
What you should do is to pass the address of an
UnsafeMutablePointer<UInt8> as an inout argument to the function
(assuming Swift 2):
var readBufferRef: UnsafeMutablePointer<UInt8> = nil
var readBufferLengthRef = 0
let readBufferIsAvailable = inputStream.getBuffer(&readBufferRef, length: &readBufferLengthRef)
On return, readBufferRef is set to the read buffer of the stream (valid until the next read operation), and readBufferLengthRef contains
the number of available bytes.

NSProcessInfo().systemUptime is wrong

I have a trouble with getting NSTimeInterval after system reboot. This functions gave me a wrong result. The reboot time is restarting when device's battery is charged. There is a trouble on iPhone 5s and iPad 3 (the new). How can I fix it?
I use this method and it helps me
public static func uptime() -> (days: Int, hrs: Int, mins: Int, secs: Int) {
var currentTime = time_t()
var bootTime = timeval()
var mib = [CTL_KERN, KERN_BOOTTIME]
// NOTE: Use strideof(), NOT sizeof() to account for data structure
// alignment (padding)
// http://stackoverflow.com/a/27640066
// https://devforums.apple.com/message/1086617#1086617
var size = strideof(timeval)
let result = sysctl(&mib, u_int(mib.count), &bootTime, &size, nil, 0)
if result != 0 {
#if DEBUG
print("ERROR - \(__FILE__):\(__FUNCTION__) - errno = "
+ "\(result)")
#endif
return (0, 0, 0, 0)
}

Core Audio Swift Equalizer adjusts all bands at once?

I am having trouble setting up a kAudioUnitSubType_NBandEQ in Swift. Here is my code to initialize the EQ:
var cd:AudioComponentDescription = AudioComponentDescription(componentType: OSType(kAudioUnitType_Effect),componentSubType: OSType(kAudioUnitSubType_NBandEQ),componentManufacturer: OSType(kAudioUnitManufacturer_Apple),componentFlags: 0,componentFlagsMask: 0)
// Add the node to the graph
status = AUGraphAddNode(graph, &cd, &MyAppNode)
println(status)
// Once the graph has been opened get an instance of the equalizer
status = AUGraphNodeInfo(graph, self.MyAppNode, nil, &MyAppUnit)
println(status)
var eqFrequencies: [UInt32] = [ 32, 250, 500, 1000, 2000, 16000 ]
status = AudioUnitSetProperty(
self.MyAppUnit,
AudioUnitPropertyID(kAUNBandEQProperty_NumberOfBands),
AudioUnitScope(kAudioUnitScope_Global),
0,
eqFrequencies,
UInt32(eqFrequencies.count*sizeof(UInt32))
)
println(status)
status = AudioUnitInitialize(self.MyAppUnit)
println(status)
var ioUnitOutputElement:AudioUnitElement = 0
var samplerOutputElement:AudioUnitElement = 0
AUGraphConnectNodeInput(graph, sourceNode, sourceOutputBusNumber, self.MyAppNode, 0)
AUGraphConnectNodeInput(graph, self.MyAppNode, 0, destinationNode, destinationInputBusNumber)
and then to apply changes in the frequency gains my code is as follows:
if (MyAppUnit == nil) {return}
else{
var bandValue0 :Float32 = tenBands.objectAtIndex(0) as! Float32
var bandValue1 :Float32 = tenBands.objectAtIndex(1) as! Float32
var bandValue2 :Float32 = tenBands.objectAtIndex(2) as! Float32
var bandValue3 :Float32 = tenBands.objectAtIndex(3) as! Float32
var bandValue4 :Float32 = tenBands.objectAtIndex(4) as! Float32
var bandValue5 :Float32 = tenBands.objectAtIndex(5) as! Float32
AudioUnitSetParameter(self.MyAppUnit, 0, AudioUnitScope(kAudioUnitScope_Global), 0, bandValue0, 0);
AudioUnitSetParameter(self.MyAppUnit, 1, AudioUnitScope(kAudioUnitScope_Global), 0, bandValue1, 0);
AudioUnitSetParameter(self.MyAppUnit, 2, AudioUnitScope(kAudioUnitScope_Global), 0, bandValue2, 0);
AudioUnitSetParameter(self.MyAppUnit, 3, AudioUnitScope(kAudioUnitScope_Global), 0, bandValue3, 0);
AudioUnitSetParameter(self.MyAppUnit, 4, AudioUnitScope(kAudioUnitScope_Global), 0, bandValue4, 0);
AudioUnitSetParameter(self.MyAppUnit, 5, AudioUnitScope(kAudioUnitScope_Global), 0, bandValue5, 0);
}
Can anyone point out what I am doing wrong here? I think it is related to the second variable in AudioUnitSetParameter. I have tried AudioUnitParameterID(0), and AudioUnitParameterID(kAUNBandEQParam_Gain + 1) for this Value but those don't seem to work at all. Any help is appreciated!
Comment adding as answer because comments are insufficient.
The following Code is in Objective-c but it should help identify your problem.
There are a number of places this might fail. Firstly, you should check the status of the AudioUnitSetParameter, and indeed all the AudioUnit Calls as this will give you a clearer point of where you're code is failing.
I've done this successfully in Objective-C and have a test app i can make available, if you need it, which shows the complete graph setup and setting the bands and gains by moving a slider ... back to your specific question. The following works just fine for me, this might help you rule out a particular section.
You can try and obtain the current "gain", this will indicate if your bands are set up correctly.
- (AudioUnitParameterValue)gainForBandAtPosition:(uint)bandPosition
{
AudioUnitParameterValue gain;
AudioUnitParameterID parameterID = kAUNBandEQParam_Gain + bandPosition;
OSStatus status = AudioUnitGetParameter(equalizerUnit,
parameterID,
kAudioUnitScope_Global,
0,
&gain);
if (status != noErr) {
#throw [NSException exceptionWithName:#"gettingParamGainErrorException"
reason:[NSString stringWithFormat:#"OSStatus Error on getting EQ Gain! Status returned %d).", (int)status]
userInfo:nil];
}
return gain;
}
then setting the gain can be done in the following way;
- (void)setGain:(AudioUnitParameterValue)gain forBandAtPosition:(uint)bandPosition
{
AudioUnitParameterID parameterID = kAUNBandEQParam_Gain + bandPosition;
OSStatus status = AudioUnitSetParameter(equalizerUnit,
parameterID,
kAudioUnitScope_Global,
0,
gain,
0);
if (status != noErr) {
#throw [NSException exceptionWithName:#"settingParamGainAudioErrorException"
reason:[NSString stringWithFormat:#"OSStatus Error on setting EQ gain! Status returned %d).", (int)status]
userInfo:nil];
}
}
Finally, what value are you trying to set, the valid range (if I'm not mistaken) is -125.0 to 25.0

Save binary information to a file

I receive binary information via stream in Swift. Lets say the information is a picture. I now want to save the picture. How is this possible?
I tried following:
let bufferSize = 154000
var buffer = [UInt8](count: bufferSize, repeatedValue: 0)
var bytesRead = inputStream?.read(&buffer, maxLength: bufferSize)
if bytesRead > 0 {
var bytesWrittenSoFar = 0
do {
var diffbytes = bytesRead! - bytesWrittenSoFar
fileStream?.open()
fileStream?.write(UnsafePointer(&buffer[bytesWrittenSoFar]), maxLength: diffbytes)
} while (bytesWrittenSoFar != bytesRead);
But when I try to write (fileStream?.write...) I get following error: "Could not find an overload for 'init' that accepts the supplied arguments
Thank you for your answer in advance!
The problem is with the initialization of UnsafePointer. In this case, you don't need it at all, you can just pass &buffer[bytesWrittenSoFar], as that is an acceptable value to pass to a function that needs an unsafePointer, per the discussion in the apple book "Using Swift with Cocoa and Objective-C".

Resources