Is Swift on OSX different from the iOS version? - ios

I want to get the bounds of a CALayer.
On OSX I have to do this
let bounds = rootLayer?.bounds
on iOS, this
let bounds = rootLayer.bounds
and then to assign this bounds to another layer I have to do this
on OSX
anotherLayer.bounds = bounds!
on iOS
anotherLayer.bounds = bounds
why? is swift different on iOS and OSX? That would be awful.
On both cases rootLayer is set like this
let rootLayer = vista.layer

Swift the language is the same. The UI frameworks and API's, are not the same, which is expected since since the UI paradigms are not the same.
I've got no idea what API you are populating vista from. But somewhere along the line, you have an API that's defined slightly different on OSX and iOS. On OSX vista.layer returns and optional, and on iOS returns a non-optional. When you find where that is, you can correct for it, and unwrap the optional there.
Lastly, implicit typing is doing you no favors here. If you change your rootLayer declaration to this:
let rootLayer:MyLayerTypeHere = vista.layer
Then you'll get a compile time error right here if the type is returned as an optional, since MyLayerTypeHere and MyLayerTypeHere? are different types.

Related

Accessing CFArray causes crash in Swift

My following code crashes with EXC_BAD_ACCESS, and I do not understand why. My initial understanding is that the memory retention in this case should be automatic, but it seems I am wrong... Maybe someone can help. Thank you! The code is written in Swift 5 and runs on iOS 15.2 in XCode 13.2.1.
Casting to NSArray causes trouble...
let someFont = CGFont("Symbol" as CFString)!
if let cfTags: CFArray = someFont.tableTags {
let nsTags = cfTags as NSArray
print(nsTags.count) // Prints: 16
let tag0 = nsTags[0] // CRASH: Thread 1: EXC_BAD_ACCESS (code=257, ...)
}
Alternatively, using CFArray-API causes also trouble (The crash message is about a misaligned pointer but the root cause seems also the bad access, which occurs e.g. if I replace UInt32.self by UInt8.self, and hence eliminate the alignment problem).
let someFont = CGFont("Symbol" as CFString)!
if let cfTags: CFArray = someFont.tableTags {
print(CFArrayGetCount(cfTags)) // Prints: 16
let tag0Ptr: UnsafeRawPointer = CFArrayGetValueAtIndex(cfTags, 0)!
tag0Ptr.load(as: UInt32.self)// CRASH :Thread 1: Fatal error: load from misaligned raw pointer
}
The issue here is that the CGFont API uses some advanced C-isms in their storage of table tags, which really doesn't translate to Swift: CGFontCopyTableTags() returns a CFArrayRef which doesn't actually contain objects, but integers. (This is technically allowed through CFArray's interface: it accepts void *s in C, into which you can technically stuff any integer which fits in a pointer, even if the pointer value is nonsense...) Swift expects CFArrays and NSArrays to only ever contain valid objects and pointers, and it treats the return values as such — this is why accessing via NSArray also fails (Swift expects an object but the value isn't an object, so it can't be accessed like a pointer, or retained, or any of the things that the runtime might expect to do).
Your second code snippet is closer to how you'll need to access the value: CFArrayGetValueAtIndex appears to return a pointer, but the value you're getting back isn't a real pointer — it's actually an integer stored in the array directly, masquerading as a pointer.
The equivalent to the Obj-C example from the CGFontCopyTableTags docs of
tag = (uint32_t)(uintptr_t)CFArrayGetValue(table, k);
would be
let tag = unsafeBitCast(CFArrayGetValueAtIndex(cfTags, 0), to: UInt.self)
(Note that you need to cast to UInt and not UInt32 because unsafeBitCast requires that the input value and the output type have the same alignment.)
In my simulator, I'm seeing a tag with a value of 1196643650 (0x47535542), which definitely isn't a valid pointer (but I don't otherwise have domain knowledge to validate whether this is the tag you're expecting).

How to get values for primary light intensity etc, from ARDirectionalLightEstimate

So I'm trying to use the front camera of iPhone XR to get the approximate location for light sources. I decided to use ARDirectionalLightEstimate but I can't figure out how to access it. I can easily access lightEstimate property.
The Docs said that the lightEstimate property of each frame has an instance of ARDirectionalLightEstimate but I can't access it using the dot operator, I even tried to type cast it to ARDirectionalLightEstimate (like I saw someone doing, I can't find the link right now but I will update) but that didn't work too. I am inexperienced in swift so it's possible I messed up somewhere.
ARDirectionalLightEstimate is a subclass of type ARLightEstimate, so to access you need to type cast lightEstimate:
let lightEstimate = sceneView?.session.currentFrame?.lightEstimate
if let directionalLightEstimate = lightEstimate as? ARDirectionalLightEstimate {
// add logic here
let primaryLightIntensity = directionalLightEstimate.primaryLightIntensity
// ...
}

Casting UnsafeRawPointer to CGImage from C function that returns a CGImageRef Crashes Swift 4

We have a cross platform image processing library and there is a call that returns a CGImageRef as an UnsafeRawPointer which I need to turn into a CGImage. I know the call works as when using it in objective c I can just cast using (CGImageRef) returnedData on the returned data and I get the image. However when trying to use it in our iOS app using Swift 4 I can't seem to cast it without getting an EXC_BREAKPOINT error or a BAD_ACCESS error. I've tried casting using as! CGImage as well as trying pointer.load(as: CGImage.self) and both give me the same result. Am I doing something wrong? is there a better way to pass back an image from C code to swift? I should mention that the same C function takes a CGImageRef as a parameter and passing a CGImage in instead causes no issues whatsoever.
I think it’s a memory management issue. E.g. the CGImageRef that you get is released before you use it. Use “Product" > “Scheme" > "Edit Scheme" menu, select “Run" section, “Diagnostics” tab and enable “Zombie Objects” option. Then run the app. If some object is used after deallocation, you will get a message in the console.
If that’s the case, something like this should fix the issue:
// pointer is your UnsafeRawPointer for CGImageRef
let cgImage = Unmanaged<CGImage>.fromOpaque(pointer).takeRetainedValue()
Better yet, if you can edit or override header files for the library in question, you can annotate C functions to let Swift know how to handle the pointers properly.

NativeScript: Get string from interop.reference

To start, here is my code:
var buffer = malloc(interop.sizeof(interop.types.UTF8CString));
var fillBuffer = mac.getBytes(buffer);
var bytes = new interop.Reference(interop.types.UTF8CString, buffer);
var hexMac = bytes[0];
The variable 'Mac' is an NSData objected retrieved from CoreBluetooth. It is the scan response from a BLE device, which contains the peripheral's MAC address (00:0b:57:a2:fb:a0).
This problem is linked to THIS question I had posted earlier.
The solution provided is great; however, I cannot seem to implement this in nativescript :
(instancetype)stringWithFormat:(NSString *)format, ...;
Intellisense tells me the method doesnt exist on type NSString.
Due to that issue, I decided to go another route (as you can tell). I am filling a buffer with the bytes of the MAC address. In the code above, bytes[0] equates to 0xb57a2fba0.
I am now trying to convert that (which is an interop.Reference) into a string that I can store on the back-end (preferably in the xx:xx:xx:xx:xx format).
I have been at this all weekend, and cannot seem to find a solution. I even broke down objc!foundation.d.ts to figure out if stringWithFormat was supported, to no avail.
The nativescript community slack was unable to provide a resolution as well.
Please help if you can!
I don't know anything about NativeScript at all, but given the other code you wrote, I assume you're calling +alloc first, and so mean to use -initWithFormat: (an instance method that initializes) rather than +stringWithFormat: (a class method which handles allocation and initialization).

Dynamic naming of objects in AudioKit (SpriteKit)

I am trying to create an app similar to the Reactable.
The user will be able to drag "modules" like an oscillator or filter from a menu into the "play area" and the module will be activated.
I am thinking to initialize the modules as they intersect with the "play area" background object. However, this requires me to name the modules automatically, i.e.:
let osci = AKOscillator()
where osci will automatically count up to be:
let osci1 = AKOscillator()
let osci2 = AKOscillator()
...
etc.
How will I be able to do this?
Thanks
edit: I am trying to use an array by creating an array of
var osciArray = [AKOscillator]()
and in my function to add an oscillator, this is my code:
let oscis = AKOscillator()
osciArray.append(oscis)
osciArray[oscCounter].frequency = freqValue
osciArray[oscCounter].amplitude = 0.5
osciArray[oscCounter].start()
selectedNode.userData = ["counter": oscCounter]
oscCounter += 1
currentOutput = osciArray[oscCounter]
AudioKit.output = currentOutput
AudioKit.start()
My app builds fine, but once the app starts running on the Simulator I get error : fatal error: Index out of range
I haven't used AudioKit, but I read about it a while ago and I have quite a big interest in it. From what I understand from the documentation, it's structured pretty much like SpriteKit: nodes connected together.
I guess then that most classes in the library derive from a base class, just like everything in SpriteKit derives from the SKNode class.
Since you are linking the audio kit nodes with visual representations via SpriteKit nodes, why don't you simply subclass from an SKSpriteNode and add an optional audioNode property with the base class from AudioKit?
That way you can just use SpriteKit to interact directly with the stored audio node property.
I think there's a lot of AudioKit related code in your question, but to answer the question, you only have to look at oscCounter. You don't show its initial value, but I am guessing it was zero. Then you increment it by 1 and try to access osciArray[oscCounter] which has only one element so it should be accessed by osciArray[0]. Move the counter lower and you'll be better off. Furthermore, your oscillators look like local variables, so they'll get lost once the scope is lost. They should be declared as instance variables in your class or whatever this is part of.

Resources