Thread Safety for a getter and setter in a singleton - ios

I have created an simple singleton in Swift 3:
class MySingleton {
private var myName: String
private init() {}
static let shared = MySingleton()
func setName(_ name: String) {
myName = name
}
func getName() -> String {
return myName
}
}
Since I made the init() private , and also declared shared instance to be static let, I think the initializer is thread safe. But what about the getter and setter functions for myName, are they thread safe?

You are correct that those getters that you've written are not thread safe. In Swift, the simplest (read safest) way to achieve this at the moment is using Grand Central Dispatch queues as a locking mechanism. The simplest (and easiest to reason about) way to achieve this is with a basic serial queue.
class MySingleton {
static let shared = MySingleton()
// Serial dispatch queue
private let lockQueue = DispatchQueue(label: "MySingleton.lockQueue")
private var _name: String
var name: String {
get {
return lockQueue.sync {
return _name
}
}
set {
lockQueue.sync {
_name = newValue
}
}
}
private init() {
_name = "initial name"
}
}
Using a serial dispatch queue will guarantee first in, first out execution as well as achieving a "lock" on the data. That is, the data cannot be read while it is being changed. In this approach, we use sync to execute the actual reads and writes of data, which means the caller will always be forced to wait its turn, similar to other locking primitives.
Note: This isn't the most performant approach, but it is simple to read and understand. It is a good general purpose solution to avoid race conditions but isn't meant to provide synchronization for parallel algorithm development.
Sources:
https://mikeash.com/pyblog/friday-qa-2015-02-06-locks-thread-safety-and-swift.html
What is the Swift equivalent to Objective-C's "#synchronized"?

A slightly different way to do it (and this is from an Xcode 9 Playground) is to use a concurrent queue rather than a serial queue.
final class MySingleton {
static let shared = MySingleton()
private let nameQueue = DispatchQueue(label: "name.accessor", qos: .default, attributes: .concurrent)
private var _name = "Initial name"
private init() {}
var name: String {
get {
var name = ""
nameQueue.sync {
name = _name
}
return name
}
set {
nameQueue.async(flags: .barrier) {
self._name = newValue
}
}
}
}
Using a concurrent queue means that multiple reads from multiple threads aren't blocking each other. Since there is no mutation on getting, the value can be read concurrently, because...
We are setting the new values using a .barrier async dispatch. The block can be performed asynchronously because there is no need for the caller to wait for the value to be set. The block will not be run until all the other blocks in the concurrent queue ahead of it have completed. So, existing pending reads will not be affected while this setter is waiting to run. The barrier means that when it starts running, no other blocks will run. Effectively, turning the queue into a serial queue for the duration of the setter. No further reads can be made until this block completes. When the block has completed the new value has been set, any getters added after this setter can now run concurrently.

Related

Atomic property wrapper only works when declared as class, not struct

I have created a "lock" in Swift and an Atomic property wrapper that uses that lock, for my Swift classes as Swift lacks ObjC's atomic property attribute.
When I run my tests with thread sanitizer enabled, It always captures a data race on a property that uses my Atomic property wrapper.
The only thing that worked was changing the declaration of the property wrapper to be a class instead of a struct and the main question here is: why it works!
I have added prints at the property wrapper and lock inits to track the number of objects created, it was the same with struct/class, tried reproducing the issue in another project, didn't work too. But I will add the files the resembles the problem and let me know any guesses of why it works.
Lock
public class SwiftLock {
init() { }
public func sync<R>(execute: () throws -> R) rethrows -> R {
objc_sync_enter(self)
defer { objc_sync_exit(self) }
return try execute()
}
}
Atomic property wrapper
#propertyWrapper struct Atomic<Value> {
let lock: SwiftLock
var value: Value
init(wrappedValue: Value, lock: SwiftLock=SwiftLock()) {
self.value = wrappedValue
self.lock = lock
}
var wrappedValue: Value {
get {
lock.sync { value }
}
set {
lock.sync { value = newValue }
}
}
}
Model (the data race should happen on the publicVariable2 property here)
class Model {
#Atomic var publicVariable: TimeInterval = 0
#Atomic var publicVariable2: TimeInterval = 0
var sessionDuration: TimeInterval {
min(0, publicVariable - publicVariable2)
}
}
Update 1:
Full Xcode project: https://drive.google.com/file/d/1IfAsOdHKOqfuOp-pSlP75FLF32iVraru/view?usp=sharing
This is question is answered in this PR: https://github.com/apple/swift-evolution/pull/1387
I think this is those lines that really explains it 💡
In Swift's formal memory access model, methods on a value types are considered to access the entire value, and so calling the wrappedValue getter formally reads the entire stored wrapper, while calling the setter of wrappedValue formally modifies the entire stored wrapper.
The wrapper's value will be loaded before the call to
wrappedValue.getter and written back after the call to
wrappedValue.setter. Therefore, synchronization within the wrapper
cannot provide atomic access to its own value.

Racing condition on a custom concurrent queue sync for getter and setter

I have the following code, which to my understanding should provide thread safe read and write on _predictions. My isolationQueue is a concurrent queue and needs to stay as such. I call multiple independent async operations on this queue to calculate predictions for various images. The only shared item between the various calls is the when prediction is set.
var isolationQueue = DispatchQueue.global(qos: .default)
var _predictions: [Int:[Prediction]] = [:]
var predictions:[Int: [Prediction]] {
get {
var result: [Int: [Prediction]]!
isolationQueue.sync {
result = _predictions
}
return result
}
set(value) {
isolationQueue.sync {
self._predictions = value
}
}
}
However for some reason Thread Sanitizer seems to detect a racing condition between the getter and the setter.
Am I missing something?
Global dispatch queues are concurrent queues, so they cannot be
used to protect against concurrent access of a resource.
You should define your own serial dispatch queue for that purpose:
var isolationQueue = DispatchQueue(label: "my.queue.identifier")
For your case, using DispatchSemaphore() is simpler and has less overhead. The code looks like
var isolationSem = DispatchSemaphore(value: 1)
var _predictions: [Int:[Prediction]] = [:]
var predictions:[Int: [Prediction]] {
get {
var result: [Int: [Prediction]]!
isolationSem.wait()
result = _predictions
isolationSem.signal()
return result
}
set(value) {
isolationSem.wait()
self._predictions = value
isolationSem.signal()
}
}
See this for the case where DispatchSemaphore is not suitable.

How to implement Singleton that takes data on initilisation in Swift?

In this post, it is very nicely explained how Singletons should be implemented in Swift, essentially it can be done with two lines:
class TheOneAndOnlyKraken {
static let sharedInstance = TheOneAndOnlyKraken()
private init() {} //This prevents others from using the default '()' initializer for this class.
}
However, what happens if my Singleton is supposed to be initalised with some data? Maybe it needs to encapsulate an API Key or other data that it can only receive from the outside. An example could look as follows:
class TheOneAndOnlyKraken {
let secretKey: String
static let sharedInstance = TheOneAndOnlyKraken()
private init() {} //This prevents others from using the default '()' initializer for this class.
}
In that situation, we can't make the initializer private because we will have to create an initializer that takes a String as an argument to satisfy the compiler:
init(secretKey: String) {
self.secretKey = secretKey
}
How can that be saved and we still make sure that we have a thread-safe instantiation of the singleton? Is there a way how we can avoid using dispatch_once or would we have to essentially default back to the Objective-C way where we use dispatch_once to make sure that the initializer indeed only gets called once?
First, note that the ObjC way you're implying is not thread-correct. It may be "safe" in that it doesn't crash and does not generate undefined behavior, but it silently ignores subsequent initializations with differing configuration. That is not expected behavior. Readers that are known to occur after the write will not receive the written data. That fails consistency. So put aside theories that such a pattern was correct.
So what would be correct? Correct would be something like this:
import Dispatch
class TheOneAndOnlyKraken {
static let sharedInstanceQueue: DispatchQueue = {
let queue = DispatchQueue(label: "kraken")
queue.suspend()
return queue
}()
private static var _sharedInstance: TheOneAndOnlyKraken! = nil
static var sharedInstance: TheOneAndOnlyKraken {
var result: TheOneAndOnlyKraken!
sharedInstanceQueue.sync {
result = _sharedInstance
}
return result
}
// until this is called, all readers will block
static func initialize(withSecret secretKey: String) {
// It is a programming error to call this twice. If you want to be able to change
// it, you'll need another queue at least.
precondition(_sharedInstance == nil)
_sharedInstance = TheOneAndOnlyKraken(secretKey: secretKey)
sharedInstanceQueue.resume()
}
private var secretKey: String
private init(secretKey: String) {
self.secretKey = secretKey
}
}
This requires a single explicit call to TheOneAndOnlyKraken.intialize(withSecret:). Until someone makes that call, all requests for sharedInstance will block. A second call to initialize will crash.

Cycling using singleton in Swift

I have a singleton class:
class SomeManager {
static let sharedInstance = SomeManager()
let serverService = SomerServerService()
let musicService = SomeMusicService()
}
I try to use
class SomeMusicService
{
let serverService = SomeManager.sharedInstance.serverService //here seems I get cycle.
}
Should I use lazy or some other initialization.
As you can see let musicService = SomeMusicService() initializes an object and then in the same object SomeMusicService it tries to call sharedInstance of SomeManager singleton to get another service at start.
So this is a full listing:
import Foundation
class ServerService
{
func downloadMusic()
{
print("Download music and play it after that.")
}
}
class MusicService
{
let serverService = Singleton.shared.serverService
func playMusic()
{
serverService.downloadMusic()
}
}
class Singleton
{
static let shared = Singleton()
let serverService = ServerService()
let musicService = MusicService()
}
let s = Singleton.shared
print("Hello, World!")
We never get print("Hello, World!") line to be invoked.
You could use weak to avoid the retain cycle, but the better answer is a computed property:
class SomeMusicService {
var serverService: SomeService { return SomeManager.sharedInstance.serverService }
}
I see from your updated code what the cycle is. Here's how it plays out:
Call Singleton.shared
Begin to construct Singleton
Construct ServerService (for serverService property)
Begin to construct MusicService (for musicService property)
Call Singleton.shared (for serverService property)
Block waiting for Singleton.shared to complete
The program is now deadlocked waiting on itself.
The right answer is to use a computed property so that there is no need to call Singleton.shared during construction. A lazy property would work as well, but seems a lot of trouble for this (and risks creating retain loops between the services).

Whether performBlockAndWait calling twice on single thread cause deadlock?

I have found something like this in performBlockAndWait documentation:
This method may safely be called reentrantly.
My question is whether it means that it never cause deadlock when I e.g. will invoke it like that on single context?:
NSManageObjectContext *context = ...
[context performBlockAndWait:^{
// ... some stuff
[context performBlockAndWait:^{
}];
}];
You can try it yourself with a small code snippet ;)
But true, it won't deadlock.
I suspect, the internal implementation uses a queue specific token in order to identify the current queue on which the code executes (see dispatch_queue_set_specific and dispatch_queue_get_specific).
If it determines that the current executing code executes on its own private queue or on a children-queue, it simply bypasses submitting the block synchronously - which would cause a dead-lock, and instead executing it directly.
A possible implementation my look as below:
func executeSyncSafe(f: () -> ()) {
if isSynchronized() {
f()
} else {
dispatch_sync(syncQueue, f)
}
}
func isSynchronized() -> Bool {
let context = UnsafeMutablePointer<Void>(Unmanaged<dispatch_queue_t>.passUnretained(syncQueue).toOpaque())
return dispatch_get_specific(&queueIDKey) == context
}
And the queue might be created like this:
private var queueIDKey = 0 // global
init() {
dispatch_queue_attr_make_with_qos_class(DISPATCH_QUEUE_SERIAL,
QOS_CLASS_USER_INTERACTIVE, 0))
let context = UnsafeMutablePointer<Void>(Unmanaged<dispatch_queue_t>.passUnretained(syncQueue).toOpaque())
dispatch_queue_set_specific(syncQueue, &queueIDKey, context, nil)
}
dispatch_queue_set_specific associates a token (here context - which is simply the pointer value of the queue) with a certain key for that queue. And later, you can try to retrieve that token for any queue specifying the key and check whether the current queue is the same queue or a children-queue. If this is true, bypass dispatching to the queue and instead call the function f directly.

Resources