How many NSOperationQueues should an app have? - ios

The question is if an app should have one instance of the queue for all async operations, or several queues can be created?
With one queue it's pretty simple because all tasks are executed based on assigned priority. So for me it's more favourable because there is no need to write extra code.
In the case of multiple queues at least one of them should be main.
So some sort of queue manager should be implemented that will be able to suspend "sub" queues and allow execution of operations from main queue if needed.
The analogy with only one single connection to database make me think that the one centralised queue should be used for all async operations.
So what would you recommend? What are the best practices?

After doing some searching and brainstorming I came up to the solution.
In my application I'm going to use 2 async queues:
one queue (NSOperationQueue) for all background operations (like parsing of downloaded .json files) and another queue (which is already implemented by NSURLSession class) for requests (API requests, downloading images, etc).

Related

Serial Dispatch Queue with nested queues

I have a swift class that does runs a multi-step process, much like this question has: Simple GCD Serial Queue example like FIFO using blocks
So i have a single public method of func start(), then a bunch of private methods which contain each "step" of the process.
I've spoken to a few people about this and i've been told i don't need to use a load of completion handlers and can instead use a serial dispatch queue.
However, if some of those steps run something that contains its own queue, then the method will end (and the next step starts) before it really has completed. So for example if func step1() uses Alamofire to download some data from an API, and step2() turns the response json into data models, step1() will return before the network response comes in. This is why i originally used callbacks, however i'm being told i don't need them?
The only way i can see a serial queue working is if there was a way of manually telling the queue when a certain task is complete (which there isn't?).
Could anyone help me work out the best way to structure and run this multi-step process, preferably without a load of nested completion handlers?
Or in other words: How do you use serial dispatch queues if some of the tasks you push to the queue will do things like network requests that return immediately.
You can use NSOperationQueue for that. NSOperationQueue is built on top of GCD and specially designed to manage a number of dependent jobs.

Swift, number of threads in app

I am making an app for website. I use JSON to get data. I want to load all posts in threads (1 post - 1 thread). How many threads I can make? Should I control the number of threads?
With Cocoa you usually don't work with Threads directly. Grand Central Dispatch (GCD) is an API that handles this for your. You just have to partition your task into small managable chunks, dispatch them onto a background queue and the rest is handled for you. You don't need to worry about creating threads, how many are currently running etc. When you dispatch enough work on one (or possibly more) queues, the CPU is going to run at maximum load.
You can also use NSOperationQueue, which has the ability to throttle the execution to some extend or cancel currently running tasks (not possible with GCD).
Unless you are doing anything unusual there is no need to use NSThread directly. Use GCD when you just need to perform a simple small task asynchronously. Use NSOperationQueue when you need more control, like cancelling submitted tasks or setting priorities. It's API is also a bit higher level and in Objective-C. GCD is a C level API, so for example it can't catch ObjC-Exceptions. NSOperationQueue uses GCD internally, so both should work equally well.

What is the difference between 'thread' and 'queue' in iOS development? [duplicate]

This question already has answers here:
Use of the terms "queues", "multicore", and "threads" in Grand Central Dispatch
(3 answers)
Closed 8 years ago.
I am new to iOS development. Now I am quite confused about the two concepts: "thread" and "queue". All I know is that they both are about multithread programming. Can anyone interpret those two concepts and the difference between them for me?
Thanks in advance!
How NSOperationQueue and NSThread Works:
NSThread:
iOS developers have to write code for the work/process he want to perform along with for the creation and management of the threads themselves.
iOS developers have to be careful about a plan of action for using threads.
iOS developer have to manage posiable problems like reuseability of thread, lockings etc. by them self.
Thread will consume more memory too.
NSOperationQueue:
The NSOperation class is an abstract class which encapsulates the code and data associated with a single task.
Developer needs to use subclass or one of the system-defined subclasses of NSOperation to perform the task.
Add operations into NSOperationQueue to execute them.
The NSOperationQueue creates a new thread for each operation and runs them in the order they are added.
Operation queues handle all of the thread management, ensuring that operations are executed as quickly and efficiently as possible.
An operation queue executes operations either directly by running them on secondary threads or indirectly using GCD (Grand Central Dispatch).
It takes care of all of the memory management and greatly simplifies the process.
If you don’t want to use an operation queue, you can also execute an operation by calling its start method. It may make your code too complex.
How To Use NSThread And NSOperationQueue:
NSThread:
Though Operation queues is the preferred way to perform tasks concurrently, depending on application there may still be times when you need to create custom threads.
Threads are still a good way to implement code that must run in real time.
Use threads for specific tasks that cannot be implemented in any other way.
If you need more predictable behavior from code running in the background, threads may still offer a better alternative.
NSOperationQueue:
Use NSOperationQueue when you have more complex operations you want to run concurrently.
NSOperation allows for subclassing, dependencies, priorities, cancellation and a supports a number of other higher-level features.
NSOperation actually uses GCD under the hood so it is as multi-core, multi-thread capable as GCD.
Now you should aware about advantages and disadvantages of NSTread and NSOperation. You can use either of them as per needs of your application.
Before you read my answer you might want to consider reading this - Migrating away from Threads
I am keeping the discussion theoretical as your question does not have any code samples. Both these constructs are required for increasing app responsiveness & usability.
A message queue is a data structure for holding messages from the time they're sent until the time the receiver retrieves and acts on them. Generally queues are used as a way to 'connect' producers (of data) & consumers (of data).
A thread pool is a pool of threads that do some sort of processing. A thread pool will normally have some sort of thread-safe queue (refer message queue) attached to allow you to queue up jobs to be done. Here the queue would usually be termed 'task-queue'.
So in a way thread pool could exist at your producer end (generating data) or consumer end (processing the data). And the way to 'pass' that data would be through queues. Why the need for this "middleman" -
It decouples the systems. Producers do not know about consumers & vice versa.
The Consumers are not bombarded with data if there is a spike in Producer data. The queue length would increase but the consumers are safe.
Example:
In iOS the main thread, also called the UI thread, is very important because it is in charge of dispatching the events to the appropriate widget and this includes the drawing events, basically the UI that the user sees & interacts.
If you touch a button on screen, the UI thread dispatches the touch event to the app, which in turn sets its pressed state and posts an request to the event queue. The UI thread dequeues the request and notifies the widget to redraw itself.

Is there any reason to share a dispatch queue?

I'm working on some code that I can't contact the original developer.
He passes a class a reference to another class's serial dispatch queue and I'm not seeing any purpose to him not just creating another dispatch queue (each class is a singleton).
It's not creating issues but I'd like to understand it further, so any insight into the positive implications is appreciated, thank you.
Edit: I suggest reading all the answers here, they explain a lot.
It's actually not a good idea to share queues in this fashion (and no, not because they are expensive - they're not, quite the converse). The rationale is that it's not clear to anyone but a queue's creator just what the semantics of the queue are. Is it serial? Concurrent? High priority? Low priority? All are possible, and once you start passing internal queues around which were actually created for the benefit of a specific class, the external caller can schedule work on it which causes a mutual deadlock or otherwise behaves in an unexpected fashion with the other items on that queue because caller A knew to expect concurrent behavior and caller B was thinking it was a serial queue, without any of the "gotchas" that concurrent execution implies.
Queues should therefore, wherever possible, be hidden implementation details of a class. The class can export methods for scheduling work against its internal queue as necessary, but the methods should be the only accessors since they are the only ones who know for sure how to best access and schedule work on that specific type of queue!
If it's a serial queue, then they may be intending to serialize access to some resource shared between all objects that share it.
Dispatch queues are somewhat expensive to create, and tie up system resources. If you can share one, the system runs more efficiently. For that matter, if your app does work in the background, using a shared queue allows you to manage a single pool of tasks to be completed. So yes, there are good reasons for using a shared dispatch queue.

What is good practice for using NSOperation and NSOperationQueue?

I just start using NSOperation and NSOperationQueue. I really want to know what should be better practice for using them.
Should I create a only one queue for whole app, or should I create one queue for each specific type of task? What is the effect if I create so many queues?
It depends on what you're using them for. If you're using them for performance, maxing out the CPU, then there is little point going much beyond how many cores you expect to have.
If you're offloading processing just to keep the UI responsive, but you're not particularily concerned about maxing out performance, I'd just use the default background queue for simplicity.
If you're using sequential background queues to trigger actions on external events ( network request completion, for example), then it doesn't do any harm to use a bunch, as they won't lead to a lot of threads trying to run concurrently.
There are many instances where you can simplify the design using an operation queue.
NSOperationQueues are built on top of GCD, and AFAIK, there is no performance impact of having loads of queues, as long as they're not trying to execute concurrently.
Read up on the Executing Operations section of the Apple Concurrency Guide. It depends of course on your implementation and application but in summary ther can only be a certain amount of NSOperations running concurrently and increasing the number of NSOperationQueues won't serve any benefit to you. So my recommendation would be to opt for 1 or fewer queues if your design can allow it.

Resources