I've inherited some code that throws any operation it can onto the main UI thread. For instance, when a network operation returns, the completion blocks are automatically run on the main thread and the programmers mix UI code with network requests willy-nilly.
How can I quantify the load or stress on the main UI thread, so that I can prove to a manager that putting everything on the main thread is a bad practice? I am hoping for a programmatic way to get this datum, because I also want to log it.
Related
I encountered a problem with non-iOS developer while describing the flow of OperationQueue. We already know that with OperationQueue we can start as many threads as we want to execute the tasks in sync/async way.
But in practical, some of the people want proof for the OperationQueue is getting executed in the background and not with UI(Main) thread.
I just want to demonstrate that when operation queue starts, it already starts its execution in the background.
I already have convinced that when we try to set qos for the operationQueue that we create, it has all the parameters of global queue's qos viz: default userInitiated userInteractive background and utility.
So that is already perfect example in code to prove that all the OperationQueue operations mentioned are run on global thread. Unless we declare the OperationQueue.main
As Shadowrun said, you can add assertions for Thread.isMainThread. Also, if you ever want to add test that you are not on the main queue, you can add a precondition:
dispatchPrecondition(.notOnQueue(.main))
But it should be noted that the whole purpose of creating an operation queue is to get things off the main thread. E.g., the old Concurrency Programming Guide: Operation Queues says [emphasis added]:
Operations are designed to help you improve the level of concurrency in your application. Operations are also a good way to organize and encapsulate your application’s behavior into simple discrete chunks. Instead of running some bit of code on your application’s main thread, you can submit one or more operation objects to a queue and let the corresponding work be performed asynchronously on one or more separate threads.
This is not to say that operation queues cannot contribute to main thread responsiveness problems. You can wait from the main thread for operations added to an operation queue (which is obviously a very bad idea). Or, if you neglect to set a reasonable maxConcurrentOperationCount and have thread explosion, that can introduce all sorts of unexpected behaviors. Or you can entangle the main thread with operation queues through a misuse of semaphores or dispatch groups.
But operations on an operation queue (other than OperationQueue.main) simply do not run on the main thread. Technically, you could get yourself in trouble if you started messing with the target queue of the underlying queue, but I can’t possibly imagine that you are doing that. If you are having main thread problems, your problem undoubtedly rests elsewhere.
I'm working on a game that I'm wanting to add cloud saves via GameKit. The original save code was based on synchronous file I/O and blocks the main queue. Moving away from this design would be a massive amount of work. Unfortunately, it seems like the GameKit APIs dispatch callbacks on the main queue which causes a deadlock in this case.
Given this, is there a way to manually process the blocks in a dispatch queue? That way the blocking code could process the main queue while waiting for the callbacks, eliminating the deadlock.
You can setup a chain of responsibility using Operations. Setup the dependencies between each operation, and then drop them in their respective queues using Grand Central Dispatch.
UI code should be in main and there’s Quality Of Services for background tasks and USER generated activity. You can create a dispatch queues and set them to be synchronous (serial) or asynchronous (parallel).
If you include code, I will post examples. Otherwise there’s several options on how to do the above.
I'm building an application that runs an Electron application. What I'm seeing is that when the main thread gets busy running its own operations, BrowserWindow's thread will get blocked (just like it does if the BrowserWindow itself is running javascript).
Are these sharing the same thread? If so, what is the best way to separate them?
First of all, it's not really Electron's main thread. It'd be more accurate to say that it's Node's thread.
Second, the Main process' main thread is used (among other things of course) to communicate between the Main process and the Renderer process that's used by the BrowserWindow, so if your main thread is performing a large synchronous operation, your main thread will block, and that could certainly affect the responsiveness of your window.
what is the best way to separate them?
I can't really provide a general solution that will be useful in all cases. You should present a specific example. What is your main thread busy doing?
You can look into using WebWorkers. See here.
In short, yes, Electron’s main thread can (somewhat counter-intuitively) “block” renderer UI.
Apparently[0], Electron relies on ongoing main-renderer communication a lot in background, and that is in addition to explicit IPC calls you make. Thus, if main thread is locking on some operations, UI will lag like crazy even if your own IPC calls are not blocked.
Ways around this:
You could use Node worker threads[1] in main.
You could use Web Workers, with or without spinning “hidden renderers”.
See also
[0] The Horror of Blocking Electron’s Main Process
[1] Gotchas on using Node worker threads in Electron: thread in #18540
I'd worked on Java, and pretty much clear with the working of threads and thread pool.
I was wondering if anyone can explain the working of how thread's are created and allocated space in the thread pool in swift ?.
Also, does
Dispatch.main.async {
// some code
}
Creates a new Thread or Asynchronously executes the task ?
Thanks in advance =)
Queues and threads are separate concepts. Queues are ordered (sometimes prioritized) sequences of blocks to execute. As (mostly) an implementation detail, blocks must be scheduled onto threads in order to execute, but this is not the major point of them.
So Dispatch.main.async dispatches (appends) a block to the main queue. The main queue is serial and somewhat special in that it is promised to also be run exclusively on the main thread (as noted by Paulw11). It also promises to be associated with the main runloop. Understanding this "appends a block to a queue" concept is critical, because it has significant impact on how you design things in queues vs how you design things in threads. async does not mean "start running this now." It means "stick this on a queue, but don't wait for it."
As a good example of how the designs can be different, placing something on a queue doesn't mean it will ever run (even without bugs or deadlocks). It is possible and useful to suspend queues so that it stops scheduling blocks. It's possible to tie queues to other queues so that when a queue "schedules" something, it just puts it onto another queue rather than executing it. There are lots of things you can do with queues unrelated to "run things in the background." You can attach completion handlers to blocks. You can use groups to wait on collections of blocks. GCD is a way of thinking about concurrency. Parallelism is just a side benefit. (A great discussion of this concept is Concurrency is not parallelism by Rob Pike. It's in Go, but the concepts still apply.)
If you call Dispatch.main.async while running on the main queue, then that block is absolutely certain to not execute until the current block finishes. In UIKit and AppKit, "the current block finishes" often means "you return from a method that was called by the OS." While not implemented this way, you can pretend that every time you're called from the OS, it was wrapped in a call to Dispatch.main.async.
This is also why you must never call Dispatch.main.sync (note sync) from the main queue. That block will wait for you to return, and you'll wait until the block finishes. A classic deadlock.
As a rule, the thread pool is not your business in iOS. It is an implementation detail. Occasionally you need to think about it for performance reasons, but if you are thinking too much about it, you probably are designing your concurrency incorrectly.
If you're coming from Java, you definitely want to read Migrating Away From Threads in the Concurrency Programming Guide. It's the definitive resource for how to rethink thread-based patterns in queues.
Your code queues the block of code on the main queue (Dispatch.main) and returns immediately (.async), before executing the code.
You do not have control over which thread is used by the queue. Even if you create an own queue:
let serialQueue = DispatchQueue(label: "queuename")
serialQueue.async {
...
}
you do not know which thread your code will be running on.
Update:
As correctly stated by Paulw11 in the comment,
... if you dispatch a task on the main queue, it is guaranteed to execute on the main thread. If you dispatch a task on any other queue, you don't know which thread it will execute on; it may execute on the main thread or some other thread.
I'm having a bit of a problem. I want to display a progress form that just shows an animation on a when the main application preforms heavy operations.
I've done this in a thread and it works fine when the user isn't preforming any operations. But it just stops when my main application is busy.
I'm not able to put Application.ProcessMessages in between the different lines of code because I'm using 3rdparty components with heavy processing time.
My idea was to create a new process and in the process create a thread that execures the animation. Now that wouldn't stop the thread form executing when the main application performs heavy operations.
But as I see it you can only create a new process if you executes a new program.
Does any one have a solution on how to make a thread continue executing even when the main application is busy?
/Brian
If your worker thread does not have a lower priority than the main thread, you don't use the Synchronize() method, don't call SendMessage() and don't try to acquire any synchronization object that the main GUI thread has already acquired, then your secondary thread should continue to work.
As the VCL isn't thread-safe people do often advise to use Synchronize() to execute code to update VCL controls synchronously in the context of the VCL thread. This however does not work if the VCL thread is itself busy. Your worker thread will block until the main thread continues to process messages.
Your application design is unfortunate, anyway. You should perform all lengthy operations in worker threads, and keep the main thread responsive for user interaction. Even with the fancy animation your app will appear hung to the user since it won't redraw while the VCL thread is busy doing other things and processes no messages. Try to put your lengthy code in worker threads and perform your animation in timer events in the main thread.
Your logic is backward. Your thread should be doing the "heavy work", and passing messages to your main application to update the progress or animation.
If you leave all the "heavy work" in your main application, the other thread won't get enough chances to execute, which means it won't get a chance to update anything. Besides, all access to the GUI (VCL controls) must happen in the application's main thread; the VCL isn't thread-safe. (Neither is Windows itself, when it comes to visual controls.)
If by "Does any one have a solution on how to make a thread continue executing even when the main application is busy?" you mean that main thread is busy you should move the code that is consumming main thread to another other thread. In other words main thread should be responsible for starting and stopping actions and not executing them.
Disclaymer:
Actually I don't know delphy but I think/hope the concepts are quite similar to C++ or C#.