I need to execute synchronous requests on API using Swift. Requests must be queued. Meaning, if one is already in progress and it awaits response it must not be canceled or interrupted by the next synchronous request that enters queue or is already in queue.
Requests must be executed in order as they enter queue (FIFO). Next request must not start until previous is finished/completed in the queue.
Also, every single request in queue must be executed until queue is empty. Synchronous requests can enter queue at any time.
I meant to implement a Synchronous API Client as a singleton which contains its own Queue for queued requests. Requests must not stop/freeze UI. UI has to be responsive on user interaction all the time.
I know it can be done with semaphores but, unless you know what your are doing and you are completely sure how semaphores work, it is not the safest or maybe the best way do it. Otherwise, potential bugs and crashes could appear.
I'm expecting successful execution of every synchronous request that enters queue (by FIFO order, regardless if it returns success or an error as a response) and UI updates immediately after.
So, my question is what is the best way to approach and solve this problem?
Thanks for your help and time.
You can create your own DispatchQueue and put you operations on it as DispatchWorkItems. It is serial per default. Just remember to call your completions on DispatchQueue.main if you plan to update the UI.
John Sundell has a wonderful article about DispatchQueues here:
https://www.swiftbysundell.com/articles/a-deep-dive-into-grand-central-dispatch-in-swift/
Related
As we know, dart is a single-threaded language. So according to the document, we can use Futrure/Stream to implement a async opetation. It sends the time-consuming operation to the Event Queue.
What confused me is where the Event Queue working on. It is working on the dart threat? if yes, it will block the app.
Another question is Event Queue a FIFO queue. If i have two opertion, one is a 1mins needed networking request, the other is a click event. The two operation will send to the Event Queue.
So if the click event will blocked by the networking request? Because the queue is a FIFO queue?
So where is the event queue working on?
Thank you very much!
One thing to note is that asynchronous and multithreading are two different things. Dart uses Futures and async/await to achieve asynchronicity, but Dart is still inherently a single-threaded language.
The way it works is when a Future is created (either manually or via calling an async method), that process is added to an event queue, as you read. Then, in the middle of all the synchronous execution, whenever there is a lull, the event queue can take priority. It can then go through the processes and figure out if any of the Futures have been completed. If so, the result is passed along to any other asynchronous processes that are waiting on that resource, if any.
This also means that, yes, if your program hangs in the middle of an asynchronous operation (with the easy example of an endless loop via while (true) {}), it will freeze the entire program, including the synchronous code and other asynchronous processes still waiting to resolve (even if the conditions allowing them to resolve have already occurred).
However, in your case, this won't be an issue. If you fire an asynchronous process in the form of a network request followed by another in the form of a "click event" (not sure what you're referring to, but I'll assume it's asynchronous as well), they will both be added to the event queue in that order. But if the click event resolves before the network request, the event queue will merely recognize that the network request Future has not yet resolved and will move on to the click event that has.
As a side note, it's worth noting that Dart does have a multi-threading capability, albeit in a fairly roundabout way. Dart has something called an Isolate, which isn't a thread but a completely separate child program. This means that the Isolate cannot access any of the same data in memory as the root program itself. However, data can be passed between the two using SendPorts and ReceivePorts. This makes using Isolates slightly more complicated than threads, but it also means that, if no memory is shared, it virtually eliminates race conditions based on which thread accesses the memory first.
I want to know what is the proper way to use [self.operationQueue cancelAllOperations];
I am using a block operation in async to fetch results from my API. Sometimes it happens that I get result of first request after second request and those are displayed to user.
I am using AFNetworking library for operations. Any suggestion on how I can make sure that only one request (the latest one) is active at a particular time, and previous one gets cancelled automatically.
When all operations in a queue are cancelled it is the responsibility of each running operation to stop itself. The queue will only prevent future operations from starting. With block operations there isn't really any way to stop as the block doesn't have access to the operation to check if it's cancelled.
It isn't clear exactly what you're using the operation for, but you would need to crate an operation subclass, either to run or to wrap that logic, which at least checked for cancellation before running the final callback to return the result.
I am in the process of converting a JavaScript-based hybrid app to a native iOS app. When I started developing the app with JavaScript, I was disappointed to find out that if you want to make an HTTP request, you have to do it asynchronously. I tried to get around this in various ways, basically:
var done = false;
$.post(url, data, function() { done = true; });
while (!done) {}
//Continue
But I came to find that this is ugly and just plain bad practice, so I got over it and just did it asynchronously.
So when I started with iOS I was excited with the idea that I might be able to do it synchronously, but again I was disappointed to find that the recommended practices are asynchronous, favoring closures or delegates to handle responses.
My question has two parts:
Why is it such common practice in almost every case for HTTP requests to be made asynchronously instead of synchronously?
Is there a way to make synchronous requests in iOS that isn't ugly or problematic?
Essentially, I've always wanted to be able to do something like:
var response = SubmitHTTPPostRequest(url, data)
Is this not really a thing? I never learned this kind of thing in school, so I apologize if this is a rudimentary question. I've just never understood why this is the way it's typically done.
You need to understand the process from sending a request to getting a response. The request will most likely go through some network adapter, to some server, back to the adapter and then back to your CPU. In general there are no cases where there is only one processor involved, in the case I described are 3 but usually there are more. That means synchronisation as doing all the work in one process is impossible since multiple processors are involved. The path to synchronisation (as already mentioned) is for your current thread to wait. I can not agree that will freeze your UI but will freeze your thread (which will freeze the UI if it is the main thread). Still putting the whole process into another thread which will wait for response will produce many other issues and questions such as "should I create a thread for each request", "memory consumption if responses take too long to return?"...
I can understand you want this synchronisation so you can do the operation in a single method but in the end this is exactly what makes an ugly code. Your method then consists of creating the request, getting response, processing response and processing the data received all in one. This might seem a good idea on the beginning but when this method becomes too long you will want to refractor the code into at least 3 methods which by coincidence is exactly what you need to do with asynchronous request. So to answer your second question: Very unlikely, the asynchronous procedure looks much less ugly.
What you should do and is done in most cases is to create some class that handles your requests and responses so from the UI part of your code you only need to do a single call. Lets say you have a table view on which you will display a list of your friend received from some social network. When you first come to this list you would like some activity indicator view to notify the user the data is loading, then send some asynchronous request to get the friends not caring when and if the response will return but when the response is received you simply remove the activity indicator and reload the table view with new data received. Now I hope you can imagine this is a very elegant code and by doing so you enable the user to be able to cancel the request by pressing back.
So the main reason for doing request asynchronous is not to block the threads because that may generate multiple issues or even blocking the main thread which will block the UI and if the main thread is blocked for too long the application will be killed in iOS (watchdog). And the reasons to do synchronisation? Well, in long term I can not think of any, you should always break operations into many methods and use callbacks.
First of all, you should be very clear with synchronous and asynchronous terms.
When Synchronous request sent, caller has to wait for the request to complete the process.
And Asynchronous request don't wait for finish.
As per stack overflow answer , i have read once :
When an HttpHandler is called, a thread pool thread is used to run that request and the same thread is used to process the entire request. If that request calls out to a database or another web service or anything else that can take time, the thread pool thread waits. This means thread pool threads spend time waiting on things when they could be used to process other requests.
In contrast, when an HttpAsyncHandler, a mechanism exists to allow the request to register a callback and return the thread pool thread to the pool before the request is fully processed. The thread pool thread starts doing some processing for the request.At that point, the thread pool thread that was processing the HTTP request is returned to the pool to process another HTTP request.
Your Answers :
1.Because , asynchronous request do not wait for task to complete. send request and while in the same time thread can perform other task without waiting. i use ASIHttpRequest in my ios app.
2.We can send request synchronously but not common this days in practice.
I use AFNetworking as my connection lib to my app. Due to the back-end restrictions, I cannot send two request simultaneously when the app starts because the server will identify a CookieTheftException (Grails). After a first successful connection, I can do as many simultaneous requests as I want but the first need to be serial.
How can I achieve that?
I thought using a Semaphore but i can't block the main thread.
Edit 1
I tried to override but it didn't work. I think the operation queue doesn't wait one request to finish (including it's callback) to start the other.
- (void)enqueueHTTPRequestOperation:(AFHTTPRequestOperation *)operation
{
[self.operationQueue setMaxConcurrentOperationCount:1];
[super enqueueHTTPRequestOperation:operation];
}
Edit 2
I realized that the maxConcurrentOperation worked and in fact 1 operation is executed at a time. The problem is that the request enqueued is already created without the cookies the server needs.
I don't know anything about Grails or the specific architecture of your system, but perhaps this could be solved by simply turning off cookies on that request, with NSMutableURLRequest -setHTTPShouldHandleCookies:.
Other than that, the best way to ensure that only one request operation is ever running for that initial call would be to ignore queues altogether, and simply have an AFHTTPRequestOperation property on your AFHTTPClient subclass. You could even get fancy with KVO to ensure that the operation queue is suspended until that initial request is finished.
I would recommend to read about GCD
You can create a queue and put some block to be executed in this queue.
This way:
It won't block main thread
Since all your networking blocks will be executed in one queue, there is no way that two blocks will be executed simultaneously.
You could set the maximum concurrent operations of the queue to 1. That way only one request will be made at a time.
[self.httpClient.operationQueue setMaxConcurrentOperationCount:1];
But since you only need to wait for the first request, why not just call that request on its own, then initialize the other requests only when the first one completed?
What is the differance between adding a operation which make a synchronous NSURLConnection request in NSOperationQueue ( or synchronous request from a thread ( not main thread)) AND making a asynchronous request from the main thread ?
Both will not block main thread so UI will remain responsive but is there any advantage of using one over other? I know in later method i can track request progress etc but assume that progress and other HTTP stuff is not important here.
They are very similar. The biggest problem with synchronous requests is that they can't easily be cancelled. Depending on your application, that could be a problem. Imagine you are downloading a big document and the user moves to another screen so you no longer need that information. In our case, I actually chose doing asynchronous NSURLConnections on a secondary NSThread, which may be overkill for some apps. It is more complicated, but it gives us the ability to both cancel requests and to decode the JSON/XML/image data on secondary threads so they don't impact main thread user interactivity.
Asynchronous requests are scheduled on the run loop and setup as a run loop source, triggering the code automatically only when there is data received from the network (as any socket source).
Synchronous requests running on a NSThread monopolizes a thread to monitor the incoming data, which is in general quite overkill.
You can always cancel an NSURLConnection even if it has been executed asynchronously, using the cancel method.
I bet using the new API that allows to send an asynchronous request on an NSOperationQueue (+sendAsynchronousRequest:queue:completionHandler:) uses GCD under the hood and dispatch_source_create, or something similar, so that it behave the same way as when an NSURLConnection is scheduled on the run loop, avoiding using an additional thread (watch the WWDC'12 videos that explains why threads are evil and their usage should be minimized), the difference only being that allows you to use a block to be informed upon completion instead of using the delegate mechanism.
Some years ago I created a class that embedded NSURLConnection asynchronous calls and delegate management into a nice block API (see OHURLLoader on my github) that makes it easier to use (feel free to take a look). I bet the new API that uses NSOperationQueues uses the same principle, still doing asynchronous requests on the runloop but allowing you to use blocks instead of having to implement a delegate.
The historical position was that there's an advantage in power consumption, and therefore battery life, in asynchronous requests — presumably including both the older delegate approach and the new block-based approach.