GCD :
Low level C API to achieve Concurrency.
dispatch_async :
- Current thread will not wait for the execution of task (block of code).
dispatch_sync :
- Will not return until block has completed.
- Blocks the current thread, not the current queue,
- !may result in deadlock.
Refs :
[https://www.appcoda.com/ios-concurrency/]
[https://www.raywenderlich.com/60749/grand-central-dispatch-in-depth-part-1\]
[https://developer.apple.com/documentation/dispatch\]
Advantages of learning or using concurrency:
•Utilize iOS devices’ hardware: Now all iOS devices have a multi-core processor that allows developers to execute multiple tasks in parallel. You should utilize this feature and get benefits out of the hardware.
•Better user experience: You probably have written code to call web services, handle some IO, or perform any heavy tasks. As you know, doing these kind of operations in the UI thread will freeze the app, making it non responsive. Once user faces this situation, the first step that he/she will take is to kill/close your app without a second thought. With concurrency, all these tasks can be done in background without hanging the main thread or disturbing your users. They can still tap on buttons, scroll and navigate through your app, while it handles the heavy loading task in background.
•The APIs like NSOperation and dispatch queues make it easy to use concurrency: Creating and managing threads are not easy tasks. This is why most of developers get scared when they heard the term concurrency and multi-threaded code. In iOS we have great and easy to use concurrency APIs that will make your life easier. You don’t have to care about creating threads or manage any low level stuff. The APIs will do all these tasks for you. Another important advantage about these APIs is that it helps you achieve synchronization easily to avoid race condition. Race condition happens when multiple threads try to access shared resource and that leads to unexpected results. By using synchronization, you protect resources from being shared between threads.
Terminology:
Serial vs Concurrent
Tasks executed serially are always executed are one at a time.
Tasks executed concurrently might be executed at the same time.
Synchronous vs Asynchronous
Asynchronous function returns only after the completion of a task that it orders.
Anasynchronous function, on the other hand, returns immediately, ordering the task to be done but does not wait for it. Thus, an asynchronous function does not block the current thread of execution from proceeding on to the next function.
Concurrency vs Parallelism
Multi-core devices execute multiple threads at the same time via parallelism; however, in order for single-cored devices to achieve this, they must run a thread, perform a context switch, then run another thread or process. This usually happens quickly enough as to give the illusion of parallel execution as shown by the diagram below:
Critical Section
This is a piece of code that mustnotbe executed concurrently, that is, from two threads at once. This is usually because the code manipulates a shared resource such as a variable that can become corrupt if it’s accessed by concurrent processes.
Race Condition
This is a situation where the behavior of a software system depends on a specific sequence or timing of events that execute in an uncontrolled manner, such as the exact order of execution of the program’s concurrent tasks. Race conditions can produce unpredictable behavior that aren’t immediately evident through code inspection.
Deadlock
Two (or sometimes more) items — in most cases, threads — are said to be deadlocked if they all get stuck waiting for each other to complete or perform another action. The first can’t finish because it’s waiting for the second to finish. But the second can’t finish because it’s waiting for the first to finish.
Thread Safe
Thread safe code can be safely called from multiple threads or concurrent tasks without causing any problems (data corruption, crashing, etc). Code that is not thread safe must only be run in one context at a time. An example of thread safe code is NSDictionary. You can use it from multiple threads at the same time without issue. On the other hand, NSMutableDictionary is not thread safe and should only be accessed from one thread at a time.
Context Switch
A context switch is the process of storing and restoring execution state when you switch between executing different threads on a single process. This process is quite common when writing multitasking apps, but comes at a cost of some additional overhead.
Options to Achieve Multithreading in iOS :
Option 1: GCD (Grand Central Dispatch)
GCD is the most commonly used API to manage concurrent code and execute operations asynchronously at the Unix level of the system. GCD provides and manages queues of tasks.
- Dispatch_async Queue :
Will create asynchronous blocks. Will create separate thread for the executions of code within the block. Will not wait for the completion of called thread (current thread).
A.1 Serial Queue
The queue can only execute one task at a time.
Serial queues are awesome for managing a shared resource. It provides guaranteed serialized access to the shared resource andprevents race conditions.Imagine that there is a single ticket booth but there are a bunch of people who want to buy cinema tickets, here the staff at the booth is a shared resource. It’ll be chaotic if the staff has to serve these people all at the same time. To handle this situation, people are required to queue up (serial queue), so that the staff can serve the customers one at a time.
Again, it doesn’t mean the cinema can only handle one customer at a time. If it sets up two more booths, it can serve three customers at one time. This is why I said you can still perform multiple tasks in parallel by using several serial queues.
The advantages of using serial queues are:
1.Guaranteed serialized access to a shared resource that avoids race condition.
2.Tasks are executed in a predictable order. When you submit tasks in a serial dispatch queue, they will be executed in the same order as they are inserted.
3.You can create any number of serial queue.
1.2 Concurrent Queue
Concurrent queues allows you to execute multiple tasks(blocks of codes) in parallel.
Concurrent queues guarantee that tasks start in same order but you will not know the order of execution, execution time or the number of tasks being executed at a given point.
For example, you submit three tasks (task #1, #2 and #3) to a concurrent queue. The tasks are executed concurrently and are started in the order in which they were added to the queue. However, the execution time and finish time vary. Even it may take some time for task #2 and task #3 to start, they both can complete before task #1. It’s up to the system to decide the execution of the tasks.
Using above queues in real life:
By default, the system provides each application with a single serial queue and four concurrent queues. The main dispatch queue is the globally available serial queue that executes tasks on the application’s main thread. It is used to update the app UI and perform all tasks related to the update of UIViews. There is only one task to be executed at a time and this is why the UI is blocked when you run a heavy task in the main queue.
Besides the main queue, the system provides four concurrent queues. We call them Global Dispatch queues. These queues are global to the application and are differentiated only by their priority level. To use one of the global concurrent queues, you have to get a reference of your preferred queue using the function dispatch_get_global_queue which takes in the first parameter one of these values:
•DISPATCH_QUEUE_PRIORITY_HIGH
•DISPATCH_QUEUE_PRIORITY_DEFAULT
•DISPATCH_QUEUE_PRIORITY_LOW
•DISPATCH_QUEUE_PRIORITY_BACKGROUND
These queue types represent the priority of execution. The queue with HIGH has the highest priority and BACKGROUND has the lowest priority. So you can decide the queue you use based on the priority of the task. Please also note that these queues are being used by Apple’s APIs so your tasks are not the only tasks in these queues.
Lastly, you can create any number of serial or concurrent queues. In case of concurrent queues, I highly recommend to use one of the four global queues, though you can also create your own.
Example for execution sequence for this code :
dispatch_async(_serialQueue, ^{
printf("1");
});
printf("2");
dispatch_async(_serialQueue, ^{
printf("3");
});
printf("4");
It may print 2413 or 2143 or 1234 but 1 always before 3
Also, itwon'tprint 1324. Because printf("3") is dispatchedafterprintf("2") is executed. And a task can only be executedafterit is dispatched.
B. Dispatch_sync queue:
Submits a block to a dispatch queue for synchronous execution. Unlike dispatch_async, this function does not return until the block has finished. Calling this function and targeting the current queue results in deadlock. Note that dispatch_sync blocks the current thread, not the current queue.
Example for execution sequence :
dispatch_sync(_serialQueue, ^{ printf("1"); });
printf("2");
dispatch_sync(_serialQueue, ^{ printf("3"); });
printf("4");
It always print 1234
Using serial queue ensure the serial execution of tasks. The only difference is that dispatch_sync only return after the block is finished whereas dispatch_async return after it is added to the queue and may not finished
There are 4 cases we can derived :
async - concurrent: the code runs on a background thread. Control returns immediately to the main thread (and UI). The block can't assume that it's the only block running on that queue
async - serial: the code runs on a background thread. Control returns immediately to the main thread. The blockcanassume that it's the only block running on that queue
sync - concurrent: the code runs on a background thread but the main thread waits for it to finish, blocking any updates to the UI. The block can't assume that it's the only block running on that queue (I could have added another block using async a few seconds previously)
sync - serial: the code runs on a background thread but the main thread waits for it to finish, blocking any updates to the UI. The blockcanassume that it's the only block running on that queue
Caution : You should never call the dispatch_sync or dispatch_sync_f function from a task that is executing in the same queue that you are planning to pass to the function. This is particularly important for serial queues, which are guaranteed todeadlock, but should also be avoided for concurrent queues.
When you create queue, NULL parameter represents Serial Queue.
Option 2. Operations queue :
Operation queues are built on top of GCD.
How it is different from GCD ?
1.Don’t follow FIFO: in operation queues, you can set an execution priority for operations and you can add dependencies between operations which means you can define that some operations will only be executed after the completion of other operations. This is why they don’t follow First-In-First-Out.
2.By default, they operate concurrently: while you can’t change its type to serial queues, there is still a workaround to execute tasks in operation queues in sequence by using dependencies between operations.
3.Operation queues are instances of class NSOperationQueue and its tasks are encapsulated in instances of NSOperation.
NSOperation is an abstract class. Hence we need to subclass to use it. In iOS SDK, Two concrete subclasses are available for NSOperation.
Hence we can directly use :
2.1 : NSBlockOperation : Use this class to initiate operation with one or more blocks. The operation itself can contain more than one block and the operation will be considered as finish when all blocks are executed.
2.2 : NSInvocationOperation : Use this class to initiate an operation that consists of invoking a selector on a specified object.
Advantages of NSOperation :
First, they support dependencies through the method addDependency(op: NSOperation) in the NSOperation class. When you need to start an operation that depends on the execution of the other, you will want to use NSOperation.
Secondly, you can change the execution priority by setting the property queuePriority with one of these values:
public enum NSOperationQueuePriority : Int {
case VeryLow
case Low
case Normal
case High
case VeryHigh
}
The operations with high priority will be executed first.
- You can cancel a particular operation or all operations for any given queue.
In case ‘cancelled’ one of these will happened :
•Your operation is already finished. In that case, the cancel method has no effect.
•Your operation is already being executing. In that case, system will NOT force your operation code to stop but instead, cancelled property will be set to true.
•Your operation is still in the queue waiting to be executed. In that case, your operation will not be executed.
NSOperation has 3 helpful boolean properties which are finished, cancelled, and ready. finished will be set to true once operation execution is done. cancelled is set to true once the operation has been cancelled. ready is set to true once the operation is about to be executed now.
Any NSOperation has an option to set completion block to be called once the task being finished. The block will be called once the property finished is set to true in NSOperation.
Option 3 : NSThread
Use this class when you want to have an Objective-C method run in its own thread of execution. Threads are especially useful when you need to perform a lengthy task, but don’t want it to block the execution of the rest of the application.
Creating Thread :
Use thedetachNewThreadSelector:toTarget:withObject:class method to spawn the new thread.
Create a new NSThread object and call its start method.
Thread Pool :
A thread pool is a collection of worker threads that efficiently execute asynchronous callbacks on behalf of the application. The thread pool is primarily used to reduce the number of application threads and provide management of the worker threads.
NSOperation is better than NSThread since,
[a] Example, you may have to cache items at once. NSOperation does this by creating as many threads as there are cores. But, with NSThread creating 100 threads for 100 image items is overkilling and inefficient.
[b] Implementing cancellation is easier in NSOperationQueue.
[c] NSOperationQueue is build with sophisticated constructions, priority and dependencies.