With multithreading, you can more efficiently use computer resources. In this article, you’ll find out what multithreading is and how it works.

Basics of threads

Threads are an important topic in programming. Before diving into multithreading, let’s discuss what threads are and what the difference is between processes and threads.
When a program is executed it spawns processes, and processes contain threads, which are short sets of instructions handled by a scheduler. Each program may have several processes associated with it, and each process can have a number of threads executing in it.

A thread comprises a thread ID, a program counter, a register set, and a stack. Apart from this, it shares with other threads belonging to the same process its code section, data section, and other operating system resources such as open files and signals.
A traditional, or heavyweight, process has a single thread that controls its execution. If a process has multiple threads, it can perform more than one task at a time. As you can see, threads can be useful, as they make the system more efficient.

The benefits of multithreaded programming

The benefits of multithreaded programming can be broken down into four major categories.

Responsiveness

The first benefit is responsiveness. Multithreading an interactive application may allow a program to continue running even if part of it is blocked or is performing a lengthy operation, thereby increasing responsiveness to the user. For example, say there are different threads for displaying a web page, downloading a file, and interacting with the web page. If there were only one thread, no other task could be completed until one of these three tasks were done. In this situation, the user would have to wait. In the case of multithreading, since different threads are simultaneously performing different tasks, the responsiveness of the website increases.

Ecommerce development services

Are you planning to expand your business online? We will translate your ideas into intelligent and powerful ecommerce solutions.

Resource sharing

The next benefit is resource sharing. By default, threads share the memory and the resources of the process to which they belong. The benefit of sharing code and data is that it allows an application to have several different threads of activity within the same address space. By sharing resources, we’re making our system more efficient because it doesn’t need to have separate or dedicated resources for each and every thread.

Economy

outsourcing software development company
The next benefit is economy. Allocating memory and resources for creating processes is costly. Because threads share resources of the process to which they belong, it’s more economical to create and context-switch threads.

Utilization of multiprocessor architectures

The benefits of multithreading can be greatly increased in a multiprocessor architecture, where threads may be running in parallel on different processors. A single-threaded process can run on only one CPU, no matter how many are available. Multithreading on a multi-CPU machine increases concurrency. In a multithreaded approach, each process has a different number of threads associated with it so each of these threads can run on multiple processors. Hence processes are completed quicker and more efficiently.

What is multithreading and how does it work?

Multithreading is one of the most popular topics in interviews with developers. It’s included in the list of main errors when developing iOS applications. That’s why it’s very important to know how it works.

Ecommerce development services

Are you planning to expand your business online? We will translate your ideas into intelligent and powerful ecommerce solutions.

Let’s say you have an application that works in the main thread. It’s responsible for executing code and showing the UI of your app. When the app launches, it will be on the main thread, or the UI thread. At this point, when we try to do time-consuming tasks in the main thread, the UI will stop responding for a while. This is a situation the user never wants to face. From the user’s perspective, the app should always be responding and should be fast. The answer to this problem for iOS apps is Grand Central Dispatch (GCD).

3 types of queues in Grand Central Dispatch

Grand Central Dispatch is an API provided by Apple that allows you to smoothly manage concurrent operations to avoid freezing and keep your app responsive.

There are three types of queues in GCD:

  1. Main queue
  2. Global queue
  3. Custom queues

#1 The main queue

The main queue has the highest priority and runs on the main thread.
All UI updates should be done on this thread; otherwise, lagging and weird crashes will occur.

#2 The global queue

The global queue is divided into four main types of treads and a default type according to QOS (Quality of Service), from the highest priority to the lowest.

  • userInteractive is used for tasks that interact with the user at the moment and take very little time. With this type of queue, animations are performed instantly. However, this should be done as quickly as possible, since the user is interacting with the X right now. This queue has a high priority but lower than the main thread.
  • userInitiated is for tasks that are initiated by the user and require feedback. Note that this doesn’t happen inside an interactive event. In this case, the user is waiting for feedback to continue the interaction, which may take several seconds. This type of queue has a high priority but lower than the previous.
  • Default is not typically used and means that the type will be inferred by the system.

  • utility is for tasks that require some time to complete and do not require immediate feedback: for example, downloading data or clearing databases. It’s for things that are being done that the user is not asking for but that are necessary for the application. These tasks may take from several seconds to several minutes. The priority of this type of queue is lower than the previous.
  • background is for tasks that take significant time, meaning minutes or hours. This is what usually runs in the background and only happens when no one wants any maintenance. This type has the lowest priority among all global queues. Both utility and background threads should be used for heavy operations that need time in order not to block the main thread.
DispatchQueue.global(qos: .background).async {
// do something
}

#3 Custom queues

Custom queues are queues you can create on your own and give whatever QOS and attributes you want:

let queue = DispatchQueue(label: "concurentQueue", qos:.background,                	attributes: .concurrent, target: .main)
let queueSerial = DispatchQueue(label: "serialQueue", qos:.background)

Serial queue

A serial dispatch queue performs only one task at the time. Serial queues are often used to synchronize access to a specific value or resource to prevent data races.
A serial queue can be initialized as follows:

let serialQueue = DispatchQueue(label: "serialQueue")
serialQueue.async {
	print("Start 1")
		// Do some work..
	print("Finish 1")
}
serialQueue.async {
	print("Start 2")
	// Do some work..
	print("Finish 2”)
}

/*
	Serial Queue prints:
	Start 1
	Finish 1
	Start 2
	Finish 2
*/

As you can see, the second task only starts after the first has finished.

Concurrent queue

A concurrent queue allows us to execute multiple tasks at once. Tasks will always start in the order they’re added, but they can finish in a different order as they can be executed in parallel. Tasks will run on distinct threads that are managed by the dispatch queue. The number of tasks running at the same time is variable and depends on system conditions.

A concurrent dispatch queue can be created by passing an attribute as a parameter to the DispatchQueue initializer:

let concurrentQueue = DispatchQueue(label: "concurrentQueue", attributes: .concurrent)

serialQueue.async {
	print(“Start 1")
	// Do some work..
	print("Finish 1")
}
serialQueue.async {
	print("Start 2")
	// Do some work..
	print("Finish 2")
}

/*
	Serial Queue prints:
	Start 1
	Start 2
	Finish 1
	Finish 2
*/

As you can see, the second task has already started before the first has finished. This means that both tasks have run in parallel.

In a real-world scenario, you often need to use those queues together.
Let’s say you need to do a heavy operation like downloading an image and then displaying this image in an imageView.

let concurrentQueue = DispatchQueue(label: "concurrentQueue", attributes: .concurrent)
concurrentQueue.async {
   let myImage = downloadImageFromBackend()
    DispatchQueue.main.async {
        /// Access and set the UI back on the main queue.
        self.imageView.image = myImage
    }
}

Sync and async queue

As you saw in the example above, we used the word async after each queue.
With GCD, you can dispatch a task synchronously or asynchronously.

  • Synchronously starting a task will block the calling thread until the task is finished.
  • Asynchronously starting a task will directly return to the calling thread without blocking.

Final thoughts

Now you know that multithreading is a property of a platform or application and that a process spawned in an operating system can consist of several threads running in parallel — that is, without a prescribed order. When performing several tasks, this separation can achieve more efficient use of computer resources.

Ecommerce development services

Are you planning to expand your business online? We will translate your ideas into intelligent and powerful ecommerce solutions.

Share your project with us

×

Write a comment

avatar