0% found this document useful (0 votes)
9 views

os_thread

A thread is a lightweight process that represents a single flow of execution within a process, allowing for faster context switching and resource sharing. There are two types of threads: user-level threads, which are managed by the user and not recognized by the OS, and kernel-level threads, which are managed by the OS and offer better coordination. Multithreading enhances system performance by allowing multiple tasks to be executed concurrently, improving responsiveness and resource utilization, although it can introduce complexity and potential issues like data inconsistency.

Uploaded by

vaishnavbhavna17
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as DOCX, PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
9 views

os_thread

A thread is a lightweight process that represents a single flow of execution within a process, allowing for faster context switching and resource sharing. There are two types of threads: user-level threads, which are managed by the user and not recognized by the OS, and kernel-level threads, which are managed by the OS and offer better coordination. Multithreading enhances system performance by allowing multiple tasks to be executed concurrently, improving responsiveness and resource utilization, although it can introduce complexity and potential issues like data inconsistency.

Uploaded by

vaishnavbhavna17
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as DOCX, PDF, TXT or read online on Scribd
You are on page 1/ 8

Threads in Operating System (OS)

A thread is a single sequential flow of execution of tasks of a process so it is also known as


thread of execution or thread of control. There is a way of thread execution inside the
process of any operating system. Apart from this, there can be more than one thread inside
a process. Each thread of the same process makes use of a separate program counter and a
stack of activation records and control blocks. Thread is often referred to as a lightweight
process.

The process can be split down into so many threads. For example, in a browser, many tabs
can be viewed as threads. MS Word uses many threads - formatting text from one thread,
processing input from another thread, etc.

Need of Thread:

o It takes far less time to create a new thread in an existing process than to create a
new process.

o Threads can share the common data, they do not need to use Inter- Process
communication.

o Context switching is faster when working with threads.

o It takes less time to terminate a thread than a process.

Types of Threads
In the operating system, there are two types of threads.

1. Kernel level thread.

2. User-level thread.

User-level thread

The operating system does not recognize the user-level thread. User threads can be easily
implemented and it is implemented by the user. If a user performs a user-level thread
blocking operation, the whole process is blocked. The kernel level thread does not know
nothing about the user level thread. The kernel-level thread manages user-level threads as if
they are single-threaded processes?examples: Java thread, POSIX threads, etc.

Advantages of User-level threads

1. The user threads can be easily implemented than the kernel thread.

2. User-level threads can be applied to such types of operating systems that do not
support threads at the kernel-level.

3. It is faster and efficient.

4. Context switch time is shorter than the kernel-level threads.

5. It does not require modifications of the operating system.

6. User-level threads representation is very simple. The register, PC, stack, and mini
thread control blocks are stored in the address space of the user-level process.

7. It is simple to create, switch, and synchronize threads without the intervention of the
process.

Disadvantages of User-level threads

1. User-level threads lack coordination between the thread and the kernel.

2. If a thread causes a page fault, the entire process is blocked.


Kernel level thread

The kernel thread recognizes the operating system. There is a thread control block and
process control block in the system for each thread and process in the kernel-level thread.
The kernel-level thread is implemented by the operating system. The kernel knows about all
the threads and manages them. The kernel-level thread offers a system call to create and
manage the threads from user-space. The implementation of kernel threads is more difficult
than the user thread. Context switch time is longer in the kernel thread. If a kernel thread
performs a blocking operation, the Banky thread execution can continue. Example: Window
Solaris.
Advantages of Kernel-level threads

1. The kernel-level thread is fully aware of all threads.

2. The scheduler may decide to spend more CPU time in the process of threads being
large numerical.

3. The kernel-level thread is good for those applications that block the frequency.

Disadvantages of Kernel-level threads

1. The kernel thread manages and schedules all threads.

2. The implementation of kernel threads is difficult than the user thread.

3. The kernel-level thread is slower than user-level threads.

Components of Threads

Any thread has the following components.

1. Program counter

2. Register set
3. Stack space

Benefits of Threads

o Enhanced throughput of the system: When the process is split into many threads,
and each thread is treated as a job, the number of jobs done in the unit time
increases. That is why the throughput of the system also increases.

o Effective Utilization of Multiprocessor system: When you have more than one
thread in one process, you can schedule more than one thread in more than one
processor.

o Faster context switch: The context switching period between threads is less than the
process context switching. The process context switch means more overhead for the
CPU.

o Responsiveness: When the process is split into several threads, and when a thread
completes its execution, that process can be responded to as soon as possible.

o Communication: Multiple-thread communication is simple because the threads


share the same address space, while in process, we adopt just a few exclusive
communication strategies for communication between two processes.

o Resource sharing: Resources can be shared between all threads within a process,
such as code, data, and files. Note: The stack and register cannot be shared between
threads. There is a stack and register for each thread.

o Thread States in Operating Systems


o When a thread moves through the system, it is always in
one of the five states:
o (1) Ready
o (2) Running
o (3) Waiting
o (4) Delayed
o (5) Blocked
o Excluding CREATION and FINISHED state.
o

o When an application is to be processed, then it creates a thread.


o It is then allocated the required resources(such as a network) and it comes in
the READY queue.
o When the thread scheduler (like a process scheduler) assign the thread with
processor, it comes in RUNNING queue.
o When the process needs some other event to be triggered, which is outsides
it’s control (like another process to be completed), it transitions
from RUNNING to WAITING queue.
o When the application has the capability to delay the processing of the thread,
it when needed can delay the thread and put it to sleep for a specific amount
of time. The thread then transitions from RUNNING to DELAYED queue.
o An example of delaying of thread is snoozing of an alarm. After it rings for the
first time and is not switched off by the user, it rings again after a specific
amount of time. During that time, the thread is put to sleep.
o When thread generates an I/O request and cannot move further till it’s done, it
transitions from RUNNING to BLOCKED queue.
o After the process is completed, the thread transitions
from RUNNING to FINISHED.
o The difference between the WAITING and BLOCKED transition is that in
WAITING the thread waits for the signal from another thread or waits for
another process to be completed, meaning the burst time is specific. While, in
BLOCKED state, there is no specified time (it depends on the user when to
give an input).
o In order to execute all the processes successfully, the processor needs to
maintain the information about each thread through Thread Control Blocks
(TCB).
o

o Multithreading in Operating System


What is Multithreading?
Multithreading is a feature in operating systems that allows a program to do several tasks at
the same time. Think of it like having multiple hands working together to complete different
parts of a job faster. Each “hand” is called a thread, and they help make programs run more
efficiently. Multithreading makes your computer work better by using its resources more
effectively, leading to quicker and smoother performance for applications like web browsers,
games, and many other programs you use every day.

How Does Multithreading Work?

Multithreading works by allowing a computer’s processor to handle multiple tasks at the


same time. Even though the processor can only do one thing at a time, it switches between
different threads from various programs so quickly that it looks like everything is happening
all at once.

Here’s how it simplifies:

 Processor Handling : The processor can execute only one instruction at a time, but it
switches between different threads so fast that it gives the illusion of simultaneous
execution.

 Thread Synchronization : Each thread is like a separate task within a program. They
share resources and work together smoothly, ensuring programs run efficiently.

 Efficient Execution : Threads in a program can run independently or wait for their
turn to process, making programs faster and more responsive.

 Programming Considerations : Programmers need to be careful about managing


threads to avoid problems like conflicts or situations where threads get stuck waiting
for each other.

Lifecycle of a Thread

There are various stages in the lifecycle of a thread. Following are the stages a thread goes
through in its whole life.

 New: The lifecycle of a born thread (new thread) starts in this state. It remains in this
state till a program starts.

 Runnable : A thread becomes runnable after it starts. It is considered to be executing


the task given to it.

 Waiting : While waiting for another thread to perform a task, the currently running
thread goes into the waiting state and then transitions back again after receiving a
signal from the other thread.

 Timed Waiting: A runnable thread enters into this state for a specific time interval
and then transitions back when the time interval expires or the event the thread was
waiting for occurs.
 Terminated (Dead) : A thread enters into this state after completing its task.

Drawbacks of Multithreading

Multithreading is complex and many times difficult to handle. It has a few drawbacks. These
are:

 If you don’t make use of the locking mechanisms properly, while investigating data
access issues there is a chance of problems arising like data inconsistency and dead-
lock.

 If many threads try to access the same data, then there is a chance that the situation
of thread starvation may arise. Resource contention issues are another problem that
can trouble the user.

 Display issues may occur if threads lack coordination when displaying data.

Benefits of Multithreading

 Multithreading can improve the performance and efficiency of a program by utilizing


the available CPU resources more effectively. Executing multiple threads
concurrently, it can take advantage of parallelism and reduce overall execution time.

 Multithreading can enhance responsiveness in applications that involve user


interaction. By separating time-consuming tasks from the main thread, the user
interface can remain responsive and not freeze or become unresponsive.

 Multithreading can enable better resource utilization. For example, in a server


application, multiple threads can handle incoming client requests simultaneously,
allowing the server to serve more clients concurrently.

 Multithreading can facilitate better code organization and modularity by dividing


complex tasks into smaller, manageable units of execution. Each thread can handle a
specific part of the task, making the code easier to understand and maintain.

You might also like