Definition of Threads
In order to understand multithreading we first have to understand threads. Each application/program that is running on a system is a process. The process for each program consists of one or more threads that are given processor time to do the work. Each thread contains all of the context information required by the system in order to execute the program instructions.
The operating system is responsible for scheduling and execution of threads. Remember the days of Windows 3.x? A single process thread could, and usually did, monopolize all of the processor time. The system would just sit and wait for the thread to complete prior to executing any other processes.
Newer operating systems, such as Windows 2000, support pre-emptive multitasking which allocates each thread a time slice. When the time slice of the currently executing thread has elapsed, the thread is suspended by the operating system, context of the thread is saved, context of another thread is loaded, and the other thread then resumes execution according to its previous state. This gives the appearance that multiple threads are executing at the same time and helps prevent the system from becoming unresponsive from a single thread (end task anyone?). On systems that have more that one processor threads are distributed across all of the processors so there really are multiple threads executing at the same time.
Threading Model Background
There are many threading models available and a lot of theory behind them. I do not claim to be an expert on any threading model, nor am I going to attempt to explain them in depth in this article. However, we will focus on the threading models that are common to the Microsoft Win32 based environments, and I’ll provide brief explanation about each.
Single threaded means there is only one thread within the process and it is doing all of the work for the process. The process must wait for the current execution of the thread to complete before it can perform another action.
Single threaded results in system idle time and user frustration. For example, assume we are saving a file to a remote network using a single threaded application. Since there is only a single thread in the application, the application will not be able to do anything else while the file is being stored in the remote location. Thus the user waits and begins to wonder if the application is ever going to resume.
Apartment Threading (Single Threaded Apartment)
Apartment threaded means there are multiple threads within the application. In single threaded apartment (STA) each thread is isolated in a separate apartment underneath the process. The process can have any number of apartments that share data through a proxy. The application defines when and for how long the thread in each apartment should execute. All requests are serialized through the Windows message queue such that only a single apartment is accessed at a time and thus only a single thread will be executing at any one time. STA is the threading model that most Visual Basic developers are familiar with because this is the threading model available to VB applications prior to VB.NET. You can think of it like an apartment building full of a row of one room apartments that are accessible one at a time through a single hallway. The advantage this provides over single threaded is that multiple commands can be issued at one time instead of just a single command, but the commands are still sequentially executed.
Free Threading (Multi Threaded Apartment)
Free threaded applications were limited to programming languages such as C++ until the release of Microsoft .NET. The free threaded/Multi Threaded Apartment (MTA) model has a single apartment created underneath the process rather than multiple apartments. This single apartment holds multiple threads rather than just a single thread. No message queue is required because all of the threads are a part of the same apartment and can share data without a proxy. You can think of it like a building with multiple rooms that are all accessible once you are inside the building. These applications typically execute faster than single threaded and STA because there is less system overhead and can be optimized to eliminate system idle time.
These types of applications are more complex to program. The developer must provide thread synchronization as part of the code to ensure that threads do not simultaneously access the same resources. A condition known as a race condition can occur when a thread accesses a shared resource and modifies the resource to an invalid state and then another thread accesses the shared resource and uses it in the invalid state before the other thread can return the resource to a valid state. Therefore it is necessary to place a lock on a resource to prevent other threads from accessing the resource until the lock has been removed. However, this can lead to a deadlock situation where two threads are competing for resources and neither can proceed. For example, thread #1 has a resource locked and is waiting for another resource that is currently locked by thread #2. Thread #2 happens to be waiting for the resource locked by thread #1. Thus, both threads are waiting on the other and neither will be allowed to proceed. The only way to avoid situations like these is through good design and testing.
Using Multiple Threads
There are advantages and disadvantages to the use of multiple threads. Using multiple threads is not always advantageous, so you should make the determination for yourself whether or not there is enough benefit for your application(s).
Advantages of Multiple Threads
- Improved responsiveness — When it comes to application performance I am a firm believer that perception is reality. If the user perceives that my application is slow, then it that means it is in reality slow. If the application is performing operations that take a perceivably long time to complete, these operations can be put into a separate thread which will allow the application to continue to be responsive to the user.
- Faster application — Multiple threads can lead to improved application performance. For example, if there are a number of calculations to be performed or the contents of a file are being processed, then there the application can be made faster by performing multiple operations at the same time.
- Prioritization — Threads can be assigned a priority which would allow higher priority tasks to take precedence over lower priority tasks.
Disadvantages of Multiple Threads
- Programming and debugging is more complex — With multithreaded applications the programmer must account for race conditions and deadlocks.
- Threads add overhead to the system — In order for the operating system to track a large number of threads it is going to consume processor time. If there are too many threads then each thread may not be given enough time to execute during its time slice. In addition, each thread is scheduled for execution less frequently due to the volume and time slice committed to each thread.