In
computer programming, a thread pool is a
software design pattern for achieving
concurrency
Concurrent means happening at the same time. Concurrency, concurrent, or concurrence may refer to:
Law
* Concurrence, in jurisprudence, the need to prove both ''actus reus'' and ''mens rea''
* Concurring opinion (also called a "concurrence"), a ...
of execution in a computer program. Often also called a replicated workers or worker-crew model, a thread pool maintains multiple
threads
Thread may refer to:
Objects
* Thread (yarn), a kind of thin yarn used for sewing
** Thread (unit of measurement), a cotton yarn measure
* Screw thread, a helical ridge on a cylindrical fastener
Arts and entertainment
* ''Thread'' (film), 2016 ...
waiting for
tasks to be allocated for
concurrent execution by the supervising program. By maintaining a pool of threads, the model increases performance and avoids latency in execution due to frequent creation and destruction of threads for short-lived tasks. The number of available threads is tuned to the computing resources available to the program, such as a parallel task queue after completion of execution.
Performance
The size of a thread pool is the number of threads kept in reserve for executing tasks. It is usually a tunable parameter of the application, adjusted to optimize program performance.
Deciding the optimal thread pool size is crucial to optimize performance.
One benefit of a thread pool over creating a new thread for each task is that thread creation and destruction overhead is restricted to the initial creation of the pool, which may result in better
performance
A performance is an act of staging or presenting a play, concert, or other form of entertainment. It is also defined as the action or process of carrying out or accomplishing an action, task, or function.
Management science
In the work place ...
and better system
stability. Creating and destroying a thread and its associated resources can be an expensive process in terms of time. An excessive number of threads in reserve, however, wastes memory, and context-switching between the runnable threads invokes performance penalties. A socket connection to another network host, which might take many CPU cycles to drop and re-establish, can be maintained more efficiently by associating it with a thread that lives over the course of more than one network transaction.
Using a thread pool may be useful even putting aside thread startup time. There are implementations of thread pools that make it trivial to queue up work, control concurrency and sync threads at a higher level than can be done easily when manually managing threads.
In these cases the performance benefits of use may be secondary.
Typically, a thread pool executes on a single computer. However, thread pools are conceptually related to
server farms in which a master process, which might be a thread pool itself, distributes tasks to worker processes on different computers, in order to increase the overall throughput.
Embarrassingly parallel problems are highly amenable to this approach.
The number of threads may be dynamically adjusted during the lifetime of an application based on the number of waiting tasks. For example, a
web server
A web server is computer software and underlying hardware that accepts requests via HTTP (the network protocol created to distribute web content) or its secure variant HTTPS. A user agent, commonly a web browser or web crawler, initiate ...
can add threads if numerous
web page requests come in and can remove threads when those requests taper down. The cost of having a larger thread pool is increased resource usage. The algorithm used to determine when to create or destroy threads affects the overall performance:
* Creating too many threads wastes resources and costs time creating the unused threads.
* Destroying too many threads requires more time later when creating them again.
* Creating threads too slowly might result in poor client performance (long wait times).
* Destroying threads too slowly may starve other processes of resources.
See also
*
Asynchrony (computer programming)
*
Object pool pattern
The object pool pattern is a software creational design pattern that uses a set of initialized objects kept ready to use – a "pool" – rather than allocating and destroying them on demand. A client of the pool will request an object from the po ...
*
Concurrency pattern
*
Grand Central Dispatch
*
Parallel Extensions
*
Parallelization
*
Server farm
*
Staged event-driven architecture
References
External links
*
Query by Slice, Parallel Execute, and Join: A Thread Pool Pattern in Java by Binildas C. A.
*
Thread pools and work queues by Brian Goetz
*
A Method of Worker Thread Pooling by Pradeep Kumar Sahu
*
Work Queue by Uri Twig: C++ code demonstration of pooled threads executing a work queue.
*
Windows Thread Pooling and Execution Chaining
*
Smart Thread Pool by Ami Bar
*
Programming the Thread Pool in the .NET Framework by David Carmona
*
by Amir Kirsh
*
Practical Threaded Programming with Python: Thread Pools and Queues by Noah Gift
*
Optimizing Thread-Pool Strategies for Real-Time CORBA by Irfan Pyarali, Marina Spivak,
Douglas C. Schmidt and Ron Cytron
*
Deferred cancellation. A behavioral pattern by Philipp Bachmann
*
A C++17 Thread Pool for High-Performance Scientific Computing by Barak Shoshany
{{Design Patterns patterns
Threads (computing)
Software design patterns
Concurrent computing
Parallel computing