HOME

TheInfoList



OR:

The balls into bins (or balanced allocations) problem is a classic problem in
probability theory Probability theory is the branch of mathematics concerned with probability. Although there are several different probability interpretations, probability theory treats the concept in a rigorous mathematical manner by expressing it through a set ...
that has many applications in
computer science Computer science is the study of computation, automation, and information. Computer science spans theoretical disciplines (such as algorithms, theory of computation, information theory, and automation) to practical disciplines (includi ...
. The problem involves ''m'' balls and ''n'' boxes (or "bins"). Each time, a single ball is placed into one of the bins. After all balls are in the bins, we look at the number of balls in each bin; we call this number the ''load'' on the bin. The problem can be modelled using a
Multinomial distribution In probability theory, the multinomial distribution is a generalization of the binomial distribution. For example, it models the probability of counts for each side of a ''k''-sided dice rolled ''n'' times. For ''n'' independent trials each of wh ...
, and may involve asking a question such as: What is the expected number of bins with a ball in them? Obviously, it is possible to make the load as small as ''m''/''n'' by putting each ball into the least loaded bin. The interesting case is when the bin is selected at random, or at least partially at random. A powerful balls-into-bins paradigm is the "power of two random choices" where each ball chooses two (or more) random bins and is placed in the lesser-loaded bin. This paradigm has found wide practical applications in shared-memory emulations, efficient hashing schemes, randomized load balancing of tasks on servers, and routing of packets within parallel networks and data centers.


Random allocation

When the bin for each ball is selected at random, independent of other choices, the maximum load might be as large as However, it is possible to calculate a tighter bound that holds
with high probability In mathematics, an event that occurs with high probability (often shortened to w.h.p. or WHP) is one whose probability depends on a certain number ''n'' and goes to 1 as ''n'' goes to infinity, i.e. the probability of the event occurring can be ma ...
. A "high probability" is a probability 1-o(1), i.e. the probability tends to 1 when n grows to infinity. For the case m=n, with probability 1 - o(1) the maximum load is: \frac\cdot(1+o(1)). Gonnet gave a tight bound for the expected value of the maximum load, which for m=n is \Gamma^(n) - \frac + o(1), where \Gamma^ is the inverse of the
gamma function In mathematics, the gamma function (represented by , the capital letter gamma from the Greek alphabet) is one commonly used extension of the factorial function to complex numbers. The gamma function is defined for all complex numbers except ...
, and it is known that \Gamma^(n) = \frac \cdot (1 + o(1)). The maximum load can also be calculated for m \ne n, and for example, for m > n \log n it is \frac+\Theta\left(\sqrt\right), and for m < n/\log n it is \Theta\left(\frac\right), with high probability. Exact probabilities for small m = n can be computed as a(n)/n^n for a(n) defined in OEIS A208250.


Partially random allocation

Instead of just selecting a random bin for each ball, it is possible to select two or more bins for each ball and then put the ball in the least loaded bin. This is a compromise between a deterministic allocation, in which all bins are checked and the least loaded bin is selected, and a totally random allocation, in which a single bin is selected without checking other bins. This paradigm often called the "power of two random choices" has been studied in a number of settings below. In the simplest case, if one allocates m balls into n bins (with m=n) sequentially one by one, and for each ball one chooses d \ge 2 random bins at each step and then allocates the ball into the least loaded of the selected bins (ties broken arbitrarily), then with high probability the maximum load is: :\frac+\Theta(1) which is almost exponentially less than with totally random allocation. This result can be generalized to the case m \ge n (with d \ge 2), when with high probability the maximum load is: :\frac+\frac+\Theta(1) which is tight up to an additive constant. (All the bounds hold with probability at least 1 - 1/n^c for any constant c>0.) Note that for m > n \log n, the random allocation process gives only the maximum load of \frac+O\left(\log \log n\right) with high probability, so the improvement between these two processes is especially visible for large values of m. Other key variants of the paradigm are "parallel balls-into-bins" where n balls choose d random bins in parallel, "weighted balls-into-bins" where balls have non-unit weights, and "balls-into-bins with deletions" where balls can be added as well as deleted.


Infinite stream of balls

Instead of just putting ''m'' balls, it is possible to consider an infinite process in which, at each time step, a single ball is added and a single ball is taken, such that the number of balls remains constant. For ''m''=''n'', after a sufficiently long time, with high probability the maximum load is similar to the finite version, both with random allocation and with partially random allocation.


Repeated balls-into-bins

In a ''repeated'' variant of the process, m balls are initially distributed in n bins in an arbitrary way and then, in every subsequent step of a discrete-time process, one ball is chosen from each non-empty bin and re-assigned to one of the n bins uniformly at random. When m=n, it has been shown that with high probability the process converges to a configuration with maximum load \mathcal O(\log(n)) after \mathcal O(n) steps.


Applications

Online Load Balancing: consider a set of ''n'' identical computers. There are ''n'' users who need computing services. The users are not coordinated - each users comes on his own and selects which computer to use. Each user would of course like to select the least loaded computer, but this requires to check the load on each computer, which might take a long time. Another option is to select a computer at random; this leads, with high probability, to a maximum load of \frac \cdot (1 + o(1)). A possible compromise is that the user will check only two computers, and use the lesser loaded of the two. This leads, with high probability, to a much smaller maximum load of \log_2 \log n + \Theta(1). Hashing: consider a
hash table In computing, a hash table, also known as hash map, is a data structure that implements an associative array or dictionary. It is an abstract data type that maps keys to values. A hash table uses a hash function to compute an ''index'', ...
in which all keys mapped to the same location are stored in a linked list. The efficiency of accessing a key depends on the length of its list. If we use a single hash function which selects locations with uniform probability, with high probability the longest chain has O\left(\frac\right) keys. A possible improvement is to use two hash functions, and put each new key in the shorter of the two lists. In this case, with high probability the longest chain has only O(\log \log n) elements.
Fair cake-cutting Fair cake-cutting is a kind of fair division problem. The problem involves a ''heterogeneous'' resource, such as a cake with different toppings, that is assumed to be ''divisible'' – it is possible to cut arbitrarily small pieces of it without ...
: consider the problem of creating a partially proportional division of a heterogeneous resource among n people, such that each person receives a part of the resource which that person values as at least 1/an of the total, where a is some sufficiently large constant. The Edmonds–Pruhs protocol is a randomized algorithm whose analysis make use of balls-into-bins arguments.{{Cite journal , last1 = Edmonds , first1 = Jeff , last2 = Pruhs , first2 = Kirk , date=2006, title=Balanced Allocations of Cake, journal=2006 47th Annual IEEE Symposium on Foundations of Computer Science (FOCS'06), volume=, pages=623–634, url = https://people.cs.pitt.edu/~kirk/papers/focs2006.pdf , doi=10.1109/FOCS.2006.17, isbn = 0-7695-2720-5 , s2cid = 2091887


References

Probability problems