Bernstein Polynomials
   HOME

TheInfoList



OR:

In the mathematical field of
numerical analysis Numerical analysis is the study of algorithms that use numerical approximation (as opposed to symbolic computation, symbolic manipulations) for the problems of mathematical analysis (as distinguished from discrete mathematics). It is the study of ...
, a Bernstein polynomial is a
polynomial In mathematics, a polynomial is an expression consisting of indeterminates (also called variables) and coefficients, that involves only the operations of addition, subtraction, multiplication, and positive-integer powers of variables. An exa ...
that is a linear combination of Bernstein
basis Basis may refer to: Finance and accounting *Adjusted basis, the net cost of an asset after adjusting for various tax-related items *Basis point, 0.01%, often used in the context of interest rates *Basis trading, a trading strategy consisting of ...
polynomials. The idea is named after
Sergei Natanovich Bernstein Sergei Natanovich Bernstein (russian: Серге́й Ната́нович Бернште́йн, sometimes Romanized as ; 5 March 1880 – 26 October 1968) was a Ukrainian and Russian mathematician of Jewish origin known for contributions to parti ...
. A
numerically stable In the mathematical subfield of numerical analysis, numerical stability is a generally desirable property of numerical algorithms. The precise definition of stability depends on the context. One is numerical linear algebra and the other is algorit ...
way to evaluate polynomials in Bernstein form is
de Casteljau's algorithm In the mathematical field of numerical analysis, De Casteljau's algorithm is a recursive method to evaluate polynomials in Bernstein form or Bézier curves, named after its inventor Paul de Casteljau. De Casteljau's algorithm can also be used to sp ...
. Polynomials in Bernstein form were first used by Bernstein in a constructive proof for the Weierstrass approximation theorem. With the advent of computer graphics, Bernstein polynomials, restricted to the interval , 1 became important in the form of
Bézier curve A Bézier curve ( ) is a parametric curve used in computer graphics and related fields. A set of discrete "control points" defines a smooth, continuous curve by means of a formula. Usually the curve is intended to approximate a real-world shape t ...
s.


Definition

The ''n''+1 Bernstein basis polynomials of degree ''n'' are defined as : b_(x) = \binom x^ \left( 1 - x \right)^, \quad \nu = 0, \ldots, n, where \tbinom is a
binomial coefficient In mathematics, the binomial coefficients are the positive integers that occur as coefficients in the binomial theorem. Commonly, a binomial coefficient is indexed by a pair of integers and is written \tbinom. It is the coefficient of the t ...
. So, for example, b_(x) = \tbinomx^2(1-x)^3 = 10x^2(1-x)^3. The first few Bernstein basis polynomials for blending 1, 2, 3 or 4 values together are: : \begin b_(x) & = 1, \\ b_(x) & = 1 - x, & b_(x) & = x \\ b_(x) & = (1 - x)^2, & b_(x) & = 2x(1 - x), & b_(x) & = x^2 \\ b_(x) & = (1 - x)^3, & b_(x) & = 3x(1 - x)^2, & b_(x) & = 3x^2(1 - x), & b_(x) & = x^3 \end : The Bernstein basis polynomials of degree ''n'' form a
basis Basis may refer to: Finance and accounting *Adjusted basis, the net cost of an asset after adjusting for various tax-related items *Basis point, 0.01%, often used in the context of interest rates *Basis trading, a trading strategy consisting of ...
for the
vector space In mathematics and physics, a vector space (also called a linear space) is a set whose elements, often called ''vectors'', may be added together and multiplied ("scaled") by numbers called '' scalars''. Scalars are often real numbers, but can ...
\Pi_n of polynomials of degree at most ''n'' with real coefficients. A linear combination of Bernstein basis polynomials :B_n(x) = \sum_^ \beta_ b_(x) is called a Bernstein polynomial or polynomial in Bernstein form of degree ''n''. The coefficients \beta_\nu are called Bernstein coefficients or Bézier coefficients. The first few Bernstein basis polynomials from above in
monomial In mathematics, a monomial is, roughly speaking, a polynomial which has only one term. Two definitions of a monomial may be encountered: # A monomial, also called power product, is a product of powers of variables with nonnegative integer exponent ...
form are: : \begin b_(x) & = 1, \\ b_(x) & = 1 - 1x, & b_(x) & = 0 + 1x \\ b_(x) & = 1 - 2x + 1x^2, & b_(x) & = 0 + 2x - 2x^2, & b_(x) & = 0 + 0x + 1x^2 \\ b_(x) & = 1 - 3x + 3x^2 - x^3, & b_(x) & = 0 + 3x - 6x^2 + 3x^3, & b_(x) & = 0 + 0x + 3x^2 - 3x^3, & b_(x) & = 0 + 0x + 0x^2 + 1x^3 \end :


Properties

The Bernstein basis polynomials have the following properties: * b_(x) = 0, if \nu < 0 or \nu > n. * b_(x) \ge 0 for x \in ,\ 1 * b_\left( 1 - x \right) = b_(x). * b_(0) = \delta_ and b_(1) = \delta_ where \delta is the Kronecker delta function: \delta_ = \begin 0 &\text i \neq j, \\ 1 &\text i=j. \end * b_(x) has a root with multiplicity \nu at point x = 0 (note: if \nu = 0, there is no root at 0). * b_(x) has a root with multiplicity \left( n - \nu \right) at point x = 1 (note: if \nu = n, there is no root at 1). * The
derivative In mathematics, the derivative of a function of a real variable measures the sensitivity to change of the function value (output value) with respect to a change in its argument (input value). Derivatives are a fundamental tool of calculus. F ...
can be written as a combination of two polynomials of lower degree: b'_(x) = n \left( b_(x) - b_(x) \right). * The ''k''-th derivative at 0: b_^(0) = \frac \binom (-1)^. *The ''k''-th derivative at 1: b_^(1) = (-1)^k b_^(0). *The transformation of the Bernstein polynomial to monomials is b_(x) = \binom\sum_^ \binom(-1)^ x^ = \sum_^n \binom\binom(-1)^x^\ell, and by the inverse binomial transformation, the reverse transformation is x^k = \sum_^ \binom \frac b_(x) = \frac \sum_^n \binomb_(x). * The indefinite
integral In mathematics Mathematics is an area of knowledge that includes the topics of numbers, formulas and related structures, shapes and the spaces in which they are contained, and quantities and their changes. These topics are represented i ...
is given by \int b_(x) \, dx = \frac \sum_^ b_(x). * The definite integral is constant for a given ''n'': \int_0^1 b_(x) \, dx = \frac \quad\ \, \text \nu = 0,1, \dots, n. * If n \ne 0, then b_(x) has a unique local maximum on the interval ,\, 1/math> at x = \frac. This maximum takes the value \nu^\nu n^ \left( n - \nu \right)^ . * The Bernstein basis polynomials of degree n form a partition of unity: \sum_^n b_(x) = \sum_^n x^\nu \left(1 - x\right)^ = \left(x + \left( 1 - x \right) \right)^n = 1. * By taking the first x-derivative of (x+y)^n, treating y as constant, then substituting the value y = 1-x, it can be shown that \sum_^ \nu b_(x) = nx. * Similarly the second x-derivative of (x+y)^n, with y again then substituted y = 1-x, shows that \sum_^\nu(\nu-1) b_(x) = n(n-1)x^2. * A Bernstein polynomial can always be written as a linear combination of polynomials of higher degree: b_(x) = \frac b_(x) + \frac b_(x). * The expansion of the
Chebyshev Polynomials of the First Kind Pafnuty Lvovich Chebyshev ( rus, Пафну́тий Льво́вич Чебышёв, p=pɐfˈnutʲɪj ˈlʲvovʲɪtɕ tɕɪbɨˈʂof) ( – ) was a Russian mathematician and considered to be the founding father of Russian mathematics. Chebyshe ...
into the Bernstein basis is T_n(u) = (2n-1)!! \sum_^n \frac b_(u).


Approximating continuous functions

Let ''ƒ'' be a
continuous function In mathematics, a continuous function is a function such that a continuous variation (that is a change without jump) of the argument induces a continuous variation of the value of the function. This means that there are no abrupt changes in value ...
on the interval , 1 Consider the Bernstein polynomial :B_n(f)(x) = \sum_^n f\left( \frac \right) b_(x). It can be shown that :\lim_ = f
uniformly Uniform distribution may refer to: * Continuous uniform distribution * Discrete uniform distribution * Uniform distribution (ecology) * Equidistributed sequence In mathematics, a sequence (''s''1, ''s''2, ''s''3, ...) of real numbers is said to be ...
on the interval  , 1Natanson (1964) p. 6 Bernstein polynomials thus provide one way to prove the Weierstrass approximation theorem that every real-valued continuous function on a real interval 'a'', ''b''can be uniformly approximated by polynomial functions over \mathbb R.Natanson (1964) p. 3 A more general statement for a function with continuous ''k''th derivative is :_\infty \le \frac \left\, f^ \right\, _\infty \quad\ \text \quad\ \left\, f^- B_n(f)^ \right\, _\infty \to 0, where additionally :\frac = \left( 1 - \frac \right) \left( 1 - \frac \right) \cdots \left( 1 - \frac \right) is an
eigenvalue In linear algebra, an eigenvector () or characteristic vector of a linear transformation is a nonzero vector that changes at most by a scalar factor when that linear transformation is applied to it. The corresponding eigenvalue, often denoted b ...
of ''B''''n''; the corresponding eigenfunction is a polynomial of degree ''k''.


Probabilistic proof

This proof follows Bernstein's original proof of 1912. See also Feller (1966) or Koralov & Sinai (2007). Suppose ''K'' is a
random variable A random variable (also called random quantity, aleatory variable, or stochastic variable) is a mathematical formalization of a quantity or object which depends on random events. It is a mapping or a function from possible outcomes (e.g., the po ...
distributed as the number of successes in ''n'' independent
Bernoulli trial In the theory of probability and statistics, a Bernoulli trial (or binomial trial) is a random experiment with exactly two possible outcomes, "success" and "failure", in which the probability of success is the same every time the experiment is c ...
s with probability ''x'' of success on each trial; in other words, ''K'' has a
binomial distribution In probability theory and statistics, the binomial distribution with parameters ''n'' and ''p'' is the discrete probability distribution of the number of successes in a sequence of ''n'' independent experiments, each asking a yes–no quest ...
with parameters ''n'' and ''x''. Then we have the
expected value In probability theory, the expected value (also called expectation, expectancy, mathematical expectation, mean, average, or first moment) is a generalization of the weighted average. Informally, the expected value is the arithmetic mean of a l ...
\operatorname\left frac\right= x\ and :p(K) = x^ \left( 1 - x \right)^ = b_(x) By the
weak law of large numbers In probability theory, the law of large numbers (LLN) is a theorem that describes the result of performing the same experiment a large number of times. According to the law, the average of the results obtained from a large number of trials shou ...
of
probability theory Probability theory is the branch of mathematics concerned with probability. Although there are several different probability interpretations, probability theory treats the concept in a rigorous mathematical manner by expressing it through a set o ...
, :\lim_ = 0 for every ''δ'' > 0. Moreover, this relation holds uniformly in ''x'', which can be seen from its proof via
Chebyshev's inequality In probability theory, Chebyshev's inequality (also called the Bienaymé–Chebyshev inequality) guarantees that, for a wide class of probability distributions, no more than a certain fraction of values can be more than a certain distance from th ...
, taking into account that the variance of  ''K'', equal to  ''x''(1−''x''), is bounded from above by irrespective of ''x''. Because ''ƒ'', being continuous on a closed bounded interval, must be uniformly continuous on that interval, one infers a statement of the form :\lim_ = 0 uniformly in ''x''. Taking into account that ''ƒ'' is bounded (on the given interval) one gets for the expectation : \lim_ = 0 uniformly in ''x''. To this end one splits the sum for the expectation in two parts. On one part the difference does not exceed ''ε''; this part cannot contribute more than ''ε''. On the other part the difference exceeds ''ε'', but does not exceed 2''M'', where ''M'' is an upper bound for , ''ƒ''(x), ; this part cannot contribute more than 2''M'' times the small probability that the difference exceeds ''ε''. Finally, one observes that the absolute value of the difference between expectations never exceeds the expectation of the absolute value of the difference, and :\operatorname\left \left(\frac\right)\right= \sum_^n f\left(\frac\right) p(K) = \sum_^n f\left(\frac\right) b_(x) = B_n(f)(x)


Elementary proof

The probabilistic proof can also be rephrased in an elementary way, using the underlying probabilistic ideas but proceeding by direct verification: The following identities can be verified: # \sum_k x^k (1-x)^ = 1 ("probability") # \sum_k x^k (1-x)^ = x ("mean") # \sum_k \left( x -\right)^2 x^k (1-x)^ = . ("variance") In fact, by the binomial theorem (1+t)^n = \sum_k t^k, and this equation can be applied twice to t\frac. The identities (1), (2), and (3) follow easily using the substitution t = x/ (1 - x). Within these three identities, use the above basis polynomial notation : b_(x) = x^k (1-x)^, and let : f_n(x) = \sum_k f(k/n)\, b_(x). Thus, by identity (1) :f_n(x) - f(x) = \sum_k (k/n) - f(x)\,b_(x), so that :, f_n(x) - f(x), \le \sum_k , f(k/n) - f(x), \, b_(x). Since ''f'' is uniformly continuous, given \varepsilon > 0, there is a \delta > 0 such that , f(a) - f(b), < \varepsilon whenever , a-b, < \delta. Moreover, by continuity, M= \sup , f, < \infty. But then : , f_n(x) - f(x), \le \sum_ , f(k/n) - f(x), \, b_(x) + \sum_ , f(k/n) - f(x), \, b_(x) . The first sum is less than ε. On the other hand, by identity (3) above, and since , x - k/n, \ge \delta, the second sum is bounded by 2''M'' times :\sum_ b_(x) \le \sum_k \delta^ \left(x -\right)^2 b_(x) = \delta^ < \delta^ n^. :(
Chebyshev's inequality In probability theory, Chebyshev's inequality (also called the Bienaymé–Chebyshev inequality) guarantees that, for a wide class of probability distributions, no more than a certain fraction of values can be more than a certain distance from th ...
) It follows that the polynomials ''f''''n'' tend to ''f'' uniformly.


Generalizations to higher dimension

Bernstein polynomials can be generalized to dimensions – the resulting polynomials have the form . In the simplest case only products of the unit interval are considered; but, using
affine transformation In Euclidean geometry, an affine transformation or affinity (from the Latin, ''affinis'', "connected with") is a geometric transformation that preserves lines and parallelism, but not necessarily Euclidean distances and angles. More generally, ...
s of the line, Bernstein polynomials can also be defined for products . For a continuous function on the -fold product of the unit interval, the proof that can be uniformly approximated by :\sum_ \sum_ \cdots \sum_ \cdots f\left(, , \dots, \right) x_1^ (1-x_1)^ x_2^ (1-x_2)^ \cdots x_k^ (1-x_k)^ is a straightforward extension of Bernstein's proof in one dimension.


See also

* Polynomial interpolation *
Newton form Newton most commonly refers to: * Isaac Newton (1642–1726/1727), English scientist * Newton (unit), SI unit of force named after Isaac Newton Newton may also refer to: Arts and entertainment * ''Newton'' (film), a 2017 Indian film * Newton ...
*
Lagrange form In numerical analysis, the Lagrange interpolating polynomial is the unique polynomial of lowest degree that interpolates a given set of data. Given a data set of coordinate pairs (x_j, y_j) with 0 \leq j \leq k, the x_j are called ''nodes'' an ...
* Binomial QMF (also known as Daubechies wavelet)


Notes


References

*, English translation * *, Russian edition first published in 1940 * * * * * * *


External links

* * * * * * from
University of California, Davis The University of California, Davis (UC Davis, UCD, or Davis) is a public land-grant research university near Davis, California. Named a Public Ivy, it is the northernmost of the ten campuses of the University of California system. The institut ...
. Note the error in the summation limits in the first formula on page 9. * * Feature Column from
American Mathematical Society The American Mathematical Society (AMS) is an association of professional mathematicians dedicated to the interests of mathematical research and scholarship, and serves the national and international community through its publications, meetings, ...
* * * * * * {{DEFAULTSORT:Bernstein Polynomial Numerical analysis Polynomials Articles containing proofs