Eigenvalues And Eigenvectors Of The Second Derivative
Explicit formulas for eigenvalues and eigenvectors of the second derivative with different boundary conditions are provided both for the continuous and discrete cases. In the discrete case, the standard central difference approximation of the second derivative is used on a uniform grid. These formulas are used to derive the expressions for eigenfunctions of Laplacian in case of separation of variables, as well as to find eigenvalues and eigenvectors of multidimensional discrete Laplacian on a regular grid, which is presented as a Kronecker sum of discrete Laplacians in one-dimension. The continuous case The index j represents the jth eigenvalue or eigenvector and runs from 1 to \infty . Assuming the equation is defined on the domain x \in ,L/math>, the following are the eigenvalues and normalized eigenvectors. The eigenvalues are ordered in descending order. Pure Dirichlet boundary conditions : \lambda_j = -\frac : v_j(x) = \sqrt \sin\left(\frac\right) Pure Neumann bounda ... [...More Info...]       [...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]   |
|
Second Derivative
In calculus, the second derivative, or the second-order derivative, of a function is the derivative of the derivative of . Informally, the second derivative can be phrased as "the rate of change of the rate of change"; for example, the second derivative of the position of an object with respect to time is the instantaneous acceleration of the object, or the rate at which the velocity of the object is changing with respect to time. In Leibniz notation: a = \frac = \frac, where is acceleration, is velocity, is time, is position, and d is the instantaneous "delta" or change. The last expression \tfrac is the second derivative of position () with respect to time. On the graph of a function, the second derivative corresponds to the curvature or concavity of the graph. The graph of a function with a positive second derivative is upwardly concave, while the graph of a function with a negative second derivative curves in the opposite way. Second derivative power rule The ... [...More Info...]       [...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]   |
|
Central Difference
A finite difference is a mathematical expression of the form . Finite differences (or the associated difference quotients) are often used as approximations of derivatives, such as in numerical differentiation. The difference operator, commonly denoted \Delta, is the operator that maps a function to the function \Delta /math> defined by \Delta x) = f(x+1)-f(x). A difference equation is a functional equation that involves the finite difference operator in the same way as a differential equation involves derivatives. There are many similarities between difference equations and differential equations. Certain recurrence relations can be written as difference equations by replacing iteration notation with finite differences. In numerical analysis, finite differences are widely used for approximating derivatives, and the term "finite difference" is often used as an abbreviation of "finite difference approximation of derivatives". Finite differences were introduced by Brook Ta ... [...More Info...]       [...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]   |
|
Eigenfunctions
In mathematics, an eigenfunction of a linear map, linear operator ''D'' defined on some function space is any non-zero function (mathematics), function f in that space that, when acted upon by ''D'', is only multiplied by some scaling factor called an eigenvalues and eigenvectors, eigenvalue. As an equation, this condition can be written as Df = \lambda f for some scalar (mathematics), scalar eigenvalue \lambda. The solutions to this equation may also be subject to Boundary value problem#boundary value conditions, boundary conditions that limit the allowable eigenvalues and eigenfunctions. An eigenfunction is a type of eigenvalues and eigenvectors, eigenvector. Eigenfunctions In general, an eigenvector of a linear operator ''D'' defined on some vector space is a nonzero vector in the domain of ''D'' that, when ''D'' acts upon it, is simply scaled by some scalar value called an eigenvalue. In the special case where ''D'' is defined on a function space, the eigenvectors are referred ... [...More Info...]       [...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]   |
|
Laplacian
In mathematics, the Laplace operator or Laplacian is a differential operator given by the divergence of the gradient of a scalar function on Euclidean space. It is usually denoted by the symbols \nabla\cdot\nabla, \nabla^2 (where \nabla is the nabla operator), or \Delta. In a Cartesian coordinate system, the Laplacian is given by the sum of second partial derivatives of the function with respect to each independent variable. In other coordinate systems, such as cylindrical and spherical coordinates, the Laplacian also has a useful form. Informally, the Laplacian of a function at a point measures by how much the average value of over small spheres or balls centered at deviates from . The Laplace operator is named after the French mathematician Pierre-Simon de Laplace (1749–1827), who first applied the operator to the study of celestial mechanics: the Laplacian of the gravitational potential due to a given mass density distribution is a constant multiple of that de ... [...More Info...]       [...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]   |
|
Separation Of Variables
In mathematics, separation of variables (also known as the Fourier method) is any of several methods for solving ordinary differential equation, ordinary and partial differential equations, in which algebra allows one to rewrite an equation so that each of two variables occurs on a different side of the equation. Ordinary differential equations (ODE) A differential equation for the unknown f(x) is separable if it can be written in the form :\frac f(x) = g(x)h(f(x)) where g and h are given functions. This is perhaps more transparent when written using y = f(x) as: :\frac=g(x)h(y). So now as long as ''h''(''y'') ≠ 0, we can rearrange terms to obtain: : = g(x) \, dx, where the two variables ''x'' and ''y'' have been separated. Note ''dx'' (and ''dy'') can be viewed, at a simple level, as just a convenient notation, which provides a handy mnemonic aid for assisting with manipulations. A formal definition of ''dx'' as a differential (infinitesimal) is somewhat advanced. Al ... [...More Info...]       [...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]   |
|
Eigenvalue
In linear algebra, an eigenvector ( ) or characteristic vector is a vector that has its direction unchanged (or reversed) by a given linear transformation. More precisely, an eigenvector \mathbf v of a linear transformation T is scaled by a constant factor \lambda when the linear transformation is applied to it: T\mathbf v=\lambda \mathbf v. The corresponding eigenvalue, characteristic value, or characteristic root is the multiplying factor \lambda (possibly a negative or complex number). Geometrically, vectors are multi-dimensional quantities with magnitude and direction, often pictured as arrows. A linear transformation rotates, stretches, or shears the vectors upon which it acts. A linear transformation's eigenvectors are those vectors that are only stretched or shrunk, with neither rotation nor shear. The corresponding eigenvalue is the factor by which an eigenvector is stretched or shrunk. If the eigenvalue is negative, the eigenvector's direction is reversed. Th ... [...More Info...]       [...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]   |
|
Eigenvector
In linear algebra, an eigenvector ( ) or characteristic vector is a vector that has its direction unchanged (or reversed) by a given linear transformation. More precisely, an eigenvector \mathbf v of a linear transformation T is scaled by a constant factor \lambda when the linear transformation is applied to it: T\mathbf v=\lambda \mathbf v. The corresponding eigenvalue, characteristic value, or characteristic root is the multiplying factor \lambda (possibly a negative or complex number). Geometrically, vectors are multi- dimensional quantities with magnitude and direction, often pictured as arrows. A linear transformation rotates, stretches, or shears the vectors upon which it acts. A linear transformation's eigenvectors are those vectors that are only stretched or shrunk, with neither rotation nor shear. The corresponding eigenvalue is the factor by which an eigenvector is stretched or shrunk. If the eigenvalue is negative, the eigenvector's direction is reversed. ... [...More Info...]       [...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]   |
|
Discrete Laplace Operator
In mathematics, the discrete Laplace operator is an analog of the continuous Laplace operator, defined so that it has meaning on a Graph (discrete mathematics), graph or a lattice (group), discrete grid. For the case of a finite-dimensional graph (having a finite number of edges and vertices), the discrete Laplace operator is more commonly called the Laplacian matrix. The discrete Laplace operator occurs in physics problems such as the Ising model and loop quantum gravity, as well as in the study of discrete dynamical systems. It is also used in numerical analysis as a stand-in for the continuous Laplace operator. Common applications include image processing, where it is known as the Laplace filter, and in machine learning for cluster analysis, clustering and semi-supervised learning on neighborhood graphs. Definitions Graph Laplacians There are various definitions of the ''discrete Laplacian'' for Graph (discrete mathematics), graphs, differing by sign and scale factor (sometim ... [...More Info...]       [...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]   |
|
Regular Grid
A regular grid is a tessellation of ''n''-dimensional Euclidean space by Congruence_(geometry), congruent parallelepiped#Parallelotope, parallelotopes (e.g. bricks). Its opposite is Unstructured grid, irregular grid. Grids of this type appear on graph paper and may be used in finite element analysis, finite volume methods, finite difference methods, and in general for discretization of parameter spaces. Since the derivatives of field variables can be conveniently expressed as finite differences, structured grids mainly appear in finite difference methods. Unstructured grids offer more flexibility than structured grids and hence are very useful in finite element and finite volume methods. Each cell in the grid can be addressed by index (i, j) in two dimensions or (i, j, k) in three dimensions, and each vertex (geometry), vertex has coordinates (i\cdot dx, j\cdot dy) in 2D or (i\cdot dx, j\cdot dy, k\cdot dz) in 3D for some real numbers ''dx'', ''dy'', and ''dz'' representing the g ... [...More Info...]       [...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]   |
|
Kronecker Sum Of Discrete Laplacians
In mathematics, the Kronecker sum of discrete Laplacians, named after Leopold Kronecker, is a discrete version of the separation of variables for the continuous Laplacian in a rectangular cuboid domain. General form of the Kronecker sum of discrete Laplacians In a general situation of the separation of variables in the discrete case, the multidimensional discrete Laplacian is a Kronecker sum of 1D discrete Laplacians. Example: 2D discrete Laplacian on a regular grid with the homogeneous Dirichlet boundary condition Mathematically, using the Kronecker sum: :L =\mathbf\otimes\mathbf+\mathbf\otimes\mathbf, \, where \mathbf and \mathbf are 1D discrete Laplacians in the ''x''- and ''y''-directions, correspondingly, and \mathbf are the identities of appropriate sizes. Both \mathbf and \mathbf must correspond to the case of the homogeneous Dirichlet boundary condition at end points of the ''x''- and ''y''-intervals, in order to generate the 2D discrete Laplacian ''L'' corres ... [...More Info...]       [...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]   |
|
Chebyshev Polynomials
The Chebyshev polynomials are two sequences of orthogonal polynomials related to the cosine and sine functions, notated as T_n(x) and U_n(x). They can be defined in several equivalent ways, one of which starts with trigonometric functions: The Chebyshev polynomials of the first kind T_n are defined by T_n(\cos \theta) = \cos(n\theta). Similarly, the Chebyshev polynomials of the second kind U_n are defined by U_n(\cos \theta) \sin \theta = \sin\big((n + 1)\theta\big). That these expressions define polynomials in \cos\theta is not obvious at first sight but can be shown using de Moivre's formula (see below). The Chebyshev polynomials are polynomials with the largest possible leading coefficient whose absolute value on the interval is bounded by 1. They are also the "extremal" polynomials for many other properties. In 1952, Cornelius Lanczos showed that the Chebyshev polynomials are important in approximation theory for the solution of linear systems; the roots of , ... [...More Info...]       [...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]   |
|
Operator Theory
In mathematics, operator theory is the study of linear operators on function spaces, beginning with differential operators and integral operators. The operators may be presented abstractly by their characteristics, such as bounded linear operators or closed operators, and consideration may be given to nonlinear operators. The study, which depends heavily on the topology of function spaces, is a branch of functional analysis. If a collection of operators forms an algebra over a field, then it is an operator algebra. The description of operator algebras is part of operator theory. Single operator theory Single operator theory deals with the properties and classification of operators, considered one at a time. For example, the classification of normal operators in terms of their spectra falls into this category. Spectrum of operators The spectral theorem is any of a number of results about linear operators or about matrices. In broad terms the spectral theorem provides cond ... [...More Info...]       [...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]   |