Distance is a numerical measurement of how far apart objects or points are. In physics or everyday usage, distance may refer to a physical length or an estimation based on other criteria (e.g. "two counties over"). The distance from a point A to a point B is sometimes denoted as .[1] In most cases, "distance from A to B" is interchangeable with "distance from B to A".[2] In mathematics, a distance function or metric is a generalization of the concept of physical distance; it is a way of describing what it means for elements of some space to be "close to", or "far away from" each other. In psychology and social sciences, distance is a non-numerical measurement; Psychological distance is defined as "the different ways in which an object might be removed from" the self along dimensions such as "time, space, social distance, and hypotheticality.[3]

In statistics and information geometry, there are many kinds of statistical distances, notably divergences, especially Bregman divergences and f-divergences. These include and generalize many of the notions of "difference between two probability distributions", and allow them to be studied geometrically, as statistical manifolds. The most elementary is the squared Euclidean distance, which forms the basis of least squares; this is the most basic Bregman divergence. The most important in information theory is the relative entropy (Kullback–Leibler divergence), which allows one to analogously study maximum likelih

In terms of this, the definition of the Hausdorff distance can be simplified: it is the larger of two values, one being the supremum, for a point ranging over one set, of the distance between the point and the set, and the other value being likewise defined but with the roles of the two sets swapped.

In graph theory the distance between two vertices is the length of the shortest path between those vertices.

Statistical distances