Sub-pixel Resolution
In digital image processing, sub-pixel resolution can be obtained in images constructed from sources with information exceeding the nominal Image resolution, pixel resolution of said images. Example For example, if the image of a ship of length , viewed side-on, is 500 pixels long, the nominal resolution (pixel size) on the side of the ship facing the camera is . Now sub-pixel resolution of well resolved features can measure ship movements which are an order of magnitude (10×) smaller. Movement is specifically mentioned here because measuring absolute positions requires an accurate lens model and known reference points within the image to achieve sub-pixel position accuracy. Small movements can however be measured (down to 1 cm) with simple calibration procedures. Specific fit functions often suffer specific bias with respect to image pixel boundaries. Users should therefore take care to avoid these "pixel locking" (or "peak locking") effects. [...More Info...]       [...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]   |
|
Digital Image Processing
Digital image processing is the use of a digital computer to process digital images through an algorithm. As a subcategory or field of digital signal processing, digital image processing has many advantages over analog image processing. It allows a much wider range of algorithms to be applied to the input data and can avoid problems such as the build-up of Noise (signal processing), noise and distortion during processing. Since images are defined over two dimensions (perhaps more), digital image processing may be modeled in the form of Multidimensional system, multidimensional systems. The generation and development of digital image processing are mainly affected by three factors: first, the development of computers; second, the development of mathematics (especially the creation and improvement of discrete mathematics, discrete mathematics theory); and third, the demand for a wide range of applications in environment, agriculture, military, industry and medical science has incre ... [...More Info...]       [...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]   |
|
Image Resolution
Image resolution is the level of detail of an image. The term applies to digital images, film images, and other types of images. "Higher resolution" means more image detail. Image resolution can be measured in various ways. Resolution quantifies how close lines can be to each other and still be visibly ''resolved''. Resolution units can be tied to physical sizes (e.g. lines per mm, lines per inch), to the overall size of a picture (lines per picture height, also known simply as lines, TV lines, or TVL), or to angular subtense. Instead of single lines, line pairs are often used, composed of a dark line and an adjacent light line; for example, a resolution of 10 lines per millimeter means 5 dark lines alternating with 5 light lines, or 5 line pairs per millimeter (5 LP/mm). Photographic lens are most often quoted in line pairs per millimeter. Types The resolution of digital cameras can be described in many different ways. Pixel count The term ''resolution'' is often considered eq ... [...More Info...]       [...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]   |
|
Point Spread Function
The point spread function (PSF) describes the response of a focused optical imaging system to a point source or point object. A more general term for the PSF is the system's impulse response; the PSF is the impulse response or impulse response function (IRF) of a focused optical imaging system. The PSF in many contexts can be thought of as the shapeless blob in an image that should represent a single point object. We can consider this as a spatial impulse response function. In functional terms, it is the spatial domain version (i.e., the inverse Fourier transform) of the Optical transfer function, optical transfer function (OTF) of an imaging system. It is a useful concept in Fourier optics, astronomy, astronomical imaging, medical imaging, electron microscope, electron microscopy and other imaging techniques such as dimension, 3D microscopy (like in confocal laser scanning microscopy) and fluorescence microscopy. The degree of spreading (blurring) in the image of a point ob ... [...More Info...]       [...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]   |
|
Acutance
In photography, acutance describes a subjective perception of visual acuity that is related to the edge contrast of an image. Acutance is related to the magnitude of the gradient of brightness. Due to the nature of the human visual system, an image with higher acutance appears sharper even though an increase in acutance does not increase real resolution. Historically, acutance was enhanced chemically during development of a negative (high acutance developers), or by optical means in printing ( unsharp masking). In digital photography, onboard camera software and image postprocessing tools such as Photoshop or GIMP offer various sharpening facilities, the most widely used of which is known as "unsharp mask" because the algorithm is derived from the eponymous analog processing method. In the example image, two light gray lines were drawn on a gray background. As the transition is instantaneous, the line is as sharp as can be represented at this resolution. Acutance in the l ... [...More Info...]       [...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]   |
|
Hyperacuity
The sharpness of our senses is defined by the finest detail we can discriminate. Visual acuity is measured by the smallest letters that can be distinguished on a chart and is governed by the anatomical spacing of the mosaic of sensory elements on the retina. Yet spatial distinctions can be made on a finer scale still: misalignment of borders can be detected with a precision up to 10 times better than visual acuity, as already shown by Ewald Hering in 1899. This hyperacuity, transcending by far the size limits set by the retinal 'pixels', depends on sophisticated information processing in the brain. Difference from traditional acuity The best example of the distinction between acuity and hyperacuity comes from vision, for example when observing stars on a night sky. The first stage is the optical imaging of the outside world on the retina. Light impinges on the mosaic of receptor sense cells, rods and cones, which covers the retinal surface without gaps or overlap, just like t ... [...More Info...]       [...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]   |