Computational photography refers to digital image capture and processing techniques that use digital computation instead of optical processes. Computational photography can improve the capabilities of a camera, or introduce features that were not possible at all with film based photography, or reduce the cost or size of camera elements. Examples of computational photography include in-camera computation of digital
panoramas A panorama (formed from Greek language, Greek Omnipresence, πᾶν "all" + ὅραμα "sight") is any Obtuse angle, wide-angle view or representation of a physical space, whether in painting, drawing, photography, film, seismic images or a th ...

, high-dynamic-range images, and light field cameras. Light field cameras use novel optical elements to capture three dimensional scene information which can then be used to produce 3D images, enhanced
depth-of-field illustrating the effect of depth of field on a tilted object. For many cameras, depth of field (DOF) is the distance between the nearest and the farthest objects that are in acceptably sharp focus in an image. The depth of field can be calculate ...

, and selective de-focusing (or "post focus"). Enhanced depth-of-field reduces the need for mechanical focusing systems. All of these features use computational imaging techniques. The definition of computational photography has evolved to cover a number of subject areas in
computer graphics Computer graphics deals with generating images with the aid of computers. Today, computer graphics is a core technology in digital photography, film, video games, cell phone and computer displays, and many specialized applications. A great dea ...

computer graphics
computer vision Computer vision is an interdisciplinary scientific field that deals with how computer A computer is a machine that can be programmed to carry out sequences of arithmetic or logical operations automatically. Modern computers can perform ge ...
, and applied
optics Optics is the branch of physics that studies the behaviour and properties of light, including its interactions with matter and the construction of optical instruments, instruments that use or Photodetector, detect it. Optics usually describes t ...

. These areas are given below, organized according to a taxonomy proposed by
Shree K. Nayar
Shree K. Nayar
. Within each area is a list of techniques, and for each technique one or two representative papers or books are cited. Deliberately omitted from the taxonomy are
image processing Digital image processing is the use of a digital computer A computer is a machine that can be programmed to Execution (computing), carry out sequences of arithmetic or logical operations automatically. Modern computers can perform generic se ...
(see also
digital image processing Digital image processing is the use of a digital computer A computer is a machine A machine is a man-made device that uses power to apply forces and control movement to perform an action. Machines can be driven by animals and people ...
) techniques applied to traditionally captured images in order to produce better images. Examples of such techniques are
image scaling In computer graphics and digital imaging, image scaling refers to the resizing of a digital image. In video technology, the magnification of digital material is known as upscaling or resolution enhancement. When scaling a vector graphic imag ...
, dynamic range compression (i.e.
tone mapping Tone mapping is a technique used in image processing Digital image processing is the use of a digital computer to process digital images through an algorithm. As a subcategory or field of digital signal processing, digital image processing has m ...
color management In digital imaging systems, color management (or colour management) is the controlled conversion Conversion or convert may refer to: Arts, entertainment, and media * Conversion (Doctor Who audio), "Conversion" (''Doctor Who'' audio), an episod ...
, image completion (a.k.a. inpainting or hole filling),
image compression Image compression is a type of data compression In signal processing Signal processing is an electrical engineering Electrical engineering is an engineering discipline concerned with the study, design, and application of equipment, ...
digital watermarking A digital watermark is a kind of marker covertly embedded in a noise-tolerant such as audio, video or image data. It is typically used to identify ownership of the copyright of such signal. "Watermarking" is the process of hiding digital informati ...
, and artistic image effects. Also omitted are techniques that produce range data,
volume data
volume data
3D model
3D model
s, 4D light fields, 4D, 6D, or 8D
BRDF The bidirectional reflectance distribution function (BRDF; f_(\omega_,\, \omega_) ) is a function of four real variables that defines how light is reflected at an Opacity (optics), opaque surface. It is employed in the optics of real-world lig ...
s, or other high-dimensional image-based representations. Epsilon photography is a sub-field of computational photography.

Effect on photography

Photos taken using computational photography can allow amateurs to produce photographs rivalling the quality of professional photographers, but currently do not outperform the use of professional-level equipment.

Computational illumination

This is controlling photographic illumination in a structured fashion, then processing the captured images, to create new images. The applications include image-based relighting, image enhancement, image deblurring, geometry/material recovery and so forth. High-dynamic-range imaging uses differently exposed pictures of the same scene to extend dynamic range. Other examples include processing and merging differently illuminated images of the same subject matter ("lightspace").

Computational optics

This is capture of optically coded images, followed by computational decoding to produce new images.
Coded aperture Coded apertures or coded-aperture masks are grids, gratings, or other patterns of materials opaque to various wavelengths of electromagnetic radiation. The wavelengths are usually high-energy radiation such as X-rays and gamma rays. By blocking ra ...
imaging was mainly applied in astronomy or X-ray imaging to boost the image quality. Instead of a single pin-hole, a pinhole pattern is applied in imaging, and
deconvolution In mathematics Mathematics (from Greek: ) includes the study of such topics as numbers ( and ), formulas and related structures (), shapes and spaces in which they are contained (), and quantities and their changes ( and ). There is no gene ...

is performed to recover the image. In coded exposure imaging, the on/off state of the shutter is coded to modify the kernel of motion blur. In this way motion deblurring becomes a well-conditioned problem. Similarly, in a lens based coded aperture, the aperture can be modified by inserting a broadband mask. Thus, out of focus deblurring becomes a well-conditioned problem. The coded aperture can also improve the quality in light field acquisition using Hadamard transform optics. Coded aperture patterns can also be designed using color filters, in order to apply different codes at different wavelengths. This allows to increase the amount of light that reaches the camera sensor, compared to binary masks.

Computational imaging

Computational imaging is a set of imaging techniques that combine data acquisition and data processing to create the image of an object through indirect means to yield enhanced resolution, additional information such as optical phase or 3D reconstruction. The information is often recorded without using a Optical Microscope, conventional optical microscope configuration or with limited datasets. Computational imaging allows to go beyond physical limitations of optical systems, such as numerical aperture, or even obliterates the need for Lens_(optics), optical elements. For parts of the optical spectrum where imaging elements such as objectives are difficult to manufacture or image sensors cannot be miniaturized, computational imaging provides useful alternatives, in fields such as X-Ray and Terahertz radiation, THz radiations.

Common techniques

Among common computational imaging techniques are Coded aperture, lensless imaging, computational speckle imaging,Katz et al.
"Non-invasive single-shot imaging through scattering layers and around corners via speckle correlations"
''Nature Photonics 8, 784–790 (2014)
ptychography and Fourier ptychography. Computational imaging technique often draws on compressive sensing or phase retrieval techniques, where the angular spectrum of the object is being reconstructed. Other techniques are related to the field of computational imaging, such as digital holography,
computer vision Computer vision is an interdisciplinary scientific field that deals with how computer A computer is a machine that can be programmed to carry out sequences of arithmetic or logical operations automatically. Modern computers can perform ge ...
and inverse problems such as tomography.

Computational processing

This is processing of non-optically-coded images to produce new images.

Computational sensors

These are detectors that combine sensing and processing, typically in hardware, like the oversampled binary image sensor.

Early work in computer vision

Although computational photography is a currently popular buzzword in computer graphics, many of its techniques first appeared in the computer vision literature, either under other names or within papers aimed at 3D shape analysis.

Art history

Computational photography, as an art form, has been practiced by capture of differently exposed pictures of the same subject matter, and combining them together. This was the inspiration for the development of the wearable computer in the 1970s and early 1980s. Computational photography was inspired by the work of Charles Wyckoff, and thus computational photography datasets (e.g. differently exposed pictures of the same subject matter that are taken in order to make a single composite image) are sometimes referred to as Wyckoff Sets, in his honor. Early work in this area (joint estimation of image projection and exposure value) was undertaken by Mann and Candoccia. Charles Wyckoff devoted much of his life to creating special kinds of 3-layer photographic films that captured different exposures of the same subject matter. A picture of a nuclear explosion, taken on Wyckoff's film, appeared on the cover of Life Magazine and showed the dynamic range from dark outer areas to inner core.

See also

* Adaptive optics * Multispectral imaging * Simultaneous localization and mapping * Super-resolution microscopy * Time-of-flight camera



External links

* Nayar, Shree K. (2007)
"Computational Cameras"
''Conference on Machine Vision Applications''.
''Computational Photography'' (Raskar, R., Tumblin, J.,)
A.K. Peters. In press.
Special issue on Computational Photography
IEEE Computer, August 2006.
Camera Culture and Computational Journalism: Capturing and Sharing Visual Experiences
IEEE CG&A Special Issue, Feb 2011. * Rick Szeliski (2010),
Computer Vision: Algorithms and Applications
', Springer. * Computational Photography: Methods and Applications (Ed. Rastislav Lukac), CRC Press, 2010.

(John Wiley and Sons book information).

GJB-1: Increasing the dynamic range of a digital camera by using the Wyckoff principleExamples of wearable computational photography as an art formSiggraph Course in Computational Photography
Digital photography Computational fields of study Computer vision