Range Image
   HOME
*





Range Image
Range imaging is the name for a collection of techniques that are used to produce a 2D image showing the distance to points in a scene from a specific point, normally associated with some type of sensor device. The resulting range image has pixel values that correspond to the distance. If the sensor that is used to produce the range image is properly calibrated the pixel values can be given directly in physical units, such as meters. Types of range cameras The sensor device that is used for producing the range image is sometimes referred to as a ''range camera'' or ''depth camera''. Range cameras can operate according to a number of different techniques, some of which are presented here. Stereo triangulation Stereo triangulation is an application of stereophotogrammetry where the depth data of the pixels are determined from data acquired using a stereo or multiple-camera setup system. This way it is possible to determine the depth to points in the scene, for example, from ...
[...More Info...]      
[...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]  


picture info

Depth Map
In 3D computer graphics and computer vision, a depth map is an image or image channel that contains information relating to the distance of the surfaces of scene objects from a viewpoint. The term is related (and may be analogous) to ''depth buffer'', ''Z-buffer'', ''Z-buffering'', and ''Z-depth''. tp://ftp.futurenet.co.uk/pub/arts/Glossary.pdf Computer Arts / 3D World Glossary Document retrieved 26 January 2011. The "Z" in these latter terms relates to a convention that the central axis of view of a camera is in the direction of the camera's Z axis, and not to the absolute Z axis of a scene. Examples File:Cubic Structure.jpg, Cubic Structure File:Cubic Frame Stucture and Floor Depth Map.jpg, Depth Map: Nearer is darker File:Cubic Structure and Floor Depth Map with Front and Back Delimitation.jpg, Depth Map: Nearer the Focal Plane is darker Two different depth maps can be seen here, together with the original model from which they are derived. The first depth map shows lu ...
[...More Info...]      
[...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]  


picture info

Samsung
The Samsung Group (or simply Samsung) ( ko, 삼성 ) is a South Korean multinational manufacturing conglomerate headquartered in Samsung Town, Seoul, South Korea. It comprises numerous affiliated businesses, most of them united under the ''Samsung'' brand, and is the largest South Korean (business conglomerate). Samsung has the eighth highest global brand value. Samsung was founded by Lee Byung-chul in 1938 as a trading company. Over the next three decades, the group diversified into areas including food processing, textiles, insurance, securities, and retail. Samsung entered the electronics industry in the late 1960s and the construction and shipbuilding industries in the mid-1970s; these areas would drive its subsequent growth. Following Lee's death in 1987, Samsung was separated into five business groups – Samsung Group, Shinsegae Group, CJ Group and Hansol Group, and JoongAng Group. Notable Samsung industrial affiliates include Samsung Electronics (the wor ...
[...More Info...]      
[...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]  


Optical Flow
Optical flow or optic flow is the pattern of apparent motion of objects, surfaces, and edges in a visual scene caused by the relative motion between an observer and a scene. Optical flow can also be defined as the distribution of apparent velocities of movement of brightness pattern in an image. The concept of optical flow was introduced by the American psychologist James J. Gibson in the 1940s to describe the visual stimulus provided to animals moving through the world. Gibson stressed the importance of optic flow for affordance perception, the ability to discern possibilities for action within the environment. Followers of Gibson and his ecological approach to psychology have further demonstrated the role of the optical flow stimulus for the perception of movement by the observer in the world; perception of the shape, distance and movement of objects in the world; and the control of locomotion. The term optical flow is also used by roboticists, encompassing related techniq ...
[...More Info...]      
[...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]  


picture info

Intensified Charge-coupled Device
A charge-coupled device (CCD) is an integrated circuit containing an array of linked, or coupled, capacitors. Under the control of an external circuit, each capacitor can transfer its electric charge to a neighboring capacitor. CCD sensors are a major technology used in digital imaging. In a CCD image sensor, pixels are represented by p-doped metal–oxide–semiconductor (MOS) capacitors. These MOS capacitors, the basic building blocks of a CCD, are biased above the threshold for inversion when image acquisition begins, allowing the conversion of incoming photons into electron charges at the semiconductor-oxide interface; the CCD is then used to read out these charges. Although CCDs are not the only technology to allow for light detection, CCD image sensors are widely used in professional, medical, and scientific applications where high-quality image data are required. In applications with less exacting quality demands, such as consumer and professional digital cameras, active ...
[...More Info...]      
[...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]  


picture info

Structure From Motion
Structure from motion (SfM) is a photogrammetric range imaging technique for estimating three-dimensional structures from two-dimensional image sequences that may be coupled with local motion signals. It is studied in the fields of computer vision and visual perception. In biological vision, SfM refers to the phenomenon by which humans (and other living creatures) can recover 3D structure from the projected 2D (retinal) motion field of a moving object or scene. Principle Humans perceive a great deal of information about the three-dimensional structure in their environment by moving around it. When the observer moves, objects around them move different amounts depending on their distance from the observer. This is known as motion parallax, and from this depth information can be used to generate an accurate 3D representation of the world around them. Finding structure from motion presents a similar problem to finding structure from stereo vision. In both instances, the correspo ...
[...More Info...]      
[...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]  


Laser Dynamic Range Imager
The Laser Dynamic Range Imager (LDRI) is a LIDAR range imaging device developed by Sandia National Laboratories for the US Space Shuttle program. The sensor was developed as part of NASA's "Return to Flight" effort following the Space Shuttle Columbia disaster to provide 2-D and 3-D images of the thermal protection system on the Space Shuttle Orbiter. The LDRI generates 3-dimensional images from 2-dimensional video. Modulated laser illumination is demodulated by the receive optics, and the resulting video sequences can be processed to produce 3-d images. The modulation produces a flickering effect from frame-to-frame in the video imagery. As part of the Orbiter Boom Sensor System, the LDRI is mounted at the end of the boom on a pan-tilt unit (PTU) along with an intensified video camera (ITVC). During 2-dimensional imaging of the reinforced carbon-carbon panels on the leading edge of the shuttle's wings, the LDRI is capable of seeing damage as small as a 0.020-inch crack. Dur ...
[...More Info...]      
[...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]  


picture info

Time-of-flight Camera
A time-of-flight camera (ToF camera), also known as time-of-flight sensor (ToF sensor), is a range imaging camera system for measuring distances between the camera and the subject for each point of the image based on time-of-flight, the round trip time of an artificial light signal, as provided by a laser or an LED. Laser-based time-of-flight cameras are part of a broader class of scannerless LIDAR, in which the entire scene is captured with each laser pulse, as opposed to point-by-point with a laser beam such as in scanning LIDAR systems. Time-of-flight camera products for civil applications began to emerge around 2000, as the semiconductor processes allowed the production of components fast enough for such devices. The systems cover ranges of a few centimeters up to several kilometers. Types of devices Several different technologies for time-of-flight cameras have been developed. RF-modulated light sources with phase detectors Photonic Mixer Devices (PMD), the Swiss Ranger, an ...
[...More Info...]      
[...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]  


picture info

Light-field Camera
A light field camera, also known as a plenoptic camera, is a camera that captures information about the ''light field'' emanating from a scene; that is, the intensity of light in a scene, and also the precise direction that the light rays are traveling in space. This contrasts with conventional cameras, which record only light intensity at various wavelengths. One type uses an array of micro-lenses placed in front of an otherwise conventional image sensor to sense intensity, color, and directional information. Multi-camera arrays are another type. Holograms are a type of film-based light field image. History Early research The first light field camera was proposed by Gabriel Lippmann in 1908. He called his concept "integral photography". Lippmann's experimental results included crude integral photographs made by using a plastic sheet embossed with a regular array of microlenses, or by partially embedding small glass beads, closely packed in a random pattern, into the sur ...
[...More Info...]      
[...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]  


picture info

Lidar
Lidar (, also LIDAR, or LiDAR; sometimes LADAR) is a method for determining ranges (variable distance) by targeting an object or a surface with a laser and measuring the time for the reflected light to return to the receiver. It can also be used to make digital 3-D representations of areas on the Earth's surface and ocean bottom of the intertidal and near coastal zone by varying the wavelength of light. It has terrestrial, airborne, and mobile applications. ''Lidar'' is an acronym of "light detection and ranging" or "laser imaging, detection, and ranging". It is sometimes called 3-D laser scanning, a special combination of 3-D scanning and laser scanning. Lidar is commonly used to make high-resolution maps, with applications in surveying, geodesy, geomatics, archaeology, geography, geology, geomorphology, seismology, forestry, atmospheric physics, laser guidance, airborne laser swath mapping (ALSM), and laser altimetry. It is also used in control and navigation for som ...
[...More Info...]      
[...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]  


picture info

Laser Rangefinder
A laser rangefinder, also known as a laser telemeter, is a rangefinder that uses a laser beam to determine the distance to an object. The most common form of laser rangefinder operates on the time of flight principle by sending a laser pulse in a narrow beam towards the object and measuring the time taken by the pulse to be reflected off the target and returned to the sender. Due to the high speed of light, this technique is not appropriate for high precision sub-millimeter measurements, where triangulation and other techniques are often used. Pulse The pulse may be coded to reduce the chance that the rangefinder can be jammed. It is possible to use Doppler effect techniques to judge whether the object is moving towards or away from the rangefinder, and if so, how fast. Precision The precision of the instrument is determined by the rise or fall time of the laser pulse and the speed of the receiver. One that uses very sharp laser pulses and has a very fast detector can rang ...
[...More Info...]      
[...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]  


picture info

Kinect
Kinect is a line of motion sensing input devices produced by Microsoft and first released in 2010. The devices generally contain RGB cameras, and infrared projectors and detectors that map depth through either structured light or time of flight calculations, which can in turn be used to perform real-time gesture recognition and body skeletal detection, among other capabilities. They also contain microphones that can be used for speech recognition and voice control. Kinect was originally developed as a motion controller peripheral for Xbox video game consoles, distinguished from competitors (such as Nintendo's Wii Remote and Sony's PlayStation Move) by not requiring physical controllers. The first-generation Kinect was based on technology from Israeli company PrimeSense, and unveiled at E3 2009 as a peripheral for Xbox 360 codenamed "Project Natal". It was first released on November 4, 2010, and would go on to sell eight million units in its first 60 days of availability. Th ...
[...More Info...]      
[...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]  


picture info

Depth Map
In 3D computer graphics and computer vision, a depth map is an image or image channel that contains information relating to the distance of the surfaces of scene objects from a viewpoint. The term is related (and may be analogous) to ''depth buffer'', ''Z-buffer'', ''Z-buffering'', and ''Z-depth''. tp://ftp.futurenet.co.uk/pub/arts/Glossary.pdf Computer Arts / 3D World Glossary Document retrieved 26 January 2011. The "Z" in these latter terms relates to a convention that the central axis of view of a camera is in the direction of the camera's Z axis, and not to the absolute Z axis of a scene. Examples File:Cubic Structure.jpg, Cubic Structure File:Cubic Frame Stucture and Floor Depth Map.jpg, Depth Map: Nearer is darker File:Cubic Structure and Floor Depth Map with Front and Back Delimitation.jpg, Depth Map: Nearer the Focal Plane is darker Two different depth maps can be seen here, together with the original model from which they are derived. The first depth map shows lu ...
[...More Info...]      
[...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]