SixthSense
   HOME
*





SixthSense
SixthSense is a gesture-based wearable computer system developed at MIT Media Lab by Steve Mann in 1994 and 1997 (headworn gestural interface), and 1998 (neckworn version), and further developed by Pranav Mistry (also at MIT Media Lab), in 2009, both of whom developed both hardware and software for both headworn and neckworn versions of it. It comprises a headworn or neck-worn pendant that contains both a data projector and camera. Headworn versions were built at MIT Media Lab in 1997 (by Steve Mann) that combined cameras and illumination systems for interactive photographic art, and also included gesture recognition (e.g. finger-tracking using colored tape on the fingers). SixthSense is a name for extra information supplied by a wearable computer, such as the device called EyeTap (Mann), Telepointer (Mann), and "WuW" (Wear yoUr World) by Pranav Mistry. Origin of the name Sixth Sense technology (a camera combined with a light source) was developed in 1997 as a headworn d ...
[...More Info...]      
[...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]  


Pranav Mistry
Pranav Mistry (born 14 May 1981) is a computer scientist and inventor. He was the President and CEO of STAR Labs (Samsung Technology & Advanced Research Labs). He is best known for his work on SixthSense, Samsung Galaxy Gear and Project Beyond. Career Mistry joined Samsung as the Director of Research in 2012, and currently serves as the Global Vice President of it. He introduced Samsung Galaxy Gear smart watch in September 2013. Before joining MIT, Pranav worked as a UX Researcher with Microsoft. In the past, he has worked with Google, CMU, NASA, UNESCO, Japan Science & Technology Agency. He was the Global Senior Vice President of search at Samsung and the head of Think Tank Team. He is the President and CEO of STAR Labs since October 2019. He was honored as the 'Young Global Leader 2013' by the World Economic Forum. * Inventions Mistry is best known for his work on SixthSense.
[...More Info...]       [...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]  


picture info

MIT Media Lab
The MIT Media Lab is a research laboratory at the Massachusetts Institute of Technology, growing out of MIT's Architecture Machine Group in the School of Architecture. Its research does not restrict to fixed academic disciplines, but draws from technology, media, science, art, and design. , Media Lab's research groups include neurobiology, biologically inspired fabrication, socially engaging robots, emotive computing, bionics, and hyperinstruments. The Media Lab was founded in 1985 by Nicholas Negroponte and former MIT President Jerome Wiesner, and is housed in the Wiesner Building (designed by I. M. Pei), also known as Building E15. The Lab has been written about in the popular press since 1988, when Stewart Brand published ''The Media Lab: Inventing the Future at M.I.T.'', and its work was a regular feature of technology journals in the 1990s. In 2009, it expanded into a second building. The Media Lab came under scrutiny in 2019 due to its acceptance of donations from ...
[...More Info...]      
[...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]  


picture info

Steve Mann (inventor)
William Stephen George Mann (born 8 June 1962) is a Canadian engineer, professor, and inventor who works in augmented reality, computational photography, particularly wearable computing, and high-dynamic-range imaging. Mann is sometimes labeled the "Father of Wearable Computing" for early inventions and continuing contributions to the field.Tech Giant "Father of Wearable Tech" Steve Mann "Goes for The Ride" to YYD ROBO!, YYD Corporate News, 2017-07-31"Father of Wearable Computing, Steve Mann, to Keynote FITC Wearables", by Nikolas Badminton, 2014-11-11, Toronto, Medium He cofounded InteraXon, makers of the Muse brain-sensing headband, and is also a founding member of the IEEE Council on Extended Intelligence (CXI). Mann is currently CTO and cofounder at Blueberry X Technologies and Chairman of MannLab. Mann was born in Canada, and currently lives in Toronto, Canada, with his wife and two children. Early life and education Mann holds a PhD in Media Arts and Sciences (1997) from t ...
[...More Info...]      
[...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]  


picture info

Gesture Recognition
Gesture recognition is a topic in computer science and language technology with the goal of interpreting human gestures via mathematical algorithms. It is a subdiscipline of computer vision. Gestures can originate from any bodily motion or state, but commonly originate from the face or hand. Focuses in the field include emotion recognition from face and hand gesture recognition, since they are all expressions. Users can make simple gestures to control or interact with devices without physically touching them. Many approaches have been made using cameras and computer vision algorithms to interpret sign language, however, the identification and recognition of posture, gait, proxemics, and human behaviors is also the subject of gesture recognition techniques. Gesture recognition can be seen as a way for computers to begin to understand human body language, thus building a better bridge between machines and humans than older text user interfaces or even GUIs (graphical user interf ...
[...More Info...]      
[...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]  


picture info

Wearable Computer
A wearable computer, also known as a body-borne computer, is a computing device worn on the body. The definition of 'wearable computer' may be narrow or broad, extending to smartphones or even ordinary wristwatches. Wearables may be for general use, in which case they are just a particularly small example of mobile computing. Alternatively, they may be for specialized purposes such as fitness trackers. They may incorporate special sensors such as accelerometers, heart rate monitors, or on the more advanced side, electrocardiogram (ECG) and blood oxygen saturation (SpO2) monitors. Under the definition of wearable computers, we also include novel user interfaces such as Google Glass, an optical head-mounted display controlled by gestures. It may be that specialized wearables will evolve into general all-in-one devices, as happened with the convergence of PDAs and mobile phones into smartphones. Wearables are typically worn on the wrist (e.g. fitness trackers), hung from the nec ...
[...More Info...]      
[...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]  


picture info

Sense
A sense is a biological system used by an organism for sensation, the process of gathering information about the world through the detection of Stimulus (physiology), stimuli. (For example, in the human body, the brain which is part of the central nervous system receives signals from the senses which continuously receive information from the environment, interprets these signals, and causes the body to respond, either chemically or physically.) Although traditionally five human senses were identified as such (namely Visual perception, sight, Olfaction, smell, Somatosensory system, touch, taste, and hearing), it is now recognized that there are many more. Senses used by non-human organisms are even greater in variety and number. During sensation, sense organs collect various stimuli (such as a sound or smell) for Transduction (physiology), transduction, meaning transformation into a form that can be understood by the brain. Sensation and perception are fundamental to nearly every ...
[...More Info...]      
[...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]  


Pattie Maes
Pattie Maes (born 1961) is a professor in MIT's program in Media Arts and Sciences. She founded and directed the MIT Media Lab's Fluid Interfaces Group. Previously, she founded and ran the Software Agents group. She served for several years as both the head and associate head of the Media Lab's academic program. Prior to joining the Media Lab, Maes was a visiting professor and a research scientist at the MIT Artificial Intelligence Lab. She holds bachelor's and PhD degrees in computer science from the Vrije Universiteit Brussel in Belgium. Maes' areas of expertise are human–computer interaction, intelligent interfaces and ubiquitous computing. Maes is the editor of three books, and is an editorial board member and reviewer for numerous professional journals and conferences. She has received several awards: Newsweek magazine named her one of the "100 people for the new century"; TIME Digital selected her as a member of the ''Cyber-Elite'' (the top 50 technological pioneers of ...
[...More Info...]      
[...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]  


picture info

Augmented Reality
Augmented reality (AR) is an interactive experience that combines the real world and computer-generated content. The content can span multiple sensory modalities, including visual, auditory, haptic, somatosensory and olfactory. AR can be defined as a system that incorporates three basic features: a combination of real and virtual worlds, real-time interaction, and accurate 3D registration of virtual and real objects. The overlaid sensory information can be constructive (i.e. additive to the natural environment), or destructive (i.e. masking of the natural environment). This experience is seamlessly interwoven with the physical world such that it is perceived as an immersive aspect of the real environment. In this way, augmented reality alters one's ongoing perception of a real-world environment, whereas virtual reality completely replaces the user's real-world environment with a simulated one. Augmented reality is largely synonymous with mixed reality. There is also overlap ...
[...More Info...]      
[...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]  


picture info

Virtual Reality
Virtual reality (VR) is a simulated experience that employs pose tracking and 3D near-eye displays to give the user an immersive feel of a virtual world. Applications of virtual reality include entertainment (particularly video games), education (such as medical or military training) and business (such as virtual meetings). Other distinct types of VR-style technology include augmented reality and mixed reality, sometimes referred to as extended reality or XR, although definitions are currently changing due to the nascence of the industry. Currently, standard virtual reality systems use either virtual reality headsets or multi-projected environments to generate realistic images, sounds and other sensations that simulate a user's physical presence in a virtual environment. A person using virtual reality equipment is able to look around the artificial world, move around in it, and interact with virtual features or items. The effect is commonly created by VR headsets consisting ...
[...More Info...]      
[...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]  


picture info

Graphical User Interfaces
The GUI ( "UI" by itself is still usually pronounced . or ), graphical user interface, is a form of user interface that allows users to interact with electronic devices through graphical icons and audio indicator such as primary notation, instead of text-based UIs, typed command labels or text navigation. GUIs were introduced in reaction to the perceived steep learning curve of CLIs (command-line interfaces), which require commands to be typed on a computer keyboard. The actions in a GUI are usually performed through direct manipulation of the graphical elements. Beyond computers, GUIs are used in many handheld mobile devices such as MP3 players, portable media players, gaming devices, smartphones and smaller household, office and industrial controls. The term ''GUI'' tends not to be applied to other lower-display resolution types of interfaces, such as video games (where HUD (''head-up display'') is preferred), or not including flat screens like volumetric displays because t ...
[...More Info...]      
[...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]  


Surface Computing
Surface computing is the use of a specialized computer GUI in which traditional GUI elements are replaced by intuitive, everyday objects. Instead of a keyboard and mouse, the user interacts with a surface. Typically the surface is a touch-sensitive screen, though other surface types like non-flat three-dimensional objects have been implemented as well. It has been said that this more closely replicates the familiar hands-on experience of everyday object manipulation. Early work in this area was done at the University of Toronto, Alias Research, and MIT. Surface work has included customized solutions from vendors such aLM3LABSor GestureTek, Applied Minds for Northrop Grumman. Major computer vendor platforms are in various stages of release: the iTable by PQLabs, Linux MPX, the Ideum MT-50, interactive bar by spinTOUCH, and Microsoft PixelSense (formerly known as Microsoft Surface). Surface types Surface computing employs the use of two broad categories of surface types, flat ...
[...More Info...]      
[...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]