Hands-on Computing
   HOME

TheInfoList



OR:

Hands-on computing is a branch of human-computer interaction research which focuses on computer interfaces that respond to human touch or expression, allowing the machine and the user to interact physically. Hands-on computing can make complicated computer tasks more natural to users by attempting to respond to motions and interactions that are natural to human behavior. Thus hands-on computing is a component of
user-centered design User-centered design (UCD) or user-driven development (UDD) is a framework of process (not restricted to interfaces or technologies) in which usability goals, user characteristics, environment, tasks and workflow of a product, service or proc ...
, focusing on how users physically respond to virtual environments.


Implementations

*
Keyboards Keyboard may refer to: Text input * Keyboard, part of a typewriter * Computer keyboard ** Keyboard layout, the software control of computer keyboards and their mapping ** Keyboard technology, computer keyboard hardware and firmware Music * Musi ...
* Stylus pens and tablets *
Touchscreen A touchscreen or touch screen is the assembly of both an input ('touch panel') and output ('display') device. The touch panel is normally layered on the top of an electronic visual display of an information processing system. The display is ofte ...
s * Human signaling


Keyboards

Keyboards and
typewriter A typewriter is a mechanical or electromechanical machine for typing characters. Typically, a typewriter has an array of keys, and each one causes a different single character to be produced on paper by striking an inked ribbon selectivel ...
s are some of the earliest hands-on computing devices. These devices are effective because users receive kinesthetic feedback, tactile feedback, auditory feedback, and visual feedback. The
QWERTY QWERTY () is a keyboard layout for Latin-script alphabets. The name comes from the order of the first six keys on the top left letter row of the keyboard ( ). The QWERTY design is based on a layout created for the Sholes and Glidden t ...
layout of the keyboard is one of the first designs, dating to 1878.Baber, Christopher. ''Beyond the Desktop''. Academic Press. 1997. New designs such as the split keyboard increase the comfort of typing for users. Keyboards input directions to the computer via keys; however, they do not allow the user direct interaction with the computer through touch or expression.


Stylus pens and tablets

Tablets are touch-sensitive surfaces that detect the pressure applied by a
stylus pen A stylus (plural styli or styluses) is a writing utensil or a small tool for some other form of marking or shaping, for example, in pottery. It can also be a computer accessory that is used to assist in navigating or providing more precision w ...
. This works via changes in magnetic fields or by bringing together two resistive sheets, for magnetic tablets and resistive tablets respectively. Tablets allow users to interact with computers by touching through a stylus pen, yet they do not respond directly to a user's touch.


Touchscreens

Touchscreen allow users to directly interact with computers by touching the screen with a finger. It is natural for humans to point to objects in order to show a preference or a
selection Selection may refer to: Science * Selection (biology), also called natural selection, selection in evolution ** Sex selection, in genetics ** Mate selection, in mating ** Sexual selection in humans, in human sexuality ** Human mating strateg ...
. Touchscreens allow users to take this natural action and use it to interact with computers. Problems may arise due to inaccuracy: people attempt to make a selection, but due to incorrect calibration, the computer does not accurately process the touch.


Human signaling

New developments in hands-on computing have led to the creation of interfaces that can respond to gestures and facial signaling. Often haptic devices like a glove have to be worn to translate the gesture into a recognizable command. The natural actions of pointing, grabbing, and tapping are common ways to interact with the computer interface. The latest studies include using
eye tracking Eye tracking is the process of measuring either the point of gaze (where one is looking) or the motion of an eye relative to the head. An eye tracker is a device for measuring eye positions and eye movement. Eye trackers are used in research ...
to indicate selection or control a
cursor Cursor may refer to: * Cursor (user interface), an indicator used to show the current position for user interaction on a computer monitor or other display device * Cursor (databases), a control structure that enables traversal over the records in ...
. Blinking and the gaze of the eye are used to communicate selections. Computers can also respond to speech inputs. Developments in this technology have made it possible for users to dictate phrases to the computer instead of type them to display text on an interface. Utilizing human signal inputs allows more people to interact with computers in a natural way.


Current problems

There are still many problems with hands-on computing interfaces that are currently being eradicated through continuing research and development. The challenge of creating a simple, user-friendly interface and developing it in an inexpensive and mass-producible way is the main complication in hands-on computing technologies. Because some interactions between human and machine are ambiguous, the mechanical response is not always the desired result for the user. Different hand gestures and facial expressions can lead the computer to interpret one command, while the user wished to convey another one entirely. Solving this problem is currently one of the main focuses in research and development. Researchers are also working to find the best way to design hands-on computing devices, so that the consumer can use the product easily. Focusing on
user-centered design User-centered design (UCD) or user-driven development (UDD) is a framework of process (not restricted to interfaces or technologies) in which usability goals, user characteristics, environment, tasks and workflow of a product, service or proc ...
while creating hands-on computing products helps developers make the best and easiest-to-use product.


Research and development

This new field has a lot of room for contributions in research and product development. Hands-on computing technologies require scientists and engineers to use a different problem-solving strategy, which considers the devices for interaction rather than just input, the interaction devices in terms of tool use, how interaction will mediate user performance, and the context in which the devices will be used. In order for a machine to be successfully used, people need to be able to transfer some of their current skill set to operate it. This can be done directly, by comparing the interface to a known and familiar topic to help people understand, or by aiding the user to draw new inferences through feedback. Users must be able to understand how to use and manipulate the interface, in order to use it to its full capability. By applying their current skills, users can operate the machine without learning new concepts and approaches.Waern, Y. "Human Learning of Human-Computer Interaction: An Introduction." Cognitive Ergonomics: Understanding, Learning and Designing Human-Computer Interaction (1990): 69-84.


References


"ThinSight"
Microsoft Research and Development. 19 November 2008. * "Office XP Speaks Out". Microsoft PressPass. 18 Apr. 2001. Microsoft. 5 December 2008. {{reflist Human–computer interaction