HOME
*





Error-tolerant Design
An error-tolerant design (also: human-error-tolerant design) is one that does not unduly penalize user or human errors. It is the human equivalent of fault tolerant design that allows equipment to continue functioning in the presence of hardware faults, such as a "limp-in" mode for an automobile electronics unit that would be employed if something like the oxygen sensor failed. Use of behavior shaping constraints to prevent errors Use of forcing functions or behavior-shaping constraints is one technique in error-tolerant design. An example is the interlock or lockout of reverse in the transmission of a moving car. This prevents errors, and prevention of errors is the most effective technique in error-tolerant design. The practice is known as poka-yoke in Japan where it was introduced by Shigeo Shingo as part of the Toyota Production System. Mitigation of the effects of errors The next most effective technique in error-tolerant design is the mitigation or limitation of the e ...
[...More Info...]      
[...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]  


Morgan Kaufmann Publishers
Morgan Kaufmann Publishers is a Burlington, Massachusetts (San Francisco, California until 2008) based publisher specializing in computer science and engineering content. Since 1984, Morgan Kaufmann has published content on information technology, computer architecture, data management, computer networking, computer systems, human computer interaction, computer graphics, multimedia information and systems, artificial intelligence, computer security, and software engineering. Morgan Kaufmann's audience includes the research and development communities, information technology (IS/IT) managers, and students in professional degree programs. The company was founded in 1984 by publishers Michael B. Morgan and William Kaufmann and computer scientist Nils Nilsson. It was held privately until 1998, when it was acquired by Harcourt General and became an imprint of the Academic Press, a subsidiary of Harcourt. Harcourt was acquired by Reed Elsevier in 2001; Morgan Kaufmann is now an imprin ...
[...More Info...]      
[...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]  


picture info

Mac OS
Two major famlies of Mac operating systems were developed by Apple Inc. In 1984, Apple debuted the operating system that is now known as the "Classic" Mac OS with its release of the original Macintosh System Software. The system, rebranded "Mac OS" in 1997, was preinstalled on every Macintosh until 2002 and offered on Macintosh clones for a short time in the 1990s. Noted for its ease of use, it was also criticized for its lack of modern technologies compared to its competitors. The current Mac operating system is macOS, originally named "Mac OS X" until 2012 and then "OS X" until 2016. Developed between 1997 and 2001 after Apple's purchase of NeXT, Mac OS X brought an entirely new architecture based on NeXTSTEP, a Unix system, that eliminated many of the technical challenges that the classic Mac OS faced. The current macOS is preinstalled with every Mac and receives a major update annually. It is the basis of Apple's current system software for its other devices – iOS, ...
[...More Info...]      
[...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]  


picture info

Fault-tolerant Computer Systems
Fault tolerance is the property that enables a system to continue operating properly in the event of the failure of one or more faults within some of its components. If its operating quality decreases at all, the decrease is proportional to the severity of the failure, as compared to a naively designed system, in which even a small failure can cause total breakdown. Fault tolerance is particularly sought after in high-availability, mission-critical, or even life-critical systems. The ability of maintaining functionality when portions of a system break down is referred to as graceful degradation. A fault-tolerant design enables a system to continue its intended operation, possibly at a reduced level, rather than failing completely, when some part of the system fails. The term is most commonly used to describe computer systems designed to continue more or less fully operational with, perhaps, a reduction in throughput or an increase in response time in the event of some partial f ...
[...More Info...]      
[...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]  


picture info

Error
An error (from the Latin ''error'', meaning "wandering") is an action which is inaccurate or incorrect. In some usages, an error is synonymous with a mistake. The etymology derives from the Latin term 'errare', meaning 'to stray'. In statistics, "error" refers to the difference between the value which has been computed and the correct value. An error could result in failure or in a deviation from the intended performance or behavior. Human behavior One reference differentiates between "error" and "mistake" as follows: In human behavior the norms or expectations for behavior or its consequences can be derived from the intention of the actor or from the expectations of other individuals or from a social grouping or from social norms. (See deviance.) Gaffes and faux pas can be labels for certain instances of this kind of error. More serious departures from social norms carry labels such as misbehavior and labels from the legal system, such as misdemeanor and crime. Departures ...
[...More Info...]      
[...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]  


The Design Of Everyday Things
''The Design of Everyday Things'' is a best-selling book by cognitive scientist and usability engineer Donald Norman about how design serves as the communication between object and user, and how to optimize that conduit of communication in order to make the experience of using the object pleasurable. One of the main premises of the book is that although people are often keen to blame themselves when objects appear to malfunction, it is not the fault of the user but rather the lack of intuitive guidance that should be present in the design. The book was originally published in 1988 with the title ''The Psychology of Everyday Things''. Norman said his academic peers liked that title, but believed the new title better conveyed the content of the book and better attracted interested readers. It is often referred to by the initialisms ''POET'' and ''DOET''. Norman uses case studies to describe the psychology behind what he deems good and bad design, and proposes design principles. Th ...
[...More Info...]      
[...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]  


Donald A
Donald is a masculine given name derived from the Gaelic name ''Dòmhnall''.. This comes from the Proto-Celtic *''Dumno-ualos'' ("world-ruler" or "world-wielder"). The final -''d'' in ''Donald'' is partly derived from a misinterpretation of the Gaelic pronunciation by English speakers, and partly associated with the spelling of similar-sounding Germanic names, such as ''Ronald''. A short form of ''Donald'' is ''Don''. Pet forms of ''Donald'' include ''Donnie'' and ''Donny''. The feminine given name ''Donella'' is derived from ''Donald''. ''Donald'' has cognates in other Celtic languages: Modern Irish ''Dónal'' (anglicised as ''Donal'' and ''Donall'');. Scottish Gaelic ''Dòmhnall'', ''Domhnull'' and ''Dòmhnull''; Welsh '' Dyfnwal'' and Cumbric ''Dumnagual''. Although the feminine given name ''Donna'' is sometimes used as a feminine form of ''Donald'', the names are not etymologically related. Variations Kings and noblemen Domnall or Domhnall is the name of many ancie ...
[...More Info...]      
[...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]  


Human Reliability
Human reliability (also known as human performance or HU) is related to the field of human factors and ergonomics, and refers to the reliability of humans in fields including manufacturing, medicine and nuclear power. Human performance can be affected by many factors such as age, state of mind, physical health, attitude, emotions, propensity for certain common mistakes, errors and cognitive biases, etc. Human reliability is very important due to the contributions of humans to the resilience of systems and to possible adverse consequences of human errors or oversights, especially when the human is a crucial part of the large socio-technical systems as is common today. User-centered design and error-tolerant design are just two of many terms used to describe efforts to make technology better suited to operation by humans. Common Traps of Human Nature People tend to overestimate their ability to maintain control when they are doing work. The common characteristics of human nat ...
[...More Info...]      
[...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]  


picture info

Human Factors
Human factors and ergonomics (commonly referred to as human factors) is the application of psychological and physiological principles to the engineering and design of products, processes, and systems. Four primary goals of human factors learning are to reduce human error, increase productivity, and enhance safety, system availability, and comfort with a specific focus on the interaction between the human and the engineered system. The field is a combination of numerous disciplines, such as psychology, sociology, engineering, biomechanics, industrial design, physiology, anthropometry, interaction design, visual design, user experience, and user interface design. Human factors research employs methods and approaches from these and other knowledge disciplines to study human behavior and generate data relevant to the four primary goals above. In studying and sharing learning on the design of equipment, devices, and processes that fit the human body and its cognitive abilities, the ...
[...More Info...]      
[...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]  


Metaphone
Metaphone is a phonetic algorithm, published by Lawrence Philips in 1990, for indexing words by their English pronunciation. It fundamentally improves on the Soundex algorithm by using information about variations and inconsistencies in English spelling and pronunciation to produce a more accurate encoding, which does a better job of matching words and names which sound similar. As with Soundex, similar-sounding words should share the same keys. Metaphone is available as a built-in operator in a number of systems. Philips later produced a new version of the algorithm, which he named Double Metaphone. Contrary to the original algorithm whose application is limited to English only, this version takes into account spelling peculiarities of a number of other languages. In 2009 Philips released a third version, called Metaphone 3, which achieves an accuracy of approximately 99% for English words, non-English words familiar to Americans, and first names and family names commonly found in ...
[...More Info...]      
[...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]  


Soundex
Soundex is a phonetic algorithm for indexing names by sound, as pronounced in English. The goal is for homophones to be encoded to the same representation so that they can be matched despite minor differences in spelling. The algorithm mainly encodes consonants; a vowel will not be encoded unless it is the first letter. Soundex is the most widely known of all phonetic algorithms (in part because it is a standard feature of popular database software such as IBM Db2, PostgreSQL, MySQL, SQLite, Ingres, MS SQL Server, Oracle. and SAP ASE.) Improvements to Soundex are the basis for many modern phonetic algorithms. History Soundex was developed by Robert C. Russell and Margaret King Odell and patented in 1918 and 1922. A variation, American Soundex, was used in the 1930s for a retrospective analysis of the US censuses from 1890 through 1920. The Soundex code came to prominence in the 1960s when it was the subject of several articles in the ''Communications'' and ''Journal of the ...
[...More Info...]      
[...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]  




Edit Distance
In computational linguistics and computer science, edit distance is a string metric, i.e. a way of quantifying how dissimilar two strings (e.g., words) are to one another, that is measured by counting the minimum number of operations required to transform one string into the other. Edit distances find applications in natural language processing, where automatic spelling correction can determine candidate corrections for a misspelled word by selecting words from a dictionary that have a low distance to the word in question. In bioinformatics, it can be used to quantify the similarity of DNA sequences, which can be viewed as strings of the letters A, C, G and T. Different definitions of an edit distance use different sets of string operations. Levenshtein distance operations are the removal, insertion, or substitution of a character in the string. Being the most common metric, the term ''Levenshtein distance'' is often used interchangeably with ''edit distance''. Types of edit dis ...
[...More Info...]      
[...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]