Critical system
   HOME

TheInfoList



OR:

A critical system is a system which must be highly reliable and retain this reliability as it evolves without incurring prohibitive costs. There are four types of critical systems: safety critical, mission critical, business critical and security critical.


Description

For such systems, trusted methods and techniques must be used for development. Consequently, critical systems are usually developed using well-tested techniques rather than newer techniques that have not been subject to extensive practical experience. Developers of critical systems are naturally conservative, preferring to use older techniques whose strengths and weaknesses are understood, rather than new techniques which may appear to be better, but whose long-term problems are unknown. Expensive software engineering techniques that are not cost-effective for non-critical systems may sometimes be used for critical systems development. For example, formal mathematical methods of software development have been successfully used for safety and security critical systems. One reason why these formal methods are used is that it helps reduce the amount of testing required. For critical systems, the costs of verification and validation are usually very high—more than 50% of the total system development costs.


Classification

A critical system is distinguished by the consequences associated with system or function failure. Likewise, critical systems are further distinguished between fail-operational and fail safe systems, according to the tolerance they must exhibit to failures: * Fail-operational — typically required to operate not only in nominal conditions (expected), but also in degraded situations when some parts are not working properly. For example, airplanes are fail-operational because they must be able to fly even if some components fail. * Fail-safe — must safely shut down in case of single or multiple failures. Trains are fail-safe systems because stopping a train is typically sufficient to put into safe state.


Safety critical

''Safety critical'' systems deal with scenarios that may lead to loss of life, serious personal injury, or damage to the natural environment. Examples of safety-critical systems are a control system for a chemical manufacturing plant, aircraft, the controller of an unmanned train metro system, a controller of a nuclear plant, etc.


Mission critical

''Mission critical'' systems are made to avoid inability to complete the overall system, project objectives or one of the goals for which the system was designed. Examples of mission-critical systems are a navigational system for a spacecraft, software controlling a baggage handling system of an airport, etc.


Business critical

''Business critical'' systems are programmed to avoid significant tangible or intangible economic costs; e.g., loss of business or damage to reputation. This is often due to the interruption of service caused by the system being unusable. Examples of business-critical systems are clients'
accounting systems Accounting software is a computer program that maintains account books on computers, including recording transactions and account balances. It may depend on virtual thinking. Depending on the purpose, the software can manage budgets, perform ...
for a bank, a stock-trading systems, enterprise resource planning systems,
search engines Search engines, including web search engines, selection-based search engines, metasearch engines, desktop search tools, and web portals and vertical market websites have a search facility for online databases. By content/topic Gene ...
, etc. These are often delineated via a business impact analysis. The term is sometimes used interchangeably with 'mission critical'; however business critical systems can be defined as those not necessary during incidents, while mission critical systems are seen as essential for any operations at any time.


Security critical

''Security critical'' systems deal with the loss of sensitive data through theft or accidental loss.


See also

*
Reliability theory Reliability engineering is a sub-discipline of systems engineering that emphasizes the ability of equipment to function without failure. Reliability is defined as the probability that a product, system, or service will perform its intended funct ...
*
Reliable system design Reliability, reliable, or unreliable may refer to: Science, technology, and mathematics Computing * Data reliability (disambiguation), a property of some disk arrays in computer storage * Reliability (computer networking), a category used to des ...
*
Redundancy (engineering) In engineering and systems theory, redundancy is the intentional duplication of critical components or functions of a system with the goal of increasing reliability of the system, usually in the form of a backup or fail-safe, or to improve a ...
*
Factor of safety In engineering, a factor of safety (FoS) or safety factor (SF) expresses how much stronger a system is than it needs to be for its specified maximum load. Safety factors are often calculated using detailed analysis because comprehensive testing i ...
*
Formal methods In computer science, formal methods are mathematics, mathematically rigorous techniques for the formal specification, specification, development, Program analysis, analysis, and formal verification, verification of software and computer hardware, ...


Notes

{{reflist Control engineering Systems engineering Engineering failures Maintenance Safety Reliability analysis