Refinement (computing)
   HOME
*





Refinement (computing)
Refinement is a generic term of computer science that encompasses various approaches for producing correct computer programs and simplifying existing programs to enable their formal verification. Program refinement In formal methods, program refinement is the verifiable transformation of an ''abstract'' (high-level) formal specification into a ''concrete'' (low-level) executable program. '' Stepwise refinement'' allows this process to be done in stages. Logically, refinement normally involves implication, but there can be additional complications. The progressive just-in-time preparation of the product backlog (requirements list) in agile software development approaches, such as Scrum, is also commonly described as refinement. Data refinement Data refinement is used to convert an abstract data model (in terms of sets for example) into implementable data structures (such as arrays). Operation refinement converts a specification of an operation on a system into an implementable ...
[...More Info...]      
[...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]  


Correctness (computer Science)
In theoretical computer science, an algorithm is correct with respect to a specification if it behaves as specified. Best explored is ''functional'' correctness, which refers to the input-output behavior of the algorithm (i.e., for each input it produces an output satisfying the specification). Within the latter notion, ''partial correctness'', requiring that ''if'' an answer is returned it will be correct, is distinguished from ''total correctness'', which additionally requires that an answer ''is'' eventually returned, i.e. the algorithm terminates. Correspondingly, to prove a program's total correctness, it is sufficient to prove its partial correctness, and its termination. The latter kind of proof (termination proof) can never be fully automated, since the halting problem is undecidable. For example, successively searching through integers 1, 2, 3, … to see if we can find an example of some phenomenon—say an odd perfect number—it is quite easy to write a par ...
[...More Info...]      
[...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]  


Nondeterministic Algorithm
In computer programming, a nondeterministic algorithm is an algorithm that, even for the same input, can exhibit different behaviors on different runs, as opposed to a deterministic algorithm. There are several ways an algorithm may behave differently from run to run. A concurrent algorithm can perform differently on different runs due to a race condition. A probabilistic algorithm's behaviors depends on a random number generator. An algorithm that solves a problem in nondeterministic polynomial time can run in polynomial time or exponential time depending on the choices it makes during execution. The nondeterministic algorithms are often used to find an approximation to a solution, when the exact solution would be too costly to obtain using a deterministic one. The notion was introduced by Robert W. Floyd in 1967. Use Often in computational theory, the term "algorithm" refers to a deterministic algorithm. A nondeterministic algorithm is different from its more familiar determi ...
[...More Info...]      
[...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]  


B-Method
The B method is a method of software development based on B, a tool-supported formal method based on an abstract machine notation, used in the development of computer software. Overview B was originally developed in the 1980s by Jean-Raymond Abrial in France and the UK. B is related to the Z notation (also originated by Abrial) and supports development of programming language code from specifications. B has been used in major safety-critical system applications in Europe (such as the automatic Paris Métro lines 14 and 1 and the Ariane 5 rocket). It has robust, commercially available tool support for specification, design, proof and code generation. Compared to Z, B is slightly more low-level and more focused on refinement to code rather than just formal specification — hence it is easier to correctly implement a specification written in B than one in Z. In particular, there is good tool support for this. The same language is used in specification, design and programming. Mec ...
[...More Info...]      
[...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]  


picture info

FermaT Transformation System
Pierre de Fermat (; between 31 October and 6 December 1607 – 12 January 1665) was a French mathematician who is given credit for early developments that led to infinitesimal calculus, including his technique of adequality. In particular, he is recognized for his discovery of an original method of finding the greatest and the smallest ordinates of curved lines, which is analogous to that of differential calculus, then unknown, and his research into number theory. He made notable contributions to analytic geometry, probability, and optics. He is best known for his Fermat's principle for light propagation and his Fermat's Last Theorem in number theory, which he described in a note at the margin of a copy of Diophantus' '' Arithmetica''. He was also a lawyer at the '' Parlement'' of Toulouse, France. Biography Fermat was born in 1607 in Beaumont-de-Lomagne, France—the late 15th-century mansion where Fermat was born is now a museum. He was from Gascony, where his father, Do ...
[...More Info...]      
[...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]  




Hoare Logic
Hoare logic (also known as Floyd–Hoare logic or Hoare rules) is a formal system with a set of logical rules for reasoning rigorously about the correctness of computer programs. It was proposed in 1969 by the British computer scientist and logician Tony Hoare, and subsequently refined by Hoare and other researchers. The original ideas were seeded by the work of Robert W. Floyd, who had published a similar system for flowcharts. Hoare triple The central feature of Hoare logic is the Hoare triple. A triple describes how the execution of a piece of code changes the state of the computation. A Hoare triple is of the form : \ C \ where P and Q are '' assertions'' and C is a ''command''.Hoare originally wrote "P\Q" rather than "\C\". P is named the ''precondition'' and Q the ''postcondition'': when the precondition is met, executing the command establishes the postcondition. Assertions are formulae in predicate logic. Hoare logic provides axioms and inference rules for all the c ...
[...More Info...]      
[...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]  


Formal System
A formal system is an abstract structure used for inferring theorems from axioms according to a set of rules. These rules, which are used for carrying out the inference of theorems from axioms, are the logical calculus of the formal system. A formal system is essentially an "axiomatic system". In 1921, David Hilbert proposed to use such a system as the foundation for the knowledge in mathematics. A formal system may represent a well-defined abstraction, system of abstract thought. The term ''formalism'' is sometimes a rough synonym for ''formal system'', but it also refers to a given style of notation, for example, Paul Dirac's bra–ket notation. Background Each formal system is described by primitive Symbol (formal), symbols (which collectively form an Alphabet (computer science), alphabet) to finitely construct a formal language from a set of axioms through inferential rules of formation. The system thus consists of valid formulas built up through finite combinations of the ...
[...More Info...]      
[...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]  


Abstraction (computer Science)
In software engineering and computer science, abstraction is: * The process of removing or generalizing physical, spatial, or temporal details or attributes in the study of objects or systems to focus attention on details of greater importance; it is similar in nature to the process of generalization; * the creation of abstract concept-objects by mirroring common features or attributes of various non-abstract objects or systems of study – the result of the process of abstraction. Abstraction, in general, is a fundamental concept in computer science and software development. The process of abstraction can also be referred to as modeling and is closely related to the concepts of ''theory'' and ''design''. Models can also be considered types of abstractions per their generalization of aspects of reality. Abstraction in computer science is closely related to abstraction in mathematics due to their common focus on building abstractions as objects, but is also related to other n ...
[...More Info...]      
[...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]  




Retrenchment (computing)
Retrenchment is a technique associated with Formal Methods In computer science, formal methods are mathematically rigorous techniques for the specification, development, and verification of software and hardware systems. The use of formal methods for software and hardware design is motivated by the expec ... that was introduced to address some of the perceived limitations of formal, model based refinement, for situations in which refinement might be regarded as desirable in principle, but turned out to be unusable, or nearly unusable, in practice. It was primarily developed at the School of Computer Science, University of Manchester. External linksThe Retrenchment Homepage Formal methods Software development philosophies Department of Computer Science, University of Manchester {{Comp-sci-stub ...
[...More Info...]      
[...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]  


Cliff Jones (computer Scientist)
Clifford "Cliff" B. Jones (born 1 June 1944) is a British computer scientist, specializing in research into formal methods. He undertook a late DPhil at the Oxford University Computing Laboratory (now the Oxford University Department of Computer Science) under Tony Hoare, awarded in 1981. Jones' thesis proposed an extension to Hoare logic for handling concurrent programs, rely/guarantee. Prior to his DPhil, Jones worked for IBM, between the Hursley and Vienna Laboratories. In Vienna, Jones worked with Peter Lucas, Dines Bjørner and others on the Vienna Development Method (VDM), originally as a method for specifying the formal semantics of programming languages, and subsequently for specifying and verifying programs. Cliff Jones was a professor at the Victoria University of Manchester in the 1980s and early 1990s, worked in industry at Harlequin for a period, and is now a Professor of Computing Science at Newcastle University. He has been editor-in-chief of the ''Formal Aspe ...
[...More Info...]      
[...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]  


picture info

Reification (computer Science)
Reification is the process by which an abstract idea about a computer program is turned into an explicit data model or other object created in a programming language. A computable/addressable object—a resource—is created in a system as a proxy for a non computable/addressable object. By means of reification, something that was previously implicit, unexpressed, and possibly inexpressible is explicitly formulated and made available to conceptual (logical or computational) manipulation. Informally, reification is often referred to as "making something a first-class citizen" within the scope of a particular system. Some aspect of a system can be reified at ''language design time'', which is related to reflection in programming languages. It can be applied as a stepwise refinement at ''system design time''. Reification is one of the most frequently used techniques of conceptual analysis and knowledge representation. Reflective programming languages In the context of programming ...
[...More Info...]      
[...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]  


picture info

Empty Set
In mathematics, the empty set is the unique set having no elements; its size or cardinality (count of elements in a set) is zero. Some axiomatic set theories ensure that the empty set exists by including an axiom of empty set, while in other theories, its existence can be deduced. Many possible properties of sets are vacuously true for the empty set. Any set other than the empty set is called non-empty. In some textbooks and popularizations, the empty set is referred to as the "null set". However, null set is a distinct notion within the context of measure theory, in which it describes a set of measure zero (which is not necessarily empty). The empty set may also be called the void set. Notation Common notations for the empty set include "", "\emptyset", and "∅". The latter two symbols were introduced by the Bourbaki group (specifically André Weil) in 1939, inspired by the letter Ø in the Danish and Norwegian alphabets. In the past, "0" was occasionally used as a ...
[...More Info...]      
[...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]