HOME





Unifying Theories Of Programming
''Unifying Theories of Programming'' (UTP) in computer science deals with program semantics. It shows how denotational semantics, operational semantics and algebraic semantics can be combined in a unified framework for the formal specification, design and implementation of programs and computer systems. The book of this title by C.A.R. Hoare and He Jifeng was published in the Prentice Hall International Series in Computer Science in 1998 and has been made freely available on the web. A UTP Symposium series was started in 2006. Theories The semantic foundation of the UTP is the first-order predicate calculus, augmented with fixed-point constructs from second-order logic. Following the tradition of Eric Hehner, programs are predicates in the UTP, and there is no distinction between programs and specifications at the semantic level. In the words of Hoare: A computer program is identified with the strongest predicate describing every relevant observation that can be made o ...
[...More Info...]      
[...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]  


He Jifeng
He Jifeng (, born August 1943) is a Chinese computer scientist. He Jifeng graduated from the mathematics department of Fudan University in 1965. From 1965 to 1985, he was an instructor at East China Normal University. During 1980–81, he was a visiting scholar at Stanford University and the University of San Francisco in California, United States. From 1984 to 1998, He Jifeng was a senior research fellow at the Programming Research Group in the Oxford University Computing Laboratory (now the Oxford University Department of Computer Science). He worked extensively on formal aspects of computing science. In particular, he worked with Prof. Sir Tony Hoare, latterly on Unifying Theories of Programming, resulting in a book of that name. Since 1986, He Jifeng has been professor of computer science at East China Normal University in Shanghai. In 1996, he also became professor of computer science at Shanghai Jiao Tong University. In 1998, he became a senior research fellow at the In ...
[...More Info...]      
[...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]  


Eric Hehner
Eric "Rick" C. R. Hehner (born 16 September 1947) is a Canadian computer scientist. He was born in Ottawa. He studied mathematics and physics at Carleton University, graduating with a Bachelor of Science (B.Sc.) in 1969. He studied computer science at the University of Toronto, graduating with a Master of Science (M.Sc.) in 1970, and a Doctor of Philosophy (Ph.D.) in 1974. He then joined the faculty there, becoming a full professor in 1983. He became the Bell University Chair in software engineering in 2001, and retired in 2012. Hehner's main research area is formal methods of software design. His method, initially called predicative programming, later called Practical Theory of Programming, is to consider each specification to be a binary ( boolean) expression, and each programming construct to be a binary expression specifying the effect of executing the programming construct. Refinement is just implication. This is the simplest formal method, and the most general, applying t ...
[...More Info...]      
[...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]  


picture info

Archive
An archive is an accumulation of historical records or materials, in any medium, or the physical facility in which they are located. Archives contain primary source documents that have accumulated over the course of an individual or organization's lifetime, and are kept to show the history and function of that person or organization. Professional archivists and historians generally understand archives to be records that have been naturally and necessarily generated as a product of regular legal, commercial, administrative, or social activities. They have been metaphorically defined as "the secretions of an organism", and are distinguished from documents that have been consciously written or created to communicate a particular message to posterity. In general, archives consist of records that have been selected for permanent or long-term preservation on the grounds of their enduring cultural, historical, or evidentiary value. Archival records are normally unpublished and a ...
[...More Info...]      
[...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]  


picture info

Least Fixed Point
In order theory, a branch of mathematics, the least fixed point (lfp or LFP, sometimes also smallest fixed point) of a function from a partially ordered set ("poset" for short) to itself is the fixed point which is less than each other fixed point, according to the order of the poset. A function need not have a least fixed point, but if it does then the least fixed point is unique. Examples With the usual order on the real numbers, the least fixed point of the real function ''f''(''x'') = ''x''2 is ''x'' = 0 (since the only other fixed point is 1 and 0 < 1). In contrast, ''f''(''x'') = ''x'' + 1 has no fixed points at all, so has no least one, and ''f''(''x'') = ''x'' has infinitely many fixed points, but has no least one. Let G = (V, A) be a directed graph and v be a vertex. The

picture info

Recursion
Recursion occurs when the definition of a concept or process depends on a simpler or previous version of itself. Recursion is used in a variety of disciplines ranging from linguistics to logic. The most common application of recursion is in mathematics and computer science, where a function (mathematics), function being defined is applied within its own definition. While this apparently defines an infinite number of instances (function values), it is often done in such a way that no infinite loop or infinite chain of references can occur. A process that exhibits recursion is ''recursive''. Video feedback displays recursive images, as does an infinity mirror. Formal definitions In mathematics and computer science, a class of objects or methods exhibits recursive behavior when it can be defined by two properties: * A simple ''base case'' (or cases) — a terminating scenario that does not use recursion to produce an answer * A ''recursive step'' — a set of rules that reduce ...
[...More Info...]      
[...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]  


picture info

Conditional (programming)
In computer science, conditionals (that is, conditional statements, conditional expressions and conditional constructs) are programming language constructs that perform different computations or actions or return different values depending on the value of a Boolean expression, called a ''condition''. Conditionals are typically implemented by selectively executing instructions. Although dynamic dispatch is not usually classified as a conditional construct, it is another way to select between alternatives at runtime. Terminology Conditional statements are imperative constructs executed for side-effect, while conditional expressions return values. Many programming languages (such as C) have distinct conditional statements and conditional expressions. Although in pure functional programming, conditional expressions do not have side-effects, many languages with conditional expressions (such as Lisp) support conditional side-effects. If–then(–else) The if–then or if ...
[...More Info...]      
[...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]  


Composition Of Relations
In the mathematics of binary relations, the composition of relations is the forming of a new binary relation from two given binary relations ''R'' and ''S''. In the calculus of relations, the composition of relations is called relative multiplication, and its result is called a relative product. Function composition is the special case of composition of relations where all relations involved are functions. The word uncle indicates a compound relation: for a person to be an uncle, he must be the brother of a parent. In algebraic logic it is said that the relation of Uncle (x U z) is the composition of relations "is a brother of" (x B y) and "is a parent of" (y P z). U = BP \quad \text \quad xUz \text \exists y\ xByPz. Beginning with Augustus De Morgan, the traditional form of reasoning by syllogism has been subsumed by relational logical expressions and their composition. Definition If R \subseteq X \times Y and S \subseteq Y \times Z are two binary relations, then their compo ...
[...More Info...]      
[...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]  




Sequential Composition
In computer science, the process calculi (or process algebras) are a diverse family of related approaches for formally modelling concurrent systems. Process calculi provide a tool for the high-level description of interactions, communications, and synchronizations between a collection of independent agents or processes. They also provide algebraic laws that allow process descriptions to be manipulated and analyzed, and permit formal reasoning about equivalences between processes (e.g., using bisimulation). Leading examples of process calculi include CSP, CCS, ACP, and LOTOS. More recent additions to the family include the π-calculus, the ambient calculus, PEPA, the fusion calculus and the join-calculus. Essential features While the variety of existing process calculi is very large (including variants that incorporate stochastic behaviour, timing information, and specializations for studying molecular interactions), there are several features that all process calculi have i ...
[...More Info...]      
[...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]  


Universal Closure
In mathematical logic, a universal quantification is a type of quantifier, a logical constant which is interpreted as "given any", "for all", "for every", or "given an arbitrary element". It expresses that a predicate can be satisfied by every member of a domain of discourse. In other words, it is the predication of a property or relation to every member of the domain. It asserts that a predicate within the scope of a universal quantifier is true of every value of a predicate variable. It is usually denoted by the turned A (∀) logical operator symbol, which, when used together with a predicate variable, is called a universal quantifier ("", "", or sometimes by "" alone). Universal quantification is distinct from ''existential'' quantification ("there exists"), which only asserts that the property or relation holds for at least one member of the domain. Quantification in general is covered in the article on quantification (logic). The universal quantifier is encoded ...
[...More Info...]      
[...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]  


Program Refinement
Refinement is a generic term of computer science that encompasses various approaches for producing correct computer programs and simplifying existing programs to enable their formal verification. Program refinement In formal methods, program refinement is the verifiable transformation of an ''abstract'' (high-level) formal specification into a ''concrete'' (low-level) executable program. '' Stepwise refinement'' allows this process to be done in stages. Logically, refinement normally involves implication, but there can be additional complications. The progressive just-in-time preparation of the product backlog (requirements list) in agile software development approaches, such as Scrum, is also commonly described as refinement. Data refinement Data refinement is used to convert an abstract data model (in terms of sets for example) into implementable data structures (such as arrays). Operation refinement converts a specification of an operation on a system into an implementab ...
[...More Info...]      
[...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]  


Predicate Transformer Semantics
Predicate transformer semantics were introduced by Edsger Dijkstra in his seminal paper " Guarded commands, nondeterminacy and formal derivation of programs". They define the semantics of an imperative programming paradigm by assigning to each ''statement'' in this language a corresponding ''predicate transformer'': a total function between two '' predicates'' on the state space of the statement. In this sense, predicate transformer semantics are a kind of denotational semantics. Actually, in guarded commands, Dijkstra uses only one kind of predicate transformer: the well-known weakest preconditions (see below). Moreover, predicate transformer semantics are a reformulation of Floyd–Hoare logic. Whereas Hoare logic is presented as a deductive system, predicate transformer semantics (either by weakest-preconditions or by strongest-postconditions see below) are complete strategies to build valid deductions of Hoare logic. In other words, they provide an effective algorithm to ...
[...More Info...]      
[...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]  




Idempotent
Idempotence (, ) is the property of certain operations in mathematics and computer science whereby they can be applied multiple times without changing the result beyond the initial application. The concept of idempotence arises in a number of places in abstract algebra (in particular, in the theory of projectors and closure operators) and functional programming (in which it is connected to the property of referential transparency). The term was introduced by American mathematician Benjamin Peirce in 1870 in the context of elements of algebras that remain invariant when raised to a positive integer power, and literally means "(the quality of having) the same power", from + '' potence'' (same + power). Definition An element x of a set S equipped with a binary operator \cdot is said to be ''idempotent'' under \cdot if : . The ''binary operation'' \cdot is said to be ''idempotent'' if : . Examples * In the monoid (\mathbb, \times) of the natural numbers with multiplication, ...
[...More Info...]      
[...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]