HOME





Dan Willard
Dan Edward Willard is an American computer scientist and logician, and is a professor of computer science at the University at Albany. Education and career Willard did his undergraduate studies in mathematics at Stony Brook University, graduating in 1970. He went on to graduate studies in mathematics at Harvard University, earning a master's degree in 1972 and a doctorate in 1978. After leaving Harvard, he worked at Bell Labs for four years before joining the Albany faculty in 1983.Curriculum vitae
accessed 2013-06-04.


Contributions

Although trained as a mathematician and employed as a computer scientist, Willard's most highly cited publication is in evolutionary biology. In 1973, with biologist
[...More Info...]      
[...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]  


picture info

University At Albany
The State University of New York at Albany, commonly referred to as the University at Albany, UAlbany or SUNY Albany, is a public research university with campuses in Albany, Rensselaer, and Guilderland, New York. Founded in 1844, it is one of the four "university centers" of the State University of New York (SUNY) system. The university enrolls 16,648 students in nine schools and colleges, which offer 50 undergraduate majors and 125 graduate degree programs. The university's academic choices include new and emerging fields in public policy, homeland security, globalization, documentary studies, biotechnology, and informatics. Through the UAlbany and SUNY-wide exchange programs, students have more than 600 study-abroad programs to choose from, as well as government and business internship opportunities in New York's capital and surrounding region. It is classified among "R1: Doctoral Universities – Very high research activity". The research enterprise totaled expenditur ...
[...More Info...]      
[...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]  


Information Processing Letters
''Information Processing Letters'' is a peer reviewed scientific journal in the field of computer science, published by Elsevier. The aim of the journal is to enable fast dissemination of results in the field of information processing Information processing is the change (processing) of information in any manner detectable by an observer. As such, it is a process that ''describes'' everything that happens (changes) in the universe, from the falling of a rock (a change in posi ... in the form of short papers. Submissions are limited to nine double-spaced pages. Both theoretical and experimental research is covered. External links * Computer science journals Publications established in 1971 Semi-monthly journals Elsevier academic journals {{compu-journal-stub ...
[...More Info...]      
[...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]  


Gödel's Incompleteness Theorems
Gödel's incompleteness theorems are two theorems of mathematical logic that are concerned with the limits of in formal axiomatic theories. These results, published by Kurt Gödel in 1931, are important both in mathematical logic and in the philosophy of mathematics. The theorems are widely, but not universally, interpreted as showing that Hilbert's program to find a complete and consistent set of axioms for all mathematics is impossible. The first incompleteness theorem states that no consistent system of axioms whose theorems can be listed by an effective procedure (i.e., an algorithm) is capable of proving all truths about the arithmetic of natural numbers. For any such consistent formal system, there will always be statements about natural numbers that are true, but that are unprovable within the system. The second incompleteness theorem, an extension of the first, shows that the system cannot demonstrate its own consistency. Employing a diagonal argument, Gödel's inc ...
[...More Info...]      
[...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]  


Self-verifying Theories
Self-verifying theories are consistent first-order systems of arithmetic, much weaker than Peano arithmetic, that are capable of proving their own consistency. Dan Willard was the first to investigate their properties, and he has described a family of such systems. According to Gödel's incompleteness theorem, these systems cannot contain the theory of Peano arithmetic nor its weak fragment Robinson arithmetic; nonetheless, they can contain strong theorems. In outline, the key to Willard's construction of his system is to formalise enough of the Gödel machinery to talk about provability internally without being able to formalise diagonalisation. Diagonalisation depends upon being able to prove that multiplication is a total function (and in the earlier versions of the result, addition also). Addition and multiplication are not function symbols of Willard's language; instead, subtraction and division are, with the addition and multiplication predicates being defined in terms of ...
[...More Info...]      
[...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]  


picture info

Shortest Path Problem
In graph theory, the shortest path problem is the problem of finding a path between two vertices (or nodes) in a graph such that the sum of the weights of its constituent edges is minimized. The problem of finding the shortest path between two intersections on a road map may be modeled as a special case of the shortest path problem in graphs, where the vertices correspond to intersections and the edges correspond to road segments, each weighted by the length of the segment. Definition The shortest path problem can be defined for graphs whether undirected, directed, or mixed. It is defined here for undirected graphs; for directed graphs the definition of path requires that consecutive vertices be connected by an appropriate directed edge. Two vertices are adjacent when they are both incident to a common edge. A path in an undirected graph is a sequence of vertices P = ( v_1, v_2, \ldots, v_n ) \in V \times V \times \cdots \times V such that v_i is adjacent to v_ for 1 \le ...
[...More Info...]      
[...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]  


picture info

Minimum Spanning Tree
A minimum spanning tree (MST) or minimum weight spanning tree is a subset of the edges of a connected, edge-weighted undirected graph that connects all the vertices together, without any cycles and with the minimum possible total edge weight. That is, it is a spanning tree whose sum of edge weights is as small as possible. More generally, any edge-weighted undirected graph (not necessarily connected) has a minimum spanning forest, which is a union of the minimum spanning trees for its connected components. There are many use cases for minimum spanning trees. One example is a telecommunications company trying to lay cable in a new neighborhood. If it is constrained to bury the cable only along certain paths (e.g. roads), then there would be a graph containing the points (e.g. houses) connected by those paths. Some of the paths might be more expensive, because they are longer, or require the cable to be buried deeper; these paths would be represented by edges with larger weights. ...
[...More Info...]      
[...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]  




Science News
''Science News (SN)'' is an American bi-weekly magazine devoted to articles about new scientific and technical developments, typically gleaned from recent scientific and technical journals. History ''Science News'' has been published since 1922 by Society for Science & the Public, a non-profit organization founded by E. W. Scripps in 1920. American chemist Edwin Slosson served as the publication's first editor. From 1922 to 1966, it was called ''Science News Letter''. The title was changed to ''Science News'' with the March 12, 1966 issue (vol. 89, no. 11). Tom Siegfried was the editor from 2007 to 2012. In 2012, Siegfried stepped down, and Eva Emerson became the Editor in Chief of the magazine. In 2017, Eva Emerson stepped down to become the editor of a new digital magazine, Annual Reviews. On February 1, 2018 Nancy Shute became the Editor in Chief of the magazine. In April 2008, the magazine changed from a weekly format to the current biweekly format, and the website ...
[...More Info...]      
[...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]  


Journal Of Computer And System Sciences
The ''Journal of Computer and System Sciences'' (JCSS) is a peer-reviewed scientific journal in the field of computer science. ''JCSS'' is published by Elsevier, and it was started in 1967. Many influential scientific articles have been published in ''JCSS''; these include five papers that have won the Gödel Prize.1993 Gödel Prize


an
2014 Gödel Prize
Its managing editor is

picture info

Priority Queue
In computer science, a priority queue is an abstract data-type similar to a regular queue or stack data structure in which each element additionally has a ''priority'' associated with it. In a priority queue, an element with high priority is served before an element with low priority. In some implementations, if two elements have the same priority, they are served according to the order in which they were enqueued; in other implementations ordering of elements with the same priority remains undefined. While coders often implement priority queues with heaps, they are conceptually distinct from heaps. A priority queue is a concept like a list or a map; just as a list can be implemented with a linked list or with an array, a priority queue can be implemented with a heap or with a variety of other methods such as an unordered array. Operations A priority queue must at least support the following operations: * ''is_empty'': check whether the queue has no elements. * ''insert_ ...
[...More Info...]      
[...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]  


picture info

Fusion Tree
In computer science, a fusion tree is a type of tree data structure that implements an associative array on -bit integers on a finite universe, where each of the input integer has size less than 2w and is non-negative. When operating on a collection of key–value pairs, it uses space and performs searches in time, which is asymptotically faster than a traditional self-balancing binary search tree, and also better than the van Emde Boas tree for large values of . It achieves this speed by using certain constant-time operations that can be done on a machine word. Fusion trees were invented in 1990 by Michael Fredman and Dan Willard. Several advances have been made since Fredman and Willard's original 1990 paper. In 1999 it was shown how to implement fusion trees under a model of computation in which all of the underlying operations of the algorithm belong to AC0, a model of circuit complexity that allows addition and bitwise Boolean operations but does not allow the multi ...
[...More Info...]      
[...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]  


Transdichotomous Model
In computational complexity theory, and more specifically in the analysis of algorithms with integer data, the transdichotomous model is a variation of the random access machine in which the machine word size is assumed to match the problem size. The model was proposed by Michael Fredman and Dan Willard, who chose its name "because the dichotomy between the machine model and the problem size is crossed in a reasonable manner." In a problem such as integer sorting in which there are integers to be sorted, the transdichotomous model assumes that each integer may be stored in a single word of computer memory, that operations on single words take constant time per operation, and that the number of bits that can be stored in a single word is at least . The goal of complexity analysis in this model is to find time bounds that depend only on and not on the actual size of the input values or the machine words.. In modeling integer computation, it is necessary to assume that machine wo ...
[...More Info...]      
[...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]  


picture info

Radix Sorting
In computer science, radix sort is a non- comparative sorting algorithm. It avoids comparison by creating and distributing elements into buckets according to their radix. For elements with more than one significant digit, this bucketing process is repeated for each digit, while preserving the ordering of the prior step, until all digits have been considered. For this reason, radix sort has also been called bucket sort and digital sort. Radix sort can be applied to data that can be sorted lexicographically, be they integers, words, punch cards, playing cards, or the mail. History Radix sort dates back as far as 1887 to the work of Herman Hollerith on tabulating machines. Radix sorting algorithms came into common use as a way to sort punched cards as early as 1923. Donald Knuth. ''The Art of Computer Programming'', Volume 3: ''Sorting and Searching'', Third Edition. Addison-Wesley, 1997. . Section 5.2.5: Sorting by Distribution, pp. 168–179. The first memory-eff ...
[...More Info...]      
[...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]