HOME
*





Mihalis Yannakakis
Mihalis Yannakakis ( el, Μιχάλης Γιαννακάκης; born 13 September 1953 in Athens, Greece)Columbia University: CV: Mihalis Yannakakis
(accessed 12 November 2009)
is professor of computer science at . He is noted for his work in computational complexity, , and other related fields. He won the

picture info

Athens
Athens ( ; el, Αθήνα, Athína ; grc, Ἀθῆναι, Athênai (pl.) ) is both the capital and largest city of Greece. With a population close to four million, it is also the seventh largest city in the European Union. Athens dominates and is the capital of the Attica region and is one of the world's oldest cities, with its recorded history spanning over 3,400 years and its earliest human presence beginning somewhere between the 11th and 7th millennia BC. Classical Athens was a powerful city-state. It was a centre for the arts, learning and philosophy, and the home of Plato's Academy and Aristotle's Lyceum. It is widely referred to as the cradle of Western civilization and the birthplace of democracy, largely because of its cultural and political influence on the European continent—particularly Ancient Rome. In modern times, Athens is a large cosmopolitan metropolis and central to economic, financial, industrial, maritime, political and cultural life in Gre ...
[...More Info...]      
[...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]  


Journal Of The ACM
The ''Journal of the ACM'' is a peer-reviewed scientific journal covering computer science in general, especially theoretical aspects. It is an official journal of the Association for Computing Machinery. Its current editor-in-chief is Venkatesan Guruswami. The journal was established in 1954 and "computer scientists universally hold the ''Journal of the ACM'' in high esteem". See also * ''Communications of the ACM ''Communications of the ACM'' is the monthly journal of the Association for Computing Machinery (ACM). It was established in 1958, with Saul Rosen as its first managing editor. It is sent to all ACM members. Articles are intended for readers wi ...'' References External links * Publications established in 1954 Computer science journals Association for Computing Machinery academic journals Bimonthly journals English-language journals {{compu-journal-stub ...
[...More Info...]      
[...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]  


picture info

Independent Set Problem
In graph theory, an independent set, stable set, coclique or anticlique is a set of vertices in a graph, no two of which are adjacent. That is, it is a set S of vertices such that for every two vertices in S, there is no edge connecting the two. Equivalently, each edge in the graph has at most one endpoint in S. A set is independent if and only if it is a clique in the graph's complement. The size of an independent set is the number of vertices it contains. Independent sets have also been called "internally stable sets", of which "stable set" is a shortening. A maximal independent set is an independent set that is not a proper subset of any other independent set. A maximum independent set is an independent set of largest possible size for a given graph G. This size is called the independence number of ''G'' and is usually denoted by \alpha(G). The optimization problem of finding such a set is called the maximum independent set problem. It is a strongly NP-hard problem. As such ...
[...More Info...]      
[...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]  


picture info

3SAT
In logic and computer science, the Boolean satisfiability problem (sometimes called propositional satisfiability problem and abbreviated SATISFIABILITY, SAT or B-SAT) is the problem of determining if there exists an interpretation that satisfies a given Boolean formula. In other words, it asks whether the variables of a given Boolean formula can be consistently replaced by the values TRUE or FALSE in such a way that the formula evaluates to TRUE. If this is the case, the formula is called ''satisfiable''. On the other hand, if no such assignment exists, the function expressed by the formula is FALSE for all possible variable assignments and the formula is ''unsatisfiable''. For example, the formula "''a'' AND NOT ''b''" is satisfiable because one can find the values ''a'' = TRUE and ''b'' = FALSE, which make (''a'' AND NOT ''b'') = TRUE. In contrast, "''a'' AND NOT ''a''" is unsatisfiable. SAT is the first problem that was proved to be NP-complete ...
[...More Info...]      
[...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]  


Christos Papadimitriou
Christos Charilaos Papadimitriou ( el, Χρήστος Χαρίλαος "Χρίστος" Παπαδημητρίου; born August 16, 1949) is a Greek theoretical computer scientist and the Donovan Family Professor of Computer Science at Columbia University. Education Papadimitriou studied at the National Technical University of Athens, where in 1972 he received his Bachelor of Arts degree in electrical engineering. He then pursued graduate studies at Princeton University, where he received his Ph.D. in electrical engineering and computer science in 1976 after completing a doctoral dissertation titled "The complexity of combinatorial optimization problems." Career Papadimitriou has taught at Harvard, MIT, the National Technical University of Athens, Stanford, UCSD, University of California, Berkeley and is currently the Donovan Family Professor of Computer Science at Columbia University. Papadimitriou co-authored a paper on pancake sorting with Bill Gates, then a Harvard underg ...
[...More Info...]      
[...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]  


ACM Symposium On Theory Of Computing
The Annual ACM Symposium on Theory of Computing (STOC) is an academic conference in the field of theoretical computer science. STOC has been organized annually since 1969, typically in May or June; the conference is sponsored by the Association for Computing Machinery special interest group SIGACT. Acceptance rate of STOC, averaged from 1970 to 2012, is 31%, with the rate of 29% in 2012. As writes, STOC and its annual IEEE counterpart FOCS (the Symposium on Foundations of Computer Science) are considered the two top conferences in theoretical computer science, considered broadly: they “are forums for some of the best work throughout theory of computing that promote breadth among theory of computing researchers and help to keep the community together.” includes regular attendance at STOC and FOCS as one of several defining characteristics of theoretical computer scientists. Awards The Gödel Prize for outstanding papers in theoretical computer science is presented alternately ...
[...More Info...]      
[...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]  




Hardness Of Approximation
In computer science, hardness of approximation is a field that studies the algorithmic complexity of finding near-optimal solutions to optimization problems. Scope Hardness of approximation complements the study of approximation algorithms by proving, for certain problems, a limit on the factors with which their solution can be efficiently approximated. Typically such limits show a factor of approximation beyond which a problem becomes NP-hard, implying that finding a polynomial time approximation for the problem is impossible unless NP=P. Some hardness of approximation results, however, are based on other hypotheses, a notable one among which is the unique games conjecture. History Since the early 1970s it was known that many optimization problems could not be solved in polynomial time unless P = NP, but in many of these problems the optimal solution could be efficiently approximated to a certain degree. In the 1970s, Teofilo F. Gonzalez and Sartaj Sahni began the study of hardn ...
[...More Info...]      
[...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]  


PCP Theory
In computational complexity theory, a probabilistically checkable proof (PCP) is a type of proof that can be checked by a randomized algorithm using a bounded amount of randomness and reading a bounded number of bits of the proof. The algorithm is then required to accept correct proofs and reject incorrect proofs with very high probability. A standard proof (or certificate), as used in the verifier-based definition of the complexity class NP, also satisfies these requirements, since the checking procedure deterministically reads the whole proof, always accepts correct proofs and rejects incorrect proofs. However, what makes them interesting is the existence of probabilistically checkable proofs that can be checked by reading only a few bits of the proof using randomness in an essential way. Probabilistically checkable proofs give rise to many complexity classes depending on the number of queries required and the amount of randomness used. The class PCP 'r''(''n''),''q''(''n'') ...
[...More Info...]      
[...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]  


picture info

Algorithmic Graph Theory
In mathematics, graph theory is the study of ''graphs'', which are mathematical structures used to model pairwise relations between objects. A graph in this context is made up of '' vertices'' (also called ''nodes'' or ''points'') which are connected by '' edges'' (also called ''links'' or ''lines''). A distinction is made between undirected graphs, where edges link two vertices symmetrically, and directed graphs, where edges link two vertices asymmetrically. Graphs are one of the principal objects of study in discrete mathematics. Definitions Definitions in graph theory vary. The following are some of the more basic ways of defining graphs and related mathematical structures. Graph In one restricted but very common sense of the term, a graph is an ordered pair G=(V,E) comprising: * V, a set of vertices (also called nodes or points); * E \subseteq \, a set of edges (also called links or lines), which are unordered pairs of vertices (that is, an edge is associated wit ...
[...More Info...]      
[...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]  


Database Theory
Database theory encapsulates a broad range of topics related to the study and research of the theoretical realm of databases and database management systems. Theoretical aspects of data management include, among other areas, the foundations of query languages, computational complexity and expressive power of queries, finite model theory, database design theory, dependency theory, foundations of concurrency control and database recovery, deductive databases, temporal and spatial databases, real-time databases, managing uncertain data and probabilistic databases, and Web data. Most research work has traditionally been based on the relational model, since this model is usually considered the simplest and most foundational model of interest. Corresponding results for other data models, such as object-oriented or semi-structured models, or, more recently, graph data models and XML, are often derivable from those for the relational model. A central focus of database theory is ...
[...More Info...]      
[...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]  


picture info

Computational Complexity Theory
In theoretical computer science and mathematics, computational complexity theory focuses on classifying computational problems according to their resource usage, and relating these classes to each other. A computational problem is a task solved by a computer. A computation problem is solvable by mechanical application of mathematical steps, such as an algorithm. A problem is regarded as inherently difficult if its solution requires significant resources, whatever the algorithm used. The theory formalizes this intuition, by introducing mathematical models of computation to study these problems and quantifying their computational complexity, i.e., the amount of resources needed to solve them, such as time and storage. Other measures of complexity are also used, such as the amount of communication (used in communication complexity), the number of gates in a circuit (used in circuit complexity) and the number of processors (used in parallel computing). One of the roles of compu ...
[...More Info...]      
[...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]  


picture info

H-index
The ''h''-index is an author-level metric that measures both the productivity and citation impact of the publications, initially used for an individual scientist or scholar. The ''h''-index correlates with obvious success indicators such as winning the Nobel Prize, being accepted for research fellowships and holding positions at top universities. The index is based on the set of the scientist's most cited papers and the number of citations that they have received in other publications. The index has more recently been applied to the productivity and impact of a scholarly journal as well as a group of scientists, such as a department or university or country. The index was suggested in 2005 by Jorge E. Hirsch, a physicist at UC San Diego, as a tool for determining theoretical physicists' relative quality and is sometimes called the Hirsch index or Hirsch number. Definition and purpose The ''h''-index is defined as the maximum value of ''h'' such that the given author/ ...
[...More Info...]      
[...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]