HOME

TheInfoList



OR:

This is a list of important publications in
computer science Computer science is the study of computation, automation, and information. Computer science spans theoretical disciplines (such as algorithms, theory of computation, information theory, and automation) to Applied science, practical discipli ...
, organized by field. Some reasons why a particular publication might be regarded as important: *Topic creator – A publication that created a new topic *Breakthrough – A publication that changed scientific knowledge significantly *Influence – A publication which has significantly influenced the world or has had a massive impact on the teaching of computer science.


Artificial intelligence


''Computing Machinery and Intelligence''

*
Alan Turing Alan Mathison Turing (; 23 June 1912 – 7 June 1954) was an English mathematician, computer scientist, logician, cryptanalyst, philosopher, and theoretical biologist. Turing was highly influential in the development of theoretical ...
* Mind, 59:433–460, 1950.
Online copy
Description: This paper discusses the various arguments on why a machine can not be intelligent and asserts that none of those arguments are convincing. The paper also suggested the
Turing test The Turing test, originally called the imitation game by Alan Turing in 1950, is a test of a machine's ability to exhibit intelligent behaviour equivalent to, or indistinguishable from, that of a human. Turing proposed that a human evaluat ...
, which it calls "The Imitation Game" as according to Turing it is pointless to ask whether or not a machine can ''think'' intelligently, and checking if it can ''act'' intelligently is sufficient.


''A Proposal for the Dartmouth Summer Research Project on Artificial Intelligence''

* John McCarthy *
Marvin Minsky Marvin Lee Minsky (August 9, 1927 – January 24, 2016) was an American cognitive and computer scientist concerned largely with research of artificial intelligence (AI), co-founder of the Massachusetts Institute of Technology's AI laboratory ...
* N. Rochester * C.E. Shannon
Online copy
Description: This summer research proposal inaugurated and defined the field. It contains the first use of the term
artificial intelligence Artificial intelligence (AI) is intelligence—perceiving, synthesizing, and inferring information—demonstrated by machines, as opposed to intelligence displayed by animals and humans. Example tasks in which this is done include speech ...
and this succinct description of the philosophical foundation of the field: "every aspect of learning or any other feature of intelligence can in principle be so precisely described that a machine can be made to simulate it." (See philosophy of AI) The proposal invited researchers to the
Dartmouth conference The Dartmouth Conference is the longest continuous bilateral dialogue between American and Soviet (now Russian) representatives. The first Dartmouth Conference took place at Dartmouth College in 1961. Subsequent conferences were held through 1990 ...
, which is widely considered the "birth of AI". (See history of AI.)


''Fuzzy sets''

* Lotfi Zadeh * Information and Control, Vol. 8, pp. 338–353. (1965). Description: The seminal paper published in 1965 provides details on the mathematics of fuzzy set theory.


''Probabilistic Reasoning in Intelligent Systems: Networks of Plausible Inference''

*
Judea Pearl Judea Pearl (born September 4, 1936) is an Israeli-American computer scientist and philosopher, best known for championing the probabilistic approach to artificial intelligence and the development of Bayesian networks (see the article on belief ...
* Publisher: Morgan Kaufmann Pub, 1988 Description: This book introduced Bayesian methods to AI.


''Artificial Intelligence: A Modern Approach''

*
Stuart J. Russell Stuart Jonathan Russell (born 1962) is a British computer scientist known for his contributions to artificial intelligence (AI). He is a professor of computer science at the University of California, Berkeley and was from 2008 to 2011 an adjunct ...
and
Peter Norvig Peter Norvig (born December 14, 1956) is an American computer scientist and Distinguished Education Fellow at the Stanford Institute for Human-Centered AI. He previously served as a director of research and search quality at Google. Norvig is t ...
* Prentice Hall, Englewood Cliffs, New Jersey, 1995,
Textbook's website
Description: The standard textbook in Artificial Intelligence
The book web site
lists over 1100 colleges.


Machine learning


''An Inductive Inference Machine''

*
Ray Solomonoff Ray Solomonoff (July 25, 1926 – December 7, 2009) was the inventor of algorithmic probability, his General Theory of Inductive Inference (also known as Universal Inductive Inference),Samuel Rathmanner and Marcus Hutter. A philosophical treatise ...
* ''IRE Convention Record'', Section on Information Theory, Part 2, pp. 56–62, 1957 * (A longer version of this, a privately circulated report, 1956, i
online
. Description: The first paper written on
machine learning Machine learning (ML) is a field of inquiry devoted to understanding and building methods that 'learn', that is, methods that leverage data to improve performance on some set of tasks. It is seen as a part of artificial intelligence. Machine ...
. Emphasized the importance of training sequences, and the use of parts of previous solutions to problems in constructing trial solutions to new problems.


''Language identification in the limit''

*
E. Mark Gold E. Mark Gold (often written "E Mark Gold" without a dot, born 1936 in Los Angeles) is an American physicist, mathematician, and computer scientist. He became well known for his article '' Language identification in the limit'' which pioneered a f ...
* ''
Information and Control ''Information and Computation'' is a closed-access computer science journal published by Elsevier (formerly Academic Press). The journal was founded in 1957 under its former name ''Information and Control'' and given its current title in 1987. , ...
'', 10(5):447–474, 1967 * Online version
(HTML)(PDF)
Description: This paper created Algorithmic learning theory.


''On the uniform convergence of relative frequencies of events to their probabilities''

* V. Vapnik, A. Chervonenkis *''Theory of Probability and Its Applications'', 16(2):264—280, 1971 Description: Computational learning theory, VC theory, statistical uniform convergence and the
VC dimension VC may refer to: Military decorations * Victoria Cross, a military decoration awarded by the United Kingdom and also by certain Commonwealth nations ** Victoria Cross for Australia ** Victoria Cross (Canada) ** Victoria Cross for New Zealand * Vic ...
. While the bounds in this paper are not the best possible, it would take 50 years before the best bound was obtained by Michael Naaman in 2021.



A theory of the learnable
'

*
Leslie Valiant Leslie Gabriel Valiant (born 28 March 1949) is a British American computer scientist and computational theorist. He was born to a chemical engineer father and a translator mother. He is currently the T. Jefferson Coolidge Professor of Comput ...
* ''
Communications of the ACM ''Communications of the ACM'' is the monthly journal of the Association for Computing Machinery (ACM). It was established in 1958, with Saul Rosen as its first managing editor. It is sent to all ACM members. Articles are intended for readers wi ...
'', 27(11):1134–1142 (1984) Description: The Probably approximately correct learning (PAC learning) framework.


''Learning representations by back-propagating errors''

* David E. Rumelhart, Geoffrey E. Hinton and Ronald J. Williams * Nature, 323, 533–536, 1986 Seppo Linnainmaa's reverse mode of automatic differentiationLinnainmaa, Seppo (1970). The representation of the cumulative rounding error of an algorithm as a Taylor expansion of the local rounding errors. Master's Thesis, Univ. Helsinki, 6-7.Griewank, Andreas (2012). Who Invented the Reverse Mode of Differentiation? Optimization Stories, Documenta Matematica, Extra Volume ISMP (2012), 389-400. (first applied to neural networks by
Paul Werbos Paul John Werbos (born 1947) is an American social scientist and machine learning pioneer. He is best known for his 1974 dissertation, which first described the process of training artificial neural networks through backpropagation of errors. He ...
) is used in experiments by
David Rumelhart David Everett Rumelhart (June 12, 1942 – March 13, 2011) was an American psychologist who made many contributions to the formal analysis of human cognition, working primarily within the frameworks of mathematical psychology, symbolic artif ...
, Geoff Hinton and Ronald J. Williams to learn internal representations.


''Induction of Decision Trees''

* J.R. Quinlan * Machine Learning, 1. 81—106, 1986. Description:
Decision Tree A decision tree is a decision support tool that uses a tree-like model of decisions and their possible consequences, including chance event outcomes, resource costs, and utility. It is one way to display an algorithm that only contains con ...
s are a common learning algorithm and a decision representation tool. Development of decision trees was done by many researchers in many areas, even before this paper. Though this paper is one of the most influential in the field.


''Learning Quickly When Irrelevant Attributes Abound: A New Linear-threshold Algorithm''

* Nick Littlestone * Machine Learning 2: 285–318, 1988
Online version(PDF)
Description: One of the papers that started the field of on-line learning. In this learning setting, a learner receives a sequence of examples, making predictions after each one, and receiving feedback after each prediction. Research in this area is remarkable because (1) the algorithms and proofs tend to be very simple and beautiful, and (2) the model makes no statistical assumptions about the data. In other words, the data need not be random (as in nearly all other learning models), but can be chosen arbitrarily by "nature" or even an adversary. Specifically, this paper introduced the
winnow algorithm The winnow algorithm Nick Littlestone (1988). "Learning Quickly When Irrelevant Attributes Abound: A New Linear-threshold Algorithm" ''Machine Learning'' 285–318(2) is a technique from machine learning for learning a linear classifier from labele ...
.


''Learning to predict by the method of Temporal difference''

* Richard S. Sutton * Machine Learning 3(1): 9–44
Online copy(PDF)
Description: The
Temporal difference Temporal difference (TD) learning refers to a class of model-free reinforcement learning methods which learn by bootstrapping from the current estimate of the value function. These methods sample from the environment, like Monte Carlo methods, a ...
method for reinforcement learning.


''Learnability and the Vapnik–Chervonenkis dimension''

* A. Blumer * A. Ehrenfeucht * D. Haussler * M. K. Warmuth *
Journal of the ACM The ''Journal of the ACM'' is a peer-reviewed scientific journal covering computer science in general, especially theoretical aspects. It is an official journal of the Association for Computing Machinery. Its current editor-in-chief is Venkatesan ...
, 36(4):929–965, 1989. Description: The complete characterization of PAC learnability using the
VC dimension VC may refer to: Military decorations * Victoria Cross, a military decoration awarded by the United Kingdom and also by certain Commonwealth nations ** Victoria Cross for Australia ** Victoria Cross (Canada) ** Victoria Cross for New Zealand * Vic ...
.


''Cryptographic limitations on learning boolean formulae and finite automata ''

* M. Kearns * L. G. Valiant * In Proceedings of the 21st Annual ACM
Symposium on Theory of Computing The Annual ACM Symposium on Theory of Computing (STOC) is an academic conference in the field of theoretical computer science. STOC has been organized annually since 1969, typically in May or June; the conference is sponsored by the Association for ...
, pages 433–444, New York. ACM. Description: Proving negative results for PAC learning.


''The strength of weak learnability''

* Robert E. Schapire * Machine Learning, 5(2):197–227, 1990.
Online version(PDF)
Description: Proving that weak and strong learnability are equivalent in the noise free PAC framework. The proof was done by introducing the boosting method.


''A training algorithm for optimum margin classifiers''

* Bernhard E. Boser * Isabelle M. Guyon * Vladimir N. Vapnik * Proceedings of the Fifth Annual Workshop on Computational Learning Theory 5 144–152, Pittsburgh (1992).
Online version(HTML)
Description: This paper presented
support vector machines In machine learning, support vector machines (SVMs, also support vector networks) are supervised learning models with associated learning algorithms that analyze data for classification and regression analysis. Developed at AT&T Bell Laboratori ...
, a practical and popular machine learning algorithm. Support vector machines often use the
kernel trick In machine learning, kernel machines are a class of algorithms for pattern analysis, whose best known member is the support-vector machine (SVM). The general task of pattern analysis is to find and study general types of relations (for exampl ...
.


''A fast learning algorithm for deep belief nets''

* Geoffrey E. Hinton * Simon Osindero * Yee-Whye Teh * Neural Computation (2006)
Online PDF
Description: This paper presented a tractable greedy layer-wise learning algorithm for deep belief networks which led to great advancement in the field of
deep learning Deep learning (also known as deep structured learning) is part of a broader family of machine learning methods based on artificial neural networks with representation learning. Learning can be supervised, semi-supervised or unsupervised. ...
.


''Knowledge-based analysis of microarray gene expression data by using support vector machines''

* MP Brown * WN Grundy * D Lin *
Nello Cristianini Nello Cristianini (born 1968) is a Professor of Artificial Intelligence in the Department of Computer Science at the University of Bristol. Education Cristianini holds a degree in physics from the University of Trieste, a Master in computation ...
* CW Sugnet * TS Furey * M Ares Jr, *
David Haussler David Haussler (born 1953) is an American bioinformatician known for his work leading the team that assembled the first human genome sequence in the race to complete the Human Genome Project and subsequently for comparative genome analysis that d ...
*
PNAS ''Proceedings of the National Academy of Sciences of the United States of America'' (often abbreviated ''PNAS'' or ''PNAS USA'') is a peer-reviewed multidisciplinary scientific journal. It is the official journal of the National Academy of Scie ...

2000 January 4;97(1):262–7
Description: The first application of supervised learning to
gene expression Gene expression is the process by which information from a gene is used in the synthesis of a functional gene product that enables it to produce end products, protein or non-coding RNA, and ultimately affect a phenotype, as the final effect. T ...
data, in particular
Support Vector Machines In machine learning, support vector machines (SVMs, also support vector networks) are supervised learning models with associated learning algorithms that analyze data for classification and regression analysis. Developed at AT&T Bell Laboratori ...
. The method is now standard, and the paper one of the most cited in the area.


Compilers


''On the translation of languages from left to right''

* Description:
LR parser In computer science, LR parsers are a type of bottom-up parser that analyse deterministic context-free languages in linear time. There are several variants of LR parsers: SLR parsers, LALR parsers, Canonical LR(1) parsers, Minimal LR(1) parse ...
, which does bottom up parsing for deterministic context-free languages. Later derived parsers, such as the
LALR parser In computer science, an LALR parser or Look-Ahead LR parser is a simplified version of a canonical LR parser, to parse a text according to a set of production rules specified by a formal grammar for a computer language. ("LR" means left-to-rig ...
, have been and continue to be standard practice, such as in
Yacc Yacc (Yet Another Compiler-Compiler) is a computer program for the Unix operating system developed by Stephen C. Johnson. It is a Look Ahead Left-to-Right Rightmost Derivation (LALR) parser generator, generating a LALR parser (the part of a co ...
and descendants.


''Semantics of Context-Free Languages.''

*
Donald Knuth Donald Ervin Knuth ( ; born January 10, 1938) is an American computer scientist, mathematician, and professor emeritus at Stanford University. He is the 1974 recipient of the ACM Turing Award, informally considered the Nobel Prize of computer sc ...
* ''Math. Systems Theory'' 2:2 (1968), 127–145. Description: About grammar attribution, the base for yacc's s-attributed and zyacc's LR-attributed approach.


''Compiler Construction for Digital Computers''

* Description: The first textbook on compilers.
Dick Grune Dick Grune is a Dutch computer scientist and university lecturer best known for inventing and developing the first version of the Concurrent Versions System (CVS). Grune was involved in the construction of Algol 68 compiler In computing, a co ...
says that entire generations of compiler constructors grew up with it and have not regretted it. One of the first texts to be produced using
punched cards A punched card (also punch card or punched-card) is a piece of stiff paper that holds digital data represented by the presence or absence of holes in predefined positions. Punched cards were once common in data processing applications or to di ...
input to a formatting program. The cards and formatting program are preserved at the Stanford Computer History Exhibits. See David Gries#Textbooks for links.


''A Unified Approach to Global Program Optimization''

*
Gary Kildall Gary Arlen Kildall (; May 19, 1942 – July 11, 1994) was an American computer scientist and microcomputer entrepreneur. During the 1970s, Kildall created the CP/M operating system among other operating systems and programming tools, an ...
* Proceedings of ACM SIGACT-SIGPLAN 1973 Symposium on Principles of Programming Languages.
pdf
Description: Formalized the concept of
data-flow analysis In computing, dataflow is a broad concept, which has various meanings depending on the application and context. In the context of software architecture, data flow relates to stream processing or reactive programming. Software architecture Dat ...
as fixpoint computation over
lattice Lattice may refer to: Arts and design * Latticework, an ornamental criss-crossed framework, an arrangement of crossing laths or other thin strips of material * Lattice (music), an organized grid model of pitch ratios * Lattice (pastry), an orna ...
s, and showed that most static analyses used for program optimization can be uniformly expressed within this framework.


''A program data flow analysis procedure''

*
Frances E. Allen Frances Elizabeth Allen (August 4, 1932August 4, 2020) was an American computer scientist and pioneer in the field of optimizing compilers. Allen was the first woman to become an IBM Fellow, and in 2006 became the first woman to win the Turing ...
, J. Cocke * Commun. ACM, 19, 137–147, 1976 Description: From the abstract: "The global data relationships in a program can be exposed and codified by the static analysis methods described in this paper. A procedure is given which determines all the definitions which can possibly reach each node of the control flow graph of the program and all the definitions that are live on each edge of the graph."


''YACC: Yet another compiler-compiler''

* Stephen C. Johnson * ''Unix Programmer's Manual'' Vol 2b, 1979
Online copy (HTML)
Description:
Yacc Yacc (Yet Another Compiler-Compiler) is a computer program for the Unix operating system developed by Stephen C. Johnson. It is a Look Ahead Left-to-Right Rightmost Derivation (LALR) parser generator, generating a LALR parser (the part of a co ...
is a tool that made
compiler In computing, a compiler is a computer program that translates computer code written in one programming language (the ''source'' language) into another language (the ''target'' language). The name "compiler" is primarily used for programs tha ...
writing much easier.


''gprof: A Call Graph Execution Profiler''

*
Susan L. Graham Susan Lois Graham (born September 16, 1942) is an American computer scientist. Graham is the Pehong Chen Distinguished Professor Emerita in the Computer Science Division of the Department of Electrical Engineering and Computer Sciences at the U ...
, Peter B. Kessler,
Marshall Kirk McKusick Marshall Kirk McKusick (born January 19, 1954) is a computer scientist, known for his extensive work on BSD UNIX, from the 1980s to FreeBSD in the present day. He was president of the USENIX Association from 1990 to 1992 and again from 2002 to ...
* Proceedings of the ACM SIGPLAN 1982 Symposium on Compiler Construction, SIGPLAN Notices 17, 6, Boston, MA. June 1982.
Online copypdf
Description: The gprof profiler


''Compilers: Principles, Techniques and Tools ''

* Alfred V. Aho *
Ravi Sethi Ravi Sethi (born 1947) is an Indian computer scientist retired from executive roles at Bell Labs and Avaya Labs. He also serves as a member of the National Science Foundation's Computer and Information Science and Engineering (CISE) Advisory Co ...
* Jeffrey D. Ullman * Monica Lam *
Addison-Wesley Addison-Wesley is an American publisher of textbooks and computer literature. It is an imprint of Pearson PLC, a global publishing and education company. In addition to publishing books, Addison-Wesley also distributes its technical titles throug ...
, 1986. Description: This book became a classic in compiler writing. It is also known as the Dragon book, after the (red) dragon that appears on its cover.


Computer architecture


''Colossus computer''

* T. H. Flowers * ''Annals of the History of Computing'', Vol. 5 (No. 3), 1983, pp. 239–252.
The Design of Colossus
Description: The ''Colossus'' machines were early computing devices used by British
codebreakers Cryptanalysis (from the Greek language, Greek ''kryptós'', "hidden", and ''analýein'', "to analyze") refers to the process of analyzing information systems in order to understand hidden aspects of the systems. Cryptanalysis is used to breach C ...
to break German messages encrypted with the
Lorenz Cipher The Lorenz SZ40, SZ42a and SZ42b were German rotor stream cipher machines used by the German Army during World War II. They were developed by C. Lorenz AG in Berlin. The model name ''SZ'' was derived from ''Schlüssel-Zusatz'', meaning ''cipher ...
during
World War II World War II or the Second World War, often abbreviated as WWII or WW2, was a world war that lasted from 1939 to 1945. It involved the World War II by country, vast majority of the world's countries—including all of the great power ...
. Colossus was an early binary electronic digital
computer A computer is a machine that can be programmed to carry out sequences of arithmetic or logical operations ( computation) automatically. Modern digital electronic computers can perform generic sets of operations known as programs. These prog ...
. The design of Colossus was later described in the referenced paper.


''First Draft of a Report on the EDVAC''

*
John von Neumann John von Neumann (; hu, Neumann János Lajos, ; December 28, 1903 – February 8, 1957) was a Hungarian-American mathematician, physicist, computer scientist, engineer and polymath. He was regarded as having perhaps the widest c ...
* June 30, 1945, the
ENIAC ENIAC (; Electronic Numerical Integrator and Computer) was the first programmable, electronic, general-purpose digital computer, completed in 1945. There were other computers that had these features, but the ENIAC had all of them in one pac ...
project.
First Draft of a report on the EDVAC
(PDF) Description: It contains the first published description of the logical design of a computer using the stored-program concept, which has come to be known as the
von Neumann architecture The von Neumann architecture — also known as the von Neumann model or Princeton architecture — is a computer architecture based on a 1945 description by John von Neumann, and by others, in the '' First Draft of a Report on the EDVAC''. T ...
. See ''
First Draft of a Report on the EDVAC The ''First Draft of a Report on the EDVAC'' (commonly shortened to ''First Draft'') is an incomplete 101-page document written by John von Neumann and distributed on June 30, 1945 by Herman Goldstine, security officer on the classified ENIAC pro ...
.''


''Architecture of the IBM System/360''

*
Gene Amdahl Gene Myron Amdahl (November 16, 1922 – November 10, 2015) was an American computer architect and high-tech entrepreneur, chiefly known for his work on mainframe computers at IBM and later his own companies, especially Amdahl Corporation ...
,
Fred Brooks Frederick Phillips Brooks Jr. (April 19, 1931 – November 17, 2022) was an American computer architect, software engineer, and computer scientist, best known for managing the development of IBM's System/360 family of computers and the O ...
, G. A. Blaauw * IBM Journal of Research and Development, 1964. *
Architecture of the IBM System/360
' Description: The
IBM System/360 The IBM System/360 (S/360) is a family of mainframe computer systems that was announced by IBM on April 7, 1964, and delivered between 1965 and 1978. It was the first family of computers designed to cover both commercial and scientific applic ...
(S/360) is a
mainframe computer A mainframe computer, informally called a mainframe or big iron, is a computer used primarily by large organizations for critical applications like bulk data processing for tasks such as censuses, industry and consumer statistics, enterprise ...
system family announced by IBM on April 7, 1964. It was the first family of computers making a clear distinction between
architecture Architecture is the art and technique of designing and building, as distinguished from the skills associated with construction. It is both the process and the product of sketching, conceiving, planning, designing, and constructing buildings ...
and implementation.


''The case for the reduced instruction set computer''

* DA Patterson, DR Ditzel * Computer ArchitectureNews, vol. 8, no. 6, October 1980, pp 25–33.
Online version(PDF)
Description: The ''reduced instruction set computer''(''
RISC In computer engineering, a reduced instruction set computer (RISC) is a computer designed to simplify the individual instructions given to the computer to accomplish tasks. Compared to the instructions given to a complex instruction set comp ...
'')
CPU design Processor design is a subfield of computer engineering and electronics engineering (fabrication) that deals with creating a processor, a key component of computer hardware. The design process involves choosing an instruction set and a certain exe ...
philosophy. The RISC is a
CPU design Processor design is a subfield of computer engineering and electronics engineering (fabrication) that deals with creating a processor, a key component of computer hardware. The design process involves choosing an instruction set and a certain exe ...
philosophy that favors a reduced set of simpler instructions.


''Comments on "the Case for the Reduced Instruction Set Computer"''

* DW Clark, WD Strecker * Computer Architecture News, 1980.
Online version(PDF)
Description:


''The CRAY-1 Computer System''

* RM Russell * Communications of the ACM, January 1978, volume 21, number 1, pages 63–72.
Online version(PDF)
Description: The
Cray-1 The Cray-1 was a supercomputer designed, manufactured and marketed by Cray Research. Announced in 1975, the first Cray-1 system was installed at Los Alamos National Laboratory in 1976. Eventually, over 100 Cray-1s were sold, making it one of the ...
was a
supercomputer A supercomputer is a computer with a high level of performance as compared to a general-purpose computer. The performance of a supercomputer is commonly measured in floating-point operations per second ( FLOPS) instead of million instructio ...
designed by a team including
Seymour Cray Seymour Roger Cray (September 28, 1925 – October 5, 1996
) was an American
Cray Research Cray Inc., a subsidiary of Hewlett Packard Enterprise, is an American supercomputer manufacturer headquartered in Seattle, Washington. It also manufactures systems for data storage and analytics. Several Cray supercomputer systems are listed ...
. The first Cray-1 system was installed at
Los Alamos National Laboratory Los Alamos National Laboratory (often shortened as Los Alamos and LANL) is one of the sixteen research and development laboratories of the United States Department of Energy (DOE), located a short distance northwest of Santa Fe, New Mexico, ...
in 1976, and it went on to become one of the best known and most successful supercomputers in history.


''Validity of the Single Processor Approach to Achieving Large Scale Computing Capabilities''

*
Gene Amdahl Gene Myron Amdahl (November 16, 1922 – November 10, 2015) was an American computer architect and high-tech entrepreneur, chiefly known for his work on mainframe computers at IBM and later his own companies, especially Amdahl Corporation ...
* AFIPS 1967 Spring Joint Computer Conference, Atlantic City, N.J.
Online version(PDF)
Description: The Amdahl's Law.


''A Case for Redundant Arrays of Inexpensive Disks (RAID)''

* David A. Patterson,
Garth Gibson __NOTOC__ Garth Alan Gibson is a computer scientist from Carnegie Mellon University. Gibson's developed the RAID taxonomy of redundant data storage systems, along with David A. Patterson and Randy Katz. Born in Aurora, Ontario, he holds a Ph.D ...
, Randy H. Katz * In International Conference on Management of Data, pages 109–116, 1988.
Online version(PDF)
Description: This paper discusses the concept of
RAID Raid, RAID or Raids may refer to: Attack * Raid (military), a sudden attack behind the enemy's lines without the intention of holding ground * Corporate raid, a type of hostile takeover in business * Panty raid, a prankish raid by male college ...
disks, outlines the different levels of RAID, and the benefits of each level. It is a good paper for discussing issues of reliability and fault tolerance of computer systems, and the cost of providing such fault-tolerance.


''The case for a single-chip multiprocessor''

* Kunle Olukotun, Basem Nayfeh, Lance Hammond, Ken Wilson, Kunyung Chang * In SIGOPS Oper. Syst. Rev. 30, pages 2–11, 1996. Description: This paper argues that the approach taken to improving the performance of processors by adding multiple instruction issue and out-of-order execution cannot continue to provide speedups indefinitely. It lays out the case for making single chip processors that contain multiple "cores". With the mainstream introduction of multicore processors by
Intel Intel Corporation is an American multinational corporation and technology company headquartered in Santa Clara, California. It is the world's largest semiconductor chip manufacturer by revenue, and is one of the developers of the x86 ser ...
in 2005, and their subsequent domination of the market, this paper was shown to be prescient.


Computer graphics


''The Rendering Equation''

* J. Kajiya * SIGGRAPH: ACM Special Interest Group on Computer Graphics and Interactive Techniques pages 143—150


''Elastically deformable models''

*
Demetri Terzopoulos Demetri Terzopoulos is an Academy Award winning Greek-Canadian-American computer scientist, university professor, author, and entrepreneur. He is best known for pioneering the physics-based approach to computer graphics and vision that has he ...
, John Platt, Alan Barr, Kurt Fleischer * Computer Graphics, 21(4), 1987, 205–214, Proc. ACM
SIGGRAPH SIGGRAPH (Special Interest Group on Computer Graphics and Interactive Techniques) is an annual conference on computer graphics (CG) organized by the ACM SIGGRAPH, starting in 1974. The main conference is held in North America; SIGGRAPH Asia ...
'87 Conference, Anaheim, CA, July 1987.
Online version(PDF)
Description: The Academy of Motion Picture Arts and Sciences cited this paper as a "milestone in computer graphics".


''Sketchpad, a Man-Machine Graphical Communication System''

*
Ivan Sutherland Ivan Edward Sutherland (born May 16, 1938) is an American computer scientist and Internet pioneer, widely regarded as a pioneer of computer graphics. His early work in computer graphics as well as his teaching with David C. Evans in that subject ...

Online version(PDF)
Description: One of the founding works on computer graphics.


Computer vision


'' The Phase Correlation Image Alignment Method ''

* C.D. Kuglin and D.C. Hines * IEEE 1975 Conference on Cybernetics and Society, 1975, New York, pp. 163–165, September Description: A correlation method based upon the inverse
Fourier transform A Fourier transform (FT) is a mathematical transform that decomposes functions into frequency components, which are represented by the output of the transform as a function of frequency. Most commonly functions of time or space are transformed ...


''Determining Optical Flow''

*
Berthold K.P. Horn Berthold Klaus Paul Horn (born December 8, 1943) is an American scientist working in the field of artificial intelligence and computer vision. He is Professor of Electrical Engineering and Computer Science at Massachusetts Institute of Technology ...
and B.G. Schunck * Artificial Intelligence, Volume 17, 185–203, 1981 * OA article here: Description: A method for estimating the image motion of world points between 2 frames of a video sequence.


''An Iterative Image Registration Technique with an Application to Stereo Vision''

* Lucas, B.D. and Kanade, T. * Proceedings of the 7th
International Joint Conference on Artificial Intelligence The International Joint Conference on Artificial Intelligence (IJCAI) is the leading conference in the field of Artificial Intelligence. The conference series has been organized by the nonprofit IJCAI Organization since 1969, making it the oldest pr ...
, 674–679, Vancouver, Canada, 1981 Description: This paper provides efficient technique for image registration


''The Laplacian Pyramid as a compact image code''

* Peter J. Burt and Edward H. Adelson *
IEEE Transactions on Communications ''IEEE Transactions on Communications'' is a monthly peer-reviewed scientific journal published by the IEEE Communications Society that focuses on all aspects of telecommunication technology, including telephone, telegraphy, facsimile, and point-to ...
, volume = "COM-31,4", pp. 532–540, 1983. Description: A technique for image encoding using local operators of many scales.


''Stochastic relaxation, Gibbs distributions, and the Bayesian restoration of images''

*
Stuart Geman Stuart Alan Geman (born March 23, 1949) is an American mathematician, known for influential contributions to computer vision, statistics, probability theory, machine learning, and the neurosciences. ikipediaList of important publications in comp ...
and
Donald Geman Donald Jay Geman (born September 20, 1943) is an American applied mathematician and a leading researcher in the field of machine learning and pattern recognition. He and his brother, Stuart Geman, are very well known for proposing the Gibbs samp ...
* IEEE Transactions on Pattern Analysis and Machine Intelligence, 1984 Description: introduced (1) MRFs for image analysis, (2) the
Gibbs sampling In statistics, Gibbs sampling or a Gibbs sampler is a Markov chain Monte Carlo (MCMC) algorithm for obtaining a sequence of observations which are approximated from a specified multivariate probability distribution, when direct sampling is diff ...
which revolutionized computational
Bayesian statistics Bayesian statistics is a theory in the field of statistics based on the Bayesian interpretation of probability where probability expresses a ''degree of belief'' in an event. The degree of belief may be based on prior knowledge about the event, ...
and thus had paramount impact in many other fields in addition to Computer Vision.


''Snakes: Active contour models''

* Michael Kass, Andrew Witkin, and
Demetri Terzopoulos Demetri Terzopoulos is an Academy Award winning Greek-Canadian-American computer scientist, university professor, author, and entrepreneur. He is best known for pioneering the physics-based approach to computer graphics and vision that has he ...
Description: An interactive variational technique for image segmentation and visual tracking.


''Condensation – conditional density propagation for visual tracking''

* M. Isard and A. Blake *
International Journal of Computer Vision The ''International Journal of Computer Vision'' (IJCV) is a journal published by Springer. The senior editor-in-chief is Jean Ponce, and the editors-in-chief are Jiri Matas, Yasuyuki Matsushita, and Svetlana Lazebnik Svetlana Lazebnik (born 197 ...
, 29(1):5–28, 1998. Description: A technique for visual tracking


''Object recognition from local scale-invariant features ''

* David Lowe *
International Conference on Computer Vision The International Conference on Computer Vision (ICCV) is a research conference sponsored by the Institute of Electrical and Electronics Engineers (IEEE) held every other year. It is considered to be one of the top conferences in computer vision, ...
, pp. 1150–1157, 1999 Description: A technique (
scale-invariant feature transform The scale-invariant feature transform (SIFT) is a computer vision algorithm to detect, describe, and match local ''features'' in images, invented by David Lowe in 1999. Applications include object recognition, robotic mapping and navigation, ima ...
) for robust feature description


Concurrent, parallel, and distributed computing

Topics covered:
concurrent computing Concurrent computing is a form of computing in which several computations are executed '' concurrently''—during overlapping time periods—instead of ''sequentially—''with one completing before the next starts. This is a property of a syst ...
,
parallel computing Parallel computing is a type of computation in which many calculations or processes are carried out simultaneously. Large problems can often be divided into smaller ones, which can then be solved at the same time. There are several different f ...
, and
distributed computing A distributed system is a system whose components are located on different networked computers, which communicate and coordinate their actions by passing messages to one another from any system. Distributed computing is a field of computer sci ...
.


Databases


''A relational model for large shared data banks''

* E. F. Codd * ''
Communications of the ACM ''Communications of the ACM'' is the monthly journal of the Association for Computing Machinery (ACM). It was established in 1958, with Saul Rosen as its first managing editor. It is sent to all ACM members. Articles are intended for readers wi ...
'', 13(6):377–387, June 1970 Description: This paper introduced the relational model for databases. This model became the number one model.


''Binary B-Trees for Virtual Memory''

* Rudolf Bayer * ACM-SIGFIDET Workshop 1971, San Diego, California, Session 5B, p. 219–235. Description: This paper introduced the
B-Tree In computer science, a B-tree is a self-balancing tree data structure that maintains sorted data and allows searches, sequential access, insertions, and deletions in logarithmic time. The B-tree generalizes the binary search tree, allowing for ...
s
data structure In computer science, a data structure is a data organization, management, and storage format that is usually chosen for efficient access to data. More precisely, a data structure is a collection of data values, the relationships among them, ...
. This model became the number one model.


''Relational Completeness of Data Base Sublanguages''

* E. F. Codd * In: R. Rustin (ed.): Database Systems: 65–98, Prentice Hall and IBM Research Report RJ 987, San Jose, California : (1972)
Online version (PDF)
Description: Completeness of Data Base Sublanguages


''The Entity Relationship Model – Towards a Unified View of Data''

*
Peter Chen Peter Pin-Shan Chen (; born 3 January 1947) is a Taiwanese American computer scientist. He is a (retired) distinguished career scientist and faculty member at Carnegie Mellon University and Distinguished Chair Professor Emeritus at LSU. He is ...
* '' ACM Transactions on Database Systems''
Vol. 1, No. 1, March 1976, pp. 9–36
Description: This paper introduced the entity-relationship diagram(ERD) method of database design.


''SEQUEL: A structured English query language''

*
Donald D. Chamberlin Donald D. Chamberlin is an American computer scientist who is one of the principal designers of the original SQL language specification with Raymond Boyce. He also made significant contributions to the development of XQuery. Chamberlin was e ...
, Raymond F. Boyce * International Conference on Management of Data, Proceedings of the 1974 ACM SIGFIDET (now SIGMOD) workshop on Data description, access and control, Ann Arbor, Michigan, pp. 249–264 Description: This paper introduced the SQL language.


''The notions of consistency and predicate locks in a database system''

* Kapali P. Eswaran, Jim Gray, Raymond A. Lorie, Irving L. Traiger * Communications of the ACM 19, 1976, 624—633 Description: This paper defined the concepts of transaction,
consistency In classical deductive logic, a consistent theory is one that does not lead to a logical contradiction. The lack of contradiction can be defined in either semantic or syntactic terms. The semantic definition states that a theory is consistent ...
and schedule. It also argued that a transaction needs to lock a logical rather than a physical subset of the database.


''Federated database systems for managing distributed, heterogeneous, and autonomous databases''

*
Amit Sheth Amit Sheth is a computer scientist at University of South Carolina in Columbia, South Carolina. He is the founding Director of the Artificial Intelligence Institute, and a Professor of Computer Science and Engineering. From 2007 to June 2019, h ...
, J.A. Larson," * ''ACM Computing Surveys - Special issue on heterogeneous databases Surveys'', Volume 22 Issue 3, Pages 183 - 236, Sept. 1990
ACM source
Description: Introduced federated database systems concept leading huge impact on data interoperability and integration of hetereogeneous data sources.


''Mining association rules between sets of items in large databases''

* Rakesh Agrawal, Tomasz Imielinski, Arun Swami * ''Proc. of the
ACM SIGMOD Conference on Management of Data SIGMOD is the Association for Computing Machinery's Special Interest Group on Management of Data, which specializes in large-scale data management problems and databases. The annual ACM SIGMOD Conference, which began in 1975, is considered one of ...
'', pages 207–216, Washington, D.C., May 1993 Description: Association rules, a very common method for data mining.


History of computation


''The Computer from Pascal to von Neumann''

* Description: Perhaps the first book on the history of computation.


''A History of Computing in the Twentieth Century''

edited by: * Nicholas Metropolis * J. Howlett *
Gian-Carlo Rota Gian-Carlo Rota (April 27, 1932 – April 18, 1999) was an Italian-American mathematician and philosopher. He spent most of his career at the Massachusetts Institute of Technology, where he worked in combinatorics, functional analysis, proba ...
*
Academic Press Academic Press (AP) is an academic book publisher founded in 1941. It was acquired by Harcourt, Brace & World in 1969. Reed Elsevier bought Harcourt in 2000, and Academic Press is now an imprint of Elsevier. Academic Press publishes refer ...
, 1980, Description: Several chapters by pioneers of computing.


Information retrieval


''A Vector Space Model for Automatic Indexing''

*
Gerard Salton Gerard A. "Gerry" Salton (8 March 1927 in Nuremberg – 28 August 1995) was a Professor of Computer Science at Cornell University. Salton was perhaps the leading computer scientist working in the field of information retrieval during his time, an ...
, A. Wong, C. S. Yang * Commun. ACM 18(11): 613–620 (1975) Description: Presented the
vector space model Vector space model or term vector model is an algebraic model for representing text documents (and any objects, in general) as vectors of identifiers (such as index terms). It is used in information filtering, information retrieval, indexing an ...
.


''Extended Boolean Information Retrieval''

*
Gerard Salton Gerard A. "Gerry" Salton (8 March 1927 in Nuremberg – 28 August 1995) was a Professor of Computer Science at Cornell University. Salton was perhaps the leading computer scientist working in the field of information retrieval during his time, an ...
, Edward A. Fox, Harry Wu * Commun. ACM 26(11): 1022–1036 (1983) Description: Presented the
inverted index In computer science, an inverted index (also referred to as a postings list, postings file, or inverted file) is a database index storing a mapping from content, such as words or numbers, to its locations in a table, or in a document or a set of d ...


''A Statistical Interpretation of Term Specificity and Its Application in Retrieval''

*
Karen Spärck Jones Karen Sparck Jones is a computer science researcher and innovator who pioneered the search engine algorithm known as inverse document frequency (IDF). While many early information scientists and computer engineers were focused on developing progr ...
* Journal of Documentation 28: 11–21 (1972). . Description: Conceived a statistical interpretation of term specificity called
Inverse document frequency Inverse or invert may refer to: Science and mathematics * Inverse (logic), a type of conditional sentence which is an immediate inference made from another conditional sentence * Additive inverse (negation), the inverse of a number that, when a ...
(IDF), which became a cornerstone of term weighting.


Networking


A Protocol for Packet Network Intercommunication

*
Vint Cerf Vinton Gray Cerf (; born June 23, 1943) is an American Internet pioneer and is recognized as one of " the fathers of the Internet", sharing this title with TCP/IP co-developer Bob Kahn. He has received honorary degrees and awards that include ...
, Robert Kahn * IEEE Transactions on Communications, 1974.
Online copy (PDF)
Description: This paper contains a lot of the ideas that later became TCP and IP, two foundational protocols that make up the Internet. Cerf and Kahn received the ACM Turning Award, in part for the work contained in this paper.


The Design Philosophy of the DARPA Internet Protocols

* David Clark * ACM SIGCOMM Computer Communications Review, Vol. 18, No. 4, pp. 106–114, August 1988.
Online copy (PDF)
Description: This paper describes some of the design principles behind the Internet, and how those design principles are realized in the Internet.


End-To-End Arguments in System Design

* Description: This paper presents the "end-to-end argument", a classic design principle widely used to guide the design of many of the Internet's protocols and systems.


Congestion Avoidance and Control

*
Van Jacobson Van Jacobson (born 1950) is an American computer scientist, renowned for his work on TCP/IP network performance and scaling.
, Michael J. Karels * ACM SIGCOMM, 1988.
Online copy (HTML)
Description: This paper identifies the problem of network congestion, and presents an algorithm for how protocols can reduce their sending rate to reduce congestion. This approach was incorporated into the TCP protocol, and influenced design of many other data transport protocols.


Analysis and Simulation of a Fair Queuing Algorithm

* Alan Demers,
Srinivasan Keshav Srinivasan Keshav is an American-Canadian Computer Scientist of Indian descent who is currently the Robert Sansom Professor of Computer Science at the University of Cambridge. Biography After undergraduate studies at the Indian Institute of Tech ...
,
Scott Shenker Scott J. Shenker (born January 24, 1956 in Alexandria, Virginia) is an American computer scientist, and professor of computer science at the University of California, Berkeley. He is also the leader of the Extensible Internet Group at the Intern ...
* ACM SIGCOMM CCR, Vol. 19, No. 4, September 1989.
Online copy (PDF)
Description: This paper presents "fair queuing", a buffer allocation algorithm nearly universally deployed on Internet routers.


Scalable High Speed IP Routing Lookups

* M. Waldvogel, G. Varghese, J. Turner, B. Plattner * ACM SIGCOMM, August 1997.
Online copy (PDF)
Description: This paper describes an algorithmic approach to finding the prefix (supernet) containing a particular IP address, a process which is now nearly universally-used on Internet routers.


Chord: A Scalable Peer-to-peer Lookup Service for Internet Applications

* Ion Stoica, Robert Morris, David Karger, M. Frans Kaashoek, Hari Balakrishnan * ACM SIGCOMM, August 2001
Online copy (PDF)
Description: This paper presents the concept of a Distributed Hash Table (DHT), a distributed data structure that had influenced the design of a number of
peer-to-peer Peer-to-peer (P2P) computing or networking is a distributed application architecture that partitions tasks or workloads between peers. Peers are equally privileged, equipotent participants in the network. They are said to form a peer-to-peer ...
systems, distributed filesystems, and other large-scale distributed systems.
Also see the "Top Ten Networking Papers" lists published i
ACM SIGCOMM CCR
* "10 Networking Papers: Recommended Reading," Jon Crowcroft
Online copy (PDF)
* "10 Papers for the Ph.D. Student in Networking," Craig Partridge
Online copy (PDF)
* "10 Networking Papers: Recommended Reading,"
Jim Kurose Jim Kurose (born 1956) is a Distinguished University Professor in the College of Information and Computer Sciences at the University of Massachusetts Amherst. He was born in Greenwich, Connecticut, USA. He received his B.A. degree from Wesleyan ...

Online copy (PDF)
* "10 Networking Papers: Reading for Protocol Design,
David Wetherall
https://dl.acm.org/doi/pdf/10.1145/1140086.1140096 * "10 Networking Papers: A Blast from the Past," Mostafa H. Ammar.


Operating systems


''An experimental timesharing system.''

* Fernando J. Corbató, M. Merwin-Daggett, and R.C. Daley * Proceedings of the AFIPS FJCC, pages 335–344, 1962.
Online copy (HTML)
Description: This paper discuss
time-sharing In computing, time-sharing is the sharing of a computing resource among many users at the same time by means of multiprogramming and multi-tasking.DEC Timesharing (1965), by Peter Clark, The DEC Professional, Volume 1, Number 1 Its emergence ...
as a method of sharing computer resource. This idea changed the interaction with computer systems.


''The Working Set Model for Program Behavior''

*
Peter J. Denning Peter James Denning (born January 6, 1942) is an American computer scientist and writer. He is best known for pioneering work in virtual memory, especially for inventing the working-set model for program behavior, which addressed thrashing in ...
* Communications of the ACM, Vol. 11, No. 5, May 1968, pp 323–333
Online version(PDF)
Description: The beginning of
cache Cache, caching, or caché may refer to: Places United States * Cache, Idaho, an unincorporated community * Cache, Illinois, an unincorporated community * Cache, Oklahoma, a city in Comanche County * Cache, Utah, Cache County, Utah * Cache County ...
. For more information se
SIGOPS Hall of Fame


''Virtual Memory, Processes, and Sharing in MULTICS''

* Robert C. Daley, Jack B. Dennis * Communications of the ACM, Vol. 11, No. 5, May 1968, pp. 306–312.
Online version(PDF)
Description: The classic paper on
Multics Multics ("Multiplexed Information and Computing Service") is an influential early time-sharing operating system based on the concept of a single-level memory.Dennis M. Ritchie, "The Evolution of the Unix Time-sharing System", Communications of ...
, the most ambitious operating system in the early history of computing. Difficult reading, but it describes the implications of trying to build a system that takes information sharing to its logical extreme. Most operating systems since Multics have incorporated a subset of its facilities.


The nucleus of a multiprogramming system

* Per Brinch Hansen * Communications of the ACM, Vol. 13, No. 4, April 1970, pp. 238–242
Online version(PDF)
Description: Classic paper on the extensible nucleus architecture of the RC 4000 multiprogramming system, and what became known as the
operating system kernel The kernel is a computer program at the core of a computer's operating system and generally has complete control over everything in the system. It is the portion of the operating system code that is always resident in memory and facilitates ...
and
microkernel In computer science, a microkernel (often abbreviated as μ-kernel) is the near-minimum amount of software that can provide the mechanisms needed to implement an operating system (OS). These mechanisms include low-level address space management, ...
architecture.


Operating System Principles

* Per Brinch Hansen * Prentice Hall, Englewood Cliffs, NJ, July 1973
Online version (ACM Digital Library)
Description: The first comprehensive textbook on operating systems. Includes the first
monitor Monitor or monitor may refer to: Places * Monitor, Alberta * Monitor, Indiana, town in the United States * Monitor, Kentucky * Monitor, Oregon, unincorporated community in the United States * Monitor, Washington * Monitor, Logan County, West ...
notation (Chapter 7).


''A note on the confinement problem''

*
Butler W. Lampson Butler W. Lampson, ForMemRS, (born December 23, 1943) is an American computer scientist best known for his contributions to the development and implementation of distributed personal computing. Education and early life After graduating from th ...
* Communications of the ACM, 16(10):613–615, October 1973.
Online version(PDF)
Description: This paper addresses issues in constraining the flow of information from untrusted programs. It discusses covert channels, but more importantly it addresses the difficulty in obtaining full confinement without making the program itself effectively unusable. The ideas are important when trying to understand containment of malicious code, as well as aspects of trusted computing.


'' The UNIX Time-Sharing System''

* Dennis M. Ritchie and
Ken Thompson Kenneth Lane Thompson (born February 4, 1943) is an American pioneer of computer science. Thompson worked at Bell Labs for most of his career where he designed and implemented the original Unix operating system. He also invented the B programmi ...
*
Communications of the ACM ''Communications of the ACM'' is the monthly journal of the Association for Computing Machinery (ACM). It was established in 1958, with Saul Rosen as its first managing editor. It is sent to all ACM members. Articles are intended for readers wi ...
17(7), July 1974.
Online copy
Description: The
Unix Unix (; trademarked as UNIX) is a family of multitasking, multiuser computer operating systems that derive from the original AT&T Unix, whose development started in 1969 at the Bell Labs research center by Ken Thompson, Dennis Ritchie, ...
operating system An operating system (OS) is system software that manages computer hardware, software resources, and provides common daemon (computing), services for computer programs. Time-sharing operating systems scheduler (computing), schedule tasks for ef ...
and its principles were described in this paper. The main importance is not of the paper but of the operating system, which had tremendous effect on operating system and computer technology.


''Weighted voting for replicated data''

* David K. Gifford * Proceedings of the 7th ACM Symposium on Operating Systems Principles, pages 150–159, December 1979. Pacific Grove, California
Online copy (few formats)
Description: This paper describes the consistency mechanism known as quorum consensus. It is a good example of algorithms that provide a continuous set of options between two alternatives (in this case, between the read-one write-all, and the write-one read-all consistency methods). There have been many variations and improvements by researchers in the years that followed, and it is one of the consistency algorithms that should be understood by all. The options available by choosing different size quorums provide a useful structure for discussing of the core requirements for consistency in distributed systems.


''Experiences with Processes and Monitors in Mesa''

*
Butler W. Lampson Butler W. Lampson, ForMemRS, (born December 23, 1943) is an American computer scientist best known for his contributions to the development and implementation of distributed personal computing. Education and early life After graduating from th ...
,
David D. Redell David (; , "beloved one") (traditional spelling), , ''Dāwūd''; grc-koi, Δαυΐδ, Dauíd; la, Davidus, David; gez , ዳዊት, ''Dawit''; xcl, Դաւիթ, ''Dawitʿ''; cu, Давíдъ, ''Davidŭ''; possibly meaning "beloved one". w ...
* Communications of the ACM, Vol. 23, No. 2, February 1980, pp. 105–117.
Online copy (PDF)
Description: This is the classic paper on synchronization techniques, including both alternate approaches and pitfalls.


''Scheduling Techniques for Concurrent Systems''

* J. K. Ousterhout * Proceedings of Third
International Conference on Distributed Computing Systems The International Conference on Distributed Computing Systems (ICDCS) is the oldest conference in the field of distributed computing systems in the world. It was launched by the IEEE Computer Society Technical Committee on Distributed Processing ...
, 1982, 22—30. Description: Algorithms for
coscheduling Coscheduling is the principle for concurrent systems of scheduling related processes to run on different processors at the same time (in parallel). There are various specific implementations to realize this. If an application consists of a coll ...
of related processes were given


''A Fast File System for UNIX''

*
Marshall Kirk Mckusick Marshall Kirk McKusick (born January 19, 1954) is a computer scientist, known for his extensive work on BSD UNIX, from the 1980s to FreeBSD in the present day. He was president of the USENIX Association from 1990 to 1992 and again from 2002 to ...
, William N. Joy,
Samuel J. Leffler Samuel J Leffler is a computer scientist, known for his extensive work on BSD, from the 1980s to FreeBSD in the present day. Among other projects, he created HylaFAX, FlexFAX, LibTIFF, and the Comparison of open source wireless drivers#FreeBSD, ...
, Robert S. Fabry * IACM Transactions on Computer Systems, Vol. 2, No. 3, August 1984, pp. 181–197.
Online copy (PDF)
Description: The
file system In computing, file system or filesystem (often abbreviated to fs) is a method and data structure that the operating system uses to control how data is stored and retrieved. Without a file system, data placed in a storage medium would be one larg ...
of
UNIX Unix (; trademarked as UNIX) is a family of multitasking, multiuser computer operating systems that derive from the original AT&T Unix, whose development started in 1969 at the Bell Labs research center by Ken Thompson, Dennis Ritchie, ...
. One of the first papers discussing how to manage disk storage for high-performance file systems. Most file-system research since this paper has been influenced by it, and most high-performance file systems of the last 20 years incorporate techniques from this paper.


''The Design of the UNIX Operating System''

* Maurice J. Bach,
AT&T Bell Labs Nokia Bell Labs, originally named Bell Telephone Laboratories (1925–1984), then AT&T Bell Laboratories (1984–1996) and Bell Labs Innovations (1996–2007), is an American industrial research and scientific development company owned by mult ...
* Prentice Hall • 486 pp • Published 05/27/1986 This definitive description principally covered the System V Release 2 kernel, with some new features from Release 3 and BSD.


''The Design and Implementation of a Log-Structured File System''

*
Mendel Rosenblum Mendel Rosenblum (born 1962) is a professor of Computer Science at Stanford University and co-founder of VMware. Early life Mendel Rosenblum was born in 1962. He attended the University of Virginia, where he received a degree in mathematics. Whil ...
, J. K. Ousterhout * ACM Transactions on Computer Systems, Vol. 10, No. 1 (February 1992), pp. 26–52.
Online version
Description:
Log-structured file system A log-structured filesystem is a file system in which data and metadata are written sequentially to a circular buffer, called a log. The design was first proposed in 1988 by John K. Ousterhout and Fred Douglis and first implemented in 1992 by ...
.


''Microkernel operating system architecture and Mach''

* David L. Black, David B. Golub, Daniel P. Julin, Richard F. Rashid, Richard P. Draves, Randall W. Dean, Alessandro Forin, Joseph Barrera, Hideyuki Tokuda, Gerald Malan, David Bohman * Proceedings of the USENIX Workshop on Microkernels and Other Kernel Architectures, pages 11–30, April 1992. Description: This is a good paper discussing one particular
microkernel In computer science, a microkernel (often abbreviated as μ-kernel) is the near-minimum amount of software that can provide the mechanisms needed to implement an operating system (OS). These mechanisms include low-level address space management, ...
architecture and contrasting it with monolithic kernel design. Mach underlies
Mac OS X macOS (; previously OS X and originally Mac OS X) is a Unix operating system developed and marketed by Apple Inc. since 2001. It is the primary operating system for Apple's Mac computers. Within the market of desktop and lap ...
, and its layered architecture had a significant impact on the design of the
Windows NT kernel The Software architecture, architecture of Windows NT, a line of operating systems produced and sold by Microsoft, is a layered design that consists of two main components, User space, user mode and Protection ring#SUPERVISOR-MODE, kernel mode. ...
and modern microkernels like L4. In addition, its memory-mapped files feature was added to many monolithic kernels.


''An Implementation of a Log-Structured File System for UNIX''

* Margo Seltzer, Keith Bostic,
Marshall Kirk McKusick Marshall Kirk McKusick (born January 19, 1954) is a computer scientist, known for his extensive work on BSD UNIX, from the 1980s to FreeBSD in the present day. He was president of the USENIX Association from 1990 to 1992 and again from 2002 to ...
, Carl Staelin * Proceedings of the Winter 1993
USENIX Conference The USENIX Annual Technical Conference (USENIX ATC, or, canonically, USENIX) is a conference of computing professions sponsored by the USENIX association. The conference includes computing tutorials, and a single track technical session for presen ...
, San Diego, CA, January 1993, 307-326 Description: The paper was the first production-quality implementation of that idea which spawned much additional discussion of the viability and short-comings of log-structured filesystems. While "The Design and Implementation of a Log-Structured File System" was certainly the first, this one was important in bringing the research idea to a usable system.


''Soft Updates: A Solution to the Metadata Update problem in File Systems''

* G. Ganger, M. McKusick, C. Soules, Y. Patt * ACM Transactions on Computer Systems 18, 2. pp 127–153, May 2000 * Online version
Description: A new way of maintaining filesystem consistency.


Programming languages


''The FORTRAN Automatic Coding System''

*
John Backus John Warner Backus (December 3, 1924 – March 17, 2007) was an American computer scientist. He directed the team that invented and implemented FORTRAN, the first widely used high-level programming language, and was the inventor of the Backu ...
et al. * Proceedings of the WJCC (Western Joint Computer Conference), Los Angeles, California, February 1957.
Online version(PDF)
Description: This paper describes the design and implementation of the first FORTRAN compiler by the IBM team. Fortran is a general-purpose, procedural,
imperative programming In computer science, imperative programming is a programming paradigm of software that uses statements that change a program's state. In much the same way that the imperative mood in natural languages expresses commands, an imperative program ...
language that is especially suited to numeric computation and scientific computing.


''Recursive functions of symbolic expressions and their computation by machine, part I''

* John McCarthy. * Communications of the ACM, 3(4):184–195, April 1960.
Several online versions
Description: This paper introduced
LISP A lisp is a speech impairment in which a person misarticulates sibilants (, , , , , , , ). These misarticulations often result in unclear speech. Types * A frontal lisp occurs when the tongue is placed anterior to the target. Interdental lispin ...
, the first
functional programming language In computer science, functional programming is a programming paradigm where programs are constructed by applying and composing functions. It is a declarative programming paradigm in which function definitions are trees of expressions that ...
, which was used heavily in many areas of computer science, especially in AI. LISP also has powerful features for manipulating LISP programs within the language.


''ALGOL 60''


Revised Report on the Algorithmic Language Algol 60
by Peter Naur, et al. – The very influential ALGOL definition; with the first formally defined syntax. *
Brian Randell Brian Randell (born 1936) is a British computer scientist, and Emeritus Professor at the School of Computing, Newcastle University, United Kingdom. He specialises in research into software fault tolerance and dependability, and is a noted aut ...
and L. J. Russell, ''ALGOL 60 Implementation: The Translation and Use of ALGOL 60 Programs on a Computer''. Academic Press, 1964. The design of the Whetstone Compiler. One of the early published descriptions of implementing a
compiler In computing, a compiler is a computer program that translates computer code written in one programming language (the ''source'' language) into another language (the ''target'' language). The name "compiler" is primarily used for programs tha ...
. See the related papers
Whetstone Algol Revisited
an
The Whetstone KDF9 Algol Translator
by
Brian Randell Brian Randell (born 1936) is a British computer scientist, and Emeritus Professor at the School of Computing, Newcastle University, United Kingdom. He specialises in research into software fault tolerance and dependability, and is a noted aut ...
* Edsger W. Dijkstra, ''Algol 60 translation: an Algol 60 translator for the x1 and making a translator for Algol 60'', report MR 35/61. Mathematisch Centrum, Amsterdam, 1961. Description: Algol 60 introduced block structure.


''The next 700 programming languages''

* Peter Landin * Communications of the ACM 9(3):157–65, March 1966 Description: This seminal paper proposed an ideal language
ISWIM ISWIM (acronym for If you See What I Mean) is an abstract computer programming language (or a family of languages) devised by Peter Landin and first described in his article "The Next 700 Programming Languages", published in the Communications ...
, which without being ever implemented influenced the whole later development.


''Fundamental Concepts in Programming Languages''

*
Christopher Strachey Christopher S. Strachey (; 16 November 1916 – 18 May 1975) was a British computer scientist. He was one of the founders of denotational semantics, and a pioneer in programming language design and computer time-sharing.F. J. Corbató, et al. ...

pdf
Description: ''
Fundamental Concepts in Programming Languages ''Fundamental Concepts in Programming Languages'' were an influential set of lecture notes written by Christopher Strachey for the International Summer School in Computer Programming at Copenhagen in August, 1967. It introduced much programming l ...
'' introduced much programming language terminology still in use today, including ''R-values'', ''L-values'', ''parametric polymorphism'', and ''ad hoc polymorphism''.


''Lambda Papers''

*
Gerald Jay Sussman Gerald Jay Sussman (born February 8, 1947) is the Panasonic Professor of Electrical Engineering at the Massachusetts Institute of Technology (MIT). He received his S.B. and Ph.D. degrees in mathematics from MIT in 1968 and 1973 respectively. H ...
and Guy L. Steele, Jr. *
AI Memo AI is artificial intelligence, intellectual ability in machines and robots. Ai, AI or A.I. may also refer to: Animals * Ai (chimpanzee), an individual experimental subject in Japan * Ai (sloth) or the pale-throated sloth, northern Amazonian ma ...
s, 1975–1980
Links to pdf's
Description: This series of papers and reports first defined the influential
Scheme A scheme is a systematic plan for the implementation of a certain idea. Scheme or schemer may refer to: Arts and entertainment * ''The Scheme'' (TV series), a BBC Scotland documentary series * The Scheme (band), an English pop band * ''The Schem ...
programming language and questioned the prevailing practices in programming language design, employing
lambda calculus Lambda calculus (also written as ''λ''-calculus) is a formal system in mathematical logic for expressing computation based on function abstraction and application using variable binding and substitution. It is a universal model of computation th ...
extensively to model programming language concepts and guide efficient implementation without sacrificing expressive power.


''Structure and Interpretation of Computer Programs''

* Harold Abelson and
Gerald Jay Sussman Gerald Jay Sussman (born February 8, 1947) is the Panasonic Professor of Electrical Engineering at the Massachusetts Institute of Technology (MIT). He received his S.B. and Ph.D. degrees in mathematics from MIT in 1968 and 1973 respectively. H ...
*
MIT Press The MIT Press is a university press affiliated with the Massachusetts Institute of Technology (MIT) in Cambridge, Massachusetts (United States). It was established in 1962. History The MIT Press traces its origins back to 1926 when MIT publ ...
, 1984, 1996 Description: This textbook explains core computer programming concepts, and is widely considered a classic text in computer science.
Online course


''Comprehending Monads''

* Philip Wadler * Mathematical structures in computer science 2.04 (1992): 461–493.
Online copy
Description: This paper introduced monads to functional programming.


''Towards a Theory of Type Structure''

* John Reynolds * Programming Symposium. Springer Berlin Heidelberg, 1974.
online copy
Description: This paper introduced
System F System F (also polymorphic lambda calculus or second-order lambda calculus) is a typed lambda calculus that introduces, to simply typed lambda calculus, a mechanism of universal quantification over types. System F formalizes parametric polymorph ...
and created the modern notion of
Parametric polymorphism In programming languages and type theory, parametric polymorphism allows a single piece of code to be given a "generic" type, using variables in place of actual types, and then instantiated with particular types as needed. Parametrically polymorph ...


''An axiomatic basis for computer programming''

*
Tony Hoare Sir Charles Antony Richard Hoare (Tony Hoare or C. A. R. Hoare) (born 11 January 1934) is a British computer scientist who has made foundational contributions to programming languages, algorithms, operating systems, formal verification, and ...
* Communications of the ACM, Volume 12 Issue 10, Oct. 1969, Pages 576-580 Description: This paper introduce
Hoare logic Hoare logic (also known as Floyd–Hoare logic or Hoare rules) is a formal system with a set of logical rules for reasoning rigorously about the correctness of computer programs. It was proposed in 1969 by the British computer scientist and l ...
, which forms the foundation of program verification


Scientific computing

* *


Computational linguistics

* :Contains the first presentation of
stochastic context-free grammar Grammar theory to model symbol strings originated from work in computational linguistics aiming to understand the structure of natural languages. Probabilistic context free grammars (PCFGs) have been applied in probabilistic modeling of RNA stru ...
s. * :The first published description of computational morphology using
finite state transducer A finite-state transducer (FST) is a finite-state machine with two memory ''tapes'', following the terminology for Turing machines: an input tape and an output tape. This contrasts with an ordinary finite-state automaton, which has a single tape ...
s. (Kaplan and Kay had previously done work in this field and presented this at a conference; the linguist Johnson had remarked the possibility in 1972, but not produced any implementation.) * :An overview of
hidden Markov model A hidden Markov model (HMM) is a statistical Markov model in which the system being modeled is assumed to be a Markov process — call it X — with unobservable ("''hidden''") states. As part of the definition, HMM requires that there be an ...
s geared toward
speech recognition Speech recognition is an interdisciplinary subfield of computer science and computational linguistics that develops methodologies and technologies that enable the recognition and translation of spoken language into text by computers with the ...
and other NLP fields, describing the Viterbi and forward-backward algorithms. * :Describes a now commonly used POS tagger based on transformation-based learning. * :Textbook on statistical and probabilistic methods in NLP. * :This survey documents relatively less researched importance of lazy functional programming languages (i.e.
Haskell Haskell () is a general-purpose, statically-typed, purely functional programming language with type inference and lazy evaluation. Designed for teaching, research and industrial applications, Haskell has pioneered a number of programming lan ...
) to construct Natural Language Processors and to accommodated many linguistic theories.


Software engineering


''Software engineering: Report of a conference sponsored by the NATO Science Committee''

* Peter Naur, Brian Randell (eds.) * Garmisch, Germany, 7–11 October 1968, Brussels, Scientific Affairs Division, NATO (1969) 231pp.
Online copy (PDF)
Description: Conference of leading people in software field c. 1968
The paper defined the field of
Software engineering Software engineering is a systematic engineering approach to software development. A software engineer is a person who applies the principles of software engineering to design, develop, maintain, test, and evaluate computer software. The term '' ...


''A Description of the Model-View-Controller User Interface Paradigm in the Smalltalk-80 System''Model View Controller History
. C2.com (2012-05-11). Retrieved on 2013-12-09.

* Krasner, Glenn E.; Pope, Stephen T. * ''
The Journal of Object Technology ''The Journal of Object Technology'' is an online scientific journal welcoming manuscripts describing theoretical, empirical, conceptual, and experimental results in the area of software and language engineering, including   * programming parad ...
'', Aug-Sep 1988
Online copy (PDF)
Description: A description of the system that originated the (now dominant) GUI programming paradigm of
Model–view–controller Model–view–controller (MVC) is a software architectural pattern commonly used for developing user interfaces that divide the related program logic into three interconnected elements. This is done to separate internal representations of info ...


''Go To Statement Considered Harmful''

* Dijkstra, E. W. * ''
Communications of the ACM ''Communications of the ACM'' is the monthly journal of the Association for Computing Machinery (ACM). It was established in 1958, with Saul Rosen as its first managing editor. It is sent to all ACM members. Articles are intended for readers wi ...
'', 11(3):147–148, March 1968
Online copy
Description: Don't use goto – the beginning of
structured programming Structured programming is a programming paradigm aimed at improving the clarity, quality, and development time of a computer program by making extensive use of the structured control flow constructs of selection ( if/then/else) and repetition ( ...
.


''On the criteria to be used in decomposing systems into modules''

*
David Parnas David Lorge Parnas (born February 10, 1941) is a Canadian early pioneer of software engineering, who developed the concept of information hiding in modular programming, which is an important element of object-oriented programming today. He is a ...
*
Communications of the ACM ''Communications of the ACM'' is the monthly journal of the Association for Computing Machinery (ACM). It was established in 1958, with Saul Rosen as its first managing editor. It is sent to all ACM members. Articles are intended for readers wi ...
, Volume 15, Issue 12:1053–1058, December 1972.
Online copy (PDF)
Description: The importance of modularization and
information hiding In computer science, information hiding is the principle of segregation of the ''design decisions'' in a computer program that are most likely to change, thus protecting other parts of the program from extensive modification if the design decisio ...
. Note that information hiding was first presented in a different paper of the same author – "Information Distributions Aspects of Design Methodology", Proceedings of IFIP Congress '71, 1971, Booklet TA-3, pp. 26–30


''Hierarchical Program Structures''

*
Ole-Johan Dahl Ole-Johan Dahl (12 October 1931 – 29 June 2002) was a Norwegian computer scientist. Dahl was a professor of computer science at the University of Oslo and is considered to be one of the fathers of Simula and object-oriented programming along w ...
,
C. A. R. Hoare Sir Charles Antony Richard Hoare (Tony Hoare or C. A. R. Hoare) (born 11 January 1934) is a British computer scientist who has made foundational contributions to programming languages, algorithms, operating systems, formal verification, and c ...
* in Dahl, Dijkstra and Hoare, Structured Programming, Academic Press, London and New York, pp. 175–220, 1972. Description: The beginning of
Object-oriented programming Object-oriented programming (OOP) is a programming paradigm based on the concept of "objects", which can contain data and code. The data is in the form of fields (often known as attributes or ''properties''), and the code is in the form of ...
. This paper argued that programs should be decomposed to independent components with small and simple interfaces. They also argued that objects should have both data and related methods.


''A Behavioral Notion of Subtyping''

* Barbara H. Liskov, Jeannette M. Wing * ACM Transactions on Programming Languages and Systems (TOPLAS),1994 Description: Introduces
Liskov substitution principle The Liskov substitution principle (LSP) is a particular definition of a subtyping relation, called strong behavioral subtyping, that was initially introduced by Barbara Liskov in a 1988 conference keynote address titled ''Data abstraction and ...
and establishes behavioral subtyping rules.


''A technique for software module specification with examples''

*
David Parnas David Lorge Parnas (born February 10, 1941) is a Canadian early pioneer of software engineering, who developed the concept of information hiding in modular programming, which is an important element of object-oriented programming today. He is a ...
* Comm. ACM 15, 5 (May 1972), 330–336.
Online copy (PDF)
Description:
software specification In computer science, formal specifications are mathematically based techniques whose purpose are to help with the implementation of systems and software. They are used to describe a system, to analyze its behavior, and to aid in its design by verif ...
.


''Structured Design''

* Wayne Stevens, Glenford Myers, and
Larry Constantine Larry LeRoy Constantine (born 1943) is an American software engineer, professor in the Center for Exact Sciences and Engineering at the University of Madeira Portugal, and considered one of the pioneers of computing. He has contributed numerous ...
* ''IBM Systems Journal, 13'' (2), 115–139, 1974.
On-line copy (PDF)
Description: Seminal paper on
Structured Design In software engineering, structured analysis (SA) and structured design (SD) are methods for analyzing business requirements and developing specifications for converting practices into computer programs, hardware configurations, and related manual ...
,
data flow diagram A data-flow diagram is a way of representing a flow of data through a process or a system (usually an information system). The DFD also provides information about the outputs and inputs of each entity and the process itself. A data-flow diagram h ...
,
coupling A coupling is a device used to connect two shafts together at their ends for the purpose of transmitting power. The primary purpose of couplings is to join two pieces of rotating equipment while permitting some degree of misalignment or end mov ...
, and cohesion.


''The Emperor's Old Clothes''

* C.A.R. Hoare * Communications of the ACM, Vol. 24, No. 2, February 1981, pp. 75–83.
Archived copy (PDF)
Description: Illustrates the "
second-system effect The second-system effect or second-system syndrome is the tendency of small, elegant, and successful systems to be succeeded by over-engineered, bloated systems, due to inflated expectations and overconfidence. The phrase was first used by Fred ...
" and the importance of simplicity.


''The Mythical Man-Month: Essays on Software Engineering''

* Brooks, Jr., F. P. * Addison Wesley Professional. 2nd edition, 1995. Description: Throwing more people at the task will not speed its completion...


''No Silver Bullet: Essence and Accidents of Software Engineering''

*
Fred Brooks Frederick Phillips Brooks Jr. (April 19, 1931 – November 17, 2022) was an American computer architect, software engineer, and computer scientist, best known for managing the development of IBM's System/360 family of computers and the O ...
* Description: Brooks argues that "there is no single development, in either technology or management technique, which by itself promises even one
order of magnitude An order of magnitude is an approximation of the logarithm of a value relative to some contextually understood reference value, usually 10, interpreted as the base of the logarithm and the representative of values of magnitude one. Logarithmic di ...
enfoldimprovement within a decade in productivity, in reliability, in simplicity." He also states that "we cannot expect ever to see two-fold gains every two years" in software development, as there is in hardware development (
Moore's law Moore's law is the observation that the number of transistors in a dense integrated circuit (IC) doubles about every two years. Moore's law is an observation and projection of a historical trend. Rather than a law of physics, it is an empi ...
).


''The Cathedral and the Bazaar''

* Raymond, E.S. * ''
First Monday ''First Monday'' is an American legal drama television series which aired on CBS during the midseason replacement from January 15 to May 3, 2002. The series centered on the U.S. Supreme Court. Like another 2002 series, '' The Court'', it was i ...
'', 3, 3 (March 1998)
Online copy (HTML)
Description:
Open source Open source is source code that is made freely available for possible modification and redistribution. Products include permission to use the source code, design documents, or content of the product. The open-source model is a decentralized so ...
methodology In its most common sense, methodology is the study of research methods. However, the term can also refer to the methods themselves or to the philosophical discussion of associated background assumptions. A method is a structured procedure for br ...
.


''Design Patterns: Elements of Reusable Object Oriented Software''

* E. Gamma, R. Helm, R. Johnson, J. Vlissides *
Addison–Wesley Addison-Wesley is an American publisher of textbooks and computer literature. It is an imprint of Pearson PLC, a global publishing and education company. In addition to publishing books, Addison-Wesley also distributes its technical titles throug ...
, Reading, Massachusetts, 1995. Description: This book was the first to define and list
design pattern A design pattern is the re-usable form of a solution to a design problem. The idea was introduced by the architect Christopher Alexander and has been adapted for various other disciplines, particularly software engineering. The "Gang of Four" boo ...
s in computer science.


''Statecharts: A Visual Formalism For Complex Systems''

*
David Harel David Harel ( he, דוד הראל; born 12 April 1950) is a computer scientist, currently serving as President of the Israel Academy of Sciences and Humanities. He has been on the faculty of the Weizmann Institute of Science in Israel since 1980, ...
* D. Harel. Statecharts: A visual formalism for complex systems. Science of Computer Programming, 8:231—274, 1987
Online version
Description:
Statechart A state diagram is a type of diagram used in computer science and related fields to describe the behavior of systems. State diagrams require that the system described is composed of a finite number of states; sometimes, this is indeed the case, ...
s are a visual modeling method. They are an extension of
state machine A finite-state machine (FSM) or finite-state automaton (FSA, plural: ''automata''), finite automaton, or simply a state machine, is a mathematical model of computation. It is an abstract machine that can be in exactly one of a finite number ...
that might be exponentially more efficient. Therefore, statcharts enable formal modeling of applications that were too complex before. Statecharts are part of the
UML The Unified Modeling Language (UML) is a general-purpose, developmental modeling language in the field of software engineering that is intended to provide a standard way to visualize the design of a system. The creation of UML was originally m ...
diagrams.


Security and privacy


Anonymity and Privacy

* David Chaum
Untraceable electronic mail, return addresses, and digital pseudonyms
Communications of the ACM, 4(2):84–88, February 1981. * Cynthia Dwork, Frank McSherry, Kobbi Nissim, Adam Smith
Calibrating Noise to Sensitivity in Private Data Analysis
In Theory of Cryptography Conference (TCC), Springer, 2006. . Th
full version
appears in Journal of Privacy and Confidentiality, 7 (3), 17–51.


Cryptography

* Whitfield Diffie and Martin E. Hellman, New Directions in Cryptography, IEEE Transactions on Information Theory, November 1976 * R. L. Rivest and A. Shamir and L. M. Adelman, A Method For Obtaining Digital Signatures And Public-Key Cryptosystems, MIT/LCS/TM-82, 1977 * Merkle, R
Security, Authentication, and Public Key Systems
PhD Thesis, 1979 Stanford University.


Passwords

* Morris, Robert and Thompson, Ken
Password security: a case history
Communications of the ACM CACM Homepage archive Volume 22 Issue 11, Nov. 1979 Pages 594–597
PDF


System security

* Dennis and Van Horn
Programming Semantics for Multiprogrammed Computations
ACM Conference on Programming Languages and Pragmatics (August 1965) * Saltzer and Schroeder
The Protection of Information in Computer Systems
ACM Symposium on Operating System Principles (October 1973
HTMLHTML2
* Karger and Schell
Thirty Years later: Lessons from the Multics Security Evaluation
ACSAC 2002 * Lampson, Butler
A Note on the Confinement Problem
Communications of the ACM, 16:10 (Oct. 1973), pp. 613–615
PDF
* Thompson, Ken
Reflections on Trusting Trust
Communications of the ACM, 27:8, Aug 1984 * J.E. Forrester and B.P. Miller
An Empirical Study of the Robustness of Windows NT Applications Using Random Testing
4th USENIX Windows Systems Symposium, Seattle, August 2000.


Usable security

* Whitten, Alma and Tygar, J.D.
Why Johnny Can't Encrypt: A Usability Evaluation of PGP 5.0
Proceedings of the 8th conference on USENIX Security Symposium, Volume 8, Pages 14–28 * Garfinkel, Simson and Shelat, Abhi
Remembrance of Data Passed
IEEE Security and Privacy, Volume 1 Issue 1, January 2003, Page 17-27


Theoretical computer science

Topics covered:
theoretical computer science computer science (TCS) is a subset of general computer science and mathematics that focuses on mathematical aspects of computer science such as the theory of computation, lambda calculus, and type theory. It is difficult to circumscribe the ...
, including
computability theory Computability theory, also known as recursion theory, is a branch of mathematical logic, computer science, and the theory of computation that originated in the 1930s with the study of computable functions and Turing degrees. The field has sinc ...
,
computational complexity theory In theoretical computer science and mathematics, computational complexity theory focuses on classifying computational problems according to their resource usage, and relating these classes to each other. A computational problem is a task solved ...
,
algorithm In mathematics and computer science, an algorithm () is a finite sequence of rigorous instructions, typically used to solve a class of specific problems or to perform a computation. Algorithms are used as specifications for performing ...
s,
algorithmic information theory Algorithmic information theory (AIT) is a branch of theoretical computer science that concerns itself with the relationship between computation and information of computably generated objects (as opposed to stochastically generated), such as str ...
,
information theory Information theory is the scientific study of the quantification, storage, and communication of information. The field was originally established by the works of Harry Nyquist and Ralph Hartley, in the 1920s, and Claude Shannon in the 1940s. ...
and
formal verification In the context of hardware and software systems, formal verification is the act of proving or disproving the correctness of intended algorithms underlying a system with respect to a certain formal specification or property, using formal met ...
.


See also

*
DBLP DBLP is a computer science bibliography website. Starting in 1993 at Universität Trier in Germany, it grew from a small collection of HTML files and became an organization hosting a database and logic programming bibliography site. Since Novem ...
(Digital Bibliography & Library Project in computer science) *
List of open problems in computer science This article is a list of notable unsolved problems in computer science. A problem in computer science is considered unsolved when no solution is known, or when experts in the field disagree about proposed solutions. Computational complexity * ...
*
List of computer science journals Below is a list of computer science journals. Alphabetic list of titles A * ACM Computing Reviews * ACM Computing Surveys * ACM Transactions on Algorithms * ACM Transactions on Computational Logic * ACM Transactions on Database Systems * A ...
*
List of computer science conferences This is a list of academic conferences in computer science. Only conferences with separate articles are included; within each field, the conferences are listed alphabetically by their short names. General * FCRC – Federated Computing Research ...
*
The Collection of Computer Science Bibliographies The Collection of Computer Science Bibliographies (founded 1993) is one of the oldest (if not the oldest) bibliography collections freely accessible on the Internet. It is a collection of bibliographies of scientific literature in computer science a ...
*
Paris Kanellakis Award The Paris Kanellakis Theory and Practice Award is granted yearly by the Association for Computing Machinery (ACM) to honor "specific theoretical accomplishments that have had a significant and demonstrable effect on the practice of computing". It wa ...
, a prize given to honor specific theoretical accomplishments that have had a significant and demonstrable effect on the practice of computing.


References

* * Randell, Brian (ed). (1982). ''The Origins of Digital Computers: Selected Papers.'' 3rd ed. Berlin: Springer-Verlag. . * Turning Points in Computing: 1962–1999, Special Issue, ''IBM Systems Journal, 38'' (2/3),1999. * Yourdon, Edward (ed.) (1979) ''Classics in Software Engineering.'' New York: Yourdon Press.


External links


ACM Classic Books Series

50 most influential papers ACM SIGPLAN papers published in PLDI from 1979 through 1999
organized into a special SIGPLAN proceedings.


Academic search engines


Google Scholar

CiteSeer

Live Academic

Odysci

ISI Web of Science
{{Important publications History of computer science
Computer science Computer science is the study of computation, automation, and information. Computer science spans theoretical disciplines (such as algorithms, theory of computation, information theory, and automation) to Applied science, practical discipli ...
Artificial intelligence publications
Computer science Computer science is the study of computation, automation, and information. Computer science spans theoretical disciplines (such as algorithms, theory of computation, information theory, and automation) to Applied science, practical discipli ...