HOME
*





Douglas W. Jones
Douglas W. Jones is an American computer scientist at the University of Iowa. His research focuses primarily on computer security, particularly electronic voting. Jones received a Bachelor of Science, B.S. in physics from Carnegie Mellon University in 1973, and a Master's degree, M.S. and Doctor of philosophy, Ph.D. in computer science from the University of Illinois at Urbana-Champaign in 1976 and 1980 respectively. Jones' involvement with electronic voting research began in 1994, when he was appointed to the Iowa Board of Examiners for Voting Machines and Electronic Voting Systems. He chaired the board from 1999 to 2003, and has testified before the United States Commission on Civil Rights, the United States House Committee on Science and the Federal Election Commission on voting issues. In 2005 he participated as an election observer for the presidential election in Kazakhstan. Jones was the technical advisor for HBO, HBO's documentary on electronic voting machine issues, ' ...
[...More Info...]      
[...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]  


picture info

Computer Science
Computer science is the study of computation, automation, and information. Computer science spans theoretical disciplines (such as algorithms, theory of computation, information theory, and automation) to Applied science, practical disciplines (including the design and implementation of Computer architecture, hardware and Computer programming, software). Computer science is generally considered an area of research, academic research and distinct from computer programming. Algorithms and data structures are central to computer science. The theory of computation concerns abstract models of computation and general classes of computational problem, problems that can be solved using them. The fields of cryptography and computer security involve studying the means for secure communication and for preventing Vulnerability (computing), security vulnerabilities. Computer graphics (computer science), Computer graphics and computational geometry address the generation of images. Progr ...
[...More Info...]      
[...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]  


picture info

Federal Election Commission
The Federal Election Commission (FEC) is an independent regulatory agency of the United States whose purpose is to enforce campaign finance law in United States federal elections. Created in 1974 through amendments to the Federal Election Campaign Act, the commission describes its duties as "to disclose campaign finance information, to enforce the provisions of the law such as the limits and prohibitions on contributions, and to oversee the public funding of Presidential elections." The commission was unable to function from late August 2019 to December 2020, with an exception for the period of May 2020 to July 2020, due to lack of a quorum. In the absence of a quorum, the commission could not vote on complaints or give guidance through advisory opinions. As of May 19, 2020, there were 350 outstanding matters on the agency's enforcement docket and 227 items waiting for action. In December 2020, three commissioners were appointed to restore a quorum; however, deadlocks arising ...
[...More Info...]      
[...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]  


picture info

Data Compression
In information theory, data compression, source coding, or bit-rate reduction is the process of encoding information using fewer bits than the original representation. Any particular compression is either lossy or lossless. Lossless compression reduces bits by identifying and eliminating statistical redundancy. No information is lost in lossless compression. Lossy compression reduces bits by removing unnecessary or less important information. Typically, a device that performs data compression is referred to as an encoder, and one that performs the reversal of the process (decompression) as a decoder. The process of reducing the size of a data file is often referred to as data compression. In the context of data transmission, it is called source coding; encoding done at the source of the data before it is stored or transmitted. Source coding should not be confused with channel coding, for error detection and correction or line coding, the means for mapping data onto a signal. ...
[...More Info...]      
[...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]  


Communications Of The ACM
''Communications of the ACM'' is the monthly journal of the Association for Computing Machinery (ACM). It was established in 1958, with Saul Rosen as its first managing editor. It is sent to all ACM members. Articles are intended for readers with backgrounds in all areas of computer science and information systems. The focus is on the practical implications of advances in information technology and associated management issues; ACM also publishes a variety of more theoretical journals. The magazine straddles the boundary of a science magazine, trade magazine, and a scientific journal. While the content is subject to peer review, the articles published are often summaries of research that may also be published elsewhere. Material published must be accessible and relevant to a broad readership. From 1960 onward, ''CACM'' also published algorithms, expressed in ALGOL. The collection of algorithms later became known as the Collected Algorithms of the ACM. See also * ''Journal of the A ...
[...More Info...]      
[...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]  


picture info

Priority Queue
In computer science, a priority queue is an abstract data-type similar to a regular queue or stack data structure in which each element additionally has a ''priority'' associated with it. In a priority queue, an element with high priority is served before an element with low priority. In some implementations, if two elements have the same priority, they are served according to the order in which they were enqueued; in other implementations ordering of elements with the same priority remains undefined. While coders often implement priority queues with heaps, they are conceptually distinct from heaps. A priority queue is a concept like a list or a map; just as a list can be implemented with a linked list or with an array, a priority queue can be implemented with a heap or with a variety of other methods such as an unordered array. Operations A priority queue must at least support the following operations: * ''is_empty'': check whether the queue has no elements. * ''insert_wi ...
[...More Info...]      
[...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]  


picture info

University Of Chicago Press
The University of Chicago Press is the largest and one of the oldest university presses in the United States. It is operated by the University of Chicago and publishes a wide variety of academic titles, including ''The Chicago Manual of Style'', numerous academic journals, and advanced monographs in the academic fields. One of its quasi-independent projects is the BiblioVault, a digital repository for scholarly books. The Press building is located just south of the Midway Plaisance on the University of Chicago campus. History The University of Chicago Press was founded in 1890, making it one of the oldest continuously operating university presses in the United States. Its first published book was Robert F. Harper's ''Assyrian and Babylonian Letters Belonging to the Kouyunjik Collections of the British Museum''. The book sold five copies during its first two years, but by 1900 the University of Chicago Press had published 127 books and pamphlets and 11 scholarly journals, includ ...
[...More Info...]      
[...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]  


picture info

Stanford University Centers And Institutes
Stanford University has many centers and institutes dedicated to the study of various specific topics. These centers and institutes may be within a department, within a school but across departments, an independent laboratory, institute or center reporting directly to the dean of research and outside any school, or semi-independent of the university itself. Independent laboratories, institutes and centers These report directly to the vice-provost and dean of research and are outside any school though any faculty involved in them must belong to a department in one of the schools. These include Bio-X and Spectrum in the area of Biological and Life Sciences; Precourt Institute for Energy and Woods Institute for the Environment in the Environmental Sciences area; the Center for Advanced Study in the Behavioral Sciences (CASBS), the Center for the Study of Language and Information (CSLI) (see below), Freeman Spogli Institute for International Studies (FSI) (see below), Human-Sciences ...
[...More Info...]      
[...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]  


Barbara Simons
Barbara Bluestein Simons (born January 26, 1941) is an American computer scientist and the former president of the Association for Computing Machinery (ACM). She is a Ph.D. graduate of the University of California, Berkeley and spent her early career working as an IBM researcher. She is the founder and former co-chair of USACM, the ACM U.S. Public Policy Council. Her main areas of research are compiler optimization, scheduling theory and algorithm analysis and design. Simons has worked for technology regulation since 2002, where she advocates for the end of electronic voting. She subsequently serves as the chairperson of the Verified Voting Foundation and coauthored a book on the flaws of electronic voting entitled ''Broken Ballots,'' with Douglas W. Jones. Early life Simons was born in Boston, Massachusetts and grew up in Cincinnati, Ohio. In high school, she developed an interest for math and science while taking A.P. Math classes. She attended Wellesley College for a year, ...
[...More Info...]      
[...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]  




Technical Guidelines Development Committee
The Technical Guidelines Development Committee (TGDC) of the National Institute of Standards and Technology supports the Election Assistance Commission in the United States by providing recommendations on voluntary standards and guidelines related to voting equipment and technologies. Charter and Membership The Technical Guidelines Development Committee (TGDC) assists EAC in developing the Voluntary Voting System Guidelines. The chairperson of the TGDC is the director of the National Institute of Standards and Technology (NIST). The TGDC is composed of 14 other members appointed jointly by EAC and the director of NIST from various standards boards and for their technical and scientific expertise related to voting systems and equipment. NIST chairs and manages the TGDC, provides technical guidance in areas such as encryption, usability engineering and testing, and performs work to improve testing and conformity assessment programs for voting systems. As part of its development of ...
[...More Info...]      
[...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]  


Election Assistance Commission
The Election Assistance Commission (EAC) is an independent agency of the United States government created by the Help America Vote Act of 2002 (HAVA). The Commission serves as a national clearinghouse and resource of information regarding election administration. It is charged with administering payments to states and developing guidance to meet HAVA requirements, adopting voluntary voting system guidelines, and accrediting voting system test laboratories and certifying voting equipment. It is also charged with developing and maintaining a national mail voter registration form. Responsibilities The EAC is tasked with performing a number of election-related duties including: * creating and maintaining the Voluntary Voting System Guidelines * creating a national program for the testing, certification, and decertification of voting systems * maintaining the National Mail Voter Registration Form required by the National Voter Registration Act of 1993 (NVRA) * reporting to Congress ...
[...More Info...]      
[...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]  


picture info

ACCURATE
Accuracy and precision are two measures of '' observational error''. ''Accuracy'' is how close a given set of measurements ( observations or readings) are to their ''true value'', while ''precision'' is how close the measurements are to each other. In other words, ''precision'' is a description of '' random errors'', a measure of statistical variability. ''Accuracy'' has two definitions: # More commonly, it is a description of only '' systematic errors'', a measure of statistical bias of a given measure of central tendency; low accuracy causes a difference between a result and a true value; ISO calls this ''trueness''. # Alternatively, ISO defines accuracy as describing a combination of both types of observational error (random and systematic), so high accuracy requires both high precision and high trueness. In the first, more common definition of "accuracy" above, the concept is independent of "precision", so a particular set of data can be said to be accurate, precise, both, ...
[...More Info...]      
[...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]