HOME
*





Hard Core Predicate
In cryptography, a hard-core predicate of a one-way function ''f'' is a predicate ''b'' (i.e., a function whose output is a single bit) which is easy to compute (as a function of ''x'') but is hard to compute given ''f(x)''. In formal terms, there is no probabilistic polynomial-time (PPT) algorithm that computes ''b(x)'' from ''f(x)'' with probability significantly greater than one half over random choice of ''x''. Goldwasser, S. and Bellare, M.br>"Lecture Notes on Cryptography". Summer course on cryptography, MIT, 1996-2001 In other words, if ''x'' is drawn uniformly at random, then given ''f(x)'', any PPT adversary can only distinguish the hard-core bit ''b(x)'' and a uniformly random bit with negligible advantage over the length of ''x''. A hard-core function can be defined similarly. That is, if ''x'' is chosen uniformly at random, then given ''f(x)'', any PPT algorithm can only distinguish the hard-core function value ''h(x)'' and uniformly random bits of length '', h(x), ' ...
[...More Info...]      
[...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]  


picture info

Cryptography
Cryptography, or cryptology (from grc, , translit=kryptós "hidden, secret"; and ''graphein'', "to write", or ''-logia'', "study", respectively), is the practice and study of techniques for secure communication in the presence of adversarial behavior. More generally, cryptography is about constructing and analyzing protocols that prevent third parties or the public from reading private messages. Modern cryptography exists at the intersection of the disciplines of mathematics, computer science, information security, electrical engineering, digital signal processing, physics, and others. Core concepts related to information security ( data confidentiality, data integrity, authentication, and non-repudiation) are also central to cryptography. Practical applications of cryptography include electronic commerce, chip-based payment cards, digital currencies, computer passwords, and military communications. Cryptography prior to the modern age was effectively synonymo ...
[...More Info...]      
[...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]  


Leonid Levin
Leonid Anatolievich Levin ( ; russian: Леони́д Анато́льевич Ле́вин; uk, Леоні́д Анато́лійович Ле́він; born November 2, 1948) is a Soviet-American mathematician and computer scientist. He is known for his work in randomness in computing, algorithmic complexity and intractability, average-case complexity, foundations of mathematics and computer science, algorithmic probability, theory of computation, and information theory. He obtained his master's degree at Moscow University in 1970 where he studied under Andrey Kolmogorov and completed the Candidate Degree academic requirements in 1972. He and Stephen Cook independently discovered the existence of NP-complete problems. This NP-completeness theorem, often called the Cook–Levin theorem, was a basis for one of the seven Millennium Prize Problems declared by the Clay Mathematics Institute with a $1,000,000 prize offered. The Cook–Levin theorem was a breakthrough in computer s ...
[...More Info...]      
[...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]  


picture info

Hadamard Code
The Hadamard code is an error-correcting code named after Jacques Hadamard that is used for error detection and correction when transmitting messages over very noisy or unreliable channels. In 1971, the code was used to transmit photos of Mars back to Earth from the NASA space probe Mariner 9. Because of its unique mathematical properties, the Hadamard code is not only used by engineers, but also intensely studied in coding theory, mathematics, and theoretical computer science. The Hadamard code is also known under the names Walsh code, Walsh family, and Walsh–Hadamard code in recognition of the American mathematician Joseph Leonard Walsh. The Hadamard code is an example of a linear code of length 2^m over a binary alphabet. Unfortunately, this term is somewhat ambiguous as some references assume a message length k = m while others assume a message length of k = m+1. In this article, the first case is called the Hadamard code while the second is called the augmented Hadamar ...
[...More Info...]      
[...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]  


List-decoding
In coding theory, list decoding is an alternative to unique decoding of error-correcting codes for large error rates. The notion was proposed by Elias in the 1950s. The main idea behind list decoding is that the decoding algorithm instead of outputting a single possible message outputs a list of possibilities one of which is correct. This allows for handling a greater number of errors than that allowed by unique decoding. The unique decoding model in coding theory, which is constrained to output a single valid codeword from the received word could not tolerate a greater fraction of errors. This resulted in a gap between the error-correction performance for stochastic noise models (proposed by Shannon) and the adversarial noise model (considered by Richard Hamming). Since the mid 90s, significant algorithmic progress by the coding theory community has bridged this gap. Much of this progress is based on a relaxed error-correction model called list decoding, wherein the decoder outp ...
[...More Info...]      
[...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]  


Semantic Security
In cryptography, a semantically secure cryptosystem is one where only negligible information about the plaintext can be feasibly extracted from the ciphertext. Specifically, any probabilistic, polynomial-time algorithm (PPTA) that is given the ciphertext of a certain message m (taken from any distribution of messages), and the message's length, cannot determine any partial information on the message with probability non-negligibly higher than all other PPTA's that only have access to the message length (and not the ciphertext). S. Goldwasser and S. MicaliProbabilistic encryption & how to play mental poker keeping secret all partial information Annual ACM Symposium on Theory of Computing, 1982. This concept is the computational complexity analogue to Shannon's concept of perfect secrecy. Perfect secrecy means that the ciphertext reveals no information at all about the plaintext, whereas semantic security implies that any information revealed cannot be feasibly extracted. Goldreich, ...
[...More Info...]      
[...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]  




One-way Permutation
In computer science, a one-way function is a function that is easy to compute on every input, but hard to invert given the image of a random input. Here, "easy" and "hard" are to be understood in the sense of computational complexity theory, specifically the theory of polynomial time problems. Not being one-to-one is not considered sufficient for a function to be called one-way (see Theoretical definition, below). The existence of such one-way functions is still an open conjecture. Their existence would prove that the complexity classes P and NP are not equal, thus resolving the foremost unsolved question of theoretical computer science.Oded Goldreich (2001). Foundations of Cryptography: Volume 1, Basic Tools,draft availablefrom author's site). Cambridge University Press. . (see als The converse is not known to be true, i.e. the existence of a proof that P≠NP would not directly imply the existence of one-way functions. In applied contexts, the terms "easy" and "hard" are usu ...
[...More Info...]      
[...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]  


CSPRNG
A cryptographically secure pseudorandom number generator (CSPRNG) or cryptographic pseudorandom number generator (CPRNG) is a pseudorandom number generator (PRNG) with properties that make it suitable for use in cryptography. It is also loosely known as a cryptographic random number generator (CRNG) (see Random number generation § "True" vs. pseudo-random numbers). Most cryptographic applications require random numbers, for example: * key generation * nonces * salts in certain signature schemes, including ECDSA, RSASSA-PSS The "quality" of the randomness required for these applications varies. For example, creating a nonce in some protocols needs only uniqueness. On the other hand, the generation of a master key requires a higher quality, such as more entropy. And in the case of one-time pads, the information-theoretic guarantee of perfect secrecy only holds if the key material comes from a true random source with high entropy, and thus any kind of pseudorandom number genera ...
[...More Info...]      
[...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]  


Johan Håstad
Johan Torkel Håstad (; born 19 November 1960) is a Swedish theoretical computer scientist most known for his work on computational complexity theory. He was the recipient of the Gödel Prize in 1994 and 2011 and the ACM Doctoral Dissertation Award in 1986, among other prizes. He has been a professor in theoretical computer science at KTH Royal Institute of Technology in Stockholm, Sweden since 1988, becoming a full professor in 1992. He is a member of the Royal Swedish Academy of Sciences since 2001. He received his B.S. in Mathematics at Stockholm University in 1981, his M.S. in Mathematics at Uppsala University in 1984 and his Ph.D. in Mathematics from MIT in 1986. Håstad's thesis and 1994 Gödel Prize concerned his work on lower bounds on the size of constant-depth Boolean circuits for the parity function. After Andrew Yao proved that such circuits require exponential size, Håstad proved nearly optimal lower bounds on the necessary size through his switching lemma, which ...
[...More Info...]      
[...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]  


picture info

Information Theory
Information theory is the scientific study of the quantification (science), quantification, computer data storage, storage, and telecommunication, communication of information. The field was originally established by the works of Harry Nyquist and Ralph Hartley, in the 1920s, and Claude Shannon in the 1940s. The field is at the intersection of probability theory, statistics, computer science, statistical mechanics, information engineering (field), information engineering, and electrical engineering. A key measure in information theory is information entropy, entropy. Entropy quantifies the amount of uncertainty involved in the value of a random variable or the outcome of a random process. For example, identifying the outcome of a fair coin flip (with two equally likely outcomes) provides less information (lower entropy) than specifying the outcome from a roll of a dice, die (with six equally likely outcomes). Some other important measures in information theory are mutual informat ...
[...More Info...]      
[...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]  


picture info

Vector Space
In mathematics and physics, a vector space (also called a linear space) is a set whose elements, often called ''vectors'', may be added together and multiplied ("scaled") by numbers called '' scalars''. Scalars are often real numbers, but can be complex numbers or, more generally, elements of any field. The operations of vector addition and scalar multiplication must satisfy certain requirements, called ''vector axioms''. The terms real vector space and complex vector space are often used to specify the nature of the scalars: real coordinate space or complex coordinate space. Vector spaces generalize Euclidean vectors, which allow modeling of physical quantities, such as forces and velocity, that have not only a magnitude, but also a direction. The concept of vector spaces is fundamental for linear algebra, together with the concept of matrix, which allows computing in vector spaces. This provides a concise and synthetic way for manipulating and studying systems of linear eq ...
[...More Info...]      
[...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]  


picture info

Inner Product Space
In mathematics, an inner product space (or, rarely, a Hausdorff pre-Hilbert space) is a real vector space or a complex vector space with an operation called an inner product. The inner product of two vectors in the space is a scalar, often denoted with angle brackets such as in \langle a, b \rangle. Inner products allow formal definitions of intuitive geometric notions, such as lengths, angles, and orthogonality (zero inner product) of vectors. Inner product spaces generalize Euclidean vector spaces, in which the inner product is the dot product or ''scalar product'' of Cartesian coordinates. Inner product spaces of infinite dimension are widely used in functional analysis. Inner product spaces over the field of complex numbers are sometimes referred to as unitary spaces. The first usage of the concept of a vector space with an inner product is due to Giuseppe Peano, in 1898. An inner product naturally induces an associated norm, (denoted , x, and , y, in the picture); so, ...
[...More Info...]      
[...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]  


Oded Goldreich
Oded Goldreich ( he, עודד גולדרייך; b. 1957) is a professor of Computer Science at the Faculty of Mathematics and Computer Science of Weizmann Institute of Science, Israel. His research interests lie within the theory of computation and are, specifically, the interplay of randomness and computation, the foundations of cryptography, and computational complexity theory. He won the Knuth Prize in 2017 and was selected in 2021 to receive the Israel Prize in mathematics. Biography Goldreich received a DSc in Computer Science at Technion in 1983 under Shimon Even. Goldreich has contributed to the development of pseudorandomness, zero knowledge proofs, secure function evaluation, property testing,Oded Goldreich, Shafi Goldwasser, and Dana Ron. 1998 Property Testing and its connection to Learning and Approximation. ''Journal of the ACM'', pages 653-750. and other areas in cryptography and computational complexity. Goldreich has also authored several books including: ''Fou ...
[...More Info...]      
[...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]