Computational Indistinguishability
   HOME
*





Computational Indistinguishability
In computational complexity and cryptography, two families of distributions are computationally indistinguishable if no efficient algorithm can tell the difference between them except with negligible probability. Formal definition Let \scriptstyle\_ and \scriptstyle\_ be two distribution ensembles indexed by a security parameter ''n'' (which usually refers to the length of the input); we say they are computationally indistinguishable if for any non-uniform probabilistic polynomial time algorithm ''A'', the following quantity is a negligible function in ''n'': : \delta(n) = \left, \Pr_ A(x) = 1- \Pr_ A(x) = 1\. denoted D_n \approx E_n. In other words, every efficient algorithm ''As behavior does not significantly change when given samples according to ''D''''n'' or ''E''''n'' in the limit as n\to \infty. Another interpretation of computational indistinguishability, is that polynomial-time algorithms actively trying to distinguish between the two ensembles cannot do so: that any s ...
[...More Info...]      
[...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]  


picture info

Analysis Of Algorithms
In computer science, the analysis of algorithms is the process of finding the computational complexity of algorithms—the amount of time, storage, or other resources needed to execute them. Usually, this involves determining a function that relates the size of an algorithm's input to the number of steps it takes (its time complexity) or the number of storage locations it uses (its space complexity). An algorithm is said to be efficient when this function's values are small, or grow slowly compared to a growth in the size of the input. Different inputs of the same size may cause the algorithm to have different behavior, so best, worst and average case descriptions might all be of practical interest. When not otherwise specified, the function describing the performance of an algorithm is usually an upper bound, determined from the worst case inputs to the algorithm. The term "analysis of algorithms" was coined by Donald Knuth. Algorithm analysis is an important part of a broader ...
[...More Info...]      
[...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]  


picture info

Cryptography
Cryptography, or cryptology (from grc, , translit=kryptós "hidden, secret"; and ''graphein'', "to write", or ''-logia'', "study", respectively), is the practice and study of techniques for secure communication in the presence of adversarial behavior. More generally, cryptography is about constructing and analyzing protocols that prevent third parties or the public from reading private messages. Modern cryptography exists at the intersection of the disciplines of mathematics, computer science, information security, electrical engineering, digital signal processing, physics, and others. Core concepts related to information security ( data confidentiality, data integrity, authentication, and non-repudiation) are also central to cryptography. Practical applications of cryptography include electronic commerce, chip-based payment cards, digital currencies, computer passwords, and military communications. Cryptography prior to the modern age was effectively synonymo ...
[...More Info...]      
[...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]  


Distribution Ensemble
In cryptography, a distribution ensemble or probability ensemble is a family of distributions or random variables X = \_ where I is a (countable) index set, and each X_i is a random variable, or probability distribution. Often I=\N and it is required that each X_n have a certain property for ''n'' sufficiently large. For example, a uniform ensemble U = \_ is a distribution ensemble where each U_n is uniformly distributed over strings of length ''n''. In fact, many applications of probability ensembles implicitly assume that the probability spaces for the random variables all coincide in this way, so every probability ensemble is also a stochastic process. See also * Provable security * Statistically close * Pseudorandom ensemble * Computational indistinguishability In computational complexity and cryptography, two families of distributions are computationally indistinguishable if no efficient algorithm can tell the difference between them except with negligible probabili ...
[...More Info...]      
[...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]  


Security Parameter
In cryptography, a security parameter is a way of measuring of how "hard" it is for an adversary to break a cryptographic scheme. There are two main types of security parameter: ''computational'' and ''statistical'', often denoted by \kappa and \lambda, respectively. Roughly speaking, the computational security parameter is a measure for the input size of the computational problem on which the cryptographic scheme is based, which determines its computational complexity, whereas the statistical security parameter is a measure of the probability with which an adversary can break the scheme (whatever that means for the protocol). Security parameters are usually expressed in unary representation - i.e. \kappa is expressed as a string of \kappa 1s, \kappa=1\cdots 1, conventionally written as 1^\kappa - so that the time complexity of the cryptographic algorithm is polynomial in the size of the input. Computational security The security of cryptographic primitives relies on the hardne ...
[...More Info...]      
[...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]  


Uniformity (complexity)
In theoretical computer science, circuit complexity is a branch of computational complexity theory in which Boolean functions are classified according to the size or depth of the Boolean circuits that compute them. A related notion is the circuit complexity of a recursive language that is decided by a uniform family of circuits C_,C_,\ldots (see below). Proving lower bounds on size of Boolean circuits computing explicit Boolean functions is a popular approach to separating complexity classes. For example, a prominent circuit class P/poly consists of Boolean functions computable by circuits of polynomial size. Proving that \mathsf\not\subseteq \mathsf would separate P and NP (see below). Complexity classes defined in terms of Boolean circuits include AC0, AC, TC0, NC1, NC, and P/poly. Size and depth A Boolean circuit with n input bits is a directed acyclic graph in which every node (usually called ''gates'' in this context) is either an input node of in-degree 0 labelle ...
[...More Info...]      
[...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]  


picture info

Polynomial Time
In computer science, the time complexity is the computational complexity that describes the amount of computer time it takes to run an algorithm. Time complexity is commonly estimated by counting the number of elementary operations performed by the algorithm, supposing that each elementary operation takes a fixed amount of time to perform. Thus, the amount of time taken and the number of elementary operations performed by the algorithm are taken to be related by a constant factor. Since an algorithm's running time may vary among different inputs of the same size, one commonly considers the worst-case time complexity, which is the maximum amount of time required for inputs of a given size. Less common, and usually specified explicitly, is the average-case complexity, which is the average of the time taken on inputs of a given size (this makes sense because there are only a finite number of possible inputs of a given size). In both cases, the time complexity is generally expresse ...
[...More Info...]      
[...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]  


picture info

Algorithm
In mathematics and computer science, an algorithm () is a finite sequence of rigorous instructions, typically used to solve a class of specific Computational problem, problems or to perform a computation. Algorithms are used as specifications for performing calculations and data processing. More advanced algorithms can perform automated deductions (referred to as automated reasoning) and use mathematical and logical tests to divert the code execution through various routes (referred to as automated decision-making). Using human characteristics as descriptors of machines in metaphorical ways was already practiced by Alan Turing with terms such as "memory", "search" and "stimulus". In contrast, a Heuristic (computer science), heuristic is an approach to problem solving that may not be fully specified or may not guarantee correct or optimal results, especially in problem domains where there is no well-defined correct or optimal result. As an effective method, an algorithm ca ...
[...More Info...]      
[...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]  




Negligible Function (cryptography)
In mathematics, a negligible function is a function \mu:\mathbb\to\mathbb such that for every positive integer ''c'' there exists an integer ''N''''c'' such that for all ''x'' > ''N''''c'', :, \mu(x),  0 such that for all ''x'' > ''N''poly : , \mu(x), 0, there exists a positive number \delta>0 such that , x-x_0, N_\varepsilon ::, \mu(x), 0 by the functions 1/x^c where c>0 or by 1/\operatorname(x) where \operatorname(x) is a positive polynomial. This leads to the definitions of negligible functions given at the top of this article. Since the constants \varepsilon>0 can be expressed as 1/\operatorname(x) with a constant polynomial this shows that negligible functions are a subset of the infinitesimal functions. Use in cryptography In complexity-based modern cryptography, a security scheme is ''provably secure'' if the probability of security failure (e.g., inverting a one-way function, distinguishing cryptographically strong pseudorandom bits from truly ran ...
[...More Info...]      
[...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]  


Oded Goldreich
Oded Goldreich ( he, עודד גולדרייך; b. 1957) is a professor of Computer Science at the Faculty of Mathematics and Computer Science of Weizmann Institute of Science, Israel. His research interests lie within the theory of computation and are, specifically, the interplay of randomness and computation, the foundations of cryptography, and computational complexity theory. He won the Knuth Prize in 2017 and was selected in 2021 to receive the Israel Prize in mathematics. Biography Goldreich received a DSc in Computer Science at Technion in 1983 under Shimon Even. Goldreich has contributed to the development of pseudorandomness, zero knowledge proofs, secure function evaluation, property testing,Oded Goldreich, Shafi Goldwasser, and Dana Ron. 1998 Property Testing and its connection to Learning and Approximation. ''Journal of the ACM'', pages 653-750. and other areas in cryptography and computational complexity. Goldreich has also authored several books including: ''Fou ...
[...More Info...]      
[...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]  


Random Oracle
In cryptography, a random oracle is an oracle (a theoretical black box) that responds to every ''unique query'' with a (truly) random response chosen uniformly from its output domain. If a query is repeated, it responds the same way every time that query is submitted. Stated differently, a random oracle is a mathematical function chosen uniformly at random, that is, a function mapping each possible query to a (fixed) random response from its output domain. Random oracles as a mathematical abstraction were first used in rigorous cryptographic proofs in the 1993 publication by Mihir Bellare and Phillip Rogaway (1993). They are typically used when the proof cannot be carried out using weaker assumptions on the cryptographic hash function. A system that is proven secure when every hash function is replaced by a random oracle is described as being secure in the random oracle model, as opposed to secure in the standard model of cryptography. Applications Random oracles are typicall ...
[...More Info...]      
[...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]  


Yehuda Lindell
Yehuda Lindell (born 24 February 1971) is a professor in the Department of Computer Science at Bar-Ilan University where he conducts research on cryptography with a focus on the theory of secure computation and its application in practice. Lindell currently leads the cryptography team at Coinbase. Education and academic positions Lindell received a BSc and Msc degree in computer science from Bar-Ilan University. He then obtained a PhD in computer science from the Weizmann Institute of Science in 2002. Lindell received a Raviv Fellowship and spent two years at IBM's cryptography research group at the T.J. Watson Research Center. In 2004, he returned to Israel to take up an academic position at Bar-Ilan University. Lindell's work on secure computation was recognized by the award of an ERC starting grant in 2009 and an ERC consolidators grant in 2014. Lindell was appointed as an IACR Fellow in 2021. Industry experience Lindell worked from 2004 to 2014 as a permanent cryptographic ...
[...More Info...]      
[...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]  




Silvio Micali
Silvio Micali (born October 13, 1954) is an Italian computer scientist, professor at the Massachusetts Institute of Technology and the founder of Algorand. Micali's research centers on cryptography and information security. In 2012, he received the Turing Award for his work in cryptography. Personal life Micali graduated in mathematics at La Sapienza University of Rome in 1978 and earned a PhD degree in computer science from the University of California, Berkeley in 1982; for research supervised by Manuel Blum. Micali has been on the faculty at MIT, Electrical Engineering and Computer Science Department, since 1983. His research interests are cryptography, zero knowledge, pseudorandom generation, secure protocols, and mechanism design. Career Micali is best known for some of his fundamental early work on public-key cryptosystems, pseudorandom functions, digital signatures, oblivious transfer, secure multiparty computation, and is one of the co-inventors of zero-knowledge proof ...
[...More Info...]      
[...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]