HOME
*





Discrete Logarithm
In mathematics, for given real numbers ''a'' and ''b'', the logarithm log''b'' ''a'' is a number ''x'' such that . Analogously, in any group ''G'', powers ''b''''k'' can be defined for all integers ''k'', and the discrete logarithm log''b'' ''a'' is an integer ''k'' such that . In number theory, the more commonly used term is index: we can write ''x'' = ind''r'' ''a'' (mod ''m'') (read "the index of ''a'' to the base ''r'' modulo ''m''") for ''r''''x'' ≡ ''a'' (mod ''m'') if ''r'' is a primitive root of ''m'' and gcd(''a'',''m'') = 1. Discrete logarithms are quickly computable in a few special cases. However, no efficient method is known for computing them in general. Several important algorithms in public-key cryptography, such as ElGamal base their security on the assumption that the discrete logarithm problem over carefully chosen groups has no efficient solution. Definition Let ''G'' be any group. Denote its group operation by mu ...
[...More Info...]      
[...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]  


picture info

Mathematics
Mathematics is an area of knowledge that includes the topics of numbers, formulas and related structures, shapes and the spaces in which they are contained, and quantities and their changes. These topics are represented in modern mathematics with the major subdisciplines of number theory, algebra, geometry, and analysis, respectively. There is no general consensus among mathematicians about a common definition for their academic discipline. Most mathematical activity involves the discovery of properties of abstract objects and the use of pure reason to prove them. These objects consist of either abstractions from nature orin modern mathematicsentities that are stipulated to have certain properties, called axioms. A ''proof'' consists of a succession of applications of deductive rules to already established results. These results include previously proved theorems, axioms, andin case of abstraction from naturesome basic properties that are considered true starting points of ...
[...More Info...]      
[...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]  


picture info

Modular Arithmetic
In mathematics, modular arithmetic is a system of arithmetic for integers, where numbers "wrap around" when reaching a certain value, called the modulus. The modern approach to modular arithmetic was developed by Carl Friedrich Gauss in his book ''Disquisitiones Arithmeticae'', published in 1801. A familiar use of modular arithmetic is in the 12-hour clock, in which the day is divided into two 12-hour periods. If the time is 7:00 now, then 8 hours later it will be 3:00. Simple addition would result in , but clocks "wrap around" every 12 hours. Because the hour number starts over at zero when it reaches 12, this is arithmetic ''modulo'' 12. In terms of the definition below, 15 is ''congruent'' to 3 modulo 12, so "15:00" on a 24-hour clock is displayed "3:00" on a 12-hour clock. Congruence Given an integer , called a modulus, two integers and are said to be congruent modulo , if is a divisor of their difference (that is, if there is an integer such that ). Congruence modulo ...
[...More Info...]      
[...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]  


picture info

Polynomial Time
In computer science, the time complexity is the computational complexity that describes the amount of computer time it takes to run an algorithm. Time complexity is commonly estimated by counting the number of elementary operations performed by the algorithm, supposing that each elementary operation takes a fixed amount of time to perform. Thus, the amount of time taken and the number of elementary operations performed by the algorithm are taken to be related by a constant factor. Since an algorithm's running time may vary among different inputs of the same size, one commonly considers the worst-case time complexity, which is the maximum amount of time required for inputs of a given size. Less common, and usually specified explicitly, is the average-case complexity, which is the average of the time taken on inputs of a given size (this makes sense because there are only a finite number of possible inputs of a given size). In both cases, the time complexity is generally expresse ...
[...More Info...]      
[...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]  


Exponential Time
In computer science, the time complexity is the computational complexity that describes the amount of computer time it takes to run an algorithm. Time complexity is commonly estimated by counting the number of elementary operations performed by the algorithm, supposing that each elementary operation takes a fixed amount of time to perform. Thus, the amount of time taken and the number of elementary operations performed by the algorithm are taken to be related by a constant factor. Since an algorithm's running time may vary among different inputs of the same size, one commonly considers the worst-case time complexity, which is the maximum amount of time required for inputs of a given size. Less common, and usually specified explicitly, is the average-case complexity, which is the average of the time taken on inputs of a given size (this makes sense because there are only a finite number of possible inputs of a given size). In both cases, the time complexity is generally expresse ...
[...More Info...]      
[...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]  


Running Time
In computer science, the time complexity is the computational complexity that describes the amount of computer time it takes to run an algorithm. Time complexity is commonly estimated by counting the number of elementary operations performed by the algorithm, supposing that each elementary operation takes a fixed amount of time to perform. Thus, the amount of time taken and the number of elementary operations performed by the algorithm are taken to be related by a constant factor. Since an algorithm's running time may vary among different inputs of the same size, one commonly considers the Worst-case complexity, worst-case time complexity, which is the maximum amount of time required for inputs of a given size. Less common, and usually specified explicitly, is the average-case complexity, which is the average of the time taken on inputs of a given size (this makes sense because there are only a finite number of possible inputs of a given size). In both cases, the time complexity i ...
[...More Info...]      
[...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]  


Finite Group
Finite is the opposite of infinite. It may refer to: * Finite number (other) * Finite set, a set whose cardinality (number of elements) is some natural number * Finite verb, a verb form that has a subject, usually being inflected or marked for person and/or tense or aspect * "Finite", a song by Sara Groves from the album '' Invisible Empires'' See also * * Nonfinite (other) Nonfinite is the opposite of finite * a nonfinite verb is a verb that is not capable of serving as the main verb in an independent clause * a non-finite clause In linguistics, a non-finite clause is a dependent or embedded clause that represen ... {{disambiguation fr:Fini it:Finito ...
[...More Info...]      
[...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]  




Order (group Theory)
In mathematics, the order of a finite group is the number of its elements. If a group is not finite, one says that its order is ''infinite''. The ''order'' of an element of a group (also called period length or period) is the order of the subgroup generated by the element. If the group operation is denoted as a multiplication, the order of an element of a group, is thus the smallest positive integer such that , where denotes the identity element of the group, and denotes the product of copies of . If no such exists, the order of is infinite. The order of a group is denoted by or , and the order of an element is denoted by or , instead of \operatorname(\langle a\rangle), where the brackets denote the generated group. Lagrange's theorem states that for any subgroup of a finite group , the order of the subgroup divides the order of the group; that is, is a divisor of . In particular, the order of any element is a divisor of . Example The symmetric group S3 has th ...
[...More Info...]      
[...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]  


Group Isomorphism
In abstract algebra, a group isomorphism is a function between two groups that sets up a one-to-one correspondence between the elements of the groups in a way that respects the given group operations. If there exists an isomorphism between two groups, then the groups are called isomorphic. From the standpoint of group theory, isomorphic groups have the same properties and need not be distinguished. Definition and notation Given two groups (G, *) and (H, \odot), a ''group isomorphism'' from (G, *) to (H, \odot) is a bijective group homomorphism from G to H. Spelled out, this means that a group isomorphism is a bijective function f : G \to H such that for all u and v in G it holds that f(u * v) = f(u) \odot f(v). The two groups (G, *) and (H, \odot) are isomorphic if there exists an isomorphism from one to the other. This is written (G, *) \cong (H, \odot). Often shorter and simpler notations can be used. When the relevant group operations are understood, they are omitted and one ...
[...More Info...]      
[...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]  


picture info

Generating Set Of A Group
In abstract algebra, a generating set of a group is a subset of the group set such that every element of the group can be expressed as a combination (under the group operation) of finitely many elements of the subset and their inverses. In other words, if ''S'' is a subset of a group ''G'', then , the ''subgroup generated by S'', is the smallest subgroup of ''G'' containing every element of ''S'', which is equal to the intersection over all subgroups containing the elements of ''S''; equivalently, is the subgroup of all elements of ''G'' that can be expressed as the finite product of elements in ''S'' and their inverses. (Note that inverses are only needed if the group is infinite; in a finite group, the inverse of an element can be expressed as a power of that element.) If ''G'' = , then we say that ''S'' ''generates'' ''G'', and the elements in ''S'' are called ''generators'' or ''group generators''. If ''S'' is the empty set, then is the trivial group , since we consider th ...
[...More Info...]      
[...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]  


picture info

Subgroup
In group theory, a branch of mathematics, given a group ''G'' under a binary operation ∗, a subset ''H'' of ''G'' is called a subgroup of ''G'' if ''H'' also forms a group under the operation ∗. More precisely, ''H'' is a subgroup of ''G'' if the restriction of ∗ to is a group operation on ''H''. This is often denoted , read as "''H'' is a subgroup of ''G''". The trivial subgroup of any group is the subgroup consisting of just the identity element. A proper subgroup of a group ''G'' is a subgroup ''H'' which is a proper subset of ''G'' (that is, ). This is often represented notationally by , read as "''H'' is a proper subgroup of ''G''". Some authors also exclude the trivial group from being proper (that is, ). If ''H'' is a subgroup of ''G'', then ''G'' is sometimes called an overgroup of ''H''. The same definitions apply more generally when ''G'' is an arbitrary semigroup, but this article will only deal with subgroups of groups. Subgroup tests Suppose th ...
[...More Info...]      
[...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]  


picture info

Surjection
In mathematics, a surjective function (also known as surjection, or onto function) is a function that every element can be mapped from element so that . In other words, every element of the function's codomain is the image of one element of its domain. It is not required that be unique; the function may map one or more elements of to the same element of . The term ''surjective'' and the related terms ''injective'' and ''bijective'' were introduced by Nicolas Bourbaki, a group of mainly French 20th-century mathematicians who, under this pseudonym, wrote a series of books presenting an exposition of modern advanced mathematics, beginning in 1935. The French word '' sur'' means ''over'' or ''above'', and relates to the fact that the image of the domain of a surjective function completely covers the function's codomain. Any function induces a surjection by restricting its codomain to the image of its domain. Every surjective function has a right inverse assuming the axiom ...
[...More Info...]      
[...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]  


Group Homomorphism
In mathematics, given two groups, (''G'', ∗) and (''H'', ·), a group homomorphism from (''G'', ∗) to (''H'', ·) is a function ''h'' : ''G'' → ''H'' such that for all ''u'' and ''v'' in ''G'' it holds that : h(u*v) = h(u) \cdot h(v) where the group operation on the left side of the equation is that of ''G'' and on the right side that of ''H''. From this property, one can deduce that ''h'' maps the identity element ''eG'' of ''G'' to the identity element ''eH'' of ''H'', : h(e_G) = e_H and it also maps inverses to inverses in the sense that : h\left(u^\right) = h(u)^. \, Hence one can say that ''h'' "is compatible with the group structure". Older notations for the homomorphism ''h''(''x'') may be ''x''''h'' or ''x''''h'', though this may be confused as an index or a general subscript. In automata theory, sometimes homomorphisms are written to the right of their arguments without parentheses, so that ''h''(''x'') becomes simply xh. In areas of mathematics where one ...
[...More Info...]      
[...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]