An important aspect in the study of
elliptic curves is devising effective ways of counting points on the curve. There have been several approaches to do so, and the
algorithms devised have proved to be useful tools in the study of various fields such as
number theory, and more recently in
cryptography and Digital Signature Authentication (See
elliptic curve cryptography and
elliptic curve DSA). While in number theory they have important consequences in the solving of
Diophantine equations, with respect to cryptography, they enable us to make effective use of the difficulty of the
discrete logarithm problem (DLP) for the group
, of elliptic curves over a
finite field , where ''q'' = ''p''
''k'' and ''p'' is a prime. The DLP, as it has come to be known, is a widely used approach to
public key cryptography, and the difficulty in solving this problem determines the
level of security
In cryptography, security level is a measure of the strength that a cryptographic primitive — such as a cipher or hash function — achieves. Security level is usually expressed as a number of "bits of security" (also security strength ...
of the cryptosystem. This article covers algorithms to count points on elliptic curves over fields of large characteristic, in particular ''p'' > 3. For curves over fields of small characteristic more efficient algorithms based on ''p''-adic methods exist.
Approaches to counting points on elliptic curves
There are several approaches to the problem. Beginning with the naive approach, we trace the developments up to Schoof's definitive work on the subject, while also listing the improvements to Schoof's algorithm made by Elkies (1990) and Atkin (1992).
Several algorithms make use of the fact that groups of the form
are subject to an important theorem due to Hasse, that bounds the number of points to be considered.
Hasse's theorem states that if ''E'' is an elliptic curve over the finite field
, then the
cardinality
In mathematics, the cardinality of a set is a measure of the number of elements of the set. For example, the set A = \ contains 3 elements, and therefore A has a cardinality of 3. Beginning in the late 19th century, this concept was generalized ...
of
satisfies
:
Naive approach
The naive approach to counting points, which is the least sophisticated, involves running through all the elements of the field
and testing which ones satisfy the Weierstrass form of the elliptic curve
:
Example
Let ''E'' be the curve ''y''
2 = ''x''
3 + ''x'' + 1 over
. To count points on ''E'', we make a
list of the possible values of ''x'', then of the
quadratic residues of x mod 5 (for lookup purpose only), then of ''x''
3 + ''x'' + 1 mod 5, then of ''y'' of ''x''
3 + ''x'' + 1 mod 5. This yields the points on ''E''.
E.g. the last row is computed as follows: If you insert
in the equation ''x''
3 + ''x'' + 1 mod 5 you get
as result (3rd column). This result can be achieved if
(
Quadratic residues can be looked up in the 2nd column). So the points for the last row are
.
Therefore,
has
cardinality
In mathematics, the cardinality of a set is a measure of the number of elements of the set. For example, the set A = \ contains 3 elements, and therefore A has a cardinality of 3. Beginning in the late 19th century, this concept was generalized ...
of 9: the 8 points listed before and the point at infinity.
This algorithm requires running time ''O''(''q''), because all the values of
must be considered.
Baby-step giant-step
An improvement in running time is obtained using a different approach: we pick an element
by selecting random values of
until
is a square in
and then computing the square root of this value in order to get
.
Hasse's theorem tells us that
lies in the interval
. Thus, by
Lagrange's theorem, finding a unique
lying in this interval and satisfying
, results in finding the cardinality of
. The algorithm fails if there exist two distinct integers
and
in the interval such that
. In such a case it usually suffices to repeat the algorithm with another randomly chosen point in
.
Trying all values of
in order to find the one that satisfies
takes around
steps.
However, by applying the
baby-step giant-step algorithm to
, we are able to speed this up to around