Linked Timestamping
   HOME
*



picture info

Linked Timestamping
Linked timestamping is a type of trusted timestamping where issued time-stamps are related to each other. Description Linked timestamping creates time-stamp tokens which are dependent on each other, entangled in some authenticated data structure. Later modification of the issued time-stamps would invalidate this structure. The temporal order of issued time-stamps is also protected by this data structure, making backdating of the issued time-stamps impossible, even by the issuing server itself. The top of the authenticated data structure is generally ''published'' in some hard-to-modify and widely witnessed media, like printed newspaper or public blockchain. There are no (long-term) private keys in use, avoiding PKI-related risks. Suitable candidates for the authenticated data structure include: * Linear hash chain * Merkle tree (binary hash tree) * Skip list The simplest linear hash chain-based time-stamping scheme is illustrated in the following diagram: The linking-based t ...
[...More Info...]      
[...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]  




Trusted Timestamping
Trusted timestamping is the process of securely keeping track of the creation and modification time of a document. Security here means that no one—not even the owner of the document—should be able to change it once it has been recorded provided that the timestamper's integrity is never compromised. The administrative aspect involves setting up a publicly available, trusted timestamp management infrastructure to collect, process and renew timestamps. History The idea of timestamping information is centuries old. For example, when Robert Hooke discovered Hooke's law in 1660, he did not want to publish it yet, but wanted to be able to claim priority. So he published the anagram ''ceiiinosssttuv'' and later published the translation ''ut tensio sic vis'' (Latin for "as is the extension, so is the force"). Similarly, Galileo first published his discovery of the phases of Venus in the anagram form. Sir Isaac Newton, in responding to questions from Leibniz in a letter in 1677, co ...
[...More Info...]      
[...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]  


picture info

RSA (algorithm)
RSA (Rivest–Shamir–Adleman) is a public-key cryptosystem that is widely used for secure data transmission. It is also one of the oldest. The acronym "RSA" comes from the surnames of Ron Rivest, Adi Shamir and Leonard Adleman, who publicly described the algorithm in 1977. An equivalent system was developed secretly in 1973 at Government Communications Headquarters (GCHQ) (the British signals intelligence agency) by the English mathematician Clifford Cocks. That system was declassified in 1997. In a public-key cryptosystem, the encryption key is public and distinct from the decryption key, which is kept secret (private). An RSA user creates and publishes a public key based on two large prime numbers, along with an auxiliary value. The prime numbers are kept secret. Messages can be encrypted by anyone, via the public key, but can only be decoded by someone who knows the prime numbers. The security of RSA relies on the practical difficulty of factoring the product of two ...
[...More Info...]      
[...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]  


Request For Comments
A Request for Comments (RFC) is a publication in a series from the principal technical development and standards-setting bodies for the Internet, most prominently the Internet Engineering Task Force (IETF). An RFC is authored by individuals or groups of engineers and computer scientists in the form of a memorandum describing methods, behaviors, research, or innovations applicable to the working of the Internet and Internet-connected systems. It is submitted either for peer review or to convey new concepts, information, or, occasionally, engineering humor. The IETF adopts some of the proposals published as RFCs as Internet Standards. However, many RFCs are informational or experimental in nature and are not standards. The RFC system was invented by Steve Crocker in 1969 to help record unofficial notes on the development of ARPANET. RFCs have since become official documents of Internet specifications, communications protocols, procedures, and events. According to Crocker, the doc ...
[...More Info...]      
[...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]  


Internet Engineering Task Force
The Internet Engineering Task Force (IETF) is a standards organization for the Internet and is responsible for the technical standards that make up the Internet protocol suite (TCP/IP). It has no formal membership roster or requirements and all its participants are volunteers. Their work is usually funded by employers or other sponsors. The IETF was initially supported by the federal government of the United States but since 1993 has operated under the auspices of the Internet Society, an international non-profit organization. Organization The IETF is organized into a large number of working groups and birds of a feather informal discussion groups, each dealing with a specific topic. The IETF operates in a bottom-up task creation mode, largely driven by these working groups. Each working group has an appointed chairperson (or sometimes several co-chairs); a charter that describes its focus; and what it is expected to produce, and when. It is open to all who want to particip ...
[...More Info...]      
[...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]  


ANSI ASC X9
__FORCETOC__ The Accredited Standards Committee X9 (ASC X9, Inc.) is an ANSI (American National Standards Institute) accredited standards developing organization, responsible for developing voluntary open consensus standards for the financial services industry in the U.S. ASC X9 is the USA Technical Advisory Group (TAG) to the International Technical Committee on Financial Services ISO/TC 68 under the International Organization for Standardization (ISO), of Geneva, Switzerland, and submits X9 American National Standards to the international committee to be considered for adoption as international standards or ISO standards.https://share.ansi.org/Shared%20Documents/Standards%20Activities/International%20Standardization/ISO/US%20TAGs%20to%20ISO/ISOTAG_March2017.pdf , Page 41, List of SDOs that operate TAGS for ANSI Membership in ASC X9 is open to all U.S. domiciled companies and organizations in the financial services industry. Domestic Role The Accredited Standards Committee ...
[...More Info...]      
[...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]  




American National Standards Institute
The American National Standards Institute (ANSI ) is a private non-profit organization that oversees the development of voluntary consensus standards for products, services, processes, systems, and personnel in the United States. The organization also coordinates U.S. standards with international standards so that American products can be used worldwide. ANSI accredits standards that are developed by representatives of other standards organizations, government agencies, consumer groups, companies, and others. These standards ensure that the characteristics and performance of products are consistent, that people use the same definitions and terms, and that products are tested the same way. ANSI also accredits organizations that carry out product or personnel certification in accordance with requirements defined in international standards. The organization's headquarters are in Washington, D.C. ANSI's operations office is located in New York City. The ANSI annual operating b ...
[...More Info...]      
[...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]  


ISO 18014
ISO/IEC 18014 ''Information technology — Security techniques — Time-stamping services'' is an international standard that specifies time-stamping techniques. It comprises four parts: * ''Part 1: Framework'' * ''Part 2: Mechanisms producing independent tokens'' * ''Part 3: Mechanisms producing linked tokens'' * ''Part 4: Traceability of time sources'' Part 1: Framework In this first part of ISO/IEC 18014, several things are explained and developed: * The identification of the objectives of a time authority. * The description of a general model on which time stamping services are based. * The definition of time stamping services. * The definition of the basic protocols of time stamping. * The specifications of the protocols between the involved entities. Key words: audit, non-repudiation, security, time-stamp Part 2: Mechanisms producing independent tokens A time-stamping service provides evidence that a data item existed before a certain point in time. Time-stamp se ...
[...More Info...]      
[...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]  


picture info

Hash Collision
In computer science, a hash collision or hash clash is when two pieces of data in a hash table share the same hash value. The hash value in this case is derived from a hash function which takes a data input and returns a fixed length of bits. Although hash algorithms have been created with the intent of being collision resistant, they can still sometimes map different data to the same hash (by virtue of the pigeonhole principle). Malicious users can take advantage of this to mimic, access, or alter data. Due to the possible negative applications of hash collisions in data management and computer security (in particular, cryptographic hash functions), collision avoidance has become an important topic in computer security. Background Hash collisions can be unavoidable depending on the number of objects in a set and whether or not the bit string they are mapped to is long enough in length. When there is a set of ''n'' objects, if ''n'' is greater than , ''R'', , which in this ca ...
[...More Info...]      
[...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]  


Ahto Buldas
Ahto Buldas (born 17 January 1967) is an Estonian computer scientist. He is the inventor of Keyless Signature Infrastructure, Co-Founder and Chief Scientist at Guardtime and Chair of the OpenKSI foundation. Life and education Buldas was born in Tallinn. After graduating from high school, he was conscripted in to the Soviet Army where he spent 2 years as an artillery officer in Siberia. After being discharged, he started studies in Tallinn University of Technology, where he defended his MSc degree in 1993 and his PhD in 1999. He currently lives in Tallinn with his wife and four children. Career Buldas was a leading contributor to the Estonian Digital Signature Act and ID-card from 1996 to 2002, currently the only national-level public-key infrastructure (PKI) which has achieved widespread adoption by a country's population for legally binding digital signatures. He published his first timestamping related research in 1998 and has published over 30 academic papers on the subjec ...
[...More Info...]      
[...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]  




Collision Resistance
In cryptography, collision resistance is a property of cryptographic hash functions: a hash function ''H'' is collision-resistant if it is hard to find two inputs that hash to the same output; that is, two inputs ''a'' and ''b'' where ''a'' ≠ ''b'' but ''H''(''a'') = ''H''(''b''). Goldwasser, S. and Bellare, M.br>"Lecture Notes on Cryptography" Summer course on cryptography, MIT, 1996-2001 The pigeonhole principle means that any hash function with more inputs than outputs will necessarily have such collisions; the harder they are to find, the more cryptographically secure the hash function is. The "birthday paradox" places an upper bound on collision resistance: if a hash function produces ''N'' bits of output, an attacker who computes only 2''N''/2 (or \scriptstyle \sqrt) hash operations on random input is likely to find two matching outputs. If there is an easier method than this brute-force attack, it is typically considered a flaw in the hash function.Pass, R"Lecture 21: Col ...
[...More Info...]      
[...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]  


Stuart Haber
Stuart Haber is an American cryptographer and computer scientist, known for his contributions in cryptography and privacy-preserving technologies and widely recognized as the co-inventor of the blockchain. His 1991 paper "How to Time-Stamp a Digital Document”, co-authored with W. Scott Stornetta, won the 1992 Discover Award for Computer Software and is considered to be one of the most important papers in the development of cryptocurrencies. Education Haber studied at Harvard University, graduating magna cum laude in 1978 with a B.A. in Mathematics. Haber earned his PhD at Columbia University in 1987 under the advisement of Zvi Galil with a thesis titled ''Provably Secure Multi-party Cryptographic Computation: Techniques and Applications''. Career In 1987, Haber joined Bell Communications Research (Bellcore) as a research scientist. In 1989, Haber met W. Scott Stornetta, his future scientific partner and collaborator, when Stornetta joined Bellcore. In 1994, Haber and Sto ...
[...More Info...]      
[...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]