HOME

TheInfoList



OR:

In
computer network A computer network is a collection of communicating computers and other devices, such as printers and smart phones. In order to communicate, the computers and devices must be connected by wired media like copper cables, optical fibers, or b ...
engineering Engineering is the practice of using natural science, mathematics, and the engineering design process to Problem solving#Engineering, solve problems within technology, increase efficiency and productivity, and improve Systems engineering, s ...
, an Internet Standard is a normative
specification A specification often refers to a set of documented requirements to be satisfied by a material, design, product, or service. A specification is often a type of technical standard. There are different types of technical or engineering specificati ...
of a technology or methodology applicable to the
Internet The Internet (or internet) is the Global network, global system of interconnected computer networks that uses the Internet protocol suite (TCP/IP) to communicate between networks and devices. It is a internetworking, network of networks ...
. Internet Standards are created and published by the
Internet Engineering Task Force The Internet Engineering Task Force (IETF) is a standards organization for the Internet standard, Internet and is responsible for the technical standards that make up the Internet protocol suite (TCP/IP). It has no formal membership roster ...
(IETF). They allow interoperation of hardware and software from different sources which allows internets to function. As the Internet became global, Internet Standards became the lingua franca of worldwide communications. Engineering contributions to the IETF start as an Internet Draft, may be promoted to a
Request for Comments A Request for Comments (RFC) is a publication in a series from the principal technical development and standards-setting bodies for the Internet, most prominently the Internet Engineering Task Force (IETF). An RFC is authored by individuals or ...
, and may eventually become an Internet Standard. An Internet Standard is characterized by technical maturity and usefulness. The IETF also defines a Proposed Standard as a less mature but stable and well-reviewed specification. A Draft Standard was an intermediate level, discontinued in 2011. A Draft Standard was an intermediary step that occurred after a Proposed Standard but prior to an Internet Standard. As put in RFC 2026:
In general, an Internet Standard is a specification that is stable and well-understood, is technically competent, has multiple, independent, and interoperable implementations with substantial operational experience, enjoys significant public support, and is recognizably useful in some or all parts of the Internet.


Overview

An Internet Standard is documented by a
Request for Comments A Request for Comments (RFC) is a publication in a series from the principal technical development and standards-setting bodies for the Internet, most prominently the Internet Engineering Task Force (IETF). An RFC is authored by individuals or ...
(RFC) or a set of RFCs. A specification that is to become a Standard or part of a Standard begins as an Internet Draft, and is later, usually after several revisions, accepted and published by the RFC Editor as an RFC and labeled a ''Proposed Standard''. Later, an RFC is elevated as ''Internet Standard'', with an additional sequence number, when maturity has reached an acceptable level. Collectively, these stages are known as the ''Standards Track'', and are defined in RFC 2026 and RFC 6410. The label ''Historic'' is applied to deprecated Standards Track documents or obsolete RFCs that were published before the Standards Track was established. Only the
IETF The Internet Engineering Task Force (IETF) is a standards organization for the Internet standard, Internet and is responsible for the technical standards that make up the Internet protocol suite (TCP/IP). It has no formal membership roster ...
, represented by the
Internet Engineering Steering Group The Internet Engineering Task Force (IETF) is a standards organization for the Internet and is responsible for the technical standards that make up the Internet protocol suite (TCP/IP). It has no formal membership roster or requirements and ...
(IESG), can approve Standards Track RFCs. The definitive list of Internet Standards is maintained in th
Official Internet Protocol Standards
Previously, STD 1 used to maintain a snapshot of the list.


History and the purpose of Internet Standards

Internet standards are a set of rules that devices have to follow when they connect in a network. Since the technology has evolved, the rules of the engagement between computers had to evolve with it. These are the protocols that are in place used today. Most of these were developed long before the
Internet Age The Information Age is a historical period that began in the mid-20th century. It is characterized by a rapid shift from traditional industries, as established during the Industrial Revolution, to an economy centered on information technology ...
, going as far back as the 1970s, not long after the creation of
personal computers A personal computer, commonly referred to as PC or computer, is a computer designed for individual use. It is typically used for tasks such as Word processor, word processing, web browser, internet browsing, email, multimedia playback, and PC ...
. TCP/IP The official date for when the first internet went live is January 1, 1983. The Transmission Control Protocol/Internet Protocol (TCP/IP) went into effect. ARPANET (Advanced Research Projects Agency Network) and the Defense Data Network were the networks to implement the Protocols. These protocols are considered to be the essential part of how the Internet works because they define the rules by which the connections between servers operate. They are still used today by implementing various ways data is sent via global networks. IPsec Internet Protocol Security is a collection of protocols that ensure the integrity of encryption in the connection between multiple devices. The purpose of this protocol is to protect public networks. According to IETF Datatracker the group dedicated to its creation was proposed into existence on 25 November 1992. Half a year later the group was created and not long after in the mid 1993 the first draft was published. HTTP
HyperText Transfer Protocol HTTP (Hypertext Transfer Protocol) is an application layer protocol in the Internet protocol suite model for distributed, collaborative, hypermedia information systems. HTTP is the foundation of data communication for the World Wide Web, wher ...
is one of the most commonly used protocols today in the context of the World Wide Web. HTTP is a simple protocol to govern how documents, that are written in HyperText Mark Language(HTML), are exchanged via networks. This protocol is the backbone of the Web allowing for the whole hypertext system to exist practically. It was created by the team of developers spearheaded by
Tim Berners-Lee Sir Timothy John Berners-Lee (born 8 June 1955), also known as TimBL, is an English computer scientist best known as the inventor of the World Wide Web, the HTML markup language, the URL system, and HTTP. He is a professorial research fellow a ...
. Berners-Lee is responsible for the proposal of its creation, which he did in 1989. August 6, 1991 is the date he published the first complete version of HTTP on a public forum. This date subsequently is considered by some to be the official birth of the World Wide Web. HTTP has been continually evolving since its creation, becoming more complicated with time and progression of networking technology. By default HTTP is not encrypted so in practice
HTTPS Hypertext Transfer Protocol Secure (HTTPS) is an extension of the Hypertext Transfer Protocol (HTTP). It uses encryption for secure communication over a computer network, and is widely used on the Internet. In HTTPS, the communication protoc ...
is used, which stands for HTTP Secure. TLS/SSL TLS stands for
Transport Layer Security Transport Layer Security (TLS) is a cryptographic protocol designed to provide communications security over a computer network, such as the Internet. The protocol is widely used in applications such as email, instant messaging, and voice over ...
which is a standard that enables two different endpoints to interconnect sturdy and privately. TLS came as a replacement for SSL. Secure Sockets Layers was first introduced before the creation of HTTPS and it was created by Netscape. As a matter of fact HTTPS was based on SSL when it first came out. It was apparent that one common way of encrypting data was needed so the IETF specified TLS 1.0 in RFC 2246 in January, 1999. It has been upgraded since. Last version of TLS is 1.3 from RFC 8446 in August 2018. OSI Model The
Open Systems Interconnection model The Open Systems Interconnection (OSI) model is a reference model developed by the International Organization for Standardization (ISO) that "provides a common basis for the coordination of standards development for the purpose of systems inter ...
began its development in 1977. It was created by the
International Organization for Standardization The International Organization for Standardization (ISO ; ; ) is an independent, non-governmental, international standard development organization composed of representatives from the national standards organizations of member countries. M ...
. It was officially published and adopted as a standard for use in 1979. It was then updated several times and the final version. It took a few years for the protocol to be presented in its final form. ISO 7498 was published in 1984. Lastly in 1995 the OSI model was revised again satisfy the urgent needs of uprising development in the field of computer networking. UDP The goal of
User Datagram Protocol In computer networking, the User Datagram Protocol (UDP) is one of the core communication protocols of the Internet protocol suite used to send messages (transported as datagrams in Network packet, packets) to other hosts on an Internet Protoco ...
was to find a way to communicate between two computers as quickly and efficiently as possible. UDP was conceived and realized by David P. Reed in 1980. Essentially the way it works is using compression to send information. Data would be compressed into a datagram and sent point to point. This proved to be a secure way to transmit information and despite the drawback of losing quality of data UDP is still in use.


Standardization process

Becoming a standard is a two-step process within the Internet Standards Process: ''Proposed Standard'' and ''Internet Standard''. These are called ''maturity levels'' and the process is called the ''Standards Track''. If an RFC is part of a proposal that is on the Standards Track, then at the first stage, the standard is proposed and subsequently organizations decide whether to implement this Proposed Standard. After the criteria in RFC 6410 is met (two separate implementations, widespread use, no errata etc.), the RFC can advance to Internet Standard. The Internet Standards Process is defined in several "Best Current Practice" documents, notabl
BCP 9
( RFC 2026 and RFC 6410). There were previously three standard maturity levels: ''Proposed Standard'', ''Draft Standard'' and ''Internet Standard''. RFC 6410 reduced this to two maturity levels.


Proposed Standard

RFC 2026 originally characterized Proposed Standards as immature specifications, but this stance was annulled by RFC 7127. A ''Proposed Standard'' specification is stable, has resolved known design choices, has received significant community review, and appears to enjoy enough community interest to be considered valuable. Usually, neither implementation nor operational experience is required for the designation of a specification as a Proposed Standard. Proposed Standards are of such quality that implementations can be deployed in the Internet. However, as with all technical specifications, Proposed Standards may be revised if problems are found or better solutions are identified, when experiences with deploying implementations of such technologies at scale is gathered. Many Proposed Standards are actually deployed on the Internet and used extensively, as stable protocols. Actual practice has been that full progression through the sequence of standards levels is typically quite rare, and most popular IETF protocols remain at Proposed Standard.


Draft Standard

In October 2011, RFC 6410 merged the second and third maturity levels into one ''Internet Standard''. Existing older ''Draft Standards'' retain that classification, absent explicit actions. For old ''Draft Standards'' two possible actions are available, which must be approved by the IESG: A ''Draft Standard'' may be reclassified as an ''Internet Standard'' as soon as the criteria in RFC 6410 are satisfied; or, after two years since RFC 6410 was approved as BCP (October 2013), the IESG can choose to reclassify an old ''Draft Standard'' as ''Proposed Standard''.


Internet Standard

An Internet Standard is characterized by a high degree of technical maturity and by a generally held belief that the specified protocol or service provides significant benefit to the Internet community. Generally Internet Standards cover interoperability of systems on the Internet through defining protocols, message formats, schemas, and languages. An Internet Standard ensures that hardware and software produced by different vendors can work together. Having a standard makes it much easier to develop software and hardware that link different networks because software and hardware can be developed one layer at a time. Normally, the standards used in data communication are called protocols. All Internet Standards are given a number in the STD series. The series was summarized in its first document, STD 1 (RFC 5000), until 2013, but this practice was retired in RFC 7100. The definitive list of Internet Standards is now maintained by the RFC Editor. Documents submitted to the IETF editor and accepted as an RFC are not revised; if the document has to be changed, it is submitted again and assigned a new RFC number. When an RFC becomes an Internet Standard (STD), it is assigned an STD number but retains its RFC number. When an Internet Standard is updated, its number is unchanged but refers to a different RFC or set of RFCs. For example, in 2007 RFC 3700 was an Internet Standard (STD 1) and in May 2008 it was replaced with RFC 5000. RFC 3700 received ''Historic'' status, and RFC 5000 became STD 1. The list of Internet standards was originally published as STD 1 but this practice has been abandoned in favor of an online list maintained by the RFC Editor.


Organizations of Internet Standards

The standardization process is divided into three steps: # Proposed standards are standards to be implemented and can be changed at any time # The draft standard was carefully tested in preparation for riverside to form the future Internet standard # Internet standards are mature standards. There are five Internet standards organizations: the
Internet Engineering Task Force The Internet Engineering Task Force (IETF) is a standards organization for the Internet standard, Internet and is responsible for the technical standards that make up the Internet protocol suite (TCP/IP). It has no formal membership roster ...
(IETF),
Internet Society The Internet Society (ISOC) is an American non-profit advocacy organization founded in 1992 with local chapters around the world. It has offices in Reston, Virginia, United States, and Geneva, Switzerland. Organization The Internet Society ...
(ISOC),
Internet Architecture Board The Internet Architecture Board (IAB) is a committee of the Internet Engineering Task Force (IETF) and an advisory body of the Internet Society (ISOC). Its responsibilities include architectural oversight of IETF activities, Internet Standards ...
(IAB),
Internet Research Task Force The Internet Research Task Force (IRTF) is an organization, overseen by the Internet Architecture Board, that focuses on longer-term research issues related to the Internet. A parallel organization, the Internet Engineering Task Force (IETF), foc ...
(IRTF),
World Wide Web Consortium The World Wide Web Consortium (W3C) is the main international standards organization for the World Wide Web. Founded in 1994 by Tim Berners-Lee, the consortium is made up of member organizations that maintain full-time staff working together in ...
(W3C). All organizations are required to use and express the Internet language in order to remain competitive in the current Internet phase. Some basic aims of the Internet Standards Process are; ensure technical excellence; earlier implementation and testing; perfect, succinct as well as easily understood records. Creating and improving the Internet Standards is an ongoing effort and Internet Engineering Task Force plays a significant role in this regard. These standards are shaped and available by the Internet Engineering Task Force (IETF). It is the leading Internet standards association that uses well-documented procedures for creating these standards. Once circulated, those standards are made easily accessible without any cost. Till 1993, the United States federal government was supporting the IETF. Now, the Internet Society's Internet Architecture Board (IAB) supervises it. It is a bottom-up organization that has no formal necessities for affiliation and does not have an official membership procedure either. It watchfully works with the World Wide Web Consortium (W3C) and other standard development organizations. Moreover, it heavily relies on working groups that are constituted and proposed to an Area Director. IETF relies on its working groups for expansion of IETF conditions and strategies with a goal to make the Internet work superior. The working group then operates under the direction of the Area Director and progress an agreement. After the circulation of the proposed charter to the IESG and IAB mailing lists and its approval then it is further forwarded to the public IETF. It is not essential to have the complete agreement of all working groups and adopt the proposal. IETF working groups are only required to recourse to check if the accord is strong. Likewise, the Working Group produce documents in the arrangement of RFCs which are memorandum containing approaches, deeds, examination as well as innovations suitable to the functioning of the Internet and Internet-linked arrangements. In other words, Requests for Comments (RFCs) are primarily used to mature a standard network protocol that is correlated with network statements. Some RFCs are aimed to produce information while others are required to publish Internet standards. The ultimate form of the RFC converts to the standard and is issued with a numeral. After that, no more comments or variations are acceptable for the concluding form. This process is followed in every area to generate unanimous views about a problem related to the internet and develop internet standards as a solution to different glitches. There are eight common areas on which IETF focus and uses various working groups along with an area director. In the "general" area it works and develops the Internet standards. In "Application" area it concentrates on internet applications such as Web-related protocols. Furthermore, it also works on the development of internet infrastructure in the form of PPP extensions. IETF also establish principles and description standards that encompass the Internet protocol suite (TCP/IP). The Internet Architecture Board (IAB) along with the Internet Research Task Force (IRTF) counterpart the exertion of the IETF using innovative technologies. The IETF is the standards making organization concentrate on the generation of "standard" stipulations of expertise and their envisioned usage. The IETF concentrates on matters associated with the progress of current Internet and TCP/IP know-how. It is alienated into numerous working groups (WGs), every one of which is accountable for evolving standards and skills in a specific zone, for example routing or security. People in working groups are volunteers and work in fields such as equipment vendors, network operators and different research institutions. Firstly, it works on getting the common consideration of the necessities that the effort should discourse. Then an IETF Working Group is formed and necessities are ventilated in the influential Birds of a Feather (BoF) assemblies at IETF conferences.


Internet Engineering Task Force

The Internet Engineering Task Force (IETF) is the premier internet standards organization. It follows an open and well-documented processes for setting internet standards. The resources that the IETF offers include RFCs, internet-drafts, IANA functions, intellectual property rights, standards process, and publishing and accessing RFCs.


RFCs

* Documents that contain technical specifications and notes for the Internet. * The acronym RFC came from the phrase "Request For Comments" - this is not used anymore today and is now simply referred to as RFCs. * The website ''RFC Editor'' is an official archive of internet standards, draft standards, and proposed standards.


Internet Drafts

* Working documents of the IETF and its working groups. * Other groups may distribute working documents as Internet-Drafts


Intellectual property rights

* All IETF standards are freely available to view and read, and generally free to implement by anyone without permission or payment.


Standards Process

* The process of creating a standard is straightforward - a specification goes through an extensive review process by the Internet community and revised through experience.


Publishing and accessing RFCs

* Internet-Drafts that successfully completed the review process. * Submitted to RFC editor for publication.


Types of Internet Standards

There are two ways in which an Internet Standard is formed and can be categorized as one of the following: "de jure" standards and "de facto" standards. A de facto standard becomes a standard through widespread use within the tech community. A de jure standard is formally created by official standard-developing organizations. These standards undergo the Internet Standards Process. Common de jure standards include
ASCII ASCII ( ), an acronym for American Standard Code for Information Interchange, is a character encoding standard for representing a particular set of 95 (English language focused) printable character, printable and 33 control character, control c ...
,
SCSI Small Computer System Interface (SCSI, ) is a set of standards for physically connecting and transferring data between computers and peripheral devices, best known for its use with storage devices such as hard disk drives. SCSI was introduced ...
, and
Internet protocol suite The Internet protocol suite, commonly known as TCP/IP, is a framework for organizing the communication protocols used in the Internet and similar computer networks according to functional criteria. The foundational protocols in the suite are ...
.


Internet Standard Specifications

Specifications subject to the Internet Standards Process can be categorized into one of the following: Technical Specification (TS) and Applicability Statement (AS). A Technical Specification is a statement describing all relevant aspects of a protocol, service, procedure, convention, or format. This includes its scope and its intent for use, or "domain of applicability". However, a TSs use within the Internet is defined by an Applicability Statement. An AS specifies how, and under what circumstances, TSs may be applied to support a particular Internet capability. An AS identifies the ways in which relevant TSs are combined and specifies the parameters or sub-functions of TS protocols. An AS also describes the domains of applicability of TSs, such as Internet routers, terminal server, or datagram-based database servers. An AS also applies one of the following "requirement levels" to each of the TSs to which it refers: * Required: Implementation of the referenced TS is required to achieve interoperability. For example, Internet systems using the
Internet Protocol Suite The Internet protocol suite, commonly known as TCP/IP, is a framework for organizing the communication protocols used in the Internet and similar computer networks according to functional criteria. The foundational protocols in the suite are ...
are required to implement IP and ICMP. * Recommended:  Implementation of the referenced TS is not required, but is desirable in the domain of applicability of the AS. Inclusion of the functions, features, and protocols of Recommended TSs in the developments of systems is encouraged. For example, the
TELNET Telnet (sometimes stylized TELNET) is a client-server application protocol that provides access to virtual terminals of remote systems on local area networks or the Internet. It is a protocol for bidirectional 8-bit communications. Its main ...
protocol should be implemented by all systems that intend to use remote access. * Elective: Implementation of the referenced TS is optional. The TS is only necessary in a specific environment. For example, the DECNET MIB could be seen as valuable in an environment where the
DECNET DECnet is a suite of network protocols created by Digital Equipment Corporation. Originally released in 1975 in order to connect two PDP-11 minicomputers, it evolved into one of the first peer-to-peer network architectures, thus transforming DEC ...
protocol is used.


Common Standards


Web Standards

TCP/ IP Model & associated Internet Standards
Web standards Web standards are the formal, non-proprietary standards and other technical specifications that define and describe aspects of the World Wide Web. In recent years, the term has been more frequently associated with the trend of endorsing a set of st ...
are a type of internet standard which define aspects of the
World Wide Web The World Wide Web (WWW or simply the Web) is an information system that enables Content (media), content sharing over the Internet through user-friendly ways meant to appeal to users beyond Information technology, IT specialists and hobbyis ...
. They allow for the building and rendering of websites. The three key standards used by the World Wide Web are
Hypertext Transfer Protocol HTTP (Hypertext Transfer Protocol) is an application layer protocol in the Internet protocol suite model for distributed, collaborative, hypermedia information systems. HTTP is the foundation of data communication for the World Wide Web, wher ...
,
HTML Hypertext Markup Language (HTML) is the standard markup language for documents designed to be displayed in a web browser. It defines the content and structure of web content. It is often assisted by technologies such as Cascading Style Sheets ( ...
, and
URL A uniform resource locator (URL), colloquially known as an address on the Web, is a reference to a resource that specifies its location on a computer network and a mechanism for retrieving it. A URL is a specific type of Uniform Resource Identi ...
. Respectively, they specify the transfer of data between a browser and a web server, the content and layout of a web page, and what web page identifiers mean.


Network Standards

Network standards are a type of internet standard which defines rules for data communication in networking technologies and processes. Internet standards allow for the communication procedure of a device to or from other devices. In reference to the TCP/IP Model, common standards and protocols in each layer are as follows: * The Transport layer: TCP and
SPX SPX can refer to: * S&P 500, a stock market index * Sequenced Packet Exchange, a networking protocol * IATA code of Sphinx International Airport, an airport in Giza, Egypt * Small Press Expo, an alternative comics convention * SpaceX (SpX), a roc ...
* Network layer: IP and
IPX Internetwork Packet Exchange (IPX) is the network-layer protocol in the IPX/SPX protocol suite. IPX is derived from Xerox Network Systems' IDP. It also has the ability to act as a transport layer protocol. The IPX/SPX protocol suite was very ...
* Data Link layer:
IEEE 802.3 IEEE 802.3 is a working group and a collection of standards defining the physical layer and data link layer's media access control (MAC) of wired Ethernet. The standards are produced by the working group of the Institute of Electrical and Electro ...
for LAN and
Frame Relay Frame Relay (FR) is a standardized wide area network (WAN) technology that specifies the Physical layer, physical and data link layers of digital telecommunications channels using a packet switching methodology. Frame Relay was originally devel ...
for WAN * Physical layer:
8P8C A modular connector is a type of electrical connector for cords and cables of electronic devices and appliances, such as in computer networking, telecommunication equipment, and audio headsets. Modular connectors were originally developed for ...
and V.92


The future of Internet Standards

The Internet has been viewed as an open playground, free for people to use and communities to monitor. However, large companies have shaped and molded it to best fit their needs. The future of internet standards will be no different. Currently, there are widely used but insecure protocols such as the Border Gateway Protocol (BGP) and Domain Name System (DNS).  This reflects common practices that focus more on innovation than security.  Companies have the power to improve these issues.  With the Internet in the hands of the industry, users must depend on businesses to protect vulnerabilities present in these standards. Ways to make BGP and DNS safer already exist but they are not widespread. For example, there is the existing BGP safeguard called Routing Public Key Infrastructure (RPKI). It is a database of routes that are known to be safe and have been cryptographically signed. Users and companies submit routes and check other users' routes for safety. If it were more widely adopted, more routes could be added and confirmed. However, RPKI is picking up momentum. As of December 2020, tech giant Google registered 99% of its routes with RPKI. They are making it easier for businesses to adopt BGP safeguards. DNS also has a security protocol with a low adoption rate: DNS Security Extensions (DNSSEC). Essentially, at every stage of the DNS lookup process, DNSSEC adds a signature to data to show it has not been tampered with. Some companies have taken the initiative to secure internet protocols. It is up to the rest to make it more widespread.


See also

*
Standardization Standardization (American English) or standardisation (British English) is the process of implementing and developing technical standards based on the consensus of different parties that include firms, users, interest groups, standards organiza ...
*
Web standards Web standards are the formal, non-proprietary standards and other technical specifications that define and describe aspects of the World Wide Web. In recent years, the term has been more frequently associated with the trend of endorsing a set of st ...


References

{{reflist


External links


RFC Editor