HOME

TheInfoList



OR:

Link rot (also called link death, link breaking, or reference rot) is the phenomenon of
hyperlink In computing, a hyperlink, or simply a link, is a digital reference to data that the user can follow or be guided by clicking or tapping. A hyperlink points to a whole document or to a specific element within a document. Hypertext is text w ...
s tending over time to cease to point to their originally targeted file, web page, or
server Server may refer to: Computing *Server (computing), a computer program or a device that provides functionality for other programs or devices, called clients Role * Waiting staff, those who work at a restaurant or a bar attending customers and su ...
due to that resource being relocated to a new address or becoming permanently unavailable. A link that no longer points to its target, often called a ''broken'' or ''dead'' link (or sometimes ''orphan'' link), is a specific form of dangling pointer. The rate of link rot is a subject of study and research due to its significance to the
internet The Internet (or internet) is the global system of interconnected computer networks that uses the Internet protocol suite (TCP/IP) to communicate between networks and devices. It is a '' network of networks'' that consists of private, p ...
's ability to preserve information. Estimates of that rate vary dramatically between studies.


Prevalence

A number of studies have examined the prevalence of link rot within the
World Wide Web The World Wide Web (WWW), commonly known as the Web, is an information system enabling documents and other web resources to be accessed over the Internet. Documents and downloadable media are made available to the network through web ...
, in academic literature that uses URLs to cite web content, and within digital libraries. A 2003 study found that on the Web, about one link out of every 200 broke each week, suggesting a
half-life Half-life (symbol ) is the time required for a quantity (of substance) to reduce to half of its initial value. The term is commonly used in nuclear physics to describe how quickly unstable atoms undergo radioactive decay or how long stable ...
of 138 weeks. This rate was largely confirmed by a 2016–2017 study of links in
Yahoo! Directory The Yahoo! Directory was a web directory which at one time rivaled DMOZ in size. The directory was Yahoo!'s first offering and started in 1994 under the name Jerry and David's Guide to the World Wide Web. When Yahoo! changed its main results to ...
(which had stopped updating in 2014 after 21 years of development) that found the half-life of the directory's links to be two years. A 2004 study showed that subsets of Web links (such as those targeting specific file types or those hosted by academic institution) could have dramatically different half-lives. The URLs selected for publication appear to have greater longevity than the average URL. A 2015 study by Weblock analyzed more than 180,000 links from references in the full-text corpora of three major open access publishers and found a half-life of about 14 years, generally confirming a 2005 study that found that half of the
URLs A Uniform Resource Locator (URL), colloquially termed as a web address, is a reference to a web resource that specifies its location on a computer network and a mechanism for retrieving it. A URL is a specific type of Uniform Resource Identi ...
cited in
D-Lib Magazine ''D-Lib Magazine'' was an online magazine dedicated to digital library research and development. Past issues are available free of charge. The publication was financially supported by contributions from the D-Lib Alliance. Prior to April 2006, t ...
articles were active 10 years after publication. Other studies have found higher rates of link rot in academic literature but typically suggest a half-life of four years or greater. A 2013 study in ''
BMC Bioinformatics ''BMC Bioinformatics'' is a peer-reviewed open access scientific journal covering bioinformatics and computational biology published by BioMed Central. It was established in 2000, and has been one of the fastest growing and most successful jour ...
'' analyzed nearly 15,000 links in abstracts from Thomson Reuters's Web of Science citation index and found that the median lifespan of web pages was 9.3 years, and just 62% were archived. A 2021 study of external links in 1996-2019 ''
New York Times ''The New York Times'' (''the Times'', ''NYT'', or the Gray Lady) is a daily newspaper based in New York City with a worldwide readership reported in 2020 to comprise a declining 840,000 paid print subscribers, and a growing 6 million paid ...
'' articles found that 25% of links were inaccessible. In addition, from a sample of 4,500 links still accessible, 13% did not lead to the original content, a phenomenon called ''content drift''. A 2002 study suggested that link rot within digital libraries is considerably slower than on the web, finding that about 3% of the objects were no longer accessible after one year (equating to a half-life of nearly 23 years).


Causes

Link rot can result from several occurrences. A target web page may be removed. The server that hosts the target page could fail, be removed from service, or relocate to a new
domain name A domain name is a string that identifies a realm of administrative autonomy, authority or control within the Internet. Domain names are often used to identify services provided through the Internet, such as websites, email services and more. As ...
. A domain name's registration may lapse or be transferred to another party. Some causes will result in the link failing to find any target and returning an error such as HTTP 404. Other causes will cause a link to target content other than what was intended by the link's author. Other reasons for broken links include: * the restructuring of websites that causes changes in URLs (e.g. might be moved to ) * relocation of formerly free content to behind a
paywall A paywall is a method of restricting access to content, with a purchase or a paid subscription, especially news. Beginning in the mid-2010s, newspapers started implementing paywalls on their websites as a way to increase revenue after years of ...
* a change in server architecture that results in code such as PHP functioning differently * dynamic page content such as search results that changes by design * the presence of user-specific information (such as a login name) within the link * deliberate blocking by
content filter An Internet filter is software that restricts or controls the content an Internet user is capable to access, especially when utilized to restrict material delivered over the Internet via the Web, Email, or other means. Content-control software det ...
s or firewalls * the expiration of a domain name registration


Prevention and detection

Strategies for preventing link rot can focus on placing content where its likelihood of persisting is higher, authoring links that are less likely to be broken, taking steps to preserve existing links, or repairing links whose targets have been relocated or removed. The creation of URLs that will not change with time is the fundamental method of preventing link rot. Preventive planning has been championed by
Tim Berners-Lee Sir Timothy John Berners-Lee (born 8 June 1955), also known as TimBL, is an English computer scientist best known as the inventor of the World Wide Web. He is a Professorial Fellow of Computer Science at the University of Oxford and a profes ...
and other web pioneers. Strategies pertaining to the authorship of links include: * linking to primary rather than secondary sources and prioritizing stable sites * avoiding links that point to resources on researchers' personal pages * using clean URLs or otherwise employing URL normalization or URL canonicalization * using permalinks and
persistent identifier A persistent identifier (PI or PID) is a long-lasting reference to a document, file, web page, or other object. The term "persistent identifier" is usually used in the context of digital objects that are accessible over the Internet. Typically, s ...
s such as ARKs, DOIs, Handle System references, and PURLs * avoiding linking to documents other than web pages * avoiding deep linking * linking to
web archives Web archiving is the process of collecting portions of the World Wide Web to ensure the information is preserved in an archive for future researchers, historians, and the public. Web archivists typically employ web crawlers for automated captur ...
such as the
Internet Archive The Internet Archive is an American digital library with the stated mission of "universal access to all knowledge". It provides free public access to collections of digitized materials, including websites, software applications/games, music, ...
, WebCite, archive.today, Perma.cc, or Amber Strategies pertaining to the protection of existing links include: * using redirection mechanisms such as
HTTP 301 The HTTP response status code 301 Moved Permanently is used for permanent redirecting, meaning that links or records returning this response should be updated. The new URL should be provided in the Location field, included with the response. The 3 ...
to automatically refer browsers and crawlers to relocated content. * using
content management systems A content management system (CMS) is computer software used to manage the creation and modification of digital content (content management).''Managing Enterprise Content: A Unified Content Strategy''. Ann Rockley, Pamela Kostur, Steve Manning. New ...
which can automatically update links when content within the same site is relocated or automatically replace links with canonical URLs * integrating search resources into HTTP 404 pages The detection of broken links may be done manually or automatically. Automated methods include plug-ins for
content management system A content management system (CMS) is computer software used to manage the creation and modification of digital content ( content management).''Managing Enterprise Content: A Unified Content Strategy''. Ann Rockley, Pamela Kostur, Steve Manning. New ...
s as well as standalone broken-link checkers such as like Xenu's Link Sleuth. Automatic checking may not detect links that return a soft 404 or links that return a 200 OK response but point to content that has changed.


See also

*
Software rot Software rot (bit rot, code rot, software erosion, software decay, or software entropy) is either a slow deterioration of software quality over time or its diminishing responsiveness that will eventually lead to software becoming faulty, unusabl ...
*
Digital preservation In library and archival science, digital preservation is a formal endeavor to ensure that digital information of continuing value remains accessible and usable. It involves planning, resource allocation, and application of preservation methods and ...
*
Deletionism and inclusionism in Wikipedia Deletionism and inclusionism are opposing philosophies that largely developed within the community of editors of the online encyclopedia Wikipediasite's community. The terms reflect differing opinions on the appropriate scope of the encyclop ...
* Archive Team, web archiving team


Further reading

* * * * *


References


External links


Future-Proofing Your URIs
* {{cite web, url=http://www.useit.com/alertbox/980614.html, title=Fighting Linkrot, authorlink=Jakob Nielsen (usability consultant), last=Nielsen, first=Jakob, date=14 June 1998, archive-url=https://web.archive.org/web/20121223011620/http://www.useit.com/alertbox/980614.html, archive-date=23 December 2012 URL Data quality Product expiration