The Zettabyte Era or Zettabyte Zone is a period of human and
computer science
Computer science is the study of computation, automation, and information. Computer science spans theoretical disciplines (such as algorithms, theory of computation, information theory, and automation) to Applied science, practical discipli ...
history that started in the mid-2010s. The precise starting date depends on whether it is defined as when the global
IP traffic first exceeded one
zettabyte
The byte is a unit of digital information that most commonly consists of eight bits. Historically, the byte was the number of bits used to encode a single character of text in a computer and for this reason it is the smallest addressable uni ...
, which happened in 2016, or when the amount of digital data in the world first exceeded a zettabyte, which happened in 2012. A zettabyte is a multiple of the unit
byte
The byte is a unit of digital information that most commonly consists of eight bits. Historically, the byte was the number of bits used to encode a single character of text in a computer and for this reason it is the smallest addressable uni ...
that measures digital storage, and it is equivalent to 1,000,000,000,000,000,000,000 (10
21) bytes.
According to
Cisco Systems
Cisco Systems, Inc., commonly known as Cisco, is an American-based multinational digital communications technology conglomerate corporation headquartered in San Jose, California. Cisco develops, manufactures, and sells networking hardware, ...
, an American multinational technology conglomerate, the global IP traffic achieved an estimated 1.2 zettabytes (an average of 96
exabyte
The byte is a unit of digital information that most commonly consists of eight bits. Historically, the byte was the number of bits used to encode a single character of text in a computer and for this reason it is the smallest addressable uni ...
s (EB) per month) in 2016. Global IP traffic refers to all digital data that passes over an IP network which includes, but is not limited to, the public Internet. The largest contributing factor to the growth of IP traffic comes from video traffic (including online streaming services like
Netflix
Netflix, Inc. is an American subscription video on-demand over-the-top streaming service and production company based in Los Gatos, California. Founded in 1997 by Reed Hastings and Marc Randolph in Scotts Valley, California, it offers a ...
and
YouTube
YouTube is a global online video sharing and social media platform headquartered in San Bruno, California. It was launched on February 14, 2005, by Steve Chen, Chad Hurley, and Jawed Karim. It is owned by Google, and is the second mo ...
).
The Zettabyte Era can also be understood as an age of growth of all forms of digital data that exist in the world which includes the public Internet, but also all other forms of digital data such as stored data from security cameras or voice data from cell-phone calls.
Taking into account this second definition of the Zettabyte Era, it was estimated that in 2012 upwards of 1 zettabyte of data existed in the world and that by 2020 there would be more than 40 zettabytes of data in the world at large.
The Zettabyte Era translates to difficulties for
data centers
A data center (American English) or data centre (British English)See spelling differences. is a building, a dedicated space within a building, or a group of buildings used to house computer systems and associated components, such as telecommun ...
to keep up with the explosion of data consumption, creation and replication. In 2015, 2% of total global power was taken up by the Internet and all its components, so energy efficiency with regards to data centers has become a central problem in the Zettabyte Era.
The zettabyte
A zettabyte is a digital unit of measurement. One zettabyte is equal to one sextillion
byte
The byte is a unit of digital information that most commonly consists of eight bits. Historically, the byte was the number of bits used to encode a single character of text in a computer and for this reason it is the smallest addressable uni ...
s or 10
21 (1,000,000,000,000,000,000,000) bytes, or, one zettabyte is equal to a trillion
gigabyte
The gigabyte () is a multiple of the unit byte for digital information. The prefix '' giga'' means 109 in the International System of Units (SI). Therefore, one gigabyte is one billion bytes. The unit symbol for the gigabyte is GB.
This definit ...
s.
[ To put this into perspective, consider that "if each ]terabyte
The byte is a unit of digital information that most commonly consists of eight bits. Historically, the byte was the number of bits used to encode a single character of text in a computer and for this reason it is the smallest addressable uni ...
in a zettabyte were a kilometre, it would be equivalent to 1,300 round trips to the moon and back (768,800 kilometers)".[ As former Google CEO Eric Schmidt puts it, from the very beginning of humanity to the year 2003, an estimated 5 exabytes of information was created,] which corresponds to 0.5% of a zettabyte. In 2013, that amount of information (5 exabytes) took only two days to create, and that pace is continuously growing.
Definitions
The concept of the Zettabyte Era can be separated into two distinct categories:
# In terms of IP traffic: This first definition refers to the total amount of data to traverse global IP networks such as the public Internet. In Canada for example, there has been an average growth of 50.4% of data downloaded by residential Internet subscribers from 2011 to 2016. According to this definition, the Zettabyte Era began in 2016 when global IP traffic surpassed one zettabyte, estimated to have reached roughly 1.2 zettabytes.[
# In terms of all forms of digital data: In this second definition, the Zettabyte Era refers to the total amount of all the digital data that exists in any form, from digital films to transponders that record highway usage to SMS text messages.][ According to this definition, the Zettabyte Era began in 2012, when the amount of digital data in the world surpassed one zettabyte.][
]
Cisco reportThe Zettabyte Era: Trends and Analysis
In 2016, Cisco Systems stated that the Zettabyte Era was now reality when global IP traffic reached an estimated 1.2 zettabytes. Cisco also provided future predictions of global IP traffic in their report ''The Zettabyte Era: Trends and Analysis''. This report uses current and past global IP traffic statistics to forecast future trends. The report predicts trends between 2016 and 2021. Here are some of the predictions for 2021 found in the report:[
* Global IP traffic will triple and is estimated to reach 3.3 ZB per annum.
* In 2016 video traffic (e.g. Netflix and YouTube) accounted for 73% of total traffic. In 2021 this will increase to 82%.
* The number of devices connected to IP networks will be more than three times the global population.
* The amount of time it would take for one person to watch the entirety of video that will traverse global IP networks in one month is 5 million years.
* PC traffic will be exceeded by smartphone traffic. PC traffic will account for 25% of total IP traffic while smartphone traffic will be 33%.
* There will be a twofold increase in ]broadband
In telecommunications, broadband is wide bandwidth data transmission which transports multiple signals at a wide range of frequencies and Internet traffic types, that enables messages to be sent simultaneously, used in fast internet connections. ...
speeds.[
]
Factors that led to the Zettabyte Era
There are many factors that brought about the rise of the Zettabyte Era. Increases in video streaming, mobile phone usage, broadband speeds and data center storage are all contributing factors that led to the rise (and continuation) of data consumption, creation and replication.
Increased video streaming
There is a large, and ever-growing consumption of multimedia
Multimedia is a form of communication that uses a combination of different content forms such as text, audio, images, animations, or video into a single interactive presentation, in contrast to tradit ...
, including video streaming, on the Internet that has contributed to the rise of the Zettabyte Era. In 2011 it was estimated that roughly 25%–40% of IP traffic was taken up by video streaming services. Since then, video IP traffic has nearly doubled to an estimated 73% of total IP traffic. Furthermore, Cisco has predicted that this trend will continue into the future, estimating that by 2021, 82% of total IP traffic will come from video traffic.[
The amount of data used by ]video streaming
Video on demand (VOD) is a media distribution system that allows users to access videos without a traditional video playback device and the constraints of a typical static broadcasting schedule. In the 20th century, broadcasting in the form of ...
services depends on the quality of the video. Thus, Android Central breaks down how much data is used (on a smartphone) with regards to different video resolutions. According to their findings, per hour video between 240p and 320p resolution uses roughly 0.3 GB. Standard video, which is clocked in at a resolution of 480p, uses approximately 0.7 GB per hour.
Netflix and YouTube are at the top of the list in terms of the most globally streamed video services online. In 2016, Netflix represented 32.72% of all video streaming IP traffic, while YouTube represented 17.31%. The third spot is taken up by Amazon Prime Video
Amazon Prime Video, also known simply as Prime Video, is an American subscription video on-demand over-the-top streaming and rental service of Amazon offered as a standalone service or as part of Amazon's Prime subscription. The service pr ...
where global data usage comes in at 4.14%.
Netflix
Currently, Netflix is the largest video streaming service in the world, accessible in over 200 countries and with more than 80 million subscribers. Streaming high definition video content through Netflix uses roughly 3 GB of data per hour, while standard definition takes up around 1 GB of data per hour. In North America, during peak bandwidth consumption hours (around 8 PM) Netflix uses about 40% of total network bandwidth. The vast amount of data marks an unparalleled period in time and is one of the major contributing factors that has led the world into the Zettabyte Era.[
]
YouTube
YouTube is another large video streaming (and video uploading) service, whose data consumption rate across both fixed and mobile networks remains quite large. In 2016, the service was responsible for using up about 20% of total Internet traffic and 40% of mobile traffic. In 2016, 100 hours of video content on YouTube was uploaded every 60 seconds. YouTube not only offers content for download (through streaming) but also As of 2018
300 hours of YouTube video
content is uploaded every minute.
Increased wireless and mobile traffic
The usage of mobile technologies to access IP networks has resulted in an increase in overall IP traffic in the Zettabyte Era. In 2016, the majority of devices that moved IP traffic and other data streams were hard-wired devices. Since then, wireless and mobile traffic have increased and are predicted to continue to increase rapidly. Cisco predicts that by the year 2021, wired devices will account for 37% of total traffic while the remaining 63% will be accounted for through wireless and mobile devices. Furthermore, smartphone traffic is expected to surpass PC traffic by 2021; PCs are predicted to account for 25% of total traffic, down from 46% in 2016, whereas smartphone traffic is expected to increase from 13% to 33%.[
According to the ]Organisation for Economic Co-operation and Development
The Organisation for Economic Co-operation and Development (OECD; french: Organisation de coopération et de développement économiques, ''OCDE'') is an intergovernmental organisation with 38 member countries, founded in 1961 to stimulate ...
(OECD), mobile broadband penetration rates are ever-growing. Between June 2016 and December 2016 there was an average mobile broadband penetration rate increase of 4.43% of all OECD countries. Poland had the largest increase coming in at 21.55% while Latvia had the lowest penetration rate having declined 5.71%. The OECD calculated that there were 1.27 billion total mobile broadband subscriptions in 2016, 1.14 billion of these subscriptions had both voice and data included in the plan.[
]
Increased broadband speeds
Broadband is what connects Internet users to the Internet, thus the speed of the broadband connection is directly correlated to IP traffic – the greater the broadband speed, the greater the possibility of more traffic that can traverse IP networks. Cisco estimates that broadband speeds are expected to double by 2021. In 2016, global average fixed broadband reached speeds as high as 27.5 Mbit/s but are expected to reach 53 Mbit/s by 2021.[ Between the fourth quarter of 2016 and the first quarter of 2017, average fixed broadband speeds globally equated to 7.2 Mbit/s. South Korea was at the top of the list in terms of broadband speeds. In that period broadband speeds increased 9.3%.
High-]bandwidth
Bandwidth commonly refers to:
* Bandwidth (signal processing) or ''analog bandwidth'', ''frequency bandwidth'', or ''radio bandwidth'', a measure of the width of a frequency range
* Bandwidth (computing), the rate of data transfer, bit rate or thr ...
applications need significantly higher broadband-speeds. Certain broadband technologies including Fiber-to-the-home
Fiber to the ''x'' (FTTX; also spelled "fibre") or fiber in the loop is a generic term for any broadband network architecture using optical fiber to provide all or part of the local loop used for last mile telecommunications. As fiber optic ...
(FTTH), high-speed digital subscriber line
Digital subscriber line (DSL; originally digital subscriber loop) is a family of technologies that are used to transmit digital data over telephone lines. In telecommunications marketing, the term DSL is widely understood to mean asymmetric dig ...
(DSL) and cable broadband are paving the way for increased broadband speeds.[ FTTH can offer broadband-speeds that are ten times (or even a hundred times) faster than DSL or cable.
]
Internet service providers in the Zettabyte Era
The Zettabyte Era has affected Internet service providers
An Internet service provider (ISP) is an organization that provides services for accessing, using, or participating in the Internet. ISPs can be organized in various forms, such as commercial, community-owned, non-profit, or otherwise privat ...
(ISPs) with the growth of data flowing from all directions. Congestion occurs when there is too much data flowing in and the quality of service
Quality of service (QoS) is the description or measurement of the overall performance of a service, such as a telephony or computer network, or a cloud computing service, particularly the performance seen by the users of the network. To quantitat ...
(QoS) weakens. In both China and the U.S. some ISPs store and handle exabytes of data.[ The response by certain ISPs is to implement so-called network management practices in an attempt to accommodate the never-ending data-surge of Internet subscribers on their networks. Furthermore, the technologies being implemented by ISPs across their networks are evolving to address the increase in data flow.]
Network management practices have brought about debates relating to net neutrality
Network neutrality, often referred to as net neutrality, is the principle that Internet service providers (ISPs) must treat all Internet communications equally, offering users and online content providers consistent rates irrespective of co ...
in terms of fair access to all content on the Internet.[ According to ]The European Consumer Organisation
The European Consumer Organisation (BEUC (), from the French name ''Bureau Européen des Unions de Consommateurs'', "European Bureau of Consumers' Unions") is an umbrella consumers' group, founded in 1962. Based in Brussels, Belgium, it brings t ...
, network neutrality can be understood as an aim that "all Internet should be treated equally, without discrimination or interference. When this is the case, users enjoy the freedom to access the content, services, and applications of their choice, using any device they choose".
According to the Canadian Radio-television and Telecommunications Commission
The Canadian Radio-television and Telecommunications Commission (CRTC; french: Conseil de la radiodiffusion et des télécommunications canadiennes, links=) is a public organization in Canada with mandate as a regulatory agency for broadcast ...
(CRTC) Telecom Regulatory Policy 2009-657 there are two forms of Internet network management practices in Canada. The first are economic practices such as data caps, the second are technical practices like bandwidth throttling
Bandwidth throttling consists in the intentional limitation of the communication speed (bytes or kilobytes per second) of the ingoing (received) data and/or in the limitation of the speed of outgoing (sent) data in a network node or in a network ...
and blocking. According to the CRTC, the technical practices are put in place by ISPs to address and solve congestion issues in their network, however the CRTC states that ISPs are not to employ ITMPs for preferential or unjustly discriminatory reasons.
In the United States, however, during the Obama-era administration, under the Federal Communications Commission
The Federal Communications Commission (FCC) is an independent agency of the United States federal government that regulates communications by radio, television, wire, satellite, and cable across the United States. The FCC maintains jurisdicti ...
's (FCC) 15–24 policy, there were three bright-line rules in place to protect net neutrality: no blocking, no throttling, no paid prioritization. On 14 December 2017, the FCC voted 3–2 to remove these rules, allowing ISPs to block, throttle and give fast-lane access to content on their network.
In an attempt to aid ISP's in dealing with large data-flow in the Zettabyte Era, in 2008 Cisco unveiled a new router, the Aggregation Services Router (ASR) 9000, which at the time was supposed to be able to offer six times the speed of comparable routers. In one second the ASR 9000 router would, in theory, be able to process and distribute 1.2 million hours of DVD traffic. In 2011, with the coming of the Zettabyte Era, Cisco had continued work on the ASR 9000 in that it would now be able to handle 96 terabytes a second, up significantly from 6.4 terabytes a second the ASR 9000 could handle in 2008.
Data centers
Energy consumption
Data centers attempt to accommodate the ever-growing rate at which data is produced, distributed, and stored. Data centers are large facilities used by enterprises to store immense datasets on servers. In 2014 it was estimated that in the U.S. alone there were roughly 3 million data centers, ranging from small centers located in office buildings to large complexes of their own. Increasingly, data centers are storing more data than end-user devices. By 2020 it is predicted that 61% of total data will be stored via cloud applications (data centers) in contrast to 2010 when 62% of data storage was on end-user devices. An increase in data centers for data storage coincides with an increase in energy consumption by data centers.
In 2014, data centers in the U.S. accounted for roughly 1.8% of total electricity consumption which equates to 70 billion kWh. Between 2010–2014 an increase of 4% was attributed to electricity consumption by data centers, this upward trend of 4% is predicted to continue through 2014–2020. In 2011, energy consumption from all data centers equated to roughly 1.1% to 1.5% of total global energy consumption. Information and communication technologies, including data centers, are responsible for creating large quantities of emissions.
Google's green initiatives
The energy used by data centers is not only to power their servers. In fact, most data centers use about half of their energy costs on non-computing energy such as cooling and power conversion. Google's data centers have been able to reduce non-computing costs to 12%. Furthermore, as of 2016, Google uses its artificial intelligence unit, DeepMind, to manage the amount of electricity used for cooling their data centers, which results in a cost reduction of roughly 40% after the implementation of DeepMind. Google claims that its data centers use 50% less energy than ordinary data centers.
According to Google's Senior Vice President of Technical Infrastructure, Urs Hölzle, Google's data centers (as well as their offices) will have reached 100% renewable energy
100% renewable energy means getting all energy from renewable resources. The endeavor to use 100% renewable energy for electricity, heating, cooling and transport is motivated by climate change, pollution and other environmental issues ...
for their global operations by the end of 2017. Google plans to accomplish this milestone by buying enough wind and solar electricity to account for all the electricity their operations consume globally. The reason for these green-initiatives is to address climate change and Google's carbon footprint. Furthermore, these green-initiatives have become cheaper, with the cost of wind energy lowering by 60% and solar energy coming down 80%.[
In order to improve a data center's energy efficiency, reduce costs and lower the impact on the environment, Google provides 5 of the best practices for data centers to implement:
# Measure the ]Power Usage Effectiveness Power usage effectiveness (PUE) is a ratio that describes how efficiently a computer data center uses energy; specifically, how much energy is used by the computing equipment (in contrast to cooling and other overhead that supports the equipment).
...
(PUE), a ratio used by the industry to measure the energy used for non-computing functions, to track a data center's energy use.
# Using well-designed containment methods, try to stop cold and hot air from mixing. Also, use backing plates for empty spots on the rack and eliminate hot spots.
# Keep the aisle temperatures cold for energy savings.
# Use free cooling methods to cool data centers, including a large thermal reservoir or evaporating water.
# Eliminate as many power conversion steps as possible to lower power distribution losses.
The Open Compute Project
In 2010, Facebook
Facebook is an online social media and social networking service owned by American company Meta Platforms. Founded in 2004 by Mark Zuckerberg with fellow Harvard College students and roommates Eduardo Saverin, Andrew McCollum, Dustin Mosk ...
launched a new data center designed in such a way that allowed it to be 38% more efficient and 24% less expensive to build and run than the average data center. This development led to the formation of the Open Compute Project
The Open Compute Project (OCP) is an organization that shares designs of data center products and best practices among companies, including ARM, Meta, IBM, Wiwynn, Intel, Nokia, Google, Microsoft, Seagate Technology, Dell, Rackspace, Hewlett ...
(OCP) in 2011. The OCP members collaborate in order to build new technological hardware that is more efficient, economical and sustainable in an age where data is ever-growing.[ The OCP is currently working on several projects, including one specifically focusing on data centers. This project aims to guide the way in which new data centers are built, but also to aid already existing data centers in improving thermal and electrical energy as well as to maximize mechanical performance. The OCP's data center project focuses on five areas: facility power, facility operations, layout and design, facility cooling and facility monitoring and control.]
References
Further reading
* {{cite book , last1=Floridi , first1=Luciano , title=Information: A Very Short Introduction , date=2010 , publisher=Oxford University Press , location=United States , pages=6–8 , url=https://books.google.com/books?id=H6viR4Fs7lYC&q=zettabyte+era&pg=PP2 , access-date=4 December 2017 , isbn=9780191609541
Big data
Cloud storage
Data centers
Data management
Digital technology
Mass media technology
New media
Technology forecasting