Koomey's Law
Koomey's law describes a trend in the history of computing hardware: for about a half-century, the number of computations per joule of energy dissipated doubled about every 1.57 years. Professor Jonathan Koomey described the trend in a 2010 paper in which he wrote that "at a fixed computing load, the amount of battery you need will fall by a factor of two every year and a half.". This trend had been remarkably stable since the 1950s ( ''R''2 of over 98%). But in 2011, Koomey re-examined this data and found that after 2000, the doubling slowed to about once every 2.6 years. This is related to the slowing of Moore's law, the ability to build smaller transistors; and the end around 2005 of Dennard scaling, the ability to build smaller transistors with constant power density. "The difference between these two growth rates is substantial. A doubling every year and a half results in a 100-fold increase in efficiency every decade. A doubling every two and a half years yields just a 16 ... [...More Info...]       [...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]   |
|
Texas Instruments
Texas Instruments Incorporated (TI) is an American multinational semiconductor company headquartered in Dallas, Texas. It is one of the top 10 semiconductor companies worldwide based on sales volume. The company's focus is on developing analog chips and embedded processors, which account for more than 80% of its revenue. TI also produces digital light processing (DLP) technology and education technology products including calculators, microcontrollers, and multi-core processors. Texas Instruments emerged in 1951 after a reorganization of Geophysical Service Incorporated, a company founded in 1930 that manufactured equipment for use in the seismic industry, as well as defense electronics. TI produced the world's first commercial silicon transistor in 1954, and the same year designed and manufactured the first transistor radio. Jack Kilby invented the integrated circuit in 1958 while working at TI's Central Research Labs. TI also invented the hand-held calculator in 1967, and intr ... [...More Info...]       [...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]   |
|
Communications Of The ACM
''Communications of the ACM'' (''CACM'') is the monthly journal of the Association for Computing Machinery (ACM). History It was established in 1958, with Saul Rosen as its first managing editor. It is sent to all ACM members. Articles are intended for readers with backgrounds in all areas of computer science and information systems. The focus is on the practical implications of advances in information technology and associated management issues; ACM also publishes a variety of more theoretical journals. The magazine straddles the boundary of a science magazine, trade magazine, and a scientific journal. While the content is subject to peer review, the articles published are often summaries of research that may also be published elsewhere. Material published must be accessible and relevant to a broad readership. From 1960 onward, ''CACM'' also published algorithms, expressed in ALGOL. The collection of algorithms later became known as the Collected Algorithms of the ACM. CA ... [...More Info...]       [...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]   |
|
Swanson's Law
Swanson's law is the observation that the price of solar photovoltaic modules tends to drop 20 percent for every doubling of cumulative shipped volume. At present rates, costs go down 75% about every 10 years. Origin It is named after Richard Swanson, the founder of SunPower Corporation, a solar panel manufacturer. Note: Read more about current innovations in solar technology. The term ''Swanson's Law'' appears to have originated with an article in ''The Economist'' published in late 2012. Swanson had been presenting such curves at technical conferences for several years. Swanson's law has been compared to Moore's law, which predicts the growing computing power of processors. Swanson's Law is a solar industry specific application of the more general Wright's Law which states there will be a fixed cost reduction for each doubling of manufacturing volume. Technical Background The method used by Swanson is more commonly referred to as ''learning curve'' or more precise '' ex ... [...More Info...]       [...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]   |
|
Performance Per Watt
In computing, performance per watt is a measure of the energy efficiency of a particular computer architecture or computer hardware. Literally, it measures the rate of computation that can be delivered by a computer for every watt of power consumed. This rate is typically measured by performance on the LINPACK benchmark when trying to compare between computing systems: an example using this is the Green500 list of supercomputers. Performance per watt has been suggested to be a more sustainable measure of computing than Moore's law. System designers building parallel computers often pick CPUs based on their performance per watt of power, because the cost of powering the CPU outweighs the cost of the CPU itself. Spaceflight computers have hard limits on the maximum power available and also have hard requirements on minimum real-time performance. A ratio of processing speed to required electrical power is more useful than raw processing speed. D. J. Shirley; and M. K. McLellan ... [...More Info...]       [...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]   |
|
Limits Of Computation
The limits of computation are governed by a number of different factors. In particular, there are several physical and practical limits to the amount of computation or data storage that can be performed with a given amount of mass, volume, or energy. Hardware limits or physical limits Processing and memory density * The Bekenstein bound limits the amount of information that can be stored within a spherical volume to the entropy of a black hole with the same surface area. * Thermodynamics limit the data storage of a system based on its energy, number of particles and particle modes. In practice, it is a stronger bound than the Bekenstein bound. Processing speed * Bremermann's limit is the maximum computational speed of a self-contained system in the material universe, and is based on mass–energy versus quantum uncertainty constraints. Communication delays * The Margolus–Levitin theorem sets a bound on the maximum computational speed per unit of energy: 6 × 1 ... [...More Info...]       [...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]   |
|
Beyond CMOS
Beyond CMOS refers to the possible future digital logic technologies beyond the scaling limits of CMOS technology. which limits device density and speeds due to heating effects. ''Beyond CMOS'' is the name of one of the 7 focus groups in ITRS 2.0 (2013) and in its successor, the International Roadmap for Devices and Systems. CPUs using CMOS were released from 1986 (e.g. 12 MHz Intel 80386). As CMOS transistor dimensions were shrunk the clock speeds also increased. Since about 2004 CMOS CPU clock speeds have leveled off at about 3.5 GHz. CMOS devices sizes continue to shrink – see Intel's process–architecture–optimization model (and older tick–tock model) and ITRS: * 22 nanometer Ivy Bridge in 2012 * first 14 nanometer processors shipped in Q4 2014. * In May 2015, Samsung Electronics showed a 300 mm wafer of 10 nanometer FinFET chips. It is not yet clear if CMOS transistors will still work below 3 nm. See 3 nanometer. Comparisons of techno ... [...More Info...]       [...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]   |
|
Reversible Computing
Reversible computing is any model of computation where every step of the process is time-reversible. This means that, given the output of a computation, it's possible to perfectly reconstruct the input. In systems that progress deterministically from one state to another, a key requirement for reversibility is a one-to-one correspondence between each state and its successor. Reversible computing is considered an unconventional approach to computation and is closely linked to quantum computing, where the principles of quantum mechanics inherently ensure reversibility (as long as quantum states are not measured or " collapsed"). Reversibility There are two major, closely related types of reversibility that are of particular interest for this purpose: physical reversibility and logical reversibility. A process is said to be ''physically reversible'' if it results in no increase in physical entropy; it is isentropic. There is a style of circuit design ideally exhibiting thi ... [...More Info...]       [...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]   |
|
Landauer's Principle
Landauer's principle is a physical principle pertaining to a lower theoretical limit of energy consumption of computation. It holds that an irreversible change in information stored in a computer, such as merging two computational paths, dissipates a minimum amount of heat to its surroundings.. It is hypothesized that energy consumption below this lower bound would require the development of reversible computing. The principle was first proposed by Rolf Landauer in 1961. Statement Landauer's principle states that the minimum energy needed to erase one bit of information is proportional to the temperature at which the system is operating. Specifically, the energy needed for this computational task is given by :E \geq k_\text T \ln 2, where k_\text is the Boltzmann constant and T is the temperature in Kelvin. At room temperature, the Landauer limit represents an energy of approximately . , modern computers use about a billion times as much energy per operation. History Rolf Lan ... [...More Info...]       [...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]   |
|
Second Law Of Thermodynamics
The second law of thermodynamics is a physical law based on Universal (metaphysics), universal empirical observation concerning heat and Energy transformation, energy interconversions. A simple statement of the law is that heat always flows spontaneously from hotter to colder regions of matter (or 'downhill' in terms of the temperature gradient). Another statement is: "Not all heat can be converted into Work (thermodynamics), work in a cyclic process."Young, H. D; Freedman, R. A. (2004). ''University Physics'', 11th edition. Pearson. p. 764. The second law of thermodynamics establishes the concept of entropy as a physical property of a thermodynamic system. It predicts whether processes are forbidden despite obeying the requirement of conservation of energy as expressed in the first law of thermodynamics and provides necessary criteria for spontaneous processes. For example, the first law allows the process of a cup falling off a table and breaking on the floor, as well as allowi ... [...More Info...]       [...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]   |
|
Efficient Energy Use
Efficient energy use, or energy efficiency, is the process of reducing the amount of energy required to provide products and services. There are many technologies and methods available that are more energy efficient than conventional systems. For example, building insulation, insulating a building allows it to use less heating and cooling energy while still maintaining a Thermal comfort, comfortable temperature. Another method made by Lev Levich is to remove energy subsidies that promote high energy consumption and inefficient energy use. Improved energy efficiency in Green building, buildings, industrial processes and Energy efficiency in transport, transportation could reduce the world's energy needs in 2050 by one third. There are two main motivations to improve energy efficiency. Firstly, one motivation is to achieve Operating cost, cost savings during the operation of the appliance or process. However, installing an energy-efficient technology comes with an upfront cost, the ... [...More Info...]       [...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]   |
|
IEEE Micro
''IEEE Micro'' is a bimonthly peer-reviewed scientific journal published by the IEEE Computer Society covering small systems and semiconductor chips, including integrated circuit processes and practices, project management, development tools and infrastructure, as well as chip design and architecture, empirical evaluations of small system and IC technologies and techniques, and human and social aspects of system development. Editors-in-chief The following people are or have been editor-in-chief: * 2024–present: Hsien-Hsin Sean Lee * 2019–2023: Lizy Kurian John * 2015–2018: Lieven Eeckhout * 2011–2014: Erik R. Altman * 2007–2010: David H. Albonesi * 2003–2006: Pradip Bose * 1999–2001: Ken Sakamura * 1995–1998: Steve Diamond * 1991–1994: Dante Del Corso * 1987–1990: Joe Hootman * 1985–1987: James J. Farrell III * 1983–1984: Peter Rony and Tom Cain * 1980–1982: Richard C. Jaeger External links * Micro Micro may refer to: Measurement * ... [...More Info...]       [...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]   |