Network analyzer (AC power)

TheInfoList

From 1929 to the late 1960s, large alternating current power systems were modelled and studied on AC network analyzers (also called alternating current network calculators or AC calculating boards) or transient network analyzers. These special-purpose analog computers were an outgrowth of the DC calculating boards used in the very earliest power system analysis. By the middle of the 1950s, fifty network analyzers were in operation. AC network analyzers were much used for Power flow study, power flow studies, short circuit calculations, and system stability studies, but were ultimately replaced by numerical solutions running on digital computers. While the analyzers could provide real-time simulation of events, with no concerns about numeric stability of algorithms, the analyzers were costly, inflexible, and limited in the number of buses and lines that could be simulated. Eventually powerful digital computers replaced analog network analyzers for practical calculations, but analog physical models for studying electrical transients are still in use.

# Calculating methods

As AC power systems became larger at the start of the 20th century, with more interconnected devices, the problem of calculating the expected behavior of the systems became more difficult. Manual methods were only practical for systems of a few sources and nodes. The complexity of practical problems made manual calculation techniques too laborious or inaccurate to be useful. Many mechanical aids to calculation were developed to solve problems relating to network power systems. DC calculating boards used resistors and DC sources to represent an AC network. A resistor was used to model the inductive reactance of a circuit, while the actual series resistance of the circuit was neglected. The principle disadvantage was the inability to model complex impedances. However, for short-circuit fault studies, the effect of the resistance component was usually small. DC boards served to produce results accurate to around 20% error, sufficient for some purposes. Artificial lines were used to analyze transmission lines. These carefully constructed replicas of the distributed inductance, capacitance and resistance of a full-size line were used to investigate propagation of impulses in lines and to validate theoretical calculations of transmission line properties. An artificial line was made by winding layers of wire around a glass cylinder, with interleaved sheets of tin foil, to give the model proportionally the same distributed inductance and capacitance as the full-size line. Later, lumped-element approximations of transmission lines were found to give adequate precision for many calculations. Laboratory investigations of the stability of multiple-machine systems were constrained by the use of direct-operated indicating instruments (voltmeters, ammeters, and wattmeters). To ensure that the instruments negligibly loaded the model system, the machine power level used was substantial. Some workers in the 1920s used three-phase model generators rated up to 600 kVA and 2300 volts to represent a power system. General Electric developed model systems using generators rated at 3.75 kVA. It was difficult to keep multiple generators in synchronism, and the size and cost of the units was a constraint. While transmission lines and loads could be accurately scaled down to laboratory representations, rotating machines could not be accurately miniaturized and keep the same dynamic characteristics as full-sized prototypes; the ratio of machine inertia to machine frictional loss did not scale.

# The MIT network analyzer

The network analyzer installed at Massachusetts Institute of Technology (MIT) grew out of a 1924 thesis project by Hugh H. Spencer and Harold Locke Hazen, investigating a power system modelling concept proposed by Vannevar Bush. Instead of miniature rotating machines, each generator was represented by a transformer with adjustable voltage and phase, all fed from a common source. This eliminated the poor accuracy of models with miniature machines. The 1925 publication of this thesis attracted the attention at General Electric, where Robert Doherty (college president), Robert Doherty was interested in modelling problems of system stability. He asked Hazen to verify that the model could accurately reproduce the behavior of machines during load changes. Design and construction was carried out jointly by General Electric and MIT. When first demonstrated in June 1929, the system had eight phase-shifting transformers to represent synchronous machines. Other elements included 100 variable line resistors, 100 variable reactors, 32 fixed capacitors, and 40 adjustable load units. The analyzer was described in a 1930 paper by H.L Hazen, O.R. Schurig and M.F. Gardner. The base quantities for the analyzer were 200 volts, and 0.5 amperes. Sensitive portable thermocouple-type instruments were used for measurement. The analyzer occupied four large panels, arranged in a U-shape, with tables in front of each section to hold measuring instruments. While primarily conceived as an educational tool, the analyzer saw considerable use by outside firms, who would pay to use the device. American Gas and Electric Company, the Tennessee Valley Authority, and many other organizations studied problems on the MIT analyzer in its first decade of operation. In 1940 the system was moved and expanded to handle more complex systems. By 1953 the MIT analyzer was beginning to fall behind the state of the art. Digital computers were first used on power system problems as early as "Whirlwind I, Whirlwind" in 1949. Unlike most of the forty other analyzers in service by that point, the MIT instrument was energized at 60 Hz, not 440 or 480 Hz, making its components large, and expansion to new types of problems difficult. Many utility customers had bought their own network analyzers. The MIT system was dismantled and sold to the Puerto Rico Electric Power Authority, Puerto Rico Water Resources Authority in 1954.

# Commercial manufacturers

By 1947, fourteen network analyzers had been built at a total cost of about two million US dollars. General Electric built two full-scale network analyzers for its own work and for services to its clients. Westinghouse built systems for their internal use and provided more than 20 analyzers to utility and university clients. After the Second World War analyzers were known to be in use in France, the UK, Australia, Japan, and the Soviet Union. Later models had improvements such as centralized control of switching, central measurement bays, and chart recorders to automatically provide permanent records of results. General Electric's Model 307 was a miniaturized AC network analyzer with four generator units and a single electronically amplified metering unit. It was targeted at utility companies to solve problems too large for hand computation but not worth the expense of renting time on a full size analyzer. Like the Iowa State College analyzer, it used a system frequency of 10 kHz instead of 60 Hz or 480 Hz, allowing much smaller radio-style capacitor and inductors to be used to model power system components. The 307 was cataloged from 1957 and had a list of about 20 utility, educational and government customers. In 1959 its list price was \$8,590. In 1953, the Metropolitan Edison Company and a group of six other electrical companies purchased a new Westinghouse AC network analyzer for installation at the Franklin Institute in Philadelphia. The system, described as the largest ever built, cost \$400,000. In Japan, network analyzers were installed starting in 1951. The Yokogawa Electric company introduced a model energized at 3980 Hz starting in 1956.

# Other applications

## Transient analyzer

A "transient network analyzer" was an analog model of a transmission system especially adapted to study high-frequency transient surges (such as those due to lightning or switching), instead of AC power frequency currents. Similarly to an AC network analyzer, they represented apparatus and lines with scaled inductances and resistances. A synchronously driven switch repeatedly applied a transient impulse to the model system, and the response at any point could be observed on an oscilloscope or recorded on an oscillograph. Some transient analyzers are still in use for research and education, sometimes combined with digital protective relays or recording instruments.

## Anacom

The Westinghouse ''Anacom'' was an AC-energized electrical analog computer system used extensively for problems in mechanical design, structural elements, lubrication oil flow, and various transient problems including those due to lightning surges in electric power transmission systems. The excitation frequency of the computer could be varied. The Westinghouse Anacom constructed in 1948 was used up to the early 1990s for engineering calculations; its original cost was \$500,000. The system was periodically updated and expanded; by the 1980s the Anacom could be run through many simulation cases unattended, under the control of a digital computer that automatically set up initial conditions and recorded the results. Westinghouse built a replica Anacom for Northwestern University, sold an Anacom to ABB Group, ABB, and twenty or thirty similar computers by other makers were used around the world.

## Physics and chemistry

Since the multiple elements of the AC network analyzer formed a powerful analog computer, occasionally problems in physics and chemistry were modeled (by such researchers as Gabriel Kron of General Electric), in the late 1940s prior to the ready availability of general-purpose digital computers. Another application was water flow in water distribution systems. The forces and displacements of a mechanical system could be readily modelled with the voltages and currents of a network analyzer, which allowed easy adjustment of properties such as the stiffness of a spring by, for example, changing the value of a capacitor.

## Structures

The David Taylor Model Basin operated an AC network analyzer from the late 1950s until the mid-1960s. The system was used on problems in ship design. An electrical analog of the structural properties of a proposed ship, shaft, or other structure could be built, and tested for its vibrational modes. Unlike AC analyzers used for power systems work, the exciting frequency was made continuously variable so that mechanical resonance effects could be investigated.

# Decline and obsolescence

Even during the Depression and the Second World War, many network analyzers were constructed because of their great value in solving calculations related to electric power transmission. By the mid 1950s, about thirty analyzers were available in the United States, representing an oversupply. Institutions such as MIT could no longer justify operating analyzers as paying clients barely covered operating expenses. James S. Small, ''The Analogue Alternative: The Electronic Analogue Computer in Britain and the USA, 1930-1975'', Routledge, 2013, {{ISBN, 1134699026, pages 35-40 Once digital computers of adequate performance became available, the solution methods developed on analog network analyzers were migrated to the digital realm, where plugboards, switches and meter pointers were replaced with punch cards and printouts. The same general-purpose digital computer hardware that ran network studies could easily be dual-tasked with business functions such as payroll. Analog network analyzers faded from general use for load-flow and fault studies, although some persisted in transient studies for a while longer. Analog analyzers were dismantled and either sold off to other utilities, donated to engineering schools, or scrapped. The fate of a few analyzers illustrates the trend. The analyzer purchased by American Electric Power was replaced by digital systems in 1961, and donated to Virginia Tech. The Westinghouse network analyzer purchased by the State Electricity Commission of Victoria, Australia in 1950 was taken out of utility service in 1967 and donated to the Engineering department at Monash University; but by 1985, even instructional use of the analyzer was no longer practical and the system was finally dismantled.https://collections.museumvictoria.com.au/items/1763754 Photograph of part of a Westinghouse network analyzer, retrieved 2017 Aug 3 One factor contributing to the obsolescence of analog models was the increasing complexity of interconnected power systems. Even a large analyzer could only represent a few machines, and perhaps a few score lines and busses. Digital computers routinely handled systems with thousands of busses and transmission lines.