Interlaced video (also known as interlaced scan) is a technique for doubling the perceived
frame rate of a video display without consuming extra
bandwidth
Bandwidth commonly refers to:
* Bandwidth (signal processing) or ''analog bandwidth'', ''frequency bandwidth'', or ''radio bandwidth'', a measure of the width of a frequency range
* Bandwidth (computing), the rate of data transfer, bit rate or thr ...
. The interlaced signal contains two
field
Field may refer to:
Expanses of open ground
* Field (agriculture), an area of land used for agricultural purposes
* Airfield, an aerodrome that lacks the infrastructure of an airport
* Battlefield
* Lawn, an area of mowed grass
* Meadow, a grass ...
s of a video frame captured consecutively. This enhances motion perception to the viewer, and reduces
flicker by taking advantage of the
phi phenomenon.
This effectively doubles the time resolution (also called ''
temporal resolution
Temporal resolution (TR) refers to the discrete resolution of a measurement with respect to time.
Physics
Often there is a trade-off between the temporal resolution of a measurement and its spatial resolution, due to Heisenberg's uncertainty p ...
'') as compared to non-interlaced footage (for frame rates equal to field rates). Interlaced signals require a display that is natively capable of showing the individual fields in a sequential order.
CRT displays and ALiS
plasma display
A plasma display panel (PDP) is a type of flat panel display that uses small cells containing plasma: ionized gas that responds to electric fields. Plasma televisions were the first large (over 32 inches diagonal) flat panel displays to be releas ...
s are made for displaying interlaced signals.
Interlaced scan refers to one of two common methods for "painting" a video image on an electronic display screen (the other being
progressive scan) by scanning or displaying each line or row of pixels. This technique uses two fields to create a frame. One field contains all odd-numbered lines in the image; the other contains all even-numbered lines.
A
Phase Alternating Line
Phase Alternating Line (PAL) is a colour encoding system for analogue television. It was one of three major analogue colour television standards, the others being NTSC and SECAM. In most countries it was broadcast at 625 lines, 50 fields (25 ...
(PAL)-based
television set
A television set or television receiver, more commonly called the television, TV, TV set, telly, tele, or tube, is a device that combines a tuner, display, and loudspeakers, for the purpose of viewing and hearing television broadcasts, or using ...
display, for example, scans 50 ''fields'' every second (25 odd and 25 even). The two sets of 25 fields work together to create a full ''frame'' every 1/25 of a second (or 25
frames per second
A frame is often a structural system that supports other components of a physical construction and/or steel frame that limits the construction's extent.
Frame and FRAME may also refer to:
Physical objects
In building construction
*Framing (con ...
), but with interlacing create a new half frame every 1/50 of a second (or 50 fields per second). To display interlaced video on progressive scan displays, playback applies
deinterlacing
Deinterlacing is the process of converting interlaced video into a non-interlaced or Progressive scan, progressive form. Interlaced video signals are commonly found in analog television, digital television (HDTV) when in the 1080i format, some D ...
to the video signal (which adds
input lag
Input lag or input latency is the amount of time that passes between sending an electrical signal and the occurrence of a corresponding action.
In video games the term is often used to describe any latency between an input and either the game or ...
).
The
European Broadcasting Union has argued against interlaced video in production and broadcasting. They recommend 720p 50 fps (frames per second) for the current production format—and are working with the industry to introduce
1080p
1080p (1920×1080 progressively displayed pixels; also known as Full HD or FHD, and BT.709) is a set of HDTV high-definition video modes characterized by 1,920 pixels displayed across the screen horizontally and 1,080 pixels down the screen vert ...
50 as a future-proof production standard. 1080p 50 offers higher vertical resolution, better quality at lower bitrates, and easier conversion to other formats, such as 720p 50 and 1080i 50.
The main argument is that no matter how complex the deinterlacing algorithm may be, the artifacts in the interlaced signal cannot be completely eliminated because some information is lost between frames.
Despite arguments against it,
television standards organizations continue to support interlacing. It is still included in digital video transmission formats such as
DV,
DVB
Digital Video Broadcasting (DVB) is a set of international open standards for digital television. DVB standards are maintained by the DVB Project, an international industry consortium, and are published by a Joint Technical Committee (JTC) o ...
, and
ATSC. New video compression standards like
High Efficiency Video Coding
High Efficiency Video Coding (HEVC), also known as H.265 and MPEG-H Part 2, is a video coding format, video compression standard designed as part of the MPEG-H project as a successor to the widely used Advanced Video Coding (AVC, H.264, or MPEG-4 ...
are optimized for
progressive scan video, but sometimes do support interlaced video.
Description
Progressive scan captures, transmits, and displays an image in a path similar to text on a page—line by line, top to bottom.
The interlaced scan pattern in a standard definition CRT display also completes such a scan, but in two passes (two fields). The first pass displays the first and all odd numbered lines, from the top left corner to the bottom right corner. The second pass displays the second and all even numbered lines, filling in the gaps in the first scan.
This scan of alternate lines is called ''interlacing''. A ''field'' is an image that contains only half of the lines needed to make a complete picture.
Persistence of vision
Persistence of vision traditionally refers to the optical illusion that occurs when visual perception of an object does not cease for some time after the rays of light proceeding from it have ceased to enter the eye.
The illusion has also been d ...
makes the eye perceive the two fields as a continuous image. In the days of CRT displays, the afterglow of the display's phosphor aided this effect.
Interlacing provides full vertical detail with the same bandwidth that would be required for a full progressive scan, but with twice the perceived
frame rate and
refresh rate
The refresh rate (or "vertical refresh rate", "vertical scan rate", terminology originating with the cathode ray tubes) is the number of times per second that a raster-based display device displays a new image. This is independent from frame rate ...
. To prevent flicker, all analog
broadcast television systems
Broadcast television systems (or terrestrial television systems outside the US and Canada) are the encoding or formatting systems for the transmission and reception of terrestrial television signals.
Analog television systems were standardized b ...
used interlacing.
Format identifiers like 576i50 and 720p50 specify the frame rate for progressive scan formats, but for interlaced formats they typically specify the field rate (which is twice the frame rate). This can lead to confusion, because industry-standard
SMPTE timecode
SMPTE timecode ( or ) is a set of cooperating standards to label individual frames of video or film with a timecode. The system is defined by the Society of Motion Picture and Television Engineers in the SMPTE 12M specification. SMPTE revised ...
formats always deal with frame rate, not field rate. To avoid confusion, SMPTE and EBU always use frame rate to specify interlaced formats, e.g., 480i60 is 480i/30, 576i50 is 576i/25, and 1080i50 is 1080i/25. This convention assumes that one complete frame in an interlaced signal consists of two fields in sequence.
Benefits of interlacing
One of the most important factors in analog television is signal bandwidth, measured in megahertz. The greater the bandwidth, the more expensive and complex the entire production and broadcasting chain. This includes cameras, storage systems, broadcast systems—and reception systems: terrestrial, cable, satellite, Internet, and end-user displays (
TVs and
computer monitors).
For a fixed bandwidth, interlace provides a video signal with twice the display refresh rate for a given line count (versus
progressive scan video at a similar frame rate—for instance 1080i at 60 half-frames per second, vs. 1080p at 30 full frames per second). The higher refresh rate improves the appearance of an object in motion, because it updates its position on the display more often, and when an object is stationary, human vision combines information from multiple similar half-frames to produce the same perceived resolution as that provided by a progressive full frame. This technique is only useful, though, if source material is available in higher refresh rates. Cinema movies are typically recorded at 24fps, and therefore don't benefit from interlacing, a solution which reduces the maximum video bandwidth to 5 MHz without reducing the effective picture scan rate of 60 Hz.
Given a fixed bandwidth and high refresh rate, interlaced video can also provide a higher spatial resolution than progressive scan. For instance, 1920×1080 pixel resolution interlaced
HDTV
High-definition television (HD or HDTV) describes a television system which provides a substantially higher image resolution than the previous generation of technologies. The term has been used since 1936; in more recent times, it refers to the g ...
with a 60 Hz field rate (known as
1080i60 or 1080i/30) has a similar bandwidth to 1280×720 pixel progressive scan HDTV with a 60 Hz frame rate (720p60 or 720p/60), but achieves approximately twice the spatial resolution for low-motion scenes.
However, bandwidth benefits only apply to an analog or ''uncompressed'' digital video signal. With digital video compression, as used in all current digital TV standards, interlacing introduces additional inefficiencies. EBU has performed tests that show that the bandwidth savings of interlaced video over progressive video is minimal, even with twice the frame rate. I.e., 1080p50 signal produces roughly the same bit rate as 1080i50 (aka 1080i/25) signal,
and 1080p50 actually requires less bandwidth to be perceived as subjectively better than its 1080i/25 (1080i50) equivalent when encoding a "sports-type" scene.
Interlacing can be exploited to produce 3D TV programming, especially with a CRT display and especially for
color filtered glasses by transmitting the color keyed picture for each eye in the alternating fields. This does not require significant alterations to existing equipment.
Shutter glasses
An active shutter 3D system (a.k.a. alternate frame sequencing, alternate image, AI, alternating field, field sequential or eclipse method) is a technique of displaying stereoscopic 3D images. It works by only presenting the image intended for th ...
can be adopted as well, obviously with the requirement of achieving synchronisation. If a progressive scan display is used to view such programming, any attempt to deinterlace the picture will render the effect useless. For color filtered glasses the picture has to be either buffered and shown as if it was progressive with alternating color keyed lines, or each field has to be line-doubled and displayed as discrete frames. The latter procedure is the only way to suit shutter glasses on a progressive display.
Interlacing problems
Interlaced video is designed to be captured, stored, transmitted, and displayed in the same interlaced format. Because each interlaced video frame is two fields captured at different moments in time, interlaced video frames can exhibit motion artifacts known as ''interlacing effects'', or ''combing'', if recorded objects move fast enough to be in different positions when each individual field is captured. These artifacts may be more visible when interlaced video is displayed at a slower speed than it was captured, or in still frames.
While there are simple methods to produce somewhat satisfactory progressive frames from the interlaced image, for example by doubling the lines of one field and omitting the other (halving vertical resolution), or
anti-aliasing the image in the vertical axis to hide some of the combing, there are sometimes methods of producing results far superior to these. If there is only sideways (X axis) motion between the two fields and this motion is even throughout the full frame, it is possible to align the scanlines and crop the left and right ends that exceed the frame area to produce a visually satisfactory image. Minor Y axis motion can be corrected similarly by aligning the scanlines in a different sequence and cropping the excess at the top and bottom. Often the middle of the picture is the most necessary area to put into check, and whether there is only X or Y axis alignment correction, or both are applied, most artifacts will occur towards the edges of the picture. However, even these simple procedures require motion tracking between the fields, and a rotating or tilting object, or one that moves in the Z axis (away from or towards the camera) will still produce combing, possibly even looking worse than if the fields were joined in a simpler method.
Some
deinterlacing
Deinterlacing is the process of converting interlaced video into a non-interlaced or Progressive scan, progressive form. Interlaced video signals are commonly found in analog television, digital television (HDTV) when in the 1080i format, some D ...
processes can analyze each frame individually and decide the best method. The best and only perfect conversion in these cases is to treat each frame as a separate image, but that may not always be possible. For framerate conversions and zooming it would mostly be ideal to line-double each field to produce a double rate of progressive frames, resample the frames to the desired resolution and then re-scan the stream at the desired rate, either in progressive or interlaced mode.
Interline twitter
Interlace introduces a potential problem called interline twitter, a form of
moiré. This
aliasing
In signal processing and related disciplines, aliasing is an effect that causes different signals to become indistinguishable (or ''aliases'' of one another) when sampled. It also often refers to the distortion or artifact that results when ...
effect only shows up under certain circumstances—when the subject contains vertical detail that approaches the horizontal resolution of the video format. For instance, a finely striped jacket on a news anchor may produce a shimmering effect. This is ''twittering''. Television professionals avoid wearing clothing with fine striped patterns for this reason.
Professional video camera
A professional video camera (often called a television camera even though its use has spread beyond television) is a high-end device for creating electronic moving images (as opposed to a movie camera, that earlier recorded the images on film). ...
s or
computer-generated imagery systems apply a
low-pass filter
A low-pass filter is a filter that passes signals with a frequency lower than a selected cutoff frequency and attenuates signals with frequencies higher than the cutoff frequency. The exact frequency response of the filter depends on the filt ...
to the vertical resolution of the signal to prevent interline twitter.
Interline twitter is the primary reason that interlacing is less suited for computer displays. Each scanline on a high-resolution computer monitor typically displays discrete pixels, each of which does not span the scanline above or below. When the overall interlaced framerate is 60 frames per second, a pixel (or more critically for e.g. windowing systems or underlined text, a horizontal line) that spans only one scanline in height is visible for the 1/60 of a second that would be expected of a 60 Hz progressive display - but is then followed by 1/60 of a second of darkness (whilst the opposite field is scanned), reducing the per-line/per-pixel refresh rate to 30 frames per second with quite obvious flicker.
To avoid this, standard interlaced television sets typically do not display sharp detail. When computer graphics appear on a standard television set, the screen is either treated as if it were half the resolution of what it actually is (or even lower), or rendered at full resolution and then subjected to a low-pass filter in the vertical direction (e.g. a "motion blur" type with a 1-pixel distance, which blends each line 50% with the next, maintaining a degree of the full positional resolution and preventing the obvious "blockiness" of simple line doubling whilst actually reducing flicker to less than what the simpler approach would achieve). If text is displayed, it is large enough so that any horizontal lines are at least two scanlines high. Most
fonts
In metal typesetting, a font is a particular size, weight and style of a typeface. Each font is a matched set of type, with a piece (a " sort") for each glyph. A typeface consists of a range of such fonts that shared an overall design.
In mod ...
for television programming have wide, fat strokes, and do not include fine-detail
serifs that would make the twittering more visible; in addition, modern character generators apply a degree of anti-aliasing that has a similar line-spanning effect to the aforementioned full-frame low-pass filter.
Deinterlacing
ALiS plasma panels and the old CRTs can display interlaced video directly, but modern computer video displays and TV sets are mostly based on LCD technology, which mostly use progressive scanning.
Displaying interlaced video on a progressive scan display requires a process called
deinterlacing
Deinterlacing is the process of converting interlaced video into a non-interlaced or Progressive scan, progressive form. Interlaced video signals are commonly found in analog television, digital television (HDTV) when in the 1080i format, some D ...
. This is an imperfect technique, and generally lowers resolution and causes various artifacts—particularly in areas with objects in motion. Providing the best picture quality for interlaced video signals requires expensive and complex devices and algorithms. For television displays, deinterlacing systems are integrated into progressive scan TV sets that accept interlaced signal, such as broadcast SDTV signal.
Most modern computer monitors do not support interlaced video, besides some
legacy medium-resolution modes (and possibly 1080i as an adjunct to 1080p), and support for standard-definition video (480/576i or 240/288p) is particularly rare given its much lower line-scanning frequency vs typical "VGA"-or-higher analog computer video modes. Playing back interlaced video from a DVD, digital file or analog capture card on a computer display instead requires some form of
deinterlacing
Deinterlacing is the process of converting interlaced video into a non-interlaced or Progressive scan, progressive form. Interlaced video signals are commonly found in analog television, digital television (HDTV) when in the 1080i format, some D ...
in the player software and/or graphics hardware, which often uses very simple methods to deinterlace. This means that interlaced video often has visible artifacts on computer systems. Computer systems may be used to edit interlaced video, but the disparity between computer video display systems and interlaced television signal formats means that the video content being edited cannot be viewed properly without separate video display hardware.
Current manufacture TV sets employ a system of intelligently extrapolating the extra information that would be present in a progressive signal entirely from an interlaced original. In theory: this should simply be a problem of applying the appropriate algorithms to the interlaced signal, as all information should be present in that signal. In practice, results are currently variable, and depend on the quality of the input signal and amount of processing power applied to the conversion. The biggest impediment, at present, is artifacts in the lower quality interlaced signals (generally broadcast video), as these are not consistent from field to field. On the other hand, high bit rate interlaced signals such as from HD camcorders operating in their highest bit rate mode work well.
Deinterlacing algorithms temporarily store a few frames of interlaced images and then extrapolate extra frame data to make a smooth flicker-free image. This frame storage and processing results in a slight
display lag
Display lag is a phenomenon associated with most types of liquid crystal displays (LCDs) like smartphones and computers and nearly all types of high-definition televisions (HDTVs). It refers to latency, or lag between when the signal is sent to t ...
that is visible in business showrooms with a large number of different models on display. Unlike the old unprocessed NTSC signal, the screens do not all follow motion in perfect synchrony. Some models appear to update slightly faster or slower than others. Similarly, the audio can have an echo effect due to different processing delays.
History
When motion picture film was developed, the movie screen had to be illuminated at a high rate to prevent visible
flicker. The exact rate necessary varies by brightness — 50 Hz is (barely) acceptable for small, low brightness displays in dimly lit rooms, whilst 80 Hz or more may be necessary for bright displays that extend into peripheral vision. The film solution was to project each frame of film three times using a three-bladed shutter: a movie shot at 16 frames per second illuminated the screen 48 times per second. Later, when sound film became available, the higher projection speed of 24 frames per second enabled a two-bladed shutter to produce 48 times per second illumination—but only in projectors incapable of projecting at the lower speed.
This solution could not be used for television. To store a full video frame and display it twice requires a
frame buffer
A framebuffer (frame buffer, or sometimes framestore) is a portion of random-access memory (RAM) containing a bitmap that drives a video display. It is a memory buffer containing data representing all the pixels in a complete video frame. Mode ...
—electronic memory (
RAM
Ram, ram, or RAM may refer to:
Animals
* A male sheep
* Ram cichlid, a freshwater tropical fish
People
* Ram (given name)
* Ram (surname)
* Ram (director) (Ramsubramaniam), an Indian Tamil film director
* RAM (musician) (born 1974), Dutch
* ...
)—sufficient to store a video frame. This method did not become feasible until the late 1980s and with digital technology. In addition, avoiding on-screen
interference patterns caused by studio lighting and the limits of
vacuum tube
A vacuum tube, electron tube, valve (British usage), or tube (North America), is a device that controls electric current flow in a high vacuum between electrodes to which an electric voltage, potential difference has been applied.
The type kn ...
technology required that CRTs for TV be scanned at
AC line frequency. (This was 60 Hz in the US, 50 Hz Europe.)
In 1930, German
Telefunken
Telefunken was a German radio and television apparatus company, founded in Berlin in 1903, as a joint venture of Siemens & Halske and the ''Allgemeine Elektrizitäts-Gesellschaft'' (AEG) ('General electricity company').
The name "Telefunken" ap ...
engineer
Engineers, as practitioners of engineering, are professionals who invent, design, analyze, build and test machines, complex systems, structures, gadgets and materials to fulfill functional objectives and requirements while considering the l ...
Fritz Schröter first formulated and patented the concept of breaking a single video frame into interlaced lines. In the USA,
RCA
The RCA Corporation was a major American electronics company, which was founded as the Radio Corporation of America in 1919. It was initially a patent trust owned by General Electric (GE), Westinghouse, AT&T Corporation and United Fruit Comp ...
engineer
Randall C. Ballard patented the same idea in 1932.
Commercial implementation began in 1934 as cathode-ray tube screens became brighter, increasing the level of flicker caused by
progressive (sequential) scanning.
[R.W. Burns, ''Television: An International History of the Formative Years'', IET, 1998, p. 425. .]
In 1936, when the UK was setting analog standards, early
thermionic valve
A vacuum tube, electron tube, valve (British usage), or tube (North America), is a device that controls electric current flow in a high vacuum between electrodes to which an electric potential difference has been applied.
The type known as a ...
based CRT drive electronics could only scan at around 200 lines in 1/50 of a second (i.e. approximately a 10 kHz repetition rate for the sawtooth horizontal deflection waveform). Using interlace, a pair of 202.5-line fields could be superimposed to become a sharper
405 line frame (with around 377 used for the actual image, and yet fewer visible within the screen bezel; in modern parlance, the standard would be "377i"). The vertical scan frequency remained 50 Hz, but visible detail was noticeably improved. As a result, this system supplanted
John Logie Baird
John Logie Baird FRSE (; 13 August 188814 June 1946) was a Scottish inventor, electrical engineer, and innovator who demonstrated the world's first live working television system on 26 January 1926. He went on to invent the first publicly dem ...
's 240 line mechanical progressive scan system that was also being trialled at the time.
From the 1940s onward, improvements in technology allowed the US and the rest of Europe to adopt systems using progressively higher line-scan frequencies and more radio signal bandwidth to produce higher line counts at the same frame rate, thus achieving better picture quality. However the fundamentals of interlaced scanning were at the heart of all of these systems. The US adopted the
525 line system, later incorporating the composite color standard known as
NTSC
The first American standard for analog television broadcast was developed by National Television System Committee (NTSC)National Television System Committee (1951–1953), Report and Reports of Panel No. 11, 11-A, 12–19, with Some supplement ...
, Europe adopted the
625 line system, and the UK switched from its idiosyncratic 405 line system to (the much more US-like) 625 to avoid having to develop a (wholly) unique method of color TV. France switched from its similarly unique
819 line monochrome system to the more European standard of 625. Europe in general, including the UK, then adopted the
PAL
Phase Alternating Line (PAL) is a colour encoding system for analogue television. It was one of three major analogue colour television standards, the others being NTSC and SECAM. In most countries it was broadcast at 625 lines, 50 fields (25 ...
color encoding standard, which was essentially based on NTSC, but inverted the color carrier phase with each line (and frame) in order to cancel out the hue-distorting phase shifts that dogged NTSC broadcasts. France instead adopted its own unique, twin-FM-carrier based
SECAM
SECAM, also written SÉCAM (, ''Séquentiel de couleur à mémoire'', French for ''color sequential with memory''), is an analog color television system that was used in France, some parts of Europe and Africa, and Russia. It was one of th ...
system, which offered improved quality at the cost of greater electronic complexity, and was also used by some other countries, notably Russia and its satellite states. Though the color standards are often used as synonyms for the underlying video standard - NTSC for 525i/60, PAL/SECAM for 625i/50 - there are several cases of inversions or other modifications; e.g. PAL color is used on otherwise "NTSC" (that is, 525i/60) broadcasts in
Brazil
Brazil ( pt, Brasil; ), officially the Federative Republic of Brazil (Portuguese: ), is the largest country in both South America and Latin America. At and with over 217 million people, Brazil is the world's fifth-largest country by area ...
, as well as vice versa elsewhere, along with cases of PAL bandwidth being squeezed to 3.58 MHz to fit in the broadcast waveband allocation of NTSC, or NTSC being expanded to take up PAL's 4.43 MHz.
Interlacing was ubiquitous in displays until the 1970s, when the needs of
computer monitors resulted in the reintroduction of progressive scan, including on regular TVs or simple monitors based on the same circuitry; most CRT based displays are entirely capable of displaying both progressive and interlace regardless of their original intended use, so long as the horizontal and vertical frequencies match, as the technical difference is simply that of either starting/ending the vertical sync cycle halfway along a scanline every other frame (interlace), or always synchronising right at the start/end of a line (progressive). Interlace is still used for most standard definition TVs, and the
1080i
1080i (also known as Full HD or BT.709) is a combination of frame resolution and scan type. 1080i is used in high-definition television (HDTV) and high-definition video. The number "1080" refers to the number of horizontal lines on the screen. ...
HDTV
High-definition television (HD or HDTV) describes a television system which provides a substantially higher image resolution than the previous generation of technologies. The term has been used since 1936; in more recent times, it refers to the g ...
broadcast standard, but not for
LCD, micromirror (
DLP), or most
plasma display
A plasma display panel (PDP) is a type of flat panel display that uses small cells containing plasma: ionized gas that responds to electric fields. Plasma televisions were the first large (over 32 inches diagonal) flat panel displays to be releas ...
s; these displays do not use a
raster scan
A raster scan, or raster scanning, is the rectangular pattern of image capture and reconstruction in television. By analogy, the term is used for raster graphics, the pattern of image storage and transmission used in most computer bitmap image ...
to create an image (their panels may still be updated in a left-to-right, top-to-bottom scanning fashion, but always in a progressive fashion, and not necessarily at the same rate as the input signal), and so cannot benefit from interlacing (where older LCDs use a "dual scan" system to provide higher resolution with slower-updating technology, the panel is instead divided into two ''adjacent'' halves that are updated ''simultaneously''): in practice, they have to be driven with a progressive scan signal. The
deinterlacing
Deinterlacing is the process of converting interlaced video into a non-interlaced or Progressive scan, progressive form. Interlaced video signals are commonly found in analog television, digital television (HDTV) when in the 1080i format, some D ...
circuitry to get progressive scan from a normal interlaced broadcast television signal can add to the cost of a television set using such displays. Currently, progressive displays dominate the HDTV market.
Interlace and computers
In the 1970s, computers and home video game systems began using TV sets as display devices. At that point, a 480-line
NTSC
The first American standard for analog television broadcast was developed by National Television System Committee (NTSC)National Television System Committee (1951–1953), Report and Reports of Panel No. 11, 11-A, 12–19, with Some supplement ...
signal was well beyond the graphics abilities of low cost computers, so these systems used a simplified video signal that made each video field scan directly on top of the previous one, rather than each line between two lines of the previous field, along with relatively low horizontal pixel counts. This marked the return of
progressive scanning not seen since the 1920s. Since each field became a complete frame on its own, modern terminology would call this
240p on NTSC sets, and
288p on
PAL
Phase Alternating Line (PAL) is a colour encoding system for analogue television. It was one of three major analogue colour television standards, the others being NTSC and SECAM. In most countries it was broadcast at 625 lines, 50 fields (25 ...
. While consumer devices were permitted to create such signals, broadcast regulations prohibited TV stations from transmitting video like this. Computer monitor standards such as the TTL-RGB mode available on the
CGA and e.g.
BBC Micro
The British Broadcasting Corporation Microcomputer System, or BBC Micro, is a series of microcomputers and associated peripherals designed and built by Acorn Computers in the 1980s for the BBC Computer Literacy Project. Designed with an emphas ...
were further simplifications to NTSC, which improved picture quality by omitting modulation of color, and allowing a more direct connection between the computer's graphics system and the CRT.
By the mid-1980s, computers had outgrown these video systems and needed better displays. Most home and basic office computers suffered from the use of the old scanning method, with the highest display resolution being around 640x200 (or sometimes 640x256 in 625-line/50 Hz regions), resulting in a severely distorted tall narrow
pixel
In digital imaging, a pixel (abbreviated px), pel, or picture element is the smallest addressable element in a raster image, or the smallest point in an all points addressable display device.
In most digital display devices, pixels are the ...
shape, making the display of high resolution text alongside realistic proportioned images difficult (logical "square pixel" modes were possible but only at low resolutions of 320x200 or less). Solutions from various companies varied widely. Because PC monitor signals did not need to be broadcast, they could consume far more than the 6, 7 and 8
MHz
The hertz (symbol: Hz) is the unit of frequency in the International System of Units (SI), equivalent to one event (or cycle) per second. The hertz is an SI derived unit whose expression in terms of SI base units is s−1, meaning that one he ...
of bandwidth that NTSC and PAL signals were confined to. IBM's
Monochrome Display Adapter
The Monochrome Display Adapter (MDA, also MDA card, Monochrome Display and Printer Adapter, MDPA) is IBM's standard video display card and computer display standard for the IBM PC introduced in 1981. The MDA does not have any pixel-addressabl ...
and
Enhanced Graphics Adapter
The Enhanced Graphics Adapter (EGA) is an IBM PC graphics adapter and de facto computer display standard from 1984 that superseded the CGA standard introduced with the original IBM PC, and was itself superseded by the VGA standard in 1987. In ...
as well as the
Hercules Graphics Card The Hercules Graphics Card (HGC) is a computer graphics controller made by Hercules Computer Technology, Inc. that combines IBM's text-only MDA display standard with a bitmapped graphics mode. This allows the HGC to offer both high-quality text a ...
and the original
Macintosh
The Mac (known as Macintosh until 1999) is a family of personal computers designed and marketed by Apple Inc. Macs are known for their ease of use and minimalist designs, and are popular among students, creative professionals, and software en ...
computer generated video signals of 342 to 350p, at 50 to 60 Hz, with approximately 16 MHz of bandwidth, some enhanced
PC clone
IBM PC compatible computers are similar to the original IBM PC, XT, and AT, all from computer giant IBM, that are able to use the same software and expansion cards. Such computers were referred to as PC clones, IBM clones or IBM PC clones. ...
s such as the
AT&T 6300
The Olivetti M24 is a computer that was sold by Olivetti in 1983 using the Intel 8086 CPU.
The system was sold in the United States under its original name by Docutel/Olivetti of Dallas. AT&T and Xerox bought rights to rebadge the system as t ...
(aka Olivetti M24) as well as computers made for the Japanese home market managed 400p instead at around 24 MHz, and the
Atari ST pushed that to 71 Hz with 32 MHz bandwidth - all of which required dedicated high-frequency (and usually single-mode, i.e. not "video"-compatible) monitors due to their increased line rates. The
Commodore Amiga
Amiga is a family of personal computers introduced by Commodore in 1985. The original model is one of a number of mid-1980s computers with 16- or 32-bit processors, 256 KB or more of RAM, mouse-based GUIs, and significantly improved graphi ...
instead created a true interlaced 480i60/576i50
RGB
The RGB color model is an additive color model in which the red, green and blue primary colors of light are added together in various ways to reproduce a broad array of colors. The name of the model comes from the initials of the three addi ...
signal at broadcast video rates (and with a 7 or 14 MHz bandwidth), suitable for NTSC/PAL encoding (where it was smoothly decimated to 3.5~4.5 MHz). This ability (plus built-in
genlocking) resulted in the Amiga dominating the video production field until the mid-1990s, but the interlaced display mode caused flicker problems for more traditional PC applications where single-pixel detail is required, with "flicker-fixer" scan-doubler peripherals plus high-frequency RGB monitors (or Commodore's own specialist scan-conversion A2024 monitor) being popular, if expensive, purchases amongst power users. 1987 saw the introduction of
VGA
Video Graphics Array (VGA) is a video display controller and accompanying de facto graphics standard, first introduced with the IBM PS/2 line of computers in 1987, which became ubiquitous in the PC industry within three years. The term can no ...
, on which PCs soon standardized, as well as Apple's
Macintosh II
The Macintosh II is a personal computer designed, manufactured, and sold by Apple Computer from March 1987 to January 1990. Based on the Motorola 68020 32-bit CPU, it is the first Macintosh supporting color graphics. When introduced, a basic sy ...
range which offered displays of similar, then superior resolution and color depth, with rivalry between the two standards (and later PC quasi-standards such as XGA and SVGA) rapidly pushing up the quality of display available to both professional and home users.
In the late 1980s and early 1990s, monitor and graphics card manufacturers introduced newer high resolution standards that once again included interlace. These monitors ran at higher scanning frequencies, typically allowing a 75 to 90 Hz field rate (i.e. 37.5 to 45 Hz frame rate), and tended to use longer-persistence phosphors in their CRTs, all of which was intended to alleviate flicker and shimmer problems. Such monitors proved generally unpopular, outside of specialist ultra-high-resolution applications such as
CAD
Computer-aided design (CAD) is the use of computers (or ) to aid in the creation, modification, analysis, or optimization of a design. This software is used to increase the productivity of the designer, improve the quality of design, improve co ...
and
DTP which demanded as many pixels as possible, with interlace being a necessary evil and better than trying to use the progressive-scan equivalents. Whilst flicker was often not immediately obvious on these displays, eyestrain and lack of focus nevertheless became a serious problem, and the trade-off for a longer afterglow was reduced brightness and poor response to moving images, leaving visible and often off-colored trails behind. These colored trails were a minor annoyance for monochrome displays, and the generally slower-updating screens used for design or database-query purposes, but much more troublesome for color displays and the faster motions inherent in the increasingly popular window-based operating systems, as well as the full-screen scrolling in WYSIWYG word-processors, spreadsheets, and of course for high-action games. Additionally, the regular, thin horizontal lines common to early GUIs, combined with low color depth that meant window elements were generally high-contrast (indeed, frequently stark black-and-white), made shimmer even more obvious than with otherwise lower fieldrate video applications. As rapid technological advancement made it practical and affordable, barely a decade after the first ultra-high-resolution interlaced upgrades appeared for the IBM PC, to provide sufficiently high pixel clocks and horizontal scan rates for hi-rez progressive-scan modes in first professional and then consumer-grade displays, the practice was soon abandoned. For the rest of the 1990s, monitors and graphics cards instead made great play of their highest stated resolutions being "non-interlaced", even where the overall framerate was barely any higher than what it had been for the interlaced modes (e.g. SVGA at 56p versus 43i to 47i), and usually including a top mode technically exceeding the CRT's actual resolution (number of color-phosphor triads) which meant there was no additional image clarity to be gained through interlacing and/or increasing the signal bandwidth still further. This experience is why the PC industry today remains against interlace in HDTV, and lobbied for the 720p standard, and continues to push for the adoption of 1080p (at 60 Hz for NTSC legacy countries, and 50 Hz for PAL); however, 1080i remains the most common HD broadcast resolution, if only for reasons of backward compatibility with older HDTV hardware that cannot support 1080p - and sometimes not even 720p - without the addition of an external scaler, similar to how and why most SD-focussed digital broadcasting still relies on the otherwise obsolete
MPEG2
MPEG-2 (a.k.a. H.222/H.262 as was defined by the ITU) is a standard for "the generic coding of moving pictures and associated audio information". It describes a combination of lossy video compression and lossy audio data compression methods, w ...
standard embedded into e.g.
DVB-T.
See also
*
Field (video)
In video, a field is one of the many still images which are displayed sequentially to create the impression of motion on the screen. Two fields comprise one video frame. When the fields are displayed on a video monitor they are "interlaced" so th ...
: In interlaced video, one of the many still images displayed sequentially to create the illusion of motion on the screen.
*
480i
480i is the video mode used for standard-definition digital television in the Caribbean, Japan, South Korea, Taiwan, Philippines, Laos, Western Sahara, and most of the Americas (with the exception of Argentina, Paraguay, and Uruguay). The ''480 ...
:
standard-definition
Standard-definition television (SDTV, SD, often shortened to standard definition) is a television system which uses a resolution that is not considered to be either high or enhanced definition. "Standard" refers to it being the prevailing sp ...
interlaced video usually used in traditionally
NTSC
The first American standard for analog television broadcast was developed by National Television System Committee (NTSC)National Television System Committee (1951–1953), Report and Reports of Panel No. 11, 11-A, 12–19, with Some supplement ...
countries (North and parts of South America, Japan)
*
576i
576i is a standard-definition television, standard-definition digital video mode, originally used for digitizing analog television in most countries of the world where the utility frequency for electric power distribution is 50 Hz. Because ...
:
standard-definition
Standard-definition television (SDTV, SD, often shortened to standard definition) is a television system which uses a resolution that is not considered to be either high or enhanced definition. "Standard" refers to it being the prevailing sp ...
interlaced video usually used in traditionally
PAL
Phase Alternating Line (PAL) is a colour encoding system for analogue television. It was one of three major analogue colour television standards, the others being NTSC and SECAM. In most countries it was broadcast at 625 lines, 50 fields (25 ...
and
SECAM
SECAM, also written SÉCAM (, ''Séquentiel de couleur à mémoire'', French for ''color sequential with memory''), is an analog color television system that was used in France, some parts of Europe and Africa, and Russia. It was one of th ...
countries
*
1080i
1080i (also known as Full HD or BT.709) is a combination of frame resolution and scan type. 1080i is used in high-definition television (HDTV) and high-definition video. The number "1080" refers to the number of horizontal lines on the screen. ...
:
high-definition television
High-definition television (HD or HDTV) describes a television system which provides a substantially higher image resolution than the previous generation of technologies. The term has been used since 1936; in more recent times, it refers to the g ...
(HDTV) digitally broadcast in 16:9 (widescreen) aspect ratio standard
*
Progressive scan: the opposite of interlacing; the image is displayed line by line.
*
Deinterlacing
Deinterlacing is the process of converting interlaced video into a non-interlaced or Progressive scan, progressive form. Interlaced video signals are commonly found in analog television, digital television (HDTV) when in the 1080i format, some D ...
: converting an interlaced video signal into a non-interlaced one
*
Progressive segmented frame
Progressive segmented Frame (PsF, sF, SF) is a scheme designed to acquire, store, modify, and distribute progressive scan video using interlaced equipment.
With PsF, a progressive frame is divided into two ''segments'', with the odd lines in one s ...
: a scheme designed to acquire, store, modify, and distribute progressive-scan video using interlaced equipment and media
*
Telecine
Telecine ( or ) is the process of transferring film into video and is performed in a color suite. The term is also used to refer to the equipment used in the post-production process.
Telecine enables a motion picture, captured originally on fi ...
: a method for converting film frame rates to television frame rates using interlacing
*
Federal Standard 1037C
Federal Standard 1037C, titled Telecommunications: Glossary of Telecommunication Terms, is a United States Federal Standard issued by the General Services Administration pursuant to the Federal Property and Administrative Services Act of 1949, a ...
: defines interlaced scanning
*
Moving image formats
This article discusses moving image capture, transmission and presentation from today's technical and creative points of view; concentrating on aspects of frame rates.
Essential parameters
The essential parameters of any moving image sequence a ...
*
Wobulation: a variation of interlacing used in DLP displays
*
Screen tearing
Screen tearing is a visual artifact in video display where a display device shows information from multiple frames in a single screen draw.
The artifact occurs when the video feed to the device is not synchronized with the display's refresh r ...
References
External links
Fields: Why Video Is Crucially Different from Graphics– An article that describes field-based, interlaced, digitized video and its relation to frame-based computer graphics with many illustrations
- An article that explains with diagrams how the field order of PAL and NTSC has arisen, and how PAL and NTSC is digitized
100FPS.COM*– Video Interlacing/Deinterlacing
Interlace / Progressive Scanning - Computer vs. VideoSampling theory and synthesis of interlaced video
{{Video formats
Film and video technology
Television technology
Video formats
1925 introductions