Serial digital interface (SDI) is a family of
digital video
Digital video is an electronic representation of moving visual images (video) in the form of encoded digital data. This is in contrast to analog video, which represents moving visual images in the form of analog signals. Digital video comprises ...
interfaces
Interface or interfacing may refer to:
Academic journals
* ''Interface'' (journal), by the Electrochemical Society
* '' Interface, Journal of Applied Linguistics'', now merged with ''ITL International Journal of Applied Linguistics''
* '' Inter ...
first standardized by
SMPTE
The Society of Motion Picture and Television Engineers (SMPTE) (, rarely ), founded by Charles Francis Jenkins in 1916 as the Society of Motion Picture Engineers or SMPE, is a global professional association of engineers, technologists, and e ...
(The Society of Motion Picture and Television Engineers) in 1989.
For example,
ITU-R BT.656 and
SMPTE 259M
SMPTE 259M is a standard published by SMPTE which "describes a 10-bit serial digital interface operating at 143/270/360 Mb/s."
The goal of SMPTE 259M is to define a serial digital interface (based on a coaxial cable), called SDI or SD-SDI.
...
define digital
video interfaces used for
broadcast
Broadcasting is the data distribution, distribution of sound, audio audiovisual content to dispersed audiences via a electronic medium (communication), mass communications medium, typically one using the electromagnetic spectrum (radio waves), ...
-grade video. A related standard, known as high-definition serial digital interface (HD-SDI), is standardized in
SMPTE 292M
SMPTE 292 is a digital video transmission line standard published by the Society of Motion Picture and Television Engineers (SMPTE). This technical standard is usually referred to as HD-SDI; it is part of a family of standards that define a serial ...
; this provides a nominal data rate of 1.485 Gbit/s.
Additional SDI standards have been introduced to support increasing video resolutions (
HD,
UHD and beyond),
frame rates,
stereoscopic (3D) video, and
color depth
Color depth, also known as bit depth, is either the number of bits used to indicate the color of a single pixel, or the number of bits used for each color component of a single pixel. When referring to a pixel, the concept can be defined as bit ...
. Dual link HD-SDI consists of a pair of SMPTE 292M links, standardized by
SMPTE 372M
SMPTE 372M is a standard published by SMPTE which expands upon SMPTE 259M, SMPTE 344M, and SMPTE 292M allowing for bit-rates of 2.970 Gbit/s, and 2.970/1.001 Gbit/s over two wires. These bit-rates are sufficient for 1080p video at 50 ...
in 1998;
[ this provides a nominal 2.970 Gbit/s interface used in applications (such as ]digital cinema
Digital cinema is the digital technology used within the film industry to distribute or project motion pictures as opposed to the historical use of reels of motion picture film, such as 35 mm film. Whereas film reels have to be shipped to mo ...
or HDTV 1080P) that require greater fidelity and resolution than standard HDTV can provide. 3G-SDI (standardized in SMPTE 424M
SMPTE 424M is a standard published by SMPTE which expands upon SMPTE 259M, SMPTE 344M, and SMPTE 292M allowing for bit-rates of 2.970 Gbit/s and 2.970/1.001 Gbit/s over a single-link coaxial cable. These bit-rates are sufficient for 1 ...
) consists of a single 2.970 Gbit/s serial link that allows replacing dual link HD-SDI. 6G-SDI and 12G-SDI standards were published on March 19, 2015.
These standards are used for transmission of uncompressed, unencrypted digital video signals (optionally including embedded audio and time code) within television facilities; they can also be used for packetized data. SDI is used to connect together different pieces of equipment such as recorders, monitors, PCs and vision mixers. Coaxial variants of the specification range in length but are typically less than . Fiber optic variants of the specification such as 297M allow for long-distance transmission limited only by maximum fiber length or repeaters.
SDI and HD-SDI are usually available only in professional video equipment because various licensing
A license (American English) or licence ( Commonwealth English) is an official permission or permit to do, use, or own something (as well as the document of that permission or permit).
A license is granted by a party (licensor) to another par ...
agreements restrict the use of unencrypted digital interfaces, such as SDI, prohibiting their use in consumer equipment. Several professional video and HD-video capable DSLR
A digital single-lens reflex camera (digital SLR or DSLR) is a digital camera that combines the optics and mechanisms of a single-lens reflex camera with a solid-state image sensor and digitally records the images from the sensor.
The reflex des ...
cameras and all uncompressed video capable consumer cameras use the HDMI
High-Definition Multimedia Interface (HDMI) is a proprietary digital interface used to transmit high-quality video and audio signals between devices. It is commonly used to connect devices such as televisions, computer monitors, projectors, gam ...
interface, often called clean HDMI. There are various mod kit
A modchip (short for modification chip) is a small electronic device used to alter or disable artificial restrictions of computers or entertainment devices. Modchips are mainly used in video game consoles, but also in some DVD player, DVD or Blu- ...
s for existing DVD player
A DVD player is a machine that plays DVDs produced under both the DVD-Video and DVD-Audio technical standards, two different and incompatible standards. Some DVD players will also play audio CDs. DVD players are connected to a television to ...
s and other devices such as splitters that ignore HDCP
High-bandwidth Digital Content Protection (HDCP) is a form of digital copy protection developed by Intel Corporation to prevent copying of digital audio and video content as it travels across connections. Types of connections include DisplayPort ...
, which allow a user to add a serial digital interface to these devices.
Electrical interface
The various serial digital interface standards all use (one or more) coaxial cable
Coaxial cable, or coax (pronounced ), is a type of electrical cable consisting of an inner Electrical conductor, conductor surrounded by a concentric conducting Electromagnetic shielding, shield, with the two separated by a dielectric (Insulat ...
s with BNC connector
The BNC connector is a miniature quick-connect/disconnect RF connector, radio-frequency connector for coaxial cable. It was introduced on military radio equipment in the 1940s, and has since become widely used in radio systems and as a common t ...
s, with a nominal impedance of 75 ohms.
This is the same type of cable used in analog composite video
Composite video, also known as CVBS (composite video baseband signal or color, video, blanking and sync), is an analog video format that combines image information—such as brightness (luminance), color (chrominance), and synchronization, int ...
setups, potentially allowing for easier "drop-in" equipment upgrades (although, at high bitrates and/or long distances, it may be necessary for older, oxidising, or lower-grade cable to be replaced with optical fibre). The specified signal amplitude at the source is 800 mV (±10%) peak-to-peak; far lower voltages may be measured at the receiver owing to attenuation
In physics, attenuation (in some contexts, extinction) is the gradual loss of flux intensity through a Transmission medium, medium. For instance, dark glasses attenuate sunlight, lead attenuates X-rays, and water and air attenuate both light and ...
. Using equalization at the receiver, it is possible to send 270 Mbit/s SDI over without use of repeaters, but shorter lengths are preferred. The HD bitrates have a shorter maximum run length, typically .
Uncompressed digital component
Component may refer to:
In engineering, science, and technology Generic systems
*System components, an entity with discrete structure, such as an assembly or software module, within a system considered at a particular level of analysis
* Lumped e ...
signals are transmitted. Data is encoded in NRZI format, and a linear feedback shift register
In computing, a linear-feedback shift register (LFSR) is a shift register whose input bit is a linear function of its previous state.
The most commonly used linear function of single bits is exclusive-or (XOR). Thus, an LFSR is most often a sh ...
is used to scramble the data to reduce the likelihood that long strings of zeroes or ones will be present on the interface. The interface is self-synchronizing and self-clocking. Framing is done by detection of a special synchronization
Synchronization is the coordination of events to operate a system in unison. For example, the Conductor (music), conductor of an orchestra keeps the orchestra synchronized or ''in time''. Systems that operate with all parts in synchrony are sa ...
pattern, which appears on the (unscrambled) serial digital signal to be a sequence of ten ones followed by twenty zeroes (twenty ones followed by forty zeroes in HD); this bit pattern is not legal anywhere else within the data payload.
Standards
Bit rates
Several bit rates are used in serial digital video signal:
* For standard-definition
Standard-definition television (SDTV; also standard definition or SD) is a television system that uses a resolution that is not considered to be either high-definition television, high or enhanced definition. ''Standard'' refers to offering a ...
applications, as defined by SMPTE 259M, the possible bit rates are 270 Mbit/s, 360 Mbit/s, 143 Mbit/s, and 177 Mbit/s. 270 Mbit/s is by far the most commonly used; though the 360 Mbit/s interface (used for widescreen
Widescreen images are displayed within a set of aspect ratio (image), aspect ratios (relationship of image width to height) used in film, television and computer screens. In film, a widescreen film is any film image with a width-to-height aspect ...
standard definition) is sometimes encountered. The 143 and 177 Mbit/s interfaces were intended for transmission of composite-encoded (NTSC
NTSC (from National Television System Committee) is the first American standard for analog television, published and adopted in 1941. In 1961, it was assigned the designation System M. It is also known as EIA standard 170.
In 1953, a second ...
or PAL
Phase Alternating Line (PAL) is a color encoding system for analog television. It was one of three major analogue colour television standards, the others being NTSC and SECAM. In most countries it was broadcast at 625 lines, 50 fields (25 ...
) video digitally and are now considered obsolete.
* For enhanced definition
Enhanced-definition television, or extended-definition television (EDTV) is a Consumer Electronics Association (CEA) marketing shorthand term for certain digital television (DTV) formats and devices. Specifically, this term defines an extension ...
applications (mainly 525P), there are several 540 Mbit/s interfaces defined, as well as an interface standard for a dual-link 270 Mbit/s interface. These are rarely encountered.
* For HDTV
High-definition television (HDTV) describes a television or video system which provides a substantially higher image resolution than the previous generation of technologies. The term has been used since at least 1933; in more recent times, it ref ...
applications, the serial digital interface is defined by SMPTE 292M. Two bit rates are defined, 1.485 Gbit/s, and 1.485/1.001 Gbit/s. The factor of 1/1.001 is provided to allow SMPTE 292M to support video formats with frame rates of 59.94 Hz, 29.97 Hz, and 23.98 Hz, in order to be compatible with existing NTSC
NTSC (from National Television System Committee) is the first American standard for analog television, published and adopted in 1941. In 1961, it was assigned the designation System M. It is also known as EIA standard 170.
In 1953, a second ...
systems. The 1.485 Gbit/s version of the standard supports other frame rates in widespread use, including 60 Hz, 50 Hz, 30 Hz, 25 Hz, and 24 Hz. It is common to collectively refer to both standards as using a nominal bit rate of 1.5 Gbit/s.
* For very high-definition applications, requiring greater resolution, frame rate, or color fidelity than the HD-SDI interface can provide, the SMPTE 372M standard defines the dual link interface. As the name suggests, this interface consists of two SMPTE 292M interconnects operating in parallel. In particular, the dual link interface supports 10-bit, 4:2:2, 1080P formats at frame rates of 60 Hz, 59.94 Hz, and 50 Hz, as well as 12-bit color depth, RGB encoding, and 4:4:4 colour sampling.
* A nominal 3 Gbit/s interface (more accurately, 2.97 Gbit/s, but commonly referred to as "3 gig") was standardized by SMPTE as 424M in 2006. Revised in 2012 as SMPTE ST 424:2012, it supports all of the features supported by the dual 1.485 Gbit/s interface but requires only one cable rather than two.
Other interfaces
SMPTE 297-2006 defines an optical fiber system for transmitting bit-serial digital signals It is intended for transmitting SMPTE ST 259 signals (143 through 360 Mbit/s), SMPTE ST 344 signals (540 Mbit/s), SMPTE ST 292-1/-2 signals (1.485 Gbit/s and 1.485/1.001 Gbit/s) and SMPTE ST 424 signals (2.970 Gbit/s and 2.970/1.001 Gbit/s). In addition to optical specification, ST 297 also mandates laser safety testing and that all optical interfaces are labelled to indicate safety compliance, application and interoperability.
An 8-bit parallel digital interface is defined by ITU-R Rec. 601; this is obsolete (however, many clauses in the various standards accommodate the possibility of an 8-bit interface).
Data format
In SD and ED applications, the serial data format is defined to 10 bits wide, whereas in HD applications, it is 20 bits wide, divided into two parallel 10-bit datastreams (known as Y and C). The SD datastream is arranged like this:
:
whereas the HD datastreams are arranged like this:
; Y:
; C:
For all serial digital interfaces (excluding the obsolete composite encodings), the native color encoding is 4:2:2 YCbCr
YCbCr, Y′CbCr, also written as YCBCR or Y′CBCR, is a family of color spaces used as a part of the color image pipeline in digital video and digital photography, photography systems. Like YPbPr, YPBPR, it is based on RGB primaries; the two ...
format. The luminance channel (Y) is encoded at full bandwidth (13.5 MHz in 270 Mbit/s SD, ~75 MHz in HD), and the two chrominance channels (Cb and Cr) are subsampled horizontally and encoded at half bandwidth (6.75 MHz or 37.5 MHz). The Y, Cr, and Cb samples are ''co-sited'' (acquired at the same instance in time), and the Y' sample is acquired at the time halfway between two adjacent Y samples.
In the above, Y refers to luminance
Luminance is a photometric measure of the luminous intensity per unit area of light travelling in a given direction. It describes the amount of light that passes through, is emitted from, or is reflected from a particular area, and falls wit ...
samples, and C to chrominance
Chrominance (''chroma'' or ''C'' for short) is the signal used in video systems to convey the color information of the picture (see YUV color model), separately from the accompanying Luma (video), luma signal (or Y' for short). Chrominance is usu ...
samples. Cr and Cb further refer to the red and blue "color difference" channels; see Component video
Component video is an analog video signal that has been split into two or more component channels. In popular use, it refers to a type of component analog video (CAV) information that is transmitted or stored as three separate signals. Compo ...
for more information. This section only discusses the native color encoding of SDI; other color encodings are possible by treating the interface as a generic 10-bit data channel. The use of other colorimetry encodings, and the conversion to and from RGB
The RGB color model is an additive color model in which the red, green, and blue primary colors of light are added together in various ways to reproduce a broad array of colors. The name of the model comes from the initials of the three ...
colorspace, is discussed below
Below may refer to:
*Earth
*Ground (disambiguation)
*Soil
*Floor
* Bottom (disambiguation)
*Less than
*Temperatures below freezing
*Hell or underworld
People with the surname
* Ernst von Below (1863–1955), German World War I general
* Fred Belo ...
.
Video payload (as well as ancillary data payload) may use any 10-bit word in the range 4 to 1,019 (004 to 3FB) inclusive; the values 0–3 and 1,020–1,023 (3FC–3FF) are reserved and may not appear anywhere in the payload. These reserved words have two purposes; they are used both for Synchronization packets and for Ancillary data
Ancillary data is data that has been added to given data and uses the same form of transport. Common examples are cover art images for media files or streams, or digital data added to radio or television broadcasts.
Television
Ancillary data (co ...
headers.
Synchronization packets
A synchronization packet (commonly known as the timing reference signal or TRS) occurs immediately before the first active sample on every line, and immediately after the last active sample (and before the start of the horizontal blank
Horizontal blanking interval refers to a part of the process of displaying images on a computer monitor or television screen via raster scanning. CRT screens display images by moving beams of electrons very quickly across the screen. Once the be ...
ing region). The synchronization packet consists of four 10-bit words, the first three words are always the same—0x3FF, 0, 0; the fourth consists of 3 flag bits, along with an error correcting code. As a result, there are 8 different synchronization packets possible.
In the HD-SDI and dual link interfaces, synchronization packets ''must'' occur simultaneously in both the Y and C datastreams. (Some delay between the two cables in a dual link interface is permissible; equipment which supports dual link is expected to buffer the leading link in order to allow the other link to catch up). In SD-SDI and enhanced definition interfaces, there is only one datastream, and thus only one synchronization packet at a time. Other than the issue of how many packets appear, their format is the same in all versions of the serial-digital interface.
The flags bits found in the fourth word (commonly known as the XYZ word) are known as H, F, and V. The H bit indicates the start of horizontal blank; and synchronization bits immediately preceding the horizontal blanking region must have H set to one. Such packets are commonly referred to as End of Active Video, or EAV packets. Likewise, the packet appearing immediately before the start of the active video has H set to 0; this is the Start of Active Video or SAV packet.
Likewise, the V bit is used to indicate the start of the vertical blanking region; an EAV packet with V=1 indicates the following line (lines are deemed to start at EAV) is part of the vertical interval
In a raster scan display, the vertical blanking interval (VBI), also known as the vertical interval or VBLANK, is the time between the end of the final visible line of a frame or field and the beginning of the first visible line of the next fra ...
, an EAV packet with V=0 indicates the following line is part of the active picture.
The F bit is used in interlaced
Interlaced video (also known as interlaced scan) is a technique for doubling the perceived frame rate of a video display without consuming extra bandwidth. The interlaced signal contains two fields of a video frame captured consecutively. Th ...
and segmented-frame formats to indicate whether the line comes from the first or second field (or segment). In progressive scan
Progressive scanning (alternatively referred to as noninterlaced scanning) is a format of displaying, storing, or transmitting moving images in which all the lines of each frame are drawn in sequence. This is in contrast to interlaced video us ...
formats, the F bit is always set to zero.
Line counter and CRC
In the high definition serial digital interface (and in dual-link HD), additional check words are provided to increase the robustness of the interface. In these formats, the four samples immediately following the EAV packets (but not the SAV packets) contain a cyclic redundancy check
A cyclic redundancy check (CRC) is an error-detecting code commonly used in digital networks and storage devices to detect accidental changes to digital data. Blocks of data entering these systems get a short ''check value'' attached, based on ...
field, and a line count indicator. The CRC field provides a CRC of the preceding line (CRCs are computed independently for the Y and C streams) and can be used to detect bit error
In digital transmission, the number of bit errors is the number of received bits of a data stream over a communication channel that have been altered due to noise, interference, distortion or bit synchronization errors.
The bit error rate (BER) ...
s in the interface. The line count field indicates the line number of the current line.
The CRC and line counts are not provided in the SD and ED interfaces. Instead, a special ancillary data packet known as an EDH packet may be optionally used to provide a CRC check on the data.
Line and sample numbering
Each sample within a given datastream is assigned a unique line and sample number. In all formats, the first sample immediately following the SAV packet is assigned sample number 0; the next sample is sample 1; all the way up to the XYZ word in the following SAV packet. In SD interfaces, where there is only one datastream, the 0th sample is a Cb sample; the 1st sample a Y sample, the 2nd sample a Cr sample, and the third sample is the Y' sample; the pattern repeats from there. In HD interfaces, each datastream has its own sample numbering—so the 0th sample of the Y datastream is the Y sample, the next sample the Y' sample, etc. Likewise, the first sample in the C datastream is Cb, followed by Cr, followed by Cb again.
Lines are numbered sequentially, starting from 1, up to the number of lines per frame of the indicated format (typically 525, 625, 750, or 1125 (Sony HDVS
Sony HDVS (High-Definition Video System) is a range of high-definition video equipment developed in the 1980s to support the Japanese Hi-Vision standard which was an early analog high-definition television system (used in multiple sub-Nyquist s ...
)). Determination of line 1 is somewhat arbitrary; however, it is unambiguously specified by the relevant standards. In 525-line systems, the first line of vertical blank is line 1, whereas in other interlaced systems (625 and 1125-line), the first line after the F bit transitions to zero is line 1.
Note that lines are deemed to start at EAV, whereas sample zero is the sample following SAV. This produces the somewhat confusing result that the first sample in a given line of 1080i video is sample number 1920 (the first EAV sample in that format), and the line ends at the following sample 1919 (the last active sample in that format). Note that this behavior differs somewhat from analog video interfaces, where the line transition is deemed to occur at the sync pulse, which occurs roughly halfway through the horizontal blanking region.
Link numbering
Link numbering is only an issue in multi-link interfaces. The first link (the ''primary'' link) is assigned a link number of 1, subsequent links are assigned increasing link numbers; so, the second (''secondary'') link in a dual-link system is link 2. The link number of a given interface is indicated by a VPID packet located in the vertical ancillary data space.
Note that the data layout in dual link is designed so that the primary link can be fed into a single-link interface, and still produce usable (though somewhat degraded) video. The secondary link generally contains things like additional LSBs (in 12-bit formats), non-cosited samples in 4:4:4 sampled video (so that the primary link is still valid 4:2:2), and alpha or data channels. If the second link of a 1080P dual link configuration is absent, the first link still contains a valid 1080i signal.
In the case of 1080p60, 59.94, or 50 Hz video over a dual link; each link contains a valid 1080i signal at the same field rate. The first link contains the 1st, 3rd, and 5th lines of odd fields and the 2nd, 4th, 6th, etc. lines of even fields, and the second link contains the even lines on the odd fields, and the odd lines on the even fields. When the two links are combined, the result is a progressive-scan picture at the higher frame rate.
Ancillary data
Like SMPTE 259M
SMPTE 259M is a standard published by SMPTE which "describes a 10-bit serial digital interface operating at 143/270/360 Mb/s."
The goal of SMPTE 259M is to define a serial digital interface (based on a coaxial cable), called SDI or SD-SDI.
...
, SMPTE 292M supports the SMPTE 291M
Ancillary data is data that has been added to given data and uses the same form of transport. Common examples are cover art images for media files or streams, or digital data added to radio or television broadcasts.
Television
Ancillary data (com ...
standard for ancillary data. Ancillary data is provided as a standardized transport for non-video payload within a serial digital signal; it is used for things such as embedded audio
Audio most commonly refers to sound, as it is transmitted in signal form. It may also refer to:
Sound
*Audio signal, an electrical representation of sound
*Audio frequency, a frequency in the audio spectrum
*Digital audio, representation of sound ...
, closed captions, timecode, and other sorts of metadata
Metadata (or metainformation) is "data that provides information about other data", but not the content of the data itself, such as the text of a message or the image itself. There are many distinct types of metadata, including:
* Descriptive ...
. Ancillary data is indicated by a 3-word packet consisting of 0, 3FF, 3FF (the opposite of the synchronization packet header), followed by a two-word identification code, a data count word (indicating 0–255 words of payload), the actual payload, and a one-word checksum. Other than in their use in the header, the codes prohibited to video payload are also prohibited to ancillary data payload.
Specific applications of ancillary data include embedded audio, EDH, VPID and SDTI.
In dual link applications, ancillary data is mostly found on the primary link; the secondary link is to be used for ancillary data only if there is no room on the primary link. One exception to this rule is the VPID packet; both links must have a valid VPID packet present.
Embedded audio
Both the HD and SD serial interfaces provide for 16 channels of embedded audio. The two interfaces use different audio encapsulation methods — SD uses the SMPTE 272M standard, whereas HD uses the SMPTE 299M standard. In either case, an SDI signal may contain up to sixteen audio channels (8 pairs) embedded 48 kHz, 24-bit audio channels along with the video. Typically, 48 kHz, 24-bit (20-bit in SD, but extendable to 24 bit) PCM
Pulse-code modulation (PCM) is a method used to Digital signal (signal processing), digitally represent analog signals. It is the standard form of digital audio in computers, compact discs, digital telephony and other digital audio application ...
audio is stored, in a manner directly compatible with the AES3
AES3 is a technical standard, standard for the exchange of digital audio signals between professional audio devices. An AES3 signal can carry two channels of pulse-code modulation, pulse-code-modulated digital audio over several transmission medi ...
digital audio interface. These are placed in the (horizontal) blanking periods, when the SDI signal carries nothing useful, since the receiver generates its own blanking signals from the TRS.
In dual-link applications, 32 channels of audio are available, as each link may carry 16 channels.
SMPTE ST 299-2:2010 extends the 3G SDI interface to be able to transmit 32 audio channels (16 pairs) on a single link.
EDH
As the standard definition interface carries no checksum, CRC, or other data integrity
Data integrity is the maintenance of, and the assurance of, data accuracy and consistency over its entire Information Lifecycle Management, life-cycle. It is a critical aspect to the design, implementation, and usage of any system that stores, proc ...
check, an EDH ( Error Detection and Handling) packet may be optionally placed in the vertical interval of the video signal. This packet includes CRC values for both the active picture, and the entire field (excluding those lines at which switching may occur, and which should contain no useful data); equipment can compute their own CRC and compare it with the received CRC in order to detect errors.
EDH is typically only used with the standard definition interface; the presence of CRC words in the HD interface make EDH packets unnecessary.
VPID
VPID (or video payload identifier) packets are increasingly used to describe the video format. In early versions of the serial digital interface, it was always possible to uniquely determine the video format by counting the number of lines and samples between H and V transitions in the TRS. With the introduction of dual link interfaces, and segmented-frame standards, this is no longer possible; thus the VPID standard (defined by SMPTE 352M) provides a way to uniquely and unambiguously identify the format of the video payload.
Video payload and blanking
The active portion of the video signal is defined to be those samples which follow an SAV packet and precede the next EAV packet; where the corresponding EAV and SAV packets have the V bit set to zero. It is in the active portion that the actual image information is stored.
Color encoding
Several color encodings are possible in the serial digital interface. The default (and most common case) is 10-bit linearly sampled video data encoded as 4:2:2 YCbCr
YCbCr, Y′CbCr, also written as YCBCR or Y′CBCR, is a family of color spaces used as a part of the color image pipeline in digital video and digital photography, photography systems. Like YPbPr, YPBPR, it is based on RGB primaries; the two ...
. (YCbCr is a digital representation of the YPbPr
YPbPr or Y'P'bP'r, also written as , is a color space used in video electronics, in particular in reference to component video cables. Like YCBCR, it is based on gamma corrected RGB primaries; the two are numerically equivalent but YPBPR is de ...
colorspace). Samples of video are stored as described above. Data words correspond to signal levels of the respective video components, as follows:
* The luma (Y) channel is defined such that a signal level of 0 mV is assigned the codeword 64 (40 hex), and 700 millivolts (full scale) is assigned the codeword 940 (3AC hex) .
* For the chroma channels, 0 mV is assigned the code word 512 (200 hex), −350 mV is assigned a code word of 64 (40 hex), and +350 mV is assigned a code word of 960 (3C0 hex).
Note that the scaling of the luma and chroma channels is ''not'' identical. The minimum and maximum of these ranges represent the preferred signal limits, though the video payload may venture outside these ranges (providing that the reserved code words of 0–3 and 1020–1023 are ''never'' used for video payload). In addition, the corresponding analog signal may have excursions further outside of this range.
Colorimetry
As YPbPr (and YCbCr) are both derived from the RGB
The RGB color model is an additive color model in which the red, green, and blue primary colors of light are added together in various ways to reproduce a broad array of colors. The name of the model comes from the initials of the three ...
colorspace, a means of converting is required. There are three colorimetries typically used with digital video:
* SD and ED applications typically use a colorimetry matrix specified in ITU-R Rec. 601.
* Most HD, dual link, and 3 Gbit/s applications use a different matrix, specified in ITU-R Rec. 709.
* The 1035-line MUSE HD standards specified by SMPTE 260M (primarily used in Japan
Japan is an island country in East Asia. Located in the Pacific Ocean off the northeast coast of the Asia, Asian mainland, it is bordered on the west by the Sea of Japan and extends from the Sea of Okhotsk in the north to the East China Sea ...
and now largely considered obsolete), used a colorimetry matrix specified by SMPTE 240M. This colorimetry is nowadays rarely used, as the 1035-line formats have been superseded by 1080-line formats.
Other color encodings
The dual-link and 3 Gbit/s interfaces additionally support other color encodings besides 4:2:2 YCbCr, namely:
* 4:2:2 and 4:4:4 YCbCr, with an optional ''alpha'' (used for linear keying, a.k.a. alpha compositing
In computer graphics, alpha compositing or alpha blending is the process of combining one image with a background to create the appearance of partial or full transparency. It is often useful to render picture elements (pixels) in separate pass ...
) or data (used for non-video payload) channel
* 4:4:4 RGB, also with an optional alpha or data channel
* 4:2:2 YCbCr, 4:4:4 YCbCr, and 4:4:4 RGB, with 12 bits of color information per sample, rather than 10. Note that the interface itself is still 10 bit; the additional 2 bits per channel are multiplexed into an additional 10-bit channel on the second link.
If an RGB encoding is used, the three primaries are all encoded in the same fashion as the Y channel; a value of 64 (40 hex) corresponds to 0 mV, and 940 (3AC hex) corresponds to 700 mV.
12-bit applications are scaled in a similar fashion to their 10-bit counterparts; the additional two bits are considered to be LSBs.
Vertical and horizontal blanking regions
For portions of the vertical and horizontal blanking regions which are not used for ancillary data, it is recommended that the luma samples be assigned the code word 64 (40 hex), and the chroma samples be assigned 512 (200 hex); both of which correspond to 0 mV. It is permissible to encode analog vertical interval information (such as vertical interval timecode
Vertical Interval Timecode (VITC, pronounced "vitsee") is a form of SMPTE timecode encoded on one scan line in a video signal. These lines are typically inserted into the vertical blanking interval of the video signal.
With one exception, VITC ...
or vertical interval test signals) without breaking the interface, but such usage is nonstandard (and ancillary data is the preferred means for transmitting metadata). Conversion of analog sync and burst signals into digital, however, is not recommended—and neither is necessary in the digital interface.
Different picture formats have different requirements for digital blanking, for example all so called 1080 line HD formats have 1080 active lines, but 1125 total lines, with the remainder being vertical blanking.
Supported video formats
The various versions of the serial digital interface support numerous video formats:
* The 270 Mbit/s interface supports 525-line, interlaced video at a 59.94 Hz field rate (29.97 Hz frame rate), and 625-line, 50 Hz interlaced video. These formats are highly compatible with NTSC
NTSC (from National Television System Committee) is the first American standard for analog television, published and adopted in 1941. In 1961, it was assigned the designation System M. It is also known as EIA standard 170.
In 1953, a second ...
and PAL
Phase Alternating Line (PAL) is a color encoding system for analog television. It was one of three major analogue colour television standards, the others being NTSC and SECAM. In most countries it was broadcast at 625 lines, 50 fields (25 ...
-B/G/D/K/I systems respectively; and the terms NTSC and PAL are often (incorrectly) used to refer to these formats. (PAL is a composite color encoding scheme, and the term does not define the line-standard, though it is most usually encountered with 625i) while the serial digital interface— other than the obsolete 143 Mbit/s and 177 Mbit/s forms- is a component standard.
* The 360 Mbit/s interface supports 525i and 625i widescreen. It can also be used to support 525p, if 4:2:0 sampling is used.
* The various 540 Mbit/s interfaces support 525p and 625p formats.
* The nominal 1.49 Gbit/s interfaces support most high-definition video
High-definition video (HD video) is video of higher resolution and quality than standard-definition. While there is no standardized meaning for ''high-definition'', generally any video image with considerably more than 480 vertical scan lines ( ...
formats. Supported formats include 1080/60i, 1080/59.94i, 1080/50i, 1080/30p, 1080/29.97p, 1080/25p, 1080/24p, 1080/23.98p, 720/60p, 720/59.94p, and 720/50p. In addition, there are several 1035i formats (an obsolete Japanese television standard), half-bandwidth 720p standards such as 720/24p (used in some film conversion applications, and unusual because it has an odd number of samples per line), and various 1080psf (progressive, segmented frame) formats. Progressive Segmented frames formats appear as interlace video but contain video which is progressively scanned. This is done to support analog monitors and televisions, many of which are incapable of locking to low field rates such as 30 Hz and 24 Hz.
* The 2.97 Gbit/s dual link HD interface supports 1080/60p, 1080/59.94p, and 1080/50p, as well as 4:4:4 encoding, greater color depth, RGB encoding, alpha channels, and nonstandard resolutions (often encountered in computer graphics or digital cinema
Digital cinema is the digital technology used within the film industry to distribute or project motion pictures as opposed to the historical use of reels of motion picture film, such as 35 mm film. Whereas film reels have to be shipped to mo ...
).
* A quad link interface of 3G-SDI supports UHDTV-1 resolution 2160/60p
Related interfaces
In addition to the regular serial digital interface described here, there are several other similar interfaces which are similar to, or are contained within, a serial digital interface.
SDTI
There is an expanded specification called SDTI (''Serial Data Transport Interface''), which allows compressed (i.e. DV, MPEG
The Moving Picture Experts Group (MPEG) is an alliance of working groups established jointly by International Organization for Standardization, ISO and International Electrotechnical Commission, IEC that sets standards for media coding, includ ...
and others) video streams to be transported over an SDI line. This allows for multiple video streams in one cable or faster-than-realtime (2x, 4x,...) video transmission. A related standard, known as HD-SDTI, provides similar capability over an SMPTE 292M interface.
The SDTI interface is specified by SMPTE 305M. The HD-SDTI interface is specified by SMPTE 348M.
ASI
The asynchronous serial interface (ASI) specification describes how to transport a MPEG Transport Stream (MPEG-TS), containing multiple MPEG video streams, over 75-ohm copper coaxial cable or multi-mode optical fiber.
ASI is popular way to transport broadcast programs from the studio to the final transmission equipment before it reaches viewers sitting at home.
The ASI standard is part of the Digital Video Broadcasting
Digital Video Broadcasting (DVB) is a set of international open standards for digital television. DVB standards are maintained by the DVB Project, an international industry consortium, and are published by a Joint Technical Committee (JTC) o ...
(DVB) standard.
SMPTE 349M
The standard ''SMPTE 349M: Transport of Alternate Source Image Formats through SMPTE 292M'', specifies a means to encapsulate non-standard and lower-bitrate video formats within an HD-SDI interface. This standard allows, for example, several independent standard-definition video signals to be multiplexed onto an HD-SDI interface and transmitted down one wire. This standard doesn't merely adjust EAV and SAV timing to meet the requirements of the lower-bitrate formats; instead, it provides a means by which an entire SDI format (including synchronization words, ancillary data, and video payload) can be ''encapsulated'' and transmitted as ordinary data payload within a 292M stream.
HDMI
The HDMI
High-Definition Multimedia Interface (HDMI) is a proprietary digital interface used to transmit high-quality video and audio signals between devices. It is commonly used to connect devices such as televisions, computer monitors, projectors, gam ...
interface is a compact audio/video interface for transferring uncompressed video
Uncompressed video is digital video that either has never been compressed or was generated by decompressing previously compressed digital video. It is commonly used by video cameras, video monitors, video recording devices (including general-pur ...
data and compressed/uncompressed digital audio
Audio most commonly refers to sound, as it is transmitted in signal form. It may also refer to:
Sound
*Audio signal, an electrical representation of sound
*Audio frequency, a frequency in the audio spectrum
*Digital audio, representation of sound ...
data from an HDMI-compliant device to a compatible computer monitor
A computer monitor is an output device that displays information in pictorial or textual form. A discrete monitor comprises a electronic visual display, visual display, support electronics, power supply, Housing (engineering), housing, electri ...
, video projector
A video projector is an image projector that receives a video signal and projects the corresponding image onto a projection screen using a lens system. Video projectors use a very bright ultra-high-performance lamp (a special mercury arc l ...
, digital television
Digital television (DTV) is the transmission of television signals using Digital signal, digital encoding, in contrast to the earlier analog television technology which used analog signals. At the time of its development it was considered an ...
, or digital audio
Digital audio is a representation of sound recorded in, or converted into, digital signal (signal processing), digital form. In digital audio, the sound wave of the audio signal is typically encoded as numerical sampling (signal processing), ...
device. It is mainly used in the consumer area, but increasingly used in professional devices including uncompressed video, often called clean HDMI.
G.703
The G.703 standard is another high-speed digital interface, originally designed for telephony.
HDcctv
The HDcctv standard embodies the adaptation of SDI for video surveillance applications, not to be confused with TDI, a similar but different format for video surveillance cameras.
CoaXPress
The CoaXPress
CoaXPress (CXP) is a digital interface standard developed for high-speed image data transmission in machine vision applications.
The name is a portmanteau of 'express' and 'coaxial' to emphasize CoaXPress is faster than other standards (e.g. Came ...
standard is another high-speed digital interface, originally designed for use with industrial cameras. The data rates for CoaXPress go up to 12.5 Gbit/s over a single coaxial cable. A 41 Mbit/s uplink channel and power over coax are also included in the standard.
References
Sources
Standards
* Society of Motion Picture and Television Engineers: ''SMPTE 274M-2005: Image Sample Structure, Digital Representation and Digital Timing Reference Sequences for Multiple Picture Rates''
* Society of Motion Picture and Television Engineers: ''SMPTE 292M-1998: Bit-Serial Digital Interface for High Definition Television''
* Society of Motion Picture and Television Engineers: ''SMPTE 291M-1998: Ancillary Data Packet and Space Formatting''
* Society of Motion Picture and Television Engineers: ''SMPTE 372M-2002: Dual Link 292M Interface for 1920 x 1080 Picture Raster''
External links
Standards of SMPTE
HDcctv Alliance (Security organization supporting SDI for security surveillance)
{{Analogue TV transmitter topics
ITU-R recommendations
Serial buses
Digital display connectors
Television technology
Audiovisual connectors
High-definition television
Broadcast engineering
Video signal
Television terminology