AJAX progress indicator
Search: (clear)
  • A and B Rolls
    The process used to create optical effects, such as dissolves or wipes in film or videotape. The A roll contains the outgoing scene and the B roll contains the incoming scene. The point where the A and B scenes begin to overlap is the start of the effect. The length of the overlap is known as the duration of the effect.
  • Access Unit
    AUs are coded representations of audio/video frames.
  • Active Server Pages
    Microsoft's technology to enable HTML pages to be dynamic and interactive by embedding scripts, i.e. either VBScript or JScript, Microsoft's alternative of JavaScript. Since the scripts in ASP pages (suffix .asp) are processed by the server, any browser can work with ASP pages regardless of its support for the scripting language used therein.
  • ActiveX
    Phrase appearing everywhere COM technology is used, e.g. ActiveXControl, ActiveXServer, etc.
  • ActiveXControl
    A COM object that can be loaded from a server via inter- or intranet and run on any NT-client. In contrast to Java applets, ActiveXControls do not run in a sandbox, e.g. the security restrictions are not too strict. Thus downloading ActiveXControls is only recommended using intranets. Examples for ActiveXControls are XMLDOM and XMLHTTP.
  • Adaptive Coding and Modulation
    A technology which can automatically change the modulation and forward error correction or FEC of a link (referred to as MODCOD) to compensate for changes in link conditions
  • Adaptive Field Frame
    The AFF encode type, which is opposite to frame only encodes (FOE), performs both frame and adaptive field/frame encoding.
  • Additive White Gaussian Noise
    A basic noise model used in Information theory to mimic the effect of many random processes that occur in nature.
  • Advanced Audio Codec
    One of the most technically advanced audio compression algorithms. It is part of the MPEG-4 specifications.
  • Advanced Streaming Format
    Advanced Streaming Format is a MS Windows Media Player file format for audio and video on the Internet and other networks. It is a highly flexible and compressed format that contains streaming audio, video, slide shows, and synchronized events. ASF files are streaming, which means that they can begin to play before they have been completely downloaded.
  • Advanced Television System Committee
    Formed to establish technical standards for advanced television systems, including high definition television.
  • AES/EBU
    Colloquial for the two-channel digital audio interface, defined by AES and EBU. Standards name: AES-3. Self-clocking transmission of 2 channels with an audio word length of up to 24 bits. Balanced transmission for up to 100-m distance.
  • Algorithms
    A programmed set of mathematical formulas developed for a computer environment to perform a specific function.
  • Alpha Channel
    A fourth channel in a 32-bit image that allows for transparency. Alpha masks can be created in another program such as Adobe Photoshop and then used by Sorenson Video to mask out an area of a video clip.
  • American National Standards Institute
    http://www.ansi.org/
  • American Standard Code for Information Interchange
    A character-encoding scheme originally based on the English alphabet that encodes 128 specified characters - the numbers 0-9, the letters a-z and A-Z, some basic punctuation symbols, some control codes that originated with Teletype machines, and a blank space - into the 7-bit binary integers.
  • Amplitude Imbalance
    The maximum peak-to-peak amplitude difference (in dB) between the output ports of a power divider or hybrid coupler over the specified frequency range.
  • Analog
    A form of transmitting information characterized by continuously variable quantities, as opposed to digital transmission, which is characterized by discrete bits of information in numerical steps. An analog signal is responsive to changes in light, sound, heat and pressure.
  • Anti-aliasing
    The smoothing and removing of aliasing effects by filtering and other techniques. Edges on character generators and DVEs contain anti-aliasing facilities.
  • Apple Quicktime
    MOV is the Apple Quicktime file format.
  • Application Program Interface
    A set of calling conventions defining how a service is invoked through a software package.
  • Artifact
    Distortion that occurs in an audio or video file that has been compressed with a codec that was not present in the original uncompressed source.
  • Aspect Ratio
    The ratio of the width of the display screen to height. For most TV sets, this ratio is 4:3. For HDTV, the ratio is 16:9. The aspect ratio, along with the number of vertical scan lines that make up the image determines what sample rate should be used to digitize the video signal.
  • Asymmetric
    Unequal. Refers to the time spent encoding vs. decoding. Encoding can take much longer than decoding.
  • Asynchronous Serial Interface
    A streaming data format which often carries an MPEG Transport Stream (MPEG-TS).
  • Asynchronous Transfer Mode
    A method for transmitting voice, video, and data over high-speed local-area networks.
  • Attribute
    A parameter or setting of a tag that defines the way the tag is rendered.
  • Audio Channel
    Audio data intended to be delivered to a single audio speaker.
  • Audio Sample Clock
    Original clock related to the audio input samples at the Source or the generated clock used to time the output of audio samples.
  • Audio Stream ID
    The audio elementary stream identifier.
  • Audio Video Interleave
    The file format for Microsoft's Video for Windows architecture.
  • Automatic Frequency Control
    A method or circuit to automatically keep a resonant circuit tuned to the frequency of an incoming radio signal. It is primarily used in radio receivers to keep the receiver tuned to the frequency of the desired station.
  • Backbone
    The primary connectivity mechanism of a hierarchical distributed system. All systems which have a connectivity to an intermediate system on the backbone are assured of connectivity to each other. This does not prevent systems from setting up private arrangements with each other to bypass the backbone for reasons of cost, performance, or security.
  • Backwards-Compatible
    Referred to the system allowing partial stream reception by DVB-S receivers.
  • Balanced Input/Output
    Symmetrical Input/Output, The signal is transmitter on a pair of shielded wires. One wire contains opposite phase to suppress parasitic signals.
  • Bandwidth
    Analog: The difference between the upper and lower frequencies in a continuous set of frequencies. Digital: A measurement of bit-rate of available or consumed data communication resources expressed in bits per second or multiples of it (bit/s, kbit/s, Mbit/s, Gbit/s, etc.).
  • Base PID
    A packet identifier of fixed value 0x1FFB.
  • Baseband
    The basic direct output signal in an intermediate frequency based obtained directly from a television camera, satellite television receiver, or video tape recorder. Baseband signals can be viewed only on studio monitors. To display the baseband signal on a conventional television set a modulator is required to convert the baseband signal to one of the VHF or UHF television channels which the television set can be tuned to receive.""
  • Basic Multilingual Plane
    The first plane, plane 0, the Basic Multilingual Plane (BMP), is where most characters have been assigned so far in Unicode. The BMP contains characters for almost all modern languages, and a large number of special characters.
  • Bayonet Neill Concelman
    A connector system used for high frequency signals like in television and digital audio, named after its inventor.
  • BCH
    Binary block code error correction technique named after the developers Bose, Chauduri, and Hocquenghem.
  • Betacam
    Portable camera/recorder system using 1/2-inch tape developed by Sony. Betacam uses a version of the Y, R-Y, B-Y color difference signal set. Betacam is a registered trademark of the Sony Corporation.
  • Betacam SP
    A superior performance version of Betacam SP uses metal particle tape and a wider bandwidth recording system.
  • Bidirectional Pictures (B-Pictures)
    The coded pictures of the MPEG standard that use both past and future pictures as reference in a technique called bidirectional prediction.
  • Bidirectional-Frames
    Bidirectional pictures.
  • Binary
    A computer language developed with only two letters in its alphabet.
  • Binary Coded Decimal
    A class of binary encodings of decimal numbers where each decimal digit is represented by a fixed number of bits, usually four or eight, although other sizes (such as six bits) have been used historically. Special bit patterns are sometimes used for a sign or for other indications (e.g., error or overflow).
  • Binary Phase Shift Keying
    The simplest form of phase shift keying (PSK). It uses two phases which are separated by 180
  • Bit
    A binary digit, either a 0 or 1. 8 bits make up on a byte. Bits are used in the measure of transmission capacity.
  • Bit Error Probability
    The expectation value of the BER. The BER can be considered as an approximate estimate of the bit error probability. This estimate is accurate for a long time interval and a high number of bit errors.
  • Bit Error Rate
    The rate at which the compressed bitstream is delivered from the storage medium to the input of a decoder.
  • Bit Error Rate Test
    A testing method for digital communication circuits that uses predetermined stress patterns consisting of a sequence of logical ones and zeros generated by a test pattern generator.
  • Bit Error Ratio
    The number of bit errors divided by the total number of transferred bits during a studied time interval. BER is a unitless performance measure, often expressed as a percentage.
  • Bitrate
    The rate at which the compressed bit stream is delivered from the channel to the input of a decoder.
  • Black Level
    The video signal level corresponding to black areas in a scene. For a composite signal, black is standardized at +7.5 units as viewed on a waveform monitor IRE scale. This elevated black level is often referred as setup and serves as a guard band between video and sync. For component video
  • Blanking
    An ordinary television signal consists of 30 separate still pictures or frames sent every second. They occur so rapidly, the human eye blurs them together to form an illusion of moving pictures. This is the basis for television and motion picture systems. The blanking interval is that portion of the television signal which occurs after one picture frame is sent and before the next one is transmitted. During this period of time special data signals can be sent which will not be picked up on an ordinary television receiver.
  • BLOB
    Binary large object, something that (e.g. within the context of XML) is treated and esp. gets stored as a whole without further structuring efforts.
  • Block Refresh
    Block refresh and keyframes work on a similar premise. Where interframes are frames that are dependent upon keyframes for unencoding video information, intra blocks are small sections of a video frame (16x16 pixels) that accurately describe a video at that point in time. Forcing a block refresh forces a section of the video frame to be accurate so that the same block in the following frame can approximate changes more accurately. The advantage to a block refresh is that refreshing random blocks of 16x16 pixels create a picture of similar image quality to keyframes at a much lower data rate. Additionally, if a packet containing an original block is lost, it won't compromise the entire movie the way losing a keyframe would.
  • Bouquet
    Collection of services marketed as a single entity.
  • Bouquet Association Table
    A table describing a bouquet of programs offered by a broadcaster. DVB only.
  • bridge component
    OSI layer 2 connecting component, that connects two or more link layer components, not necessarily using different technologies
  • Broadband
    Refers to networks capable of high bandwidth delivery. Broadband networks are used by Cable Television and range from 550 MHz to 1GHz. A single TV channel requires 6MHz. In the digital domain, all content is digitized and hence the bandwidth is usually measured in bits-per-second (BPS).
  • Broadcast
    A packet delivery system where a copy of a given packet is given to all hosts attached to the network.
  • Broadcast Quality
    A nebulous term used to describe the output of a manufacturers product. Usually at least means that the technical specifications meet the FCC rules for broadcasting.
  • Broadcast Satellite Service
    A term used to refer to satellite television broadcasts intended for home reception.
  • Broadcast Service
    Bandwidth of the frequency Slot allocated to a service.
  • Broadcaster
    Organization which assembles a sequence of events or programs to be delivered to the viewer based upon a schedule
  • bslbf
    bit serial, leftmost bit first
  • Burst (Color)
    A color reference signal included as part of the overall composite video signal. Eight to ten cycles of color sub-carrier (3.579545 MHz, often abbreviated as 3.58) are inserted before the start of every horizontal line. It can be seen just following the H sync pulse in the H blanking interval. It provides color-synchronizing information for the color decoding circuits in monitors, receivers and other TV equipment. These have color killer circuits that disable the color decoded if the burst is not present so that the signal is processed as monochrome. The burst must be precisely timed in relation to H sync""
  • Byte
    One byte of digital video information is a packet of bits, usually but not always eight. In the digital video domain, a byte is used to represent the luminance or chrominance level. One thousand bytes is one kilobyte (kb) and one million bytes is one megabyte (MB).
  • Byte aligned
    A bit in a coded bit stream is byte-aligned if its position is a multiple of 8-bits from the first bit in the stream.
  • Caching
    A form of replication in which information learned during a previous transaction is used to process later transactions.
  • Capture Device
    A capture device transforms analog audio or video data to digital data.
  • Carrier Noise Interference
    Carrier to Noise plus Interference ratio.
  • Carrier/Noise
    C/N ratio of RF or IF signal power to noise power.
  • CCIR-601
    International standard for digital television source resolution. Also sometimes called D1 after the VTR format that first used this signal.
  • CDS HNED storage
    Storage on the HNED dedicated to CDSs of a single service provider.
  • CEA Extension
    A 128 byte EDID 1.3-compatible extension block defined in CEA-861-D, designed to allow declaration of audio formats, additional video formats (beyond those in the base EDID structure) and other characteristics of the Sink.
  • CEC Root (Device)
    A device, generally a display (Sink) device, formally defined by the following rule: A device that has no HDMI output or, a device that has chosen to take the physical address 0.0.0.0.
  • Cell
    Geographical area that is covered with DVB-T signals delivering one or more particular transport streams throughout the area by means of one or more transmitters. The cell may in addition contain repeaters. Two neighboring cells may be intersecting or fully overlapping. The cell_id that is used to uniquely identify a cell is unique within each original_network_id. For hand-over purposes it is more convenient if the transport streams associated with the cell cover exactly the same area, or only one transport stream per cell is used.
  • CH1, CH2
    Channel numbers relating to analog audio sources.
  • Channel
    A digital medium that stores or transports an ITU-T Rec. H.222.0 | ISO/IEC 13818-1 stream.
  • Checksum
    A small-size datum computed from an arbitrary block of digital data for the purpose of detecting errors that may have been introduced during its transmission or storage.
  • Chrominance
    The color information part of the video signal, usually defined in terms of hue and saturation. The video signal is make up of chrominance (color) and luminance (brightness) information. The symbols are Cr and Cb.
  • Client Pull
    The HTTP-EQUIV attribute can be used to automatically load another document and thereby re-direct the user. Although this use is not recommended it is useful when a page or site has moved to a new location.
  • Closed Group of Pictures
    Selection so that all GOPs are closed and can be decoded separately.
  • Codec
    Coder/decoder system for digital transmission. A codec is a program which encodes and decodes digital data with the purpose of data reduction.
  • Coded representation
    A data element as represented in its encoded form.
  • Color Bars
    An electronically generated standard set of colors used as a reference for proper equipment setup. Color bars include the three additive primary colors (red, green, and blue) and their complements (cyan, magenta, and yellow) displayed in vertical rows, plus gray and black. The bars appear left to right in order of decreasing luminance - yellow, cyan, green, magenta, red, and blue. Videotape machines, cameras, telecine chains, and monitors all use color bars as a reference for proper setup. The waveform can be examined on a waveform monitor or vectorscope to verify that the encoding process was proper and/or that it was not changed by any subsequent transmission or recording process. Color bars are readily available and easily interpreted and so have become a de-facto set standard. There are several variations of color bars. Full field bars show the color bars running the full height of the screen. EIA (Electronic Industries Association) bars assigns the bottom third of the pattern to the -I, Q, and black level set signals, which are useful for setup of camera encoders. SMPTE bars are the same as EIA, but insert another set of short color bars above the I and Q bars. These bars run in the reverse color order of the regular bars and are a convenient aid for the setup of color monitors. All versions allow the overall amplitude of the subcarrier (chrominance) to be set at 100% or 75% of the standard value. The 75% value is usually used for recording to avoid possible overload problems from highly saturated yellows and cyans. The amplitude of the gray bar can be set at 100% or 75% IRE units.
  • Common Interchange Format
    CIF is a video resolution of 352x288 pixels: a quarter of PAL resolution.
  • Communication Port
    Communications input/output ports for cable connections between a serial port and VTR(s). Usually COM1 and COM2.
  • Comp In
    Composite video signal input.
  • Comp Out
    Composite video signal output.
  • Compact Disc
    Compact Disc, a 12-cm disk containing information, colloquial for Digital Audio CDs (CD-DA), One disc contains according to the standard (Red Book) up to 79 minutes of digital audio. On the market are disks found with up to 80 minutes. Producing long playing CDs is more critical, some old generation players may show playability problems. A CD has a storage capacity of typically 700 Mbytes.
  • Compact Disc - Read Only Memory
    Compact disc format used for holding text, graphics, and high fidelity stereo sound.
  • Compact Disc - Recordable
    Recordable CD. A CD-R has inside a dye, sensitive to a laser beam. The Laser writes the information (pits) into an optical pregroove. The dye will changing the reflectivity of the gold layer and form the pits. The pregroove contains time information and is used to guide the laser during recording. A CD-R cannot be erased. It is a WORM (Write Once, Read Many times). A finalized (fix-up) CD-R becomes a CD can be played in conventional players.
  • Component
    Specific set of functionalities.
  • Component (Elementary Stream)
    One or more entities which together make up an event.
  • Component Video
    A video signal win which luminance and chrominance information is kept separate rather than being combined as in the composite video signal. Component processing and routing requires three wires to route the signal, and component recording requires the use of separate tracks on magnetic tape Y, R-Y, B-Y, or Y, U, V.
  • Composite Baseband
    The unclamped and unfiltered output of the satellite receiver's demodulator circuit, containing the video information as well as all transmitted subcarriers.
  • Composite Video
    Standard video that combines chrominance and luminance information by encoding the output of the red, green, and blue channels into the Y, I, and Q signals. Composite video includes blanking and sync and is the standard for broadcast transmissions of video signals.
  • Compressed (Audio)
    All audio formats carried by HDMI other than L-PCM and One Bit Audio.
  • Compression
    Reduction of dynamic range. Used in broadcasting to achieve greater or more uniform loudness. Digital compression involves the use of algorithms to reduce the bandwidth necessary to store or transmit a digital signal.
  • Compressor
    An audio or video device that reduces the amplitude range of a video clip or an audio track. The effect of the audio compressor is to make the loud parts of a signal softer and to make the very soft parts louder.
  • Conditional Access System
    Controls subscriber access to services, programs and events.
  • Conditional Access Table
    Contains a directory listing of all ITU-T Rec. H.222 entitlement management message streams used by Program Map Tables.
  • Connecting component
    Component which is used to connect link layer components with each other.
  • Constant Bit Rate
    A variety of MPEG video compression where the amount of compression does not change.
  • Constrained System Parameter Stream
    A Program Stream for which the constraints defined in 2.7.9 of ITU-T Rec. H.222.0 | ISO/IEC 13818-1 apply.
  • Content Download Service
    Service that provides download delivery of content items to the local storage of the HNED.
  • Content item
    An editorially coherent grouping of one or more audiovisual or generic data files which are intended to be consumed in conjunction with each other.
  • Content on Demand
    Program provided at the request of the end user for direct consumption (real-time streaming).
  • Content provider
    Entity that owns or is licensed to sell content or content assets.
  • Content Service Provider
    Entity which acquires/licenses content from Content Providers and packages this into a service.
  • Control Track
    A guide pulse acting as an electronic sprocket hole recorded on the videotape. Control track pulses are used by servo systems to maintain a tape speed that allows precise playback head tracking. Control track pulses are recorded one per television frame and are used in lieu of time code by very basic editing systems to locate edit points and make edits. Missing control track signals may cause the video signal to break up or mistrack. This type of problem is analogous to that resulting from torn sprocket holes on film.
  • Copper Distributed Data Interface
    The version of FDDI that uses twisted pair lines rather than optical fiber.
  • Counter
    This is the indicator which tells you how far through a tape you are. A real-time counter is one which gives a reading in hours, minutes and seconds.
  • Cropping
    Cropping is the process of cutting the borders of a picture or video.
  • CTL
    Control signal in the form of regular pulses recorded along a longitudinal track on the videotape. By counting these pulses, it is possible to determine the number of frames, and hence the tape's running time. Used mainly to adjust the tracking position of video heads, and to achieve time code continuity in continuous recording.
  • CTS
    1) Cycle Time Stamp; 2) Clear To Send
  • Cyclic Redundancy Check
    Verifies the correctness of data.
  • Data element
    An item of data as represented before encoding and after decoding.
  • Data Stream Disparity
    Integer indicating DC-offset level of link. A positive value represents the excess number of "1"s that have been transmitted. A negative value represents the excess number of "0"s that have been transmitted.""
  • Decibel (dB)
    The standard unit used to express the ratio of two power levels. It is used in communications to express either a gain or loss in power between the input and output devices.
  • Decibels referenced to one milliwatt
    A unit for expression power level in decibels with reference to a power of one milliwatt.
  • Decoded stream
    The decoded reconstruction of a compressed bit stream.
  • Decoder
    A television set-top device which enables the home subscriber to convert an electronically scrambled television picture into a viewable signal. This should not be confused with a digital coder/decoder known as a CODEC which is used in conjunction with digital transmissions.
  • Decoding (process)
    The process that reads an input coded bitstream and produces decoded pictures or audio samples.
  • Decoding time-stamp; DTS (system)
    A field that may be present in a PES packet header that indicates the time that an access unit is decoded in the system target decoder.
  • Delay
    The time it takes for a signal to go from the sending station through the satellite to the receiving station. This transmission delay for a single hop satellite connection is very close on one-quarter of a second.
  • Delivery Network
    Network connecting the Delivery Network Gateway (DNG) and service providers.
  • Delivery Network Gateway
    Device that is connected to one or multiple delivery networks and one or multiple home network segments.
  • Delivery system
    Physical medium by which one or more multiplexes are transmitted. EXAMPLE: Satellite system, wide-band coaxial cable, fibre optics, terrestrial channel of one emitting point.
  • Demodulator
    A satellite receiver circuit which extracts or demodulates the "wanted "signals from the received carrier.""
  • Descriptor
    A data structure of the format: descriptor_tag, descriptor_length, and a variable amount of data. The tag and length fields are each 8 bits. The length specifies the length of data that begins immediately following the descriptor_length field itself. A descriptor whose descriptor_tag identifies a type not recognized by a particular decoder shall be ignored by that decoder. Descriptors can be included in certain specified places within PSIP tables, subject to certain restrictions. Descriptors may be used to extend data represented as fixed fields within the tables. They make the protocol very flexible because they can be included only as needed. New descriptor types can be standardized and included without affecting receivers that have not been designed to recognize and process the new types.
  • Destination Transport address:
    Combination of the IP destination address and destination UDP port.
  • Digital
    Conversion of information into bits of data for transmission through wire, fiber optic cable, satellite, or over air techniques. Method allows simultaneous transmission of voice, data or video.
  • Digital - Video Home System
    A digital enhancement of the Video Home System ( VHS ) videocassette recording ( VCR ) technology that was developed by Japan Victor Company (JVC) in 1976. D-VHS is a joint effort among JVC, Hitachi, Matsushita, and Sony (who developed Betamax VCR systems).
  • Digital D-1
    The CCIR-601 digital video format for high-end digital video systems in broadcast and production environments. This is a SMPTE 259M component serial digital video format. Digital video using separate color components, such as YCrCB or RGB.
  • Digital Disk Recorder
    A digital video recording device based on high speed computer disk drives. Commonly used as a means to get video into and out from computers and for editing.
  • Digital Storage Media
    A digital storage or transmission device or system.
  • Digital Television
    Refers to transmitting a broadcast signal by encoding it as zeroes and ones - the digital code used in computers. DTV can be compressed to provide four, five or more channels in the same bandwidth required for one channel of the current standard television.
  • Digital Video (or Versatile) Disk
    A disk of the size of a CD, but with a storage capacity of up to 17 Gbyte. The single layer one-side DVD stores up to 4.7 Gbyte, more than eight time as much as on a CD. It is an ideal media for video and multi-channel audio applications. Sometimes, DVD is translated as Digital Video Disk. With MPEG-2 code, a 3h Film with 7+1 Audio channels can be stored on one disk in excellent audio and video quality.
  • Digital Video Cassette
    A digital tape recording format using approximately 5:1 compression to produce near-Betacam quality on a very small cassette. Originated as a consumer product, but being used professionally as exemplified by Panasonic's variation, DVC-Pro, and Sony's variation DVCam.
  • Digital Video Effects
    A black box which digitally manipulates the video to create special effects, for example, the ADO (Ampex Digital Optics) system. Common DVE effects include inverting the picture, shrinking it, moving it around within the frame of another picture, spinning it, and a great many more.
  • Digital Visual Interface
    A video display interface developed by the Digital Display Working Group (DDWG). The digital interface is used to connect a video source to a display device, such as a computer monitor. It was developed with the intention of creating an industry standard for the transfer of digital video content.
  • Direct Broadcast Satellite
    Satellites powerful enough (approximately 120 watts on the Ku-band) to transmit a signal directly to a medium to small receiving dish (antenna). DBS does not require reception and distribution by an intermediate broadcasting facility, but transmits directly to the end user.
  • Direct Stream Transport
    An audio format which is a lossless compression of Direct Stream Digital (DSD), as used in Super Audio CD. DST is described in ISO/IEC 14496, part 3, Amendment 6: Lossless coding of oversampled audio.
  • DirectShow
    DirectShow is part of Microsoft's DirectX framework. DirectShow is the media streaming architecture.
  • DirectX
    Microsoft DirectX is a set of application programming interfaces (APIs) for creating games and other high-performance multimedia applications.
  • Disc Array
    A group of two or more hard discs being used as a single drive to either increase performance or redundancy. Also known as RAID (Redundant Array of Independent Discs).
  • Dissolve
    The gradual change from one picture to another, allowing the pictures to be superimposed during the transition.
  • Dolby
    Dolby A, B, C, and Dolby SR are all types of electronic processing used to increase signal-to-noise ratio and reduce unwanted frequencies, specifically the tape hiss inherent in the recording process of analog signals.
  • Dolby AC-3
    A perceptual digital audio coding technique capable of delivering multichannel digital surround sound. It incorporates 6 (5.1) discrete channels; each channel can carry a different signal simultaneously (left front, right front, center, left rear, right rear, sub-woofer). AC-3 is the key component of Dolby Stereo Digital film sound format introduced to theaters in June 1992 and Dolby Surround Digital which is the consumer equivalent of Dolby Stereo Digital. Chosen as the audio standard for HDTV.
  • Domain
    In the Internet, a part of a naming hierarchy. Syntactically, an Internet domain name consists of a sequence of names (labels) separated by periods (dots). In OSI, domain is generally used as an administrative partition of a complex distributed system"
  • Domain Name System
    The distributed name/address mechanism used in the Internet.
  • Downstream
    In the direction of the primary audio and video data flow, i.e. towards the Sink (e.g. display).
  • Drop Frame Time Code
    An SMPTE Time Code (TC) option that allows indicated TC to agree with clock time. The color frame rate for color TV is actually 29.97 frames per second, not 30, so that over a 60 minute period a TC reader would count 108 frames (3.6 seconds) short. To correct this situation, at the beginning of each minute frames 0 and 1 are
  • Dropped Frames
    Frames of a video lost during the capture process. This can be caused by several factors, including: slow hard drive, lack of RAM, or a slow capture card.
  • DSM
    A set of one or more digital elementary streams. See virtual channel.
  • DTD
    1) Detailed Timing Descriptor; 2) Document Type Definition. Schema specification method for SGML and XML documents. DTDs are either contained in the document or belong to its external subset and are then referenced from within the document's document type declaration per URI. Known DTDs are e.g. DocBook, CML, IBTWSH, and HTML. dtd2html generates HTML documentation for SGML DTDs. For XML, DTDs will be replaced by the new XML Schema specification method.
  • Dublin
    A metadata format defined on the basis of international consensus which has defined a minimal information resource description, generally for use in a WWW environment. http://dublincore.org/
  • Duration
    The time between the starting point and the ending point of an encoding process.
  • DV
    DV is a standard for digital video recording. DV videos can be imported by special DV codecs via FireWire.
  • DV25
    The most common type of DV compression; DV25 has a fixed data rate of 25 Megabits per second.
  • DVB-C
    Digital Video Broadcasting baseline system for digital cable television (EN 300 429 [6])
  • DVB-CS
    Digital Video Broadcasting baseline system for SMATV distribution systems (EN 300 473 [13])
  • DVB-MC
    Digital Video Broadcasting baseline system for Multi-point Video Distribution Systems below 10 GHz (EN 300 749 [16])
  • DVB-MS
    Digital Video Broadcasting baseline system for Multi-point Video Distribution Systems at 10 GHz and above (EN 300 748 [15])
  • DVB-S
    Digital Video Broadcasting baseline system for digital satellite television (EN 300 421 [5])
  • DVB-S2
    DVB-S2 System as specified in EN 300 469
  • DVB-T
    Digital Video Broadcasting baseline system for digital terrestrial television (EN 300 744 [9])
  • DVD-Audio
    Disk format conforming to any version of DVD Specifications for Read-Only Disc, Part 4: AUDIO SPECIFICATION.
  • Editing
    The process by which one or more compressed bit streams are manipulated to produce a new compressed bit stream. Edited bit streams meet the same requirements as streams which are not edited.
  • Electronic Program Guide
    A generic term for one of the coded video, coded audio or other coded bit.
  • Elementary Stream Clock Reference; ESCR (system)
    A time stamp in the PES Stream from which decoders of PES streams may derive timing.
  • Encapsulation
    The technique used by layered protocols in which a layer adds header information to the protocol data unit (PDU) from the layer above.
  • Encoder
    An embodiment of an encoding process.
  • Encoding (process)
    A process, not specified in ITU-T Rec. H.222.0 | ISO/IEC 13818-1, that reads a stream of input pictures or audio samples and produces a coded bit stream conforming to ITU-T Rec. H.222.0 | ISO/IEC 13818-1.
  • Encrypt
    To scramble the contents of a file or message in such a way as to make it unreadable to everyone except those with a key or code. The code makes it possible to unscramble the encrypted file or message.
  • Encryption
    The science of encoding data so that it cannot be interpreted by anybody or any machine that does not have the key or code. This process is also called scrambling.""
  • Entitlement Control Message
    Private conditional access information which specify control words and possibly other, typically stream-specific, scrambling and/or control parameters.
  • Entitlement Management Message Stream
    Defines access rights for each individual decoder. The EMM stream is processed with the access control device, but the user processor is responsible for buffering EMMs and feeding them via an interface to the access control device. Authorization levels may be addressed to single decoders or groups of decoders.
  • Entropy coding
    Variable length lossless coding of the digital representation of a signal to reduce redundancy.
  • Event
    A collection of elementary streams with a common time base, an associated start time, and an associated end time. An event is equivalent to the common industry usage of television program."
  • eXtensible Stylesheet Language
    XSL is a language for creating a style sheet that describes how data sent over the Web using the Extensible Markup Language (XML) is to be presented to the user.
  • eXtensive Markup Language
    Extensible Markup Language, Modification of the SGML standard. In contrast to SGML documents XML documents may exist without having their schema described in a DTD. XML documents consist (mainly) of text and tags, and the tags imply a tree structure upon the document. Is the XML document properly structured, i.e. the tags do nest, the document is said to be ‘well-formed'. Is there, in addition, a DTD to which the document conforms, it is called ‘valid'.
  • Extract PTS
    Indicates that Presentation Time Stamps are to be extracted from the elementary streams to achieve lip-sync.
  • Fast forward playback (video)
    The process of displaying a sequence, or parts of a sequence, of pictures in display-order faster than real-time.
  • Fiber Distributed Data Interface
    The ANSI specification for fiber-optic networks that transmit at a speed of up to 100 megabits per second over a dual, counter-rotating, token ring topology.
  • Fiber Optic Cable
    A cable, consisting of a center glass core surrounded by layers of plastic that transmits data using light rather than electricity. It has the ability to carry more information over much longer distance.
  • Field
    Half of a video frame, 262.5 horizontal lines (NTSC).
  • FIFO
    First-In, First-Out shift register
  • File Server
    A computer connected to the network that contains primary files/applications and shares them as requested with the other computers on the network. If the file server is dedicated for that purpose only, it is connected to a client/server network. All the computers connected to a peer-to-peer network are capable of being the file server.
  • File Transfer Protocol
    File Transfer Protocol (FTP), a standard Internet protocol, is the simplest way to exchange files between computers on the Internet. Like the Hypertext Transfer Protocol (HTTP), which transfers displayable Web pages and related files, and the Simple Mail Transfer Protocol (SMTP), which transfers e-mail, FTP is an application protocol that uses the Internet's TCP/IP protocols. FTP is commonly used to transfer Web page files from their creator to the computer that acts as their server for everyone on the Internet. It's also commonly used to download programs and other files to your computer from other servers. As a user, you can use FTP with a simple command line interface (for example, from the Windows MS-DOS Prompt window) or with a commercial program that offers a graphical user interface. Your Web browser can also make FTP requests to download programs you select from a Web page. Using FTP, you can also update (delete, rename, move, and copy) files at a server. You need to logon to an FTP server. However, publicly available files are easily accessed using anonymous FTP. Basic FTP support is usually provided as part of a suite of programs that come with TCP/IP. However, any FTP client program with a graphical user interface usually must be downloaded from the company that makes it.
  • Filter
    A filter is used to remove spurious data from a search. Typically used when looking for a specific types of data in a list box.
  • Firewall
    A device used to keep uninvited guests from entering a private network. Because progressively downloaded videos are sent via the same method as Web content (HTTP, FTP), they are less likely to be blocked by a firewall than a streaming broadcast.
  • First Clock Reference
    The first time stamp in a system or program stream is used by the decoder for timing the decode start.
  • Forbidden
    When used in the clauses defining the coded bit stream, indicates that the value shall never be used.
  • Frame
    One complete video picture. In NTSC, takes place in one-thirtieth of a second and is made up of 525 lines and two fields.
  • Frame Only Encode
    When selected, the video stream complies with MPEG-2 syntax. The encode prediction is frame-based only as opposed to Adaptive Field Frame (AFF).
  • Frames (Pictures)
    These two terms are used interchangeably in the film industry. Pictures is the term used primarily in theoretical explanations; the term "frames" is used in descriptions of the practical applications.""
  • Frequency
    The number of times that an alternating current goes through its complete cycle in one second of time. One cycle per second is also referred to as one hertz; 1000 cycles per second, one kilohertz; 1,000,000 cycles per second, one megahertz; and 1,000,000,000 cycles per second, one gigahertz.
  • Gateway
    The original Internet term for what is now called router or more precisely, IP router. In modern usage, the terms gateway and "application gateway" refer to systems which do translation from some native format to another.""
  • Generation
    A copy or replication of an audio or video signal. Each generation or copy is further removed from the original signal. In the analog realm, each generation introduces degradation of the signal quality. In digital, each generation is exactly the same as its source.
  • Genlock
    To reference a signal generator to a signal normally external to and separate from the system controlled by the signal generator. The purpose is to bring a system, such as a studio, into proper timing with an external source, such as a live remote camera, so that special effects as wipes, keys, and dissolves can be done using the external source in combination with internal sources.
  • Gigabit Ethernet
    Gigabit Ethernet, a transmission technology based on the Ethernet frame format and protocol used in local area networks (LANs), provides a data rate of 1 billion bits per second (one gigabit). Gigabit Ethernet is defined in the IEEE 802.3 standard and is currently being used as the backbone in many enterprise networks. Gigabit Ethernet is carried primarily on optical fiber (with very short distances possible on copper media). Existing Ethernet LANs with 10 and 100 Mbps cards can feed into a Gigabit Ethernet backbone. An alternative technology that competes with Gigabit Ethernet is ATM. A newer standard, 10-Gigabit Ethernet, is also becoming available.
  • Gigabyte
    One billion bytes of information. One thousand megabytes.
  • Global Motion Compensation
    GMC is a MPEG-4 video encoding tool.
  • GOP Structure
    The encoded structure of frames within the selected GOP.
  • Graphical User Interface
    Graphics-based user interface that incorporates screens (windows), icons, pulldown menus, etc., and uses a mouse for navigation.
  • H.264
    A block-oriented motion-compensation-based video compression standard developed by the ITU-T Video Coding Experts Group (VCEG) together with the ISO/IEC JTC1 Moving Picture Experts Group (MPEG). The project partnership effort is known as the Joint Video Team (JVT). The ITU-T H.264 standard and the ISO/IEC MPEG-4 AVC standard (formally, ISO/IEC 14496-10
  • Half-Horizontal Resolution (HHR)
    Resolutions are 352x480 pixels (NTSC) and 352x576 pixels (PAL).
  • Hard Copy
    A paper printout of computer data. In editing, a printout of an EDL.
  • Header
    A block of data in the coded bitstream containing the coded representation of a number of data elements pertaining to the coded data that follow the header in the bitstream.
  • Headroom
    Is a measure in dB between the actual peak level and the maximum clip level.
  • Hertz (Hz)
    The name given to the basic measure of radio frequency characteristics. An electromagnetic wave completes a full oscillation from its positive to its negative pole and back again in what is known as a cycle. A single Hertz is thus equal to one cycle per second.
  • hh:mm:ss:ff
    Hours, minutes, seconds and frames as used in certain video operations, including preroll and postroll.
  • High Definition Television
    HDTV (high definition television) is a television display technology that provides picture quality similar to 35 mm. movies with sound quality similar to that of today's compact disc. Some television stations have begun transmitting HDTV broadcasts to users on a limited number of channels. HDTV generally uses digital rather than analog signal transmission. However, in Japan, the first analog HDTV program was broadcast on June 3, 1989. The first image to appear was the Statue of Liberty and the New York Harbor. It required a 20 MHz channel, which is why analog HDTV broadcasting is not feasible in most countries. HDTV and standard definition television (SDTV) are the two categories of display formats for digital television (DTV) transmissions, which are becoming the standard. HDTV provides a higher quality display with a vertical resolution display from 720p to 1080i. The p stands for progressive scanning, which means that each scan includes every line for a complete picture, and the i stands for interlaced scanning which means that each scan includes alternate lines for half a picture. These rates translate into a frame rate of up to 60 frames per second, twice that of conventional television. One of HDTV's most prominent features is its wider aspect ratio (the width to height ratio of the screen) of 16
  • Hint Track
    The hint track is one track of a MP4 file. It contains information needed by ISO-compliant streaming servers.
  • Hinted Video
    Video clips are made up of frames. Frames can become quite large for transmission, affecting efficiency. Also, if any of these frames are lost in a streaming transmission, it can have a very negative effect on quality. When video data is hinted, frames are divided into smaller chunks of data called packets to increase speed of transfer and reduce the impact of lost data. All QuickTime movies created for streaming must be hinted.
  • Horizontal Resolution
    The number of elements, or pixels, on a horizontal line.
  • hyperlink
    On the Web or other hypertext systems, hyperlink is a synonym for both link and hypertext link. Possibly, the term originated because link was not felt to be specific enough. And it's shorter than "hypertext link."""
  • Hypertext Markup Language
    A document type consisting of text and tags that rule e.g. layout and hyperlinking. An HTML file has extension '.htm' or '.html', the content type is 'html'. Java applets can be included using the tag. The current version is HTML 4.0. Extensions of HTML are CSS (a stylesheet language that may govern the layout without touching the content), DHTML (include dynamic behavior), and XML (semantic instead of presentational tags), and XHTML (XML-compatible HTML).
  • Hypertext Transfer Protocol
    An application protocol, i.e. a set of rules for exchanging files on the World Wide Web that runs on top of TCP/IP. The latest version is HTTP 1.1 as published by an IETF's RFC. The W3C pursues HTTP within its architecture domain. Access to HTTP requests is, among others, enabled using the XMLHTTP ActiveXControl.
  • I-Pictures
    I-Pictures are coded without reference to any other picture. When encoding full motion video, I-pictures are used as the base reference subsequent pictures.
  • IEEE 1394
    An interface standard that allows for direct transfers of DV or other digital data from one device to another. See i.Link and FireWire.
  • IFIP
    A research organization that performs substantive pre-standardization work for OSI. IFIP is noted for having formalized the original Message Handling System (MHS) model.
  • IGMP
    The Internet Group Management Protocol (IGMP) is an Internet protocol that provides a way for an Internet computer to report its multicast group membership to adjacent routers. Multicasting allows one computer on the Internet to send content to multiple other computers that have identified themselves as interested in receiving the originating computer's content. Multicasting can be used for such applications as updating the address books of mobile computer users in the field, sending out company newsletters to a distribution list, and broadcasting high-bandwidth programs of streaming media to an audience that has "tuned in" by setting up a multicast group membership. Using the Open Systems Interconnection (OSI) communication model"
  • IMUX
    1) Input Multiplexer - Filter; 2) Inverse Multiplex
  • In video and film, the standard
    In the Internet, a part of a naming hierarchy. Syntactically, an Internet domain name consists of a sequence of names (labels) separated by periods (dots). In OSI, domain is generally used as an administrative partition of a complex distributed system"
  • Inclination
    The angle between the orbital plane of a satellite and the equatorial plane of the earth.
  • Indexing
    Indexing programs, spiders, robots, and crawlers record and analyze the occurrences of keywords in the text of a page and/or the contents of the TITLE and META tags. This information is stored in databases, which can be searched by users of search engines.
  • InfoFrame
    A data structure defined in CEA-861-D that is designed to carry a variety of auxiliary data items regarding the audio or video streams or the source device and is carried from Source to Sink across HDMI.
  • Infrared
    Electromagnetic waves whose frequency range is above that of microwaves, but below that of the visible spectrum.
  • Institute of Electrical and Electronic Engineers
    A professional organization, which, as a part of its services to the community, performs some pre-standardization work for OSI. IESG Internet Engineering Steering Group. The executive committee of the IETF. Consists of each Area Director plus the IETF Chairman.
  • Interior Gateway Protocol
    The protocol used to exchange routing information between collaborating routers in the Internet. RIP and OSPF are examples of IGPs.
  • Interlaced Video
    The combining of two sequential television field that make up a complete frame in the NTSC system. Field 1 contains the odd numbered scan lines, field 2 the even numbered line. When combined by interlacing, line 2 falls between 1 and 3, line 4 falls between 3 and 5, and so on. Interlaced scanning solves the problem of flicker at reasonable frame rates. An image repetition rate of less than 48 per second appears to flicker under typical viewing conditions. Interlacing provides a field repetition rate of 60 per second, well above the flicker threshold. The PAL system, having a field rate of 50 per second, often exhibits flicker to those not accustomed to it. Motion picture projection gets around the problem by using a 2-blade shutter assisted by viewing in a relatively dark room. Interlace bring with it motion artifacts that are noticeable at times. While exhibiting superior motion rendition compared to 24 frame may show a double image particularly noticeable in sporting events or other fast action because of the 1/60th of a second offset between fields. The equalizing and vertical serrated pulses in the sync signal are essential to achieve perfect interlace.
  • Intermediate System
    An OSI system which is not an end system, but which serves instead to relay communications between end systems. Internet A collection of networks interconnected by a set of routers which allow them to function as a single, large virtual network.
  • Internet Address
    A 32-bit address assigned to hosts using TCP/IP.
  • Internet Architecture Board
    Formally called the Internet Activities Board. The technical body that oversees the development of the Internet suite of protocols (commonly referred to as TCP/IP). It has two task forces (the IRTF and the IETF) each charged with investigating a particular area.""
  • Internet Assigned Numbers Authority
    The entity responsible for assigning numbers in the Integer Suite of Protocols.
  • Internet Control Message Protocol
    The protocol used to handle errors and control messages at the IP layer.
  • Internet Engineering Task Force
    One of the task forces of the IAB. The IETF is responsible for solving short-term engineering needs of the Internet.
  • Internet Gateway Routing Protocol
    A proprietary IGP used by Cisco System's routers.
  • Internet Protocol Version 6
    A new version of IP which is designed to be an evolutionary step from IPv4
  • Interoperability Technology Association for Information Processing
    The technical organization which has the official charter to develop Japans OSI profiles and conformance tests.
  • Intra-Frame
    I-Frames are coded without reference to any other picture. When encoding full motion video, I-Frames are used as the base reference subsequent pictures.
  • Intra-Pictures
    I-Pictures are coded without reference to any other picture. When encoding full motion video, I-pictures are used as the base reference subsequent pictures.
  • Intranet
    A private network that is contained within an enterprise using TCP/IP, HTTP, and other Internet protocols. Intranets may include connections through gateway computers to the Internet using firewall servers for security.
  • Inverse 3:2
    Automatically detects and suppresses the encoding of repeated frames while encoding motion pictures. Movie film is projected at a rate of 24 frames per second, while the television signal runs at 30 frames per second. Inverse 3:2 reconciles the frame rates of the two mediums much as the 3:2 pulldown projector claw does so in analog television systems using optical projectors.
  • IP
    Internet Protocol, protocol in the TCP/IP internet layer for communication between nets and their hosts. An IP-address uniquely identifies each network and each of its hosts on the internet. Addresses consist of four bytes that can be represented by four integers (0 to 255) separated by dots, e.g. 157.189.162.75?. Dependent on the address' class and a subnet mask
  • IP Notification Table (see EN 30
    Internet Protocol, protocol in the TCP/IP internet layer for communication between nets and their hosts. An IP-address uniquely identifies each network and each of its hosts on the internet. Addresses consist of four bytes that can be represented by four integers (0 to 255) separated by dots, e.g. 157.189.162.75?. Dependent on the address' class and a subnet mask
  • ISAPI
    Internet Information Server API, API supported by the MS's Internet Information Server (IIS).
  • Java
    Object oriented programming language, Java classes compile into Java-bytecode. This code can be executed on any platform that implements the Java Virtual Machine (JVM). Executable Java-objects are either applets or applications. Java supports multithreading.
  • JScript
    Microsoft's script language for HTML pages. It adheres to the ECMAScript standard and is Microsoft's equivalent to Netscape's JavaScript.
  • Key
    A video special effect in which the level of a video signal is used as a switch which allows selective substitution of picture information from one source with picture information from a different source. Luminance keyers use the amplitude of the monochrome portion of the signal, while chroma keyers use the amplitude of a specific color or hue. Most printed material is inserted into video using luminance keys, while picture information is inserted using chroma keys.
  • Kilobits per second
    Number of bits, or binary digits, transmitted every second, increased in multiples of 1024 bits per second.
  • Layer (video and systems)
    One of the levels in the data hierarchy of the video and system specifications defined in Parts 1 and 2 of ITU-T Rec. H.222.0 | ISO/IEC 13818-1.
  • Letterbox Format
    A technique for showing a wider aspect ratio format on narrower aspect ratio screen displayed in its original aspect ratio. The resulting effect of letter boxing in a video system is black bars at the top and bottom of the picture.
  • Local Area Network
    Communications network that serves users within a confined geographical area.
  • Longitudinal Time Code (LTC)
    Time code that is recorded on the videotape on a linear audio track rather than in the vertical blanking interval of the video signal. Longitudinal time code is an audio signal that uses 80 bits to assign a numerical value expressed in hours, minutes, seconds, and frames to each and every frame of video.
  • Lossless
    A compression scheme where the reconstructed image is exactly the same as the original. No data is lost.
  • Lossy
    A compression scheme where some of the less important or less visible data is discarded in order to make the image smaller. The reconstructed image is close to the original, but not exactly the same.
  • LTC
    Time code that is recorded on the videotape on a linear audio track rather than in the vertical blanking interval of the video signal. Longitudinal time code is an audio signal that uses 80 bits to assign a numerical value expressed in hours, minutes, seconds, and frames to each and every frame of video.
  • Luminance
    The brightness information part of the video signal. Luminance if often designated by the symbol Y. Luminance level of video signal is measured on a waveform monitor by viewing in the L-Pass or IRE mode.
  • Macroblock
    The four 8x8 blocks of luminance data and the two (for 4:2:0 chroma format), four (for 4:2:2 chroma format) or eight (for 4:4:4 chroma format) corresponding 8x8 blocks of chrominance data coming from a 16 by 16 section of the luminance component of the picture. Macroblock is sometimes used to refer to the pel (picture element) data and sometimes to the coded representation of the pel values and other data elements defined in the macroblock header.
  • Major channel
    The first number in a two-part number used to identify a virtual channel. Each virtual channel carries one service, such as a television program. The major channel in the U.S. for current NTSC broadcasters is usually their NTSC channel number.
  • Master
    This is the recording VTR in an edit suite. To function as a master machine a VTR needs to be capable of insert editing.
  • Media Key
    For videos where you want to keep your content secure, QuickTime allows you to create a Media Key. It is essentially a password that you need to see a movie. If you try to view the movie without supplying the password, you'll get audio without a picture. To supply passwords for your QuickTime player, go to the Control Panel (Windows) and double click the QuickTime Icon and choose Media Keys from the drop down menu. Press the Add button to supply new passwords.
  • Megahertz
    One million cycles, or Hertz, per second.
  • Message
    A data structure that is variable length, starting with a type byte and ending with a CRC. Messages may be shorter or longer than a transport packet, however are typically shorter than a PES packet.
  • Metadata
    Data that describes data. Data which provides information about a resource or asset.
  • Minor channel
    The second number in a two-part number used to identify a virtual channel. The minor number changes for each different service that is or will be present in a DTV transport stream.
  • Most Significant Bit
    In DVB-S2, the MSB is always transmitted first.
  • Motion Compensation
    The use of motion vectors to improve the efficiency of the prediction of pel values. The prediction uses motion vectors to provide offsets into the past and or future reference pictures containing previously decoded pel values that are used to form the prediction error signal.
  • Motion Estimation
    The process of estimating motion vectors during the encoder process.
  • Motion JPEG
    A series of JPEG images played by a video player.
  • Motion Vector
    A two-dimensional vector used for motion compensation that provides an offset from the coordinate position in the current picture to the coordinates in a reference picture.
  • Moving Pictures Experts Group
    An international technical committee that sets standards for compression of full-motion video and associated audio. These standards are followed by virtually all developers and manufacturers of video encoder/decoder equipment and software.
  • MP4
    The MP4 file format is the official MPEG-4 file format.
  • MPEG-1
    MPEG compression techniques that roughly approximate VHS quality video (320x240 pixels) at bit-rates of about 1.54 Mbps.
  • MPEG-2
    Broadcast-quality video can be obtained using MPEG-2 compression schemes. Transfer rates are approximately 3 Mbps to 15 Mbps and higher. MPEG-2 can also be used to create scalable video systems - those in which the picture quality of each decoded subset is directly proportional to the amount of information bits that the subset carries.""
  • MPEG-4
    MPEG-4 (MP4) was defined using the standards for encoding video in a digital compressed format as specified by the Moving Picture Experts Group (MPEG). It supports video, audio, and system components that are compliant with the International Organization for Standardization (ISO) MPEG-4 de facto standards. The MPEG-4 open standards are a set of specifications that are used to build products for production, encoding and delivery of audio/video content over many kinds of networks to a variety of clients such as personal computers, wireless devices, Web browsers, and many more. Device manufacturers prefer using MPEG-4 because the open architecture and codec do not lock them into specific formats or players. Instead of having to develop for three or four separate formats, which is logistically difficult and costly, providers can build on MPEG-4's single format.
  • Multi-channel
    Audio with more than 2 channels. Typically this term is applied to 6 (5.1) channel streams. Also called surround formats.
  • Multicast
    A special form of broadcast where copies of the packet are delivered to only a subset of all possible destinations.
  • MultiCrypt
    Is used to describe the simultaneous operation of several CA systems.
  • MUX/Multiplex
    Stream of all the digital data carrying one or more services within a single physical channel
  • n Phase Shift Keying
    n-valued Phase Shift Keying uses points on the constellation diagram, equispaced around a circle. For instance, with four phases, QPSK can encode two bits per symbol.
  • Namespaces
    W3C activity concerning XML to enable documents to use names specified in foreign DTDs. A namespace declaration within an XML document points to a namespace
  • National Television Systems Committee
    Formed to determine the guidelines and technical standards for monochrome and color television. Also used to describe the 525-line, 59.95Hz color television signal used in North America and several other parts of the world.
  • Network
    Collection of MPEG-2 Transport Stream (TS) multiplexes transmitted on a single delivery system. EXAMPLE: All digital channels on a specific cable system.
  • Network Interface Card (NIC)
    A board that provides network communication capabilities to and from a computer.
  • Non-Drop Frame Time Code
    A time-based reference system for video and audio that was developed and standardized by the SMPTE. The system assigns each frame a distinct eight-digit number that is composed of hours, minutes, seconds, and frames. Because of the nature of the NTSC color television system, non-drop frame time code does not agree with clock time. Non-drop frame time code indicates 3.6 seconds less than one hour of clock time video information. Although still in use for industrial and educational applications, non-drop frame time code has been replaced by drop frame time code by the television networks and most independent stations.
  • Non-Linear Editor
    An editing system based on storage of video and audio on computer disk, where the order or lengths of scenes can be changed without the necessity of re-assembling or copying the program.
  • Offline Editing
    The work print or decision making stage of videotape editing. The resulting tape is not considered to be a broadcast quality master but is used to create program continuity and generate accurate time code data that will be stored and used after to conform a master quality tape from unedited production material. An offline work print may have visible time code numbers burned into the picture area for reference. This is called a window dub.
  • One Bit Audio
    1-bit Delta-Sigma modulated signal stream such as that used by Super Audio CD.
  • Online Editing
    The last stage of videotape editing, resulting in a final master tape. The equipment used during online editing is generally designed to produce broadcast quality tapes and costs much more than the equipment used in offline editing. Online editing rates are about three to four times those of offline editing. Nearly all online editing is performed on one-inch videotape. These VTRs are capable of slow motion, still-framing with broadcast quality and search and wind at 50 times play speed.
  • Open Systems Interconnection
    An effort to standardize computer networking that was started in 1977 by the International Organization for Standardization (ISO), along with the ITU-T
  • Optical Disk
    Also referred to as a laser disk. A semi-rigid plastic disk containing aluminum or other substrate in which a laser beam embeds digital data, allowing information to be stored. The substrate itself is sandwiched between plastic to prevent damage to the data when the disk is handled. The most commonly used is the WORM disk, meaning Write Once Read Many.
  • original_network_id
    Unique identifier of a network.
  • Pack (system)
    A pack consists of a pack header followed by zero or more packets. It is a layer in the system coding syntax described in 2.5.3.3 of ITU-T Rec. H.222.0 | ISO/IEC 13818-1.
  • Packet
    A collection of contiguous bits of data.
  • Packet data
    Contiguous bytes of data from an elementary stream present in a packet.
  • Packet Identifier
    A unique integer value used to identify elementary streams of a program in a single or multi-program Transport Stream as described in 2.4.3 of ITU-T Rec. H.222.0 | ISO/IEC 13818-1.
  • Packet Stream
    A sequential flow of packets.
  • Packetized Elementary Stream
    Packets of elementary stream data. Elementary stream is a generic term for coded video, audio, or other streams. Each packet contains a header and a data payload.
  • Padding (audio)
    A method to adjust the average length of an audio frame in time to the duration of the corresponding PCM samples, by conditionally adding a slot to the audio frame.
  • Pan and Scan
    A technique for changing the field of view of a motion picture or down converted HDTV images that has been transferred to a smaller viewing format.
  • Payload
    Payload refers to the bytes which follow the header bytes in a packet. For example, the payload of some Transport Stream packets includes a PES_packet_header and its PES_packet_data_bytes, or pointer_field and PSI sections, or private data; but a PES_packet_payload consists of only PES_packet_data_bytes. The Transport Stream packet header and adaptation fields are not payload.
  • PES packet
    The data structure used to carry elementary stream data. A PES packet consists of a PES packet header followed by a number of contiguous bytes from an elementary data stream. It is a layer in the system coding syntax described in 2.4.3.6 of ITU-T Rec. H.222.0 | ISO/IEC 13818-1.
  • PES packet header
    The leading fields in a PES packet up to and not including the PES_packet_data_byte fields, where the stream is not a padding stream. In the case of a padding stream the PES packet header is similarly defined as the leading fields in a PES packet up to and not including padding_byte fields.
  • PES stream
    A PES Stream consists of PES packets, all of whose payloads consist of data from a single elementary stream, and all of which have the same stream_id. Specific semantic constraints apply. Refer to Intro. 4 of ITU-T Rec. H.222.0 | ISO/IEC 13818-1.
  • Phase Alternate Line
    The European television broadcasting video standard.
  • Phase Shift Keying
    A digital modulation scheme that conveys data by changing, or modulating, the phase of a reference signal (the carrier wave).
  • Physical channel
    A generic term to refer to the each of the 6 MHz frequency bands where television signals are embedded for transmission. Also known as the physical transmission channel (PTC). One analog virtual channel fits in one PTC but multiple digital virtual channels typically coexist in one PTC.
  • Pixel
    A word derived from Picture Elements. This is the smallest unique point of a digital video image. In a Digital Video, a picture is divided up into thousands of Pixels, each specified by Luminance, Chrominance, and position information.
  • Pixel Encoding
    Bit placement and sequencing for the components of a pixel for a particular color space and chroma sampling.
  • Port
    Used in TCP/IP to distinguish between different data streams traveling to a single IP address. Communication can be established at different ports to keep conversations separate.
  • Pre-Roll
    This refers to the sequence of events before and edit begins during which the player and recording VTRs are synchronized.
  • Predicted Picture
    Predicted Pictures are coded with respect to the nearest I or P Picture. P Pictures feature forward prediction incorporating motion compensation. This results in better compression (ideally, fewer bits per pixel) than I Pictures; however, P Pictures can still propagate errors.
  • Prediction
    The use of a predictor to provide an estimate of the pel value or data element currently being decoded.
  • Presentation Time Stamp
    A field that may be present in a PES packet header that indicates the time that a presentation unit is presented in the system target decoder.
  • Presentation Unit
    A decoded Audio Access Unit or a decoded picture.
  • printf
    Symbol in the C programming language.
  • Program
    In MPEG terminology, a collection of program elements. Program elements may be streams of data such as video, data, and audio. Program elements need not have any defined time base; those that do have a common time base are intended for synchronized presentation. The term program is also commonly used in the context of a television program such as a scheduled daily news broadcast. In ATSC standards"
  • Program element
    A generic term for one of the elementary streams or other data streams that may be included in a program. For example: audio, video, data, and so on.
  • Program Specific Information
    MPEG-2 Program Specific Information (as defined in ISO/IEC 13818-1 [1])
  • Progressive Video
    Lines of spatial information of a video signal. Progressive video lines contain samples starting from one time instant and continuing through successive lines to the bottom of the frame.
  • Protocol
    A set of rules that two or more computers on a network must agree upon in order to communicate. Think of protocols as languages. If two computers speak TCP/IP, they will understand each other.
  • PTS Offset
    Value of the first generated presentation time stamp.
  • Pulse Code Modulation
    The PCM file stores raw uncompressed audio data.
  • QCIF
    QCIF is a video resolution, the quarter of CIF. The size is 176 x 144 pixel.
  • Quantization
    Quantization is the action of information reduction to get a higher compression result.
  • Random Access Memory
    The working memory of a computer where data and programs are temporarily stored. RAM only holds information when the computer is on.
  • Read Only Memory
    A permanent, non-volatile memory from which data can be read. This information can't be deleted nor altered.
  • Real Networks Content (RA, RM, RAM)
    Real Networks content is software developed by RealNetworks that streams live or pre-recorded audio/video to a client, such as a Web browser, by decompressing it dynamically so that it can be played back in real time. Delivering Real Networks content requires use of RealSystem Server Basic (free), Plus, Intranet, or Professional software. The RealSystem Server Basic allows for up to 25 concurrent viewers. As your requirement for simultaneous viewers increases, so does the cost of the server.
  • Real-time
    The reception, processing and delivery of data in such a way that it appears to be instantaneous.
  • Real-time Transport Protocol
    A data transport protocol defined to deliver live media capture to one or more clients at the same time. Requires an RTP server to capture, compress, packetize, and transmit the data over a network in real time. The transport protocol used by RTSP streaming. RTP/RTSP are network protocols designed for real-time streaming.
  • Receiver
    A component that is responsible for receiving the four differential TMDS input pairs at the input to an HDMI Sink and converting those signals into a digital output indicating a 24 bit, 12 bit, or 6 bit TMDS decoded word and indicating the TMDS coding mode used to decode those bits. This digital output may be contained within a semiconductor device or may be output from a semiconductor device.
  • Red/Green/Blue
    The primary colors of light. Computers and some analog component devices use separate red, green, and blue color channels to keep the full bandwidth and therefore the highest quality picture.
  • Redundant Array of Inexpensive Disks
    A configuration of multiple disks designed to preserve data after a disk casualty.
  • Reference Video Signal
    A video signal which contains a sync signal or sync and burst signals, used as a reference for synchronization of video equipment.
  • Region
    As used in the PSIP document, a region is a geographical area consisting of one or more countries.
  • Relay Server
    A relay server receives unicast streams from the Internet and multicasts them to a Local Area Network (LAN).
  • Remote Connector
    This is either the 36-pin parallel or 9-pin serial connector on the back of a VTR through which the machine is controlled by an external device.
  • Repeater
    1) Equipment which receives and re-transmits a DVB-T signal. It cannot change the TPS bits and thus the cell_id. 2) A device with one or more digital AV inputs and one or more digital AV outputs. Repeater devices shall simultaneously behave as both a Sink and a Source.
  • Replication Server
    A replication server is used for sending multiple unicast streams across the Internet. Each recipient receives their own unique stream, which creates an inefficient use of bandwidth.
  • Reserved
    When used in the clause defining the coded bit stream, indicates that the value may be used in the future for ISO defined extensions
  • reserved_future_use
    When used in the clause defining the coded bit stream, indicates that the value may be used in the future for ETSI defined extensions
  • Resolution
    A measure of the sharpness of an image. The ability of a system to reproduce fine detail and sharp edges.
  • Reverse Play
    The ability to play the tape backwards.
  • Router
    A system responsible for making decisions about which of several paths network (or Internet) traffic will follow. To do this it uses a routing protocol to gain information about the network, and algorithms to choose the best route based on several criteria known as routing metrics. In OSI terminology"
  • RS-232
    A recommended standard (RS) interface to define the electrical and functional characteristics used in synchronous and asynchronous transmissions between a computer and a peripheral device.
  • RS-422
    A recommended standard (RS) interface to define the electrical and functional characteristics used in a balanced serial interface, but does not specify a connector.
  • Safe Title and Safe Action Areas
    Geometric boundaries within the television viewing area used as a guide to insure the correct placement of graphics, titles of other types of art work so as not to lose the desired action or title information as seen on a television receiver.
  • Sample Rate
    Similar to the frame rate for a film, this represents the number of times, per second, that the audio signal is captured. In general, the higher the sample rate, the higher the sound quality of the resulting audio, but the larger the audio file. The sample rate is very important in determining the highest frequency that can be captured. The human ear can hear frequencies from 20 Hz to a maximum of 20,000 Hz. For a frequency to be captured in an audio file, the sample rate must be more than twice that frequency. For instance, the sample rate of CD audio is 44,100 Hz (44.1 kHz) so that the highest frequency it can represent is 22,050 Hz. This clearly covers the range of human hearing.
  • Sampling
    The process associated with analog-digital conversion in which the continuous analog waveform is divided into discrete moments in time.
  • Sampling Frequency
    The number of sample measurements taken from an analogue signal in a second, generally expressed in megahertz. These samples are then converted into digital numeric values to create the digital signal.
  • Saturation
    e intensity of the color in a video picture. Intensity may range from pale vivid or intense. The greater the color (chroma) saturation, the more intense the color. Excessive saturation leads to a form of transmission or recording distortion called over-deviation or bearding or to color streaking sometimes known as bleeding color. The amount of color saturation may be seen on a vectorscope.
  • Search Mode
    A VTR mode used when searching for specific scenes, by viewing the video output or time codes while playing back the tape at various speeds in forward or reverse direction.
  • Section
    A data structure comprising a portion of an ISO/IEC 13818-1 (MPEG Systems) defined table, such as the Program Association Table (PAT), Conditional Access Table (CAT), or Program Map Table (PMT). All sections begin with the table_id and end with the CRC_32 field, and their starting points within a packet payload are indicated by the pointer_field mechanism defined in the ISO/IEC 13818-1 International Standard.
  • Sequence Headers/GOP
    (AFF only) The number of Group of Picture headers that should be between each pair of sequence headers.
  • Sequentiel Coleur A Memoire
    European video standard with image format 4:3, 625 lines, 50 Hz and 6MHz video bandwidth with a total 8 MHz of video channel width. The major difference between PAL and SECAM is that the chrominance is FM modulated in SECAM.
  • Serial D-1 Video
    Refers to video adhering to the SMPTE 259M component serial digital video standard.
  • service_id
    unique identifier of a service within a TS
  • Sibilance
    Voice recordings often contain excessive loudness of the voice sibilants, and this effect is sometimes called sibilance. The most difficult sibilants to reproduce accurately are the sounds s and "sh". This effect is reduced by the use of a de-esser.""
  • Signal/Noise
    Signal-to-Noise ratio. The relation of the strength of the desired signal to the accompanying electronic interference, the noise. If S/N is high, sounds are reproduced with less noise and pictures are reproduced clearly without snow.
  • Simple Message Transport Protocol
    Simple Message Transport Protocol, protocol for Email.
  • SimulCrypt
    A process that facilitates using several Conditional Access (CA) systems in parallel, in conjunction with the DVB common scrambling algorithm, to control access to pay-TV services. It involves the inter-operation of two or more CA streams in a DVB environment.
  • Sink
    A device with a digital AV input.
  • Small Computer System Interface
    A computer hardware interface that allows up to fifteen devices to be connected to a single host adapter board. Devices can include hard disk drives, CD-ROM players, sound cards, scanners, etc.
  • SMPTE 259M
    A Serial Digital Interface standard for bringing composite and component 4:2:2 digital video to and from video peripherals
  • SNMP
    Simple Network Management Protocol. The network management protocol of choice for TCP/IP-based internets.
  • SOAP
    Simple Object Access Protocol, a method invented by Microsoft to use RPC over the internet via HTTP calls. SOAP is now published as an W3C Note and implemented, among others, as part of the Apache XML Project.
  • Socket
    A paring of an IP address and a port number.
  • Sony Digital Interface
    Digital transmission format for digital audio, outdated today. As this is an old de-facto industry standard, it is available as an option for the D827 MCH. Transmission format not recommended for longer distances. Allows for transmission of 16 bit audio (Standard allows for 20 bits). Two standards are common: 2-channel transmission, unbalanced, on 75 W BNC-cable (used e.g. in Sony PCM-1630 devices), or multichannel transmission on balanced lines with RS485 characteristics.
  • Source
    A device with a digital AV output.
  • Source Input Format
    When SIF is selected, the automatic scene detection and frequency domain filtering features are available. It also activates the SIF bit rate fields and SIF resolutions. SIF resolution defaults to 352x240 (NTSC) or 352x288 (PAL).
  • Spectrum
    The range of electromagnetic radio frequencies used in transmission of voice, data and television.
  • Speed of Data Transfer
    The rate, at which information travels through a network, usually measured in megabits per second.
  • SQL
    SQL is a standard interactive and programming language for getting information from and updating a database. Although SQL is both an ANSI and an ISO standard, many database products support SQL with proprietary extensions to the standard language.
  • Standard Definition
    Standard definition television (SDTV) is a digital television (DTV) format that provides a picture quality similar to digital versatile disk (DVD). SDTV and high definition television (HDTV) are the two categories of display formats for digital television (DTV) transmissions, which are becoming the standard. HDTV provides a higher quality display, with a vertical resolution display from 720p to 1080i and higher and an aspect ratio (the width to height ratio of the screen) of 16:9, for a viewing experience similar to watching a movie. In comparison, SDTV has a range of lower resolutions and no defined aspect ratio. New television sets will be either HDTV-capable or SDTV-capable, with receivers that can convert the signal to their native display format. SDTV, in common with HDTV, uses the MPEG-2 file compression method. Because a compressed SDTV digital signal is smaller than a compressed HDTV signal, broadcasters can transmit up to five SDTV programs simultaneously instead of just one HDTV program. This is multicasting. Multicasting is an attractive feature because television stations can receive additional revenue from the additional advertising these extra programs provide. With today's analog television system, only one program at a time can be transmitted. When the United States decided to make the transition from analog television to DTV, the Federal Communications Commission decided to let broadcasters decide whether to broadcast SDTV or HDTV programs. Most have decided to broadcast SDTV programs in the daytime and to broadcast HDTV programs during prime time broadcasting. Both SDTV and HDTV are supported by the Digital Video Broadcasting (DTV) and Advanced Television Systems Committee (ATSC) set of standards.
  • Standards Converter
    A device used to translate one television standard to another. For example, videotape made using the TNSC standard cannot be shown in a country that uses either PAL or SECAM unless the tape is first copied through a standards converter. Tapes may be copied from any one standard to any other standard through this device.
  • Stereo
    2 channel audio.
  • Stream
    1) An ordered series of bytes. The usual context for the term stream is the series of bytes extracted from transport stream packet payloads which have a common unique PID value (e.g., video PES packets or Program Map Table sections). 2) A time-ordered set of digital data originating from one Source and terminating at zero or more Sinks. A stream is characterized by bounded bandwidth requirements.
  • Streaming Download
    Streaming downloads are an RTP/RTSP-based delivery method in which the video is played as it is received. The user has immediate access to any point in the movie without downloading it first. Streaming downloads are especially well suited to long broadcasts because frames are discarded as soon as they are viewed. This means that no storage space is taken up on the client side to view the file. It also acts as a form of content protection, since the user has no means of saving and redistributing the file. If a connection can't keep pace with the data rate, information can be dropped to help to maintain real-time playback. Streaming is the only way to present live feeds and supports broadcasts and multicasts.
  • Streaming Media
    Streaming media is sound (audio) and pictures (video) that are transmitted on the Internet in a streaming or continuous fashion, using data packets. The most effective reception of streaming media requires some form of broadband technology such as cable modem or DSL.
  • Streaming Server
    A specialized server that can use RTP/RTSP to broadcast audio and video over the Internet. Examples include QuickTime Streaming Server, Darwin Streaming Server, the Real System Servers, or Windows Media Services in Windows 2000 Servers.
  • Subcell
    Geographical area that is part of the cells coverage area and that is covered with DVB-T signals by means of a transposer. In conjunction with the cell_id the cell_id_extension is used to uniquely identify a subcell.
  • Super Video CD
    Super Video CD builds on the Video CD standard, delivering higher quality audio and video on a CD disc.
  • Sync
    Any of the signals used to generate and control a television picture, but, specifically, the portion of the composite video signal from zero to minus forty IRE units consisting of vertical and horizontal timing pulses and equalizing signals to maintain the proper relationship of the two fields of video making up each frame.
  • System/Service Information
    Digital data describing the delivery system, content and scheduling/timing of broadcast data streams, etc. It includes MPEG-2 PSI together with independently defined extensions.
  • Table
    Comprised of a number of sub_tables with the same value of table_id.
  • Table, instance
    Tables are identified by the table_id field. However, in cases such as the RRT and EIT, several tables with different content can be defined simultaneously; each of these is a table instance. All instances have the same PID and table_id but a different table_id_extension.
  • Table, PSIP
    A collection of tables describing virtual channel attributes, event features, and other elements. PSIP tables are compliant with the private section syntax of ISO/IEC 13818-1.
  • Tbit
    Time duration of a single bit carried across the TMDS data channels.
  • Tcharacter
    Time duration of a single TMDS character carried across the TMDS data channels. This is equal to 10*Tbit .
  • Telecine Formats
    The most popular type of television projector/camera configuration is called a flying spot scanner. Most film is transferred in the 35mm format in the form of filmed television programs or feature films. Wide screen film formats such as Cinemascope and Panavision may be transferred to videotape for use on television by using a sophisticated pan and scan method to select the most desirable areas of a scene. This preprogrammed information and selection of scene framing or panning is stored in a computer and may then be duplicated in real time as the film is recorded on videotape. Color balance and scene density information, as
  • Television World Standards
    There are more than 165 countries in the world that offer television to their citizens. Of these, 18% transmit black and white (monochrome), color transmissions in NTSC = 23%, PAL = 35%, and SECAM = 20%. The chart in Section 10 outlines in detail the various formats and other specifications used around the world.
  • Terabyte
    1 trillion bytes.
  • The International Maritime Satellite Organization
    Operates a network of satellites for international transmissions for all types of international mobile services including maritime, aeronautical, and land mobile.
  • The International Telecommunications Satellite Organization
    Operates a network of satellites for international transmissions. Interference Energy which tends to interfere with the reception of the desired signals, such as fading from airline flights, RF interference from adjacent channels, or ghosting from reflecting objects such as mountains and buildings.
  • The largest Internet in the worl
    The Internet Group Management Protocol (IGMP) is an Internet protocol that provides a way for an Internet computer to report its multicast group membership to adjacent routers. Multicasting allows one computer on the Internet to send content to multiple other computers that have identified themselves as interested in receiving the originating computer's content. Multicasting can be used for such applications as updating the address books of mobile computer users in the field, sending out company newsletters to a distribution list, and broadcasting high-bandwidth programs of streaming media to an audience that has "tuned in" by setting up a multicast group membership. Using the Open Systems Interconnection (OSI) communication model"
  • Time Base Corrector
    An electronic processing device connected to the output of a VCR or VTR that removes or masks the jitter generated by unavoidable mechanical inaccuracies in helical scan recorders. This is accomplished by automatically delaying the video signal so that each line starts at the proper time. The TBC can be an integral part of the recorder (professional) or may be an external stand-alone device (industrial/consumer). It may also include a video-processing amplifier that allows the adjustment of video and color levels in the corrected signal.
  • Time Base Error
    The horizontal and/or vertical jitter inherent in most videotape recording equipment. Broadcast standards require a horizontal line-to-line timing accuracy of better than one part in thousands, which is impossible to attain in a mechanical tape-scanning device. An electronic automatic time delay device, known as a Time Base Corrector (TBC) is necessary in all helical scan VTRs to compensate for this inherent problem if the video is to be broadcast or composited (dissolve, wipe, matte) in any way. Consumer and industrial machines cannot afford this feature and rely on fast horizontal automatic frequency control (AFC) circuits in monitor and receivers to cover this fault.
  • Time Code (TC)
    A standardized numbering system referenced to a 24 hour clock by which audio or video material is specifically identified for editing or reference purposes. The system assigns a unique, eight-digit number to every frame on an audio or videotape. This number assumes the following form: Up to 23 hours Up to 59 minutes Up to 59 seconds Up to 29 frames. The maximum time that could be displayed would be 23:59:59:29. After that, the display would start over again at zero. A typical time code would be displayed as 14:23:06:17.
  • TMDS Character
    A 10-bit TMDS-encoded value. One such value is carried on each of the three data channels for each cycle of the TMDS clock.
  • Transmission Control Protocol
    Transmission Control Protocol, a protocol from the TCP/IP-transport layer. TCP is reliable, i.e. each package is acknowledged. It furthermore is connection-oriented, i.e. the connection is established via handshake, before any data is transmitted. TCP serves for communication between applications that are identified by their port-numbers upon a host.
  • Transmission Control Protocol / Internet Protocol
    Transmission Control Protocol / Internet Protocol, a bundle of network protocols, e.g. the Internet Protocol IP, the Transport Protocol TCP and the Application Protocols TELNET and FTP.
  • Transmitter
    1) A component that is responsible for driving the four differential TMDS output pairs into an HDMI output and for clocking the data driven into those four output pairs. 2) Equipment that allows to modulate a baseband transport stream and to broadcast it on one frequency.
  • Transport Stream
    Data structure defined in ISO/IEC 13818-1 [19]. It is the basis of the DVB standards.
  • transport_stream_id
    Unique identifier of a TS within an original network.
  • Transposer
    Type of repeater which receives a DVB-T signal and re-transmits it on a different frequency.
  • TSID
    Transport stream ID or transmission signal ID.
  • TV-Anytime
    TV-Anytime (www.tv-anytime.org)
  • Unbalanced
    This refers to a way of connecting audio signals using wires carrying a single core and screen.
  • Unicast
    Unicast is communication between a single sender and a single receiver over a network. The term exists in contradistinction to multicast, communication between a single sender and multiple receivers, and anycast, communication between any sender and the nearest of a group of receivers in a network. An earlier term, point-to-point communication, is similar in meaning to unicast. The new Internet Protocol version 6 (IPv6) supports unicast as well as anycast and multicast.
  • Unicode
    A computing industry standard for the consistent encoding, representation and handling of text expressed in most of the world's writing systems.
  • URL (Uniform Resource Locator)
    The global address of a file on the Internet. The address of the window you are currently viewing is an example of a URL.
  • User Datagram Protocol
    A transport protocol in the Internet suite of protocols. UDP, like TCP, uses IP for delivery; however, unlike TCP, UDP provides for exchange of datagrams without acknowledgments or guaranteed delivery.
  • Variable Bit Rate
    In video coding, the bitrate of the encoded video stream varies over time. Sometimes the video is encoded with a fixed quantization factor resulting in varying bitrates and varying perceived quality. Sometimes the video is encoded with a fixed perceived quality which usually results in varying bitrate.
  • Vector
    A 16x16 pixel block from a video frame.
  • Version number
    A number that increments each time there is a change in a referenced table.
  • Vertical Blanking
    The portion of the video signal that occurs between the end of one field and the beginning of the next. During this time, the electron beams in the cameras and monitors are turned off so that they can return from the bottom of the screen to the top without showing traces of movement on the screen. When the position of V-blanking is not adjusted correctly, a horizontal black bar appears on the screen.
  • Vertical Interval
    Non-visible (off-screen) part of a video picture signal.
  • Vertical Interval Time Code
    Time code that is recorded within the vertical blanking interval of the video signal. VITC uses a 90-bit word to assign a value to each frame of video.
  • Video Buffering Verifier
    The VBV indicates the default value is the maximum size of the buffer; that is 1,835,008 bits for main profile at main level (MP@ML).
  • Video Burst Sync
    A composite signal consisting of video signal, burst signal and sync signal.
  • Video CD
    Video CD is a standard for storing audio and video information on CD discs. A 74-minute video sequence can be stored on a single disc.
  • Video Field
    The period from one VSYNC active edge to the next VSYNC active edge.
  • Video for Windows
    A format developed by Microsoft Corporation for storing video and audio information.
  • Video Format
    A video format is sufficiently defined such that when it is received at the monitor, the monitor has enough information to properly display the video to the user. The definition of each format includes a Video Format Timing, the picture aspect ratio, and a colorimetry space.
  • Video Format Timing
    The waveform associated with a video format. Note that a specific Video Format Timing may be associated with more than one Video Format (e.g., 720X480p@4:3 and 720X480p@16:9).
  • Video Stream ID
    This identifies the video elementary stream to be included in the multiplexed stream.
  • Video Stream PID
    This unique integer value associated with video elementary streams of a program in a single or multi-program transport stream.
  • Video-on-Demand
    VOD is pre-recorded video stored on a server for access at the user's convenience.
  • VIDEO_TS
    Inside this folder are the .IFO files (navigations files, carrying information like programs, changes, buttons, etc), .BUF (a copy of e IFO files, in case that the DVD could be damaged), .VOB (video, audio and subtitles DATA)
  • Virtual Channel
    The designation, usually a number, which is recognized by the user as the single entity that will provide access to a TV program. It is called virtual because its identification (name and number) may be defined independently from its physical (RF) location. Examples of virtual channels include: digital radio (audio only), a typical cable analog TV channel, a typical digital TV channel (composed of one audio and one video stream), multi-visual digital channels (composed of several video streams and one or more audio tracks), or a data broadcast channel (composed of one or more data streams). In the case of an analog TV channel, the virtual channel designation will link to a specific physical transmission channel. In the case of a digital TV channel, the virtual channel designation will link both to the physical transmission channel and to the particular video and audio streams within that physical transmission channel that make up the event currently on that VC.
  • Virtual Channel Table
    Used in reference to either TVCT or CVCT.
  • Virtual Private Network
    A way to use a public telecommunication infrastructure, such as the Internet, to provide remote offices or individual users with secure access to their organization's network. A virtual private network can be contrasted with an expensive system of owned or leased lines that can only be used by one organization. The goal of a VPN is to provide the organization with the same capabilities, but at a much lower cost. A VPN works by using the shared public infrastructure while maintaining privacy through security procedures and tunneling protocols such as the Layer Two Tunneling Protocol (L2TP). In effect, the protocols, by encrypting data at the sending end and decrypting it at the receiving end, send the data through a tunnel that cannot be "entered" by data that is not properly encrypted. An additional level of security involves encrypting not only the data
  • Watermark
    A still image such as a logo added to a video clip to show identity or to protect content.
  • Windows Media Audio
    MS Windows Media Audio is an audio codec designed by Microsoft for use with streaming content at CD quality. It is designed to resist data loss that could cause signal degradation and can improve download times for audio.
  • Windows Media Video
    WMV is a MS Windows Media file with audio and/or video and is used to download and play files or to stream content. The WMV format is similar to the ASF format. Please see the ASF file documentation for more information about the capabilities of these files.
  • Wipe
    The transition between television picture sources in which each picture source is displayed on only a portion of the screen, that portion being determined by an electronically generated pattern which can be sized and positioned using a special effects generator.
  • World Wide Web
    An easy but powerful global information system, based on combination of information retrieval and hypertext techniques.
  • XLR
    A three-pin audio connector used with balanced lines in microphone and line level audio applications.
  • XLS FO
    XSL Formatting Objects, a formatting vocabulary forming one half of XSL (besides XSLT).
  • XML Parser
    An XML parser is a processor that reads an XML document and determines the structure and properties of the data. It breaks the data up into parts and provides them to other components. If the parser goes beyond the XML rules for proper formatting and validates the document against an XML DTD, the parser is said to be a validating parser.""
  • XML Schema
    The most recent design for providing XML schemata as released within two W3C working drafts currently in Last-Call. Schemata serve for describing the structure and constraining the contents of XML documents and associating datatypes with XML element types and attributes. Previous schema designs include DTDs, XML-Data, XDR, RDF, SOX, DCDs, XSchema and DDML. Although the XML Schema WG is overdue to bring their WDs to Recommendation, there are still problems with the design. One of the efforts to overcome those is the MSL effort.
  • XML Schema Declaration
    XML Schema Declaration, suffix of files containing a description of an XML schema according to the XML Schema specification.
  • XSL Transformations
    XSL Transformations, a language for transforming XML documents into other XML documents. XSLT is designed for use as part of XSL, which is a stylesheet language for XML. In addition to XSLT, XSL includes an XML vocabulary for specifying formatting. XSL specifies the styling of an XML document by using XSLT to describe how the document is transformed into another XML document that uses the formatting vocabulary.
  • YCBCR
    Digital representation of any video signal using one of several luminance/color-difference color spaces.
  • YIQ
    Like YUV, but standard only for NTSC Video and TV.
  • YUV
    Analog component video used for the PAL or NTSC broadcast television standard. Y is the luminance vector, U and V are the color difference vectors.
  • YUV-12
    A color encoding scheme composed of 8x8 pixel color blocks, used to save space when encoding graphics and video. Since the human eye is less sensitive to color than it is to light, for every 16 luminance [luminance = brightness of an image or object] samples (Y), there are only 8 chrominance [chrominance = perception of color.] samples (U, V). It is down compatible to black and white television. It is used on PAL TV and Video and is the Standard for Digital Video (CCIR 601). It is equivalent to YCbCr color Model.