Unit 21: Principles of Video Technology Assignment

Media 101: TV standards and Aspect Ratios

Whats in the Picture: an Introduction to TV standards and Aspect Ratios

Television standards as we call them are the Encoding standards for the recording and reception of video “Synergy between the video and how we play it”.

The 3 Standards are:

NTSC= National television standards commitee

PAL= Phase Alternating Line

SECAM= Sequential Colour Memory

A TV’s video output is made up of 25 to 30 frames which display every second, each frame is made up of 625 scan lines (525 for NTSC 625 for PAL and SECAM).

NTSC is the TV standard of the Americas, it was created in 1941 as the standards for black and white TV broadcasts.

Strengths: there is less flicker and smoother motion in the video output because of higher frame rates and it produces less noise.

Weakness: the lower number of scan lines means the picture isn’t as clear as PAL and the contrast level poor. Colour levels can fluctuate from frame to frame.

PAL was created for UK broadcasting in 1961.

Strengths: has more scan lines so the pictures have higher picture detail, higher levels of contrast and better colour reproduction than NTSC.

Weakness: lower frame rate means the picture motion can appear to flicker more than 30 frames and colour saturation can vary from frame to frame.

SECAM was created in France in 1967, it is now an unpopular standard.

Strengths: the high number of scan lines means a good quality picture, the colour hues are in constant saturation and it has stable colour reproduction.

Weakness: like PAL there is more motion flicker and Pattern effects seem to crop up on the picture from time to time, there are also many many variants of SECAM from country to country alot of which are incompatible with one another.

Aspect Ratios

Aspect Ratios by definition are the relationship between the width and the height of a film (or video) frame. An example of an aspect ratio would be Terminator 2 which has an aspect ratio 2.20:1. This means that the image is 2.2 times as long as it is high. There are many different aspect ratios, and some cameras will only be able to film in certain aspect ratios. What follows is a few examples of popular types of aspect ratios.

Academy Standard 1.33:1 (or 1.37:1) it is also known as 4:3

one of the earliest aspect ratios was popular through out Hollywood’s “Golden age”.

It was used after silent films had fallen out of fashion. The NTSC used the Academy standard for TV screens as it transferring films to TV easier. A famous example of movie filmed in the academy standard would be Casablanca from 1942.

Standard Flat 1.85:1 (or 1.66:1)

In the early 50’s widescreen cinema had become very popular which made life difficult for film makers as it required them to use specialised cameras and projectors which cost a lot of money, so they came up with a wider solution which was a new standard called Standard Flat. Many theatrical films use the Standard flat to this day,an example of a modern filmwould be Trainspotting which was released in 1996.

Anamorphic Scope 2.35:1 also known as true widescreen

All feature films are shot on film reels. The drawback of this practise is that there is not as much room in the picture, so in order to widescreen you would need a wider film stock. In Anamorphic scope the film is shot with a “anamorphic” lens attached to the camera. An optic located inside the lens contracts the light waves together as they enter the lens. The image is then compressed on it’s horizontal axis, to half its original width. An example of this would be a 2.35:1 image would be compressed to 1.18:1. Later on when the movie is played back at the cinema the picture is stretched back out to its full width. An example of a movie filmed in anamorphic scope would be live action 101 Dalmatians from 1996.

Vistavision 1.66:1 / 1.85:1 / 2.0:1

More flexible than other film formats. It allows for more aspect Ratios. It was created in 1954 by engineers and Paramount. Filmed with a special camera mounted on its side. its image quality was better than a standard 33 mm, but it did require a special projector. Movies shot in Vistavision include Vertigo, North by North West, White Christmas.


1.43:1 / 1.78:1 / 1.9:1

Widescreen in nature, the system originated in Canada at the Expo 67 fair in Montreal where a group of Filmakers/entrepreneurs designed a new system using a single powerful projector rather than multiple projectors that industry standard at the time. this revolusionised. Movies shot in IMAX include Contagion, Real Steel, Puss in Boots.



2.35:1 when transferred, 2.60:1

This uses 3 cameras and interlocks the 3 images together which creates an extremely wide presentation in 2.60:1 but when its transferred to video it’s aspect Ratio becomes 2.35:1 as the film is reduced to a 35mm anamorphic print.

movies shot in Cinerama include How the West was Won, The Wonderful World of the Brothers Grimm, and Seven Wonders of the World.

All Images used came from Google images.

Media 101: Video Connectors

Making the Connection: a beginners guide to video connectors.

Visual Interface standards

Visual interface standards are the format and method by which the video signal is transmitted between its source and the display. Simply put it’s the cable that connects the player to the screen we call these video connectors.

These days there are two main high-definition video connectors (visual interfaces), they are HDMI and DVI.

HDMI: High Definition Multimedia Interface.

HDMI was developed by Philips, Panasonic, Sony, Silicon image, Hitachi, and Toshiba. It was officially released in 2002. HDMI is an audio-video interface with a huge bandwidth of 4Gbps which sends the data faster, more reliably, less lossy (loss of data in transmission) which gives you a crisper picture. In technical terminology the specs of HDMI are:

  • Supports 1080p.
  • Can send up to 8 channels of high-res audio (which is better than DVD audio which only supports 6 channels).
  • Supports the colour spaces RGB 4:4:4, YCbCr 4:4:4, YCbCr 4:2:2.
  • Supports the video formats: SDTV 720x576i (PAL) 4:3 & 16:9, EDTV 640x480P (VGA) 720x576p (PAL) 4:3 & 16:9, HDTV 1280x720p, 1920x1080i, 1920x1080p) 16:9

Currently the standard HDMI connector is known as “Type A” there is a second connector called “Type B”  however this connector has not yet entered production as it is being reserved for future higher-resolution devices.


DVI connector

DVI is made up of 3 types of DVI cables, these cables are DVI-D, DVI-A and DVI-I. the definitions for each cable are as follows:

  • DVI-D (digital only) is a faster higher quality image than analogue because of its digital format. All video cards produce a digital video signal at first, which can be converted to an analogue signal if the source has an analogue output. DVI-D eliminates eliminates the need to convert to analogue, making it less lossy. DVI-D sends information using a digital information format called TMDS (Transition minimised differential signalling).
  • DVI-A (analogue only) carries a DVI signal to an analogue display. Converting the signal to analogue leads to a loss in quality. Used for older monitors such as CRT and TFT.
  • DVI-I (can carry both analogue and digital signals) The digital-to-digital signal sends information using a digital information format called TMDS. If you are trying to connect an analogue source to a digital display you’ll need to buy a converter. The main benefit of DVI-I is that it is a more versatile cable in terms of what it can connect to.
  • Single vs Dual link: The DVI-D and DVI-I formats are available in single and/or dual link. TMDS is the digital format used to transfer the information. With the single link there is one TMDS transmitter, Dual link has two transmitters which in effect double the transmission’s power. This means that dual link provides quicker speed and superior signal quality, which means it can display images in higher resolution than single link.

There are also formats which came before the high definition connectors, they are:

Composite video

This connector only carries a video signal which provides an image only. This cable is usually yellow and the information required to create a picture: the chrominance which is a combination of saturation and hue, the luminance which is the brightness and combines the two in the signal. The main drawback is that you require audio cables in order to get any sound. This type of connector was widely used throughout the 1980’s in older versions of game consoles. It is still widely used for Video, DVD and video games consoles.


S-video is a 4-pin connector which carries two seperate signals, one for the luminance and one for chrominance which delivers a better picture than composite video signals do. Another advantage S-video has over composite is that it can carry audio signals without extra cables.



Component video

This type of video came after composite and s-video and carries the picture information in 3 signal carrying wires. One wire carries the luminance and the other two carry the chrominance, the chrominance is split in to two signals, one red and one blue. These are often referred to as YUV, the Y is luma and the U and V are w chrominance signals. The connectors are marked Y which carries the luminance signal(Luma) and Cb and Cr which both carry chrominance signals. The advantage of using component video is that it has a clean image and less colour bleed than the two previous connectors.

Media 101: Broadcast Technology

From Radio Waves to Internet Streaming: An introduction to broadcast technology.

The definition of Broadcasting is “To sow over a wide area…” In Broadcast technology this means communicating over television or radio to an audience ,using video and audio transmission. The main methods of television broadcasting are:

  • Analogue
  • Cable
  • Satellite
  • Digital
  • VOD (Video On Demand)



Analogue uses electromagnetic waves which are a type of energy, electromagnetic waves can carry information as they move which can be used to transfer information from one point to another. Electromagnetic waves are a type of radiation which can vary in size e.g radio waves, light waves and X-rays. It is with radio waves that broadcasters are able to transfer information.

Broadcasters can use radio waves to transfer their programmes to a viewers television. Analogue works by sending the information as radio waves from a transmitter(towers/masts) to a receiver(aerials). The receiver catches the signal from the transmitter and sends it down a wire to the TV. TV channels are tuned in on the TV according to their frequency. The 2 main frequencies employed by analogue are UHF(Ultra High Frequency) and VHF(Very High Frequency).

The information is sent as variations in the amplitude, frequency and phase of the electromagnetic signal. This simply means the TV programe is sent over Radio waves from a TV mast to the aerial on top of your home(or where ever it is).

Terrestrial Broadcast History

1929 BBC begins Broadcasting “over the Air”.

1961 a conference in Stockholm (ST61) gives each European country its own frequency to broadcast its programmes.

1985: UK stops using VHF to broadcast.

2010 – 2012 UK digital switchover takes effect, all terrestrial broadcasting in the EU is set to end in 2012.


Cheap, able to broadcast to both local and national audiences, can serve a variety of different reception modes like Handheld, Aerial on house or TV, Transmission quality stays the same regardless of the size of the audience.


limited interactivity, line of sight obstructions can weaken the signal strength, Takes up a considerable airspace which puts it at odds with other broadband users, Image quality can suffer “ghosting”.

So in the future of Broadcasting terrestrial will no longer exist as the digital age has come.


Cable is a method of broadcasting where the Programme’s information is transmitted by cables rather than through the air. The transmission begins at a head end facility which is the programmes are processed and then transmitted down a large coaxial cable, which is named a Trunk Cable, which runs through the middle of a town or city. Feeder cables break off from the Trunk cable kind of like a tree’s roots and take the signal into a neighborhood, Drop cables then branch off from the Feeder cables and carry the signal into a set top box or cable modem which processes the signal. This is then sent to the TV along a video connector.

Cable Broadcast History

CATV(Community Antenna Television) is often used to mean “Cable TV”. Cable TV originated from 1948 when certain areas were out of terrestrial broadcast range or couldn’t get a good signal so these big antennas were made that used cables to feed the signals into homes.

Cable in most common in North America, Europe, Australia and East Asia.


Reliability is a factor as it can deliver signals where dishes cant e.g mountainous areas.

Lightwave and Fibre-optic Cables provide on demand and interactive services.


“Lossy” signal. This means that parts of the sound and picture could be lost over a certain difference. Because of this many companies have had to install Signal Amplifiers at half-mile intervals to boost the signals strength and thats going to cost alot of money. Because of the expense many companies have switched to Lightwave signals which travel down a fibre-optic cable which goes straight to a node in a neighborhood. The Node will then convert the signal to an RF signal which gives a stronger signal strength which reduces signal distortion and noise.

Overall Cable TV is still popular today and one of the most popular Cable providers in the UK is Virgin Media.


Satellite works by transmitting signals to uplink dishes, which transmit the electromagnetic signal to a specific satellite high in the planets orbit. The Satellite then transmits the signal back down to the Earth on a different frequency  which is caught by a Parabolic dish on the customers home. The Dish amplifies the signal which has become after traveling tens of thousands of miles, the signal is then sent down a cable to a receiver e.g Sky Box. The Signal is then taken to from the receiver to the TV by a video connector.

History of Satellite Broadcasting

In 1976 the first hints of Satellite TV emerged in the form of a broadcast of the heavyweight boxing match called the “Thriller from Manilla” which was broadcast through Satellites.

Satellite TV truly came into being during the 80’s and was initially very expensive, gradually the price went down throughout the 80’s.

In the 90’s four cable companies launched a DBS (Direct Broadcast Satellite) named Primestar which kick started the small Satellite dish era.

Recently the number of Satellite TV subscribers has reach more than 18 million and its still growing.


It offers a lot of Channels, Companies are very competitive so a lot of companies offer free installation and low fees. Portable uplinks


The Satellites cost a considerable amount of money to maintain and provide for. Subscription and Installation Fees. Atmospheric interference. Signal Strength.


According to many Digital is seen as the future of Broadcasting. It can be sent by Cable, Satellite and Terrestrial(over the air). The Information is encoded as an “MPEG transport stream” which includes both audio and video data, It also contains error correction information and provides a higher picture quality than any Analogue signal. DVB-T is the European standard for Digital TV.

History of Digital Broadcasting

Digital is an emerging form of broadcasting and will soon be taking over Britain and most of Europe in the “Digital Switchover”.


Digital signals are better than Analogue because they take up less bandwidth which means there is more room available for other information and interactive services, another thing is the way that the signal is packaged should also mean it will not be as “Lossy” as an Analogue signal would be.


It costs a lot money to switch over from Analogue to Digital, Cliff effect which means if the signal degrades beyond a certain point the receiver will not be able to decode the signal. Switching channels is slower because of time delays in the decoding of the signal. You may require extra equipment and a new antenna.

VOD (Video On Demand)

VOD is used to download and/or stream videos to a computer or a similar device so people can watch them whenever they want.

It is new way to broadcast films and TV to an audience digitally when they want it, It is quickly catching on.

VOD gives the viewers full control of what they watch and when, some are free such as BBC iPlayer and others charge like Netflix and Zune.

VOD is also known as internet TV, which is also referred to as “Catch up TV” it is a way of broadcasting TV programes over an internet connection. The viewers have the choice of either Stream programmes or Download them directly onto their PC or other device. Among the most popular internet TV providers are BBC iPlayer, 4OD and 5 on Demand.

4OD for an example is an internet broadcast service that offers Streams and Downloads of programmes from Channel 4, E4 and More 4 for 7 – 30 days after they are first broadcast, 4OD operates on Windows (XP, Vista, 7) Mac OS X and Linux and is available at channel4.com, Virgin media, Talktalk TV, BT vision, Playstation 3 and iPAD although the number of days the programmes are available may differ between them.

According to http://www.rapidtvnews.com Channel 4 recently (at this time of writing 14th November) reported that the number of 4OD users had gone up by 2 million and has shown a continued rise in viewing across all platforms from the 31 million recorded in the previous month, Hollyoaks is ahead in the popularity charts with 2.3 million views.

In general http://thenextweb.com reported “Visits to online video sites grew by over a third in the uk over the past year, with 785 million visits to websites such as youtube in september alone”.

People have been interested in the idea of Internet TV ever since the creation of modern internet but it never came to be until now when the technology has caught up with Idea and subsequently boomed.

Streaming is when a video is converted into a specific code that is put into a container bitstream which is them streamed continuously to a client computer(Audience).


Choice the viewer is given free reign to choose whatever they want to watch at any time, It’s free, Accessiblility is not a problem as it can be viewed on trains, boats and planes.


Loss of Advertising means they take a hit to the Broadcast provider’s revenue as people would rather have something for free than have adverts. You need a large bandwidth otherwise the progame could become disrupted by loss of connection, only recently have large Bandwidths become readily available. It will cost money to Set up, Run and Maintain in order to keep up with consumer demand. Illegal torrents will rob them of their profits, sites like tvlinks and surfthechannel are examples of torrent sites.






Based on the way the television Industry is heading its seems like Digital is becoming the dominant broadcasting service. The Digital switchover which is occurring in Britain and is scheduled to end on the 24th of October 2012. However consumers are becoming more and more keen on VOD (Video On Demand) which allows them to watch what they want when and where they want. This is good news for the consumers but not so great for the advertisers whom rely on revenue from broadcast advertising.

Media 101: Digital Recording

The Sony DSR-250P is primarily a professional broadcast camera. This camera can be used both on location (outdoor) and in a studio (indoor).

The 250P has 3 CCDs (Charged Couple Devices), each of these chips reads the light values of one of the primary colours (Red, Blue, Green). The advantages that come with having 3CCDs are that you get 450’000 pixels which each contribute to high sensitivity, good signal-to-noise ratio and reduced smear of the DSR-250P.

In order to use the camera you must first turn the power switch (2) to the on position and then turn the “mode” switch (3) to camera.


It is possible to swap between manual and automatic features. To be in manual mode switch off all Auto switches as displayed in the image below.

In order to manually focus the camera switch focus to manual. Turn the focus ring in order to adjust the focus. In order to manually zoom in and out, rotate the zoom ring until you have the desired position for the shot (refer to image above for the position of the various features mentioned in the past paragraph). Other zoom options include the “rocker buttons” which enable a much smoother zoom (see below).

In order to control how light or dark an image is, use the Iris ring. If the Image is still too dark/bright check the ND Filter switch, for indoor shooting it should be off.

The F stop is the F number on the viewfinder monitor and it describes how much light is getting into the Iris. E.g a low F number means the Iris is open wide and lots of light is going in so that the picture is over exposed. A high F number E.g 11 to 20 means the aperture is nearly closed and not allowing  much light which creates a dark image.

In order to use white balance you must find the white balance button (5), press the button and an icon will appear on the viewfinder monitor, representing that the white balance is on. Then zoom in on a white object until the icon on the viewfinder monitor stops flashing, then release the switch.

There are presets in place that are meant to represent outside light (the Sun) 5800k and for indoor light (lightbulb) 3200k. This enables you to swap the white balance quickly for the two different types of lighting (see diagram 3 above).

In order to control the brightness of a shot without having to rely on natural interior/exterior lighting you can use the gain located next to the White Balance button. This is a switch with H, M and L (High, Medium, Low) which can be used to boost the signals from your camera to artificially create a brighter image.

The Sony DSR – 250p records in the following formats, DV CAM/ DV/ Mini DV cassette. You can also take still images by switching from camera to memory. The images can be stored in M2 memory cards or Cassettes.

External power sources can be connected by plugging into a socket on the back of the camera. To put in battery packs there is a slot just above the external power socket.

The Sony DSR-250p camera also has an external microphone. The input for the microphone is in the socket marked mic in 48v (see Diagram below).

To adjust audio levels there are two small wheels on the side of the camera near the back marked CH-1, CH-2. More controls for audio can be found beneath the drop down door which is where the audio wheels are also located (See picture below)

In order to view your feature straight away the Camera has an LCD monitor/ Viewfinder or can plugged into an external monitor. The Magenta wire in the picture below depicts the video out sockets location, this is commonly used for filming in a studio.

The Sony DSR 250p camera will accept the following visual interfaces: S video, Co axial and Composite, The camera can also shoot in the following TV standards; PAL and NTSC.
Media 101: Digital Editing
Exporting by definition means “to take away”, when we export a file out of a video editing software we are putting the movie into a format which we can takeout of the program. Exporting is the final phase of the post-production process. Everything in the video (shooting, editing, special effects, audio, colouring, lighting, titles) has been accounted for and now must be exported. For this course we use Final Cut Pro which has many different export options. The file format you use will depend on the task the file is going to be used for, and the quality required.

The definition of compression is to reduce the size of something, in media production this means reducing the file size. In media you will sometimes be required to compress the video if you want to say put it on Youtube, but you need to be careful because the video quality can be downgraded during the compression process.
File Types:
Video – for the motion pictures the file type is an MPG. An MPG file is derived from the MPEG (Moving Picture Experts Group) video/audio compression format that is designed to provide broadcast quality motion video. MPG files are represented by .mpg which you will see on programs like final cut pro which I used to make my video production with.
Still images – for the still image files I used a file type called a JPG which comes from JPEG (Joint Photographic Experts Group). JPG files are very small and easy to move around on memory sticks and suitable to send to people via email. JPG can compress a file down to 1/10 of it’s original file size however this comes with a price as JPG is lossy, this means that during compression some of the images quality is lost.
Sound – All of the sound files for my video production are .wav files. Also known as a waveform audio file or wave file were first introduced in 1991 on the Microsoft Windows 3.1.
Data Transfer – for example Fire wire is a cable that can be connected to a camera and/or to a tape deck. It can be used to log and capture files, you can burn the video software from the video onto a PC.

Evaluation: The Journey of making a Final Cut Pro Video

When I was working on my Sony DSR 250p project we were required to make a video with audio that guided people through the camera’s functions. I was nervous as I seldom get along with technology in a working environment and I was unsure if I would be able to keep up with all the details and references that I would have to make in order to get a distinction grade.

When writing a script for the video production I used the Production manual and Iain Bruce’s own words to help me get a straight forward explanation of where parts of the apparatus could be located and what their functions were. At this time of writing it has been at most 3 months since I finished the video recording and I am little sketchy on some of the details but I will do the best I can to try and remember how I completed the project.

Iain Bruce (our tutor) was bringing us out of class in pairs to record our video recording. I was paired with Cherie and we used a MAC that had an inbuilt microphone to record the audio, after the audio was recorded we transfered it to Iain’s memory stick. After we recorded the audio we took it in turns to record and take pictures of the various features and apparatuses of a Sony DSR 250p camera with exact same kind of camera. I recorded all the parts we would be required to cover in the assignment including white balance, focus, 3CCDs

After I obtained the footage on cassette I returned to the classroom and Iain provided me a tape deck to transfer the footage to the MACs in the classroom. I then took the video and audio files and put into final cut pro where I arranged them to make my video tutorial for the Sony DSR 250p camera. After the video and audio was arranged it was compressed onto an MOV file before being exported and uploaded to my youtube channel under the name DSR 250p tutorial and can be viewed on my blog.

In conclusion I feel that I have gained a basic understanding of recording technology and how to apply it to the creation and production process of a media product. I understand some basic technical details although I will need to hone them so I don’t have to constantly call upon others for help.


Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out /  Change )

Google+ photo

You are commenting using your Google+ account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )


Connecting to %s