Format war has been raging across Earth for years. What can be done to bring peace to the world of HDTV?
A lot’s written about HDTV, high definition television, but the New York Times tech correspondent got it in a nutshell: “On each programme,” he wrote upon receiving his HDTV set, “we counted the pores on the host’s nose.”
An HDTV set has many more lines than a standard set, and the more lines you have, the better the picture. We don’t have it in the UK yet, but if you’ve been to the States, South Korea, Japan, Australia, Nigeria, or Brazil in the past few years you’ve probably seen it for yourself. Going from normal British telly to high definition TV is like going from black-&-white to colour.
Your bog-standard TV set in the UK has 625 lines, of which you can only see 575. This is the PAL standard. A TV in the States has one hundred less at 525 and is known as the NTSC standard. High definition TV sets have up to 1,080 lines. This gives a much more detailed picture.
After years of format wars across the globe, and in particular in the US, the HDTV world is settling down – a bit. The US has adopted a Common Image Format based on using the maximum number of lines, 1,080. But there is more to it than that.
Clear as mud
Three image formats currently dominate HDTV production: 1,920-x-1,080 in 50 or 60i varieties; 1,920-x-1,080 in 24, 25, or 30p; and 1,280-x-720 in 60p. Or, to use their shorthand, 1080i, 1080p and 720p.
The first two numbers refer to the resolution. So 1080i is 1,920 pixels wide by 1,080 lines deep. The second set of numbers is the number of fields per second that are displayed on the screen. Because electricity in the States runs at 60Hz frequency, HDTV in the US displays at either 60 or 30 fields a second, in Europe (50Hz) it’s 50 or 25 fields a second. 24 is based on film.
The letter at the end is either “i” for interlaced, or “p” for progressive. Both are different ways of getting a picture onto a screen. Interlacing is where the display writes alternate lines – lines one, three, five, and so on, then lines two, four, six and so on – to build up the whole picture on screen. Half the picture is drawn with every refresh, resulting in a complete frame being drawn 25
times per second.
The technology was developed because early TV tubes couldn’t draw the whole picture before the top began to fade. This is how standard definition works. It’s also why TVs “flicker”.
Progressive is where the entire image is written in line order and then displayed on the screen, so lines one, two, three, four, up to 1,080 are written and then the image is displayed. This gives a smoother image without flicker, and it looks like film. This is how your computer monitor displays.
Sony kicks off
So far, so clear. The confusion – and wading through the Web sites, news groups and industry press releases, it is clear that the HDTV-world is very confused – comes when you start to look at how these three different standards are being applied.
1080i has been adopted as the common image format by the States, and to a large degree by Australia, North America, and Asia. The big technology developers and manufacturers like Sony have also been happy to adopt this standard. But, just to make things more complicated, Europe has not.
Instead, at a European Broadcasting Union (EBU) conference towards the end of last year they came down firmly in favour of 720p. This nearly gave Sony Europe’s director of strategic planning, John Ive, a heart attack: “We don’t need this debate,” he said. “Movies, entertainment, kids, current affairs, docs, and even sport work wonderfully well in interlaced form.”
He would say that of course – his company is the biggest supplier of 1080i production and
broadcast gear in the world.
In turn, that made Phil Laven, director of EBU’s technical department, commit a sharp U-turn. The final decision was turned into a “work in progress” and the issue was left fudged, again.
“Why is Europe promoting 720 progressive while the rest of the world is getting on with 1080 interlaced?” you might ask. Well, to clarify their position after their bun-fight with Sony, EBU released a statement in January this year. It said that most consumers in Europe are moving towards widescreen, non-CRT, flat panel TVs. All these flat panel displays and HDTV projectors will be progressively scanned.
Because the displays are progressively scanned, said EBU, broadcasters should broadcast in progressive. This is because when you convert from interlaced to progressive you lose quality. This is done in the consumer’s equipment and it is the quality of these filters which determine the quality of the image: much better to broadcast in progressive and display in progressive.
Another good reason – and perhaps the main one – for recommending 720p over 1080i is bandwidth. With current compression technologies it is less bandwidth-heavy to broadcast 720p.
So, despite Sony’s heart attack, EBU went on to recommend that the preferred standard for HDTV emission in Europe is 720p/50. However, it also mentioned the need for “flexibility” and the need to be aware of and support “the multiplicity of HDTV formats”.
Hang on a minute…
Let’s think about this for a minute. The States and the rest of the world are broadcasting in 1080i. Some of the US sports channels broadcast in 720p/60 because they get better motion portrayal. The nascent HDTV industry in Europe is also working in 1080i: The BBC test broadcasts are in 1080i; Sky is promising to broadcast in 1080i and 720p in time for the 2006 World Cup in Germany; and a dedicated HDTV satellite channel called, wait for it, Euro 1080 launched in early 2004.
So why is EBU adamant that progressive is better? It would argue that it’s looking towards the longer term. In the future, the argument goes, compression technologies will be such that 1080p (the best quality of the three formats) will be easily piped into homes.
It also argues that the difference in quality to the viewer is negligible: 720 and 1080 lines deliver the same subjective vertical resolution. It argues that “inter line twitter” of interlaced images reduces the image quality.
The EBU agrees that 1080i gives a wider image, but argues that cameras and displays today only offer 1,440 pixels and use funky technology to stretch it out, not the 1920 promised. Sony of course points out that the screens of tomorrow will be the full 1,920 pixels wide.
EBU argues that progressive gives much improved motion portrayal, especially for slow motion – you don’t get the blur of interlaced images. And finally it argues that it’s easier to convert from progressive to interlaced than vice versa.
“Suppliers of HDTV equipment have complained that EBU’s support for progressive scanning is damaging the case for 1080i/25 and the 1,920-x-1,080 common image format,” EBU’s Philip Laven said in defence of his position. “In fact, EBU has recognized that 1080i/25 services will operate alongside 720p/50 services – and strongly hopes that 1080p/50 will eventually become the norm.”
Where does this leave you and me? Well, as consumers we should be OK. A new “HD Ready” label has been produced by the European Information and Communications Technology Industry Association (EICTA). Supported by all the major Euro broadcasters, including Sky, the label guarantees technology from different manufacturers is future proof.
If a screen has an HD Ready label it has a minimum resolution of 720 lines, and is capable of accepting 720p/50/60 and 1080i/50/60.
But as programme-makers things are trickier. EBU in their well-funded, fat-bottomed helpful kind of way have suggested that HDTV programme makers buy equipment that: “Should include, at a minimum, 720p/50, 1080i/25 and 1080p/25 systems”. They add: “HDTV production equipment in the longer term will need to include all of the above and 1080p/50.”
So the future of HDTV in Europe looks like it will be multi-format. Broadcasters will be able to choose, on a programme-by-programme basis, whether to broadcast in 720p or 1080i and consumers shouldn’t need to worry. But what if you’re a programme maker?
Choosing what to shoot
“Forget about 720, 1080 is real high definition,” says Doug Hammond, director of operations at Shooting Partners Group in London. They’ve been shooting HD since 1990 and their definition of HD is 1,920-x-1,080.
“You’ll never make a movie on 720,” says Hammond, “you can on 1080”. From his point of view the EBU format war about 720 isn’t even worth discussing. His industry works on 1080 and that’s it. “We’ve sent some cameras out to Africa for Discovery Channel in the States. They’ll be shooting in 1080/30p or 1080/60i.”
He advises: “Use 1080, shoot one higher and down convert from 60 to 50 rather than bump it up.”In the States, that bellweather of the broadcast world – the LA porn industry – has been at it for years. At the AVN Adult Entertainment Expo in Las Vegas in January Bob Christian of Adam & Eve Productions said they’d been shooting in HD for three years. “We shoot at the highest 1080p and then edit on HD equipment. The DVDs are released at standard definition but the all-HD process results in a higher quality image.”
However, not everyone thinks that’s such a good idea. Nina Hartley, a 21-year veteran of the industry who has been in over 650 movies said she wasn’t sure if HD would benefit porn. “HD is not adult friendly,” she said. “Most women in porn are average looking, the same for the guys. I’m not sure how that will hold up.”
|Caamera hire (8 weeks)||£10,600||£20,800||£12,700|
|Stock and working copies||£3,100||£36,480||£5,100|
|Shooting Sub Total||£13,700||£57,280||£17,800|
|Post-neg cut, grade, conform, finish & masters||£14,320||£22,720||£14,320|
|Source: Shooting Partners. Rough costs involved in shooting HD based on an eight week shoot using 60 rolls of stock.|
Nvidia GeForce GTX 1080 vs GTX 1070: What's the difference between GTX 1070 and GTX 1080? A price...