Video Frame Rates and Display Refresh Rates for Beginners

Jan 26 2011

We hope you've been enjoying our series of Beginner's Guides for HTPC and Home Theater. As part of the series, we’ve previously discussed video resolutions and how video information is displayed on a screen for a frame of video in our guide, Video Resolutions for Beginners. What we didn’t delve into much was the rate at which video frames are captured, or, in other words, the video frame rate. This guide will cover the basics of frame rates and how displays deal with the frame rates. We’ll try to cut through the marketing buzzwords like 120Hz, 240Hz, 600Hz sub-field drive, etc. so that you can make a more informed decision when purchasing your next display and how to insure an optimal viewing experience.

 

What is frame rate?

It simply is the rate at which frames of video are captured and transmitted. Frame rate is generally expressed in frames per second (fps). The most common film and video frame rates are 23.976 fps, 25 fps, 29.97 fps, 50 fps and 59.94 fps.

When dealing with interlaced video such as the NTSC 480i and ATSC 1080i standards which have an effective frame rate of 29.97 fps, the video fields are captured and transmitted at twice the effective frame rate for a field rate of 59.94 fields per second.

 

If only half of an image is captured and sent, how does this interlacing stuff even work?

When dealing with interlaced video fields, a progressive display (which is most, if not all, consumer displays in today’s market) will perform what is called de-interlacing to achieve a whole frame instead of half a frame or field. De-interlacing is a complex process and algorithms to perform it improve all the time. If you’ve read reviews on MissingRemote lately for GPUs and other devices, you will see that we use the HQV benchmark which, in part, measures the ability of a device to deinterlace.

As pointed out in Video Resolutions for Beginners, interlacing is a form of compression because only half the vertical lines (size of a single field) of the entire image are conveyed at the field rate. This compression works well when dealing with static images because the frame can be reconstructed perfectly. In fact, the compression is effectively lossless when dealing with static images. If there is fast motion in the image, interlacing becomes more problematic because the missing lines of each field are lost and have to be reconstructed by the device that deinterlaces the image.

The following image shows two consecutive fields of a moving picture displayed as a single frame (this is a simplistic deinterlacing method known as weaving). Notice that there are artifacts or, more specifically, jagged lines or “jaggies”. More sophisticated deinterlacing algorithms can make a considerable improvement to this image.

 

Why are there so many frame rates? Why not just settle on a single frame rate to make this easy?

The top reasons for the various frame rates in use today are history, technological limitations, cost and artistic choice.

Historically, frame rate for film was 24 fps and chosen as a minimum acceptable rate due to the high cost of film. A higher frame rate is said to have better temporal resolution than a lower one. Temporal resolution describes the ability of the frame rate to capture moving objects. The historical film frame rate has poor temporal resolution when compared to faster rates such as 59.94 fps. If this faster rate had been used, there would be more than twice as much film used for the same amount of running time.

When television came along, there were technological reasons why the old NTSC standard chose an effective frame rate of 29.97 fps. The technology and amount of bandwidth required to capture, transmit and display video in an acceptable way to televisions were major reasons why the rate was chosen.

Later on, the PAL standard was developed in Europe and used an effective frame rate of 25 fps. Again, the frame rate was chosen and limited for some of the same reasons as NTSC.

When high-definition video standards were developed, frame rates were chosen yet again due to the state of technology and bandwidth limitations as well as for compatibility with film and NTSC or PAL video sources. HD frame rates can be as high as 59.94 fps.

In broadcast video today, the most commonly used frame rates are 720p, specified at 59.94 fps, and 1080i, specified with an effective frame rate of 29.97 fps. In Europe, Africa and much of Asia, 25 fps and 50 fps are used in lieu of 29.97 fps and 59.94 fps. Networks have decided on either 720p or 1080i for the transmission of their content. Before we look at the tradeoff, consider the term spatial resolution which is a term used to describe the number of pixels of information in the video frame. A vertical resolution of 1080 lines is said to have a superior spatial resolution compared to a vertical resolution of 720 lines. With that out of the way, we can now see that broadcasters make a choice between spatial resolution and temporal resolution. 720p has the superior temporal resolution with fast moving video and 1080i has the superior spatial resolution.

With Blu-ray disc, the most often used frame rate is 23.976 fps with 1080 lines of vertical resolution.

 

What about the display refresh rate?

While the video frames are captured and displayed at some rate, the frames of video are sometimes displayed at a different rate called the refresh rate. Refresh rate is generally expressed in cycles per second called Hertz (Hz). In today’s display market, televisions typically accept a maximum rate of 60 fps.

You might be wondering about the refresh rate terms bandied about by display manufacturers such as 120Hz, 240Hz, 480Hz, 600Hz sub-field drive, etc. and what it all means if the display is only accepting 60 fps at the maximum. Basically, these terms are mostly marketing specmanship. To make matters more confusing, it is important to understand exactly how the display is getting to those refresh rates. Liquid crystal display (LCD) (or light-emitting diode (LED) LCD)) technology is typically marketed as 120 Hz or 240 Hz. The displays will accept a 60 fps signal and either display the image multiple times or it will interpolate frames to a higher rate and then display interpolated frames. The reason for the interpolation is an attempt to increase temporal resolution. The interpolation also causes what is often called the “soap opera effect” which can be especially unnatural looking when viewing film-based content. Plasma displays today tout the 600Hz figure to one-up the LCD display marketing even though the technology has always been superior where motion is concerned.

As a final note, 24p which is almost always a frame rate of 23.976 fps is accepted by displays and either displayed at a rate of 59.94 Hz using a telecine process known as 2:3 pulldown or an even multiple such as 96 Hz. A detailed discussion of 24p and how it should be properly viewed can be found in our article 24p: What You Should Know.

 

I’ve got a home theater PC (HTPC), set-top box (STB) or other source device. What is the frame rate of the video coming out? What should I set the rate to?

First, it should be noted that sometimes source devices such as a HTPC will often refer to the rate going to the display as the refresh rate and use Hz as a unit. A HTPC must be set to a fixed output rate. In the US, the best rate for most content is 59 Hz which is used to mean 59.94 Hz. This setting allows for 29.97 fps and 59.94 fps video to be properly displayed. It also will display 24p content though it will be subject to telecine judder so in cases where a display can properly handle 24p, it is often desirable to select a rate of 23 Hz which is used to mean 23.976 Hz. For traditional PAL countries, 50Hz will be preferred.

Other source devices can often be set to output the content’s native resolution and frame rate. This can be desirable because an external device such as a video processor in a audio-video receiver (AVR) or the display itself may offer superior scaling and processing of the image. The only way to know for sure is to experiment and try to determine whether the source device or the display offers the best performance.

If there's a question we didn't answer or something is unclear, let us know in the comments below.

Related Articles

Comments

Greetings all.  I could use some help.  I'm new to the 1080/24p world as I just upgraded my Plasma 720p to a new LED 1080p 240hz TV (LG 476500).  I have an HTPC that I just cant seem to get set properly.

I use it for CableCARD/HD Content in MediaCenter as a dedicated HTPC/DVR.  It also is my primary bluray player via ArcSoft TMT5.

The graphics card is an ATI5570 with 1.3b HDMI cables running from the HTPC > AVR > TV.

Should I be setting the ATI Settings to 24 or 60?  It appears that TMT5 is auto switching to 1080/24p when I play blurays; but, then it has a bad choppy effect every few seconds.

GRRRR!  What should this all get set to?!  Thanks!

Remember that the TMT auto-switching refresh rate feature is still beta. I have read several reports that using the latest Catalyst driver from AMD fixed some issues with the auto-switch feature. If you're still seeing the issue, you can also turn the feature off for now.

You may also find this post regarding 24p of interest. You will want to make sure that your LG display is using the "Real Cinema" mode for proper 24p handling. Another thing to be aware of is that PC refresh rates available will be 23Hz and 24Hz. You really want 23Hz for BD playback. It is the rate that will equal 23.976.

Thanks Aaron-- I knew you could point in the right direction.  I tried setting the panel to Real Cinema-- no change in effect.  Setting ATI CCC to 23 won't hold-- it jumps right to 24Hz.

So, I think for now, I'll apply the disable the auto-switch feature in TMT and wait until the technology catches up on the play side a bit.

Comment viewing options

Select your preferred way to display the comments and click "Save settings" to activate your changes.
Website design by Yammm Software
Powered by Drupal