So I remember talking to Noen about this a while back. I need a little bit of clarification. It is my understanding that movies, TV shows, etc are typically filmed at 24 frames per second. Some reality shows are filmed at 30 frames per second. They're going to film the Hobbit at 48 frames per second. Now the way this translates to displays (TVs, monitors, projectors, etc) is the display either adjusts to the fps of the material or the processor uses some type of pull down. So if I were watching a 24fps movie on a 60hz display I would get 3:2 pulldown (first frame is displayed 3 times, next frame displayed 2 times) to compensate for 60 divided by 24. This causes jitter. Why you see jitter watching a movie on a computer monitor. Ideally you want a 1:1 ratio for pulldown. So if you're watching 24p content you want 24, 48, 72, 120 etc for 1:1, 2:2, 3:3, etc pulldown. The part I want to clarify is all the TV manufacturing companies are advertising "48hz, 72hz, or 120hz compatible" LCDs. Why? Assuming all that true-motion crap is off what's the advantage of having 48hz (or 120hz) over 24hz for watching 24p content? An LCD doesn't "flash" like a CRT or movie theater projector so there's no need for 2:2 or 3:3 pulldown. I understand that 120hz is the ideal ratio because you can do 30x4 (4:4) or 24x5 (5:5) but if I'm watching 24hz content on an LCD there's no advantage of 48, 72, or 120 over 24hz...correct? I understand that movie theaters (film) or projectors (not LCD?) will often do 48hz (every frame displayed twice) or 72 hz (every frame displayed 3 times) simply because it reduces jitter but that's because the frames are actually disappearing and reappearing. With an LCD there's no point in anything higher than 1:1 pulldown. Or am I totally wrong? Basically the only 2 types of content I have are 24 or 30. I have my TV set to 24 by default and for the tiny bit of 30fps material I have it will switch the TV to 60hz (2:2 because it doesn't support a 30hz refresh). Almost all the time I watch everything in 24hz 1:1.
9/10/2012 4:00:25 PM
lots of marketing confusing all of it. watch tv and then a bluray on a 240hz tv, a 120hz, tv, and then a 60hz tv (assuming led/lcd). I prefer watching at 60hz for tv otherwise I see that ultra fast unnatural motion. blurays I'll watch at 240 sometimes but often will leave it at 60. 120hz is universally bad at all times to me.
9/10/2012 4:22:30 PM
I think you're missing the point of my question. Technically Blurays are usually 24p content. You want to watch it at 24hz on the TV. Watching it at 48, 72, or 120hz won't make a difference because the source content is still 24p. LCDs don't "flicker" like CRTs or older projectors so there's no need to watch Blurays at anything higher than 24hz. I think what you're talking about is the "smooth motion" effect that a lot of TVs have by default. So 120, 240+ TVs aren't actually getting source material any higher than 24, 25, or 30 they're just using interpolation to create fake frames to throw in there to fake a higher refresh rate. 120hz is the best format for everything because it uses 5:5 pulldown for Blurays (24p content) and 4:4 pulldown for 30p content (reality tv shows, sports, etc). If you're watching Blurays at 60hz then you're watching them with jitter (3:2 pulldown) and you're seeing one frame 3 times and the next 2 times.-And just to add this in. HDTVs don't currently go higher than 60hz (input) unless it's 3D (in which case it's 120hz). Anything higher than 60hz is just the processor in the TV "faking" frames. You can on the other hand buy a true 120hz computer monitor and it receives a true 120hz signal (from a computer) via dual link DVI. The only need for it is gaming or 3D movies though. [Edited on September 10, 2012 at 4:30 PM. Reason : s]
9/10/2012 4:27:16 PM
Although LCD does not flash like a CRT( where the ray gun was scanning the screen to excite the phosphors one at a time), it still refreshes pixels at a certain rate. If the refresh rate is 60hz, then you run into the pull down issues (some frames are visible for a slighhtly longer duration than others). At 120hz both 30fps (most non-movie programming) and 24fps (most movies) can be shown with each frame visible for exactly same duration (4 or 5 refresh cycles each, respectively) without TV having to adjust its refresh rate. Now, why not all 60hz LCD tvs were not also 48hz capable, I do not know. Some 60 hz TVs do slow down to 48hz when they detect 24p content. In that case, just like with a movie theater. Each frame lasts 2 refresh cycles. Rates higher than 120hz are useless for 2d programming unles you are into artificial smoothing because the highest content rate commonly available is 60fps. For 3D content higher refresh rate may reduce flicker. There are also smoothing algorithms that take 30 or 60 fps content and make interframes that will smooth the motion and make sports games and such more lifelike. This smoothing makes movies look like complete shit though. First thing you do when you get a 120hz tv is turn the smoothing off, unless you watch primarily ESPN
9/10/2012 4:37:24 PM
Thanks and that's a very thorough explanation of everything but it still doesn't address my main question. I can't currently set my TV to 48, 72, 96, or 120 hz at 1080p. It will only do 1080p24. http://www.highdefdigest.com/news/show/1015
9/10/2012 4:49:43 PM
I think I'm getting what your saying but I'm not sure but I'll try to provide some input.LCD information is a little bit skewed and marketed poorly. LCDs typically suffer from ghosting and blurring issues because they are slow to refresh. LCD as you probably know stands for Liquid Crystal display. The liquid crystal matrix alignment, which determines the frequency of light (color) to be allowed to pass through from the backlight, is a slower process than lighting or adjusting a diode or plasma pixel. The refresh rate of the screen is therefore more or less fixed and lower than that of some other display technologies.What LCD makers have done is ultimately changed signal processing to try and improve the perceived refresh by manipulating how many of what frames and are sent to the display to optimize to provide clearer image in situations where certain pixels and segments of pixels are changing significantly rapidly by creating interpolated transition images. The 120HZ reflects the frequency at which this happens. So a 120HZ signal processing will actually be very different than 60HZ because the frequency with which the image is sampled for these purposes is increased and when its replaced. You would actually be getting different frames all together if your tv allows adjustment not just different numbers of the frames from your signal.This may be a clearer explanation since I'm not sure I did that all that well. http://gizmodo.com/290237/the-trouble-with-lcd-tvs-motion-blur-and-the-120hz-solutionJust go plasma, better color, no ghosting, deeper blacks, cheaper set.
9/10/2012 5:22:41 PM
9/10/2012 5:37:20 PM
9/10/2012 5:59:08 PM
9/10/2012 6:08:49 PM
That's why I said visually there's no difference.You're confusing technologies. Backlight strobing (probably better defined as pixel response time) is different than image processing (refresh rate).Pixel response time is native to the type/quality of the LEDs used, whereas image processing is part of the TV's cpu or whatever it's called. You could have 480Hz but if pixel response isn't up to par you'll see ghosting and image blurring. It's not directly related to your question, I just brought it up as it's a new way of determining how well LCDs respond.I dunno, this is my understanding but maybe I'm wrong. It makes sense to me.[Edited on September 10, 2012 at 6:19 PM. Reason : .]
9/10/2012 6:11:40 PM
9/10/2012 6:24:57 PM
9/10/2012 7:07:26 PM
I just meant you slow it down by changing the settings. The native refresh is probably 60Hz. And yea, computers can change all of the output on the fly just like the TV.
9/10/2012 7:19:34 PM
Yeah the only thing that is ever displayed on my HDTV is coming from a computer. I never thought I could notice 3:2 pulldown until I actually watched something at 24hz and noticed how much smoother it was.
9/10/2012 7:24:58 PM
So I guess my question then is most TVs that are more than a couple years old can't accept a true 24p signal from a computer or Bluray player so either the TV or the device (Bluray player) is using 3:2 pulldown to show a movie and giving the picture a very subtle stuttering effect. I really never thought about it until I actually switched my TV to 24hz last year and everything was all of a sudden smooth.
9/10/2012 7:58:34 PM
9/10/2012 10:43:51 PM
24p should not be "smooth". Are you sure the smoothing is not on?
9/10/2012 10:52:39 PM
^Smooth as in I don't see the 3:2 pull down stutter I see on most peoples TVs that are playing 24p content on a 60hz set HDTV. It's set at 24hz. ^^i still prefer LCD for various reasons. At the time I bought my LED Samsung it was exactly what I wanted. Very happy with it.-and I'm pretty sure LED backlit LCD panels have surpassed plasma (yes for a premium) but I'll take the lower power consumption and lighter weight over the cheaper cost of a plasma any day. Pretty sure the gold standard for black levels are now the Sharp Elite pro LED LCD series which surpass any plasma. [Edited on September 10, 2012 at 11:43 PM. Reason : Hm]
9/10/2012 11:26:59 PM
9/11/2012 2:40:09 AM
Looking at the previous posts, I think you are under assumption that LCD monitor does not refresh the screen if there are several consecutive frames in a row. It does. I am not sure about the exact physics behind why, maybe the backlight is actually pulsing, or maybe the LCD pixels are flashing, but it does flicker even when the image does not change from cycle to cycle. I see it all the time when filming at a <200fps shutter speed. If you have access to a digital camera, point it at your computer LCD while displaying a still picture and see what happens. So the Hz and the Frames Per Second (fps) should not be lumped together for the sake of clarity, although technically Hz means just that, cycles per second. Refresh rate in Hz is the frequency of the monitor "flashing" the image, and frame rate in FPS is the quantity of individual signals that come out of your source every second. So you could have watched the TV at 24 fps, but most likely at 48Hz (or 96 or 120), as 24Hz would have induced a noticeable flicker. Watching 24fps signal at 60Hz would have created a slight stutter due to pulldown.
9/11/2012 2:43:51 AM
9/11/2012 9:15:13 AM
9/11/2012 10:30:25 AM
9/11/2012 10:34:37 AM
^I'd be curious to see the methods they use. The measurement is candela per square meter. This is a measure of luminance ("brightness" or energy output) over a region. But depending on the size of the region and type of images used for the testing this could be misleading.In a region of half blacks and half full whites i expect a plasma will put out more total luminance than an LCD in pretty much every case over the region but depending on the pixel arrangement that could have huge implication on how the blacks are perceived and judged. If the pixels are interspersed evenly sayBWBWBWBWBWBWBWBWBWBWBWBWThe the black pixels will get washed out regardless and which ever screen has the lowest overall luminance will have the best blacks.But lets say we have a region likeBBBBBWWWWBBBBBWWWWBBBBBWWWWHere the pixels along the boundary get washed out but the others show whatever the base luminance level for blacks is. In a plasma this should ultimately be 0 cd/m2 since the pixel is off. Realistically there would be reflection from ambient light so pure 0 is impossible but you get the idea. Even the best backlit system is probably going to let some light through. Therefore the region on the left would be blacker even if the luminance over the region was the same for both displays.The key then is defining what the average case is between larger regions or more interspersed regions.Any for all I know maybe they used a pure black screen for test but I don't know
9/11/2012 10:49:41 AM
Ok I see your point. So I know that "full array" or truly back-lit LED LCDs (not side lit like most thin HDTVs) have local dimming and actually shut off certain portions of the screen for pure black.
9/11/2012 10:56:51 AM
^Yeah if they can set the filter to allow no light from the backlight then there's no reason a backlit screen couldn't get the same blacks as a plasma over a single image.The problem comes in how good they are on the fly at being able to adjust to moving images and regions of blacks to be able to shut off that region. Does that region have to fit a certain shape or size for instance.Lets say a panda is walking across the screen, this is reasonable slow and could be handled well.But then lets say a soccer ball is rolling off a pass and going fast, here the processing is going to have to be really good to get something that small and fast moving to dim that region.I just think its harder to accomplish over moving images. Also half the battle is a visual perception thing that varies slightly person to person depending on their own vision so once things get to a certain point anyway its a moot point.I just like talking this stuff (<<CS Grad with a focus on Visualization and Perception )[Edited on September 11, 2012 at 11:19 AM. Reason : can't spell]
9/11/2012 11:03:15 AM
Ah that's awesome! Yeah i'm not in the field but I've always been interested in stuff like this. I don't really watch sports and most of my content is 24p or 30p. I'm perfectly content with 24hz on an LCD for almost 95% of my viewing. Gaming is a different story. Every since I got a 120hz monitor I can tell a difference up to about 90 frames per second. 90-120 is pretty indistinguishable for me. That jump from 60 to 90 is noticeable though in faster motion stuff.
9/11/2012 11:14:00 AM