On 15/01/07, <b class="gmail_sendername">Daniel Kristjansson</b> <<a href="mailto:danielk@cuymedia.net">danielk@cuymedia.net</a>> wrote:<div><span class="gmail_quote"></span><blockquote class="gmail_quote" style="border-left: 1px solid rgb(204, 204, 204); margin: 0pt 0pt 0pt 0.8ex; padding-left: 1ex;">
On Sun, 2007-01-14 at 21:44 +0100, Wim Fokkens wrote:<br>> Hi Daniel<br>><br>> > You will never get a perfect picture with a progressive display..<br>><br>> I have to disagree with that.<br><br>I'm sorry but that is impossible.
<br><br>> And the picture quality is absolutely perfect and easily beats the picture<br>> quality of my STB connected with RGB scart. I think the main reason for this<br>> easy win is the fact that my radeon card can do very good deinterlacing in
<br>> hardware. And PowerDVD codec makes good use of this.<br><br>Very good deinterlacing is possible. And with material that was<br>originally not interlaced, like movies, you can reconstruct<br>the original progressive frame. But no physically constructable
<br>device can display it perfectly. You would need to flash the<br>backlight on for an infinitesimally short time once per frame,<br>this would require infinite power. You could get close enough,<br>say by projecting a light which flickers on for a few
<br>milliseconds through film 24 times per second. But LCD's and<br>plasma displays are far from this ideal. Plasma phosphors glow<br>for milliseconds, and worse the image is updated row by row.<br>LCD's backlights blink at something like 40,000 hz they
<br>effectively give off light constantly. This gives us a nice<br>flicker free display for static images but it also means you<br>see the row by row update and the image is constant for a long<br>time so your eye can not reconstruct it into a moving image as
<br>smooth as the moving image reconstructed by your mind when you<br>see images that flash onto the screen whole for a short period<br>of time. You can construct a device which shows progressive<br>material better using three DLPs or three LCDs and a shuttered
<br>HID lamp, but these devices are very expensive; and you need<br>to watch them in a darkened room.<br><br>> I also have a MythTV pc with the following specs:<br>> I am trying very hard to get the same picture<br>
> quality as my MyTheater system. But I don't even come close.<br>> A can get to the same level as James Buckley did.<br>> I also tried XVMC but judging from the picture it is only doing Hardware<br>> Motion Compensation but no hardware deinterlacing.
<br>> Why is this? Is there no support for hardware deinterlacing in the Nvidia<br>> driver?<br><br>There is some support. The XvMC API allows us to display the<br>odd field, even field, or both fields of an image. This is
<br>enough for us to do either bob or one field deinterlacing.<br>But it is not enough for us to something like 3-2 pulldown,<br>i.e. perfect reconstruction of material that was originally<br>not interlaced. We also can't do anything like Lancos, kernel,
<br>or linear blend with XvMC. nVidia has another library which<br>they call PureVideo, it is much better and is probably what<br>is being used in your MyTheater system but it is not available<br>on Linux unless you pay a good deal of money for it.
<br><br>Mark Kendall has started on an OpenGL video renderer which<br>should allow us to implement some really slick deinterlacers<br>in hardware. But this isn't very far along yet and we will<br>not be writing an XvMC replacement in OpenGL fragment
<br>programs anytime soon; we may eventually do this since it<br>doesn't look like nVidia will be adding MPEG-4 AVC support<br>to XvMC...<br><br></blockquote><div><br>Just to add, I have no intention of using anything but MythTV! :P
<br> </div><br></div><br>