|
General Discussion General discussion about SageTV and related companies, products, and technologies. |
|
Thread Tools | Search this Thread | Display Modes |
#1
|
||||
|
||||
Yet another ffdshow thread
I've been using SageTV for a while now, since 2.0 (?), and way back then I was reading that ffdshow really won't help my PQ since I'm using a std TV (display at 800X600). Is this still the case? I've been playing with the idea of using my 350's output but then I get the menu/remote control lag. The 350 looks much better than my nVidia decoders but I don't like the lag. DVDs look pretty good and the recorded TV isn't all that great and I want to see if I can get better.
Any idea where I can find the "ffdshow for Dummies" wiki/FAQ? I've been to the SourceForge.net site and didn't really see what I was looking for. Any help will be appreciated.
__________________
386DX, 40MB HDD, 5-1/4" & 3-1/2" Floppies, 14.4K baud modem, DOS 6.2 and Windows 3.1 on a Samsung 55" LCD |
#2
|
|||
|
|||
I think a large part of how good your display looks is determined by how good your video card is and what it is.
For me I use the Nvidia 6600GT with the purevideo decoders and it's much better than my 350 could ever hope to display. I of course now use DVI output, but still I could equal the output of the 350 easily no matter what mode I used and I don't use ffdshow either. I've tried it, but it's more taxing on the CPU and doesn't result in a lot better picture so I don't use it. Maybe then TT and Sage play nice I'll play some more with it, but right now there's no reason for me to. What is your hardware by the way? I ask because if you do enable the TV-Out you may find your system becoming unstable with the PVR-350, especially if you have the UI out as well. |
#3
|
||||
|
||||
cummings66: My hardware is an Intel 650 CPU, Intel mobo, eVGA 6600 heatsink edition, 1 GB DDR2, 3 X SATA HDDs, 1 X SATA DVD, Audigy 2 ZS, 2 X PVR-350 (I know, I don't need 2 but when I bought those I wasn't 100% sure what the difference was so I went ahead and got two. Wish I had a 500 instead.) I bought TT but don't use it. I just bought it for the NVIDIA decoders. I do use NVIDIA video and audio decoders at the same time - I'm pretty sure I'm at version 1.88. My video drivers are 81.85 version "b". This is a dedicated HTPC machine.
I keep hearing all the people say how wonderful the PQ is, but my STD TV's tuner looks better than my HTPC tuners. Playing a DVD looks alright - like it should. I used to have a 6800 std in there but then took it out and put that into another game machine and that's when I put the 6600 passively cooled GPU in there. I don't see a PQ difference between the 6600 and 6800. I do have the s-video out of the video card going to a small breakout adapter (that came with the 6600) that has the 4-pin s-video as well as HDTV RCA-style jacks. Since my TV doesn't support s-video, I have a converter changing it from s-vid to the yellow RCA jack. In the video driver I tell it to output as s-video - which does look better compared to "auto" and "composite" (I believe that is the third option). I've tried changing the flicker control and that helped somewhat. I'm running video at 800 X 600 and I believe that *could* have something to do with the general blurriness of the pic. I was reading some about aspect ratios (thanks Nielm) and other stuff and from what I read, if my TV is at resolution A and my video card is outputting resolution B, the signal won't line up correctly when being shown on the screen and it'll look blurry. It's worth a try. What's the std TV resulution again? 750X480 or something like that?
__________________
386DX, 40MB HDD, 5-1/4" & 3-1/2" Floppies, 14.4K baud modem, DOS 6.2 and Windows 3.1 on a Samsung 55" LCD |
#4
|
||||
|
||||
720 x 480 interlaced (I think). I am in a similar situation; I just switched to a 6600gt from the pvr350's TV output and am overall quite happy with it. The main reason I switched was to end the stability problems with the 350 decoder and the EOF bug. I also wanted to be able to play DVD's and alternate video formats through sage, which I couldn’t do via the 350's output.
One thing to keep in mind is that a standard old TV set will not display progressive images, so your nVidia PureVideo decoder will not do you any good when displayed on your TV. Your TV simply can not display the extra information from a progressive source. It should look better on your computer monitor, however. I think my 6600gt may have a composite video output, if yours does, I might try it vs. converting the s-video signal back to composite. The screen resolution you send to your TV will make a difference in its overall appearance. I run at 800 x 600 and for what ever reason it looks better to me than running it at 720 x 480. I have tried other resolutions too, but it tends to make my TV freak out completely. Theoretically, 720 x 480 should give the best results as it should be the native resolution that your TV card captures at. Thus there wouldn’t be any need to scale the image up to fit a 800 x 600 screen, then have your video card squish it back down to fit the TV’s resolution. There can also be an issue with your recording quality, and or the quality of the signal making it to the TV card. I have 2 Hauppauge cards and am generally impressed with the PQ. Though for a time, one card seemed to be better than the other, which I tracked down to a bad coaxial patch cable. I think there are some other adjustments that can be made to the Hauppauge cards to tweak their picture quality, but I have always just left mine stock. The bottom line is that it’s nigh time to get a new TV. Your video card and CPU have to do all kinds of tricks to format video for your TV set. The 350 decoder, and your TV’s tuner for that matter, only accept source video that is formatted perfectly for it. If the 6600 were designed specifically to render 720 x 480 interlaced at 60hz only, it could probably do a better job than it is doing for you currently. But my guess is that nVidia wouldn’t sell as many gaming cards that way. When you get a display that is capable of delivering more data than your video source has available you get a whole different set of problems. That is when you get into needing those PureVideo adaptive de-interlacing capabilities. The good news is that you already have a couple of video cards that can do it for you, once you have a HDTV that can display the benefits. For now, I am stuck with you in SDTV land. Hopefully I will be able to find the $$ and make the switch to an HDTV set here soon. My old set is starting to get nice purple and green spots on it from not being very well shielded. The more it annoys my wife the sooner the new TV arrives. |
#5
|
||||
|
||||
Humanzee: Thanks for the info. I agree with you about the new TV! Would you mind telling my wiffe that, too! Unfortunatly, it's that time of year when the big department stores put their large items (ie, leather sofas, etc) on sale and my TV money (so I thought it was TV money) just got sucked into a new sofa, loveseat combo that I had no idea we needed. SO now we get to watch a blurry TV on great, comfortable couches!
I will try to setup a custom res. at 720 X 480 and see if I can output composite directly. Thanks, again. AWS
__________________
386DX, 40MB HDD, 5-1/4" & 3-1/2" Floppies, 14.4K baud modem, DOS 6.2 and Windows 3.1 on a Samsung 55" LCD |
#6
|
||||
|
||||
Quote:
I use "deinterlacing" in quotes because when the video is rendered to the framebuffer (where the TV out pulls the data from) it's progressive at the resolution and framerate you specify in Windows. For example, if your desktop is set to 800x600@60Hz, the video will be converted from 720x480@29.97Hz interlaced to that resolution/framerate. The reason the PureVideo decoders are/can be beneficial is because they are about the best deinterlacer available for PC. Since the video is getting "deinterlaced" one way or another, you will want to do it the best you possible can. That will minimize the issues of going from one interlaced format (source) to a different one (output). The only thing worse than going interlaced -> progressive -> interlaced, is going interlaced -> interlaced, especially if theres a resolution change in there. |
#7
|
||||
|
||||
Quote:
|
#8
|
||||
|
||||
Yeah, it's all kind of confusing, especially when you get into using video card TV-outs.
|
#9
|
||||
|
||||
Quote:
|
#10
|
||||
|
||||
While I was trying to find the correct resolution for my std 4X3 TV, I came across this article - 480i:
"480i refers to a video mode where the vertical resolution consists of 480 lines, and the i stands for interlaced. It usually has a horizontal resolution of 640 and a 4:3 aspect ratio. It has a field rate of 60 hertz, properly named 480i60. 640x480 with an aspect ratio of 4:3 is Standard Definition Television (SDTV). It is used in NTSC countries also in analog." I've been told a number of times that 720x480 is my TV's resolution. Why is this place saying it is 640X480? Are the just wrong? So when I create my custom resolution, I should make it 720x480, right? What about the refresh rate? Isn't that something around 59Hz? ... not quite 60Hz?
__________________
386DX, 40MB HDD, 5-1/4" & 3-1/2" Floppies, 14.4K baud modem, DOS 6.2 and Windows 3.1 on a Samsung 55" LCD |
#11
|
||||
|
||||
Quote:
First NTSC was defined before there were pixels, NTSC is defined as: 525 lines (of which about 480 contain picture information) refreshed at 29.97Hz, in an interlaced manner for 2 262.5 line files that are displayed at 59.94 Hz. There's a horizontal scan frequency of 525*29.97 == 15.734kHz. Now each channel is given a bandwidth of 6MHz. Note that this is all analog and there are no pixels. It is displayed on a 4:3 ratio display. This is the way analog TV is broadcast Now comes digital, and there needs to be a way to represent that. Go through a bit of sampling theory (that I doubt anyone cares about, and I can't remember well enought to explain this late ) and what you've got is with the given parameters (namely 6MHz bandwidth) you can represent somewhere in the neighborhood of 300-350 by 525 pixels. OK, so where's 720x480 come from, well that comes from ITU-R BT.601, which specifies digital encoding of video. It specifies a 13.5MHz sampling rate, which results in the 720 line horizontal resolution. This is the way DVDs are stored. So where's the 640x480 come from, it comes from computers, 480 * 4/3 = 640. PCs use square pixels. Conversely video uses slightly rectangular pixels, .9/1, so 480*4/3 * 1/.9 ~= 720 So I guess to directly answer your questions: Horizontal resolution isn't incredibly important because the TV just sees analog wave forms, refresh rate and horizontal scan rate are what the TV cares about. Refresh rate is 59.94Hz interlaced Horizontal scan rate is 15.7kHz The resolution of your recordings is 720x480. So yes, I would aim for 720x480 @ 59.94Hz interlaced and a horizontal scan rate of 15.7. |
#12
|
||||
|
||||
...too bad there isn't a icon bowing down in awe to another. :-) Thanks, Stanger, that makes it all make sense now.
__________________
386DX, 40MB HDD, 5-1/4" & 3-1/2" Floppies, 14.4K baud modem, DOS 6.2 and Windows 3.1 on a Samsung 55" LCD |
#13
|
|||
|
|||
I'm unworthy...
|
#14
|
||||
|
||||
Goodness! I just changed from "S-Video" out (via my NVIDIA driver) to "Composite" and then changed res to 720x480... my what a (good) difference! I can actually read the text on the Windows part of the HTPC... it looks great now! Thanks for all your help!!!
__________________
386DX, 40MB HDD, 5-1/4" & 3-1/2" Floppies, 14.4K baud modem, DOS 6.2 and Windows 3.1 on a Samsung 55" LCD |
#15
|
||||
|
||||
And in the end, something totally illogical, and unintuitive saves the day
|
Currently Active Users Viewing This Thread: 1 (0 members and 1 guests) | |
|
|