SageTV Community  

Go Back   SageTV Community > Hardware Support > Hardware Support
Forum Rules FAQs Community Downloads Today's Posts Search

Notices

Hardware Support Discussions related to using various hardware setups with SageTV products. Anything relating to capture cards, remotes, infrared receivers/transmitters, system compatibility or other hardware related problems or suggestions should be posted here.

Reply
 
Thread Tools Search this Thread Display Modes
  #1  
Old 03-09-2006, 03:50 PM
Humanzee's Avatar
Humanzee Humanzee is offline
Sage Fanatic
 
Join Date: Sep 2004
Location: North Idaho
Posts: 752
Athlon64 3200+ too little for HDTV?

I have a socket 754 Athlon64 3200+ CPU. See my signature below. My monitor/TV is set to 1920x1080 and all is well. Well that is, until I start trying to play back HD material. Our local HD broadcasts come down in 1080i so the computer needs to do some de-interlacing before it gets presented to my set in 1080p. I have the nVidia purevideo decoders, and have all my options set to use hardware acceleration, and I am using FSE (better with it on than off). Still my CPU gets pegged at 100% and I get stuttering and pixilation on HD content. Before you ask, I have all the latest graphics/chipset/peripheral drivers that I can find installed

My OTA signal strength is in the high 90 percentiles so I don’t think it’s the recording that is the issue. However, I downloaded some .ts files that seem to play better than some HDWmv files I also downloaded and those Mpeg HD files recorded by Sage.

Possible Theories:
1) My CPU is just not fast enough to do HD fluidly. I got this chip and motherboard right when the first Athlon64 chips hit the market, its probably almost 2 years old, single core. I tried some over clocking but ran into stability problems.

2) Inadequate bus speed and bandwidth. I have (3) SATA-300Mb/s drives hooked up to a PCI Promise SATA300 controller card. I notice that when copying more than one file at a time to any one or combination of these drives, there is a significant slow down in transfer speed, I.e. more than half again slower. Could sage be having problems accessing the data fast enough? (I doubt it) These drives also show up in device manager as USB removable devises for some reason.

3) AGP just won’t cut it anymore. Maybe my CPU load will drop if I get a new MB with a PCIe graphics card.

4) 1080p will give any MB/CPU/GPU combination a serious work out. I am expecting too much.

5) Software and/or Driver Configuration Problems.

Possible Solutions:
Option 1) Rebuild the HTPC with updated Motherboard, CPU, RAM, PCIe Graphics Card, and native support for at least four SATA-300 drives.

Option 2) Build a Client PC with faster CPU and PCIe Graphics. Maybe something with a small form factor and passive cooling? Set up the existing system as a server and only let it record and commercial detect.

Option 3) Wait to fix what ever mysterious problems are creating these issues. Maybe driver update, maybe Sage update, maybe something else.

Option 4) Turn the existing machine into a client for the SDTV set and build a new server per option 1. HD files should use less processing when displayed at 800x600.

Option 5) Wait a while, and then go for Options 1, 2 or 4. HTPC oriented hardware is about to get a whole lot faster and cheaper. (maybe)

Questions:
Is my current chip really that slow that it is incapable of handling HD content? It is a couple of years old but the tops for socket 754 right now is only 3700+ at over $300.

Will a multi-core chip help things drastically?

What would you do?

Would you consider an integrated MB graphics solution like the nForce 4/ 6150 combos for a client? Would those not be any better than what I have now?

AMD or Intel?
Reply With Quote
  #2  
Old 03-09-2006, 05:09 PM
dadams dadams is offline
Sage Advanced User
 
Join Date: Feb 2004
Location: Oklahoma
Posts: 195
FWIW...I have the same processor as you and I can watch an HD file while another is recording with no problems(processor 40-50%) so I doubt it is your processor, although I am not running at 1920x1080. I can also view HD files on my 32" Samsung in another room. It is running a 3200+ Sempron. For HD recording I use 2 VBox Cat's Eye DT150s.

If you try overlay do you get the same problems?
Reply With Quote
  #3  
Old 03-09-2006, 05:51 PM
Humanzee's Avatar
Humanzee Humanzee is offline
Sage Fanatic
 
Join Date: Sep 2004
Location: North Idaho
Posts: 752
I've been avoiding overlay because I prefer the look of and color depth of vmr9 ( transparent menu items and no crushed blacks) but I'll have another look at it tonight.
Reply With Quote
  #4  
Old 03-09-2006, 07:22 PM
wazkaren wazkaren is offline
Sage Advanced User
 
Join Date: May 2004
Location: Rochester,NY
Posts: 155
I'm running the Sage Client on a AMD Athlon XP3000+, and I can playback HD in FSE/VMR9 while recording from 3 Hauppauge cards at the same time (the HD I record on a different server). Originally I had an ATI 9600 video card and found that I could only use overlay with that card. With VMR9 I would get very jerky playback. I have since upgraded to an nVidia 6600 pro and now get very smooth playback in VMR9. So the video card is what did the trick for me.


Thanks,
Greg
__________________
SageTV 6.44, Windows XP Pro,ASUS A7N266, AMD Athlon XP1900+, 768 MB Ram, Avermedia A180, FusionHDTV 5 lite, HDHomeRun.
SageTV 6.44,Windows XP Pro,Chaintech 7NJL6
AMD Athlon XP3000+, 1 gig Ram, SPDIF via on-board audio to Sony STR-DE575 surround sound, BFG 6600GT OC to a Sony KF42WE610 TV, 2 x Hauppauge 250,1 x Hauppauge 150
SageTV Client 6.44,Windows XP Pro, MSI K7T Pro, AMD Duron CPU 1 GHz, 512 Mb Ram, , Linksys WMP54GS, ATI 9600SE

Last edited by wazkaren; 03-09-2006 at 07:25 PM.
Reply With Quote
  #5  
Old 03-09-2006, 07:39 PM
Humanzee's Avatar
Humanzee Humanzee is offline
Sage Fanatic
 
Join Date: Sep 2004
Location: North Idaho
Posts: 752
I have a 6600GT but it is an AGP card. I wonder if AGP vs PCIe makes much difference? Have not tried overlay yet.

On a side note does anybody know anything about overclocking the graphics card. The MSI drivers came with a utility that you could use to dynamically overclock your GPU clock. But it was part of the driver package I think. Since I have switched to the latest from nVidia, I don't know if I should be going backwards to get this feature by installing the MSI software.
Reply With Quote
  #6  
Old 03-09-2006, 08:02 PM
Polypro Polypro is offline
Sage Icon
 
Join Date: Jun 2005
Posts: 1,804
Google "Cool Bits", it's a reg entry that adds OC to the NVIDIA control panel. 3000+ Venice here, no probs with 3 tuners (1 HD, 2 SD). AGP/PCI-E, no difference speed wise...at this time.

P
Reply With Quote
  #7  
Old 03-09-2006, 08:37 PM
wazkaren wazkaren is offline
Sage Advanced User
 
Join Date: May 2004
Location: Rochester,NY
Posts: 155
Quote:
Originally Posted by Humanzee
I have a 6600GT but it is an AGP card.
Mine is also an AGP. So it sounds like maybe the video card is not your problem. Oh well, it was just a thought.

Greg
__________________
SageTV 6.44, Windows XP Pro,ASUS A7N266, AMD Athlon XP1900+, 768 MB Ram, Avermedia A180, FusionHDTV 5 lite, HDHomeRun.
SageTV 6.44,Windows XP Pro,Chaintech 7NJL6
AMD Athlon XP3000+, 1 gig Ram, SPDIF via on-board audio to Sony STR-DE575 surround sound, BFG 6600GT OC to a Sony KF42WE610 TV, 2 x Hauppauge 250,1 x Hauppauge 150
SageTV Client 6.44,Windows XP Pro, MSI K7T Pro, AMD Duron CPU 1 GHz, 512 Mb Ram, , Linksys WMP54GS, ATI 9600SE
Reply With Quote
  #8  
Old 03-09-2006, 08:39 PM
Humanzee's Avatar
Humanzee Humanzee is offline
Sage Fanatic
 
Join Date: Sep 2004
Location: North Idaho
Posts: 752
Quote:
Originally Posted by Polypro
Google "Cool Bits", it's a reg entry that adds OC to the NVIDIA control panel. 3000+ Venice here, no probs with 3 tuners (1 HD, 2 SD). AGP/PCI-E, no difference speed wise...at this time.

P
Thanks, I just found that myself, was able to gain a couple hundred Mhz in memory frequency and about 53 more in clock frequency, using the "Detect Optimal Frequencies" option. Will see if it helps. Found a good article about overclocking the 6600GT at http://www.anandtech.com/video/showdoc.aspx?i=2295
Reply With Quote
  #9  
Old 03-09-2006, 08:46 PM
stanger89's Avatar
stanger89 stanger89 is offline
SageTVaholic
 
Join Date: May 2003
Location: Marion, IA
Posts: 15,188
Out of curiosity, when you're playing an HD file, does the decoder say DXVA Mode or Software YUV mode?
Reply With Quote
  #10  
Old 03-09-2006, 08:47 PM
Humanzee's Avatar
Humanzee Humanzee is offline
Sage Fanatic
 
Join Date: Sep 2004
Location: North Idaho
Posts: 752
I found more info about my CPU it is a 3200+ code name CLAWHAMMER. Never heard of that one before myself. Maybe ill try overclocking it again. The Giga-byte overclocking software was a bit hokey, I'll look into clocking it in the bios if the graphics overclock doesn't help.
Attached Images
File Type: jpg CPUZ.JPG (47.5 KB, 460 views)

Last edited by Humanzee; 03-09-2006 at 08:49 PM.
Reply With Quote
  #11  
Old 03-09-2006, 09:02 PM
Humanzee's Avatar
Humanzee Humanzee is offline
Sage Fanatic
 
Join Date: Sep 2004
Location: North Idaho
Posts: 752
Quote:
Originally Posted by stanger89
Out of curiosity, when you're playing an HD file, does the decoder say DXVA Mode or Software YUV mode?
The PureVideo decoder? I have De-interlace Control set to "Smart", De-interlace Mode set to "Best Available" and the decoder format says "DirectX VA mode A (idct) Video Mixing" At least on Recorded HD. I have a downloaded mpg video that shows decoder mode "YUY2 Video Mixing"
Reply With Quote
  #12  
Old 03-09-2006, 09:59 PM
Humanzee's Avatar
Humanzee Humanzee is offline
Sage Fanatic
 
Join Date: Sep 2004
Location: North Idaho
Posts: 752
Ok, the GPU overclocking with cool bits seems to have helped with about 80% of the stuttering and pixalation. I still get 100% cpu utilization though. I'm going to look into overclocking the CPU a little, and maybe the GPU a little more than it is now.
Reply With Quote
  #13  
Old 03-09-2006, 10:25 PM
Wheemer's Avatar
Wheemer Wheemer is offline
Sage Icon
 
Join Date: Dec 2004
Location: Deer Lake, NL, Canada
Posts: 1,493
Is your tv set to be the primary monitor?
Reply With Quote
  #14  
Old 03-09-2006, 11:35 PM
Humanzee's Avatar
Humanzee Humanzee is offline
Sage Fanatic
 
Join Date: Sep 2004
Location: North Idaho
Posts: 752
Yup, Figured that one out on day one with my 6600GT. I don't think FSE will work otherwise.
Reply With Quote
  #15  
Old 03-10-2006, 08:26 AM
Commodore 64's Avatar
Commodore 64 Commodore 64 is offline
Sage User
 
Join Date: Nov 2005
Posts: 72
I don't know what the difference between 1080i and 1080p is except that interlaced vs progressive relates to how the picture is drawn. I've seen many people say that progressive is 60hz and Interlaced is 30 hz.

FWIW, I have been running my philips 30pw8520 (old CRT based 30 inch widescreen) at 1980 x 1080, 60hz for several months now. I have inadvertently started flame wars in IRC chat rooms when stating this. People say it is impossible and that I can't be viewing 1080p on a CRT. I don't think I am viewing 1080p though, because when I get a bad flag or something and the deinterlacer craps out, I see interlacing lines. I think I'm just running at 60hz, but not necessarily 1080p.


Anyways, I'm not sure how all this relates to your issue, but I get OTA ATSC broadcasts with my Vbox DTA-150 and Fusion 5 Lite with no stuttering. I think my system is similar to yours in overall specs, I also use an AGP 6600GT.
__________________
Hardware: P4 3.0E; Asus I865PE mobo; 1Gb pc3200; Gigabyte 6600GT; VBox DTA 150; Fusion 5 Lite;MCE Remote;USB-UIRT; Philips 30" CRT @ 1920 x 1080, 60Hz; JVC D201S receiver

Software: XP Pro SP2; SageTV 4; nVidia Video Decoder; nVidia Audio Decoder
Reply With Quote
  #16  
Old 03-10-2006, 10:27 AM
Jesse's Avatar
Jesse Jesse is offline
Sage Fanatic
 
Join Date: Feb 2005
Location: Marietta, Ga.
Posts: 813
Hi,

I have never figured out exactly how to compare an Intel chip to an AMD (read all sorts of stuff ) but FWIW I playback HD using VMR9/FSE on a
42" HD plasma via component using a 6600gt (agp), a P4 3.0e (which I think should be in the same ballpark as an AMD 3200+ ) and the purvideo decoders. I am still using the 7xxx drivers, have them set to
720p and then had it build a custom res for my tv (rectangular pixels). I get very smooth playback. I used to get some barely perceptible stuttering but shutting down all the other apps that were running cured this. The only things running now are sage and girder.

I dont know if this is any help. Maybe if I tried 1080p I would be having problems too.

Good luck.

Jesse
Reply With Quote
  #17  
Old 03-10-2006, 01:00 PM
Humanzee's Avatar
Humanzee Humanzee is offline
Sage Fanatic
 
Join Date: Sep 2004
Location: North Idaho
Posts: 752
Quote:
Originally Posted by Commodore 64
FWIW, I have been running my philips 30pw8520 (old CRT based 30 inch widescreen) at 1980 x 1080, 60hz for several months now. I have inadvertently started flame wars in IRC chat rooms when stating this. People say it is impossible and that I can't be viewing 1080p on a CRT. I don't think I am viewing 1080p though, because when I get a bad flag or something and the deinterlacer craps out, I see interlacing lines. I think I'm just running at 60hz, but not necessarily 1080p.
Stanger could explain it all but from what I understand there is currently little benefit for haveing a set that can actually do 1080p at the moment. My use of the word "set" here may be misleading. My 37" Westinghouse is actually more of a computer monitor than an HDTV set, it can actually do 1080P. I say it may be of little benefit only because there is currently no broadcast HDTV content in 1080p, it tops out at 1080i. Because my set can do 1080P it makes it an excellent HTPC monitor, for all the other applications other than video which I might use on my comuter. Games and what not. To make video look good on a progressive screen you pretty much need to use some sort of de-interlacing so that you don't get the comb effect. In and HDTV this is done in the TV by the some hardware, in my case it is done by the HTPC and the PureVideo decoder

If you de-interlace an interlaced signal and then display it on a set that can only display an intrelaced signal (standard old tube tv) you don't gain much if anything, but you also shouldn't loose much. check out http://www.100fps.com for a good run down on de-interlacing. Your set is unlikely to truely render 1080p. Even though you are sending it a progressive signal the set is probably either down scaling it to 720p or is re-interlacing it at 1080i. The fact that you see the comb effect probalby has more to do with what your HTPC is sending the TV rather than the TV's internal decoding.

All that being said, there is some 1080p content out there in the form of HD-wmv files and some IMAX films that are available as .ts files designed for use on your PC or HTPC. When I play a .ts file that is 1080p I only use about 50% CPU time as the graphics card and CPU don't have to do any de-interlacing. HDwmv files are compressed so you still need to use some CPU to decode them on the fly.

Alternatively, I could probably force my 6600GT to output a 1080i signal but I feel that would be counter productive. I suppose I could also try de-interlacing the file itself with ffmpeg but that would require more disk space and patience as well as some way to schedule the process for when I wouldn't otherwise be using the machine.

De-interlacing on the fly seems to be the way to go for me, I nearly have all these issues licked by overclocking my 6600GT, but the wife asked me to help her clean the apartment last night before her folks come over the weekend. Need a few more hours of over clocking my hardware and I suspect I shall overcome.
Reply With Quote
  #18  
Old 03-10-2006, 02:15 PM
stanger89's Avatar
stanger89 stanger89 is offline
SageTVaholic
 
Join Date: May 2003
Location: Marion, IA
Posts: 15,188
Quote:
Originally Posted by Humanzee
Stanger could explain it all
I see I've been successful at fooling you all into thinking I know what I'm talking about

Quote:
but from what I understand there is currently little benefit for haveing a set that can actually do 1080p at the moment.
Yes and no. We'll start from the top. 1080i content, if properly deinterlaced (which the latest nVidia decoders/cards/drivers provide) should look better on a 1080p display than a 720p display (possibly better than a 1080i display). Now for that to hold true, the display needs to accept 1080p directly (not 1080i like some 1080p displays).

Now for content that starts out at < 1080 (720p and less), there's definitly a diminishing returns phenomenon as you increase display resolution.

Of course size of display factors in to, 1080p is less of a benefit on smaller displays than larger, and probably doesn't really shine until your talking front projection.

Also, don't forget about HD-DVD and Blu-ray, both of which are 1080p native formats (for movies at least).

Quote:
To make video look good on a progressive screen you pretty much need to use some sort of de-interlacing so that you don't get the comb effect. In and HDTV this is done in the TV by the some hardware, in my case it is done by the HTPC and the PureVideo decoder
And for 1080i content, you need a card that can handle all the features for HD, that means 7900, 7800, 6800 Ultra, hopefully the 7600 will fall into that list as well.

Quote:
If you de-interlace an interlaced signal and then display it on a set that can only display an intrelaced signal (standard old tube tv) you don't gain much if anything, but you also shouldn't loose much.
That depends, if your output resolution doesn't match the source resolution, good deinterlacing is essentiall since you can't resize without deinterlacing.

Quote:
check out http://www.100fps.com for a good run down on de-interlacing. Your set is unlikely to truely render 1080p. Even though you are sending it a progressive signal the set is probably either down scaling it to 720p or is re-interlacing it at 1080i.
Actually I think we determined on AVS that he's running 1080i60

Quote:
The fact that you see the comb effect probalby has more to do with what your HTPC is sending the TV rather than the TV's internal decoding.
Yup.
Reply With Quote
  #19  
Old 03-10-2006, 05:37 PM
Humanzee's Avatar
Humanzee Humanzee is offline
Sage Fanatic
 
Join Date: Sep 2004
Location: North Idaho
Posts: 752
Quote:
Originally Posted by stanger89
And for 1080i content, you need a card that can handle all the features for HD, that means 7900, 7800, 6800 Ultra, hopefully the 7600 will fall into that list as well.
Stanger, So do you think this is the root of my problem then? Im running too weak of a graphics card with a 6600GT. All our local broadcasts here in Seattle are 1080i.
Reply With Quote
  #20  
Old 03-10-2006, 05:48 PM
stanger89's Avatar
stanger89 stanger89 is offline
SageTVaholic
 
Join Date: May 2003
Location: Marion, IA
Posts: 15,188
I don't know what your problem is, the 6600 GT should be fine. The only difference between the 6600GT and a "better" card is the missing HD IVCT/Bad edit correction. I was referring to optimal PQ requirements, not general playback requirements.

You really shouldn't be seeing 100% CPU usage. I was just playing a couple on my 6600GT (in my 4200) and would pull < 20% usage (I'd estimate it used 20-30% of the core it was running on, though I think there's something hosed as it only plays 1/6 frames ).

I get smooth playback with full software decoding though even (50% utilization with neither core maxed). Double check that you're actually using DXVA.
Reply With Quote
Reply


Currently Active Users Viewing This Thread: 1 (0 members and 1 guests)
 

Posting Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

BB code is On
Smilies are On
[IMG] code is On
HTML code is Off

Forum Jump


All times are GMT -6. The time now is 06:20 PM.


Powered by vBulletin® Version 3.8.11
Copyright ©2000 - 2023, vBulletin Solutions Inc.
Copyright 2003-2005 SageTV, LLC. All rights reserved.