Building My MythTV Box, Part 2: Software
Pages: 1, 2, 3

nVidia Driver and X Server Configuration

For maximum performance on an nVidia graphics card, you need to use a binary-only kernel module. Some glue code needs to be compiled against the current kernel, but the build process is quick. The driver can be retrieved directly from nVidia's website, but Gentoo also makes it available as a package. Just type "emerge nvidia-driver" to install it.

By default, the package system will install the latest driver marked stable, which is version 6629 from November 2004. The absolute latest driver, from June 2005, has much faster performance, but is marked unstable, and requires special configuration to unmask it. In a simple test of the frame rate with glxgears, I found the difference between the 6629 driver and the 7667 driver to be about 50 percent (1,400 frames per second versus 2,000 frames per second).

During troubleshooting, one of the nifty features of the nVidia X driver is that it can simultaneously output to multiple devices through the same card. I have an external monitor hooked up to the VGA port, as well as a TV hooked up to the S-Video port. Each is a "screen" to the X server. I have configured an nVidia extension called TwinView for simultaneous display on both the TV and the monitor.

Configuring TwinView is, for the most part, straightforward. Each display is given a mode line, and the TV display must be told what type of TV is attached. Mine uses NTSC, which is the U.S. standard, though the driver supports many different TV standards throughout the world. The one wrinkle with simultaneous display is that TwinView can confuse programs using the X server because it works in part by assembling a virtual display from the physical displays. When watching 16:9 HD broadcasts on a standard-definition TV, they will be distorted unless the Xinerama X server extension is disabled. The key modification to my X configuration is in the usage of the display device, which enables TwinView:

Section "Device"
        Identifier  "Videocard0"
      # -- NVidia-accelerated driver
        Driver      "nvidia"
        VendorName  "Giga-Byte"
        BoardName   "NVIDIA GeForce FX 5200"

        Option      "TwinView"
        Option      "TwinViewOrientation" "Clone"
        Option      "ConnectedMonitor" "CRT,TV"
        Option      "SecondMonitorHorizSync" "30-70"
        Option      "SecondMonitorVertRefresh" "60"
        Option      "MetaModes" "1024x768, 1024x768"
        Option      "TVStandard" "NTSC-M"
        Option      "TVOutFormat" "COMPOSITE"
      # -- Disable xinerama extension for TwinView
        Option      "NoTwinViewXineramaInfo" "true"
EndSection

One of the best ways to reduce the resource requirements for displaying HDTV is to offload some of the graphics processing from the system CPU to the graphics card. When the XVideo-Motion Compensation (XvMC) extension is used, parts of the video decode are done in the graphics hardware. Unfortunately, there is a bug in the latest nVidia AMD64 driver that prevents XvMC from working. For now, I am stuck with software decode. (nVidia released a new driver on August 9, but I have not had a chance to test it yet.)

LIRC and the Remote Control

Although I am old enough to remember television without remote controls, I do not have fond memories of it. Without a remote, MythTV is cumbersome. Getting up to walk to the keyboard to fast-forward through commercials defeats the purpose of having a powerful TV-playback device.

Linux Infrared Remote Control (LIRC) software turns a PC into a universal receiver. By programming LIRC with the codes from your remote, the PC can respond to any remote you already own. Flexibility presents a challenge. I use a universal remote control, which is designed to adapt to the equipment it is configured to control. If the equipment is a universal receiver, it will expect to adapt to the remote control, presenting a slight chicken-and-egg dilemma. To get a remote working with basic functions, I used my TiVo remote, though I plan to set the system up with my universal remote in the future.

As a receiver, I selected the Home Electronics IRA-3. Hardware installation of the IRA-3 is a snap because it simply plugs into a serial port and draws power from it. There is a small red light that comes on when the receiver is powered up, and it blinks when an IR flash is detected. The IRA-3 uses the IRMAN protocol, which requires the use of libirman on Linux. However, one slight modification is necessary. The IRA-3 has a longer startup delay than the brand-name IRMAN device, as described by Pratap Pereria in a message to the LIRC mailing list.

To use LIRC with the IRA-3, first build the libirman library with this patch: ira3.patch.txt. I tinkered with the emerge build instructions for libirman to get the patch applied automatically as part of the installation process. By default, LIRC will build with support for generic serial devices, so change the build process to support irman devices. On Gentoo, this is done by setting the USE flag. When compiling from source, it is an option to the configuration script.

On startup, the main software process needs to be passed the -H flag to select the right driver. I have also found that the serial driver needs to be in control of the port. If lircd doesn't start, try configuring the port to be under control of the serial driver with a command like "setserial /dev/ttyS0 uart 16550."

Pages: 1, 2, 3

Next Pagearrow