Using the 1.0-8178 driver from ports and an FX 5200, I can not reliably start X. Starting X results in a display lock about 80% of the time, freezing input and giving a corrupted display. I can normally ssh into the box after it freezes and shut everything down manually, but a tiny fraction of the time the machine is solidly frozen. When I can
get in, xorg is running wild with nearly 100% of the CPU consumed.
Most puzzling to me is that when X starts successfully, it seems to be completely stable. On this particular boot the machine has been up for a week with no problems, running xscreensaver's entire suite of GL screensavers flawlessly all weekend.
Initially, I built a kernel with no agp, enabled Nvidia's AGP in xorg.cong ("NvAGP "1") and fired it up. Freeze. I then added the hint (hint.agp.0.disabled="1") in /boot/device.hints, thinking it might be loading the module, anyway. It froze again. Sooo... I set xorg back to "nv" from "nvidia" and did some research.
Next I tried building the driver with support for AGPGART and went through a series of the same things with that setup, trying agp both in the kernel and as a module. My first successful launch of X with the nvidia driver came with agp.ko loaded and the nvidia module built with AGPGART support (and "NvAGP" "2"). GL speed was great and I thought the problem solved - until the next boot. This was when I discovered that X would start, but only sometimes.
Since then I've tried it with AGP in the kernel, as a module and disabled. I've not started X enough times to give detailed results, but in the thirty or forty boot/launch cycles I have done I have seen no noticeable advantage of one mode over the other. They all seem to have about the same failure rate, which is "most of the time." Because of that, I've settled on the most recommended (from what I've read) mode - NvAGP 1, no kernel AGP, nvidia driver built without AGPGART support.
I've also tried it with ACPI both enabled and disabled. I've disabled hyperthreading on the motherboard and am using the 4BSD scheduler. Running in SMP mode with hyperthreading on fared no better.
hw.nvidia.agp.card.rates: 8x 4x
hw.nvidia.version: NVIDIA FreeBSD x86 NVIDIA Kernel Module 1.0-8178 Wed Dec 14 17:04:30 PST 2005
hw.nvidia.cards.0.model: GeForce FX 5200
I note that I see both "hw.nvidia.agp.status.sba: enabled" and "hw.nvidia.registry.EnableAGPSBA: 0" in there, and that seems a little odd. I'm going to try adding "hw.nvidia.registry.EnableAGPSBA=1" to /etc/sysctl.conf, but don't expect much from doing so.
I have not tried changing the scheduler. Should I bother?
I'm hoping someone sees something I missed. I've used this machine as a Gentoo Linux workstation for almost four years, about one and a half of those with this very card. Over the last couple of years, though, I've been converting my Gentoo servers to FreeBSD for a number of reasons. I decided to give it a shot for my workstation and absolutely love it. It just feels so much snappier and cleaner - and I love the way it handles memory. It would be guaranteed a permanent place on my desktop - except for this one problem. I know
3D isn't usually critical for a workstation, but I've grown attached to it. I want to stay with FreeBSD, but the call of pretty eye-candy will become overwhelming, I think, and I can't have a workstation that can not be reliably rebooted from remote.
Well, modifying the EnableAGPSBA setting accomplished nothing other than making the registry setting agree with what the driver reports as effective. I tried changing the IRQ of the AGP card in the BIOS, changing the AGP aperture setting to every available BIOS setting, changing AGP speed to 4x in the BIOS, turning off the Nvidia splash logo and building the kernel with the ULE scheduler. (Which, as an aside, improved things for me. With 4BSD I had to nice portmanager or any compilation process or video playback would hitch slightly every 10-15 seconds. With ULE I can compile and play video - even several videos at the same time - with no hitching whatsoever.) I also played with a number of Nvidia's option settings - nothing helped. I tried, but can not give the card its own IRQ - the motherboard insists on sharing the AGP slot with the USB controller. The motherboard also has a "game accelerator" feature which alters memory timing to enhance performance. It can not be disabled, only set to force more demanding settings as opposed to its default of "auto". After two weeks of working on this, I am forced to conclude that this particular combination of hardware is not compatible with 6.0-RELEASE and the 8178 Nvidia drivers.
This is an Abit IC7 motherboard with a 2.4GHz P4. The AGP card is a Mad Dog Multimedia Conqueror FX 5200 Plus. 1.5G of RAM is installed (3x512M), as is a National Semiconductor Gigabit Ethernet card and a positively ancient SBLive. If anyone finds a trick to making this combo work, please let me know.