Go Back   nV News Forums > Linux Support Forums > NVIDIA Linux

Newegg Daily Deals

Reply
 
Thread Tools
Old 12-15-06, 09:35 PM   #121
breggy
Breggy
 
Join Date: Dec 2006
Location: America
Posts: 32
Talking Re: Nvidia 6200 and ATI RS480/482 chipset incompatibilities

Just chiming in. I've got a 7300 GS that works fine in Windows but is flakey in Linux, and I am too eargarly awaiting the release of this fix, which apparently is not in the latest BETA drivers as of Dec. 2006.
breggy is offline   Reply With Quote
Old 12-15-06, 09:55 PM   #122
netllama
NVIDIA Corporation
 
Join Date: Dec 2004
Posts: 8,763
Default Re: Nvidia 6200 and ATI RS480/482 chipset incompatibilities

The problem is being worked around in the driver, however its not clear what is actually causing it, and the investigation is still in progress.

This bug is also known to exist on some GeForce 7300GS cards on motherboards with the same ATI-RS48x chipset.
netllama is offline   Reply With Quote
Old 12-15-06, 10:00 PM   #123
breggy
Breggy
 
Join Date: Dec 2006
Location: America
Posts: 32
Thumbs up Re: Nvidia 6200 and ATI RS480/482 chipset incompatibilities

@netllama:

I am your guinea pig. I'll help any way that I can. I need this to work.
breggy is offline   Reply With Quote
Old 12-17-06, 08:52 AM   #124
breggy
Breggy
 
Join Date: Dec 2006
Location: America
Posts: 32
Question Re: Nvidia 6200 and ATI RS480/482 chipset incompatibilities

You know, I've heard of some modem manufacturers cutting costs on their hardware by taking some shortcuts and letting Windows do some of the grunt work behind what would normally be done in hardware. Could that be the case with these TurboCache cards? Could that be why there is no real answer and why all the lockups? Could NVidia be trying to emulate in Linux which comes natively in Windows?

This 7300GS was a pretty darn good deal when I bought it for it's performance. Could it be letting Windows do some of the dirty work to keep the manufacturing costs down?
breggy is offline   Reply With Quote
Old 12-18-06, 06:26 PM   #125
lithiumfx
Registered User
 
Join Date: Dec 2006
Posts: 13
Default Re: Nvidia 6200 and ATI RS480/482 chipset incompatibilities

Quote:
Originally Posted by breggy
You know, I've heard of some modem manufacturers cutting costs on their hardware by taking some shortcuts and letting Windows do some of the grunt work behind what would normally be done in hardware. Could that be the case with these TurboCache cards? Could that be why there is no real answer and why all the lockups? Could NVidia be trying to emulate in Linux which comes natively in Windows?

This 7300GS was a pretty darn good deal when I bought it for it's performance. Could it be letting Windows do some of the dirty work to keep the manufacturing costs down?
It wouldn't surprise me. The TurboCache feature seems to be bringing out this bug. Whilst I'm fairly sure this is a software bug that's brought out only under the environments (i.e. the ATI chipset + an Nvidia TurboCache card) mentioned in this topic, there's no excuse for the time Nvidia have took to fix it. If this was a Windows bug, it would have been a priority fix.

Perhaps the TurboCache cards are similar to WinModems - they can work under Linux but give you hell unless they have a very good driver. I hope that the next driver release (9742, or if the workaround isn't added, >9742) does fix this bug, however, I'm concerned as to how Nvidia don't seem to be too forthcoming with why the bug occurs (due to it still being researched) or what their 'workaround' is.
__________________
Lithium FX
lithiumfx is offline   Reply With Quote
Old 12-19-06, 10:58 PM   #126
NthDegree
Registered User
 
Join Date: Dec 2006
Posts: 4
Default Re: Nvidia 6200 and ATI RS480/482 chipset incompatibilities

HP Pavillion 3700+
Model: EJ188AA-ABU
Prod #: t3245.uk

Here's my problems with it and my big list of workarounds, none of which nvidia corp. will recommend!

My assessment of the nVidia 6200SE TurboCache shows that the card automatically changes clock speeds when it enters Xorg and when fullscreen games start. This change of speed can result in a fast heat up of the card, around the 68-78C range games are several times more likely to cause a kernel panic, Xorg with only 2D apps is about twice as likely to cause a kernel panic. The heat-up only appears to occur when running OpenGL games or using applications requiring heavy use of Direct Rendered 3D Acceleration.

So a temporary workaround that nvidia might be able to implement in their drivers if possible is to lower the slowdown threshold which is currently at 145C on the 6200SE TurboCache or offer a way to adjust the threshold on their "NVIDIA X Server Settings" application.

For users a set of ways to lessen the load and make the system more stable while using OpenGL 3D applications:

There is an unofficial nvclock application that can show the clock speed of nvidia cards, the latest version (0.8 at this time of writing) is capable of adjusting the clock speed. After loading a fullscreen OpenGL game and after Xorg has loaded nvclock can be used to lower the clock speed thus reducing the level of heat and lessening load on the card. Unfortunately nvclock cant currently change the 6200SE TurboCache's fan speed which would effectively help lessen heat slightly. Instead users should attempt to add extra cooling to the computer in general to lessen heat generally which will lessen stress on the card thus reducing the likelyhood of a crash.

For gaming, running a minimalist distribution such as Gentoo and implementing the above strategies is incredibly effective at reducing hard lock-ups and kernel panics, infact at low temperatures in the 50-55C range very few panics/lock-ups have occurred with the current drivers (1.0-9631 at this time of writing).

For normal PC usage, using a distribution such as Red Hat Enterprise Linux or CentOS and implementing the above will dramatically lower the chance of a crash, so far at this time NO lockups/panics have occurred with the current drivers (1.0-9631 at this time of writing) under standard usage after implementing the above. The system successfully runs CentOS x86_64 with an average uptime of 3 days with common usage (IRC, IM, Web Browsing etc.).

Using FreeBSD 6.1-RELEASE. VERY frequent lock-ups, likely to be kernel panics. Not limited to the use of 3D Direct Rendering or OpenGL. The frequency of the lock-ups suggests that older versions of nvidia drivers and those on slightly less used systems are much less stable. An average uptime of 1 hour between lock-ups without following the above and an average of 8 hours with following the above suggests driver stability is to blame.

Using Windows XP Home Edition Service Pack 2 and Windows Server 2003 Enterprise R2. NO crashes of any kind have occurred even under VERY HIGH load, this demonstrates that drivers under Windows maintain stability. This could however be down to the Windows Microkernel which has over time developed a level of internal resistance to kernel-level crashes due to 3rd party drivers.

In conclusion one can easily tell that the nvidia drivers under Linux and FreeBSD lose stability under high load and/or when the 6200SE TurboCache becomes hot, stability can be greatly enhanced by lessening load on the graphics card and implementing extra cooling to prevent the graphics card from reaching temperatures exceeding 65C which is the threshold for which the current Linux drivers (1.0-9631) appear to start causing kernel panics and hard lock-ups.

GNU/Linux distributions tested:
CentOS (x86_64)
Fedora Core 5 (x86_64) (with some modifications)
Gentoo Linux (x86_64)
Gentoo Linux (i386)

Other Systems Tested:
Windows XP Home Edition Service Pack 2 (x86)
Windows Server 2003 Enterprise R2 (x86)
FreeBSD 6.1-RELEASE (i386)

-----------------------------

This is my detailed end-user report on the issue, I expect this FIXED. I will keep an update on the CentOS x86_64 under normal use to inform of any crashes. I am posting this report using this system ;-)

*EXTRA*

On older nvidia drivers the monitor appears to go into a form of suspend mode upon a crash, on the current Linux drivers that has yet to happen. This may suggest that some problem has already been fixed.

Last edited by NthDegree; 12-19-06 at 11:41 PM.
NthDegree is offline   Reply With Quote
Old 12-20-06, 02:50 AM   #127
mmartinmate
Registered User
 
Join Date: Apr 2006
Posts: 7
Lightbulb Re: Nvidia 6200 and ATI RS480/482 chipset incompatibilities

Hi NthDegree,
Very good work, Nvidia could have done this test long time ago I suppose they know if they overclock the GPU.
I have experienced the same behavior you say.
I have a question (not for you) by don't experienced this with the open source driver 'nv' or when using the closed driver without using 3D characteristics.
In test with the current driver (1.0-9631) with glxgears i could see gears turning more slowly than with previous drivers (perhaps the overclock is less intensive !!).
I would suggest another test, Google Earth with the closed driver, the system freeze after two search or before (when 3D world appears) using it in a wmware session with Debian 3.2 application work perfect.
One question for Nvidia: If the problem is heat how we can place a cooler in the video card?
A suggestion: if the problem is a kernel panic when temperature is above 65C and nvidia driver consider limit 145C the driver could tell a lie to the kernel, it could tell real temperature * 65/145 (I think we can find some day GPU like cheese in a slice of pizza).

Bye,
Miguel

-------------------------------------------------------
NthDegree, I have reproduced your post in Novell Bugzilla.
https://bugzilla.novell.com/show_bug.cgi?id=228224
-------------------------------------------------------

Last edited by mmartinmate; 12-20-06 at 07:37 AM.
mmartinmate is offline   Reply With Quote
Old 12-20-06, 04:09 AM   #128
NthDegree
Registered User
 
Join Date: Dec 2006
Posts: 4
Default Re: Nvidia 6200 and ATI RS480/482 chipset incompatibilities

Implementing the potential to slow down the clock speed and save the setting so the clock speed persistently lowers courtesy of the "NVIDIA X Server Settings" application would be nice. Automatic throttling at a temperature the USER specifies, with a fallback for the max. temperature the card can take would also be nice for power users wanting maximum performance from their card. So would GPU fan speed controls.

Why these features have not been added I have no idea.

"By allowing the graphics processing unit (GPU) to share the capacity and bandwidth of dedicated video memory and dynamically available system memory, TurboCache turbocharges performance and provides larger total graphics memory."

The problem is probably less to do with the chipset specifically and more to do with system memory usage since TurboCache technology uses system memory to speed up performance in games and stressful 3D operations. Since all kernel modules on Linux/BSD effectively gain a form of superuser privilege; unlike on Windows. One wrong move and the kernel panics.
NthDegree is offline   Reply With Quote

Old 12-20-06, 07:36 AM   #129
mmartinmate
Registered User
 
Join Date: Apr 2006
Posts: 7
Default Re: Nvidia 6200 and ATI RS480/482 chipset incompatibilities

Hi,

I have found some information about cooling and system locks.

http://www.nvnews.net/vbulletin/showthread.php?t=63318
http://www.nvnews.net/vbulletin/showthread.php?p=780750)

Perhaps netllama have more information.
mmartinmate is offline   Reply With Quote
Old 12-20-06, 11:22 AM   #130
netllama
NVIDIA Corporation
 
Join Date: Dec 2004
Posts: 8,763
Default Re: Nvidia 6200 and ATI RS480/482 chipset incompatibilities

This problem has nothing to do with heat. Underclocking the GPU should not have any impact on the instability.
netllama is offline   Reply With Quote
Old 12-20-06, 06:47 PM   #131
lithiumfx
Registered User
 
Join Date: Dec 2006
Posts: 13
Default Re: Nvidia 6200 and ATI RS480/482 chipset incompatibilities

NthDegree,

I'm impressed by the detail of your testing processes, and the wide variety of OS' you've used to test the 6200. I too notice improved stability when lowering clock speeds, however, I believe this is a fairly generic fix. Lowering clock speeds means the card is put under less stress, and thus, the bug in the driver that seems to arise due to high load is less likely to occur. I do agree that underclocking a card typically improves stability, though.
Regardless of if stability improves with better cooling or not, the main point of your tests is that they are a workaround. We shouldn't need to apply them though, since a bug that causes the host system to lock up should be fixed. Hopefully the next driver release will address this issue, but it'd be interesting to see what Nvidia's development team are doing as a workaround.
__________________
Lithium FX
lithiumfx is offline   Reply With Quote
Old 12-20-06, 11:46 PM   #132
NthDegree
Registered User
 
Join Date: Dec 2006
Posts: 4
Default Re: Nvidia 6200 and ATI RS480/482 chipset incompatibilities

Right i've had the first crash on CentOS x86_64, it was triggered by Firefox browsing "The Cult of The Dead Cow" site. The monitor went into suspend (and wouldn't leave suspend) and since my hard disk depends on apic and acpi I can't test disabling those to see if it has any effect. I believe these are bad writes to memory, where something is overwriting system memory where it shouldn't.

It would be helpful if everyone posted their dmesg's so I can see errors, because on boot I suffer from the following Linux bugs:

"..MP-BIOS bug: 8254 timer not connected to IO-APIC
failed.
timer doesn't work through the IO-APIC - disabling NMI Watchdog!
works."

(enable_8254_timer fixes this but makes the clock 15x faster and causes jerky graphics and rendering issues on some games. NOTE: 8254 routing gets automatically disabled on ATI motherboards, wonder why?!)

"Uhhuh. NMI received for unknown reason 2d.
Dazed and confused, but trying to continue
Do you have a strange power saving mode enabled?"

(not sure about this one but I assume enable_8254_timer helps there too)

"APIC error on CPU0: 00(40)
APIC error on CPU0: 40(40)"

(that is sorta fixed with noapic but I can't do that or my hard disk doesn't work)

Yes ATI does provide users with some very cr@p hardware, even Windows XP needs special hacks (approximately 8MB of extras included with XP by default to make ATI hardware work). Is there some hackish way I can get the kernel to perform debug dumps? That way it will show what goes wrong when these panics/lock-ups occur and might help solve w/e it is that is causing the issues.
NthDegree is offline   Reply With Quote
Reply


Thread Tools

Posting Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

BB code is On
Smilies are On
[IMG] code is On
HTML code is Off
Forum Jump


All times are GMT -5. The time now is 02:57 PM.


Powered by vBulletin® Version 3.7.1
Copyright ©2000 - 2014, Jelsoft Enterprises Ltd.
Copyright 1998 - 2014, nV News.