View Single Post
Old 01-16-10, 08:38 AM   #1
OneOfOne
Registered User
 
Join Date: Jan 2005
Posts: 6
Default 8800gt powermizer issues.

I can't remember the last time I tried to overclock (probably good 6+ months ago), now that I'm back into gaming I tried to OC and it's stuck only on one performance level :
Quote:
-- General info --
Card: nVidia Geforce 8800GT
Architecture: G92 A2
PCI id: 0x611
GPU clock: 601.712 MHz
Bustype: PCI-Express

-- Shader info --
Clock: 1512.000 MHz
Stream units: 112 (01111111b)
ROP units: 16 (1111b)
-- Memory info --
Amount: 512 MB
Type: 256 bit DDR3
Clock: 899.996 MHz

-- PCI-Express info --
Current Rate: 16X
Maximum rate: 16X

-- Sensor info --
Sensor: Analog Devices ADT7473
Board temperature: 51C
GPU temperature: 65C
Fanspeed: 890 RPM
Fanspeed mode: auto
PWM duty cycle: 28.6%

-- VideoBios information --
Version: 62.92.1f.00.01
Signon message: GeForce 8800 GT VGA BIOS
Performance level 0: gpu 600MHz/shader 1500MHz/memory 900MHz/0.00V/100%
VID mask: 3
Voltage level 0: 0.95V, VID: 0
Voltage level 1: 1.00V, VID: 1
Voltage level 2: 1.05V, VID: 2
Voltage level 3: 1.10V, VID: 3
Running nvidia-settings -q ALL -s gives :
Quote:
ERROR: Error while querying valid values for attribute 'NvControlVersion' on UnimatrixZero:0[gpu:0] (Bad argument).
ERROR: Error while querying valid values for attribute 'GLXServerVersion' on UnimatrixZero:0[gpu:0] (Bad argument).
ERROR: Error while querying valid values for attribute 'GLXClientVersion' on UnimatrixZero:0[gpu:0] (Bad argument).
ERROR: Error while querying valid values for attribute 'OpenGLVersion' on UnimatrixZero:0[gpu:0] (Bad argument).
ERROR: Error while querying valid values for attribute 'XRandRVersion' on UnimatrixZero:0[gpu:0] (Bad argument).
ERROR: Error while querying valid values for attribute 'XF86VidModeVersion' on UnimatrixZero:0[gpu:0] (Bad argument).
ERROR: Error while querying valid values for attribute 'XvVersion' on UnimatrixZero:0[gpu:0] (Bad argument).
Quote:
media-video/nvidia-settings-195.30
x11-drivers/nvidia-drivers-195.30
x11-base/xorg-server-1.7.4
Basicly if I try to change it manually it reverts to the default, both using nvclock and nvidia-settings.
any ideas? CoolBits is enabled in xorg.conf.
Attached Files
File Type: gz nvidia-bug-report.log.gz (41.9 KB, 98 views)
OneOfOne is offline   Reply With Quote