Go Back   nV News Forums > Linux Support Forums > NVIDIA Linux

Newegg Daily Deals

Reply
 
Thread Tools
Old 05-03-12, 12:04 PM   #13
Spyke
Foxie
 
Spyke's Avatar
 
Join Date: Aug 2004
Location: Canada
Posts: 618
Send a message via AIM to Spyke Send a message via MSN to Spyke Send a message via Yahoo to Spyke
Default Re: GTX 680 + linux, What is the max pixelclock?

To make matters worse, the latest 300 series drivers will reset your custom refresh rates back to 60HZ the moment you start up a 3D game/whatever...

Any way to disable that nonsense? Seriously who makes those decisions?
__________________
Gaming:
Intel i7 980X @ 4GHz | ASUS Rampage III Extreme | GTX 480 3Way SLI @ 900Mhz | Koolance VID-NX480 | Corsair Obsidian 700D
Corsair H70 CPU Cooler | EK-FB RE3 | Corsair AX1200 | Black Ice SR1 360 | 240GB OCZ Revodrive X2 SSD | Windows 7 Ultimate
12GB Corsair Dominator GT @ 8-8-8-24-1T DDR3-1600 | Onkyo TX SR-707 | 70" Sharp Aquos LCD | KEF Audio 5.1 C3/C6LCR/C7
Workstation:
Intel i7 920 D0 @ 4GHz | ASUS Rampage II Extreme | GTX 480 @ 800Mhz | Koolance VID-NX480 | Lian-Li V1200B | Corsair HX1000
EK NB ASUS HP | Watercool HeatKiller 3.0 | Feser Extreme X-360 | 2x160GB Intel X25-M SSD RAID0 | 4x2TB WD20EARS RAID10
12GB Corsair Dominator @ 8-8-8-24-1T DDR3-1600 | Creative X-Fi Titanium | Pioneer DVR-212D | Gentoo Linux
Dell 3008WFP | JohnBlue JB3 | CityPulse DA2.03e II DAC | KingRex T20U w/ Modded Auricaps & PSU | Glow Audio Sub One
Server:
Quad Socket Opteron 8356 (16 cores) @ 2.3GHz | Supermicro H8QMi-2 | Supermicro 2U Chassis | Redundant 1200W PSU
32GB 2GBx16 DDR2-667 ECC | 4x 300GB Cheetah SAS RAID10 | Adaptec 5805 512MB w/BBU | Colocated | Gentoo Linux
Spyke is offline   Reply With Quote
Old 05-03-12, 12:07 PM   #14
Xevious
Registered User
 
Join Date: Aug 2002
Posts: 291
Default Re: GTX 680 + linux, What is the max pixelclock?

Quote:
Originally Posted by Spyke View Post
To make matters worse, the latest 300 series drivers will reset your custom refresh rates back to 60HZ the moment you start up a 3D game/whatever...

Any way to disable that nonsense? Seriously who makes those decisions?
This seems really simple to get around on linux if you just remove the modeline or disable EDID like I already have to do.
Xevious is offline   Reply With Quote
Old 05-03-12, 12:21 PM   #15
Spyke
Foxie
 
Spyke's Avatar
 
Join Date: Aug 2004
Location: Canada
Posts: 618
Send a message via AIM to Spyke Send a message via MSN to Spyke Send a message via Yahoo to Spyke
Default Re: GTX 680 + linux, What is the max pixelclock?

Quote:
Originally Posted by Xevious View Post
This seems really simple to get around on linux if you just remove the modeline or disable EDID like I already have to do.
Unfortunately its not that easy, I always had EDID disabled, now the driver simply wants to screw with my refresh rates everytime something 3D starts up, even if its setting both monitors to 100HZ when the other one doesn't support it....

I can go into nvidia-settings and force the second monitor back to 68HZ after starting minecraft and not run into any problems, so the drivers do still support independent refresh rates per screen....

So NVIDIA, how do I get the new 300 series driver to not 'conveniently' manage my refresh rates when it feels like it?
__________________
Gaming:
Intel i7 980X @ 4GHz | ASUS Rampage III Extreme | GTX 480 3Way SLI @ 900Mhz | Koolance VID-NX480 | Corsair Obsidian 700D
Corsair H70 CPU Cooler | EK-FB RE3 | Corsair AX1200 | Black Ice SR1 360 | 240GB OCZ Revodrive X2 SSD | Windows 7 Ultimate
12GB Corsair Dominator GT @ 8-8-8-24-1T DDR3-1600 | Onkyo TX SR-707 | 70" Sharp Aquos LCD | KEF Audio 5.1 C3/C6LCR/C7
Workstation:
Intel i7 920 D0 @ 4GHz | ASUS Rampage II Extreme | GTX 480 @ 800Mhz | Koolance VID-NX480 | Lian-Li V1200B | Corsair HX1000
EK NB ASUS HP | Watercool HeatKiller 3.0 | Feser Extreme X-360 | 2x160GB Intel X25-M SSD RAID0 | 4x2TB WD20EARS RAID10
12GB Corsair Dominator @ 8-8-8-24-1T DDR3-1600 | Creative X-Fi Titanium | Pioneer DVR-212D | Gentoo Linux
Dell 3008WFP | JohnBlue JB3 | CityPulse DA2.03e II DAC | KingRex T20U w/ Modded Auricaps & PSU | Glow Audio Sub One
Server:
Quad Socket Opteron 8356 (16 cores) @ 2.3GHz | Supermicro H8QMi-2 | Supermicro 2U Chassis | Redundant 1200W PSU
32GB 2GBx16 DDR2-667 ECC | 4x 300GB Cheetah SAS RAID10 | Adaptec 5805 512MB w/BBU | Colocated | Gentoo Linux
Spyke is offline   Reply With Quote
Old 05-03-12, 12:46 PM   #16
AaronP
NVIDIA Corporation
 
AaronP's Avatar
 
Join Date: Mar 2005
Posts: 2,487
Default Re: GTX 680 + linux, What is the max pixelclock?

The driver doesn't just change modes or refresh rates on its own. Something must be requesting that it change to a 60 Hz mode.
AaronP is offline   Reply With Quote
Old 05-03-12, 12:54 PM   #17
Spyke
Foxie
 
Spyke's Avatar
 
Join Date: Aug 2004
Location: Canada
Posts: 618
Send a message via AIM to Spyke Send a message via MSN to Spyke Send a message via Yahoo to Spyke
Default Re: GTX 680 + linux, What is the max pixelclock?

Quote:
Originally Posted by AaronP View Post
The driver doesn't just change modes or refresh rates on its own. Something must be requesting that it change to a 60 Hz mode.
Its something to do with xrandr, however I'm now trying to force modes through xrandr and for some reason its not accepting my custom modelines...

> xrandr --newmode 2560x1440_100 400 2560 2592 2612 2692 1440 1443 1448 1480 +HSync -VSync

> xrandr --newmode 2560x1440_67 270 2560 2592 2612 2692 1440 1443 1448 1480 +HSync -VSync

> xrandr --current
spike@darius:~> xrandr --current
Screen 0: minimum 8 x 8, current 5120 x 1440, maximum 16384 x 16384
DVI-I-0 disconnected (normal left inverted right x axis y axis)
DVI-I-1 disconnected (normal left inverted right x axis y axis)
DVI-I-2 connected 2560x1440+0+0 (normal left inverted right x axis y axis) 0mm x 0mm
2560x1440 60.0*+
2048x1536 85.0 75.0 60.0
1920x1440 85.0 75.0 60.0
1856x1392 75.0 60.0
1792x1344 75.0 60.0
1600x1200 85.0 75.0 70.0 65.0 60.0
1400x1050 74.8 60.0
1280x1024 85.0 75.0 60.0
1280x960 85.0 60.0
1152x864 75.0
1024x768 85.0 43.5 75.0 70.1 60.0
832x624 74.6
800x600 85.1 75.0 72.2 60.3 56.2
720x400 85.0
700x525 149.5 120.0
640x480 85.0 75.0 72.8 59.9
640x400 85.1
640x350 85.1
512x384 140.1 87.1 120.0
400x300 144.4
320x240 145.6 120.1
320x175 170.5
HDMI-0 disconnected (normal left inverted right x axis y axis)
DVI-I-3 connected 2560x1440+2560+0 (normal left inverted right x axis y axis) 0mm x 0mm
2560x1440 60.0*+
2048x1536 85.0 75.0 60.0
1920x1440 85.0 75.0 60.0
1856x1392 75.0 60.0
1792x1344 75.0 60.0
1600x1200 85.0 75.0 70.0 65.0 60.0
1400x1050 74.8 60.0
1280x1024 85.0 75.0 60.0
1280x960 85.0 60.0
1152x864 75.0
1024x768 85.0 43.5 75.0 70.1 60.0
832x624 74.6
800x600 85.1 75.0 72.2 60.3 56.2
720x400 85.0
700x525 149.5 120.0
640x480 85.0 75.0 72.8 59.9
640x400 85.1
640x350 85.1
512x384 140.1 87.1 120.0
400x300 144.4
320x240 145.6 120.1
320x175 170.5
2560x1440_100 (0x2be) 400.0MHz
h: width 2560 start 2592 end 2612 total 2692 skew 0 clock 148.6KHz
v: height 1440 start 1443 end 1448 total 1480 clock 100.4Hz
2560x1440_67 (0x2bf) 270.0MHz
h: width 2560 start 2592 end 2612 total 2692 skew 0 clock 100.3KHz
v: height 1440 start 1443 end 1448 total 1480 clock 67.8Hz


> xrandr --output DVI-I-2 --mode 2560x1440_100 --pos 0x0 --output DVI-I-3 --mode 2560x1440_67 --pos 2560x0
xrandr: cannot find mode 2560x1440_100

EDIT: Its not adding them to the first screen apparently, checking........
__________________
Gaming:
Intel i7 980X @ 4GHz | ASUS Rampage III Extreme | GTX 480 3Way SLI @ 900Mhz | Koolance VID-NX480 | Corsair Obsidian 700D
Corsair H70 CPU Cooler | EK-FB RE3 | Corsair AX1200 | Black Ice SR1 360 | 240GB OCZ Revodrive X2 SSD | Windows 7 Ultimate
12GB Corsair Dominator GT @ 8-8-8-24-1T DDR3-1600 | Onkyo TX SR-707 | 70" Sharp Aquos LCD | KEF Audio 5.1 C3/C6LCR/C7
Workstation:
Intel i7 920 D0 @ 4GHz | ASUS Rampage II Extreme | GTX 480 @ 800Mhz | Koolance VID-NX480 | Lian-Li V1200B | Corsair HX1000
EK NB ASUS HP | Watercool HeatKiller 3.0 | Feser Extreme X-360 | 2x160GB Intel X25-M SSD RAID0 | 4x2TB WD20EARS RAID10
12GB Corsair Dominator @ 8-8-8-24-1T DDR3-1600 | Creative X-Fi Titanium | Pioneer DVR-212D | Gentoo Linux
Dell 3008WFP | JohnBlue JB3 | CityPulse DA2.03e II DAC | KingRex T20U w/ Modded Auricaps & PSU | Glow Audio Sub One
Server:
Quad Socket Opteron 8356 (16 cores) @ 2.3GHz | Supermicro H8QMi-2 | Supermicro 2U Chassis | Redundant 1200W PSU
32GB 2GBx16 DDR2-667 ECC | 4x 300GB Cheetah SAS RAID10 | Adaptec 5805 512MB w/BBU | Colocated | Gentoo Linux
Spyke is offline   Reply With Quote
Old 05-03-12, 01:03 PM   #18
Spyke
Foxie
 
Spyke's Avatar
 
Join Date: Aug 2004
Location: Canada
Posts: 618
Send a message via AIM to Spyke Send a message via MSN to Spyke Send a message via Yahoo to Spyke
Default Re: GTX 680 + linux, What is the max pixelclock?

I don't know, the modes are in the list, it just wont add them.

2560x1440_100 (0x2be) 400.0MHz
h: width 2560 start 2592 end 2612 total 2692 skew 0 clock 148.6KHz
v: height 1440 start 1443 end 1448 total 1480 clock 100.4Hz
2560x1440_67 (0x2bf) 270.0MHz
h: width 2560 start 2592 end 2612 total 2692 skew 0 clock 100.3KHz
v: height 1440 start 1443 end 1448 total 1480 clock 67.8Hz

> xrandr --addmode DVI-I-2 2560x1440_100
X Error of failed request: BadMatch (invalid parameter attributes)
Major opcode of failed request: 152 (RANDR)
Minor opcode of failed request: 18 (RRAddOutputMode)
Serial number of failed request: 33
Current serial number in output stream: 34

> xrandr --addmode DVI-I-3 2560x1440_67
X Error of failed request: BadMatch (invalid parameter attributes)
Major opcode of failed request: 152 (RANDR)
Minor opcode of failed request: 18 (RRAddOutputMode)
Serial number of failed request: 33

EDIT: From what I gather the driver is trying to validate them, even though I have that disabled in xorg.conf.... any ideas?
__________________
Gaming:
Intel i7 980X @ 4GHz | ASUS Rampage III Extreme | GTX 480 3Way SLI @ 900Mhz | Koolance VID-NX480 | Corsair Obsidian 700D
Corsair H70 CPU Cooler | EK-FB RE3 | Corsair AX1200 | Black Ice SR1 360 | 240GB OCZ Revodrive X2 SSD | Windows 7 Ultimate
12GB Corsair Dominator GT @ 8-8-8-24-1T DDR3-1600 | Onkyo TX SR-707 | 70" Sharp Aquos LCD | KEF Audio 5.1 C3/C6LCR/C7
Workstation:
Intel i7 920 D0 @ 4GHz | ASUS Rampage II Extreme | GTX 480 @ 800Mhz | Koolance VID-NX480 | Lian-Li V1200B | Corsair HX1000
EK NB ASUS HP | Watercool HeatKiller 3.0 | Feser Extreme X-360 | 2x160GB Intel X25-M SSD RAID0 | 4x2TB WD20EARS RAID10
12GB Corsair Dominator @ 8-8-8-24-1T DDR3-1600 | Creative X-Fi Titanium | Pioneer DVR-212D | Gentoo Linux
Dell 3008WFP | JohnBlue JB3 | CityPulse DA2.03e II DAC | KingRex T20U w/ Modded Auricaps & PSU | Glow Audio Sub One
Server:
Quad Socket Opteron 8356 (16 cores) @ 2.3GHz | Supermicro H8QMi-2 | Supermicro 2U Chassis | Redundant 1200W PSU
32GB 2GBx16 DDR2-667 ECC | 4x 300GB Cheetah SAS RAID10 | Adaptec 5805 512MB w/BBU | Colocated | Gentoo Linux
Spyke is offline   Reply With Quote
Old 05-03-12, 01:47 PM   #19
Spyke
Foxie
 
Spyke's Avatar
 
Join Date: Aug 2004
Location: Canada
Posts: 618
Send a message via AIM to Spyke Send a message via MSN to Spyke Send a message via Yahoo to Spyke
Default Re: GTX 680 + linux, What is the max pixelclock?

This is what I actually have right now, but when I start minecraft or anything else OpenGL, it switches the second display to 100.4hz, is there a way to actually remove the 100.4hz setting from DVI-I-3? That would solve this I think....

~> xrandr
Screen 0: minimum 8 x 8, current 5120 x 1440, maximum 16384 x 16384
DVI-I-0 disconnected (normal left inverted right x axis y axis)
DVI-I-1 disconnected (normal left inverted right x axis y axis)
DVI-I-2 connected 2560x1440+0+0 (normal left inverted right x axis y axis) 0mm x 0mm
800x600 60.3 +
2560x1440 100.4* 67.8
HDMI-0 disconnected (normal left inverted right x axis y axis)
DVI-I-3 connected 2560x1440+2560+0 (normal left inverted right x axis y axis) 0mm x 0mm
800x600 60.3 +
2560x1440 100.4 67.8*
__________________
Gaming:
Intel i7 980X @ 4GHz | ASUS Rampage III Extreme | GTX 480 3Way SLI @ 900Mhz | Koolance VID-NX480 | Corsair Obsidian 700D
Corsair H70 CPU Cooler | EK-FB RE3 | Corsair AX1200 | Black Ice SR1 360 | 240GB OCZ Revodrive X2 SSD | Windows 7 Ultimate
12GB Corsair Dominator GT @ 8-8-8-24-1T DDR3-1600 | Onkyo TX SR-707 | 70" Sharp Aquos LCD | KEF Audio 5.1 C3/C6LCR/C7
Workstation:
Intel i7 920 D0 @ 4GHz | ASUS Rampage II Extreme | GTX 480 @ 800Mhz | Koolance VID-NX480 | Lian-Li V1200B | Corsair HX1000
EK NB ASUS HP | Watercool HeatKiller 3.0 | Feser Extreme X-360 | 2x160GB Intel X25-M SSD RAID0 | 4x2TB WD20EARS RAID10
12GB Corsair Dominator @ 8-8-8-24-1T DDR3-1600 | Creative X-Fi Titanium | Pioneer DVR-212D | Gentoo Linux
Dell 3008WFP | JohnBlue JB3 | CityPulse DA2.03e II DAC | KingRex T20U w/ Modded Auricaps & PSU | Glow Audio Sub One
Server:
Quad Socket Opteron 8356 (16 cores) @ 2.3GHz | Supermicro H8QMi-2 | Supermicro 2U Chassis | Redundant 1200W PSU
32GB 2GBx16 DDR2-667 ECC | 4x 300GB Cheetah SAS RAID10 | Adaptec 5805 512MB w/BBU | Colocated | Gentoo Linux
Spyke is offline   Reply With Quote
Old 05-03-12, 02:18 PM   #20
Spyke
Foxie
 
Spyke's Avatar
 
Join Date: Aug 2004
Location: Canada
Posts: 618
Send a message via AIM to Spyke Send a message via MSN to Spyke Send a message via Yahoo to Spyke
Default Re: GTX 680 + linux, What is the max pixelclock?

Is it really that difficult to force a specific modeline to a single display and not be available globally to all displays?

The displays have the proper refresh rates when X starts up, due to being specifically stated in the MetaModes, but why does X make the modelines available across all displays?

And SecondMonitorVertRefresh doesn't seem to do anything in the new drivers....

EDIT: Disabled RANDR completely, fixed all issues. Not the "fix" I had in mind but I will take it at this point.
__________________
Gaming:
Intel i7 980X @ 4GHz | ASUS Rampage III Extreme | GTX 480 3Way SLI @ 900Mhz | Koolance VID-NX480 | Corsair Obsidian 700D
Corsair H70 CPU Cooler | EK-FB RE3 | Corsair AX1200 | Black Ice SR1 360 | 240GB OCZ Revodrive X2 SSD | Windows 7 Ultimate
12GB Corsair Dominator GT @ 8-8-8-24-1T DDR3-1600 | Onkyo TX SR-707 | 70" Sharp Aquos LCD | KEF Audio 5.1 C3/C6LCR/C7
Workstation:
Intel i7 920 D0 @ 4GHz | ASUS Rampage II Extreme | GTX 480 @ 800Mhz | Koolance VID-NX480 | Lian-Li V1200B | Corsair HX1000
EK NB ASUS HP | Watercool HeatKiller 3.0 | Feser Extreme X-360 | 2x160GB Intel X25-M SSD RAID0 | 4x2TB WD20EARS RAID10
12GB Corsair Dominator @ 8-8-8-24-1T DDR3-1600 | Creative X-Fi Titanium | Pioneer DVR-212D | Gentoo Linux
Dell 3008WFP | JohnBlue JB3 | CityPulse DA2.03e II DAC | KingRex T20U w/ Modded Auricaps & PSU | Glow Audio Sub One
Server:
Quad Socket Opteron 8356 (16 cores) @ 2.3GHz | Supermicro H8QMi-2 | Supermicro 2U Chassis | Redundant 1200W PSU
32GB 2GBx16 DDR2-667 ECC | 4x 300GB Cheetah SAS RAID10 | Adaptec 5805 512MB w/BBU | Colocated | Gentoo Linux
Spyke is offline   Reply With Quote

Old 05-03-12, 04:23 PM   #21
AaronP
NVIDIA Corporation
 
AaronP's Avatar
 
Join Date: Mar 2005
Posts: 2,487
Default Re: GTX 680 + linux, What is the max pixelclock?

Adding modes through RandR is not supported yet, but you should still be able to add them through xorg.conf. The driver still provides a lot of detail about where various modes come from if you start X with the -logverbose 6 option or turn on the ModeDebug option. Check /var/log/Xorg.0.log to see where your unwanted modes are coming from, and then you should be able to use the various ModeValidation options to turn off implicit modes, such as NoXServerModes, NoVesaModes, NoPredefinedModes, and NoEdidModes.
AaronP is offline   Reply With Quote
Old 05-03-12, 05:09 PM   #22
Spyke
Foxie
 
Spyke's Avatar
 
Join Date: Aug 2004
Location: Canada
Posts: 618
Send a message via AIM to Spyke Send a message via MSN to Spyke Send a message via Yahoo to Spyke
Default Re: GTX 680 + linux, What is the max pixelclock?

Quote:
Originally Posted by AaronP View Post
Adding modes through RandR is not supported yet, but you should still be able to add them through xorg.conf. The driver still provides a lot of detail about where various modes come from if you start X with the -logverbose 6 option or turn on the ModeDebug option. Check /var/log/Xorg.0.log to see where your unwanted modes are coming from, and then you should be able to use the various ModeValidation options to turn off implicit modes, such as NoXServerModes, NoVesaModes, NoPredefinedModes, and NoEdidModes.
Thank you for the reply. That makes sense if its not yet supported.

The idea is I want 2 different refresh rates for each head, which I can do just fine through xorg.conf, except RANDR then also advertises these 2 specific modes for both my monitors, which I do not want, I need to force each monitor to a specific refresh.

Minecraft for example somehow sets the mode via RANDR which causes my secondary monitor to get set to the same refresh rate as my first monitor. If I was able to remove the incorrect refresh rate from the secondary monitor in the RANDR list that would be okay, but its not possible.

It would be possible if we were able to set custom modes through RANDR as then we can force specific modelines to a specific screen and no longer use xorg.conf to set specific modes, or of course the other option is to disable RANDR entirely until this is supported. (Games can no longer change the modes, which is good as I don't run stuff in fullscreen anyhow)

Hope that explains it well.
__________________
Gaming:
Intel i7 980X @ 4GHz | ASUS Rampage III Extreme | GTX 480 3Way SLI @ 900Mhz | Koolance VID-NX480 | Corsair Obsidian 700D
Corsair H70 CPU Cooler | EK-FB RE3 | Corsair AX1200 | Black Ice SR1 360 | 240GB OCZ Revodrive X2 SSD | Windows 7 Ultimate
12GB Corsair Dominator GT @ 8-8-8-24-1T DDR3-1600 | Onkyo TX SR-707 | 70" Sharp Aquos LCD | KEF Audio 5.1 C3/C6LCR/C7
Workstation:
Intel i7 920 D0 @ 4GHz | ASUS Rampage II Extreme | GTX 480 @ 800Mhz | Koolance VID-NX480 | Lian-Li V1200B | Corsair HX1000
EK NB ASUS HP | Watercool HeatKiller 3.0 | Feser Extreme X-360 | 2x160GB Intel X25-M SSD RAID0 | 4x2TB WD20EARS RAID10
12GB Corsair Dominator @ 8-8-8-24-1T DDR3-1600 | Creative X-Fi Titanium | Pioneer DVR-212D | Gentoo Linux
Dell 3008WFP | JohnBlue JB3 | CityPulse DA2.03e II DAC | KingRex T20U w/ Modded Auricaps & PSU | Glow Audio Sub One
Server:
Quad Socket Opteron 8356 (16 cores) @ 2.3GHz | Supermicro H8QMi-2 | Supermicro 2U Chassis | Redundant 1200W PSU
32GB 2GBx16 DDR2-667 ECC | 4x 300GB Cheetah SAS RAID10 | Adaptec 5805 512MB w/BBU | Colocated | Gentoo Linux
Spyke is offline   Reply With Quote
Old 05-11-12, 04:02 AM   #23
Xevious
Registered User
 
Join Date: Aug 2002
Posts: 291
Default Re: GTX 680 + linux, What is the max pixelclock?

Quote:
Originally Posted by AaronP View Post
Adding modes through RandR is not supported yet, but you should still be able to add them through xorg.conf. The driver still provides a lot of detail about where various modes come from if you start X with the -logverbose 6 option or turn on the ModeDebug option. Check /var/log/Xorg.0.log to see where your unwanted modes are coming from, and then you should be able to use the various ModeValidation options to turn off implicit modes, such as NoXServerModes, NoVesaModes, NoPredefinedModes, and NoEdidModes.


Care to comment on > 400 Mhz pixelclock on geforce 600 series hardware?
Xevious is offline   Reply With Quote
Old 05-22-12, 11:56 AM   #24
waperboy
Registered User
 
Join Date: Sep 2010
Posts: 11
Default Re: GTX 680 + linux, What is the max pixelclock?

Quote:
Originally Posted by Xevious View Post
Care to comment on > 400 Mhz pixelclock on geforce 600 series hardware?
295.40 and 295.49 drivers on Linux ignore modelines with pixelclock > 400MHz on GTX 680 as well.

I have this Catleap monitor as well, ran fine at 2560x1440 103Hz on GTX 460. I purchased the 680 purely to get 120Hz, only to hit the wall because of the driver? It is well established that it can be made to perform 120+Hz on Windows.
waperboy is offline   Reply With Quote
Reply


Thread Tools

Posting Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

BB code is On
Smilies are On
[IMG] code is On
HTML code is Off
Forum Jump


All times are GMT -5. The time now is 04:48 AM.


Powered by vBulletin® Version 3.7.1
Copyright ©2000 - 2014, Jelsoft Enterprises Ltd.
Copyright 1998 - 2014, nV News.