Go Back   nV News Forums > Linux Support Forums > NVIDIA Linux

Newegg Daily Deals

Reply
 
Thread Tools
Old 10-01-02, 02:45 AM   #1
bigredlinux
Registered User
 
Join Date: Oct 2002
Posts: 5
Default understanding /etc/X11/XF86Config(-4)

I have a few questions regarding the XF86Config file regarding file purpose, resolutions and accelarator modules.

First of all (this might be asked a lot), why 2 files? What is XF86Config-4 for vs XF86Config and can I delete XF86Config if I only use XFree 4? I know that the -4 file is read in first, then the other, but that is all I can discern.

Okay, so my default resolution is 1600x1200, and I know you press Crl - Alt - + or - to get smaller resolutions, but then my desktop is huge and I have to scroll around to see it all...I am assuming this has to do with modelines. RedHat seems to have tons of them in the XF86Config file and I am not sure which ones I actually need.

Finally, this all came about because with a Geforce3 and Athlon 850 chip I am only getting 234 fps on the ssystem -bench program and over at Tom's Hardware they had 285 fps for a geforce2 gts, so I figured maybe there was something I needed to do to optimize.

Any input would be most appreciated.
bigredlinux is offline   Reply With Quote
Old 10-01-02, 03:29 AM   #2
bigredlinux
Registered User
 
Join Date: Oct 2002
Posts: 5
Default beginning to understand

Okay, I did some research and I see that changing the resolution does not change the screen size and to set the default screen size you use the option

Virtual 1024 768

under your display section. So I guess my modeline settings are okay after all.

My question now becomes, how do I get better performance? Does KDE slow down my performance and how do I take advantage of page-flipping.
bigredlinux is offline   Reply With Quote
Old 10-01-02, 09:38 AM   #3
bwkaz
Registered User
 
Join Date: Sep 2002
Posts: 2,262
Default

Yes, KDE slows you down. Especially if it was:

1) Compiled with an old binutils (that doesn't combine relocation information), or loaded with an old ld.so (one that doesn't cache the last symbol resolution)

AND, if it was:

2) Not objprelinked. But objprelink works only on i386-class processors (Intel/AMD), and you can sometimes see a lot more segfaults than normal, so distributions are highly unlikely to enable this.

If it was prelinked, or if it was compiled with a newer binutils and you're using a newer ld.so, then you'll see less of a slowdown, but the fact is that KDE is pretty big, and therefore fairly slow. Gnome isn't much better.

twm doesn't have all the "desktop" functionality, it's a bare-minimum window manager. Actually, it's barely even that. It comes with X, so as long as you have X installed, you can use it if you want. To use it, create a .xinitrc file (the dot at the beginning is important) in your home directory, containing the following lines:

twm &
xterm &
exec xterm -name login


When you exit out of the xterm named login, your X session will end. You will also have to place the xterm windows yourself when X starts up (there are no defaults, unless you pass a -geometry argument to xterm -- click the mouse to put them down somewhere), but if you're only going to be doing this to run a benchmark, I don't think it'd be a huge deal. When you want KDE back, mv the .xinitrc file to some other name, and restart X.

There are two X config files because XFree86 version 3 used an incompatible syntax to what XFree86 version 4 uses. So the XFree developers figured they'd make it look for its own syntax file first, then the old one if its version wasn't found. It's so you can have and use X 3 and X 4 on the same system.
__________________
Registered Linux User #219692
bwkaz is offline   Reply With Quote
Old 10-02-02, 01:42 AM   #4
bigredlinux
Registered User
 
Join Date: Oct 2002
Posts: 5
Default thanks for the advice

Okay, the information provided there was excellent...I have my whole XF86Config-4 file under my grasp. Now, for just a few more questions:

It seems to me that XFree 4.x doesn't need the Modeline directives under the Moniter section at all...I no longer have an XF86Config file (only -4) and the modelines are not in it, and everything runs fine. What is the main purpose of those lines and if I wanted to add them in, how would I know what to type?

So running with no window manager (just fwm) I get (glxgears program) 4700 fps at Depth 16 at most resolutions using my Geforce3 Ti200 card. Is that good performance or should I expect more?

Finally, Depth 32 seems not to work, but everyone doing benchmarks always talk about 1024x768x32...what is the 32 and why would my Depth 32 have issues.
bigredlinux is offline   Reply With Quote
Old 10-02-02, 06:24 AM   #5
bwkaz
Registered User
 
Join Date: Sep 2002
Posts: 2,262
Default

XFree 4 calculates modelines itself, from (at least) the monitor's EDID information. I'm pretty sure it gets some info from elsewhere also, but I'm not that familiar with it.

There's an XFree86 Video Timings Howto here that explains some of this.

That's pretty good. glxgears runs at about 2800fps here (GF4 4200, 24-bit mode).

Depth 32 doesn't work because there really is no such thing. In "32-bit mode", you have 8 bits of color info per color per pixel, which is 8*3 bits of actual information per pixel. That's 24-bit color. There's a slight variation on this mode that aligns all the pixels on 32-bit boundaries in video memory to make accesses faster, and that's (for some reason) called 32-bit mode (what they do is add 8 bits of filler). Even though there's only 24 bits of color information per pixel. Some cards actually use the extra 8 bits for an alpha channel, but it seems X doesn't use that. Or maybe it can and I just don't know how to make it.

If you change your depth to 24, you'll get "32-bit mode", with all the benefits of it (a.k.a. more colors).

Put another way, it's the difference between "depth" and "bits per pixel". The depth is still 24 bits, no matter what, because that's the number of bits that affect color. The bits per pixel is what Windows (for one) seems to enjoy calling "depth", which is pretty much just wrong. But a lot of people have gotten confused because of it.
__________________
Registered Linux User #219692
bwkaz is offline   Reply With Quote
Old 10-03-02, 05:42 AM   #6
Webgraph
Registered User
 
Join Date: Oct 2002
Posts: 19
Default

32-Bit means there's 256 red levels, 256 green level, 256 blue levels, and 256 transparency levels. Usually, the number of colours is determined by a power of two. For example, if the display was 20-bit, the number of colours would equal to 2 ^ 20 or 1 048 576.

About these benchmark programs, where can I get them? And also, would someone be able to check my XFree86 configuration file I attached with this post to see if there is anything wrong with it. I'm running a GeForce4 MX 420 (64 MB), but Mandrake-Linux 9.0 reads it as a GeForce 2 DDR (generic). The reason why I would like to have this checked is since I installed the kernel and GL drivers via the tarball method and modified this file (originally XF86Config-4, just renamed for your understanding), KDE will no longer work and none of the KDE-based applications will work neither. This would be the reason why the X server login window didn't show at start-up because it's KDE-based. Any help would be appreciated.
Attached Files
File Type: txt xf86-4.2.1 configuration.txt (2.9 KB, 175 views)
__________________
Webgraph
robnet10@mail.com
Webgraph is offline   Reply With Quote
Old 10-03-02, 06:24 AM   #7
bigredlinux
Registered User
 
Join Date: Oct 2002
Posts: 5
Default afaik

afaik, it doesn't matter what it labels your Device, as long as it is the same in all places the label is used. All that matters is that your nvidia module is loaded, and the driver will take it from there. From the commandline, what you should do is login and type

startx

and see if x starts...if it doesn't not it will clearly tell you why. Alot of times I find after first installing the driver I have to start kdm or whatever login manager once since the first time after install it defaults to runlevel 3...but that was just my experience.

Continuing with the thread...so are we saying that
when people are talking about a resolution of 1024x768x32 they set the Depth to 24? Is that the short of it?
bigredlinux is offline   Reply With Quote
Old 10-03-02, 08:18 AM   #8
bwkaz
Registered User
 
Join Date: Sep 2002
Posts: 2,262
Default Re: afaik

Quote:
Originally posted by bigredlinux
so are we saying that
when people are talking about a resolution of 1024x768x32 they set the Depth to 24? Is that the short of it?
In Linux, yes. You can't set X to 32-bit depth (it will set itself to 32 bpp sometimes, though; however, it doesn't matter since visually, there's no difference), because that doesn't actually exist. If it did, you wouldn't see any difference between it and 24-bit depth anyway (we can hardly distinguish 16 million colors IIRC, which is how many there are in a 24-bit depth mode -- we don't need 4 billion).
__________________
Registered Linux User #219692
bwkaz is offline   Reply With Quote

Old 10-04-02, 02:15 PM   #9
Webgraph
Registered User
 
Join Date: Oct 2002
Posts: 19
Default Re: afaik

Quote:
Originally posted by bigredlinux
afaik, it doesn't matter what it labels your Device, as long as it is the same in all places the label is used. All that matters is that your nvidia module is loaded, and the driver will take it from there. From the commandline, what you should do is login and type

startx

and see if x starts...if it doesn't not it will clearly tell you why. Alot of times I find after first installing the driver I have to start kdm or whatever login manager once since the first time after install it defaults to runlevel 3...but that was just my experience.

Continuing with the thread...so are we saying that
when people are talking about a resolution of 1024x768x32 they set the Depth to 24? Is that the short of it?
I tried the startx command and it didn't work. I get errors from the following programs:

kdeinit
ksmserver

There are only two possibilities I see that could fix the problem without reinstalling Linux. The problem is that I don't know how to act on either possibility. The first possibility is to uninstall and reinstall KDE. The other possibility is what you recommended, setting the runlevel to 3. So how would I be able to do this?
__________________
Webgraph
robnet10@mail.com
Webgraph is offline   Reply With Quote
Old 10-04-02, 03:11 PM   #10
bigredlinux
Registered User
 
Join Date: Oct 2002
Posts: 5
Default if you really want help...

If you really want help on this, you need to do two things...one, you need to start a new thread so that people acutally read your problem and don't think it is just discussion on the XF86Config-4 file and how to understand it. Second, you need to actually print the errors you get, not just the programs that give them to you. For instance, if I were to compile a program, I don't say...

'I am gettng an error from gcc'

and then post that...I actually post the error so people can do something about it.

You don't need to reinstall anything, all you need to do is track down the problem. Reinstalling is a M$ fallacy.
bigredlinux is offline   Reply With Quote
Old 10-05-02, 02:51 PM   #11
Webgraph
Registered User
 
Join Date: Oct 2002
Posts: 19
Default

Already started the thread.
__________________
Webgraph
robnet10@mail.com
Webgraph is offline   Reply With Quote
Reply


Thread Tools

Posting Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

BB code is On
Smilies are On
[IMG] code is On
HTML code is Off
Forum Jump


Similar Threads
Thread Thread Starter Forum Replies Last Post
Understanding the Bulldozer Architecture through the LINPACK Benchmark News Archived News Items 0 06-26-12 11:30 AM
Understanding CineFX - MUCH more than the R300 Uttar Rumor Mill 68 10-02-02 01:02 AM

All times are GMT -5. The time now is 05:30 PM.


Powered by vBulletin® Version 3.7.1
Copyright ©2000 - 2014, Jelsoft Enterprises Ltd.
Copyright 1998 - 2014, nV News.