nV News Forums

 
 

nV News Forums (http://www.nvnews.net/vbulletin/index.php)
-   NVIDIA Linux (http://www.nvnews.net/vbulletin/forumdisplay.php?f=14)
-   -   GPU gets not utilized (G84) (http://www.nvnews.net/vbulletin/showthread.php?t=101958)

Strunkenbold 11-08-07 03:58 PM

GPU gets not utilized (G84)
 
1 Attachment(s)
Im a little bit disappointed about the performance of my new Sparkle Calibre 8600GT graphics board.
Compared to my old 7600GT it renders in some situations even slower! :thumbdwn:
Through the special display of the Sparkle Card I can watch the temperatures during rendering.(there is a Zalman VF700 mounted on the card, so dont wonder about the cool values, stock is getting much hotter)
(wine .48, nvidia-drivers .19, win 163.x)

Benchmark of CS:Source Stresstest:
7600GT 78fps --->windows 164fps
8600GT 72fps ---->windows 226fps

Selected Tests with 3Dmark03 (8600GT, 7600GT) (8600GT Win, 7600GT Win)
fillrate2: (4393fps, 5015fps) (8864fps, 6404fps)
ragtroll: (35fps, 43fps) (72fps, 53fps)

During that time the temperature of the card rised from 39 C (idle) to 42 C.
For comparsion: the FUR Rendering Benchmark stressed the card to about 50 C and even glxgears was able to do 49 C. Normal rendering seems to be around 46 C.

Also Im creator of:
http://bugs.winehq.org/show_bug.cgi?id=9768
May you have a look.

Strunkenbold 11-08-07 04:17 PM

Re: GPU gets not utilized (G84)
 
1 Attachment(s)
uploaded old log...

I have currently compiz-fusion up and running, during testing it was disabled among with some settings in the xorg.conf file.

Lithorus 11-09-07 02:36 AM

Re: GPU gets not utilized (G84)
 
You should test with a native linux game and not things through wine. Wine have to do on-the-fly conversions and your CPU is not excactly the fastest..

Strunkenbold 11-09-07 10:20 AM

Re: GPU gets not utilized (G84)
 
Quote:

Originally Posted by Lithorus
You should test with a native linux game and not things through wine. Wine have to do on-the-fly conversions and your CPU is not excactly the fastest..

You must be joking. Its a 3ghz core2!
And it doesnt matter how fast the cpu is, as it does the same conversions in exactly the _same_ time with the 7600GT. And still its faster, how can that be?
Oh, and please prove me wrong that the conversations are thing that are so costly, and _not_ the used gl extensions.
Wait I do that for you:
Code:

CPU: Core 2, speed 2925 MHz (estimated)
Counted CPU_CLK_UNHALTED events (Clock cycles when not halted) with a unit mask of 0x00 (Unhalted core cycles) count 100000
samples  %        image name              app name                symbol name
847168  65.0536  libGLcore.so.100.14.11  libGLcore.so.100.14.11  (no symbols)
94536    7.2594  libc-2.6.1.so            libc-2.6.1.so            (no symbols)
84299    6.4733  no-vmlinux              no-vmlinux              (no symbols)
79074    6.0721  speeddemo.exe            speeddemo.exe            (no symbols)
50023    3.8412  wined3d.dll.so          wined3d.dll.so          shader_glsl_load_constantsF
15812    1.2142  libGL.so.100.14.11      libGL.so.100.14.11      (no symbols)
8229      0.6319  libmad.so.0.2.1          libmad.so.0.2.1          (no symbols)
7040      0.5406  wined3d.dll.so          wined3d.dll.so          vertexdeclaration
6842      0.5254  ntdll.dll.so            ntdll.dll.so            RtlLeaveCriticalSection
5578      0.4283  wined3d.dll.so          wined3d.dll.so          shader_glsl_load_constants
5227      0.4014  opera                    opera                    (no symbols)
5136      0.3944  ntdll.dll.so            ntdll.dll.so            RtlEnterCriticalSection
4781      0.3671  libqt-mt.so.3.3.8        libqt-mt.so.3.3.8        (no symbols)
4329      0.3324  nvidia_drv.so            nvidia_drv.so            (no symbols)

Even though its an old oprofile log it should tell quite much.
Please excuse me, I invested quite some time in testing and didn't expect such an answer(The error is not in nvidia-drivers, the error is in wine...stop bugging us!).
You can really trust wine developers. They are the ones who using most recent and rare gl extensions and not these 3 year old native games.

It was already discovered on phoronix that the .19 drivers didnt solved the performance gap on g8x cards between win and linux and this is just one more example for it.

Strunkenbold 11-11-07 04:12 AM

Re: GPU gets not utilized (G84)
 
Is there any response from nvidia to that problem? Do you need some more things?

tx2rx 11-11-07 02:23 PM

Re: GPU gets not utilized (G84)
 
When you ran the test with wine how loaded was your CPU? Was one of the cores fully maxed out ?

As I understand it, wine takes the directX calls and converts them into OpenGL calls. This compatibility layer is CPU bound not GPU bound. Do you get similar results from an OpenGL game (say Doom3) running in windows native and the same game running under wine ?

If the problem is CPU based then there's nothing the Nv driver team can do (dual core probably won't help much unless the game supports multi-core CPU's). The real test is if you run a recent Linux native OpenGL game on both cards and compare the results.

Lithorus 11-12-07 03:23 AM

Re: GPU gets not utilized (G84)
 
In the bug-report it says :
Quote:

model name : Intel(R) Core(TM)2 CPU 4300 @ 1.80GHz
Either way you should really test with a native game instead of Wine to atleast rule out that the problem is in Wine.

Strunkenbold 11-12-07 11:37 AM

Re: GPU gets not utilized (G84)
 
Dear Lithorus,
u are missing one point:
model name : Intel(R) Core(TM)2 CPU 4300 @ 1.80GHz
cpu MHz 2924.999

You can also read those things in the pasted logs.

And again, theres no need to prove things with native games. This is completly waste of time.
I do not report a problem that only reflects me, Im reporting a general problem.
And also Im quite sure that NVIDIA is so smart to test native games against performance regressions.
Also keep in mind that the rendering speed of the 8600GT is in most cases supperior to that of the 7600GT, I only showed you 2 corner cases which needs to be fixed. And think about that one of it has to do with HL2....

Last thing, as a user I want support. I dont buy a 150€ card to spend 5 hours in debugging and testing. Unless nvidia pays me for that. :D

Strunkenbold 11-12-07 01:25 PM

Re: GPU gets not utilized (G84)
 
I had some interesting, talk to a well know wine dev. To sum things up:

-it's not the conversations (dx->opengl) that makes wine slow, they are quite efficient today

-some dx stuff is hard to implement with opengl, cause it lacks sometimes the equivalent opengl calls
-also due to compatibility reasons, not the most efficient opengl calls are used

This means:
The rendering speed of wine doesnt depend that strong on the cpu.

But,
were still using the old 100.x codebase on linux. Some people might know that the 100.x drivers on windows had similar speed issues, and the 7600GT and 8600GT had not that big speed difference.
So problems will hopefully fixed when the next major driver bump appears.

Lithorus 11-13-07 02:46 AM

Re: GPU gets not utilized (G84)
 
First of all the E4300 stock speed is 1.8Ghz which means you overclocked. Yes I didn't notice the CPU Mhz but just assumed that you would use stock speeds when testing.

If you are really that certain that Wine is not the problem, why are you so reluctant to prove your claim with running native vs native.

Also the numbers on the linux drivers doesn't really translate directly to Windows numbers AFAIK.

Btw. check this link :
http://www.phoronix.com/scan.php?pag...item=882&num=2

Edit:
Another thing. Could you test wether your overclocked CPU is REALLY running at that speed? I've seen some posts somewhere else that overclocked CPU's might get reported wrongly. For instance what is the CPU speed score in 3dmark03?

Strunkenbold 11-13-07 03:55 PM

Re: GPU gets not utilized (G84)
 
Quote:

Originally Posted by Lithorus
If you are really that certain that Wine is not the problem, why are you so reluctant to prove your claim with running native vs native.

As I said, I dont want to put more effort into that. And it seems you still dont get it, maybe its me who makes things a little bit unclear(sorry for that). Whenever the speed is low also display of my graphic card indicates that by displaying low temperature values. Native games show correct values, as most of wine applications. Correct means 48-50 C (heavy load), uncorrect 38-42 C (idle, 2D Mode).
So its not about speed, its about a gpu that doesnt seem to operate correctly and put most of the work to the cpu. As it does operate correctly under windows, I assume that my hardware is ok but the driver needs to be fixed.
Why don't we play the game the other way arround, prove me that nvidia-drivers are correct!?! :p
:D

Quote:

Originally Posted by Lithorus
Also the numbers on the linux drivers doesn't really translate directly to Windows numbers AFAIK.

And why do they put a 100 infront of it? For Fun? They explicit changed their driver naming model to reflect that. Also see the latest released legacy drivers.
No, nvidia still didnt released the 160.x drivers to linux users and Im waiting for it...

Quote:

Originally Posted by Lithorus
Edit:
Another thing. Could you test wether your overclocked CPU is REALLY running at that speed? I've seen some posts somewhere else that overclocked CPU's might get reported wrongly. For instance what is the CPU speed score in 3dmark03?

Its working correctly. Phoronix gets 110fps in their benchmark I get 164fps.
They have Pentium D820 ( 2,8 Ghz) and me Core2 (2,9 Ghz).


All times are GMT -5. The time now is 01:15 AM.

Powered by vBulletin® Version 3.7.1
Copyright ©2000 - 2014, Jelsoft Enterprises Ltd.
Copyright 1998 - 2014, nV News.