Go Back   nV News Forums > Linux Support Forums > NVIDIA Linux

Newegg Daily Deals

Reply
 
Thread Tools
Old 05-21-07, 01:16 PM   #1
knefas
Registered User
 
knefas's Avatar
 
Join Date: Jul 2005
Posts: 26
Default Interrupts and wakeups: CPU waste.

Hi, I've tried the powertop tool released by intel to try to maximize my battery life,

and it reports that the top cause for wakeups in the CPU is the nvidia module.

Code:
Top causes for wakeups:
  76.6% (61.0)       <interrupt> : nvidia
  10.3% ( 8.2)       <interrupt> : acpi
   5.3% ( 4.2)       <interrupt> : ohci1394, eth0

Someone in the powetop mailing list hypothesizes that the problem could be some kind of VBLANK sync (I get ~60 per second, and would make sense). For other cards the option "NoDRI" solves this, but the nvidia module seems to ignore that.

Code:
(WW) NVIDIA(0): Option "NoDRI" is not used
I've tried reading the internet and the manual, but haven't found anything. Does anyone have any suggestion to avoid interrupts to be generated by the nvidia module?
knefas is offline   Reply With Quote
Old 05-21-07, 06:33 PM   #2
STrRedWolf
Registered User
 
Join Date: Feb 2005
Posts: 5
Default Re: Interrupts and wakeups: CPU waste.

It's the same here on my Geforce Go 7300 equipped Dell Inspiron E1505, using the latest 9800-series drivers. The nvidia driver is constantly at the top of interrupt usage, waking the CPU up and using a ton of power.

We're discussing this on Intel's Power mailing list. If there's a way to turn off the interrupt usage in 2D (turning it back on in 3D when needed), let us know. Extra bonus points if one of NVidia's engineers gets on the list. Intel has already fixed their driver.
STrRedWolf is offline   Reply With Quote
Old 05-21-07, 11:30 PM   #3
netllama
NVIDIA Corporation
 
Join Date: Dec 2004
Posts: 8,763
Default Re: Interrupts and wakeups: CPU waste.

Syncing to VBLANK is not enabled by default with the NVIDIA driver. If its enabled on your system, then either you've opted to enable it, or you're using a 3rd party repackaging of the NVIDIA driver which has it enabled by default (Livna did this in the past, I'm not sure if they still do). The nvidia driver does not use the DRI infrastructure. Please see the driver README's discussion of the __GL_SYNC_TO_VBLANK variable for more information.

IRQ usage cannot be disabled with the nvidia driver.
netllama is offline   Reply With Quote
Old 05-22-07, 04:11 AM   #4
zbiggy
Registered User
 
Join Date: Sep 2002
Posts: 623
Default Re: Interrupts and wakeups: CPU waste.

Maybe this will help:
Code:
Option "UseEvents" "1"
zbiggy is offline   Reply With Quote
Old 05-22-07, 05:20 AM   #5
STrRedWolf
Registered User
 
Join Date: Feb 2005
Posts: 5
Default Re: Interrupts and wakeups: CPU waste.

Quote:
Originally Posted by netllama
Syncing to VBLANK is not enabled by default with the NVIDIA driver. If its enabled on your system, then either you've opted to enable it, or you're using a 3rd party repackaging of the NVIDIA driver which has it enabled by default (Livna did this in the past, I'm not sure if they still do). The nvidia driver does not use the DRI infrastructure. Please see the driver README's discussion of the __GL_SYNC_TO_VBLANK variable for more information.

IRQ usage cannot be disabled with the nvidia driver.
That's very bad, netllama, since IRQ usage in 2D isn't quite needed (3D granted is needed and should be automatic). Besides, IRQ usage == CPU usage == power usage, and on any laptop less power is good.

I'm not using sync-to-VBlank in Gentoo. Yet I'm still getting 90% of my IRQ's pinpointed to NVidia's drivers, and that makes me only get 3 hours out of a supposedly 4 hour battery (and 4 out of a 5.5, all under 2D usage -- I expect to burn more battery in 3D).
STrRedWolf is offline   Reply With Quote
Old 05-22-07, 06:36 AM   #6
chunkey
#!/?*
 
Join Date: Oct 2004
Posts: 662
Default Re: Interrupts and wakeups: CPU waste.

No, You don't want any software 2D! It EATS your battery too!
Furthermore, disable IRQs is all rubbish. Or are you that kind of human being that turns of the clock generator of a pacemaker just to save it's batteries?
(Yes, believe me or not but if the IRQ is disabled, the driver needs to POLL the device and this really burns enegry! ).

So listen, if you want to do something, then you have to sue the laptop vendor, because of "false advertising".
chunkey is offline   Reply With Quote
Old 05-22-07, 07:51 AM   #7
STrRedWolf
Registered User
 
Join Date: Feb 2005
Posts: 5
Default Re: Interrupts and wakeups: CPU waste.

Quote:
Originally Posted by chunkey
No, You don't want any software 2D! It EATS your battery too!
Furthermore, disable IRQs is all rubbish. Or are you that kind of human being that turns of the clock generator of a pacemaker just to save it's batteries?
(Yes, believe me or not but if the IRQ is disabled, the driver needs to POLL the device and this really burns enegry! ).

So listen, if you want to do something, then you have to sue the laptop vendor, because of "false advertising".
Actually, I want my laptop to be a mobile Second Life client. But I gotta work for my Lindens.

But in all seriousness, kernel 2.6.21 has the option of going "tickless" or if the kernel doesn't need to wake up for anything in the next few seconds, it cancels the timer for those seconds. That enables the CPU to snooze in low power a bit longer and stretches the battery safely (we all know how badly putting a Toshiba 115CS battery under a rolling asphalt packer turned out to be).

Intel was able to do this with their drivers in 2D mode, where most of the time usage is spent. Read those reports of Vista's Aero interface cutting battery life in half, if not two thirds? Disable Aero and it goes back to XP levels of power usage. NVidia is at the same position Intel was before Intel started patching it to be like a tickless kernel.

Besides, 60-72 Hz? Most video doesn't get beyond 30 Hz anyway, and GIF graphics are limited to 10 Hz. What's so important to wake up 60 to 72 times a second and listen to the video card when you're on a laptop and a static display?

(Oh, and suing the laptop vendor? I think they covered their butt on that with the legal documents, plus I already blew away the warantee on the Dell E1505 -- replaced the crappy ATI with an NVidia. What's better, 64 megs of VRAM and a constant 5fps frame rate, or 256 megs of VRAM and an average rate of 25 fps?)
STrRedWolf is offline   Reply With Quote
Old 05-22-07, 10:08 AM   #8
chunkey
#!/?*
 
Join Date: Oct 2004
Posts: 662
Default Re: Interrupts and wakeups: CPU waste.

Quote:
Originally Posted by STrRedWolf
But in all seriousness, kernel 2.6.21 has the option of going "tickless" or if the kernel doesn't need to wake up for anything in the next few seconds, it cancels the timer for those seconds. That enables the CPU to snooze in low power a bit longer and stretches the battery safely (we all know how badly putting a Toshiba 115CS battery under a rolling asphalt packer turned out to be).
... but with a POLLING driver, you will never get into this >=C3 state either. Because the cpu is always checking the state of the gfx card.

Quote:
Originally Posted by STrRedWolf
Intel was able to do this with their drivers in 2D mode, where most of the time usage is spent. Read those reports of Vista's Aero interface cutting battery life in half, if not two thirds? Disable Aero and it goes back to XP levels of power usage. NVidia is at the same position Intel was before Intel started patching it
to be like a tickless kernel.
AFAIK, Nvidia has invented some fancy, mobile technologies to "extent" the user experience (well no, they just invented PR slogans for "old" technology. ).

have you ever tried thunderbird's nvclock (or enabled coolbits)?
You can use these tools and _lower_ your 2D speeds and _upper_
your "runtime".

Quote:
Originally Posted by STrRedWolf
Besides, 60-72 Hz? Most video doesn't get beyond 30 Hz anyway, and GIF graphics are limited to 10 Hz. What's so important to wake up 60 to 72 times a second and listen to the video card when you're on a laptop and a static display?
There are several good reasons.... IF you still own a CRT.
For me... I can't stand displays with less than 85Hz, yes even TFTs .

Quote:
Originally Posted by STrRedWolf
(Oh, and suing the laptop vendor? I think they covered their butt on that with the legal documents, plus I already blew away the warantee on the Dell E1505 -- replaced the crappy ATI with an NVidia. What's better, 64 megs of VRAM and a constant 5fps frame rate, or 256 megs of VRAM and an average rate of 25 fps?)
Ahh, now that IS maybe the REAL reason why your batteries are "only" lasting 3 instead of 4 hours .
chunkey is offline   Reply With Quote

Old 05-22-07, 12:31 PM   #9
pe1chl
Registered User
 
Join Date: Aug 2003
Posts: 1,026
Default Re: Interrupts and wakeups: CPU waste.

Quote:
Originally Posted by chunkey
... but with a POLLING driver, you will never get into this >=C3 state either. Because the cpu is always checking the state of the gfx card.
But is this really necessary?
Maybe you need to stop and re-consider the state of things. That is what the tickless kernel developers did too, and with some lateral thinking they were able to accomplish some nice things.

For example, in the classic system there is a timer tick 100 or 1000 times per second because we always believed it was required for scheduling and timekeeping. But they were able to get rid of that, and use only programmed time interrupts when they are really required.
When more than one task is ready to run and they need to compete for the CPU, a scheduled tick is required to interrupt the current task and switch to the other. But when there is no ready task at all, why interrupt all the time to see if you need to schedule? The CPU can remain asleep until a device interrupt arrives that indicates the mouse was moved, a key was typed, etc.

Probably a similar thing is possible in the 2D (or even 3D) video driver. When nothing has changed on the display, the GPU is not busy doing some thing, I find it hard to believe that you need an interrupt at every vertical retrace "to check the state". Maybe when you are waiting for something to complete, but not all the time.
So a little lateral thinking can save a lot of energy, and dismissing such a possibility from the beginning is not a very positive approach.
pe1chl is offline   Reply With Quote
Old 05-22-07, 01:17 PM   #10
chunkey
#!/?*
 
Join Date: Oct 2004
Posts: 662
Default Re: Interrupts and wakeups: CPU waste.

So, where's the big performance/power gain? I can't see any.
url

At "best", it's maybe a half minute more "battery life". But, stuff like
Cool 'n' Quiet or Speedstep "enlarge" it up to a hour!
chunkey is offline   Reply With Quote
Old 05-22-07, 02:28 PM   #11
STrRedWolf
Registered User
 
Join Date: Feb 2005
Posts: 5
Default Re: Interrupts and wakeups: CPU waste.

Quote:
Originally Posted by chunkey
At "best", it's maybe a half minute more "battery life". But, stuff like
Cool 'n' Quiet or Speedstep "enlarge" it up to a hour!
Already underclocking w/Speedstep, killed a few other items to boot too. System's running mostly in C2/C3 range.

Running with Coolbits and UseEvents on. Interrupts on the driver are still at 60 Hz (a few extra because the Intel wireless card in here is sharing the interrupt), but I did get the power consumption down to between 18.6 W and 15.2 W (roughly 5.3 hours) in 2D.

3D mode, throw those figures out the window anyway. But most of the time I'm in 2D, so I want to squeeeeze as much as I can out of that so when I go into 3D (Second Life over Wifi at a mom-and-pop coffee shop, wheee!) I can play for as long as I can and not have to pay for the power with another mint mocha frappachino.

But as mentioned above, it's not "Interrupts from card EVIL" or "Timer Interrupts BAD" it's "Are we waking up to do nothing?" Waking up to do nothing at all is bad for power consumption. I already tweaked my setup so it's running on less power (tickless kernel, ACPI/Speedstep, no IRQ balancing, killing a few items). I want to tweak it down mooooore, and Nvidia's suspect #1 now (followed by USB if I plug anything in there, but I suspect FireWire will be lighter on the power because it uses DMA).
STrRedWolf is offline   Reply With Quote
Old 06-20-07, 07:56 AM   #12
Cougar81
Registered User
 
Join Date: Jun 2007
Location: Helsinki, Finland
Posts: 6
Default Re: Interrupts and wakeups: CPU waste.

Quoted from http://linuxpowertop.org/known.php#intelgfx:
Quote:
The Intel graphics driver used to have a bug where it set up the hardware to generate an interrupt (called VBLANK) each time the screen finished refreshing (typically at 60Hz or 72Hz). In normal 2D mode, this interrupt isn't actually used. PowerTOP unveiled this bug and the current Intel graphics driver no longer has this behavior.
I wonder if this could be done for the NVidia driver as well?
Cougar81 is offline   Reply With Quote
Reply


Thread Tools

Posting Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

BB code is On
Smilies are On
[IMG] code is On
HTML code is Off
Forum Jump


All times are GMT -5. The time now is 09:59 PM.


Powered by vBulletin® Version 3.7.1
Copyright ©2000 - 2014, Jelsoft Enterprises Ltd.
Copyright 1998 - 2014, nV News.