Originally Posted by Roadhog
Logic... Isn't your strong point is it...
How long do you think it will take you to make up the money spent on a new gpu in power savings?
Lets say that the 7870 costs $300, and uses 150 watts under 100% load.
The price of your current 5870 doesn't matter, but lets say it draws 250 watts under 100% load. That's 100 watts more than your 7870.
Now, if you were to have the GPU under 100% load, 24 hours a day, with a cost per kwh of an insanely high $0.20/kwh, it would cost you $1.2 a day, or $36 a month to run your 5870. Compare that to the 7870 at 150 watts, 100% load, it would cost you $0.72 a day, or $21.6 a month. That is an amazing savings of $14.4 a month. To break even on the 7870 cost, it would take you 20.8 months.
That is just the worst case scenario. If you drop the cost of the kwh to a much more reasonable $0.10/kwh, you are looking at 41.6 months to break even. As you can imagine, since you don't run your GPU 24/7 that it will take you much longer than 41 months to break even.
Hey! We agree on something Roadie. I'm always bewildered when someone mentions electricity savings as a factor as well.
I offered to sell a coworker a 50" 720P plasma the other day for $250, and he replies "Don't those use a lot of electricity?!" and doesn't buy it. Meanwhile he watches his old style tube tv with a much smaller picture and probably close to the same juice used.
Electricity is way too cheap here to be a factor on anything unless you're literally watching every nickel- in which case you should be gaming on a Nintendo DS, not a computer.