PDA

View Full Version : Does 1 MB of L2 cache really make a difference?


Pages : [1] 2 3

saturnotaku
01-31-06, 09:31 AM
I've been reading and looking at some comparisons, but I haven't seen much that directly shows the benefits of 512 KB vs. 1 MB of L2 cache on a CPU. So I thought I'd put the question out here. What, if any, difference does this make in terms of performance? Is it negligible to the point where spending the extra money for 1 MB isn't totally worth it?

CaptNKILL
01-31-06, 09:45 AM
I hope its negligible, because that would make my CPU easily faster than a 4000+ ;) :D

Ive always heard people associate more cache with "smoother" system performance, but since the A64 was released I havent heard that too much (how much smoother can it get?). I really dont think it has that much effect after looking at some benchmarks, but Ive never used a 1mb cache A64 myself so I cant really comment any further than that.

|MaguS|
01-31-06, 09:48 AM
I think it mainly comes to play on CPU heavy applications and games.

Riptide
01-31-06, 10:00 AM
It doesn't help much with games. It doesn't even help much with other applications. Unless you consider 6% or less a big difference.

Now going from 256K-512K there is probably a bigger difference to be had. Diminishing returns.

Lfctony
01-31-06, 10:11 AM
Article:
http://www.phoronix.com/scan.php?page=article&item=219&num=1

jeffmd
01-31-06, 10:11 AM
rip, %6 is a big difference. ^^

Thats pretty much it though, I've seen the benchmarks say it helps in most games, but raw cpu applications like video compressing it dosn't help so much in. We saw the same when the xp barton came out, doubling the l2 cache to 512k.

Riptide
01-31-06, 10:14 AM
6% is a best case scenario. In many cases it's around 3-4%. That isn't even worth $50 to me.

I'd lay $100 on the table and bet anyone here that in a blind test they couldn't tell a difference. That to me just indicates how worthless it really is.

Especially in high resolutions, which are still normally GPU bound in modern games, it's completely and totally worthless.

CaptNKILL
01-31-06, 10:48 AM
Article:
http://www.phoronix.com/scan.php?page=article&item=219&num=1
Bleh, they used a 6600GT for the tests... why the hell would they have used a mid range GPU with a high-end CPU :o

$n][pErMan
01-31-06, 12:43 PM
Bleh, they used a 6600GT for the tests... why the hell would they have used a mid range GPU with a high-end CPU :o
Because it forces the CPU to do more work.. :p

CaptNKILL
01-31-06, 01:01 PM
[pErMan']Because it forces the CPU to do more work.. :p
Yeah, thats how it works... yeah... :p

Riptide
01-31-06, 01:09 PM
Which ofcourse is a pretty unrealistic scenario. CPU limited scenarios are rare these days.

We need better GPUs, not CPUs.

ViN86
01-31-06, 01:21 PM
Which ofcourse is a pretty unrealistic scenario. CPU limited scenarios are rare these days.

We need better GPUs, not CPUs.
well, most high end cards get bottlenecked at low resolutions. thats when you crank up the AA/AF.

i also like how you practice what you preach. i see that X2 4200+ ;)
hey, youre the reason i went with a 3800 instead of a 4400 and saved myself hundreds. not a single regret i might add.

Tr1cK
01-31-06, 02:03 PM
Save the money, go with 2 gigs of system ram, and then the performance is smooth.

not a single regret i might add
^QFT

Riptide
01-31-06, 02:13 PM
i also like how you practice what you preach. i see that X2 4200+ ;)Thanks man I learned my lesson w/the FX53. Good chip but holy it was not even close to worth the premium I paid vs. the 3800+. Big mistake.

The ultra high end is almost never a good value. This is just how it is.

But honestly... If I had the $$ to waste I'd have an FX-60 on the way today. ;)

saturnotaku
01-31-06, 02:32 PM
Then let me ask this, would it be a worthwhile investment to swap out my single-core Venice 3500+ (2.2 GHz, 512 KB L2) for an X2 of the same specs? Or should I try to go for an X2 with a faster clock speed and same L2 (4600+) or the same clock speed and more L2 (4400+)?

Thanks for the replies thus far.

Riptide
01-31-06, 02:56 PM
Clock speed should take priority and an X2 is a nicety - though as pointed out many times before it isn't really going to bump up that many games. Some do benefit though but it's not like a 50% improvement.

I'd get the 4600+ if you can swing it. Stay away from the 4800+ or the FX60.

4200+ if you want to save coin.

Lfctony
01-31-06, 05:31 PM
I went from a 3500+ to an Opty 150, that's the equivalent of a 4000+, 2.4Ghz/1Mb compared to 2.2Ghz/512Kb of the 3500+. While I could measure a slight difference in games, it didn't feel faster. I ended up selling the bugger and got a 4200+ X2 for the same money. I also encouraged my brother to go with a 3500+ instead of a 4000+, and spend the extra money on the video card. Needless to say, a 3500+/X1800XT combo is much better than a 4000+/7800GT combo.

j0j081
02-01-06, 12:00 AM
My Athlon 64 w/1 MB cache pwns at 2.4 ghz!

Rytr
02-01-06, 12:36 AM
The price difference in the X2 makes those with the 512MB definitely a better buy. I tested a 3400+ CH against a 3200+ NC with both running at 2.2GHz in the same 754 system. About the best I got was a 3% overall increase in Doom and UT4.

retsam
02-01-06, 02:19 AM
if your gonna run server software then the extra core and extra cache is worth it but if your not then keep the money in your pocket and go for a faster video card.

saturnotaku
02-01-06, 05:04 AM
That being the case, I think I'll just stick with my 3500+ and look into upgrading my monitor instead. Thanks. :)

grey_1
02-01-06, 12:30 PM
That being the case, I think I'll just stick with my 3500+ and look into upgrading my monitor instead. Thanks. :) This thread just saved me some cash:D , time for a monitor here as well. Thanks guys.

Mr_LoL
02-01-06, 01:54 PM
I bought mine for naming reasons. 4000+ sounds better than 3800+ :D Though it was 60 more expensive than the 3800+ :(

agentkay
02-01-06, 02:14 PM
Is the cache automatically utilizied by the OS? I ask because I read last month that XP and 2K use only 256kB unless you add or change a certain registry value and enter the amount of cache your CPU has.

CaptNKILL
02-01-06, 02:18 PM
Is the cache automatically utilizied by the OS? I ask because I read last month that XP and 2K use only 256kB unless you add or change a certain registry value and enter the amount of cache your CPU has.
I dont think thats true. There are performance differences between CPUs with different amounts of cache, so it must do something. Maybe with Windows 9x?