PDA

View Full Version : At what point do cpu speeds become irrelivant?


Omega53
04-09-04, 12:48 AM
Ok, so I went out and bought a mobile 2500+ just like everyone and their mother and was able to hit 2.440 ghz 195*12.5 until I get some better ram. My question is when does cpu speed start to become irreilvant. I mean do we really need 2.440 ghz speeds? Not that im going to continue to tweak this baby until it maxes out :D but I don't know why I do it. Its like im obsessed without a reason for being obsessed. I havent noticed any huge differences over my old 2600+ 133*16 @ 2.130 Ghz. I just feel like no matter how fast my cpu is, it will never be fast enough :(

Sazar
04-09-04, 12:55 AM
800mhz...

unless I am gaming it is irrelevant to have something faster for normal usage for me...

:cool:

Omega53
04-09-04, 01:18 AM
800mhz...

unless I am gaming it is irrelevant to have something faster for normal usage for me...

:cool:

I meant in gaming when does it become irelivant :D

Sazar
04-09-04, 01:20 AM
some people do it for a big pen0s factor...

:shrug:

I personally don't really care anymore... :)

I am not going to notice 2-3 fps difference and the like... there is little logical reason for me to do it...

Clay
04-09-04, 01:22 AM
So you're asking about video card scaling in relation to CPU speed? The anecdotal evidence that I've seen suggests that it has not become irrelavant yet. Even with the fastest offerings from Intel and AMD the highest end video cards from ATI and NVIDIA continue to scale well.

Clay
04-09-04, 01:25 AM
some people do it for a big pen0s factor...

:shrug:

I personally don't really care anymore... :)

I am not going to notice 2-3 fps difference and the like... there is little logical reason for me to do it...I love common sense like this...just wish there was more of it. :) I couldn't agree more, deltas of <10~15fps mean little to me around the 40fps mark and even <20-30fps mean little once you're up higher than 60fps. It never ceases to amaze me how some people can get so hung up on splitting hairs (be it FPS, IQ minutiae or the like). Sorry Neo_Radeon9700, that's not directed at you...I'm just talking to myself. :p

jAkUp
04-09-04, 01:30 AM
It depends on the game really.. newer games you arent gonna see much of an upgrade from bumping up the cpu.. they are mostly videocard dependent.

Omega53
04-09-04, 01:49 AM
It depends on the game really.. newer games you arent gonna see much of an upgrade from bumping up the cpu.. they are mostly videocard dependent.

says the man with the 3.8ghz P4 :p

jAkUp
04-09-04, 02:28 AM
ROFL Yea... Use me as an example :D

tieros
04-24-04, 01:55 AM
The answer is "when they are not the bottleneck" in the system.

Right now there is very little reason to beef up your CPU because the current bottlenecks are the GPU and the AGP bus. So the market focuses on the weak links, and delivers new technology (like PCI-Express and the NV40) that shifts the bottleneck back to the CPU, or perhaps RAM or the FSB.

And at no point in history have computers ever been too powerful for the applications we would like to run on them, especially in gaming.

Developers will always expand their products to abuse the resources given them, so don't expect to be able to stop buying stuff anytime soon :type:

PaiN
04-25-04, 10:28 AM
I OC because I luv it...I'm into making the fastest parts faster :D
as for the speed that's just frosting on the cake..yumm
I've benched this CPU @ around 3.85ghz stable but, as with all my systems, for "everyday" use I back off the max speed(especially a $1000 CPU ;) ) and find a comfortable level too game my heart out at.

devnull
04-28-04, 10:18 PM
OCing is actually a novelty. Most people do it for the sheer joy of it. I know I get a big charge out of a 210MHz overclock. (XP2600+ from 2.09GHz to 2.3GHz) I bet more than half of you who are into beefing up cars don't have stock parts. What do you need a V8 engine for? :P

Besides, most of the 32bit AMD CPUs cost FAR less than newer videocards. The FX6800 will cost well into the $500 range, but an AMD XP 3200+ can be found at pricewatch.com for $168. And I can guarantee you that with that kind of CPU, you can get away with having a subpar videocard. My dad has a 3200+ with an FX5600, and can run most DX9 games without a flicker. Painkiller, UT2004, Far Cry... they all run perfectly. Try THAT with a subpar CPU.

Of course, we're not gonna be able to do that with games like Doom 3 and Half Life 2. Once those two games come out, that's when I believe CPU speed will no longer matter... at least as much as your videocard.

Daneel Olivaw
04-28-04, 11:08 PM
I study at the University of Montreal, and a big part of my bachelor's degree has to do with algorithms. So I can tell you right away that a difference of 10% CPU makes a noticable difference in games, but in the grand scheme of things it makes no difference at all.

Suppose you have an input of size (n). If you have an algorithm (AI for example, shortest path for a concrete example) that runs in a time ~n^2, whether your comp is running at 2GHz or 2.2GHz makes no difference to a researcher (though it might to a gamer). If such an algorithm runs on an imput of size 10 000 in 100 years, an overclock of 10% is not going to make the problem more solvable.

Researchers are better off trying to find an algorithm that does the same work in TIME( (n)log(n) ) (as opposed to my example of n^2) instead of upgrading/overclocking their systems. In my example, the algo would go from taking 100 years to just one 1 year.

For gamers, find an example of an algorithm that runs in hundreds of CPU cycles (such as possibly parts of the AI in UT) instead of years and you'll see this logic applies to gamers too.

New instruction sets can sometimes cause such improvements. Divx encoding is probably going to improve better than linearly in nv40 for example. Which is not a question of clock speed but rather of circuit 'optimization' (in the true sense, not in the nv drivers sense). Circuit optimization/new instructions/new algorithms are much more pertinent than clockspeed.

Imagine a core that only has circuits to add or substract integers. How would you go about multiplying? Of course you could, but it would take quite a few cycles where you would make additions/substractions. Whereas if you simply added a multiplier to the circuit you'd be able to multiply in a single cycle, at the cost of a bigger circuitry.

Well, if anyone read through all that, cool. :) Just my 2 cents.

alucard_x
05-04-04, 04:33 PM
ew, too much data structures and algorithms for you. thank god im done with that class.

answer: CPU speeds will never be irrelavent, as long as there's new software that continues to push the system.