PDA

View Full Version : Geforce 6800U to have NEW clock speeds!


Pages : [1] 2

V0ices
05-01-04, 10:45 AM
yehaa: nv40.com (http://www.nv40.com)

It will clock their cards faster to beat ATI,

Bad_Boy
05-01-04, 11:40 AM
http://www.nvnews.net/vbulletin/showthread.php?t=28105 ;)

Snooka
05-01-04, 12:17 PM
Wow! nvidia really wants the crown back.

BigE4u
05-01-04, 03:01 PM
Not to throw a monkey wrench in the works but... it shouldn't have to be about higher clock speeds, it should be more about efficiency. Its like INTEL, they raise the clocks up really high to make them lQQk faster, but AMD can do just about the same work at lower speeds comparably because they are more effecient. Don't get me wrong, i've always been a big fan of nVidia since i can remember, but they need to start thinking differently like giving us better IQ without having to push the clocks.

Ps- Its like having two car engines with the same horspower rating, one with a carburator, the other with fuel injection... now who do you think will be more effecient? Point is, you have to learn to adapt instead of being "old school" all your life or making excuses to justify the means.

Casper
05-01-04, 03:14 PM
At this phase it´s the only think they can do except taking their NV40 back to the drawing board...

Both ATI and nVidia would want to clock their cards as low as possible to increase the yields that is why we see this higher numbers now when ATI has showned they have a faster card nVidia has to follow... And then ATI will try to up it even further and so on.

Great for us customers.

DJB
05-01-04, 03:46 PM
In what way do you think that the 6800 series is NOT more efficient?
Clearly Nvidia is already going that route. The cards tested achieve twice the performance, with relatively young drivers, than the 5900 series, at a lower core clock. They are already "AMDing" it.
That doesn't mean that they are not going to increase core speeds as well in order to improve performance of the new architecture. Makes perfect sense to do so if they can achieve high enough yields.

Take NV40 back to the drawing board? ROFL. It already is a new architecture. I would expect very littel change in the NV40 architecture other than perhaps a die shrink in the next refresh.

DJB

reever2
05-01-04, 03:49 PM
they raise the clocks up really high to make them lQQk faster, but AMD can do just about the same work at lower speeds comparably because they are more effecient

CPUto GPU analogies don't work. GPU's are constantly changing architectures, cpu's hardly change at all with new generations, just a die shrink and raised clockspeeds, it's how it has always been

dave_
05-01-04, 04:18 PM
CPUto GPU analogies don't work. GPU's are constantly changing architectures, cpu's hardly change at all with new generations, just a die shrink and raised clockspeeds, it's how it has always been

that's absolute nonsense. compare a AMD Duron / Athlon / Athlon XP / AMD64 all at the same clockspeed and then tell me the differences are just "die shrink and raised clockspeeds".

reever2
05-01-04, 04:48 PM
that's absolute nonsense. compare a AMD Duron / Athlon / Athlon XP / AMD64 all at the same clockspeed and then tell me the differences are just "die shrink and raised clockspeeds".

Yeah, and cache increases, bus increases, memory speed raising, peripheral additions made to the mobo and proc(IMC), and IPC tweaks. None of those cpu's will differ drastically when it comes to what actually makes up the cpu. CPU makers aren't going to double any part of the cpu every generation like we have with GPU's, or make any huge increases in transistor counts, the actual cpu( minus cache) back then and now still is only going to take less than 40 mil transistors.

SuLinUX
05-01-04, 05:48 PM
Not to throw a monkey wrench in the works but... it shouldn't have to be about higher clock speeds, it should be more about efficiency. Its like INTEL, they raise the clocks up really high to make them lQQk faster, but AMD can do just about the same work at lower speeds comparably because they are more effecient. Don't get me wrong, i've always been a big fan of nVidia since i can remember, but they need to start thinking differently like giving us better IQ without having to push the clocks.

Ps- Its like having two car engines with the same horspower rating, one with a carburator, the other with fuel injection... now who do you think will be more effecient? Point is, you have to learn to adapt instead of being "old school" all your life or making excuses to justify the means.

Pushing the clocks speeds up on the NV40 has got NOTHING to do with IQ, maybe when you see PS3.0/SM3 running at 32bit FP then you may just shutup.

The Intel and AMD clockspeed debate dont even compare to this as the NV40 has a effient architecture, and so will the R350.

Mr Mean
05-01-04, 05:55 PM
I think they are throwing everything at ATI to get the speed crown and the most feature set card available. Not a bad move indeed :)

dave_
05-01-04, 06:22 PM
I think they are throwing everything at ATI to get the speed crown and the most feature set card available. Not a bad move indeed :)

the first 6800 reviews said that the huge heatsink remained quite cool, maybe they intentionally made it unnecessarily large to accommodate for this "surprise" clock bump later.

Nv40
05-01-04, 06:23 PM
Don't get me wrong, i've always been a big fan of nVidia since i can remember, but they need to start thinking differently like giving us better IQ without having to push the clocks.




huh?

Nvidia focus this time was not merely the "clocks" ,lets see..,new AA modes(more soon to come)/newAF up to 16x /have full speed FP32 precision +full support for ShaderModel 3.0 +FloatingPoint blending/texturing filtering,which all that together not only more efficiency in programming ,but more advanced graphics than ever before .as a bonus a full programable hardware dedicated to Video. and its performance with beta drivers/reference board is up to 2x times the speed of the fastest card at the moment. what else you want? :)

well ..it would be nice to have a video card that can do LOrd of the RIngs movie Graphics in real time ,for just $199 but i guess that we will need to wait some time for that :D

dave_
05-01-04, 06:38 PM
huh?

Nvidia have focused this time not only in performance ,lets see..,new AA modes(more soon to come)/newAF up to 16x /have full speed FP32 precision +full support for ShaderModel 3.0 +FloatingPoint blending/texturing filtering,which all that together not only more efficiency in programming ,but more advanced graphics than ever before. as as a bonus a full programable hardware dedicated to Video. and its performance with beta drivers/reference board is up to 2x times the speed of the fastest card at the moment. what else you want? :)

well ..it would be nice to have a video card that can do LOrd of the RIngs movie Graphics in real time ,for just $199 but i guess that we will need to wait some time for that :D

yeah but they pulled loads of IQ BS with the NV3x, and there's already hints of the same for the NV4x in Far Cry and 3DMark2003 :( hopefully it'll be different this time

Nv40
05-01-04, 06:51 PM
yeah but they pulled loads of IQ BS with the NV3x, and there's already hints of the same for the NV4x in Far Cry and 3DMark2003 :( hopefully it'll be different this time


yeah.. all other 99% of the benchmarks in PS2.0 and PS1.x doesnt count. so the NV4x will not be able to Run Dx9 games at full speed ,without quality issues.. right? :rolleyes:

dave_
05-01-04, 07:01 PM
yeah.. all other 99% of the benchmarks in PS2.0 and PS1.x doesnt count. so the NV4x will not be able to Run Dx9 games at full speed ,without quality issues.. right? :rolleyes:

you on crack or something? some reviews have questioned the "purity" of NV4x IQ is all, I didn't say anything about performance or DX9 or anything else you're babbling on about.

or can you only understand something if you can make it into "nvidia fanboy BS vs. ATI fanboy BS" ?

Dirty
05-01-04, 07:07 PM
:clap: :clap: :nana: :nana: :smoking2: :beer: (bud) :thumbsup:

Dirty
05-01-04, 07:08 PM
go back to rage3d.com

jimmyjames123
05-01-04, 07:22 PM
yeah but they pulled loads of IQ BS with the NV3x, and there's already hints of the same for the NV4x in Far Cry and 3DMark2003

I'll say it again, the NV40 was being detected as NV3x hardware in FarCry using v1.1 patch! The NV3x setting is used to give NV3x cards a boost in performance, and the tradeoff is IQ. The NV40 is an entirely different animal. In fact, some tests have suggested that the NV40's PS 2.0 speed is at times higher than it's PS 1.1 speed! So, obviously the NV40 will not stand to gain much when substituting PS 2.0 shaders with PS 1.1 shaders. As for 3dmark03, the DH article was more FUD than anything really. Several reviews have compared IQ with the NV40 vs current gen cards, and there was no striking differences in 3dmark03. Most likely, the so-called issue can be explained by different rendering techniques and beta Forceware drivers.

dave_
05-01-04, 07:25 PM
I'll say it again, the NV40 was being detected as NV3x hardware in FarCry using v1.1 patch! The NV3x setting is used to give NV3x cards a boost in performance, and the tradeoff is IQ. The NV40 is an entirely different animal. In fact, some tests have suggested that the NV40's PS 2.0 speed is at times higher than it's PS 1.1 speed! So, obviously the NV40 will not stand to gain much when substituting PS 2.0 shaders with PS 1.1 shaders. As for 3dmark03, the DH article was more FUD than anything really. Several reviews have compared IQ with the NV40 vs current gen cards, and there was no striking differences in 3dmark03. Most likely, the so-called issue can be explained by different rendering techniques and beta Forceware drivers.

nice reply thanks.

Dirty
05-01-04, 07:25 PM
As for 3dmark03, the DH article was more FUD than anything really.

AMEN :cool3:

:D

kev13dd
05-01-04, 07:31 PM
Current series out right now:

ATI has lower clocks, and better performance on most things

Series comming out

Nvidia has practically a new design. This stuff is comming from 3DFX more than Nvidia, and they are encorperating new ideas and new designs to get better performance

ATI isn't comming out with too much brand new revolutionary stuff besides more more pipelines. How do they make the performance go up? Increase the clock speeds!



Havn't we learned that higher clock speeds, hardly mean best performance when it comes to comparing cards made by different companies?

K

john19055
05-01-04, 08:03 PM
It looking more and more that my next card will be Nvidia.

-=DVS=-
05-01-04, 08:14 PM
If Ultra will be lets say 450 or even 475Mhz ,it would be pretty cool not stinky 400 :p

theshape1978
05-01-04, 09:04 PM
enough..enough....give me the card already.