PDA

View Full Version : 512-bit graphics cards?


Pages : [1] 2

Geforce4ti4200
05-22-03, 03:41 AM
The geforce classic had a 256 bit core, isnt it time to move on to 512 bits?

bkswaney
05-22-03, 03:46 AM
Why?

mreman4k
05-22-03, 12:41 PM
Originally posted by bkswaney
Why?

Because I said so..;) :p

Hanners
05-22-03, 12:46 PM
Do you mean the core architecture? I think you'll find that's already at 512-bits.

silence
05-22-03, 02:28 PM
i think he meant memory bus....did you?

mreman4k
05-22-03, 02:29 PM
Originally posted by Geforce4ti4200
The geforce classic had a 256 bit core, isnt it time to move on to 512 bits?

He means core...I wonder what the benefits of a 512 bit core are.

StealthHawk
05-22-03, 02:41 PM
Originally posted by silence
i think he meant memory bus....did you?
No. The original GeForce did not have a 256bit bus.....neither does any nvidia card besides NV35.

DivotMaker
05-22-03, 02:47 PM
Originally posted by StealthHawk
No. The original GeForce did not have a 256bit bus.....neither does any nvidia card besides NV35.

I think the original poster may be confused because the original GF was called "GeForce 256"...

Uttar
05-22-03, 03:00 PM
I think you're all confused :P
The GeForce 256 was named that way because the core was 256-bit. Nothing to do with the memory bus.
Don't know much about all that core bit thingy though.


Uttar

Solomon
05-22-03, 03:56 PM
Only card I know of with a 512-bit core is the Matrox Parhelia. At least that's what they advertise.

Regards,
D. Solomon Jr.
*********.com

StealthHawk
05-22-03, 04:12 PM
Originally posted by BigBerthaEA
I think the original poster may be confused because the original GF was called "GeForce 256"...

Naw, I think it is pretty clear that a GeForce "classic" refers to the original GeForce. You wouldn't call a new card like NV35 a "classic."

deejaya
05-22-03, 06:47 PM
I think they may start appearing either NV40 (or ATI equivalent) or the generation after, now that 256-bit memory bus is "normal". And Solomon is right, Parhelia is supposedly a 512-bit core.

Geforce4ti4200
05-23-03, 12:04 AM
the riva128 was the first with a 128 bit core, the tnt2u was the last. then the geforce's 256 bit core made her quite a fast card. Its time to have 512 bit core to really push those 256 bits of ram :angel:

StealthHawk
05-23-03, 05:09 AM
As someone else asked.....why? What advantages do a 512bit core have over a 256bit core?

Geforce4ti4200
05-23-03, 05:58 AM
twice the pipelines or bandwith or something? the geforce 256 went to a 256 bit core......

Dazz
05-23-03, 07:35 AM
Originally posted by Solomon
Only card I know of with a 512-bit core is the Matrox Parhelia. At least that's what they advertise.

Regards,
D. Solomon Jr.
*********.com Yeah 128bit x 4 while Geforce FX is 64bit x 4 not too sure about the Radeon 9800 i think it's 32bit x 8?

deejaya
05-23-03, 08:49 AM
Originally posted by StealthHawk
As someone else asked.....why? What advantages do a 512bit core have over a 256bit core?

Having a 512-bit core compared to a 256-bit core should let the core do more operations, moving more bits per cycle, independantly from the cpu. But I am no video card engineer, maybe some thing's are different with the current generation cards that are around now.

EDIT: Oops.

Nebuchadnezzar
05-23-03, 12:15 PM
Just the 'core' memory controller, so 256bit DDR bus = 512bit internal core bus. R300, R350, Parhelia, NV35, etc. are 512bit chips. Although only Matrox used to mention this.

Hanners
05-23-03, 12:19 PM
Originally posted by Nebuchadnezzar
Just the 'core' memory controller, so 256bit DDR bus = 512bit internal core bus. R300, R350, Parhelia, NV35, etc. are 512bit chips. Although only Matrox used to mention this.

I thought so too, but looking on this page (http://www.nvidia.com/view.asp?PAGE=fx_5900) shows the 5900 as having a 256-bit core. :confused:

deejaya
05-23-03, 12:32 PM
Matrox Parhelia = 512-bit core
FX = 256-bit core
9700/9800 = 256-bit core

JGro
05-23-03, 03:58 PM
A 256 bit core means that internally it works on data in 256 bit chunks. (i.e. the registers are 256 bits wide, among other things)
It has nothing to do with the memory controller.

There is only an advantage to moving to a 512 bit core if your using 512 bit data. As far as I know, there is currently no use for 512 bit data in GPUs.

Look at the x86-64 CPUs. The reason why they perform better is because of the integrated memory controller, more registers, etc. The 64 bitness only helps in applications where 64 bit data is being used like in some scientific applications.

MrNasty
05-23-03, 04:07 PM
I think he means how consoles are 128bit at the moment and he is saying that cards should be 4x what the consoles offer IE 512bit.

:\

Nebuchadnezzar
05-23-03, 04:07 PM
I could bet that I saw at R300 launch lots of specheets with mentioning it as a 512bit chip. Dunno.:confused:

StealthHawk
05-23-03, 07:30 PM
Originally posted by Dazz
Yeah 128bit x 4 while Geforce FX is 64bit x 4 not too sure about the Radeon 9800 i think it's 32bit x 8?

Are you talking about the memory controller?

StealthHawk
05-23-03, 07:32 PM
Originally posted by Geforce4ti4200
twice the pipelines or bandwith or something? the geforce 256 went to a 256 bit core......

No, the core has nothing to do with the bandwidth....the memory bus does, as long as external memory is being used at least. Things might change with onchip memory, but I'm not sure, I'm not a hardware engineer.

The bus width of the core has nothing to do with the number of pipelines either. Die size does(ie, how much you can fit on chip).