PDA

View Full Version : PS3 Pics/Specs Leaked?


Pages : 1 2 3 4 5 6 7 8 9 10 11 [12] 13

jAkUp
05-18-05, 03:24 AM
So does this mean they dropped 32 bit precision?

fivefeet8
05-18-05, 03:26 AM
So does this mean they dropped 32 bit precision?

No. FP32 color precision and FP16 blending/filtering precision are different things. FP32 color precision is also called 128bit FLoating Point precision by Nvidia marketing. ;)

FP32 color = 128bit FP precision
FP16 color = 64bit FP precision
64bit Floating Point blending/filtering = FP16 blending/filtering

retsam
05-18-05, 04:24 AM
hey guys...
Attached Thumbnails
Click image for larger version Name: 7800gtxpower.jpg Views: 7 Size: 47.5 KB Click image for larger version Name: 7800gtxspec.jpg Views: 8 Size: 74.1 KB Click image for larger version Name: coverimage.jpg Views: 9 Size: 86.0 KB
__________________

you know i love you jakup



why dont you start your own thread about your experiances at e3

retsam
05-18-05, 04:41 AM
Here are 2 differences methods, you see Xbox 360 R500 chip rely on 1080i and PS3 RSX chip rely on 1080p, that why R500 need eDRAM to use 1080i format so RSX dont really need it to use 1080p format. 1080i has the disadvantages over 1080p format are that 1080i used twice more bandwidth on signal than 1080p and it deliver a half frame every 1/60th of a second in full frames at 30fps. That when eDRAM came in to deliver the other missing frame that will give the full frames at 60fps that Xbox 360 need for gaming. In a year time 1080p will became new standard format to replace 1080i, 1080p has the advantages over 1080i format are saved twice bandwidth on signal than 1080i did and instead used that saved bandwidth to store more image data and also it deliver a full frame every 1/60th of a second in full frames at 60fps.


dude were are you getting your information from...if this was true why dont the broadcasters broadcast in 1080p and skip 1080i. instead of using 1080i first.... this makes no sence at all. the reason the broadcasters and the cable guys arent using 1080p is becouse its soo bandwidth intensive twice that of 1080i. ive heard somewere is the 30 Mbps rate with compression .....

evilchris
05-18-05, 04:58 AM
Gates has to be losing sleep over dropping nVidia and going with ATI. What a mistake...


Yeah it must suck being a billionaire like Bill.

evilchris
05-18-05, 05:06 AM
dude were are you getting your information from...if this was true why dont the broadcasters broadcast in 1080p and skip 1080i. instead of using 1080i first.... this makes no sence at all. the reason the broadcasters and the cable guys arent using 1080p is becouse its soo bandwidth intensive twice that of 1080i. ive heard somewere is the 30 Mbps rate with compression .....

All of a sudden there are all these "1080p" experts ever since the PS3 press release. 1080p is now the "sUPER l33t HDTEEVEE moDE, eVERYTHING eLSE SUXORs!

Fact is, it's a buzzword and has no point. It isn't even part of the broadcast standard and precisely zero broadcasters will use it, ever. HDTV = 720p or 1080i, everything else is just whizbang extra mode support that won't ever have a real use. You won't see anything other than 720p or 1080i until the next TV revolution, by which time 1080 will be crap. I wish I had a damn PS3 now so I could prove this to people by running it at 1080i versus 1080p and having them guess which is which.

Retsam you are right btw, full HDTV 1080i is 19Mbps, 1080p is roughly twice that and is why it won't be used. Half are using 720p, the other half 1080i.

AthlonXP1800
05-18-05, 06:07 AM
dude were are you getting your information from...if this was true why dont the broadcasters broadcast in 1080p and skip 1080i. instead of using 1080i first.... this makes no sence at all. the reason the broadcasters and the cable guys arent using 1080p is becouse its soo bandwidth intensive twice that of 1080i. ive heard somewere is the 30 Mbps rate with compression .....

Well the reason broadcasters didnt skipped 1080i because 1080p is not yet became industry standard until next year in 2006. Broadcasters found it increased difficulties allocated extra bandwidth on extra TV channels with 1080i format because it used twice more on bandwidth. All broadcasters are in favoured with 1080p format because it use twice less bandwidth than it did with 1080i so broadcasters can release up bandwidth and they will be happy to get rid of 1080i and put more TV channels on 1080p than it did with 1080i.

retsam
05-18-05, 07:09 AM
Well the reason broadcasters didnt skipped 1080i because 1080p is not yet became industry standard until next year in 2006. Broadcasters found it increased difficulties allocated extra bandwidth on extra TV channels with 1080i format because it used twice more on bandwidth. All broadcasters are in favoured with 1080p format because it use twice less bandwidth than it did with 1080i so broadcasters can release up bandwidth and they will be happy to get rid of 1080i and put more TV channels on 1080p than it did with 1080i.

first 1080p ... is the 1080p24 standard .... do you know what the 24 stands for....
second 1080i ... is the 1080i30 standard ... do you know what the 30 stands for ...

ATI_Dude
05-18-05, 07:09 AM
Here are 2 differences methods, you see Xbox 360 R500 chip rely on 1080i and PS3 RSX chip rely on 1080p, that why R500 need eDRAM to use 1080i format so RSX dont really need it to use 1080p format. 1080i has the disadvantages over 1080p format are that 1080i used twice more bandwidth on signal than 1080p and it deliver a half frame every 1/60th of a second in full frames at 30fps. That when eDRAM came in to deliver the other missing frame that will give the full frames at 60fps that Xbox 360 need for gaming. In a year time 1080p will became new standard format to replace 1080i, 1080p has the advantages over 1080i format are saved twice bandwidth on signal than 1080i did and instead used that saved bandwidth to store more image data and also it deliver a full frame every 1/60th of a second in full frames at 60fps.

No RSX is not in trouble after watched Xbox 360 conference, R500 turned to be the biggest disappointment after all the overhyped about R500 rumoured the most powerful and fastest chip and Bill Gates believed he got things right with Xbox 360 and will beaten PS3 this time. I watched all the games trailers and thought "Is this joking? Where the breakthrough graphics?" None of it excited or blowed me away, most games trailers looked like it can very easy done on PC's last and current generation videocards and PS2, some others games trailers with very simple graphics can be easy done with Dreamcast. I think ATI and Microsoft heavily underestimated the power of Nvidia and RSX chip 3 years ago when NV30 came out failed to catched up with R300 and thought Nviida will never catch up and will fall like 3dfx, now they got it all wrong, RSX turned to be the most advanced GPU and R500 looked like obselete already.

I'm not sure I follow you. I was merely speculating on peak performance in games when you have fully lit, fully skinned polygons with full scene anti aliasing. ATI's architecture is laid out better since it has the extra data cache (eDRAM) and Unified shader architecture for greater flexibility. I'm not the only one who's skeptical. Anandtech made some interesting comments on the E3 PS3 demos. First of all they noticed aliasing. Second, some demos only ran in 720p. My point is that FSAA takes a comparably higher performance hit on the PS3 and, as I see it, there's no way the PS3 can run games in 1080p (1920x1080) with 4x FSAA and heavy use of shaders and maintain a solid 60 fps. The Xbox 360 on the other hand is built to run games at 720p with FSAA and maintain a consistent frame rate. But again, I was merely thinking out loud, so take it for what it's worth :)

retsam
05-18-05, 07:19 AM
here is a good right up on how hd works... and i have to correct my self all hd standards are just guidlines not really standards

so actually there is a 1556P AND 3132p ....

http://www.spectsoft.com/support/whitepapers/videorates/

Knot3D
05-18-05, 08:41 AM
what is this ? is the 1080i vs 1080p discussion the new 'sm2.0 vs sm3.0' thingy ? lol

AthlonXP1800
05-18-05, 08:41 AM
first 1080p ... is the 1080p24 standard .... do you know what the 24 stands for....
second 1080i ... is the 1080i30 standard ... do you know what the 30 stands for ...

Both 24 and 30 stand for fps - frame per second, I read these standards are use in digital films for HD editing and post production that I dont know much about it cos I dont have digital HD video camera. The only people who use these 2 new digtial film formats are from movies and TV studios but none use these new standards in the UK studios yet until probably 2008 or later because so far none of UK channels current support HD so next year in 2006 Sky will start showing some imported US shows in HD format for first time.

Slybri
05-18-05, 11:33 AM
I was watching the tech parts of the sony presentation again and I noticed that the cell chip had 8 SPE's, but only used 7of them. The 8th is apparently reserved for "redundancy."

Do you think later on down the line, they will unlock the 8th SPE and UP the power? You know how the graphics on consoles get better near the end of their cycles as programmers learn more tricks with the system. Maybe PS3 has a little more power in reserve for when it's older.

Another thought: While the PS3 is capable of running graphics like KILLZONE 2, it would still cost 100 million or more to create a game like that, not to mention the years of development. I think games of that quality will be rare.

Anyway, I'm glad the PS3 turned is looking so much better than the 360. Seems like M$'s box is nothing more than a way to sell stuff to gamers online. I always liked the Playstation better anyways. Still, the 360 has a 6 month lead on the ps3 which could be crucial.

The nintendo revolution I just don't get...yet.

newlinuxguy
05-18-05, 11:47 AM
http://www.theinquirer.net/?article=23325

Nvidia is still keeping quiet about the final spec of the chip but we know it's a 90 nanometre chip and that it's likely that it will have at least 10MB of cache memory. ATI's Xbox 360 chip has exactly the same 10MB as you need to have enough buffer to render HDTV picture quality games. That's what the new consoles are all about.

WHAT THE $#$%

MUYA
05-18-05, 11:52 AM
That is highly sceptical since no one has mentioned that and you cannot just slap e-dram on the die well unless Sony gave NV the know how...ack who knows

ATI_Dude
05-18-05, 11:54 AM
That is highly sceptical since no one has mentioned that and you cannot just slap e-dram on the die well unless Sony gave NV the know how...ack who knows

Ditto. They'll have to redesign the chip almost from scratch to add eDRAM.

Vagrant Zero
05-18-05, 12:04 PM
Ditto. They'll have to redesign the chip almost from scratch to add eDRAM.

PC version won't have the ram. That's for the PS3 version, to run games at 1080P.

gmontem
05-18-05, 12:11 PM
Well the reason broadcasters didnt skipped 1080i because 1080p is not yet became industry standard until next year in 2006. Broadcasters found it increased difficulties allocated extra bandwidth on extra TV channels with 1080i format because it used twice more on bandwidth. All broadcasters are in favoured with 1080p format because it use twice less bandwidth than it did with 1080i so broadcasters can release up bandwidth and they will be happy to get rid of 1080i and put more TV channels on 1080p than it did with 1080i.

1080i uses less video bandwidth than 1080p. Interestingly enough 1080i uses about the same video bandwidth as 720p. However 1080i is less stressful on CRT tubes because the requried horizontal scanning frequency is less than 720p -- ~35KHz vs ~46KHz -- and is yet another reason why some broadcasters favor 1080i over 720p. You can verify this with a CRT display that can display the scanning frequency and so on.

Where in the world did you get the idea that interlaced signals use more bandwidth than progressive signals?

H3avyM3tal
05-18-05, 12:27 PM
I asked this on another thread, but I got no answer, so I'll ask this here:

I think nvidia said that the RSX is capable of 128bit fpp at the sony conference, no? If so, isn't that about as double from the 7800???

:confused:

ynnek
05-18-05, 01:12 PM
there's a whole bunch of video's floating around..

the killzone video I saw, was supposdily coded/scripted with the actual game code and cell programming on the DEV box, then ran thru and rendered thru that dev box to give an approximation on what the PS3 SHOULD be able to do. The dev systems aren't as powerful as the final PS3 system, so it takes time to render frame by frame some of the whole highrez wowgee videos we are seeing. Thats what I meant by not realtime.

evilchris
05-18-05, 01:12 PM
Well the reason broadcasters didnt skipped 1080i because 1080p is not yet became industry standard until next year in 2006. Broadcasters found it increased difficulties allocated extra bandwidth on extra TV channels with 1080i format because it used twice more on bandwidth. All broadcasters are in favoured with 1080p format because it use twice less bandwidth than it did with 1080i so broadcasters can release up bandwidth and they will be happy to get rid of 1080i and put more TV channels on 1080p than it did with 1080i.

Dude, 1080p uses MORE BANDWIDTH than 1080i. NO BROADCASTERS want to use 1080p. You are totally wrong man, please stop posting this BS.

ATI_Dude
05-18-05, 01:25 PM
PC version won't have the ram. That's for the PS3 version, to run games at 1080P.

It hasn't been confirmed. The specs leaked at the E3 don't mention eDRAM.

DaveW
05-18-05, 01:33 PM
Dude, 1080p uses MORE BANDWIDTH than 1080i. NO BROADCASTERS want to use 1080p. You are totally wrong man, please stop posting this BS.

Yep.

"Children, interlaced video is bad ...mkay?"

God knows why any of the HDTV standards are interlaced, its like an old habit that the TV industry had a hard time letting go of.

MUYA
05-18-05, 01:39 PM
I asked this on another thread, but I got no answer, so I'll ask this here:

I think nvidia said that the RSX is capable of 128bit fpp at the sony conference, no? If so, isn't that about as double from the 7800???

:confused:
RSX is capable of FP32 = 32 bits for each RGB component + alpha component. = 4X32 = 128 bits. like the FP32 used in NV3X upwards

H3avyM3tal
05-18-05, 01:41 PM
Oh, and whats the difference between this and the 7800?
I am clueless...