Go Back   nV News Forums > Software Forums > Gaming Central > Console World

Newegg Daily Deals

Reply
 
Thread Tools
Old 06-13-05, 02:07 PM   #1
|MaguS|
Guest
 
Posts: n/a
Default Another Kutaragi Interview, Decent Info.

All bias aside, I think this was one of the better interviews about the PS3 and just the next gen. Kutaragi really didn't slam 360 like MS/ATI have been doing, He just spoke honestly about technology. If you put your bias thoughts aside, alot of the things he says makes alot of sense.

He does keep stating about how the PS3 is a Computer now and not a console... but thats thier perception.

Here are points I found interesting.

Quote:
IPW: We were predicting that eDRAM was going to be used for the graphics memory, but after hearing that the PS3 will support the use of two HDTVs, we understood why it wasn't being used.

KK: Fundamentally, the GPU can run without graphics memory since it can use Redwood (the high-speed interface between Cell and the RSX GPU) and YDRAM (the code name for XDR DRAM). YDRAM is unified memory. However, there's still the question of whether the [bandwidth and cycle time] should be wasted by accessing the memory that's located far way when processing the graphics or using the shader. And there's also no reason to use up the Cell's memory bandwidth for normal graphics processes. The shader does a lot of calculations of its own, so it will require its own memory. A lot of VRAM will especially be required to control two HDTV screens in full resolution (1,920 x 1,080 pixels). For that, eDRAM is no good. eDRAM was good for the PS2, but for two HDTV screens, it's not enough. If we tried to fit enough volume of eDRAM [to support two HDTV screens] onto a 200mm x 300mm chip, there won't be enough room for the logics, and we'd have had to cut down on the number of shaders. It's better to use the logics in full, and to add on a lot of shaders.
Good explination why they didn't go eDRAM which I believe is valid. Adding so much eDRAM as normal VRAM would be costly, Why do you think CPUs and other devices don't have much Cache? Its very fast but costly...

Quote:
IPW: Microsoft decided to use a unified-shader GPU by ATI for its Xbox 360. Isn't unified-shader more cutting-edge when it comes to programming?

KK: The Vertex shader and Pixel shader are unified in ATI's architecture, and it looks good at one glance, but I think it will have some difficulties. For example, some question where will the results from the Vertex processing be placed, and how will it be sent to the shader for pixel processing. If one point gets clogged, everything is going to get stalled. Reality is different from what's painted on canvas. If we're taking a realistic look at efficiency, I think Nvidia's approach is superior.
I believe this aswell, While it is a technically optimal system it will cause alot of problems in the beginning as developers learn it. It will be alot like the PS2s EE (waits for the laughter to die)that caused so much headaches since it was new tech that none had any experiance with. Developers will have to learn how to balance the available pipes to achieve optimal performance and this will take time. The RSX is alot like current gen GPUs so theres not much they don't know how to handle.

I wonder how they will accomplish hardware backwards compatibility, What part of the PS3 is similar to the PS2, I hope they dont plan to include the EE in the PS3. That would be um... a poor decision.

Saddly one thing this interview doesn't cover is the actual specs of the RSX, it's design should be complete by now so I would think they would want to put out numbers and info but the lack of any is a downer.

http://www.gamespot.com/news/2005/06...s_6127392.html
  Reply With Quote
Old 06-14-05, 10:11 AM   #2
ATI_Dude
Registered User
 
ATI_Dude's Avatar
 
Join Date: Aug 2003
Location: Copenhagen, Denmark
Posts: 396
Default Re: Another Kutaragi Interview, Decent Info.

I don't buy the argument about stalling the graphics pipeline. ATI are not amateurs and I believe they thought of this when they designed the R500 chip. Reality is that we know very little about how the two consoles work in practice. We only have vague information about clockspeeds and general architecture, but no actual console to work with. M$ and ATI chose to opt for a unified shader architecture while Sony and nVidia went the trusty PS/VS way. We'll just have to wait and see who made the best choices.
__________________
Regards,
ATI_Dude

Desktop: | Intel Core i7 2600K@ 3.4 GHz | Asus P8P67 Pro | 8 GB DDR3 1600 Corsair (2x4 GB) | Asus GeForce GTX 580 | Creative X-Fi Fatal1ty FPS soundcard | Segate Momentus XT 500 GB Hyprid SATA HDD | Samsung SyncMaster T220HD 22'' LCD | 650 watt Corsair HX650 PSU |

Laptop1: | MacBook Pro Uni-body | Core2 Duo 2.53 GHz (FSB 1066 MHz) | GeForce 9400M 256 MB DDR3 | 4 GB DDR3 1066 RAM | 500 GB Hitachi 5400 RPM HDD | 13'' LED 1280x800 | Dual Boot Mac OS X Snow Leopard & Windows 7 Home x64 |

Laptop2: | Dell Inspiron XPS Gen 2 | Pentium M Centrino (Dothan) 2.13 GHz (FSB 533 MHz)| GeForce Go 6800 Ultra 256 MB DDR3 450@1063 MHz (12 PS, 5 VS)| 1 GB DDR2 533 RAM | 100 GB Fujitsu 5400 RPM HDD | 17'' WUXGA LCD 1920x1200 | Creative Soundblaster Audigy 2 ZS Notebook |
ATI_Dude is offline   Reply With Quote
Old 06-14-05, 02:00 PM   #3
Vagrant Zero
I'm a Back Door Man
 
Join Date: Jun 2004
Location: Los Angeles
Posts: 1,750
Send a message via AIM to Vagrant Zero
Default Re: Another Kutaragi Interview, Decent Info.

They've said that the PS3 will be hardware backwards with PS2 and software backwards [emulation] with ps1.
__________________
o <---- Dev
|\_o <---- Paladin
// \
Vagrant Zero is offline   Reply With Quote
Old 06-14-05, 03:15 PM   #4
Ninjaman09
Takin 'er easy
 
Ninjaman09's Avatar
 
Join Date: Jul 2004
Location: Jowjah
Posts: 6,687
Default Re: Another Kutaragi Interview, Decent Info.

Magus, I have to ask - what the hell is going on in your avatar?
__________________
Core i7 920 @ 3.2 | ASUS P6T Deluxe V2
6GB Mushkin DDR3-1600 RAM
eVGA GTX 570 SC | Auzen X-Fi Prelude 7.1
CORSAIR CMPSU-850TX
Dell U2410
Ninjaman09 is offline   Reply With Quote
Old 06-14-05, 07:35 PM   #5
Ruined
Registered User
 
Ruined's Avatar
 
Join Date: Jul 2003
Posts: 2,447
Default Re: Another Kutaragi Interview, Decent Info.

Quote:
Originally Posted by |MaguS|
All bias aside, I think this was one of the better interviews about the PS3 and just the next gen. Kutaragi really didn't slam 360 like MS/ATI have been doing, He just spoke honestly about technology. If you put your bias thoughts aside, alot of the things he says makes alot of sense.
Well he has numerous times in the past. I can point to at least three articles he refers to XBOX360 as XBOX 1.5, and even one where he calls PS3 PS3.5 in comparison.

Quote:
He does keep stating about how the PS3 is a Computer now and not a console... but thats thier perception.
Well, the PS2 was also called a computer, so I don't see this as entirely new. I think it is more reaction to the press (like Gamepro for instance) covering the XBOX360 as a do-it-all unit for the average joe while PS3 was just a game console.

Quote:
Good explination why they didn't go eDRAM which I believe is valid. Adding so much eDRAM as normal VRAM would be costly, Why do you think CPUs and other devices don't have much Cache? Its very fast but costly...
I think the true reason stems from two things, first that is that nVidia has had no experience with eDram and creating a GPU that works hand-in-hand with it would probably have pushed back the PS3's launch quite a bit. Second, the 90nm G70 GPU which the PS3's RSX will be based on is not designed to work with edram, so it would likely have been much more costly and time consuming to have NV totally redesign the chip.

In essence, the GPU maker they picked simply has no experience or parts with this technology in them, while ATI does. This likely happened because originally the GPU in PS3 was going to be the Graphics Synthesizer 3 by Sony/Toshiba, and this never worked out, so they picked nVidia later in the game instead.

While what KK said is true, I believe the features of PS3 are primarily there because they won't cost too much to implement and they give a nice PR reason why edram wasn't used. In the end though, for HD gaming it's likely the edram-loaded machine will have the advantage due to the free 4xMSAA at 720p.

Quote:
I believe this aswell, While it is a technically optimal system it will cause alot of problems in the beginning as developers learn it. It will be alot like the PS2s EE (waits for the laughter to die)that caused so much headaches since it was new tech that none had any experiance with. Developers will have to learn how to balance the available pipes to achieve optimal performance and this will take time. The RSX is alot like current gen GPUs so theres not much they don't know how to handle.
I disagree here. Microsoft has the best development tools and software in the industry. Even if unified shaders is something new to learn, the quality and ease of use of Microsoft's tools will likely still make programming for XBOX360 easier and more efficient than programming for PS3. While Sony and Nvidia make decent software, they are simply not in the same league as Microsoft. Not to mention, effectively utilizing Cell is likely going to be much more difficult than working on unified shaders. Plus, the XNA development tools for X360 can also be used to port every game made with it to PC and vice versa easily, so developers that become familiar with it can maximize revenue by releasing to both X360 and PC without having to use more costly and time consuming means of porting between the two - therefore being familiar with XNA will be high on most gaming studios' priority list.
__________________
We're all in it together.

Intel Core 2 Quad Q6700 2.66GHz CPU | Intel G965WH mobo | 8GB (4x2GB) DDR2-667mhz CAS5 RAM (1066MHz FSB) | BFG GeForce 285 GTX OC 1GB | Dell E228WFP 22" DVI-HDCP LCD Monitor | 1TB Western Digital RE3 SATA2 Main Drive | 500GBx2 Western Digital RE3 SATA2 Scratch Drives in RAID0 | Western Digital RE3 1TB SATA2 Media Drive | External 2TB Western Digital MyBook Backup Drive | Adaptec eSATA 3.0gbps PCI-E interface | Sandisk External 12-in-1 Flash Card Reader | LG GGC-H20L HD DVD/BD reader, DVD writer | LG GGW-H20L HD DVD/BD reader, DVD/BD writer | Microsoft E4000 Ergonomic Keyboard | Logitech Trackman Wheel | Antec P182 ATX Case | Thermaltake ToughPower XT 850w modular PSU | KRK RP-8 Rokit Studio Monitors | Windows Vista Ultimate x64
Ruined is offline   Reply With Quote
Old 06-15-05, 12:38 AM   #6
Subtestube
Anisymbolic
 
Subtestube's Avatar
 
Join Date: Aug 2004
Location: Wellington, New Zealand
Posts: 1,365
Default Re: Another Kutaragi Interview, Decent Info.

Before I say this, I should say that I really like the idea of the cell - I've been waiting for true general purpose parallel vector co-processors for CPUs for a while, so my tendancy is to think that it's a good idea. That said - I think Ruined is right - it probably won't be that easy to program for at first - there's no way it'll be transparent to the developers, (unfortunately - but then, nor is true multi-threaded programming, which can be damn hard, so don't think that the XBOX 360 is totally innocent in this region). More importantly, and to actually address the point at hand - I doubt the unified shader architecture will be hard to program for - it's supposed to be entirely transparent to developers, with only Tier 1 devs having access to the low level workings of it. In other words - I don't think that the R500 being hard to program for will be anywhere near as valid a complaint as the difficulties that will invariably arise from writing games with genuine evenly balanced parallelism (on both Cell and XBOX 360) as a base feature.

Edit: Ruined also makes a valid point with regards to dev tools - Microsoft's tools are second to none. That said, nVIDIA's are very very very good - much better than ATi's in my humble experience. Sure, not as good as Microsoft's, but then, MS's aren't targeted at graphics dev, where nVIDIA's are. I know stuff-all about Sony's dev tools, and I doubt they'll compare to MS's, but who knows, maybe they'll be just as good. They do have nV onboard on this, and, the NV Developer network is excellent - so Sony might get some useful hints in terms of how developers like software.
__________________
Dr Possible: Core 2 Duo E6400 on Gigabyte GA-965P-DS4. Galaxy GeForce 7600GT. 2GB Corsair XMS 2 DDR2-6400 RAM (CL5). ATi Theatre 550 Pro. Windows XP MCE. All stored in Piano black Antec Sonata II, with a broken door.

Mobile: ASUS M2400N, Pentium M 1.5 GHz. 512 MB DDR RAM. Intel EXTREME graphics. Windows XP SP 2 / Ubuntu 5.10.

Ridiculous DOES not have an 'e' in it. It comes from "ridicule" and has less than nothing to do with the colour red.
Subtestube is offline   Reply With Quote
Old 06-15-05, 01:30 AM   #7
steamedrice
Registered User
 
Join Date: Oct 2004
Posts: 24
Default Re: Another Kutaragi Interview, Decent Info.

well with all this talk about edram,what did 4mb of edram do for PS2? helped it keep up with xbox/gamecube with emotion engine's lackluster GPU part?

I thought edram was something new but guess not after reading my PS2 has 4mb of it lol
steamedrice is offline   Reply With Quote
Old 06-15-05, 02:09 AM   #8
Ruined
Registered User
 
Ruined's Avatar
 
Join Date: Jul 2003
Posts: 2,447
Default Re: Another Kutaragi Interview, Decent Info.

Quote:
Originally Posted by Subtestube
Before I say this, I should say that I really like the idea of the cell - I've been waiting for true general purpose parallel vector co-processors for CPUs for a while, so my tendancy is to think that it's a good idea. That said - I think Ruined is right - it probably won't be that easy to program for at first - there's no way it'll be transparent to the developers, (unfortunately - but then, nor is true multi-threaded programming, which can be damn hard, so don't think that the XBOX 360 is totally innocent in this region).
While I see that the X360 is similar in some ways, the primary way it is different is that it presents three fully functional CPU cores, instead of just 1 fully functional core with 7 SPEs. I just see the former as less formidable and more traditional for devs to program for, as it is plain old multithread with 3 physical/6 virtual CPUs. The cell is more complex than that to get the full potential out of it. It also seems that the X360 Xenon CPU will be superior to Cell in terms of non-graphical calculations such as physics and AI, while the Cell will be superior to the X360 Xenon CPU in graphical calculations (source: Ars Technica). However, with the killer GPUs both consoles have, I think that the advantage Cell has there becomes less significant, and the advantage X360 has becomes more significant, since most of the graphics calcs will be done by the GPU.


Quote:
More importantly, and to actually address the point at hand - I doubt the unified shader architecture will be hard to program for - it's supposed to be entirely transparent to developers, with only Tier 1 devs having access to the low level workings of it. In other words - I don't think that the R500 being hard to program for will be anywhere near as valid a complaint as the difficulties that will invariably arise from writing games with genuine evenly balanced parallelism (on both Cell and XBOX 360) as a base feature.
I also heard unified shaders will be transparent to devs.
__________________
We're all in it together.

Intel Core 2 Quad Q6700 2.66GHz CPU | Intel G965WH mobo | 8GB (4x2GB) DDR2-667mhz CAS5 RAM (1066MHz FSB) | BFG GeForce 285 GTX OC 1GB | Dell E228WFP 22" DVI-HDCP LCD Monitor | 1TB Western Digital RE3 SATA2 Main Drive | 500GBx2 Western Digital RE3 SATA2 Scratch Drives in RAID0 | Western Digital RE3 1TB SATA2 Media Drive | External 2TB Western Digital MyBook Backup Drive | Adaptec eSATA 3.0gbps PCI-E interface | Sandisk External 12-in-1 Flash Card Reader | LG GGC-H20L HD DVD/BD reader, DVD writer | LG GGW-H20L HD DVD/BD reader, DVD/BD writer | Microsoft E4000 Ergonomic Keyboard | Logitech Trackman Wheel | Antec P182 ATX Case | Thermaltake ToughPower XT 850w modular PSU | KRK RP-8 Rokit Studio Monitors | Windows Vista Ultimate x64
Ruined is offline   Reply With Quote

Old 06-15-05, 06:08 AM   #9
|MaguS|
Guest
 
Posts: n/a
Default Re: Another Kutaragi Interview, Decent Info.

Quote:
Originally Posted by Ruined
I think the true reason stems from two things, first that is that nVidia has had no experience with eDram and creating a GPU that works hand-in-hand with it would probably have pushed back the PS3's launch quite a bit. Second, the 90nm G70 GPU which the PS3's RSX will be based on is not designed to work with edram, so it would likely have been much more costly and time consuming to have NV totally redesign the chip.

In essence, the GPU maker they picked simply has no experience or parts with this technology in them, while ATI does. This likely happened because originally the GPU in PS3 was going to be the Graphics Synthesizer 3 by Sony/Toshiba, and this never worked out, so they picked nVidia later in the game instead.
Please show me where ATI has had previous experiance with eDram? Oh wait they also had none till just now. Please show me where ATI had experiance with unified pipelines... oh wait... they have none. The G70 is actually based on the RSX, the RSX is supposly ahead of the G70 in technology.

Everything you said negative about Nvidia is complete utter BS, They are using technology that they know of and have years of experiance with, ATI are the ones who are experimenting with new technology and don't have any experiance with it. No desktop GPU to date has used edram and unified pipelines so no one knows how well it will preform in the real world.

Quote:
I disagree here. Microsoft has the best development tools and software in the industry. Even if unified shaders is something new to learn, the quality and ease of use of Microsoft's tools will likely still make programming for XBOX360 easier and more efficient than programming for PS3. While Sony and Nvidia make decent software, they are simply not in the same league as Microsoft. Not to mention, effectively utilizing Cell is likely going to be much more difficult than working on unified shaders. Plus, the XNA development tools for X360 can also be used to port every game made with it to PC and vice versa easily, so developers that become familiar with it can maximize revenue by releasing to both X360 and PC without having to use more costly and time consuming means of porting between the two - therefore being familiar with XNA will be high on most gaming studios' priority list.
Ah again so much speculation without any backup, If MS's tools are so easy and great why weren't they available by E3? Sony had already shipped out acouple of development units for E3, hell two of the most impressive REALTIME demos were done in 2 months or less for the presentation (UT2007 and FF7). Sweeney already stated that the Ps3 is extremly easy to develop for, its OpenGL based so all developers should known the language. Oh and btw, the Dev units were all coded with Cell processing the graphics so no one has messed with the RSX yet, the Cell CPU was rendering everything. I guess it must be easy to program for and access since it was all done on cell in less then 2 months...

Had dev units that did not even house the same CPUs that the 360 did while the Sony dev units had each dev unit using a cell cpu. Unified pipelines is an unknown factor, no one knows how well it will preform or how easy/hard it is to program for since its 100% new. Developers will have to learn alot about the hardware to be able to take advantage of it fully. They need to learn how to load balance and timing of data passing through the pipelines. Since the RSX is alot like current GPUs they have nothing much to learn except how far they can push it. They know how to access most of its features and functions without worry about load balancing or over pushing the pipelines. Developers will have alot easier time with the RSX then the Xeon.

Who cares about ports? I would rather have original software on my console then some port. This is why I love my PS2, It has original games...
  Reply With Quote
Old 06-15-05, 07:26 AM   #10
Subtestube
Anisymbolic
 
Subtestube's Avatar
 
Join Date: Aug 2004
Location: Wellington, New Zealand
Posts: 1,365
Default Re: Another Kutaragi Interview, Decent Info.

I should stress that when I was complimenting MS's Dev tools, I actually mean Visual Studio and XNA - in other words, the software rather than the hardware. Additionally, (and again - at present I should say that I still have suspicions [no knowledge, obviously] the PS3 will end up being the more powerful of the consoles) I personally don't trust Sweeney's testimony - it may be true that it will be very easy to develop for the cell, but one's man's word is not enough. It probably is true that it's easy for Sweeney and his skilled crew to develop for the PS3. My suspicion is that experienced developers (like Epic) won't have significant problems with a dramatically new CPU architecture, because (basically) they're very good at what they do. Newer kids might find the Cell a bit of a greater learning curve.

For all that, I still think that that tri-core masterpiece in the XBOX360 may not prove to be that much easier to deal with. Call it 'good old threaded programming', all you like, but it's extremely new in terms of games programming. Why? Data-integrity is a big one. Anyone who's ever dealt with real-time multi-threaded systems will tell you that just getting used to the ways the different threads interact is a very nasty learning curve indeed. I suppose it'll mean that game devs have to be tidier, and again, this won't affect the big dev shops. The smaller ones, however, who are working on tight time budgets will find this substantially harder than the good ol' delta based "while(true)" game loop.

Finally, even with Ars Technica's points (and yes, they're a very solid tech source) I still have doubts about the ability of the XBOX360 CPU to outperform the Cell for anything vector based - and that includes physics. Simply put, the cell (AFAIK) treats vectors as first class types at the hardware level, and pretty much every physics calculation you can name is based on vector math. This is, of course, just my opinion, and as we'll not see 'apples to apples' comparisons any time soon, it's all pretty useless conjecture.
__________________
Dr Possible: Core 2 Duo E6400 on Gigabyte GA-965P-DS4. Galaxy GeForce 7600GT. 2GB Corsair XMS 2 DDR2-6400 RAM (CL5). ATi Theatre 550 Pro. Windows XP MCE. All stored in Piano black Antec Sonata II, with a broken door.

Mobile: ASUS M2400N, Pentium M 1.5 GHz. 512 MB DDR RAM. Intel EXTREME graphics. Windows XP SP 2 / Ubuntu 5.10.

Ridiculous DOES not have an 'e' in it. It comes from "ridicule" and has less than nothing to do with the colour red.
Subtestube is offline   Reply With Quote
Old 06-15-05, 05:52 PM   #11
ATI_Dude
Registered User
 
ATI_Dude's Avatar
 
Join Date: Aug 2003
Location: Copenhagen, Denmark
Posts: 396
Default Re: Another Kutaragi Interview, Decent Info.

Quote:
Originally Posted by Ruined
While I see that the X360 is similar in some ways, the primary way it is different is that it presents three fully functional CPU cores, instead of just 1 fully functional core with 7 SPEs. I just see the former as less formidable and more traditional for devs to program for, as it is plain old multithread with 3 physical/6 virtual CPUs. The cell is more complex than that to get the full potential out of it. It also seems that the X360 Xenon CPU will be superior to Cell in terms of non-graphical calculations such as physics and AI, while the Cell will be superior to the X360 Xenon CPU in graphical calculations (source: Ars Technica). However, with the killer GPUs both consoles have, I think that the advantage Cell has there becomes less significant, and the advantage X360 has becomes more significant, since most of the graphics calcs will be done by the GPU.
I must admit that the PS3 is a bit of a mystery to me. It seems like a strange mix of the old PS2 architecture where the CPU did all the vertex processing and the traditional architecture where the GPU does most of the vertex processing. I simply don't understand why Sony puts a CPU which excels in FPU caluculations when the system already has a dedicated graphics processing unit. Strong FPU performance is not really needed to manage AI, physics etc. Maybe this will be clearer when the system is finalized and the first games are released.
__________________
Regards,
ATI_Dude

Desktop: | Intel Core i7 2600K@ 3.4 GHz | Asus P8P67 Pro | 8 GB DDR3 1600 Corsair (2x4 GB) | Asus GeForce GTX 580 | Creative X-Fi Fatal1ty FPS soundcard | Segate Momentus XT 500 GB Hyprid SATA HDD | Samsung SyncMaster T220HD 22'' LCD | 650 watt Corsair HX650 PSU |

Laptop1: | MacBook Pro Uni-body | Core2 Duo 2.53 GHz (FSB 1066 MHz) | GeForce 9400M 256 MB DDR3 | 4 GB DDR3 1066 RAM | 500 GB Hitachi 5400 RPM HDD | 13'' LED 1280x800 | Dual Boot Mac OS X Snow Leopard & Windows 7 Home x64 |

Laptop2: | Dell Inspiron XPS Gen 2 | Pentium M Centrino (Dothan) 2.13 GHz (FSB 533 MHz)| GeForce Go 6800 Ultra 256 MB DDR3 450@1063 MHz (12 PS, 5 VS)| 1 GB DDR2 533 RAM | 100 GB Fujitsu 5400 RPM HDD | 17'' WUXGA LCD 1920x1200 | Creative Soundblaster Audigy 2 ZS Notebook |
ATI_Dude is offline   Reply With Quote
Old 06-15-05, 09:00 PM   #12
evilchris
 
evilchris's Avatar
 
Join Date: Nov 2003
Location: San Diego, CA
Posts: 4,411
Default Re: Another Kutaragi Interview, Decent Info.

R300 was BRAND NEW tech for ATi and kicked ass out of the gate. The thought that you have to go through a few generations to get something right is ignorant.

As far as ports, PS2 has tons of them. Splinter Cell comes to mind. That and other games such as KOTOR were just flat out too much for it and it didn't even get a port!

To suggest which console is easier to develop for at this time is ludicrous, and an educated statement can not be made on the subject at this time since neither box exists.
__________________
[CENTER][SIGPIC][/SIGPIC]
[/CENTER][B][CENTER]--Communist Party of America--[/CENTER][/B]
evilchris is offline   Reply With Quote
Reply


Thread Tools

Posting Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

BB code is On
Smilies are On
[IMG] code is On
HTML code is Off
Forum Jump


Similar Threads
Thread Thread Starter Forum Replies Last Post
Interview: Can Big Data Predict the Next Revolution? News Archived News Items 0 06-10-12 12:40 PM

All times are GMT -5. The time now is 04:30 AM.


Powered by vBulletin® Version 3.7.1
Copyright ©2000 - 2014, Jelsoft Enterprises Ltd.
Copyright 1998 - 2014, nV News.