The console hasn't even released yet and its already using 3 year old tech... so by the time it releases its gpu will be 4 years old. They better not be thinking about charging a lot of the system, can only see the CPU and new drive being the most costly thing about the unit.
TIME’s "TechLand" blog claims that the console, set to launch in 2012, will pack a R700 series variant from Advanced Micro Devices, Inc. (AMD), built on a 32 nm process with 1 GB of video memory. R700 GPUs are found in AMD's two-generations-old Radeon 4000 Series -- the R700 architecture launched in 2008.
While the GPU may seem a bit underpowered by modern PC gaming standards, consider that the PlayStation 3 from Sony Corp. (TYO:6758) uses a modified version of the NVIDIA Corp. (NVDA) chip found inside the GeForce 7800 (2006-era) and the Xbox 360 from Microsoft Corp. (MSFT) uses a "Xenos" AMD GPU -- which falls somewhere between a R520 (2005 era) and a R600 GPU (2006 era GPU). In other words, by console standards, the Wii U's reported GPU is quite advanced, with its architecture surpassing those found in the PS3 or Xbox 360.
So ithe PS3 and X360 launched with GPUs based off current or less then a year old technology. So while the Wii U is more powerful because its newer it's pretty pathetic that they could use more current technology.
Oh and about the disc format, they wont say what it is other then its not Blu-Ray and holds 25B of data.
In an interview with Kotaku, Nintendo designer Katsuya Eguchi confirms that the Wii U will use a proprietary high-density optical disc format that isn't Blu-Ray. That can't make Sony too happy. Reportedly the discs will pack up to 25 GB -- the same as the maximum for a single-layer Blu-Ray disc. Mr. Eguchi declined to reveal whether standard DVD playback would be supported, whether double-layer (50 GB) discs would be supported, and whether we might see movies shipping in this new format.
Anyone wana bet they are using HD-DVD Tech?