Graphics cards have been multicore for ages, well before CPUs. They are already specialized processors that handle a ton of calculations in parrallel. The reason companies go "multi-chip" is because with a massive chip all it takes is one defect to make the entire chip defective and if it is a big chip it wastes a larger portion of a wafer. But if the same die area is used for two chips, one defect only results in one defective chip. What is needed is better abstraction of "multi-chip" solutions to software and better multi-chip communications.
Originally Posted by AngelGraves13
What we need is dual-core graphics cards! I have a feeling that will be the next "big thing"