Originally Posted by AthlonXP1800
Not sure about Intel's Sandy Bridge and AMD already created 128bit CPU instruction set in bulldozer since R&D started in 2007.
I still don't fully understand why 128-bit is necessary in the general computing space though. 64-bit is pretty easy to understand because now mainstream (e.g. everyday use for everyday consumers) software is starting to run into memory limitations that, while 32-bit can handle it with a few kernel tricks, it does so very inefficiently (and more prone to bugs as well.) And IMO it's good that AMD started that migration when they did (in spite of several intel fanboys criticizing them for it) because that gave us the time we needed to migrate now that the limits of 32-bit are being hit in the consumer space right now.
128-bit though doesn't make any sense right now. We're nowhere near hitting the limits of 64-bit in the consumer space; not even remotely close. Hell 64-bit addressing is way more than enough for today's hard drives even. Even if we doubled the capacity of hard disks every year for the next 20 years (2 TB ^ 20 years = 1EB, compared with the 16EB limit of 64-bit,) we still wouldn't reach the limits of 64-bit addressing. And that is just hard disks, volatile memory (where CPU bit depth would really make a difference) would be much smaller.