Originally posted by Edge
That's pretty far off, it was actually running at a full 30 FPS on a 1.4 ghz computer with a Geforce 3 card in it, and detail was very high (hard to say if it was "1/10th" or not, the hair and lighting wasn't as good, but the character models looked almost exactly the same as they did in the movie). Check out Gamespot's article of it, they have a video of the demo running on a computer at 30 fps back in 2001.
And the ONLY thing I could see that would be very difficult for current hardware to replicate is the lighting, particularly the radiosity. There is simply no way that any hardware in the next couple years will be able to do real-time radiosity on every object, which will make things look a bit unrealistic.
What he said.
Doing Final Fantasy in real time requires more than just high poly counts and texture resolutions. The Final Fantasy nVidia demo didn't do "Real" lighting. I'm talking about when you shine a light on a wall, and some of the light reflects off it to shine on the floor, which reflects some of it up onto the ceiling etc etc. The lights also need to fall off over distance and cast soft shadows. Real lighting is very math intensive and probably accounts for 99% of the time it takes to render a movie like Final Fantasy, and nVidia skipped that bit for their demo.
IMO even CGI scenes from old movies like Tron and The Last Starfighter had better lighting than todays games. Games rely on cheats like light maps to give the illusion of a properly lit scene (which look great in screenshots), but once you interact with environment it still looks fake. The Doom 3 engine is a very first step in the right direction, but is only maybe 1/10th of the way there.
If graphics cards continue to evolve at their current rate (each generation card is only about 50% faster than the previous one) we will all be long dead before we see Final Fantasy trully rendered in real time.