Recently, I've begun to see a trend for developers, specifically on consoles to try and "cheat" hardware limitations by relying heavily on image post processing in the TV's themselves.
Most 120hz TV's nowadays have a feature that "smooths" the video that you're watching to simulate a full 60 frame per second motion. The problem is that the technology isn't perfect, and not everything that happens on screen is predictable by the image processor, so therefore, we get a chopped up effect where part of the image is moving at 60 fps, and the other part of the image is moving at 24 or 30 fps. It creates an almost disconnected feel where the game or video seems to be disjointed or uneven.
If you go into an electronics retailer that sells games and TV's such as Best Buy, you will notice that virtually all of the consoles nowadays are on 120hz televisions, and they have the image smoothing technology turned on. While you're playing the game, you will see that the game runs almost perfectly, but in reality, the game is only running @ 30 fps or less. The only time that this doesn't happen is on games that actually run @ 60 fps, such as forza motorsport and other titles that are heavily reliant on fluid motion. Another place I noticed this was here (the video):
watch the video. The majority of the video is rendered at 30 fps, but there are blips of smoothness in there that make it look like full motion.
What is your opinion on this trend? Is it a good or a bad thing?