The setup and detection of 1080i devices seems to work fine now. I didn't need to setup modelines or validation options, pretty much a default xorg.conf.
However, the problem of displaying 1080i video thru the 1080i output still isn't working correctly and still needs to be deinterlaced. http://www.nvnews.net/vbulletin/show...ighlight=60.05
nvidia-settings -q RefreshRate still shows 60.05, which is wrong wrong wrong. My guess is that 60.05 is hardcoded as the timing to use to display interlaced video and causing the 'jittery'/'juddering' display and that's why 1080i video content tends to blend the current and next lines periodically.