[url=http://www.missingremote.com/index.php?option=com_content&task=view&id=2127&Itemid=1]Original Article Link[/url]
[i]And for the record, 1080i is higher resolution than 720p, those who think otherwise can not use a calculator, a 1080i field is bigger than a 720p frame, and there are twice as many of them.[/i]
Some one please correct me if I am wrong about this… but here is how I understand this.
720p: 720 horizontal (left to right) scan lines for each vertical (top to bottom) scan, scanned progressively down the screen with 60 (actually 59.97 but close enough) fields (not frames) scanned per second. Therefore 60 full complete images per second.
1080i: 1080 horizontal scan lines but “interlaced” at 60 fields per second. Meaning that there are 2 Fields of 540 scan lines (odd 1,3,5,7 etc… and even 2,4,6,8 etc) interlaced together for a total of 60 Fields per second( 30 odd + 30 even = 60).
Now for a little simple math…
720p: 720×1280=921,600 pixels per Field x 60 per sec = 55,296,000 pixels per second.
1080i: 540×1920=1,036,800 pixels per Field x 60 per sec = 62,208,000 pixels per second.
So yes 1080i delivers 6,912,000 more pixels per second than 720p, but not by much and it is interlaced.
Now compared with the pixel count on 1080p
1080p: 1080×1920=2,073,600 pixels per Field x 60 per sec = 124,416,000 pixels per sec. (yes I know… duh.. twice 1080i)
I have both a 1080i and a 720p set, on the 1080i I notice that fonts are almost impossible to read (I know it’s not TV but the HTPC display) with severely noticeable flicker, the 720p set is rock solid. But for viewing OTA-HD or even DVDs (ffdshow up converted) I notice little if any difference in quality. So going by the numbers you are not wrong saying 1080i has higher resolution but since they are so close it’s really going to depend on the material being viewed and the person viewing it. (and since I don’t own a 1080p set and prolly won’t in the near future… so what about 1080p for now).
Do I understand this correctly???