I have always had a problem with these charts, just like the statement that the human eye cannot detect faster than 30fps so 60fps doesn't matter... I even used to say that FPS thing... until I realized that no matter what the math and tests say, I could tell a difference and most people could.
Most of these testa re done by measuring the eye and brains ability to isolate a certain detail and then use that to determine when you can and cannot see a difference.
For instance when they say you can't see the difference between 1080p and 720p they are usually using some calculation of how small a dot the eye can percieve per degree of field of vision. But what is not accounted for is that you can see the EFFECT of smaller dots even if you can't make out the individual dots themselves.
Sometimes the loss of detail is noteable, especially in fine repeating patterns. Sometimes it is a percieved hardness of an edge... sometimes certain kinds of motion make it more noteable.
Kind of like with the human eyes ability to pick up more than 30fps. You might not be able to accurately recognize a shape of a certain complexity or isolate an image at more than 30fps, but you can certainly tell the difference especially on a display in the absense of motion blur.
I personally can tell the difference between 720p and 1080p on good content at much farther than most charts or math say I can.
I still say CRTs have some ubeatable image qualities...
However the irony of your statement is that 1080i really has less lines of data on screen at any given time than 720p (540 lines interlaced to come out to 1080... I think PAL is 576 progressive lines so technically 1080i has less lines of data on screen at any given time than regular old PAL

)