That only pertains to 1080p/30 vs 1080i/60. If the original material is 1080p/60 and is then converted to 1080i/60, you've permanently lost half the image.
Correct. And as the industry standard for 1080i is 60fps and 30 fps for 1080p, properly deinterlaced 1080i will result in the original 1080p signal.
It is true for the examples I gave, but not for the example you gave:
You can have 15fps film quadrupled up to 60fps and it will look like utter poop.
Correct, as the human vision system can more easily detect the lack of true motion under 24fps (unique frames), thus it starts to look like a quick moving slide show, which in reality, that is what film and video is. It is one of the reasons why the old silent films, most of which were shot around 16fps
(most were hand cranked, so the frame rate could vary by quite a lot) look the way they do, even when they are displayed, as is often the case, at much higher speeds than they were shot at in order to lower the risk of having the nitrate film stock catch on fire from longer exposure to the lamp. You can also see how a lower fps signal looks by what most web cams and video phones produce, which is often just 15 fps at higher resolutions.
Doubled or tripled refresh rates are done for reasons of brightness. Even a still image can look dim if the refresh rate is low enough. Simplify it down to a single light bulb. Use the wall switch to turn it on and off at some given rate. Observe the room (not the light); it should seem noticeably dimmer than usual. Now increase that on/off rate, and the room will increase in brightness. Your average wall dimmer does exactly this, only much faster.
Except having it on without ever turning it off and on regardless of how fast results in the highest possible lumens... so no, the main purpose of refreshing an image has nothing to do with brightness, it has to do with the way our vision works.
In fact, the brighter the image, the higher the refresh rate needs to be because our visual perception for noticing a flashed image increases at higher brightness. You can see this by lowering the refresh rate on a computer’s CRT monitor to 60Hz and then crank up the brightness. The brighter the monitor is set, the more you will see the flicker effect. Notice that while you can increase the brightness, the refresh rate stays the same… the two are not related, at least not in any significant manor.
If we watched 24fps film with the lamp continually on for maximum brightness we would see more motion jitter than if it was refreshed at 2xfps or 3xfps. Refresh rates are used to fool the eye into thinking they are seeing more frames per second then really exist. The more “frames”, or flashing images we see within a second, the more smoothly the transition between frames becomes.
The primary reason we need higher refresh rates is to avoid seeing the flicker from the light source being turned on, because at 48Hz, that is enough to make most frames transition smoothly to the human eye.
At around 24fps, the average person perceives somewhat fluid motion. On a large screen (10' or more), the average person perceives the on/off dimming effect of lowered light output. Double up the refresh rate (show every frame twice) and you are able to double the light output to the screen, thus increasing the amount of light the viewer observes.
I'm very curious where you read this. Flashing a lamp that produces a max of say 5000 lumens isn't going to magically produce 10,000 or even 15,000 lumens because you are doubling or tippling the refresh rate. That's absolutely untrue, and goes against all laws of physics.
But the motion sure isn't any smoother, since you haven't added any new data.
What is smoother is the transition from one flashed image to the other, thus the transitions are smoother, not the actual motion, which is what higher unique frame rates are for. Ideally, as stated before, a higher fps is best, but that comes at a steep price of both the ability to capture or in the case of a game, create, and then transmit all that additional data.