Originally posted by LoudMusic
Heh heh, I already tried. The movement died when Yahoo! refused to hand write the search results pages. They said something about, "we're not gonna hand code no 100,000 stinking pages a minute you stupid punk!" ... or something. Don't quote me on that (:
But seriously, I've been to sites who's welcome page was a couple hundred kilobytes, and it had about 40 linked images. At 5kb/s (and that's good for 56k), it would take over a minute just to get the stupid welcome page! Every byte counts, and in html a byte equals a character - including returns and spaces. It's definitely something to think about when you're desiging large scale sites.
If you look at Google's approach, they knew their service was going to be very popular and require loads of bandwidth. The welcome page is quick and simple, because it gets requests hundreds of thousands of times a day. Every character they can remove is worth almost a megabyte of bandwidth. So they made it as simple as possible, then removed all the extra returns. Probably saved them a gigabyte of throughput a day ... EVERY DAY! Then when you take a gander at the search results, it's more of the same. Stripped down code to make it as effecient as possible. You'll also notice how few of images there are. The same goes for Yahoo. I remember when they only had four images on their main page - the banner, and the three credit card icons.
And besides, it's easier to go back and fix/change things when you've actually written it by hand. I saw a page that a guy had re-edited so many times, that it had overlapping / repeating / conflicting code and wouldn't load correctly in any browser. It was amazing ...
~LoudMusic