Not really. What we're interested in here is the trend. And they got the trend wrong over the last few years.
I don't think they have - the 'trend' in the CRU data is not a five year mean. The five year mean is clearly shown in the Hansen (NASA GISS) data, however, and has clearly been rising since ~1970...
Isn't the trend supposed to be taking off exponentially or somesuch?
No. Even the most alarmist predictions are nothing like 'exponential'.
Aren't we supposed to be looking at a catastrophic departure into the abyss here?
Again, I'd say you'd been listening to the catastrophists too much. Although global warming has the potential to cause major problems for our species, it doesn't mean that all consequences are bad by default. Indeed, the number of increased deaths from heatwaves is projected to be less than the number of lives saved by milder winters...
It's worth a note at this time that it's only since 1880 that we've really been able to measure the global surface temperature - and with improving accuracy since then. Its no coincidence that Industrial Revolution has lead to scientific revolution. It's also worth noting that our ability to measure global surface temperature has only a 0.2 degree confidence - and the period of 125 years we're talking about has only a 1.0 degree variance between lowest and highest points.
The degree of confidence in climate data is getting higher (and the margin of error is getting lower) all the time. The most recent data actually has an error of just +/- 0.05 degrees. I also think there is more significance in the data than you are pointing out - even with a 0.2 (+/- 0.1) degree of confidence, the data clearly shows a significant change, way clear of the margin of error. The warming observed in the 20th Century is unequivocal.
We have no real clue what happened before then, beyond marginally less accurate data which seems to say that just before the 1880s we left a low-grade Ice Age.
If by 'real' clue you mean that we cannot go back in time and measure the 'real' climate, then yes - I'd agree... but it's also a tad misleading. That's akin to saying that we have no real clue what dinosaurs looked like because we can only reconstruct them from fossil evidence. However true that may be, it categorically does not mean that we cannot get a very good idea of what they or the climate was like from the evidence that is available today.
There are multiple lines of evidence that give us a very real indication of past climate - there are many different types of 'proxies' of past climate, including some very tangible/real indicators of atmospheric gas composition (i.e. bubbles trapped in ancient icesheets). Of course, the error attributed to such data is inherently larger, but even so, not only is the margin of error still low enough to show that 20th Century warming is unusual with respect to the past 1000-2000 years, the degree of confidence in paleoclimate data is increasing all the time as more data/better models are developed. To pretend that climate proxy data will ever be as accurate as direct measurements would be wrong. But by the same token, to dismiss climate proxies as useless or meaningless is equally wrong.
This graph showing reconstructions of paleoclimate (brown) and instrumentally measured climate data (black line) used atleast 12 separate sources of proxy data to reconstruct past temperature... we can see clearly that the margin of error is indeed alot higher for the modelled data (about +/- 0.5 degrees across most regions). So although the paleoclimate record is indeed less accurate, I'd say it was very far from having no 'real' clue... This paleoclimate modelling must be considered in the context of climate forcings. What we know about natural forcings up to and including the 20th Century is that they alone cannot explain why the temperature is rising in the way it is now.
the underlying point here is that the planet is always changing its climate, so any analysis which takes any temperature as a "zero" is fundamentally inaccurate.
I don't dispute the fact that Earth's climate has and always will change in any number of different ways, over any number of different timescales. You may be right that comparing
absolute temperatures is pointless - but that is not what we are doing here. The take-home message is that the current period of warming is unusual in the context of the past millennium and no naturally occuring phenomena (natural forcings) can fully explain it. The fact that factoring in anthropogenic forcings provides a much better explanation, however, is significant.
I'm also wondering what happened between 1940 and 1980 which rendered the global mean surface temperatures so low and unchanging, especially given that carbon dioxide emissions from sources attributed to mankind increased at almost exactly the same rate, year-on-year as it did since then.
I guess the answer to that is 'nothing happened'... although anthropogenic global warming is largely considered to have 'begun' at the start of the Industrial Revolution, the fact that anthropogenic forcings have become more dominant drivers of global climate than natural forcings is a relatively new phenomenon.
It's analogous to a point Danoff made some time ago about the suggestion that it took the US ten years to put a man on the moon - as he said, it didn't take 10 years - or a 100 years... it took the entire history of mankind to put a man on the moon. Similarly, when discussing the fact that anthropogenic radiative forcings are now a problem, it hasn't always been the case - even though it is now. What the climate record between 1940-1980 shows is that, despite increasing GHG emissions, natural forcings and anthropogenic forcings were balanced during that period (Note that it doesn't mean that anthropogenic forcings weren't around). Since then, however, anthropogenic forcings - such as massive deforestation and continuing global rises in GHG emissions have become more significant, and the warming in the late 20th Century (from ~1975 onward) is attributable in the main to anthropogenic forcings.