This is the season of overcrowded and overheated shops, office Christmas parties, long-lost relatives – including those who you pray would get lost – and false bonhomie.
But it is also, and this is much worse than any of that, the forecasting season. This is the time when normally sensible people cannot resist the urge to look into the crystal ball. In newspapers and magazines all over the country, journalists are putting together features on the outlook for 2005 and beyond, safe only in the knowledge that by about January 10 everybody will have forgotten them.
Nor are they operating in a vacuum. Around now, every other press release contains so-and-so’s "top 10 HR predictions for 2005", or simply "Outlook 2005". Given that China remains the hot economics story, there will be many predictions of the "What will the year of the rooster bring?" variety. The Economist, which seemingly cannot make room for all its predictions in a normal issue, has for some years put together a separate publication. The World in 2005 sells for £4.95 in WHSmith.
For connoisseurs of forecasting, however, there is one problem with the current wave of predictions. Too many people these days are so unwilling to risk putting their foot in their mouth that there is a dull uniformity to this huge output of forecasts. Most are so blindingly and boringly obvious that they have little chance of achieving the only real accolade in this area – being recognised by future historians as a truly awful forecast.
Collectors of truly awful forecasts have had no shortage of material over the years. For economists, Malthus’s Essay on the Principle of Population, which had the effect of making economics the "dismal science", was published nearly 200 years ago but has enduring power. Malthus, of course, predicted that the world would run out of food. "Population, when unchecked, increases in geometrical ratio," he wrote. "Subsistence only increases in an arithmetical ratio. A slight acquaintance with the numbers will show the immensity of the first power in comparison with the second."
Malthus failed to take account of sharp improvements in agricultural productivity and methods. Even today, when there are millions of starving people in the world, the problem is not a shortage of food but its distribution.
That has not stopped Malthus-type forecasts from appearing over the years. Just over 30 years ago an international think-tank called the Club of Rome published The Limits to Growth, which argued that the global economy would be forced to slow down because of a shortage of natural resources. Even now, after a period in which oil prices have hit a record high of $50 a barrel, there are predictions that the world is about to run out of oil. It is not, although we may have to get used to paying a bit more for it.
Laura Lee, an American journalist, wrote a whole book, Bad Predictions, on the subject of truly awful forecasts. Many, like those infamous Y2K predictions five years ago, when we faced disaster at the hands of the millennium bug, involve technology.
From the Roman engineer Sextus Julius Frontinus, who said in AD 10 that mankind had run out of things to invent, even forecasters paid to know about these things have got it wrong on technology. In 1952, IBM famously predicted a total market for computers of 52 units. Thirty years later, with the advent of the PC, it had sensibly raised this to 200,000. Today, that is roughly the number it ships each week.
Perhaps IBM was merely being modest about its product, as was Alexander Graham Bell when he predicted that one day there would be a telephone in every American city. A common forecasting error is to assume, like Sextus Julius Frontinus, that no further progress is possible, or likely. John von Neumann suggested in 1949 that we might have reached the limits of computer technology, while Arthur L Samuel wrote in the New Scientist in 1964 that computers were unlikely to get any faster. Even Bill Gates, the great computer visionary, is reported to have said in 1981 that "640K ought to be enough for anybody".
Transport is another favoured area for the truly awful forecast. In 1902 Harper’s Weekly told its readers: "The actual building of roads devoted to motor cars is not for the near future, in spite of many rumours to that effect." A year later the president of Michigan Savings Bank told Henry Ford’s lawyer not to risk investing in Ford’s company because "the horse is here to stay; the automobile is only a novelty".
Forecasters and futurologists fall into two timing traps. The first is to pitch things too far into the future. Science Digest opined in August 1948 that it would take mankind 200 years to enable a moon landing to take place. In the event it took just over 20.
The other trap is to exaggerate the pace of change. Many of us grew up expecting that by now we would be commuting in flying cars and routinely holidaying on other planets. Some of us have probably imagined a future in which travel would consist of being beamed across continents. These things may happen, but not for a while. Arthur C Clarke predicted in Vogue in 1966 that by 2001 houses would be made of ultra-lightweight material and be capable of flying. "Whole communities may migrate south for the winter," he said. Perhaps he was simply having a bit of fun at the expense of the fashionistas.
It is easy to scoff, but it gets a little uncomfortable when it becomes too close to home. Some of the strongest candidates for the roll of honour of truly awful forecasts relate to the job market and the way we work. And not all of them date back to the golden age of futurology, the 1950s and 1960s.
Many readers will remember the 1994 book, Job Shift: How to Prosper in the Workplace Without Jobs. Author William Bridges argued that the growth of outsourcing, temping and consultancy spelt an end to "the job as we know it" – those "boxes on the organisational chart with regular duties, hours and salaries". He described the phenomenon as "de-jobbing" and compared the shift to a second industrial revolution, leading to mingled disappointment and relief a decade later among the millions who still toiled from nine-ish to five-ish with a salary and job description.
As if to trump this, the following year Jeremy Rifkin published The End of Work. This was on a broad canvas. After considering how work had been a part of human existence since Palaeolithic times, the book declared: "Now, for the first time, human labour is being systematically eliminated from the production process." Within less than a century, he asserted, mass work in the market sector was likely to be phased out in virtually all industrialised nations.
John Philpott, the CIPD’s chief economist, votes this one of his favourite truly awful forecasts.
"This ‘classic’ was published just as the US economy started to move back to full employment. Rifkin seems to have transferred his attention to ‘the coming environmental crisis’, although I imagine his earlier work will be given a retread in the light of equally absurd talk in the US at the moment of permanent ‘jobless growth’," he says. "Without being too defensive, I hope, serious economists seldom espouse absolute nonsense. This is mostly confined to popular futurologists."
Where futurologists – and some employers – have got it most wrong is on the way that things would evolve when it came to working hours. Automation, labour-saving technology and the inexorable decline in the average working week from more than 65 hours in the 1850s appeared to have only one possible result.
Computer scientist Christopher Evans, in a 1978 piece for Science Fact entitled "Computers and artificial intelligence", predicted that "by 1990 people will be retiring at 40 or thereabouts". We can laugh but, in the 1990s, with occupational pension schemes apparently in a state of permanently rude health, many companies had policies of retiring people at 50.
The godfather of forecasters, the man for whom the truly awful forecast came very easily, was Herman Kahn. Kahn, a robust and quirky individual who is said to have been the model for Dr Strangelove, was a man of strong views and boundless imagination, particularly when it came to predictions. His 1967 book The Year 2000, still available on Amazon.com, genuinely is a classic.
Kahn did not do too badly when it came to some of his predictions, getting it more or less right on home computers, mobile phones, video recorders and satellite dishes, although overdoing it with his forecasts of underwater cities, new forms of energy and house-cleaning robots.
He probably thought his safest predictions involved work. By 2000, he suggested, nobody would be working more than 30 hours a week and 13 weeks of annual holiday would be the norm, even in workaholic America. Like many futurologists, Kahn thought the challenge in advanced societies would be filling the many hours of leisure created by technological and economic advance, although his solution was less conventional than most. He predicted that by 2000 humans would routinely hibernate through the darkest and coldest months of winter. That may be how it sometimes feels, but it is some way away from reality.
On a more mundane level, there is a phenomenon well known in newspaper offices. If a journalist knows one person who is doing something different to the norm, he or she is merely out of the ordinary. If there are two, that becomes an interesting phenomenon worthy of note. When there are three, it is a trend.
As Philpott puts it: "Many current scares are really versions of myths that have been around for a long time. These include the end of the ‘job for life’, the death of permanent full-time employment, and the end of the job as we know it and the rise of self-employed portfolio careers for all. These myths derive from over-extrapolating the experience of some individuals or groups in the economy."
Others derive from scanning one graph and assuming that the trend it illustrates will carry on in the same direction while everything else remains the same. This is what happened to 1960s pundits who predicted the population would become uncontrollable, just before the Pill came on the market. And it hit the doom-mongers of the 1980s who littered every conference with warnings of the "demographic time bomb" as the working population was about to slump. In the event, with baby-boomers and women staying longer in the labour market and the recession of the early 1990s, the bomb has failed to detonate.
Perhaps the job market trips us up so often because, more than some parts of the economy, it is so intimately tied up with human behaviour. Behaviour can and does change, often in ways we do not expect.
Ahead of Labour’s 1997 election victory and the introduction of a national minimum wage, most economists thought it would lead to higher unemployment. Michael Howard, leading the charge for the Conservative Party, said it would destroy two million jobs. Employers in certain sectors, particularly retail and catering, warned of the dire consequences of the policy.
As far as it is possible to tell, however, the minimum wage had a negligible impact on jobs or, if it did, the effect was swamped by that of a generally strong job market. The direst of the predictions were based on the belief that workers further up the income scale would seek to maintain differentials with those benefiting from the minimum wage. That has not happened.
On a more fundamental level, few of us expected the steady fall in unemployment that has characterised Labour’s period in office. And in many ways we were right to be sceptical.
"Any structural break in behaviour can make models based on past relationships unreliable," says Philpott. "Having observed the unusually early shake-out of jobs and associated strong productivity growth when the UK economy went into recession in 1990-91 – interpreting this as a sign of a more flexible ‘hire and fire’ labour market – I expected a similar response and some rise in unemployment when the economy slowed in the late 1990s. In the event, unemployment continued to fall, mainly because employers preferred to hoard rather than fire staff in what by then was a relatively tight labour market."
The job market has caught out economists in another way. For years an essential weapon in their armoury was the notion of a certain level of unemployment at which wage settlements, and therefore inflation, would start to take off. This, the clumsily named non-accelerating inflation rate of unemployment (Nairu), was thought to be 7 or 8 per cent of the workforce, two million or more on the wide Labour Force Survey (LFS) jobless measure. But unemployment has come down to 5 per cent on the LFS measure, and under 3 per cent on the claimant count (meeting the traditional definition of full employment), without triggering a new bout of inflation.
"This version of doom and gloom was again proved wrong," says Philpott, "although there are still disputes as to whether the Nairu theory was wrong, whether the Nairu was below two million to start with, or whether it has since fallen."
So we have got it wrong, over and over again, although sometimes we have got it right. Where are we likely to be getting it wrong when it comes to the current batch of predictions about the future?
One useful rule of thumb is to beware the herd. When everybody expects something, there is a good chance it will turn out to be wrong. The trouble is that it passes unnoticed when the consensus is right, or we forget that there was a healthy debate at the time. Take the collapse in dot-com and other technology shares five years ago. Hindsight tells us that we all got sucked into this mood of irrational exuberance. In fact, many economists and market analysts warned repeatedly of the danger – and some fund managers lost their jobs by staying out of technology shares – but failed to convince enough investors.
Are we getting it wrong on some of the really big things? Another of those healthy debates, on the question of global warming and climate change, is in full swing. While most of the scientific establishment now argues that the case is unanswerable, others take a different view. Bjorn Lomborg, author of The Sceptical Environmentalist, has assembled a group of Nobel prize-winning economists and others to argue that, even if we accept the science of global warming, it is far from clear that this should be a priority.
And what about the consensus that says Europe is destined to become an economic backwater, shackled by its ageing population and inability to embrace economic reform?
A high-level group chaired by Wim Kok, the former Dutch prime minister, and including our own Will Hutton, head of the Work Foundation, as its rapporteur, recently added its voice to the gloom on Europe. The EU, it said, had failed to respond to the reform initiatives launched at Lisbon four years ago. A failure to make the EU economy more flexible and responsive would, it said, threaten Europe’s very civilisation.
The consensus may be right. One worry for the new EU entrants from eastern Europe, says Philpott, is that they are adopting "the full panoply of EU laws and employment regulations," which will survive and hold them back long after their labour cost advantage has gone.
But these things change. At the end of the 1980s, America was regarded as a lumbering giant, beset with low productivity growth and a lack of dynamism. It was only a matter of time before the US would be overhauled by dynamic, rapidly-growing Japan. But the 1990s was the American decade, while Japan suffered weak economic growth.
What about the consensus on retirement? In a few short years, we have moved from a position where earlier and earlier retirement was thought possible, indeed preferable. Now, as the government embraces EU age discrimination legislation and digests Adair Turner’s recommendations, the ground is being steadily prepared for retirement at 70.
But perhaps the shift has been overdone. The great futurologists, even those who gave us some of those truly awful forecasts, all believed that technical progress and rising productivity would enable shorter working hours and earlier retirement. Maybe they were not entirely wrong. And maybe some of the gloom about "working till you drop" has been overdone. Most forecasts, after all, are wrong.
Top five forecasters’ excuses
- "The statistics were wrong." A favourite one this, particularly among economists. If recent history is clouded in uncertainty, even the most talented forecaster will struggle to get the future right. But statistical revisions often work to the forecasters’ advantage. Funnily, they never seem to mention that.
- "It was a bolt from the blue." Otherwise know as the act of God, or the "all bets are off" excuse, an unexpected shock that is so big it would throw any forecast off its tracks. Sounds reasonable enough, except that these bolt-from-the-blue events are often the excuse for more forecasting howlers. After the attacks on America on 11 September 2001, almost everybody predicted a deep recession. The US economy, in fact, had already been in recession for several months and the action taken in the wake of 9/11 lifted it out.
- "They heeded my warnings." Otherwise known as the Y2K excuse, the consultants, technical experts and IT companies who made billions collectively out of fear of the millennium bug argued that without this expenditure there would indeed have been a technological disaster when the clock struck midnight on 31 December 1999. It looks like an uncheckable claim, except for the fact that countries such as Italy, which did not spend, also avoided Y2K disaster.
- "I’ll be right in the end." The old forecasters’ adage – give a forecast and a date but never the two together – is rather too close to the truth for comfort. It is a feature of the predictive art that a forecast is rarely wrong, simply ahead of its time.
- "If the facts change, I change my mind. What do you do, sir?" The language, as you might expect from somebody as cultured as John Maynard Keynes, is elegant. Behind it, however, is the perfect catch-all forecasters’ excuse: people did not behave as our forecasting model said they would but, never mind, we will know next time.
David Smith is economics editor of The Sunday Times. He will be speaking at the CIPD’s Annual Reward Conference on 8-10 February at London’s Olympia