This little gem comes from the 1979 Biblical Doomsday “documentary” The Late, Great Planet Earth:
‘I am one of those scientists who finds it hard to see how the human race is to bring itself and bring the human enterprise much past the year 2000.’
That dire prediction was made by famed scientist Dr. George Wald, who was by all accounts a brilliant man who won a Nobel Prize for his work.
The phrase “much past” makes Wald’s dooms-date ambiguous, though I consider it a failed prediction at this point, since we’re 17 years into new century without civilization collapsing, and without any evidence it’s about to. To the contrary, since Wald’s quoted statement, we’ve managed to add three billion more humans to the planet while also sharply reducing global rates of malnourishment and absolute poverty. Across a wide variety of metrics, the human race has grown larger, healthier, richer, and less violent, and there are no signs the trends will abate anytime soon.
Making accurate future predictions is always fraught with uncertainty, but it becomes especially conjectural when people start making predictions about things outside of their areas of expertise. Wald’s mastery of biochemistry left him with no better a grasp of the human race’s trajectory than an average person, and his inclusion in this religious doomsday documentary is an example of the “Appeal to Authority” logical fallacy, in which a person’s credentials are erroneously substituted for reasoned and fact-based argumentation.
In my recent blog entry about Richard Branson, I pointed out that predictions should not be trusted if the person making them stands to tangibly benefit if other people believe them, and to that I’ll add that predictions should not be trusted if the person making them doesn’t have relevant expertise. Moreover, name-dropping and credential-dropping should never substitute for independently verifiable facts and transparent methodologies.
UPDATE: (8/28/2017) Coincidentally, I just came across the article, “A Nobel Doesn’t Make You an Expert: Lessons in Science and Spin.” The author (a former New York Times science editor) uses the example of James Watson, who won a Nobel Prize for co-discovering the structure of DNA, to show that the opinions and predictions of “experts” are often of little value when they pertain to subjects outside their areas of expertise. In 1998, Dr. Watson erroneously predicted that cancer would be cured within two years. The author also sets forth a few tips for evaluating predictions from “experts,” which partly overlap with my own and which I’ll summarize here:
- Ensure that the person’s education and professional credentials are relevant. A useful measure of a scientist’s level of expertise is the quantity and quality of the peer-reviewed papers they have produced.
- Be suspicious when experts have conflicts of interest that may bias their opinions and predictions.
- Remember that experts whose theories fall far outside the scientific mainstream are usually (but not always) wrong.
- Be very suspicious of scientists and other experts who feel aggrieved or persecuted by the mainstream of their professions. If an expert with an outlier theory also believes there is a conspiracy against him or her, it should raise a red flag in your mind.
Links:
https://undark.org/article/cornelia-dean-making-sense-of-science/