Menu
Aeon
DonateNewsletter
SIGN IN

Scientists working extra hours during the COVID-19 pandemic at the Laboratory of Infectious Diseases in Athens, Greece. 18 March 2020. Photo by Enri Canaj/Magnum

i

The good scientist

Science is the one culture that all humans share. What would it mean to create a scientifically literate future together?

by Martin Rees + BIO

Scientists working extra hours during the COVID-19 pandemic at the Laboratory of Infectious Diseases in Athens, Greece. 18 March 2020. Photo by Enri Canaj/Magnum

The Royal Society, the United Kingdom’s academy of sciences, was founded in 1660. At its earliest meetings, scientists shared travellers’ tales, peered through newly invented microscopes, and experimented with airpumps, explosions and poisons. Its earliest fellows included the polymaths Christopher Wren and Robert Hooke, along with enthusiastic amateurs such as the prolific diarist Samuel Pepys. Sometimes gatherings turned gruesome: Pepys recorded the event of a blood transfusion from a sheep to a man – who, amazingly, survived. Health and safety rules render Royal Society meetings somewhat duller these days, but the guiding spirit remains. Right from the start, the Society recognised that science was international and multidisciplinary.

Science and technology, of course, hugely expanded over the following centuries. As a result, the Royal Society’s present-day fellows are specialised professionals. This fact aggravates the barrier between science and the public, as well as between different specialisms. As a physical scientist, most of my own all-too-limited knowledge of modern biology comes from ‘popular’ books on the subject.

The sharp demarcation between scientists and humanities scholars would have perplexed intellectuals such as Wren, Hooke and Pepys. In 1959 the novelist, critic and chemist C P Snow bemoaned this divide in his iconic lecture on the ‘Two Cultures’, presented at the University of Cambridge. There was (and still is) much truth in his analysis; we are all too narrow in our cultural reach. However, Snow presented the dichotomy too starkly – a consequence, perhaps, of the social milieu in which he moved. He felt an affinity with scientists and engineers who had been part of the war effort in the Second World War, and retained a robust sense of optimism about the role of science in human betterment. That generation had ‘the future in their bones’, he said, and roamed what he elsewhere called the ‘corridors of power’. They influenced, among others, the UK’s prime minister Harold Wilson, who extolled ‘the white heat of this technological revolution’ in a celebrated speech at the 1963 Labour Party conference. In contrast, the humanities scholars whom Snow knew best – and who typified, for him, the literary culture of the 1950s – had been intellectually straitjacketed by schooling with a strong focus on Classical languages, often followed by three years in the narrow social world of Oxford or Cambridge.

The issues that concerned Snow loom only larger today. Societies are increasingly dependent on advanced technology; science pervades our lives more than ever. But the glad optimism about science has faded. In many quarters, observers view the impact of new breakthroughs with more ambivalence than excitement. Since Snow’s time, our ‘marvellous’ new technologies have created fresh hazards and raised new ethical quandaries. Many commentators are anxious that science is getting out of hand, such that neither politicians nor the public can assimilate or cope with it. The stakes are higher now too: science offers huge opportunities, but future generations will be vulnerable to risks – nuclear, genetic, algorithmic – powerful enough to jeopardise the very survival of our civilisation.

In a later publication based on his original lecture, Snow suggested that there was a ‘third culture’, one embracing the social sciences. Today it might be truer to say that the very idea of ‘culture’ has many interweaving strands. Nonetheless, intellectual narrowness and ignorance remain endemic, and science is a closed book to a worrying number of people in politics and the media. But just as many people are ignorant of the history and literature of their own nation. Scientists don’t have a special reason to moan; in fact, it’s really quite remarkable how many people are interested in subjects as blazingly irrelevant to practical life as dinosaurs, the Higgs boson and cosmology. There is a surprising and gratifying interest in fundamental big questions – such as the origins of consciousness, of life, and of the cosmos itself.

Charles Darwin’s ideas, for example, have been culturally and philosophically resonant ever since they were first unveiled in 1859. Indeed, they’ve never provoked more vibrant debates than they do today. Darwin was perhaps the last scientist who could present his research in a way accessible to general readers; today, it’s hard to present original findings without a forbidding array of equations, or a specialised vocabulary. ‘On the Origin of Species’, which he described as ‘one long argument’ underpinning his theory, ranks highly as a work of literature. It changed our perception of human beings by revealing that we were an outcome of a grand evolutionary process that can be traced back to the beginning of life on Earth. The great work closes with the words:

There is grandeur in this view of life … having been originally breathed into a few forms or into one; and that, whilst this planet has gone cycling on according to the fixed law of gravity, from so simple a beginning endless forms most beautiful and most wonderful have been, and are being, evolved.

When asked about religion, Darwin offered a diffident response:

[T]he whole subject is too profound for the human intellect. A dog might as well speculate on the mind of Newton. Let each man hope and believe what he can.

His reply strikes a chord with me. Science should straddle all faiths. Modern scientists evince a variety of religious attitudes; there are traditional believers as well as hardline atheists among them. If we learn anything from the pursuit of science, it’s that even something as basic as an atom is quite hard to understand. This truth should induce skepticism about any dogma, or any claim to have achieved more than a very incomplete and metaphorical insight into some profound aspect of existence. Atheist scientists are surely aware that many of their colleagues follow some religious practices. Rather than attacking pro-science mainstream religions, they should strive for peaceful coexistence with them, and thereby broaden the alliance against brands of fundamentalism that are indeed hostile to science.

Astronomy and cosmology now play a pivotal cultural role alongside Darwinism. Both biology and the study of celestial objects are intrinsically appealing, and not only because both subjects involve beautiful images and fascinating ideas. Their allure also comes from a certain positive and non-threatening popular image. Meanwhile, genetics and nuclear physics might be equally interesting, but the public is ambivalent about them because they have downsides as well as benefits.

Today, it’s a real intellectual deprivation to be blind to the marvellous vision offered by Darwinism and by modern cosmology – the chain of emergent complexity leading from a ‘big bang’ to stars, planets, biospheres, and human brains able to ponder the wonder and the mystery of it all. Concepts such as these should be part of the public conversation. So too should some conception of the natural environment and the principles that govern the biosphere and climate. Science is the one culture that all humans can share: protons, proteins and Pythagoras’ theorem are the same the world over.

Scientists don’t fall into a single mould. Isaac Newton’s mental powers seem to have been really ‘off the scale’, and his concentration was as exceptional as his intellect. When asked how he cracked such deep problems as gravity, he said: ‘by thinking on it continually’. He was solitary and reclusive when young, but became vain and vindictive in his later years. By contrast, Darwin was agreeable and sympathetic to the end, and modest in his self-assessment. ‘I have a fair share of invention,’ he wrote in his autobiography, ‘and of common sense or judgment, such as every fairly successful lawyer or doctor must have, but not, I believe, in any higher degree.’ Darwin reminds us that the thought processes of most scientists are not intrinsically different from those of other professionals, nor indeed from those of a detective assessing the evidence at a crime scene. They’re not monolithic, either. It’s simplistic to refer to ‘the scientific method’; the methodology varies widely depending on the topic, and demands a different mix of mathematical modelling, experiments and fieldwork. Each subfield requires diverse styles of thinking and attracts different personality types. Some scientists see themselves as intellectuals, others as technocrats.

The heterogeneous business of science is always a ‘work in progress’. Some theories are supported by overwhelming evidence; others are provisional and tentative. But, however confident we might be in a theory, we should keep our minds ajar if not open to the possibility that some intellectual revolution will offer a drastically different perspective.

Scientists tend to be severe critics of other people’s work. Those who overturn an established consensus to contribute something unexpected and original tend to garner the greatest esteem. But scientists should be equally critical of their own output: they must not become too enamoured of ‘pet’ theories or be influenced by wishful thinking. Many of us find that very hard. Scholars who have invested years of their life in a project are bound to be strongly committed to its importance, to the extent that it’s a traumatic wrench if the whole effort comes to nought. But initially tentative ideas firm up only after intense scrutiny and criticism – for instance, the link between smoking and lung cancer, or between HIV and AIDS. Scrutiny and criticism are also how seductive theories get destroyed by harsh facts. That’s why the American historian and sociologist Robert Merton in 1942 described science as ‘organised skepticism’.

Noisy controversy on a scientific topic doesn’t mean that the arguments are evenly balanced. Nonetheless, broad and open debate has shown itself to be the best route to clarity. That’s how science advances. Benign developments in communications mean that more people, worldwide, can participate in science. In particular, the best scientific journalists and bloggers are plugged into an extensive network and can help to calibrate controversies in areas such as climate science, cold fusion and epidemiology.

When rival theories fight it out, only one winner is left standing. A crucial piece of evidence can clinch the case. That happened in 1965 for Big Bang cosmology: weak microwaves were discovered that had no plausible interpretation other than as an afterglow of a hot, dense ‘beginning’. Or take the discovery of ‘seafloor spreading’, again in the 1960s, which converted almost all geophysicists to a belief in continental drift. In other cases, an idea gains only a gradual ascendancy; alternative views are marginalised until their proponents die off. Sometimes, the field itself moves on, and what once seemed an epochal issue is bypassed or sidelined. In general, the more remarkable a claim is – the more intrinsically unlikely, the more incompatible with a package of well-corroborated ideas – the more skeptical and less credulous it is appropriate to be. As the American cosmologist Carl Sagan said: ‘Extraordinary claims require extraordinary evidence.’

On the ‘great unknowns’, there’s less of a gap between expert and public – neither one has a clue

The path towards a consensual scientific understanding can be winding, with many blind alleys explored along the way. Occasionally, a maverick is vindicated. We all enjoy seeing this happen – but such instances are rare, and rarer than the popular press would have us believe. Sometimes a prior consensus is overturned, but most advances transcend and generalise the concepts that went before, rather than contradicting them. Albert Einstein didn’t ‘overthrow’ Newton, for instance. He transcended Newton, offering a new perspective with broader scope and deeper insights into space, time and gravity.

What about ideas ‘beyond the fringe’? As an astronomer, I haven’t found it fruitful to engage in dialogue with astrologers or creationists. We don’t share the same methods, nor play by the same evidence-based rules. No one should let a craving for certainty – for the easy answers that science can seldom provide – drive us towards the illusory comfort and reassurance that these pseudosciences appear to offer.

If you ask scientists what they’re working on, they will rarely say that they are ‘trying to understand the Universe’ or ‘curing cancer’. Their normal response is something very narrow and specific. They realise that the big questions are important, but they must be tackled in a step-by-step way. Only cranks or geniuses try to solve the big questions in one go; the rest of us tackle a problem that’s bitesize, and hope to make incremental progress that way.

An occupational risk is that scientists can forget that these narrow problems are worthwhile only insofar as they’re a step towards answering some big question. This is why it’s good for scientists to engage with general audiences. I would derive far less satisfaction from my own research if I could talk about it only to a few other specialists. When one discusses the ‘great unknowns’, there’s less of a gap between the expert and the public – neither one has a clue. The experts are perhaps confused at a deeper level, but that’s all. Even if we explain ourselves badly, we benefit from exposure to general audiences who focus on the big questions and remind us of how much we still don’t know.

Sometimes, the most familiar questions are the ones that baffle us most – whereas some of the best-understood phenomena are far away in the cosmos. Astronomers can confidently explain black holes crashing together a billion light-years away. Yet our grasp of everyday matters that affect us all – diet and childcare, for instance – is still so meagre that ‘expert’ advice changes from year to year. But it isn’t necessarily paradoxical that we’ve understood some arcane cosmic phenomena while being flummoxed by everyday things. What challenges us is complexity, not mere size. The smallest insect is structured far more intricately than a star or a galaxy, and offers deeper mysteries.

A theme of Snow’s lecture was that scholars in the humanities failed to appreciate the creativity and imagination that the practice of science involves. But it can’t be denied that there are differences in what those things mean for the artist, as opposed to the scientist. An artist’s work might be individual and distinctive, but it generally doesn’t last. On the other hand, even a journeyman scientist should be able to add a few durable bricks to the corpus of ‘public knowledge’, even if our contributions as scientists will probably lose their identity. If A didn’t discover something, in general B soon would; indeed, there are many cases of near-simultaneous discovery. Einstein made a more distinctive imprint on 20th-century science than any other individual – but, had he never existed, all his insights would have been revealed by now, though probably by several people rather than by one great mind. Any scientist is ‘replaceable’, in a way that an individual artist is not. As the British immunologist Peter Medawar remarked, when Richard Wagner diverted his energies for 10 years in the middle of the Ring cycle to compose Tristan und Isolde and Die Meistersinger von Nürnberg, the German composer wasn’t worried that someone would scoop him on Götterdämmerung.

Einstein’s image as the benign and unkempt sage has made him as iconic as Wagner, with fame that extends far beyond his field. He was one of the few who really did achieve public celebrity. Einstein’s impact on general culture, though, has sometimes involved a misunderstanding of his actual meaning. In some ways, it’s a pity that he called his theory ‘relativity’; its essence is that the local laws of physics are the same, just across different frames of reference. ‘Theory of invariance’ might have been a more apt choice, and would have staunched the misleading analogies with relativism in the humanities and social sciences. But in terms of cultural fallout, he’s fared no worse than others. Werner Heisenberg’s uncertainty principle – a mathematically precise concept, the keystone of quantum mechanics – has been hijacked by adherents of oriental mysticism. Darwin has likewise suffered tendentious distortions, especially in regards to racial differences, eugenics and claims that Darwinism offers a basis for morality.

Societies now confront difficult questions such as: who should access the ‘readout’ of our personal genetic code? How will lengthening lifespans affect society? Should we build nuclear power stations – or wind farms – to keep the lights on without triggering ecological collapse? Should we plant genetically modified crops? Should the law allow ‘designer babies’ or cognition-enhancing drugs? These cannot be addressed without deploying scientific expertise, but they straddle practical policies too.

This gap was more easily bridged in the 17th century. The scientists who founded the Royal Society described themselves as ‘ingenious and curious’, in their Philosophical Transactions recording their activities. But they were also immersed in the practical agenda of their era – improving navigation, exploring the New World, and rebuilding London after the Great Fire. They were, in the words of Francis Bacon, ‘merchants of light’, but also committed to ‘the relief of man’s estate’.

Now that the impact of research can be so much greater, scientists have a still-deeper responsibility to engage with society than they did in previous centuries. At the same time, they must accept that they speak as citizens and not as experts when it comes to the economic, social and ethical aspects of policy. Yet if the discussion is to rise above mere sloganeering, everyone needs enough of a ‘feel’ for science to avoid becoming bamboozled by propaganda and bad statistics, and to sidestep over-deference to experts. The need for proper debate will become more acute in the future, as the pressures on the environment and from misdirected technology get more diverse and threatening.

Politicians need the best ‘in-house’ scientific advice. But, more than that, the public debate must be leveraged by ‘scientific citizens’ – engaging with the media, and with a public attuned to the scope and limits of science. They can do this via campaigning groups, via blogging and journalism, or via NGO or political activity – all of which can catalyse a better-informed debate.

In peacetime, the atomic scientists continued not just as academics but as engaged citizens

You would be a poor parent if you didn’t care about what happened to your children in adulthood, even though you might have little control over them. Likewise, no scientists should be indifferent to the fruits of their ideas – their creations. They should try to foster benign spin-offs, commercial or otherwise. They should resist, so far as they can, dubious or threatening applications of their work, and alert the public and politicians to perceived dangers.

We have some fine exemplars from the past. Take the atomic scientists who developed the first nuclear weapons during the Second World War. Fate had assigned them a pivotal role in history. Though many of them returned with relief to peacetime academic pursuits, the ivory tower didn’t serve as a sanctuary. They continued not just as academics but as engaged citizens, promoting efforts to control the power they had helped to unleash. Prominent among them were the physicists Joseph Rotblat, Hans Bethe and Rudolf Peierls. Because science straddles ideologies, they could sustain contact and trust with their Soviet counterparts and help to lay the groundwork for the 1960s arms control treaties.

Another example comes from the UK, where a dialogue between scientists and parliamentarians (initiated especially by the philosopher Mary Warnock in the 1980s) produced a widely admired legal framework regulating the use of embryos. Similar dialogue led to guidelines for stem-cell research. But in the UK and Europe, there have been failures too. For instance, the GM crop debate was left too late – to a time when opinion was already polarised between eco-campaigners on the one side and commercial interests on the other. This has led to an excessive caution in Europe – despite evidence that more than 300 million Americans have eaten GM crops for decades, without manifest harm.

Today, genetics and robotics are advancing apace, confronting us with a range of new contexts where regulation is required on ethical or prudential grounds. These are rightly prompting wide public discussion, but professionals have a special obligation to engage. Universities can use their staff’s expertise, and their convening power, to assess which scary scenarios – from ecothreats to misapplied genetics or cybertech – can be dismissed as science fiction, and which deserve serious attention.

Scientists who have worked in government often end up frustrated. It’s hard to get politicians to prioritise longterm issues – or measures that will mainly benefit people in remote parts of the world – when there’s a pressing agenda at home. Even the best politicians focus mainly on more urgent and parochial matters, and on getting re-elected. Scientists can often gain more leverage indirectly. Sagan, for instance, was a preeminent exemplar of the concerned scientist; in an age before social media and Tweets, he had immense influence through his writings, broadcasts, lectures and campaigns. He would have been a leader in our contemporary age of protests and marches, electrifying crowds through his passion and his eloquence. We need such figures today.

Unlike our 17th-century forebears, we know a great deal about our world – and indeed about what lies beyond. Technologies that our ancestors couldn’t have conceived enrich our lives and our understanding. Many phenomena still make us fearful, but the advance of science spares us from irrational dread. We know that we are stewards of a precious ‘pale blue dot’ in a vast cosmos – a planet with a future measured in billions of years, whose fate depends on humanity’s collective actions. But all too often, the focus is parochial and short-term. We discount too heavily the problems we’ll leave for our grandchildren. And we downplay what’s happening even now in countries that seem far away.

Snow’s celebrated lecture in fact contained an impassioned plea for the deployment of science to lessen the gulf between rich and poor nations. This message deserves to be remembered at least as much as the two-cultures thesis that preceded it.

The same year that Snow delivered his lecture, Medawar concluded his Reith lectures for the BBC with the words:

The bells which toll for mankind are … like the bells on Alpine cattle; they are attached to our own necks, and it must be our fault if they do not make a cheerful and melodious sound.

Today, six decades later, the promise that science offers is greater than ever; but so too are the threats from its misuse. The stakes are getting higher, and the world is getting more interconnected. To harness the benefits while avoiding the dangers and ethical tradeoffs demands international collaboration, guided by values that science itself can’t provide.