homeaboutarchives + tagsshopmembership!
aboutarchivesshopmembership!
aboutarchivesmembers!

Should we be messaging the stars?

posted by Jason Kottke   Jun 30, 2017

Message Aliens

Humans currently have the capability to blast powerful messages into space that will travel hundreds and thousands of light years and fall on the ears, eyes, and radio telescopes of possible habitable worlds. A group called METI wants to do just that…to say a bracing “hello” to whoever is out there listening. But not everyone, starting with Elon Musk and Stephen Hawking, thinks this is such a good idea. In a piece for the NY Times Magazine called Greetings, E.T. (Please Don’t Murder Us.), Steven Johnson details the effort to communicate with possible alien worlds and the potential pitfalls.

The anti-METI movement is predicated on a grim statistical likelihood: If we do ever manage to make contact with another intelligent life-form, then almost by definition, our new pen pals will be far more advanced than we are. The best way to understand this is to consider, on a percentage basis, just how young our own high-tech civilization actually is. We have been sending structured radio signals from Earth for only the last 100 years. If the universe were exactly 14 billion years old, then it would have taken 13,999,999,900 years for radio communication to be harnessed on our planet. The odds that our message would reach a society that had been tinkering with radio for a shorter, or even similar, period of time would be staggeringly long. Imagine another planet that deviates from our timetable by just a tenth of 1 percent: If they are more advanced than us, then they will have been using radio (and successor technologies) for 14 million years. Of course, depending on where they live in the universe, their signals might take millions of years to reach us. But even if you factor in that transmission lag, if we pick up a signal from another galaxy, we will almost certainly find ourselves in conversation with a more advanced civilization.

It is this asymmetry that has convinced so many future-minded thinkers that METI is a bad idea. The history of colonialism here on Earth weighs particularly heavy on the imaginations of the METI critics. Stephen Hawking, for instance, made this observation in a 2010 documentary series: “If aliens visit us, the outcome would be much as when Columbus landed in America, which didn’t turn out well for the Native Americans.” David Brin echoes the Hawking critique: “Every single case we know of a more technologically advanced culture contacting a less technologically advanced culture resulted at least in pain.”

As Johnson notes, concerns like these will likely grow in number and magnitude in the coming decades. The march of technology is placing more and more potential power into the hands of fewer people.

Wrestling with the METI question suggests, to me at least, that the one invention human society needs is more conceptual than technological: We need to define a special class of decisions that potentially create extinction-level risk. New technologies (like superintelligent computers) or interventions (like METI) that pose even the slightest risk of causing human extinction would require some novel form of global oversight. And part of that process would entail establishing, as Denning suggests, some measure of risk tolerance on a planetary level. If we don’t, then by default the gamblers will always set the agenda, and the rest of us will have to live with the consequences of their wagers.

Between 1945 and now, the global community has thus far successfully navigated the nuclear danger to human civilization but has struggled to identify and address the threat of climate change initiated with the Industrial Revolution. Things like superintelligence and cheap genetic modification (as with CRISPR) aren’t even on the global agenda and something METI sounds like science fiction to even the tech-savviest politicians. Technology can move very fast sometimes and the pace is quickening. Politics? Not so much.

P.S. If you want to read some good recent sci-fi about the consequences of extraterrestrial communication, try Liu Cixin’s The Three Body Problem.

Update: Johnson’s long piece was even longer pre-publication and he shared some of the outtakes on his site, including a reference to the The Three Body Problem.

Even with 8,000 words to play with, we had to cut a number of passages that might still be of interest. About fifty people asked me on Twitter (or in the Times comments section) why the piece didn’t reference The Dark Forest trilogy, and particularly its opening novel The Three Body Problem, which features a METI-style outreach that goes spectacularly wrong. We did originally have a nod to the books, but just didn’t have the space to keep it; but they are indeed terrific (and haunting) and well worth reading if you’re interested in this question.