How would humanity change if we knew aliens existed?

We have yet to discover any signs of an extraterrestrial civilization — a prospect that could quite literally change overnight. Should that happen, our sense of ourselves and our place in the cosmos would forever be shaken. It could even change the course of human history. Or would it?

Top image: Josh Kao; more about this artist here.

Last week, SETI's Seth Shostak made the claim that we'll detect an alien civilization by 2040. Personally, I don't believe this will happen (for reasons I can elucidate in a future post — but the Fermi Paradox is definitely a factor, as is the problem of receiving coherent radio signals across stellar distances). But it got me wondering: What, if anything, would change in the trajectory of a civilization's development if it had definitive proof that intelligent extraterrestrials (ETIs) were real?

Finding a World Much Like Our Own

As I thought about this, I assumed a scenario with three basic elements.

First, that humanity would make this historic discovery within the next several years or so. Second, that we wouldn't actually make contact with the other civilization (just the receipt, say, of a radio transmission — something like a Lucy Signal that would cue us to their existence). And third, that the ETI in question would be at roughly the same level of technological development as our own (so they're not too much more advanced than we are; that said, if the signal came from an extreme distance, like hundreds or thousands of light-years away, these aliens would probably have advanced appreciably by now. Or they could be gone altogether, the victims of a self-inflicted disaster).

I tossed this question over to my friend and colleague Milan Cirkovic. He's a Senior Research Associate at the Astronomical Observatory of Belgrade and a leading expert on SETI.

"Well, that's a very practical question, isn't it?" he responded. "Because people have been expecting something like this since 1960 when SETI was first launched — they haven't really been expecting to find billion-year old supercivilizations or just some stupid bacteria."

Indeed, the underlying philosophy of SETI over the course of its 50-year history has been that we'll likely detect a civilization roughly equal to our own — for better or worse. And no doubt, in retrospect it started to look "for worse" when the hopes of an early success were dashed. Frank Drake and his colleagues thought they would find signs of ETIs fairly quickly, but that turned out not to be the case (though Drake's echo can still be heard in the unwarranted contact optimism of Seth Shostak).

"Enormous Implications"

"Some people argued that a simple signal wouldn't mean much for humanity," added Cirkovic, "but I think Carl Sagan, as usual, had a good response to this."

Specifically, Sagan said that the very understanding that we are not unique in the universe would have enormous implications for all those fields in which anthropocentrism reigns supreme.

"Which means, I guess, half of all the sciences and about 99% of the other, non-scientific discourse," said Cirkovic.

Sagan also believed that the detection of a signal would reignite enthusiasm for space in general, both in terms of research and eventually the colonization of space.

"The latter point was quite prescient, actually, because at the time he said this there wasn't much enthusiasm about it and it was much less visible and obvious than it is today," he added.

No doubt — this would likely generate tremendous excitement and enthusiasm for space exploration. In addition to expanding ourselves into space, there would be added impetus to reach out and meet them.

At the same time, however, some here on Earth might counterargue that we should stay home and hide from potentially dangerous civilizations (ah, but what if everybody did this?). Ironically, some might even argue that we should significantly ramp-up our space and military technologies to meet potential alien threats.

Developmental Trajectories

In response to my query about the detection of ETIs affecting the developmental trajectory of civilizations, Cirkovic replied that both of Sagan's points can be generalized to any civilization at their early stages of development.

He believes that overcoming speciesist biases, along with a constant interest and interaction with the cosmic environment, must be desirable for any (even remotely) rational actors anywhere. But Cirkovic says there may be exceptions — like species who emerge from radically different environments, say, the atmospheres of Jovian planets. Such species would likely have a lack of interest in surrounding space, which would be invisible to them practically 99% of the time.

So if Sagan is correct, detecting an alien civilization at this point in our history would likely be a good thing. In addition to fostering science and technological development, it would motivate us to explore and colonize space. And who knows, it could even instigate significant cultural and political changes (including the advent of political parties both in support of and in opposition to all this). It could even lead to new religions, or eliminate them altogether.

Another possibility is that nothing would change. Life on Earth would go on as per usual as people work to pay their bills and keep a roof above their heads. There could be a kind of detachment to the whole thing, leading to a certain ambivalence.

At the same time however, it could lead to hysteria and paranoia. Even worse, and in twisted irony, the detection of a civilization equal to our own (or any life less advanced than us, for that matter) could be used to fuel the Great Filter Hypothesis of the Fermi Paradox. According to Oxford's Nick Bostrom, this would be a strong indication that doom awaits us in the (likely) near future — a filter that affects all civilizations at or near our current technological stage. The reason, says Bostrom, is that in the absence of a Great Filter, the galaxy should be teeming with super-advanced ETIs by now. Which it's clearly not.

Yikes. Stupid Fermi Paradox — always getting in the way of our future plans.

Follow me on Twitter: @dvorsky