These are the days of fever dreams, whether induced by an actual virus or by the slow-motion stresses of a world dealing with a pandemic. One kind of dream in particular that I know I’ve had has to do with discovering that this was all, well, a dream. Except, when I really do wake up, I remember that there are ideas about the nature of reality that go beyond even this. The trickiest variant of these concepts is the simulation hypothesis, which is that we far more likely exist within a virtual reality than in a physical reality.
The proposition that the world is a sham is not new; it’s been cropping up for thousands of years across different cultures, from China to ancient Greece, advocated by thinkers like Descartes with his mind-body dualism. But this more recent version, based around computation—or at least artificial reconstruction—bubbled up around 2003 with the publication of a paper titled “Are You Living in a Computer Simulation?” by the philosopher Nick Bostrom. In essence Bostrom makes the argument that if any extremely advanced civilizations develop the capacity to run “ancestor simulations” (to learn about their own pasts) the simulated ancestral entities would likely far outnumber actual sentient entities in the universe. With a little probabilistic hand-waving it is then possible to argue that we are most likely simulated.
All of which is good fun if you’ve had a few beers or spent a few too many hours cowering under your bedclothes. But while you might love or hate this hypothesis, the simple fact is that before judging it we should really apply the criteria we use for assessing any hypothesis, and the first step in that process is to ask whether it can be assessed in any reasonable way.
Intriguingly, the simulation hypothesis might be testable, under certain assumptions. For example, we might suppose that a simulation has its limitations. The most obvious one, extrapolating from the current state of digital computation, is simply that a simulation will have to make approximations to save on information storage and calculation overheads. In other words: it would have limits on accuracy and precision.
One way that those limits could manifest themselves is in the discretization of the world, perhaps showing up in spatial and temporal resolution barriers. Although we do think that there are some absolute limits in what constitutes meaningful small distances or time intervals—the Planck scale and Planck time—that has to do with the limits of our current understanding of physics rather than the kind of resolution limits on your pixelated screen. Nonetheless, recent research suggests that the true limit of meaningful intervals of time might be orders of magnitude larger than the traditional Planck time (which itself is 10-43 seconds). Perhaps future physics experiments could reveal an unexpected chunkiness to time and space.
But the neatest test of the hypothesis would be to crash the system that runs our simulation. Naturally, that sounds a bit ill-advised, but if we’re all virtual entities anyway does it really matter? Presumably a quick reboot and restore might bring us back online as if nothing had happened, but possibly we’d be able to tell, or at very least have a few microseconds of triumph just before it all shuts down.
The question is: how do you bring down a simulation of reality from inside it? The most obvious strategy would be to try to cause the equivalent of a stack overflow—asking for more space in the active memory of a program than is available—by creating an infinitely, or at least excessively, recursive process. And the way to do that would be to build our own simulated realities, designed so that within those virtual worlds are entities creating their version of a simulated reality, which is in turn doing the same, and so on all the way down the rabbit hole. If all of this worked, the universe as we know it might crash, revealing itself as a mirage just as we winked out of existence.
You could argue that any species capable of simulating a reality (likely similar to its own) would surely anticipate this eventuality and build in some safeguards to prevent it happening. For instance, we might discover that it is strangely and inexplicably impossible to actually make simulated universes of our own, no matter how powerful our computational systems are—whether generalized quantum computers or otherwise. That in itself could be a sign that we already exist inside a simulation. Of course, the original programmers might have anticipated that scenario too and found some way to trick us, perhaps just streaming us information from other simulation runs rather than letting us run our own.
But interventions like this risk undermining the reason for a species running such simulations in the first place, which would be to learn something deep about their own nature. Perhaps letting it all crash is simply the price to pay for the integrity of the results. Or perhaps they’re simply running the simulation containing us to find out whether they themselves are within a fake reality.
Too often, the towering figures of science remain stick figures in the history books, known for their discoveries and accomplishments but not as the complicated, all-too-human people behind those achievements. The stick-figure version of Marie Curie, one of the most famous scientists of all time, describes a pioneering researcher on radioactivity who discovered two new elements and whose revolutionary findings about the atom had widespread applications throughout the 20th century—from medicine to the atomic bomb. Curie was the first woman to win a Nobel Prize, the first person and only woman to win two Nobel Prizes, and the only person to win them in two different scientific fields, physics (1903) and chemistry (1911). These were phenomenal achievements, regardless of gender.
But while the new film Radioactive rightly celebrates Madame Curie’s brilliance, it also reveals her courage as a female scientist struggling with the male-dominated scientific community. She had to fight for even the most rudimentary of laboratory space and face-down those who stood in her way. Fortunately, she found a scientific partner and later husband, Pierre Curie, who shared her passions and fought along with her for scientific justice.
The movie also allows Curie to step down from her scientific pedestal as she faces the tragic early death of Pierre in 1906 at 46 and an international scandal over her 1911 affair with a married colleague, Paul Langevin, which drew punishing newspaper headlines and an angry mob at her doorstep, screaming epithets and urging her to “go home” to her native Poland.
The film is not a nuts-and-bolts science lesson, but it does provide a window into the importance of the Curies’ discoveries and the challenging lives of scientists in the late 19th and early 20th centuries. We watch the husband-wife team as they conducted painstaking experiments in their underfunded labs and endured back-breaking labor to shovel, crush and boil tons of pitchblende ore to measure signs of radioactivity hidden within.
Born Maria Salomea Skłodowska in Warsaw, Poland, on November 7, 1867, she emigrated to Paris in 1891, at age 24, to study physics, chemistry and math at the University of Paris. She finally managed to get space in Pierre Curie’s lab; their joint scientific work brought them together, resulting in marriage on July 26, 1895. Her bridal costume was a practical navy blue. Marie is reported to have told Pierre: “I have no dress except the one I wear every day. If you are going to be kind enough to give me one, please let it be practical and dark so that I can put it on afterwards to go to the laboratory.”
The Curies loved long-distance bike trips. With a romantic flourish, the new film shows them riding side by side into the countryside, stopping along the way, stripping down to swim nude in a lake and lying naked on a blanket beside the shore (What? Scientists have sex too?).
The movie portrays the Curies as scientific equals, although Marie was often the leader in the early understanding of radiation—she coined the word “radioactivity”—and the discovery of the new elements radium and polonium (named after her native Poland). However, the inevitable sexism of the time nearly resulted in Marie initially being left out of the 1903 Nobel in Physics, which Pierre was to share with physicist Henri Becquerel, whose accidental discovery of a new form of radiation preceded the Curies’ work.
Fortunately, Pierre got advance notice of the commendation and insisted that Marie share in the honor as well. “I told them if there’s a Nobel to be won, we’ll win it together,” he announces in one film scene. One gets the sense that if Marie not been added, there would have been hell to pay in the Curies’ personal and professional worlds. In the film, their marriage is strong, and Pierre is the ultimate champion of his wife’s achievements, telling her, “You did the extraordinary. You changed the world.”
Madame Curie is portrayed with admirable “don’t mess with me” strength by the remarkable British actress Rosamund Pike. Her Madame Curie is bold—even arrogant—and not afraid to speak her mind. At one point, she says to her husband “you have one of the finest minds I’ve ever met. It just so happens that mine is finer.” After the tragic death of Pierre, who was trampled by a horse-drawn wagon, she loses her stoicism, privately breaking down in heart-wrenching sobs of despair. “Here is this brilliant, quite severe, sometimes odd creature who underneath has this well of emotion and love that most people never saw,” noted Pike in an interview.
As a film, Radioactive has met with mixed reviews, in part because the Iranian-French graphic novelist and director Marjane Satrapi chose a risky device to show how the Curies’ work later impacted the world. Her didactic “back to the future” approach jumps from the historic time of Marie Curie’s work forward to the use of radiation therapy in the late 1950s to treat a young boy suffering from cancer. It also spells out how the Curies’ basic research eventually led to the atomic bombs dropped 75 years ago over Hiroshima and Nagasaki in August 1945, to nuclear testing in Nevada in 1961 and to the Chernobyl nuclear reactor accident in 1986. Connecting the dots in this way is distracting—and a bit awkward—but it does show the astounding impacts the Curies’ transformative work would have on the history of the world.
In her own time, Madame Curie saw both the positive and negative health impacts of radiation, including its ability to shrink tumors. Before his untimely death, Pierre, plagued by a hacking cough, was already showing signs of illness from repeated exposure to radiation in their research. She, too, falls prey to radiation-related ailments, leading to her death at 66 on July 4, 1934 from aplastic anemia, a blood disease likely due to exposure to large amounts of radiation over her lifetime.
Many books, plays and films have drawn portraits of the First Lady—and First Couple—of science. What I liked about Radioactive was the complex, nuanced way in which Pike portrays the driven Marie Curie and her ambition, determination and imperfections in pursuing a life befitting her brilliant mind. The stick-figure image I had of Marie Curie is replaced with a flesh-and-blood woman who conducts her painstaking science wearing the suffocating high-necked, floor-length dresses of the time.
But when she takes those clothes off, we see her as a woman whose romantic and sexual desires led her to risk her illustrious reputation for an ill-fated love affair. Marie Curie was a major celebrity in her time, idolized by the public and then viciously torn down by the press—a cycle we’re all too familiar with today. Idols fall hard, and Marie Curie suffered the scorn of France and the world, yet went on to win a second Nobel Prize that year.
Madame Curie also aided the French war effort, fighting for funding and even offering to melt down the gold in her Nobel medals for mobile x-ray units that could be taken to the battlefield to help reduce the number of unnecessary amputations. The film shows her driving such a unit—they were dubbed petites Curies (little Curies)—joined by Irène, one of her two daughters, who was working in a hospital and beginning her own scientific research career. (Irène Joliot-Curie and her husband Frédéric Joliot-Curie would go on to win a Nobel Prize in Chemistry in 1935 for their discovery of artificial radioactivity. They, too, died from illnesses related to overexposure to radiation).
Marie Curie stared down the sexism and obstacles she faced in her day, providing a legacy of achievement and recognition that has inspired generations of scientists, particularly women interested in pursuing research. She would likely have been surprised at the slow pace of achieving equality in the sciences, particularly in her fields of physics and chemistry, that has continued to this day.
In the midst of the COVID-19 pandemic, Radioactive, recently released on Amazon Prime Video, is a timely reminder of the importance of science and scientists in our society. “The movie can be seen not only as a biopic of Marie Curie but also as a spirited defense of science itself,” said a Los Angeles Times article titled “Why It’s Time for Scientists to Become Cinematic Superheroes.” Director Satrapi and actress Pike (both former Oscar nominees) wanted the film to be heroic and inspirational, showing Madame Curie as a scientific superstar, as well as wife and mother who is relatable to a nonscientific audience. As Pike noted, “We presume a child will relate to Wonder Woman more readily than she’ll relate to Marie Curie. But why?”
Even today, says Satrapi, more than 150 years after her birth, Marie Curie “is a woman of the future.”
New research could let scientists co-opt biology’s basic building block—the cell—to construct materials and structures within organisms. A study, published in March in Science and led by Stanford University psychiatrist and bioengineer Karl Deisseroth, shows how to make specific cells produce electricity-carrying (or blocking) polymers on their surfaces. The work could someday allow researchers to build large-scale structures within the body or improve brain interfaces for prosthetic limbs.
In the medium term, the technique may be useful in bioelectric medicine, which involves delivering therapeutic electrical pulses. Researchers in this area have long been interested in incorporating polymers that conduct or inhibit electricity without damaging surrounding tissues. Stimulating specific cells—to intervene in a seizure, for instance—is much more precise than flooding the whole organism with drugs, which can cause broad side effects. But current bioelectric methods, such as those using electrodes, still affect large numbers of cells indiscriminately.
The new technique uses a virus to deliver genes to desired cell types, instructing them to produce an enzyme (Apex2) on their surface. The enzyme sparks a chemical reaction between precursor molecules and hydrogen peroxide, infused in the space between cells; this reaction causes the precursors to fuse into a polymer on the targeted cells. “What’s new here is the intertwining of various emerging fields in one application,” says University of Florida biomedical engineer Kevin Otto, who was not involved in the research but co-authored an accompanying commentary in Science. “The use of conductive polymers assembled [inside living tissue] through synthetic biology, to enable cell-specific interfacing, is very novel.”
The researchers tested the process and tracked cell function in rodent brain cells, artificially grown human brain models, and living worms. They also injected the ingredients into living mice’s brains to show they were not toxic.
The commentary authors say this work could pave the way for improved treatments for depression or Parkinson’s disease by increasing the precision with which neurons are stimulated. It could also precisely target cells that carry information to the brain, potentially giving amputees sensations in a prosthetic limb.
Deisseroth sees the research having even broader uses. “We’ve been able to build new structures inside cells we target genetically, so we have only the cells of interest construct something for us; that’s pretty exciting and very, very general,” he says. “It’s a basic science exploration of: What can we do? What can we build within biological structures using their structural complexity?”
Obstacles remain, however. “There are regulatory hurdles associated with gene therapy in humans,” Otto says. The durability of changes, as well as viability of the technique in higher species, also needs to be demonstrated, he adds.
In a recent op-ed in the Wall Street Journal, Lawrence Krauss bemoans what he sees as contemporary science’s “ideological corruption.” He blames this corruption on humanities scholars for pointing out how science can be “tainted by ideological biases due to race, sex or economic dominance.” His complaint rests on a basic mistake. Krauss confuses what he calls ideological corruption—ideology leading science astray from facts—with ideological awareness.
Ideological commitments and social and political values have always influenced scientific research. Such values can light the way for science or lead into darkness. For most of the history of medical research, studies have disproportionately focused on men. As a result, we know far less about how various ailments manifest in women, and how to treat those ailments with appropriate drugs at appropriate dosages.
This is a problematic influence of values; one that has had deadly outcomes. Currently, a tremendous amount of scientific research has pivoted to address different facets of COVID-19; this is a laudable shift that reflects our collective priority of managing and ultimately ending this pandemic. Science does not occur in a social vacuum, as it were. Rather, scientific research reflects the priorities, unquestioned assumptions, and blind spots of individual scientists and the broader cultures they participate in.
This influence of social and political values on science only becomes problematic when one of two circumstances arises. First, pernicious values can shape scientific research, as regrettably has been the case for racism and sexism for far too much of science’s history. Thus, the failure to devote proportionate attention in medical research to women’s health, and the recurrent efforts over centuries to establish the biological inferiority of people of color—genetic, neural or otherwise. Second, whether values are pernicious or positive, they can lead scientific research astray if they wield improper influence on study design, data analysis or other elements of scientific research.
A well-known example of this is when the tobacco industry funded scientific experts to conduct misleading research about cigarettes’ role in lung cancer. What we fervently wish to be true, or what would enrich a corporation if it were true, should not shape scientific findings. This is why so much attention is spent ensuring the funding sources of scientific research don’t improperly influence the findings.
But the influence of values on scientific research is much more pervasive than these kinds of problematic instances. Scientists’ and societies’ values shape what research questions are posed, how many resources are devoted to answering those questions, what the exact aims of the research consist in, and more. Even for science’s greatest successes, these social values are a motivating influence. Einstein’s revolutionary theories of physics were in part inspired by his concerns about how to set clocks at different train stations to the same time. In 2006, I attended a talk in which the preeminent evolutionary biologist Richard Levins said that his scientific work begins with the question of what science will do for the children. Our interests and our values are the engine of scientific discovery.
Ideological awareness is thus essential to our understanding of science. The failure to recognize the pervasive influence of values on science is a danger because problematic roles for values can proceed unchecked if their role is not acknowledged. It took decades for the gender bias in medical research I mentioned above to gain recognition and even longer for any steps to be taken to mitigate it. Just recently, a Hastings Center report found that a myopic focus on genomics has crowded out research on social determinants of health, at a cost to racial equity. Disclosing how values influence science is the first step to analyzing their influence. Ideology cannot be challenged if it remains concealed.
It seems what Krauss is criticizing is not actually the role of ideological influence at all, but rather ideological awareness, the recognition of ideological bias. He’s right about one thing, though: periodically through science’s long history, there are abhorrent instances of scientific research going wrong for ideological reasons. This is one reason why we need the humanities. We need philosophers to help lay bare and analyze how values shape science; we need historians to reveal science’s broader societal context.
What science needs is not a return to ideological obliviousness but growing ideological awareness: a collective move to uncover the social and political values that influence our scientific research in order to critically evaluate those values and the roles they play. More intentional pursuit of ideological awareness is not a corrupting influence on science but a shift that will only result in better scientific research and a clearer understanding of the significance of our scientific findings.
A Lunar “Tablespoonful”
“In the broad, flat lunar maria, or ‘seas’ (such as Mare Tranquillitatis, the site of the Apollo 11 manned landing), the depths of craters that have reached bedrock indicate a regolith thickness of from five to 10 meters. Thus the Apollo 11 astronauts Neil A. Armstrong and Edwin E. Aldrin, Jr., did not come within several meters of solid rock at Tranquillity Base, and the geology picks they had brought along for the purpose of chipping specimens off outcrops were superfluous. They stood and walked on top of the regolith, and the lunar sample they returned was collected, with scoop and tongs, from this layer of rock debris. Our own group at the Smithsonian Institution Astrophysical Observatory has been working with 16 grams (about a tablespoonful) of the soil. —John A. Wood”
The Mango Bears Fruit
“The U.S. Department of Agriculture has secured through its agricultural explorers and by exchange with the British East Indian departments of agriculture one of the largest collection of mango varieties in the world, and now has in fruit, at its plant introduction station at Miami, Fla., about 20 varieties. It is said that these selected varieties strikingly belie the many unkind things that have been said about the mango. Some of them have hardly more fiber in them than a freestone peach, and can be cut open lengthwise and eaten as easily with a spoon as a cantaloupe.”
Schools and the Army
“The National Research Council announces that the mental tests which were used with striking success in the Army during the war are to be used on a large scale in American public schools. A program of group tests has been worked out which will make it possible to conduct wholesale surveys of schools annually, or even semiannually, so that grade classification and individual educational treatment can be adjusted with desirable frequency.”
Of Toadstools and Whales
“It is a simple matter of fact and of every day observation that all forms of animal work are the result of the reception and assimilation of a few cubic feet of oxygen, a few ounces of water, of starch, of fat, and of flesh. In a chemical point of view man may be defined to be something of this sort. That great authority, Professor [Thomas] Huxley, has lately been discussing what he calls ‘protoplasm, or ‘the physical basis of life.’ He seeks for that community of faculty which exists between the mossy, rock-incrusted lichen, and the painter or botanist that studies it. Professor Huxley has not proved, and it is impossible for him to prove, that these protoplasms may not have essential points of difference. Physiologists cannot yet tell us how it is that ‘of four cells absolutely identical in organic structure and composition, one will grow into Socrates, another into a toadstool, one into a cockchafer, another into a whale.’”
1915: This cover imagined what the surface of Saturn’s moon Titan would look like. Ninety years later the Huygens probe touched down on Titan and sent back images of the actual landscape. Credit: Scientific American, Vol. CXII, No. 12; March 20, 1915
Planets and stars are as much a product of “nature’s laboratory” as Homo sapiens. And while our species examines the information gleaned from the universe that surrounds us, we also have an emotional sense of wonder at revealing the secrets of nature and our place within it. (This magazine may tend to leave emotion to neuroscience, as our focus is on scientific data—but we also publish poetry.) From the invention of the first telescope in 1608 to the discovery of the Milky Way galaxy’s place in Laniakea—a great river of galaxies streaming toward a giant hidden gravity source—our exploration of the universe so far follows a trend: we explore with instruments and with our imaginations. And one day we will go ourselves, leaving our footprints across the universe. —D.S.
An enormous explosion in Beirut’s seaport area this week ruined nearby buildings and caused damage and shattered windows across the city, killing scores of people and wounding thousands more. As videos of the disaster spread on social media, people around the world immediately began speculating about the cause.
Lebanese government sources eventually said it was 2,750 metric tons of ammonium nitrate, a chemical routinely used as agricultural fertilizer and, when mixed with fuel oil, as a mining explosive. The stockpile had been sitting in a warehouse since 2014, when the ship that carried it into port was abandoned. This incident is not the first time ammonium nitrate has caused devastation. In 2013, for instance, a stockpile a tenth the size of the one in Beirut blew up a West Chemical and Fertilizer Company facility in Texas.
Experts such as engineers Suzanne Smyth and Russell Ogle investigate the causes and origins of such fires and explosions in an attempt to prevent future disasters. Smyth and Russ are a managing engineer and practice director, respectively, at Exponent, an American multidisciplinary engineering and scientific consulting company. Scientific American spoke with them about how explosion detectives get to the roots of an incident, and what conditions can trigger an ammonium nitrate detonation.[An edited transcript of the interview follows.]
How do you investigate the cause of a particular explosion or fire?
SMYTH: What we’re essentially following is the scientific method. We go through and gather a bunch of data, which could come from videos, documents, interviews, looking at evidence on the ground. And then, once you have those data, you start hypothesizing. One thing we’ve been trying to be very careful about is not hypothesizing too early in the process. We don’t have a ton of information [for Beirut], but we’ve seen the videos of the explosion. Seeing the shock wave—it’s very rare to see that. What we’re talking about is that really thin, spherical white shape moving away from the explosion. What you’re actually seeing is water vapor condensing out of the air, because of really low pressure right behind the high-pressure shock wave. And then you can see it disappears right away, because it’s evaporating once the pressure equalizes. You can see the actual shock wave, so you know that it detonated—and only certain things can get you to a detonation.
What do you mean by detonation?
OGLE: A shock wave travels faster than the speed of sound, and that’s the hallmark of a detonation. There are two kinds of decomposition reactions that you can find in ammonium nitrate when it’s starting to build up enough pressure to cause damage. The first is called a deflagration. It’s a wave—literally a chemical reaction wave—that is traveling through the material slower than the speed of sound. As it continues to travel, it accelerates. And if it gets to the point where it hits the speed of sound, that’s what we call a detonation. A detonation yields far more damaging mechanisms against things such as structures and buildings.
What else can you learn from the video footage from Beirut?
SMYTH: A lot of the times when we’re analyzing video, we’re looking for the sequence and timing of events. When we’re investigating an explosion, there’s usually both explosion damage and fire damage. It’s always one of our goals to figure out: Was it fire, then explosion or explosion, then fire? The video we see that shows a fire before the explosion is really useful.
OGLE: There’s a very distinctive reddish-brown cloud that’s rising up, following the explosion. It’s not the same thing as performing a chemical analysis, but [the cloud] is very distinctive and would be consistent with the decomposition products of ammonium nitrate—the primary one being nitrogen oxide. I think, at least visually, there’s a potential confirmation there that ammonium nitrate [is] participating in the overall reaction.
What can cause ammonium nitrate to explode?
OGLE: It’s stable under normal conditions, but you can do things to it that will cause it to misbehave. The main trigger is an external heat source. Depending on how you want to count it, there have been probably somewhere between 20 and 30 major catastrophic explosions with ammonium nitrate since it came on the scene as a commercial product in the 1920s. And fire is a frequent trigger. It’s the heat of the fire that warms up the ammonium nitrate that can become a problem. If it is heated by a large heat source like a fire, the ammonium nitrate will begin to decompose—and that decomposition can be mild and harmless, or it can be catastrophic.
The difference between the two is whether or not the ammonium nitrate is pushed together in a stockpile. Think of it like a bonfire with a bunch of logs. When you build up that bonfire, those logs are trapping the heat, which accelerates the burning and makes the big fire. Whereas if you spread them out, the heat escapes to the atmosphere harmlessly. The same thing is true with the ammonium nitrate if it’s loaded up in, for example, what they call supersacks (these flexible containers that can [often] hold about one [metric] ton each). If you pile them up with no airflow in between, then any heat that gets generated during a decomposition is trapped and can’t get out. That heat raises the temperature and accelerates the decomposition, and there’s nothing to stop it.
With the stockpile of ammonium nitrate in Beirut, what precautions should have been taken?
OGLE: In the U.S., we turn to [a nonprofit] organization called the National Fire Protection Association to give us guidance on how to safely handle things like hazardous materials. If you exceed a threshold quantity of material—one half of a [metric] ton [of ammonium nitrate]—then you need to take a much more sophisticated approach to how you store and handle the material to keep it safe. If you have 2,750 [metric] tons, first and foremost, the thing you need to do is move that material far away from the population. It represents a significant hazard.
When investigating an explosion, what other clues do you look for?
SMYTH: The presence of a crater is another indication of the size of the explosion and what could potentially be involved. And the radii of damage: How far do you have minor structural damage? Broken windows? Major structural damage? Looking at how far away things are damaged can help you estimate how much power or energy or force was released. Those types of information can be used to back out what specifically happened. If you’re looking at a smaller explosion within a building, you can look for directional indications: where this wall was blown out toward the north and this wall was blown out toward the east. You can also look at fragments, missiles or shrapnel that were thrown during the explosion to both estimate how much force you’d need to move that fragment, as well as where they are coming from. Oftentimes we’re looking for fragments of certain pieces of equipment to put them back together like a puzzle: you might be able to look at those pieces and understand why they fragmented the way they did. Time lines are really important to our work. Understanding how long something was exposed to a fire, when someone was last in the room [and] noted that everything was normal, when people first see smoke—that type of [establishment of] a time line can be really helpful to help us [set bounds on] what’s going on, as well as potentially eliminate different hypotheses.
OGLE: You’re searching for little nuggets of gold through a bunch of other kinds of fluff. But that requires interviews and searching through documents. One of the things, for example, that there’s been some talk about in the news media is whether or not there were any fireworks stored either nearby or, potentially, in the same warehouse [as the ammonium nitrate in Beirut]. Given the devastation that you’re looking at, it would be probably difficult to find physical evidence of that. You’re going to have to rely on people, and maybe documents, that will help you establish whether or not some materials were being stored in adjacent warehouses or in the same one.
Below Venus’s toxic clouds of sulfuric acid is an apocalyptic world, with temperatures hot enough to melt lead and pressures that could crush heavy machinery. But it might not always have been so.
In 2016 Michael Way of NASA’s Goddard Institute for Space Studies and his colleagues applied the first three-dimensional climate model to early Venus. They found it could once have been so temperate that liquid water pooled in vast oceans—the key to life as we know it. Now Way and Anthony Del Genio, also at Goddard, have developed a framework for the planet’s evolution based on more complex data that incorporates various topographies and different amounts of sunlight. Their study, published in May in the Journal of Geophysical Research: Planets, puts forward a new explanation for how Venus could have remained habitable for nearly three billion years before morphing into today’s blistering hellscape.
Many scientists have postulated that Venus was bone-dry from the beginning and never hosted liquid water. Roughly 4.5 billion years ago, when the solar system formed, the second planet from the sun would have received enough sunlight that any atmospheric water was lost to space—and the radiation would have thwarted the formation of life as it exists on Earth. “There would have been nothing,” Way says, without some mitigating factor. That factor, he and Del Genio argue, could have been a supersized cloud that developed early in the planet’s evolution and cooled the world.
Unlike Earth, Venus does not rotate once on its axis every 24 hours but instead does so once every 243 Earth days. Given that it orbits the sun on a similar timescale (once every 225 Earth days), one side of the planet typically basks in sunlight, while the other faces a lengthy darkness. A thick atmosphere could easily circulate heat from the dayside to the nightside, keeping Venus hot. But in Way and Del Genio’s model, a giant cloud on the dayside would act as a bright shield to reflect incoming sunlight and allow temperatures cool enough for liquid water.
Many researchers have already considered the idea that Venus was once habitable, but the new model further shows how the planet could have transformed into today’s hothouse—and it tosses conventional wisdom aside. “There’s a story about Venus that we tell ourselves. We teach it in introductory astronomy classes, and we write about it in books,” says David Grinspoon, an astrobiologist at the Planetary Science Institute, who was not involved in the study, although he was a co-author on the 2016 paper. “And it turns out that story is wrong.” The idea is that the sun slowly increased in brightness, causing the planet to grow so warm that it could no longer maintain a stable ocean. In other words, it pushed the inner edge of the so-called habitable zone—the orbital region where liquid water can create conditions conducive to life—past the solar system’s second planet. But Way and Del Genio’s model suggests cloud cover would have provided enough shade to keep liquid water on the surface of Venus even until today—had something not tipped the planet into its current state.
The authors propose a violent mechanism best understood by looking at the young Earth. Roughly 250 million years ago deep gashes opened in Earth’s crust, pouring lava onto the surface and spewing enough carbon dioxide into the atmosphere to kill 96 percent of marine species and 70 percent of terrestrial species in the largest mass extinction in history. These volcanic events, which leave deposits called large igneous provinces, produce at least 100,000 cubic kilometers of lava over one million years. “We’re talking about an affront to God in terms of the amount of lava that comes out per unit time,” says Paul Byrne, a planetary geologist at North Carolina State University, who was not involved in the study.
Although these eruptions have rocked Earth on several occasions, often resulting in mass extinctions, multiple events have never happened at once. “That’s fortunate for life on Earth,” Way says, but scientists see no reason why more than one event could not happen simultaneously. And if such multiple events did occur on Venus, they would have dumped enough carbon dioxide into the atmosphere to drive the planet into an apocalyptic greenhouse state, researchers say.
The hypothesis is attractive: “There’s something romantically tragic about a world so like our own that was killed,” Byrne says. “I want so much for it to be true that one day we’ll touch down and find fossils from a shallow sea of a Venusian ecosystem.” He notes, however, that there is no direct evidence to support this notion.
The authors argue that large-scale volcanism would have continued to pave much of the planet in volcanic rock, a state visible today. But Vicki Hansen, a geologist at the University of Minnesota Duluth, who was not involved in the study, says measurements from the Magellan spacecraft, which orbited Venus in the early 1990s, do not support a resurfacing from one catastrophic event: “If you look at the data, it flies in the face of all that,” she says. According to her team’s analysis, “We can identify three distinct eras in the evolution of Venus; if you have catastrophic resurfacing, that doesn’t work, because [it] would wipe out all earlier histories.”
There is no question that the issue is contentious. Indeed, a number of scientists still argue that Venus was never fit for life.
To find out, researchers will need to peer more closely at our neighbor. “We could do models until the cows come home; that doesn’t make anything right,” Hansen says. “We have to test what the results of those models are.”
Byrne says we should send a fleet of spacecraft to Venus, including orbiters, landers, balloons, aerial platforms and even blimps. The planet’s atmosphere holds clues about how much water has been lost, and the surface could reveal whether and when volcanic eruptions punctured it. Future missions could help settle the debate about whether or not Venus was ever hospitable to life and could push astronomers to expand their search for livable planets across the galaxy.
“If this scenario is correct, it says Venus-like planets actually have the potential for life, so we shouldn’t ignore them,” says Adrian Lenardic, a geophysicist at Rice University, who was also not involved in the research. “We should look there.”
“Mr. Francis Galton affirms that ‘the patterns of the papillary ridges upon the bulbous palmar surfaces of the terminal phalanges of the ﬁngers and thumbs are absolutely unchangeable throughout life, and show in different individuals an inﬁnite variety of forms and peculiarities. The chance of two ﬁnger-prints being identical is less than one in sixty-four thousand millions. If, therefore, two ﬁnger-prints are compared and found to coincide exactly, it is practically certain that they are prints of the same ﬁnger of the same person; if they differ, they are made by different ﬁngers. –Lancet”
—Scientific American, June 1894
More gems from Scientific American’s first 175 years can be found on our anniversary archive page.