Causal reasoning is ubiquitous — from physics to medicine, economics and social sciences, as well as in everyday life. Whenever we press the button, the bell rings, and we think that the pressing of the button causes the bell to ring. Normally, causal influence is assumed to only go one way — from cause to effect — and never back from the effect to the cause: the ringing of the bell does not cause the pressing of the button that triggered it. Now researchers from the University of Oxford and the Université libre de Bruxelles have developed a theory of causality in quantum theory, according to which cause-effect relations can sometimes form cycles. This theory offers a novel understanding of exotic processes in which events do not have a definite causal order. The study has been published in Nature Communications.
One of the ways in which quantum theory defies classical intuitions is by challenging our ideas of causality. Quantum entanglement can be used to produce correlations between distant experiments that are known to evade satisfactory causal explanations within the framework of classical causal models. Furthermore, a unification of quantum theory and gravity is expected to allow situations in which the causal structure of spacetime is subject to quantum indefiniteness, suggesting that events need not be causally ordered at all. Recently, a team of researchers from Oxford and Brussels has developed a theory of causality in quantum theory, in which causal concepts are defined in intrinsically quantum terms rather than pertaining to an emergent classical level of measurement outcomes. This has offered, in particular, a causal understanding of the correlations produced by entangled states. Now, they have generalized the theory to allow causal influence to go in cycles, providing a causal understanding of processes with events in indefinite causal order.
“The key idea behind our proposal is that causal relations in quantum theory correspond to influence through so-called unitary transformations — these are the types of transformations that describe the evolutions of isolated quantum systems. This is closely analogous to an approach to classical causal models that assumes underlying determinism and situates causal relations in functional dependences between variables,” says Jonathan Barrett from the University of Oxford.
The main idea of the new study is to apply the same principle to processes in which the order of operations can be dynamic or even indefinite, seeing as a large class of these processes can be understood as arising from unitary transformations, too, just not ones that unfold in an ordinary sequence.
“Previously, processes with indefinite causal order were typically regarded as simply incompatible with any causal account. Our work shows that a major class of them — those that can be understood as arising from unitary processes and which are believed to be the ones that could have a physical realization in nature — could in fact be seen as having a definite causal structure, albeit one involving cycles,” says Robin Lorenz, a corresponding author of the study.
“The idea of cyclic causal structures may seem counterintuitive, but the quantum process framework within which it is formulated guarantees that it is free of logical paradoxes, such as the possibility of going back in time and killing your younger self,” explains Ognyan Oreshkov from the Université libre de Bruxelles. “Exotic as they appear, some of these scenarios are actually known to have experimental realizations in which the variables of interest are delocalized in time.”
Does this mean that spacetime does not have the acyclic causal structure it is normally assumed to have? Not exactly, since in the mentioned experiments the events that are causally related in a cyclic fashion are not local in spacetime. However, the researchers believe that the causal structure of spacetime itself could become cyclic in this quantum way at the intersection of quantum theory and general relativity, where analogous processes to those realizable in the lab are expected, but with the events being local in their respective spacetime reference frames.
Reference: “Cyclic quantum causal models” by Jonathan Barrett, Robin Lorenz and Ognyan Oreshkov, 9 February 2021, Nature Communications.
UCLA materials scientists and colleagues have discovered that perovskites, a class of promising materials that could be used for low-cost, high-performance solar cells and LEDs, have a previously unutilized molecular component that can further tune the electronic property of perovskites.
Named after Russian mineralogist Lev Perovski, perovskite materials have a crystal-lattice structure of inorganic molecules like that of ceramics, along with organic molecules that are interlaced throughout. Up to now, these organic molecules appeared to only serve a structural function and could not directly contribute to perovskites’ electronic performance.
Led by UCLA, a new study shows that when the organic molecules are designed properly, they not only can maintain the crystal lattice structure, but also contribute to the materials’ electronic properties. This discovery opens up new possibilities to improve the design of materials that will lead to better solar cells and LEDs. The study detailing the research was recently published in Science.
“This is like finding an old dog that can play new tricks,” said Yang Yang, the Carol and Lawrence E. Tannas Jr. Professor of Engineering at the UCLA Samueli School of Engineering, who is the principal investigator on the research. “In materials science, we look all the way down to the atomic structure of a material for efficient performance. Our postdocs and graduate students didn’t take anything for granted and dug deeper to find a new pathway.”
In order to make a better-performing perovskite material, the researchers incorporated a specially designed organic molecule, a pyrene-containing organic ammonium. On its exterior, the positively charged ammonium molecule connected to molecules of pyrene — a quadruple ring of carbon atoms. This molecular design offered additional electronic tunability of perovskites.
“The unique property of perovskites is that they have the advantage of high-performance inorganic semiconductors, as well as easy and low-cost processability of polymers,” said study co-lead author Rui Wang, a UCLA postdoctoral scholar in materials science and engineering. “This newly enhanced perovskite material now offers opportunities for improved design concepts with better efficiency.”
To demonstrate perovskites’ added effectiveness, the team built a photovoltaic (PV) cell prototype with the materials, and then tested it under continuous light for 2,000 hours. The new cell continued to convert light to energy at 85% of its original efficiency. This contrasts with a PV cell made of the same materials, but without the added altered organic molecule, which retained only 60% of its original efficiency.
Reference: “Reconfiguring the band-edge states of photovoltaic perovskites by conjugated organic cations” by Jingjing Xue, Rui Wang, Xihan Chen, Canglang Yao, Xiaoyun Jin, Kai-Li Wang, Wenchao Huang, Tianyi Huang, Yepin Zhao, Yaxin Zhai, Dong Meng, Shaun Tan, Ruzhang Liu, Zhao-Kui Wang, Chenhui Zhu, Kai Zhu, Matthew C. Beard, Yanfa Yan and Yang Yang, 5 February 2021, Science.
The other co-lead authors on the study are Jingjing Xue, a materials science postdoctoral scholar at UCLA; and Xihan Chen of the U.S. Department of Energy’s National Renewable Energy Laboratory (NREL) in Colorado. The other corresponding authors include Matthew Beard, a senior research fellow at NREL and the director of its Center for Hybrid Organic Inorganic Semiconductors for Energy; and Yanfa Yan, a professor of physics and astronomy at the University of Toledo.
Other authors are from UCLA; NREL; the University of Toledo; Yangzhou University, China; Soochow University, China; Monash University, Australia; and Lawrence Berkeley National Laboratory.
The research was funded by the U.S. Department of Energy.
How Rocks Rusted on Earth and Turned Red – Important Phenomenon Could Help Assess Future Climate Change
How did rocks rust on Earth and turn red? A Rutgers-led study has shed new light on the important phenomenon and will help address questions about the Late Triassic climate more than 200 million years ago, when greenhouse gas levels were high enough to be a model for what our planet may be like in the future.
“All of the red color we see in New Jersey rocks and in the American Southwest is due to the natural mineral hematite,” said lead author Christopher J. Lepre, an assistant teaching professor in the Department of Earth and Planetary Sciences in the School of Arts and Sciences at Rutgers University–New Brunswick. “As far as we know, there are only a few places where this red hematite phenomenon is very widespread: one being the geologic ‘red beds’ on Earth and another is the surface of Mars. Our study takes a significant step forward toward understanding how long it takes for redness to form, the chemical reactions involved and the role hematite plays.”
The research by Lepre and a Columbia University scientist is in the journal Proceedings of the National Academy of Sciences. It challenges conventional thinking that hematite has limited use for interpreting the ancient past because it is a product of natural chemical changes that occurred long after the beds were initially deposited.
Lepre demonstrated that hematite concentrations faithfully track 14.5 million years of Late Triassic monsoonal rainfall over the Colorado Plateau of Arizona when it was on the ancient supercontinent of Pangea. With this information, he assessed the interrelationships between environmental disturbances, climate and the evolution of vertebrates on land.
Lepre examined part of a 1,700-foot-long rock core from the Chinle Formation in the Petrified Forest National Park in Arizona (the Painted Desert) that is housed at Rutgers. Rutgers–New Brunswick Professor Emeritus Dennis V. Kent examined the same core for a Rutgers-led study that found that gravitational tugs from Jupiter and Venus slightly elongate Earth’s orbit every 405,000 years and influenced Earth’s climate for at least 215 million years, allowing scientists to better date events like the spread of dinosaurs.
Lepre measured the visible light spectrum to determine the concentration of hematite within red rocks. To the scientists’ knowledge, it is the first time this method has been used to study rocks this old, dating to the Late Triassic epoch more than 200 million years ago. Many scientists thought the redness was caused much more recently by the iron in rocks reacting with air, just like rust on a bicycle. So for decades, scientists have viewed hematite and its redness as largely unimportant.
“The hematite is indeed old and probably resulted from the interactions between the ancient soils and climate change,” Lepre said. “This climate information allows us to sort out some causes and effects – whether they were due to climate change or an asteroid impact at Manicouagan in Canada, for example – for land animals and plants when the theropod dinosaurs (early ancestors of modern birds and Tyrannosaurus rex) were rising to prominence.”
The scientists, in collaboration with Navajo Nation members, have submitted a multi-million dollar grant proposal to retrieve more cores at the Colorado Plateau that will include rocks known to record a very rapid atmospheric change in carbon dioxide similar to its recent doubling as a result of human activity.
Reference: “Hematite reconstruction of Late Triassic hydroclimate over the Colorado Plateau” by Christopher J. Lepre and Paul E. Olsen, 16 February 2021, Proceedings of the National Academy of Sciences.
Funding: National Science Foundation, Lamont Climate Center.
Almost 80 years after its discovery, a large shell from the ornate Marsoulas Cave in the Pyrenees has been studied by a multidisciplinary team from the CNRS, the Muséum de Toulouse, the Université Toulouse – Jean Jaurès and the Musée du quai Branly – Jacques-Chirac: it is believed to be the oldest wind instrument of its type. Scientists reveal how it sounds in a study published in the journal Science Advances on February 10th, 2021.
The Marsoulas Cave, between Haute-Garonne and Ariège, was the first decorated cave to be found in the Pyrenees. Discovered in 1897, the cave bears witness to the beginning of the Magdalenian culture in this region, at the end of the Last Glacial Maximum. During an inventory of the material from the archaeological excavations, most of which is kept in the Muséum de Toulouse, scientists examined a large Charonia lampas (sea snail) shell, which had been largely overlooked when discovered in 1931.
The tip of the shell is broken, forming a 3.5 cm diameter opening. As this is the hardest part of the shell, the break is clearly not accidental. At the opposite end, the shell opening shows traces of retouching (cutting) and a tomography scan has revealed that one of the first coils is perforated. Finally, the shell has been decorated with a red pigment (hematite), characteristic of the Marsoulas Cave, which indicates its status as a symbolic object.
To confirm the hypothesis that this conch was used to produce sounds, scientists enlisted the help of a horn player, who managed to produce three sounds close to the notes C, C-sharp and D. As the opening was irregular and covered with an organic coating, the researchers assume that a mouthpiece was also attached, as is the case for more recent conches in collection of the Musée du quai Branly – Jacques Chirac. 3D impressions of the conch will enable this lead to be explored and verify whether it can be used to produce other notes.
Listen to the sound of the Marsoulas conch, as it may have been played 18,000 years ago. Credit: © Carole Fritz et al. 2021 / playing: Jean-Michel Court / recording: Julien Tardieu
The first carbon-14 dating of the cave, carried out on a piece of charcoal and a fragment of bear bone from the same archaeological level as the shell, provided a date of around 18,000 years. This makes the Marsoulas conch the oldest wind instrument of its type: to date, only flutes have been discovered in earlier European Upper Palaeolithic contexts; the conches found outside Europe are much more recent.
In addition to immersing us in the sounds produced by our Magdalenian ancestors, this shell reinforces the idea of exchanges between the Pyrenees and the Atlantic coast, more than 200 kilometers away.
The laboratories involved are the Travaux et recherches archéologiques sur les cultures, les espaces et les sociétés (CNRS/Université Toulouse – Jean Jaurès/Ministère de la Culture), the Maison des sciences de l’homme et de la société de Toulouse (CNRS/Université Fédérale de Toulouse) and the Laboratoire d’archéologie moléculaire et structurale (CNRS/Sorbonne Université). Covering a period between around 21,000 and 14,000 years BP, it is characterized by worked animal bones and antlers and extensive exchange networks. The Altamira and Lascaux caves are the most famous examples. As the quantity is limited, analyses have not been able to identify its nature.
Reference: 10 February 2021, Science Advances.
On April 7, NASA’s OSIRIS-REx mission will give asteroid Bennu one last glance before saying farewell. Before departing for Earth on May 10, the OSIRIS-REx spacecraft will perform a final flyby of Bennu – capturing its last images of sample collection site Nightingale to look for transformations on Bennu’s surface after the October 20, 2020, sample collection event.
The OSIRIS-REx mission team recently completed a detailed safety analysis of a trajectory to observe sample site Nightingale from a distance of approximately 2.4 miles (3.8 kilometers). The spacecraft’s flight path is designed to keep OSIRIS-REx a safe distance from Bennu, while ensuring the science instruments can collect precise observations. The single flyby will mimic one of the observation sequences conducted during the mission’s Detailed Survey phase in 2019. OSIRIS-REx will image Bennu for a full 4.3-hour rotation to obtain high-resolution images of the asteroid’s northern and southern hemispheres and its equatorial region. The team will then compare these new images with the previous high-resolution imagery of Bennu obtained during 2019.
This final flyby of Bennu was not part of the original mission schedule, but the observation run will provide the team an opportunity to learn how the spacecraft’s contact with Bennu’s surface altered the sample site. Bennu’s surface was considerably disturbed after the Touch-and-Go (TAG) sample collection event, with the collector head sinking 1.6 feet (48.8 centimeters) into the asteroid’s surface while firing a pressurized charge of nitrogen gas. The spacecraft’s thrusters also mobilized a substantial amount of surface material during the back-away burn.
During this new mission phase, called the Post-TAG Observation (PTO) phase, the spacecraft will perform five separate navigation maneuvers in order to return to the asteroid and position itself for the flyby. OSIRIS-REx executed the first maneuver on Jan. 14, which acted as a braking burn and put the spacecraft on a trajectory to rendezvous with the asteroid one last time. Since October’s sample collection event, the spacecraft has been slowly drifting away from the asteroid, and ended up approximately 1,635 miles (2,200 km) from Bennu. After the braking burn, the spacecraft is now slowly approaching the asteroid and will perform a second approach maneuver on March 6, when it is approximately 155 miles (250 km) from Bennu. OSIRIS-REx will then execute three subsequent maneuvers, which are required to place the spacecraft on a precise trajectory for the final flyby on April 7.
OSIRIS-REx is scheduled to depart Bennu on May 10 and begin its two-year journey back to Earth. The spacecraft will deliver the samples of Bennu to the Utah Test and Training Range on September 24, 2023.
NASA’s Goddard Space Flight Center in Greenbelt, Maryland, provides overall mission management, systems engineering, and the safety and mission assurance for OSIRIS-REx. Dante Lauretta of the University of Arizona, Tucson, is the principal investigator, and the University of Arizona also leads the science team and the mission’s science observation planning and data processing. Lockheed Martin Space in Denver built the spacecraft and provides flight operations. Goddard and KinetX Aerospace are responsible for navigating the OSIRIS-REx spacecraft. OSIRIS-REx is the third mission in NASA’s New Frontiers Program, which is managed by NASA’s Marshall Space Flight Center in Huntsville, Alabama, for the agency’s Science Mission Directorate in Washington.
For nearly a century, scientists have worked to unravel the mystery of dark matter — an elusive substance that spreads through the universe and likely makes up much of its mass, but has so far proven impossible to detect in experiments. Now, a team of researchers have used an innovative technique called “quantum squeezing” to dramatically speed up the search for one candidate for dark matter in the lab.
The findings, published today in the journal Nature, center on an incredibly lightweight and as-of-yet undiscovered particle called the axion. According to theory, axions are likely billions to trillions of times smaller than electrons and may have been created during the Big Bang in humungous numbers — enough to potentially explain the existence of dark matter.
Finding this promising particle, however, is a bit like looking for a single quantum needle in one really big haystack.
There may be some relief in sight. Researchers on a project called, fittingly, the Haloscope At Yale Sensitive To Axion Cold Dark Matter (HAYSTAC) experiment report that they’ve improved the efficiency of their hunt past a fundamental obstacle imposed by the laws of thermodynamics. The group includes scientists at JILA, a joint research institute of the University of Colorado Boulder and the National Institute of Standards and Technology (NIST).
“It’s a doubling of the speed from what we were able to do before,” said Kelly Backes, one of two lead authors of the new paper and a graduate student at Yale University.
The new approach allows researchers to better separate the incredibly faint signals of possible axions from the random noise that exists at extremely small scales in nature, sometimes called “quantum fluctuations.” The team’s chances of finding the axion over the next several years are still about as likely as winning the lottery, said study coauthor Konrad Lehnert, a NIST Fellow at JILA. But those odds are only going to get better.
“Once you have a way around quantum fluctuations, your path can just be made better and better,” said Lehnert, also a professor adjoint in the Department of Physics at CU Boulder.
HAYSTAC is led by Yale and is a partnership with JILA and the University of California, Berkeley.
Daniel Palken, the co-first author of the new paper, explained that what makes the axion so difficult to find is also what makes it such an ideal candidate for dark matter — it’s lightweight, carries no electric charge and almost never interacts with normal matter.
“They don’t have any of the properties that make a particle easy to detect,” said Palken, who earned his PhD from JILA in 2020
But there’s one silver lining: If axions pass through a strong enough magnetic field, a small number of them may transform into waves of light — and that’s something that scientists can detect. Researchers have launched efforts to find those signals in powerful magnetic fields in space. The HAYSTAC experiment, however, is keeping its feet planted on Earth.
The project, which published its first findings in 2017, employs an ultra-cold facility on the Yale campus to create strong magnetic fields, then try to detect the signal of axions turning into light. It’s not an easy search. Scientists have predicted that axions could exhibit an extremely wide range of theoretical masses, each of which would produce a signal at a different frequency of light in an experiment like HAYSTAC. In order to find the real particle, then, the team may have to rifle through a large range of possibilities — like tuning a radio to find a single, faint station.
“If you’re trying to drill down to these really feeble signals, it could end up taking you thousands of years,” Palken said.
Some of the biggest obstacles facing the team are the laws of quantum mechanics themselves — namely, the Heisenberg Uncertainty Principle, which limits how accurate scientists can be in their observations of particles. In this case, the team can’t accurately measure two different properties of the light produced by axions at the same time.
The HAYSTAC team, however, has landed on a way to slip past those immutable laws.
The trick comes down to using a tool called a Josephson parametric amplifier. Scientists at JILA developed a way to use these small devices to “squeeze” the light they were getting from the HAYSTAC experiment.
Palken explained that the HAYSTAC team doesn’t need to detect both properties of incoming light waves with precision — just one of them. Squeezing takes advantage of that by shifting uncertainties in measurements from one of those variables to another.
“Squeezing is just our way of manipulating the quantum mechanical vacuum to put ourselves in a position to measure one variable very well,” Palken said. “If we tried to measure the other variable, we would find we would have very little precision.”
To test out the method, the researchers did a trial run at Yale to look for the particle over a certain range of masses. They didn’t find it, but the experiment took half the time that it usually would, Backes said.
“We did a 100-day data run,” she said. “Normally, this paper would have taken us 200 days to complete, so we saved a third of a year, which is pretty incredible.”
Lehnert added that the group is eager to push those bounds even farther — coming up with new ways to dig for that ever-elusive needle.
“There’s a lot of meat left on the bone in just making the idea work better,” he said.
Reference: 10 February 2021, Nature.
Machines, thanks to novel algorithms and advances in computer technology, can now learn complex models and even generate high-quality synthetic data such as photo-realistic images or even resumes of imaginary humans. A study recently published in the international journal PLOS Genetics uses machine learning to mine existing biobanks and generate chunks of human genomes that do not belong to real humans but have the characteristics of real genomes.
“Existing genomic databases are an invaluable resource for biomedical research, but they are either not publicly accessible or shielded behind long and exhausting application procedures due to valid ethical concerns. This creates a major scientific barrier for researchers. Machine-generated genomes, or artificial genomes as we call them, can help us overcome the issue within a safe ethical framework,” said Burak Yelmen, first author of the study and Junior Research Fellow of Modern Population Genetics at the University of Tartu.
The pluridisciplinary team performed multiple analyses to assess the quality of the generated genomes compared to real ones. “Surprisingly, these genomes emerging from random noise mimic the complexities that we can observe within real human populations and, for most properties, they are not distinguishable from other genomes from the biobank we used to train our algorithm, except for one detail: they do not belong to any gene donor,” said Dr Luca Pagani, one of the senior authors of the study and a Mobilitas Pluss fellow.
The study additionally involves the assessment of the proximity of artificial genomes to real genomes to test whether the privacy of the original samples is preserved. “Although detecting privacy leaks among thousands of genomes could appear as looking for a needle in a haystack, combining multiple statistical measures allowed us to check all models carefully. Excitingly, the detailed exploration of complex leakage patterns can lead to improvements in generative model evaluation and design, and will fuel back the machine learning field,” said Dr Flora Jay, the coordinator of the study and CNRS researcher in the Interdisciplinary computer science laboratory (LRI/LISN, Université Paris-Saclay, French National Centre for Scientific Research).
All in all, machine learning approaches had provided faces, biographies and multiple other features to a handful of imaginary humans: now we know more about their biology. These imaginary humans with realistic genomes could serve as proxies for all the real genomes which are not publicly available or require long application procedures or collaborations, hence removing an important accessibility barrier in genomic research, in particular for underrepresented populations.
Reference: “Creating artificial human genomes using generative neural networks” by Burak Yelmen, Aurélien Decelle, Linda Ongaro, Davide Marnetto, Corentin Tallec, Francesco Montinaro, Cyril Furtlehner, Luca Pagani and Flora Jay, 4 February 2021, PLOS Genetics.
People traditionally think that lungs and limbs are key innovations that came with the vertebrate transition from water to land. But in fact, the genetic basis of air-breathing and limb movement was already established in our fish ancestor 50 million years earlier. This, according to a recent genome mapping of primitive fish conducted by the University of Copenhagen, among others. The new study changes our understanding of a key milestone in our own evolutionary history.
There is nothing new about humans and all other vertebrates having evolved from fish. The conventional understanding has been that certain fish shimmied landwards roughly 370 million years ago as primitive, lizard-like animals known as tetrapods. According to this understanding, our fish ancestors came out from water to land by converting their fins to limbs and breathing under water to air-breathing.
However, limbs and lungs are not innovations that appeared as recent as once believed. Our common fish ancestor that lived 50 million years before the tetrapod first came ashore already carried the genetic codes for limb-like forms and air breathing needed for landing. These genetic codes are still present in humans and a group of primitive fishes.
This has been demonstrated by recent genomic research conducted by University of Copenhagen and their partners. The new research reports that the evolution of these ancestral genetic codes might have contributed to the vertebrate water-to-land transition, which changes the traditional view of the sequence and timeline of this big evolutionary jump. The study has been published in the scientific journal Cell.
“The water-to-land transition is a major milestone in our evolutionary history. The key to understanding how this transition happened is to reveal when and how the lungs and limbs evolved. We are now able to demonstrate that the genetic basis underlying these biological functions occurred much earlier before the first animals came ashore,” stated by professor and lead author Guojie Zhang, from Villum Centre for Biodiversity Genomics, at the University of Copenhagen’s Department of Biology.
A group of ancient living fishes might hold the key to explain how the tetrapod ultimately could grow limbs and breathe on air. The group of fishes includes the bichir that lives in shallow freshwater habitats in Africa. These fishes differ from most other extant bony fishes by carrying traits that our early fish ancestors might have had over 420 million years ago. And the same traits are also present in for example humans. Through a genomic sequencing the researchers found that the genes needed for the development of lungs and limbs have already appeared in these primitive species.
Our synovial joint evolved from fish ancestor
Using pectoral fins with a locomotor function like limbs, the bichir can move about on land in a similar way to the tetrapod. Researchers have for some years believed that pectoral fins in bichir represent the fins that our early fish ancestors had.
The new genome mapping shows that the joint which connects the socalled metapterygium bone with the radial bones in the pectoral fin in the bichir is homologous to synovial joints in humans — the joints that connect upper arm and forearm bones. The DNA sequence that controls the formation of our synovial joints already existed in the common ancestors of bonefish and is still present in these primitive fishes and in terrestrial vertebrates. At some point, this DNA sequence and the synovial joint was lost in all of the common bony fishes — the socalled teleosts.
“This genetic code and the joint allows our bones move freely, which explains why the bichir can move around on land,” says Guojie Zhang.
First lungs, then swim bladder
Moreover, the bichir and a few other primitive fishes have a pair of lungs that anatomically resembles ours. The new study reveals that the lungs in both bichir and alligator gar also function in a similar manner and express same set of genes as human lungs.
At the same time, the study demonstrates that the tissue of the lung and swim bladder of most extant fishes are very similar in gene expression, confirming they are homologous organs as predicted by Darwin. But while Darwin suggested that swim bladders converted to lungs, the study suggests it is more likely that swim bladders evolved from lungs. The research suggests that our early bony fish ancestors had primitive functional lungs. Through evolution, one branch of fish preserved the lung functions that are more adapted to air breathing and ultimately led to the evolution of tetrapods. The other branch of fishes modified the lung structure and evolved with swim bladders, leading the evolution of teleosts. The swim bladders allow these fishes to maintain buoyancy and perceive pressure, thus better survive under water.
“The study enlightens us with regards to where our body organs came from and how their functions are decoded in the genome. Thus, some of the functions related to lung and limbs did not evolve at the time when the water-to-land transition occurred, but are encoded by some ancient gene regulatory mechanisms that were already present in our fish ancestor far before landing. It is interesting that these genetic codes are still present in these ‘living-fossil” fishes, which offer us the opportunity to trace back the root of these genes,” concludes Guojie Zhang.
FACT BOX 1: Not just limbs and lungs, but also the heart
Primitive fish and humans also share a common and critical function in the cardio-respiratory system: The conus arteriosus, a structure in the right ventricle of our heart which might allow the heart to efficiently deliver the oxygen to the whole body, and which is also found in the bichir. However, the vast majority of bony fish have lost this structure. The researchers discovered a genetic element that appears to control the development of the conus arteriosus. Transgenic experiments with mice showed that when researchers removed this genetic element, the mutated mice died due to thinner, smaller right ventricles, which lead to congenital heart defects and compromised heart function.
FACT BOX 2:
The vast majority of extant fish species belong to the ray-finned fishes, a subclass of bony fish. These are typically fish with gills, fins and a swim bladder. The terrestrial group of vertebrates are known as tetrapod. The tetrapod includes all vertebrates that descended from the first animals adapted to a life on land by developing four limbs and lungs, i.e., all mammals, birds, reptiles and amphibians. The researchers’ theory is that the air-breathing ability in these primitive fishes allowed them to survive the second mass extinction roughly 375-360 million years ago. At that time, oxygen depletion in Earth’s oceans caused a majority of species to be wiped out. Lungs allowed some fish to survive on land. The study has been published in the scientific journal Cell. The research team also contributed to another paper which reported the genome for another primitive fish, the lungfish. The genome is the biggest vertebrate genome decoded so far. This paper was published in Cell at the same time. The research is supported by the Villum Foundation, among others.
Reference: “Tracing the genetic footprints of vertebrate landing in non-teleost ray-finned fishes” by Xupeng Bi, Kun Wang, Liandong Yang, Hailin Pan, Haifeng Jiang, Qiwei Wei, Miaoquan Fang, Hao Yu, Chenglong Zhu, Yiran Cai, Yuming He, Xiaoni Gan, Honghui Zeng, Daqi Yu, Youan Zhu, Huifeng Jiang, Qiang Qiu, Huanming Yang, Yong E. Zhang, Wen Wang, Min Zhu, Shunping He and Guojie Zhang, 4 February 2021, Cell.
Entry, Descent, and Landing – often referred to as “EDL” – is the shortest and most intense phase of the Mars 2020 mission. It begins when the spacecraft reaches the top of the Martian atmosphere, traveling nearly 12,500 miles per hour (20,000 kilometers per hour). It ends about seven minutes later, with Perseverance stationary on the Martian surface. To safely go from those speeds down to zero, in that short amount of time, while hitting a narrow target on the surface, requires “slamming on the brakes” in a very careful, creative, and challenging way.
Experience Entry, Descent, and Landing
You need a iFrames Capable browser to view this content.
Real-Time Simulation: Fly alongside Perseverance in this 3D demo of its Entry, Descent, and Landing. View the full experience ›
Landing on Mars is hard. Only about 40 percent of the missions ever sent to Mars – by any space agency – have been successful. Hundreds of things have to go just right during this nail-biting drop. What’s more, Perseverance has to handle everything by itself. During the landing, it takes more than 11 minutes to get a radio signal back from Mars, so by the time the mission team hears that the spacecraft has entered the atmosphere, in reality, the rover is already on the ground. So, Perseverance is designed to complete the entire EDL process by itself – autonomously.
How the Landing Plays Out
Ten minutes before entering the atmosphere, the spacecraft sheds its cruise stage, which houses solar panels, radios, and fuel tanks used during its flight to Mars. Only the protective aeroshell – with rover and descent stage inside – makes the trip to the surface. Before entering the atmosphere, the vehicle fires small thrusters on the backshell to reorient itself and make sure the heat shield is facing forward for what comes next.
As the spacecraft enters the Martian atmosphere, the drag produced drastically slows it down – but these forces also heat it up dramatically. Peak heating occurs about 80 seconds after atmospheric entry, when the temperature at the external surface of the heat shield reaches about 2,370 degrees Fahrenheit (about 1,300 degrees Celsius). Safe in the aeroshell, however, the rover gets up to only about room temperature.
As it begins to descend through the atmosphere, the spacecraft encounters pockets of air that are more or less dense, which can nudge it off course. To compensate, it fires small thrusters on its backshell that adjust its angle and direction of lift. This “guided entry” technique helps the spacecraft stay on the path to its downrange target.
The heat shield slows the spacecraft to under 1,000 miles per hour (1,600 kilometers per hour). At that point, it’s safe to deploy the supersonic parachute. To nail the timing of this critical event, Perseverance uses a new technology – Range Trigger – to calculate its distance to the landing target and open the parachute at the ideal time to hit its mark.
The parachute, which is 70.5 feet (21.5 meters) in diameter, deploys about 240 seconds after entry, at an altitude of about 7 miles (11 kilometers) and a velocity of about 940 mph (1,512 kph).
Zeroing In on Landing
Twenty seconds after parachute deployment, the heat shield separates and drops away. The rover is exposed to the atmosphere of Mars for the first time, and key cameras and instruments can begin to lock onto the fast-approaching surface below. Its landing radar bounces signals of the surface to figure out its altitude. Meanwhile, another new EDL technology – Terrain-Relative Navigation – kicks in.
Using a special camera to quickly identify features on the surface, the rover compares these to an onboard map to determine exactly where it’s heading. Mission team members have mapped in advance the safest areas of the landing zone. If Perseverance can tell that it’s headed for more hazardous terrain, it picks the safest spot it can reach and gets ready for the next dramatic step.
In the thin Martian atmosphere, the parachute is only able to slow the vehicle to about 200 miles per hour (320 kilometers per hour). To get to its safe touchdown speed, Perseverance must cut itself free of the parachute, and ride the rest of the way down using rockets.
Directly above the rover, inside the backshell, is the rocket-powered descent stage. Think of it as a kind of jetpack with eight engines pointed down at the ground. Once it’s about 6,900 feet (2,100 meters) above the surface, the rover separates from the backshell, and fires up the descent stage engines.
The descent stage quickly diverts to one side or the other, to avoid being impacted by the parachute and backshell coming down behind it. The direction of its divert maneuver is determined by the safe target selected by the computer that runs Terrain-Relative Navigation.
As the descent stage levels out and slows to its final descent speed of about 1.7 miles per hour (2.7 kilometers per hour), it initiates the “skycrane” maneuver. With about 12 seconds before touchdown, at about 66 feet (20 meters) above the surface, the descent stage lowers the rover on a set of cables about 21 feet (6.4 meters) long. Meanwhile, the rover unstows its mobility system, locking its legs and wheels into landing position.
As soon as the rover senses that its wheels have touched the ground, it quickly cuts the cables connecting it to the descent stage. This frees the descent stage to fly off to make its own uncontrolled landing on the surface, a safe distance away from Perseverance.
Save the Date
Known by the team as the “Seven Minutes of Terror,” the Entry, Descent, and Landing for Perseverance will be broadcast live as the rover arrives at Mars on Feb. 18, 2021.
New capabilities developed by an international team of astronomers make it possible to directly image planets that could potentially harbor life within the habitable zone of a neighboring star system.
It is now possible to capture images of planets that could potentially sustain life around nearby stars, thanks to advances reported by an international team of astronomers in the journal Nature Communications.
Using a newly developed system for mid-infrared exoplanet imaging, in combination with a very long observation time, the study’s authors say they can now use ground-based telescopes to directly capture images of planets about three times the size of Earth within the habitable zones of nearby stars.
Efforts to directly image exoplanets — planets outside our solar system — have been hamstrung by technological limitations, resulting in a bias toward the detection of easier-to-see planets that are much larger than Jupiter and are located around very young stars and far outside the habitable zone — the “sweet spot” in which a planet can sustain liquid water. If astronomers want to find alien life, they need to look elsewhere.
“If we want to find planets with conditions suitable for life as we know it, we have to look for rocky planets roughly the size of Earth, inside the habitable zones around older, sun-like stars,” said the paper’s first author, Kevin Wagner, a Sagan Fellow in NASA’s Hubble Fellowship Program at the University of Arizona’s Steward Observatory.
The method described in the paper provides more than a tenfold improvement over existing capabilities to directly observe exoplanets, Wagner said. Most studies on exoplanet imaging have looked in infrared wavelengths of less than 10 microns, stopping just short of the range of wavelengths where such planets shine the brightest, Wagner said.
“There is a good reason for that because the Earth itself is shining at you at those wavelengths,” Wagner said. “Infrared emissions from the sky, the camera and the telescope itself are essentially drowning out your signal. But the good reason to focus on these wavelengths is that’s where an Earthlike planet in the habitable zone around a sun-like star is going to shine brightest.”
The team used the Very Large Telescope, or VLT, of the European Southern Observatory in Chile to observe our closest neighbor star system: Alpha Centauri, just 4.4 light-years away. Alpha Centauri is a triple star system; it consists of two stars — Alpha Centauri A and B — that are similar to the sun in size and age and orbit each other as a binary system. The third star, Alpha Centauri C, better known as Proxima Centauri, is a much smaller red dwarf orbiting its two siblings at a great distance.
A planet not quite twice the size of Earth and orbiting in the habitable zone around Proxima Centauri has already been indirectly detected through observations of the star’s radial velocity variation, or the tiny wobble a star exhibits under the tug of the unseen planet. According to the study’s authors, Alpha Centauri A and B could host similar planets, but indirect detection methods are not yet sensitive enough to find rocky planets in their more widely separated habitable zones, Wagner explained.
“With direct imaging, we can now push beneath those detection limits for the first time,” he said.
To boost the sensitivity of the imaging setup, the team used a so-called adaptive secondary telescope mirror that can correct for the distortion of the light by the Earth’s atmosphere. In addition, the researchers used a starlight-blocking mask that they optimized for the mid-infrared light spectrum to block the light from one of the stars at a time. To enable observing both stars’ habitable zones simultaneously, they also pioneered a new technique to switch back and forth between observing Alpha Centauri A and Alpha Centauri B very rapidly.
“We’re moving one star on and one star off the coronagraph every tenth of a second,” Wagner said. “That allows us to observe each star for half of the time, and, importantly, it also allows us to subtract one frame from the subsequent frame, which removes everything that is essentially just noise from the camera and the telescope.”
Using this approach, the undesired starlight and “noise” — unwanted signal from within the telescope and camera — become essentially random background noise, possible to further reduce by stacking images and subtracting the noise using specialized software.
Similar to the effect to noise-canceling headphones, which allow soft music to be heard over a steady stream of unwanted jet engine noise, the technique allowed the team to remove as much of the unwanted noise as possible and detect the much fainter signals created by potential planet candidates inside the habitable zone.
The team observed the Alpha Centauri system for nearly 100 hours over the course of a month in 2019, collecting more than 5 million images. They collected about 7 terabytes of data, which they made publicly available at http://archive.eso.org.
“This is one of the first dedicated multi-night exoplanet imaging campaigns, in which we stacked all of the data we accumulated over nearly a month and used that to achieve our final sensitivity,” Wagner said.
After removing so-called artifacts — false signals created by the instrumentation and residual light from the coronagraph — the final image revealed a light source designated as “C1” that could potentially hint at the presence of an exoplanet candidate inside the habitable zone.
“There is one point source that looks like what we would expect a planet to look like, that we can’t explain with any of the systematic error corrections,” Wagner said. “We are not at the level of confidence to say we discovered a planet around Alpha Centauri, but there is a signal there that could be that with some subsequent verification.”
Simulations of what planets within the data are likely to look like suggest that “C1” could be a Neptune– to Saturn-sized planet at a distance from Alpha Centauri A that is similar to the distance between the Earth and the sun, Wagner said. However, the authors clearly state that without subsequent verification, the possibility that C1 might be due to some unknown artifact caused by the instrument itself cannot be ruled out just yet.
Finding a potentially habitable planet within Alpha Centauri has been the goal of the initiative Breakthrough Watch/NEAR, which stands for New Earths in the Alpha Centauri Region. Breakthrough Watch is a global astronomical program looking for Earthlike planets around nearby stars.
“We are very grateful to the Breakthrough Initiatives and ESO for their support in achieving another steppingstone towards the imaging of Earthlike planets around our neighbor stars,” said Markus Kasper, lead scientist of the NEAR project and a co-author on the paper.
The team intends to embark on another imaging campaign in a few years, in an attempt to catch this potential exoplanet in the Alpha Centauri system in a different location, and to see whether it would be consistent with what would be expected based on modeling its expected orbit. Further clues may come from follow-up observations using different methods.
The next-generation of extremely large telescopes, such as the Extremely Large Telescope of the European Southern Observatory, and the Giant Magellan Telescope, for which the University of Arizona produces the primary mirrors, are expected to be able to increase direct observations of nearby stars that might harbor planets in their habitable zones by a factor of 10, Wagner explained. Candidates to look at include Sirius, the brightest star in the night sky, and Tau Ceti, which hosts an indirectly observed planetary system that Wagner and his colleagues will try to directly image.
“Making the capability demonstrated here a routine observing mode — to be able to pick up heat signatures of planets orbiting within the habitable zones of nearby stars — will be a game changer for the exploration of new worlds and for the search for life in the universe,” said study co-author Daniel Apai, a UArizona associate professor of astronomy and planetary science who leads the NASA-funded Earths in Other Solar Systems program that partly supported the study.
Reference: 10 February 2021, Nature Communications.
Funding for NEAR was provided primarily by the Breakthrough Watch program and the European Southern Observatory (ESO). Breakthrough Watch is managed by the Breakthrough Initiatives, sponsored by the Breakthrough Foundation. Breakthrough Watch provided the instrument upgrades that made the observations possible, and ESO contributed the telescope time.
For a full set of authors and institutions, and funding information, please see the research paper “Imaging low-mass planets within the habitable zone of Alpha; Centauri.”