A massive simulation of the cosmos and a nod to the next generation of computing.
A team of physicists and computer scientists from the U.S. Department of Energy’s (DOE) Argonne National Laboratory performed one of the five largest cosmological simulations ever. Data from the simulation will inform sky maps to aid leading large-scale cosmological experiments.
The simulation, called the Last Journey, follows the distribution of mass across the universe over time — in other words, how gravity causes a mysterious invisible substance called “dark matter” to clump together to form larger-scale structures called halos, within which galaxies form and evolve.
“We’ve learned and adapted a lot during the lifespan of Mira, and this is an interesting opportunity to look back and look forward at the same time.” — Adrian Pope, Argonne physicist
The scientists performed the simulation on Argonne’s supercomputer Mira. The same team of scientists ran a previous cosmological simulation called the Outer Rim in 2013, just days after Mira turned on. After running simulations on the machine throughout its seven-year lifetime, the team marked Mira’s retirement with the Last Journey simulation.
The Last Journey demonstrates how far observational and computational technology has come in just seven years, and it will contribute data and insight to experiments such as the Stage-4 ground-based cosmic microwave background experiment (CMB-S4), the Legacy Survey of Space and Time (carried out by the Rubin Observatory in Chile), the Dark Energy Spectroscopic Instrument and two NASA missions, the Roman Space Telescope and SPHEREx.
“We worked with a tremendous volume of the universe, and we were interested in large-scale structures, like regions of thousands or millions of galaxies, but we also considered dynamics at smaller scales,” said Katrin Heitmann, deputy division director for Argonne’s High Energy Physics (HEP) division.
p id=”caption-attachment-110774″>Argonne’s Mira supercomputer was recently retired after seven years of enabling groundbreaking science. Credit: Argonne National Laboratory
The code that constructed the cosmos
The six-month span for the Last Journey simulation and major analysis tasks presented unique challenges for software development and workflow. The team adapted some of the same code used for the 2013 Outer Rim simulation with some significant updates to make efficient use of Mira, an IBM Blue Gene/Q system that was housed at the Argonne Leadership Computing Facility (ALCF), a DOE Office of Science User Facility.
Specifically, the scientists used the Hardware/Hybrid Accelerated Cosmology Code (HACC) and its analysis framework, CosmoTools, to enable incremental extraction of relevant information at the same time as the simulation was running.
“Running the full machine is challenging because reading the massive amount of data produced by the simulation is computationally expensive, so you have to do a lot of analysis on the fly,” said Heitmann. “That’s daunting, because if you make a mistake with analysis settings, you don’t have time to redo it.”
The team took an integrated approach to carrying out the workflow during the simulation. HACC would run the simulation forward in time, determining the effect of gravity on matter during large portions of the history of the universe. Once HACC determined the positions of trillions of computational particles representing the overall distribution of matter, CosmoTools would step in to record relevant information — such as finding the billions of halos that host galaxies — to use for analysis during post-processing.
“When we know where the particles are at a certain point in time, we characterize the structures that have formed by using CosmoTools and store a subset of data to make further use down the line,” said Adrian Pope, physicist and core HACC and CosmoTools developer in Argonne’s Computational Science (CPS) division. “If we find a dense clump of particles, that indicates the location of a dark matter halo, and galaxies can form inside these dark matter halos.”
The scientists repeated this interwoven process — where HACC moves particles and CosmoTools analyzes and records specific data — until the end of the simulation. The team then used features of CosmoTools to determine which clumps of particles were likely to host galaxies. For reference, around 100 to 1,000 particles represent single galaxies in the simulation.
“We would move particles, do analysis, move particles, do analysis,” said Pope. “At the end, we would go back through the subsets of data that we had carefully chosen to store and run additional analysis to gain more insight into the dynamics of structure formation, such as which halos merged together and which ended up orbiting each other.”
Using the optimized workflow with HACC and CosmoTools, the team ran the simulation in half the expected time.
The Last Journey simulation will provide data necessary for other major cosmological experiments to use when comparing observations or drawing conclusions about a host of topics. These insights could shed light on topics ranging from cosmological mysteries, such as the role of dark matter and dark energy in the evolution of the universe, to the astrophysics of galaxy formation across the universe.
“This huge data set they are building will feed into many different efforts,” said Katherine Riley, director of science at the ALCF. “In the end, that’s our primary mission — to help high-impact science get done. When you’re able to not only do something cool, but to feed an entire community, that’s a huge contribution that will have an impact for many years.”
The team’s simulation will address numerous fundamental questions in cosmology and is essential for enabling the refinement of existing models and the development of new ones, impacting both ongoing and upcoming cosmological surveys.
“We are not trying to match any specific structures in the actual universe,” said Pope. “Rather, we are making statistically equivalent structures, meaning that if we looked through our data, we could find locations where galaxies the size of the Milky Way would live. But we can also use a simulated universe as a comparison tool to find tensions between our current theoretical understanding of cosmology and what we’ve observed.”
Looking to exascale
“Thinking back to when we ran the Outer Rim simulation, you can really see how far these scientific applications have come,” said Heitmann, who performed Outer Rim in 2013 with the HACC team and Salman Habib, CPS division director and Argonne Distinguished Fellow. “It was awesome to run something substantially bigger and more complex that will bring so much to the community.”
As Argonne works towards the arrival of Aurora, the ALCF’s upcoming exascale supercomputer, the scientists are preparing for even more extensive cosmological simulations. Exascale computing systems will be able to perform a billion billion calculations per second — 50 times faster than many of the most powerful supercomputers operating today.
“We’ve learned and adapted a lot during the lifespan of Mira, and this is an interesting opportunity to look back and look forward at the same time,” said Pope. “When preparing for simulations on exascale machines and a new decade of progress, we are refining our code and analysis tools, and we get to ask ourselves what we weren’t doing because of the limitations we have had until now.”
The Last Journey was a gravity-only simulation, meaning it did not consider interactions such as gas dynamics and the physics of star formation. Gravity is the major player in large-scale cosmology, but the scientists hope to incorporate other physics in future simulations to observe the differences they make in how matter moves and distributes itself through the universe over time.
“More and more, we find tightly coupled relationships in the physical world, and to simulate these interactions, scientists have to develop creative workflows for processing and analyzing,” said Riley. “With these iterations, you’re able to arrive at your answers — and your breakthroughs — even faster.”
A paper on the simulation, titled “The Last Journey. I. An extreme-scale simulation on the Mira supercomputer,” was published on January 27, 2021, in the Astrophysical Journal Supplement Series. The scientists are currently preparing follow-up papers to generate detailed synthetic sky catalogs.
Reference: “The Last Journey. I. An Extreme-scale Simulation on the Mira Supercomputer” by Katrin Heitmann, Nicholas Frontiere, Esteban Rangel, Patricia Larsen, Adrian Pope, Imran Sultan, Thomas Uram, Salman Habib, Hal Finkel, Danila Korytov, Eve Kovacs, Silvio Rizzi, Joe Insley and Janet Y. K. Knowles, 27 January 2021, Astrophysical Journal Supplement Series.
The work was a multidisciplinary collaboration between high energy physicists and computer scientists from across Argonne and researchers from Los Alamos National Laboratory.
Funding for the simulation is provided by DOE’s Office of Science.
It’s clear that rising greenhouse gas emissions are the main driver of global warming. But on a regional level, several other factors are at play. That’s especially true in the Arctic — a massive oceanic region around the North Pole that is warming two to three times faster than the rest of the planet. One consequence of the melting of the Arctic ice cap is a reduction in albedo, which is the capacity of surfaces to reflect a certain amount of solar radiation. Earth’s bright surfaces like glaciers, snow, and clouds have a high reflectivity. As snow and ice decrease, albedo decreases and more radiation is absorbed by the Earth, leading to a rise in near-surface temperature.
The other regional, yet much more complex factor that scientists need to pay detailed attention to relates to how clouds and aerosols interact. Aerosols are tiny particles suspended in the air; they come in a wide range of sizes and compositions and can occur naturally — such as from sea spray, marine microbial emissions or forest fires (like in Siberia) — or be produced by human activity, for example from the combustion of fossil fuels or agriculture. Without aerosols, clouds cannot form because they serve as the surface on which water molecules form droplets. Owing to this role, and more specifically to how they affect the amount of solar radiation that reaches the Earth surface, and the terrestrial radiation that leaves the Earth, aerosols are an essential element in regulating the climate and Arctic climate in particular.
“A lot of question marks”
In a paper published in Nature Climate Change on February 8, 2021, Julia Schmale, the head of EPFL’s Extreme Environments Research Laboratory, alerts the scientific community to the need for a better understanding of aerosol-related processes. “How albedo is affected by ice is fairly well understood — there are established maximum and minimum values, for example,” says Schmale. “But when it comes to groups of aerosols, there are many variables to consider: will they reflect or absorb light, will they form a cloud, are they natural or anthropogenic, will they stay local or travel long distances, and so on. There are a lot of question marks out there, and we need to find the answers.” She worked on the paper with two coauthors: Paul Zieger and Annica M. L. Ekman, both from the Bolin Centre for Climate Research at Stockholm University.
Schmale has carried out several research expeditions to the North Pole, most recently in early 2020 on the German icebreaker Polarstern. She saw first-hand that the Arctic climate tends to change fastest in the winter — despite there being no albedo during this period of 24-hour darkness. Scientists still don’t know why. One reason could be that clouds present in winter are reflecting the Earth’s heat back down to the ground; this occurs to varying degrees depending on natural cycles and the amount of aerosol in the air. That would lift temperatures above the Arctic ice mass, but the process is extremely complicated due to the wide range of aerosol types and differences in their capacity to reflect and absorb light. “Few observations have been made on this phenomenon because, in order to conduct research in the Arctic in the wintertime, you have to block off an icebreaker, scientists, and research equipment for the entire season,” says Schmale.
Improving weather models
Although many research expeditions have already been carried out in the Arctic, a lot remains to be explored. One option could be to collect all the discoveries made so far on Arctic warming and use them to improve existing weather models. “A major effort is needed right away, otherwise we’ll always be one step behind in understanding what’s going on. The observations we’ve already made could be used to improve our models. A wealth of information is available, but it hasn’t been sorted through in the right way to establish links between the different processes. For instance, our models currently can’t tell us what kinds of aerosols contribute the most to climate change, whether local or anthropogenic,” says Schmale.
In their paper, the research team puts forth three steps that could be taken to gain better insight into the Arctic climate and the role played by aerosols. They suggest creating an interactive, open-source, virtual platform that compiles all Arctic knowledge to date. They point to the International Arctic Systems for Observing the Atmosphere (IASOA) program as an example; the IASOA coordinates the activities of individual Arctic observatories to provide a collaborative international network for Arctic atmospheric research and operations. “We need to improve our climate models because what’s happening in the Arctic will eventually spread elsewhere. It’s already affecting the climate in other parts of the northern hemisphere, as we’ve seen with the melting glaciers and rising sea levels in Greenland. And to develop better models, a better understanding of aerosols’ role will be crucial. They have a major impact on the climate and on human health,” says Schmale.
Reference: “Aerosols in current and future Arctic climate” by Julia Schmale, Paul Zieger and Annica M. L. Ekman, 8 February 2021, Nature Climate Change.
Researchers have identified a new form of magnetism in so-called magnetic graphene, which could point the way toward understanding superconductivity in this unusual type of material.
The researchers, led by the University of Cambridge, were able to control the conductivity and magnetism of iron thiophosphate (FePS3), a two-dimensional material that undergoes a transition from an insulator to a metal when compressed. This class of magnetic materials offers new routes to understanding the physics of new magnetic states and superconductivity.
Using new high-pressure techniques, the researchers have shown what happens to magnetic graphene during the transition from insulator to conductor and into its unconventional metallic state, realized only under ultra-high pressure conditions. When the material becomes metallic, it remains magnetic, which is contrary to previous results and provides clues as to how the electrical conduction in the metallic phase works. The newly discovered high-pressure magnetic phase likely forms a precursor to superconductivity so understanding its mechanisms is vital.
Their results, published in the journal Physical Review X, also suggest a way that new materials could be engineered to have combined conduction and magnetic properties, which could be useful in the development of new technologies such as spintronics, which could transform the way in which computers process information.
Properties of matter can alter dramatically with changing dimensionality. For example, graphene, carbon nanotubes, graphite, and diamond are all made of carbon atoms, but have very different properties due to their different structure and dimensionality.
“But imagine if you were also able to change all of these properties by adding magnetism,” said first author Dr Matthew Coak, who is jointly based at Cambridge’s Cavendish Laboratory and the University of Warwick. “A material which could be mechanically flexible and form a new kind of circuit to store information and perform computation. This is why these materials are so interesting, and because they drastically change their properties when put under pressure so we can control their behaviour.”
In a previous study by Sebastian Haines of the Cavendish Laboratory and the Department of Earth Sciences, researchers established that the material becomes a metal at high pressure, and outlined how the crystal structure and arrangement of atoms in the layers of this 2D material change through the transition.
“The missing piece has remained however, the magnetism,” said Coak. “With no experimental techniques able to probe the signatures of magnetism in this material at pressures this high, our international team had to develop and test our own new techniques to make it possible.”
The researchers used new techniques to measure the magnetic structure up to record-breaking high pressures, using specially designed diamond anvils and neutrons to act as the probe of magnetism. They were then able to follow the evolution of the magnetism into the metallic state.
“To our surprise, we found that the magnetism survives and is in some ways strengthened,” co-author Dr. Siddharth Saxena, group leader at the Cavendish Laboratory. “This is unexpected, as the newly-freely-roaming electrons in a newly conducting material can no longer be locked to their parent iron atoms, generating magnetic moments there — unless the conduction is coming from an unexpected source.”
In their previous paper, the researchers showed these electrons were ‘frozen’ in a sense. But when they made them flow or move, they started interacting more and more. The magnetism survives, but gets modified into new forms, giving rise to new quantum properties in a new type of magnetic metal.
How a material behaves, whether conductor or insulator, is mostly based on how the electrons, or charge, move around. However, the ‘spin’ of the electrons has been shown to be the source of magnetism. Spin makes electrons behave a bit like tiny bar magnets and point a certain way. Magnetism from the arrangement of electron spins is used in most memory devices: harnessing and controlling it is important for developing new technologies such as spintronics, which could transform the way in which computers process information.
“The combination of the two, the charge and the spin, is key to how this material behaves,” said co-author Dr David Jarvis from the Institut Laue-Langevin, France, who carried out this work as the basis of his PhD studies at the Cavendish Laboratory. “Finding this sort of quantum multi-functionality is another leap forward in the study of these materials.”
“We don’t know exactly what’s happening at the quantum level, but at the same time, we can manipulate it,” said Saxena. “It’s like those famous ‘unknown unknowns’: we’ve opened up a new door to properties of quantum information, but we don’t yet know what those properties might be.”
There are more potential chemical compounds to synthesize than could ever be fully explored and characterized. But by carefully selecting and tuning materials with special properties, it is possible to show the way towards the creation of compounds and systems, but without having to apply huge amounts of pressure.
Additionally, gaining fundamental understanding of phenomena such as low-dimensional magnetism and superconductivity allows researchers to make the next leaps in materials science and engineering, with particular potential in energy efficiency, generation and storage.
As for the case of magnetic graphene, the researchers next plan to continue the search for superconductivity within this unique material. “Now that we have some idea what happens to this material at high pressure, we can make some predictions about what might happen if we try to tune its properties through adding free electrons by compressing it further,” said Coak.
“The thing we’re chasing is superconductivity,” said Saxena. “If we can find a type of superconductivity that’s related to magnetism in a two-dimensional material, it could give us a shot at solving a problem that’s gone back decades.”
Reference: “Emergent Magnetic Phases in Pressure-Tuned van der Waals Antiferromagnet FePS3” by Matthew J. Coak, David M. Jarvis, Hayrullo Hamidov, Andrew R. Wildes, Joseph A. M. Paddison, Cheng Liu, Charles R. S. Haines, Ngoc T. Dang, Sergey E. Kichanov, Boris N. Savenko, Sungmin Lee, Marie Kratochvílová, Stefan Klotz, Thomas C. Hansen, Denis P. Kozlenko, Je-Geun Park and Siddharth S. Saxena, 5 February 2021, Physical Review X.
Scientists have discovered the first evidence for a rare type of stellar explosion, or supernova in the Milky Way. This intriguing object lies near the center of our galaxy in a supernova remnant called Sagittarius A East (Sgr A East). Chandra data revealed that Sgr A East may belong to a special group of Type Ia supernovas. This result helps astronomers understand the different ways that white dwarf stars can explode. Astronomers have found evidence for an unusual type of supernova near the center of the Milky Way galaxy. This composite image contains data from NASA’s Chandra X-ray Observatory (blue) and the NSF’s Very Large Array (red) of the supernova remnant called Sagittarius A East, or Sgr A East for short. This object is located very close to the supermassive black hole in the Milky Way’s center, and likely overruns the disk of material surrounding the black hole.
Researchers were able to use Chandra observations targeting the supermassive black hole and the region around it for a total of about 35 days to study Sgr A East and find the unusual pattern of elements in the X-ray signature, or spectrum. An ellipse on the annotated version of the images outlines the region of the remnant where the Chandra spectra were obtained.
The X-ray spectrum of Sgr A East show that it is a strong candidate for the remains of a so-called Type Iax supernova, a special class of Type Ia supernova explosions that are used to accurately measure distances across space and study the expansion of the Universe.
Astronomers are still debating the cause of Type Iax supernova explosions, but the leading theory is that they involve thermonuclear reactions that travel much more slowly through the star than in normal Type Ia supernovas. This relatively slow walk of the blast leads to weaker explosions and, hence, different amounts of elements produced in the explosion. The researchers found this distinctive pattern of elements in the Chandra observations of Sgr A East.
In other galaxies, scientists observe that Type Iax supernovas occur at a rate that is about one third that of Type Ia supernovas. In the Milky Way, there have been three confirmed Type Ia supernova remnants and two candidates that are younger than 2,000 years. If Sgr A East is younger than 2,000 years and is a Type Iax supernova, this study suggests that our Galaxy is in alignment with respect to the relative numbers of Type Iax supernovas seen in other galaxies.
Previous studies had argued that Sgr A East was the remnant from the collapse of a massive star, which is a wholly different category of supernova, although a normal Type Ia supernova had not been ruled out. The latest study conducted with this deep Chandra data argue against both the massive star and the normal Type Ia interpretations.
These results will be published on Wednesday February 10th, 2021 in The Astrophysical Journal, and a preprint is available online. The authors of the paper are Ping Zhao (Nanjing University in China, and previously at the University of Amsterdam), Shing-Chi Leung (California Institute of Technology), Zhiyuan Li (Nanjing University), Ken’ichi Nomoto (The University of Tokyo in Japan), Jacco Vink (University of Amsterdam), and Yang Chen (Nanjing University).
NASA’s Marshall Space Flight Center manages the Chandra program. The Smithsonian Astrophysical Observatory’s Chandra X-ray Center controls science from Cambridge Massachusetts and flight operations from Burlington, Massachusetts.
New immunotherapy drug activates the body’s innate immune system to fight cancer.
A new nanoparticle-based drug can boost the body’s innate immune system and make it more effective at fighting off tumors, researchers at UT Southwestern have shown. Their study, published in Nature Biomedical Engineering, is the first to successfully target the immune molecule STING with nanoparticles about one millionth the size of a soccer ball that can switch on/off immune activity in response to their physiological environment.
“Activating STING by these nanoparticles is like exerting perpetual pressure on the accelerator to ramp up the natural innate immune response to a tumor,” says study leader Jinming Gao, Ph.D., a professor in UT Southwestern’s Harold C. Simmons Comprehensive Cancer Center and a professor of otolaryngology – head and neck surgery, pharmacology, and cell biology.
For more than a decade, researchers and pharmaceutical companies have been racing to develop drugs that target STING, which stands for “stimulator of interferon genes.” The STING protein, discovered in 2008, helps mediate the body’s innate immune system — the collection of immune molecules that act as first responders when a foreign agent circulates in the body, including cancer DNA. Research has suggested that activating STING can make the innate immune system more powerful at fighting tumors or infections. However, results from earlier clinical trials involving first-generation compounds targeting STING for activation failed to demonstrate an impressive clinical effect.
“A major limitation of conventional small molecule drugs is that after injection into tumors, they are washed out from the tumor site by blood perfusion, which can reduce antitumor efficacy while causing systemic toxicities,” explains Gao.
Gao and his colleagues at UTSW discovered another approach that is different from the earlier or first-generation STING agonist approaches that utilize synthetic cyclic dinucleotide to activate STING in the body. Gao and his team aimed to design a polymer — a manmade macromolecule that can self-assemble into nanoparticles — to effectively deliver cyclic GMP-AMP (cGAMP), a natural small molecule activator of STING, to the protein target. But one polymer they synthesized, PC7A, produced an unexpected and novel effect: It activated STING even without cGAMP. The group reported the initial results in 2017, not knowing at the time exactly how PC7A worked; the polymer didn’t resemble any other drugs that activated STING.
In the new paper, Gao’s team showed that PC7A binds to a different site on the STING molecule from known drugs. Moreover, its effect on the STING protein is different. While existing drugs activate the protein over the course of about six hours, PC7A forms polyvalent condensates with STING for over 48 hours, causing a more sustained effect on STING. This longer innate immune activation, they showed, leads to a more effective T cell response against multiple solid tumors. Mice survived longer and had slower tumor growth when they received a combination of PC7A and cGAMP, the researchers found.
The polymer also has other advantages. When circulating in the bloodstream, the polymers are present as small round nanoparticles that do not bind to STING. It’s only when those nanoparticles enter immune cells that they separate, attach to STING, and activate the immune response. That means that PC7A might be less likely to cause side effects throughout the body than other STING-targeting drugs, says Gao, although clinical trials will be needed to prove that.
Because PC7A binds to a different site of the STING molecule, the compound might work in patients for whom typical STING-targeting drugs do not. Up to 20 percent of people have inherited a slightly different gene for STING; the variant makes the STING protein resistant to several cyclic dinucleotide drugs. Gao and his team demonstrated that PC7A can still activate cells that express these STING variants.
“There’s been a lot of excitement about therapies that target STING and the potential role these compounds could play in expanding the benefits of immunotherapies for cancer patients,” says Gao. “We believe that our new nanotechnology approach offers a way to activate STING without some of the limitations we’ve seen with earlier STING agonist drugs in development.”
Reference: “Polycarbonate-based ultra-pH sensitive nanoparticles improve therapeutic window” by Xu Wang, Jonathan Wilhelm, Wei Li, Suxin Li, Zhaohui Wang, Gang Huang, Jian Wang, Houliang Tang, Sina Khorsandi, Zhichen Sun, Bret Evers and Jinming Gao, 17 November 2020, Nature Communications.
Other UTSW researchers who contributed to this study were Suxin Li, Min Luo, Zhaohui Wang, Qiang Feng, Jonathan Wilhelm, Xu Wang, Wei Li, Jian Wang, Agnieszka Cholka, Yang-xin Fu, Baran Sumer, and Hongtao Yu.
This research was supported by funds from the National Institutes of Health (U54 CA244719) and Mendelson-Young Endowment for Cancer Therapeutics.
Gao holds the Elaine Dewey Sammons Distinguished Chair in Cancer Research, in honor of Eugene P. Frenkel, M.D. at UTSW.
Molecular iodine, a major emission from the ocean, can quickly convert to iodic oxoacids even under weak daylight conditions. These oxoacids lead rapidly to aerosol particles that significantly affect climate and human health.
Iodine-containing vapors that are emitted from oceans are a major source of aerosol particles. “Despite their importance to the climate, the formation of marine particles has been poorly understood,” says Siddharth Iyer, Postdoctoral Researcher in Aerosol Physics Laboratory at Tampere University.
In this research, the formation of aerosol particles form from iodine-containing vapours under marine boundary layer conditions were studied. The experiments were carried out in the ultra-clean CLOUD chamber in CERN, where the nucleation and growth rates as well as the composition of freshly formed particles from iodic oxoacids (iodic acid and iodous acid) were measured.
These vapours derive from photolysis and oxidation of molecular iodine, for which the ocean surface is a major source. The conversion to iodine oxoacids were found to be extremely fast, even under weak daylight conditions. Although iodic acid was identified as the key vapour, a related species – iodous acid – was also found to play an important stabilizing role in the initial steps of neutral (uncharged) particle formation.
“Sulfuric acid is known to be key player in new particle formation, but our results indicate that iodine oxoacid particle formation can compete with sulfuric acid in pristine regions of the atmosphere. This significantly advances our understanding of aerosol formation,” Iyer sums up.
The article Role of iodine oxoacids in atmospheric aerosol nucleation was published in Science.
Reference: “Role of iodine oxoacids in atmospheric aerosol nucleation” by Xu-Cheng He, Yee Jun Tham, Lubna Dada, Mingyi Wang, Henning Finkenzeller, Dominik Stolzenburg, Siddharth Iyer, Mario Simon, Andreas Kürten, Jiali Shen, Birte Rörup, Matti Rissanen, Siegfried Schobesberger, Rima Baalbaki, Dongyu S. Wang, Theodore K. Koenig, Tuija Jokinen, Nina Sarnela, Lisa J. Beck, João Almeida, Stavros Amanatidis, António Amorim, Farnoush Ataei, Andrea Baccarini, Barbara Bertozzi, Federico Bianchi, Sophia Brilke, Lucía Caudillo, Dexian Chen, Randall Chiu, Biwu Chu, António Dias, Aijun Ding, Josef Dommen, Jonathan Duplissy, Imad El Haddad, Loïc Gonzalez Carracedo, Manuel Granzin, Armin Hansel, Martin Heinritzi, Victoria Hofbauer, Heikki Junninen, Juha Kangasluoma, Deniz Kemppainen, Changhyuk Kim, Weimeng Kong, Jordan E. Krechmer, Aleksander Kvashin, Totti Laitinen, Houssni Lamkaddam, Chuan Ping Lee, Katrianne Lehtipalo, Markus Leiminger, Zijun Li, Vladimir Makhmutov, Hanna E. Manninen, Guillaume Marie, Ruby Marten, Serge Mathot, Roy L. Mauldin, Bernhard Mentler, Ottmar Möhler, Tatjana Müller, Wei Nie, Antti Onnela, Tuukka Petäjä, Joschka Pfeifer, Maxim Philippov, Ananth Ranjithkumar, Alfonso Saiz-Lopez, Imre Salma, Wiebke Scholz, Simone Schuchmann, Benjamin Schulze, Gerhard Steiner, Yuri Stozhkov, Christian Tauber, António Tomé, Roseline C. Thakur, Olli Väisänen, Miguel Vazquez-Pufleau, Andrea C. Wagner, Yonghong Wang, Stefan K. Weber, Paul M. Winkler, Yusheng Wu, Mao Xiao, Chao Yan, Qing Ye, Arttu Ylisirniö, Marcel Zauner-Wieczorek, Qiaozhi Zha, Putian Zhou, Richard C. Flagan, Joachim Curtius, Urs Baltensperger, Markku Kulmala, Veli-Matti Kerminen, Theo Kurtén, Neil M. Donahue, Rainer Volkamer, Jasper Kirkby, Douglas R. Worsnop and Mikko Sipilä, 5 February 2021, Science.
Bd’s actin structures likely play roles in causing skin disease threatening amphibians worldwide.
Researchers at the University of Massachusetts Amherst have gained new insight into the biological processes of a chytrid fungus responsible for a deadly skin infection devastating frog populations worldwide.
Led by cell biologist Lillian Fritz-Laylin, the team describes in a paper published today (February 8, 2021) in Current Biology how the actin networks of Batrachochytrium dendrobatidis (Bd) also serve as an “evolutionary Rosetta Stone,” revealing the loss of cytoskeletal complexity in the fungal kingdom.
“Fungi and animals seem so different, but they are actually pretty closely related,” says Fritz-Laylin, whose lab studies how cells move, which is a central activity in the progression and prevention of many human diseases. “This project, the work of Sarah Prostak in my lab, shows that during early fungal evolution, fungi probably had cells that looked something like our cells, and which could crawl around like our cells do.”
Chytrids including Bd encompass more than 1,000 species of fungi deep on the phylogenetic, or evolutionary, tree. The researchers used chytrids, which share features of animal cells that have been lost in yeast and other fungi, to explore the evolution of actin cytoskeleton, which helps cells keep their shape and organization and carry out movement, division and other crucial functions.
Prostak, a research associate in Fritz-Laylin’s lab, is the lead author of the paper, which she initially wrote as her undergraduate honors biology thesis, the expanded and finished the research after graduation. Other authors are Margaret Titus, professor of genetics, cell biology and development at the University of Minnesota, and Kristyn Robinson, a UMass Amherst Ph.D. candidate in Fritz-Laylin’s lab.
“Bd is more closely related to animal cells than more typically studied fungi so it can tell us a lot about the animal lineage and the fungal lineage and can also provide a lot of insight into human actin networks,” Prostak says. “We can use it to study animal-like regulation in a similar system rather than actually studying it in animal cells, which is very complicated because animal cells have so many actin regulators.”
The research team used a combination of genomics and fluorescence microscopy to show that chytrids’ actin cytoskeleton has features of both animal cells and yeast. “How these complex actin regulatory networks evolved and diversified remain key questions in both evolutionary and cell biology,” the paper states.
The biologists explored the two developmental stages in Bd’s life cycle. In the first stage, Bd zoospores swim with a flagellum and build actin structures similar to those of animal cells, including pseudopods that propel the organisms forward. In the reproductive stage, Bd sporangia assemble actin shells, as well as actin patches, which are similar to those of yeast.
The disease chytridiomycosis, caused by Bd, ravages the skin of frogs, toads and other amphibians, eventually leading to heart failure after throwing off fluid regulation. This disease has been attributed to huge losses of biodiversity, including dozens of presumed population declines and extinctions over the past 50 years, though exactly how many species have been affected by this disease has been subject to debate.
The UMass Amherst biologists say Bd’s actin structures they observed likely play important roles in causing the disease. “This model suggests that actin networks underlie the motility and rapid growth that are key to the pathology and pathogenicity of Bd,” the paper concludes.
Prostak, an animal lover drawn to Fritz-Laylin’s lab because of its focus on pathogens, hopes their research advancing the knowledge about Bd will lead to measures that slow the deadly damage of chytridiomycosis.
“Figuring out the basic biology of Bd will hopefully give insight into disease mitigation in the future,” Prostak says.
Reference: 8 February 2021, Current Biology.
Funding: Pew Charitable Trust, National Institutes of Health
Why businesses need the climate-equivalent of a ‘weather service.’
The findings are published in the prestigious journal, Nature Climate Change, and calls on businesses, the financial services industry and regulators to work more closely with climate scientists.
Regulators and governments — both domestic and international — are increasingly requiring that businesses assess and disclose their vulnerability to the physical effects of climate change, for example, increased drought, bushfires, and sea level rise.
“People are making strategically material decisions on a daily basis, and raising debt or capital to finance these, but the decisions may not have properly considered climate risk,” said lead author Dr. Tanya Fiedler from the University of Sydney Business School.
“To assess the physical risks of climate change, businesses are referencing climate models, which are publicly available but complex. The problem arises when this information is used for the purpose of assessing financial risk, because the methodologies of those undertaking the risk assessment can be ‘black boxed’ and in some instances are commercial in confidence. This means the market is unable to form a view.”
Co-author on the paper, Professor Andy Pitman from the University of New South Wales, said: “Businesses want to know which of their assets and operations are at risk of flooding, cyclones or wind damage and when, but providing that information using existing global climate models is a struggle. There is, of course, very useful information available from climate models, but using it in assessing business risk requires a bespoke approach and a deep engagement between business and climate modelers.”
Professor Pitman, Director of the ARC Centre of Excellence for Climate Extremes, added: “A whole host of issues can trip up the unwary, ranging from the type of model, how it was set up, how changes in greenhouse gases were represented, what time period is being considered and how “independent” of each other the different models truly are.”
To address the gap between science and business, a paradigm shift is needed.
Professor Christian Jakob from Monash University, another co-author of the study, said: “Climate modeling needs to be elevated from a largely research-focused activity to a level akin to that of operational weather forecasting — a level that is of tangible and practical value to business.”
Without such an approach, the paper highlights some of the unintended consequences arising from climate information being used inappropriately.
“As with any form of decision-making, businesses could be operating under a false sense of security that arises when non-experts draw conclusions believed to be defensible, when they are not,” Dr. Fiedler, an expert at the University of Sydney’s Discipline of Accounting, said.
“Our study proposes a new approach with deep engagement between governments, business and science to create information that is fit for purpose. Until this happens, your best bet is to go to the source — the climate modelers themselves.”
Reference: 8 February 2021, Nature Climate Change.
In January 2020, the sky over Sweden delivered an early valentine to people on the ground in the form of a heart-shaped hole in the clouds. Unfortunately, the sky appeared heartless to NASA satellites looking down from above. But other cases of the unusual atmospheric display—so-called “fallstreak holes”—dotted the sky that month over the southern United States.
The natural-color (above) and false-color (below) images on this page, acquired on January 29, 2021, show fallstreak holes west of Atlanta, Georgia. The natural-color image further below shows a similar scene on January 7, 2021, northwest of Miami, Florida. The locations are more than 500 miles apart, but the physics behind the phenomenon is the same. All of the images were acquired by the Moderate Resolution Imaging Spectroradiometer (MODIS) on NASA’s Terra satellite.
Fallstreak holes, also known as hole-punch clouds, are the result of cold air temperatures and atmospheric instability. Viewed from below, it can appear as if part of the cloud is falling out of the sky. As it turns out, that is actually what’s happening.
The phenomenon occurs in mid-level clouds composed of liquid water droplets that are super-cooled; that is, the droplets remain liquid even when temperatures are below the typical freezing point of water (32°F, or 0°C). But even super-cooled droplets have their limits. The additional cooling that occurs over the wings of aircraft, for example, can push the droplets to the point of freezing as an airplane passes through the cloud layer. Ice crystals beget more ice crystals as the liquid droplets continue to freeze. They eventually grow heavy enough that the ice crystals fall out of the sky, leaving behind a void in the cloud layer.
The falling ice crystals are often visible in the center of the holes. They are especially apparent in the false-color image at the top of this page, which uses a combination of infrared and visible light (MODIS bands 7-2-1) to distinguish between water and ice. In this view, ice clouds (blue) appear centered within the voids in the water clouds (white).
Both ascending and descending aircraft are a common trigger for fallstreak holes and their longer, skinner cousins, canal clouds. It is no coincidence that the holes are located near busy airport hubs.
NASA Earth Observatory images by Joshua Stevens, using MODIS data from NASA EOSDIS LANCE and GIBS/Worldview.
In order to study a wide range of astronomical phenomena, the VLA has several shapes or configurations, each with its own advantages.
When the Very Large Array was completed forty years ago, it was a different kind of radio telescope. Rather than having a single antenna dish, the VLA has 27. The data these antennas gather is combined in such a way that they act as a single radio telescope. As a radio array, the virtual dish of the VLA can cover an area roughly the size of Disney World. But the VLA can also do something ordinary telescopes can’t do: it can change shape.
The antennas of the VLA are arranged along three long arms, each with nine antennas. Each arm has a rail track, allowing the antennas to be moved to different locations along the arm by a 200-ton transporter. Thus, the antennas can be spaced widely apart, or clustered close together. Although each antenna can be moved individually, they are typically positional in standard arrangements or configurations. In many ways, each configuration is its own radio telescope. By moving antennas into these different configurations, the VLA can serve as many observatories rolled into one.
The power of a telescope largely depends upon two factors: the faintness of the light it can see, known as its sensitivity, and the sharpness of the images it can produce, known as its resolution. These two factors are often contradictory. To capture faint images a telescope needs to collect lots of light over a long time, but this can make images blurry. To capture a sharp image you often need a brighter source. It is similar to the effect of our own eyes, which adapt to brightness. It’s one of the reasons you can see clearly in bright daylight, while things can look more blurry in dim light. By arranging antennas into different configurations, the VLA can overcome this challenge, allowing it to capture both sharp images and faint objects depending on the needs of astronomers.
There are four primary configurations used by the VLA. They are each assigned a letter A – D, depending on the spread of the antennas. Configuration A, spanning more than 22 miles, is where the antennas are most widely spaced, and Configuration D is where they are closest together, with the antennas clustered into an area less than a mile wide. The VLA cycles through these configurations, staying in each one for several months.
The largest configuration gives the VLA its highest resolution. Radio astronomers often want to see fine details in a radio image, which is why Configuration A is the most requested. But smaller configurations have their own uses. Configuration D gives the VLA the greatest sensitivity. This makes it particularly useful in the study of diffuse hydrogen gas in nearby galaxies, and in capturing images of faint radio nebulae.
Configuration B is a workhorse configuration. It is a third the width of Configuration A and therefore strikes a balance between sensitivity and resolution. It is mostly used for the VLA Sky Survey (VLASS), which is a 7-year project to map 80% of the sky in radio light. When it is finished it will have a catalog of more than 10 million radio sources. VLASS also uses an additional hybrid configuration known as BnA. In this arrangement, the antennas in the north arm are arranged in A configuration, while antennas in the other two arms are kept in B Configuration. This gives the virtual dish of the VLA an oval shape.
Configuration BnA is used to see the southernmost region of the sky. Objects in the far south of the sky are near the horizon, and their light comes in at a low angle. By stretching the northern arm, the VLA can “circularize” the images gathered so they aren’t distorted by their low angle.
If in the future you happen to visit the VLA, you may find the antennas scattered near the horizon, or huddled close to the visitor center. If you visit at another time, you will likely see the antennas in a different configuration. All because VLA shapeshifts to see the universe in wondrous new ways.