. Scientific Frontline

Thursday, September 2, 2021

A Black Hole Triggers a Premature Supernova

 

Dillon Dong, with a 27-meter radio dish at
Caltech's Owens Valley Radio Observatory in the background.
In 2017, a particularly luminous and unusual source of radio waves was discovered in data taken by the Very Large Array (VLA) Sky Survey, a project that scans the night sky in radio wavelengths. Now, led by Caltech graduate student Dillon Dong (MS '18), a team of astronomers has established that the bright radio flare was caused by a black hole or neutron star crashing into its companion star in a never-before-seen process.

"Massive stars usually explode as supernovae when they run out of nuclear fuel," says Gregg Hallinan, professor of astronomy at Caltech. "But in this case, an invading black hole or neutron star has prematurely triggered its companion star to explode." This is the first time a merger-triggered supernova has ever been confirmed.

Bright Flares in the Night Sky

Hallinan and his team look for so-called radio transients—short-lived sources of radio waves that flare
brightly and burn out quickly like a match lit in a dark room. Radio transients are an excellent way to identify unusual astronomical events, such as massive stars that explode and blast out energetic jets or the mergers of neutron stars.

As Dong sifted through the VLA's massive dataset, he singled out an extremely luminous source of radio waves from the VLA survey called VT 1210+4956. This source is tied for the brightest radio transient ever associated with a supernova.

Dong determined that the bright radio energy was originally a star surrounded by a thick and dense shell of gas. This gas shell had been cast off the star a few hundred years before the present day. VT 1210+4956, the radio transient, occurred when the star finally exploded in a supernova and the material ejected from the explosion interacted with the gas shell. Yet, the gas shell itself, and the timescale on which it was cast off from the star, were unusual, so Dong suspected that there might be more to the story of this explosion.

Two Unusual Events

Following Dong's discovery, Caltech graduate student Anna Ho (PhD '20) suggested that this radio transient be compared with a different catalog of brief bright events in the X-ray spectrum. Some of these X-ray events were so short-lived that they were only present in the sky for a few seconds of Earth time. By examining this other catalog, Dong discovered a source of X-rays that originated from the same spot in the sky as VT 1210+4956. Through careful analysis, Dong established that the X-rays and the radio waves were likely coming from the same event.

Gregg Hallinan

"The X-ray transient was an unusual event—it signaled that a relativistic jet was launched at the time of the explosion," says Dong. "And the luminous radio glow indicated that the material from that explosion later crashed into a massive torus of dense gas that had been ejected from the star centuries earlier. These two events have never been associated with each other, and on their own they're very rare."

A Mystery Solved

So, what happened? After careful modeling, the team determined the most likely explanation—an event that involved some of the same cosmic players that are known to generate gravitational waves.

They speculated that a leftover compact remnant of a star that had previously exploded—that is, a black hole or a neutron star—had been closely orbiting around a star. Over time, the black hole had begun siphoning away the atmosphere of its companion star and ejecting it into space, forming the torus of gas. This process dragged the two objects ever closer until the black hole plunged into the star, causing the star to collapse and explode as a supernova.

The X-rays were produced by a jet launched from the core of the star at the moment of its collapse. The radio waves, by contrast, were produced years later as the exploding star reached the torus of gas that had been ejected by the inspiraling compact object.

Astronomers know that a massive star and a companion compact object can form what is called a stable orbit, in which the two bodies gradually spiral closer and closer over an extremely long period of time. This process forms a binary system that is stable for millions to billions of years but that will eventually collide and emit the kind of gravitational waves that were discovered by LIGO in 2015 and 2017.

However, in the case of VT 1210+4956, the two objects instead collided immediately and catastrophically, producing the blasts of X-rays and radio waves observed. Although collisions such as this have been predicted theoretically, VT 1210+4956 provides the first concrete evidence that it happens.

Serendipitous Surveying

The VLA Sky Survey produces enormous amounts of data about radio signals from the night sky, but sifting through that data to discover a bright and interesting event such as VT 1210+4956 is like finding a needle in a haystack. Finding this particular needle, Dong says, was, in a way, serendipitous.

"We had ideas of what we might find in the VLA survey, but we were open to the possibility of finding things we didn't expect," explains Dong. "We created the conditions to discover something interesting by conducting loosely constrained, open-minded searches of large data sets and then taking into account all of the contextual clues we could assemble about the objects that we found. During this process you find yourself pulled in different directions by different explanations, and you simply let nature tell you what's out there."

The paper is titled "A transient radio source consistent with a merger-triggered core collapse supernova." Dillon Dong is the first author. In addition to Hallinan and Ho, additional co-authors are Ehud Nakar, Andrew Hughes, Kenta Hotokezaka, Steve Myers (PhD '90), Kishalay De (MS '18, PHD '21), Kunal Mooley (PhD '15), Vikram Ravi, Assaf Horesh, Mansi Kasliwal (MS '07, PhD '11), and Shri Kulkarni. Funding was provided by the National Science Foundation, the United States–Israel Binational Science Foundation, the I-Core Program of the Planning and Budgeting Committee and the Israel Science Foundation, Canada's Natural Sciences and Engineering Research Council, the Miller Institute for Basic Research in Science at the UC Berkeley, the Japan Society for the Promotion of Science Early-Career Scientists Program, the National Radio Astronomy Observatory, and the Heising-Simons Foundation.

A paper about the findings will appear in the journal Science on September 3.

Source/Credit: California Institute of Technology / Lori Dajose

sn090221_02

Z Turns Twenty-five Years Old

 

An open-shutter photo showing electrical energy coursing through the
transmission line sections of Sandia National Laboratories’ Z machine.
(Photo by Randy Montoya)
Sandia National Laboratories is celebrating 25 years of research conducted at its Z Pulsed Power Facility — a gymnasium-sized accelerator commonly referred to as Z or the Z machine.

Z began with a simple idea — running large pulsed electrical currents through targets at the center of the machine — that has resulted in startling science even after 25 years.

“We have seen continuous innovation over the history of Z, and we still have about another decade of exciting research lined up,” says Dan Sinars, Sandia’s pulsed power sciences director.

The adventure began 25 years ago, said former director Don Cook, when Sandia researchers modified a machine built in 1985 called the Particle Beam Fusion Accelerator. That machine — Z’s ancestor, in a sense — employed a very high voltage and smaller current to make lithium-ion beams for fusion research. The experimental output was powerful, about 15 terawatts, but had hardly increased in a decade of testing.

So, trying a different approach, the machine was restructured to deliver very high currents and lower voltages. Currents 100 times larger than those in a bolt of lightning efficiently vaporized arrays of tiny wires into clouds of ions. Then the powerful magnetic field accompanying the electric current slammed the ions into each other, a process that emitted copious X-rays that could be used for fusion research and other applications.

The new method, attempted first on a smaller Sandia machine called Saturn, immediately increased the output to 40 terawatts, and led to many experiments to improve the number, size, material choice and placement of succeeding arrays.

“Once it was confirmed in experiments in 1996 on a machine temporarily called PBFA II-Z that enormous pressures (millions of atmospheres) and very high temperatures (millions of degrees Celsius) could be produced by z-pinches, we renamed the machine simply Z in 1996. So, 2021 is the 25th anniversary of Z,” said Cook.

Researchers around the world marveled at the huge output increase, which quickly reached more than 200 terawatts, said former Sandia vice president and early Z leader Gerry Yonas. The Z-pinch work — called Z because the operation occurs along the Z axis in three-dimensional graphs — generated data for the U.S. nuclear stockpile.

Z hasn’t yet created fusion ignition, though the effort to increase its fusion output continues. “Achieving nuclear fusion in the lab isn’t for people who give up easily,” said Sandia fellow Keith Matzen, Z director from 2005-2013 and again from 2015-2019, who cautions it will take a bigger version of Z to demonstrate that the fusion energy emitted by the process is equal to the electrical energy stored in the facility, a milestone known as break-even.

Meanwhile Z researchers have delved into other areas, including determining where life elsewhere in our galaxy may have evolved; investigating the existence of diamonds on Neptune and liquid helium on Saturn; determining the age of white dwarf stars and the behavior of black holes in space; and the amount of water in the universe and its age, said Sinars.

To achieve these unusual capabilities, he said, researchers over decades reimagined work on Z so that the huge magnetic fields naturally accompanying Z’s powerful electrical discharges became instruments in their own right, testing materials by creating pressures exceeding those at Earth’s core or aiding in the effort to create breakeven nuclear fusion by pre-compressing the target fuel environs.

“In the meantime,” said Cook, “Z has become the most energetic source of X-rays for fusion research and for stockpile stewardship on the planet. Its capabilities as a pre-eminent research facility for high energy density sciences are known and appreciated worldwide.”

Sandia National Laboratories is a multimission laboratory operated by National Technology and Engineering Solutions of Sandia LLC, a wholly owned subsidiary of Honeywell International Inc., for the U.S. Department of Energy’s National Nuclear Security Administration. Sandia Labs has major research and development responsibilities in nuclear deterrence, global security, defense, energy technologies and economic competitiveness, with main facilities in Albuquerque, New Mexico, and Livermore, California.

Press Release
Source/Credit: Sandia National Laboratories

tn090221_01

What's Up September 2021

 


Source/Credit: NASA/JPL

Wednesday, September 1, 2021

Physicists find ‘magnon’ origins in 2D magnet

 

Rice University physicists Pengcheng Dai (left) and Lebing Chen have discovered that unusual magnetic features they previously noticed in 2D chromium triiodide arise from topological features. (Photo by Jeff Fitlow/Rice University)

Rice physicists have confirmed the topological origins of magnons, magnetic features they discovered three years ago in a 2D material that could prove useful for encoding information in the spins of electrons.

The discovery, described in a study published online this week in the American Physical Society journal PRX, provides a new understanding of topology-driven spin excitations in materials known as in 2D van der Waals magnets. The materials are of growing interest for spintronics, a movement in the solid-state electronics community toward technologies that use electron spins to encode information for computation, storage and communications.

Spin is an intrinsic feature of quantum objects and the spins of electrons play a key role in bringing about magnetism.

Rice physicist Pengcheng Dai, co-corresponding author of the PRX study, said inelastic neutron-scattering experiments on the 2D material chromium triiodine confirmed the origin of the topological nature of spin excitations, called magnons, that his group and others discovered in the material in 2018.

The group’s latest experiments at Oak Ridge National Laboratory’s (ORNL) Spallation Neutron Source showed “spin-orbit coupling induces asymmetric interactions between spins” of electrons in chromium triiodine, Dai said. “As a result, the electron spins feel the magnetic field of moving nuclei differently, and this affects their topological excitations.”

In van der Waals materials, atomically thin 2D layers are stacked like pages in a book. The atoms

Graduate student Lebing Chen displays chromium triiodide crystals
 he made in a Rice University laboratory.
(Photo by Jeff Fitlow/Rice University)

within layers are tightly bonded, but the bonds between layers are weak. The materials are useful for exploring unusual electronic and magnetic behaviors. For example, a single 2D sheet of chromium triiodine has the same sort of magnetic order that makes magnetic decals stick to a metal refrigerator. Stacks of three or more 2D layers also have that magnetic order, which physics call ferromagnetic. But two stacked sheets of chromium triiodine have an opposite order called antiferromagnetic.

That strange behavior led Dai and colleagues to study the material. Rice graduate student Lebing Chen, the lead author of this week’s PRX study and of the 2018 study in the same journal, developed methods for making and aligning sheets of chromium triiodide for experiments at ORNL. By bombarding these samples with neutrons and measuring the resulting spin excitations with neutron time-of-flight spectrometry, Chen, Dai and colleagues can discern unknown features and behaviors of the material.

In their previous study, the researchers showed chromium triiodine makes its own magnetic field thanks to magnons that move so fast they feel as if they are moving without resistance. Dai said the latest study explains why a stack of two 2D layers of chromium triiodide has antiferromagnetic order.

“We found evidence of a stacking-dependent magnetic order in the material,” Dai said. Discovering the origins and key features of the state is important because it could exist in other 2D van der Waals magnets.

Additional co-authors include Bin Gao of Rice, Jae-Ho Chung of Korea University, Matthew Stone, Alexander Kolesnikov, Barry Winn, Ovidiu Garlea and Douglas Abernathy of ORNL, and Mathias Augustin and Elton Santos of the University of Edinburgh.

The research was funded by the National Science Foundation (1700081), the Welch Foundation (C-1839), the National Research Foundation of Korea (2020R1A5A1016518, 2020K1A3A7A09077712), the United Kingdom’s Engineering and Physical Research Council and the University of Edinburgh and made use of facilities provided by the United Kingdom’s ARCHER National Supercomputing Service and the Department of Energy’s Office of Science.

News Release
Source/Credit: Rice University / Jade Boyd

phy090121_01

Glacial Ice Cores Reveal 15,000 Year Old Microbes

 

Extensive glaciation at high altitudes in the Tibetan Plateau.
Source: Reurinkjan
Known as the world’s “Third Pole”, the Tibetan Plateau holds a vast amount of Earth’s ice. Over 46,000 glaciers blanket the arid, elevated landscape, which is part of the expansive Hindu Kush Himalaya (HKH) mountain range. These mountains and their icefields collectively hold the largest volume of snow and ice outside the Arctic and Antarctic. One might easily assume that the ice is sterile and void of life beyond its inert composition, considering the ancient and inaccessible depths it descends to. However, a new investigation of Tibetan glacial ice cores reveals quite the opposite: these immense glaciers in fact hold a rich chronological record of frozen, unique microbial life.

Zhi-Ping Zhong is a postdoctoral paleoclimatology researcher at Ohio State University’s Byrd Polar and Climate Research Center, and the lead author of a new publication in the journal Microbiome outlining his team’s investigation of nearly 15,000-year-old microbes in Tibetan ice. Their innovation is in their methodology — it is notoriously difficult to isolate and preserve ancient microbial DNA well enough to resolve individual genomes, while simultaneously avoiding contamination or degradation of the sample. In addition, glacier ice contains very low levels of biomass, making contamination by today’s microbes and viruses an even more imposing risk. Zhong and his team pioneered a new approach that accomplished this difficult task with remarkable precision, permitting them to see right down to the ancient genes.

“We developed clean methods to remove the contaminants on glacier ice core surfaces,” Zhong explained in an interview with GlacierHub. “This helps guarantee we obtain the ‘real’ microbes and viruses that were archived in glacier ice, not contaminants.” The team’s methods involved meticulous shaving and disinfection of the cores down to their innermost ice, isolating relatively uncontaminated material for analysis. They expanded upon previous work by first validating their methods on artificial cores they had laced with known bacteria, allowing them to measure what amount of the mock contaminants remained. With more concrete data on the efficacy of their approach, they proceeded to clean and process the actual cores.

The ice cores used in the investigation were drilled by Lonnie Thompson and colleagues in 2015 from the Guliya Ice Cap. Thompson, a renowned paleoclimatologist and professor at Ohio State University since 1991, began (alongside Ellen Mosley-Thompson) building the Byrd Polar and Climate Research Center’s ice core collection several decades ago. Zhong emphasises that glacier ice does not only archive past climates and chemical information about Earth’s atmosphere — it also archives entire microbial ecosystems, providing a preserved biological record going back untold thousands of years. 

The research team’s meticulous contamination prevention and reduction methods both outside and inside the lab revealed certain groups of bacteria commonly found in glacier ice such as Janthinobacterium, Polaromonas, and Sphingomonas. However, investigation of viral genetic material uncovered entire genetic sequences which were unique to the study, revealing 28 novel genera. This rate of 88 percent novel genera found in the glacier ice is much higher than those found by viral analyses of ocean environments (52 percent unique genera) and soils (61 percent unique genera). Such discoveries at exceptional levels of detail are integral to Zhong’s goals for the study. He explains that he hopes to understand the mutation rates of microbes over long periods of time by comparing the frozen genomes with those of more current bacteria and viruses. “These efforts will provide us the possibility of using a sort of molecular clock to help date the ice.”

The potential applications of Zhong. et al’s methods don’t end on this planet, either. Extremophilic life on Earth (including hardy ice-dwelling bacteria and other microbes) are frequently studied as potential models for extraterrestrial life on other planets and moons. Numerous bodies in our solar system harbor water ice, albeit in more extreme climatic conditions, leading to the astrobiological assumption that such ice may be sufficient to provide habitable conditions for life. Because the team’s protocol was developed for microbial and viral extraction from high-elevation, cold, and dry environments on Earth, Zhong noted how similar techniques “may one day be applied in the search for life in the Martian polar regions as well in other icy worlds in our solar system.” 

These techniques hold great promise for expanding our understanding of microbial history and evolution, but alongside this field’s emergence comes the existential threat of climate change. A quarter of the Third Pole has melted since 1970, and according to a 2019 IPCC report, two-thirds of its glaciers are predicted to disappear within the next 80 years. These catastrophic trends are global to varying degrees, and with the melt comes the Earth-wide loss of a biological history going back hundreds of thousands of years, unsalvageable as these records transition to meltwater. 

Aware of this threat, the Byrd Polar and Climate Research Center has collected and preserved more than 7,000 meters of ice core sections over its 40 years of glacier ice analysis across the globe. The frozen room at the Byrd Center is a time capsule preserving histories of the world that soon may not be accessible anywhere else. Both the archived ice cores and Zhong’s methods may serve as a foundation for the next generation of researchers, working in a world where the only views of once magnificent and biology-rich glaciers are in shelved cylinders of ice, each four inches across and about a yard long. Scientists have barely begun to read the vast genetic tome that is contained in Earth’s glaciers — these new methods of recovering frozen genomes and preserving threatened ice are now facing a fruitful, fateful race against time.

Source/Credit: Columbia University Climate School / by Daniel Burgess

en090121_01

Using Liquid Metal to Turn Motion into Electricity

 

Photo credit: Veenasri Vallem
Researchers at North Carolina State University have created a soft and stretchable device that converts movement into electricity and can work in wet environments.

“Mechanical energy – such as the kinetic energy of wind, waves, body movement and vibrations from motors – is abundant,” says Michael Dickey, corresponding author of a paper on the work and Camille & Henry Dreyfus Professor of Chemical and Biomolecular Engineering at NC State. “We have created a device that can turn this type of mechanical motion into electricity. And one of its remarkable attributes is that it works perfectly well underwater.”

The heart of the energy harvester is a liquid metal alloy of gallium and indium. The alloy is encased in a hydrogel – a soft, elastic polymer swollen with water.

The water in the hydrogel contains dissolved salts called ions. The ions assemble at the surface of the metal, which can induce charge in the metal. Increasing the area of the metal provides more surface to attract charge. This generates electricity, which is captured by a wire attached to the device.


“Since the device is soft, any mechanical motion can cause it to deform, including squishing, stretching and twisting,” Dickey says. “This makes it versatile for harvesting mechanical energy. For example, the hydrogel is elastic enough to be stretched to five times its original length.”

In experiments, researchers found that deforming the device by only a few millimeters generates a power density of approximately 0.5 mW m-2. This amount of electricity is comparable to several popular classes of energy harvesting technologies.

“However, other technologies don’t work well, if at all, in wet environments,” Dickey says. “This unique feature may enable applications from biomedical settings to athletic wear to marine environments. Plus, the device is simple to make.

“There is a path to increase the power, so we consider the work we described here a proof-of-concept demonstration.”

The researchers already have two related projects under way.

One project is aimed at using the technology to power wearable devices by increasing the harvester’s power output. The second project evaluates how this technology could be used to harvest wave power from the ocean.

The paper, “A Soft Variable-Area Electrical-Double-Layer Energy Harvester,” is published in the journal Advanced Materials. First author of the paper is Veenasri Vallem, a Ph.D. student at NC State. Co-authors include Erin Roosa and Tyler Ledinh, who were undergrads at NC State when the work was done; Sahar Rashid-Nadimi and Abolfazl Kiani, who were visiting scholars at NC State and are now at California State University, Bakersfield; and Woojin Jung and Tae-il Kim of Sungkyunkwan University in South Korea, who worked on the project while visiting NC State.

The work was done with support from NC State’s ASSIST Center, which is funded by the National Science Foundation under grant EEC-1160483. Additional support came from the Coastal Studies Institute of North Carolina and the Fostering Global Talents for Innovative Growth Program supervised by the Korea Institute for Advancement of Technology.

Source/Credit: North Carolina State University

tn090121_01

Tuesday, August 31, 2021

New biomarkers identified to detect consumption of emerging illicit drug

Professor Eric Chan (middle) from the NUS Department of Pharmacy led the research which was conducted in collaboration with the Health Sciences Authority (HSA).
The research team included Ms Moy Hooi Yan (extreme left), HSA’s Laboratory Director of the Analytical Toxicology Lab - Drug Abuse Testing, and Dr Wang Ziteng (extreme right), Research Fellow at the NUS Department of Pharmacy.

 A team of researchers from the National University of Singapore (NUS) has come up with a new solution to boost the surveillance of designer drug abuse. Led by Professor Eric Chan from the NUS Department of Pharmacy, the team has identified three new urinary biomarkers that could be used to detect consumption of ADB-BUTINACA, an emerging synthetic cannabinoid which is a type of new psychoactive substance (NPS). The innovative approach used to identify the biomarkers can be applied to other existing and new synthetic cannabinoids.

NPS are drugs designed to mimic the effects of illegal substances such as cannabis, cocaine, heroin, ‘Ice’, Ecstacy and LSD. The intention of the clandestine laboratories to introduce synthetic cannabinoids with different chemical structures is to try to circumvent legislative bans.

Over the past two years, users of NPS made up the third largest proportion of drug abusers in Singapore, while synthetic cannabinoids have dominated Singapore’s NPS market for the past four years. As most synthetic cannabinoids are extensively metabolized in the body after consumption, they become virtually undetectable in urine samples.

Commenting on the significance of the team’s research, Prof Chan said, “Prior to our study, the metabolism and urinary biomarkers of ADB-BUTINACA were unclear. Our discovery and unique methodology offer assistance to the forensic fraternity who is constantly being challenged by the emergence of novel synthetic cannabinoids, and can also bring benefits to the international public communities to tackle the increasing abuse of this synthetic cannabinoid. This will bring us closer to the goal of having a drug-free world.”

The study, which was carried out in collaboration with the Analytical Toxicology Laboratory of Singapore’s Health Sciences Authority, was first published in the journal Clinical Chemistry on 13 August 2021.

New biomarkers for accurate detection of synthetic drug abuse

ADB-BUTINACA is a new synthetic cannabinoid that was first identified in Europe in 2019, and it entered Singapore’s drug scene last year. Although three existing metabolites of ADB-BUTINACA are available as reference standards for routine forensic monitoring, they have been found to be absent or detected at lower concentrations in some urine samples of abusers. This created an impetus to identify other potential metabolites for use as urinary biomarkers for the cannabinoid’s consumption.

Instead of using the conventional and more time-consuming method of chemically synthesizing metabolites of ADB-BUTINACA, Prof Chan and his team introduced an innovative method to identify the cannabinoid’s unique metabolites using the concepts of drug metabolism and pharmacokinetics.

The team synthesized key metabolites of ADB-BUTINACA using human liver enzymes in the laboratory for investigating their disposition and identifying novel biomarker metabolites in urine. From their studies, a total of 15 metabolites of ADB-BUTINACA and their respective pathways of biotransformation in the body were identified for the first time using this method.

Of the 15 new metabolites, the researchers proposed four as urinary metabolite biomarkers due to their metabolic stability, including one metabolite where its reference standard is currently available. A panel comprising either one or a combination of these four newly-established urinary biomarkers was developed for diagnosing the consumption of ADB-BUTINACA.

Moving forward, the team plans to extend their current strategy to better understand the disposition of novel metabolites of synthetic cannabinoids by kidneys and their eventual occurrence in urine.

Source/Credit: National University of Singapore

med083121_02

Sandia uncovers hidden factors that affect solar farms during severe weather

Sandia National Laboratories researchers Thushara Gunda, front, and Nicole Jackson examine solar panels at Sandia’s Photovoltaic Systems Evaluation Laboratory as summer monsoon clouds roll by. Using machine learning and data from solar farms across the U.S., they uncovered the age of a solar farm, as well as the amount of cloud cover, have pronounced effects on farm performance during severe weather.
(Photo by Randy Montoya)

 Sandia National Laboratories researchers combined large sets of real-world solar data and advanced machine learning to study the impacts of severe weather on U.S. solar farms, and sort out what factors affect energy generation. Their results were published earlier this month in the scientific journal Applied Energy.

Hurricanes, blizzards, hailstorms and wildfires all pose risks to solar farms both directly in the form of costly damage and indirectly in the form of blocked sunlight and reduced electricity output. Two Sandia researchers scoured maintenance tickets from more than 800 solar farms in 24 states and combined that information with electricity generation data and weather records to assess the effects of severe weather on the facilities. By identifying the factors that contribute to low performance, they hope to increase the resiliency of solar farms to extreme weather.

“Trying to understand how future climate conditions could impact our national energy infrastructure, is exactly what we need to be doing if we want our renewable energy sector to be resilient under a changing climate,” said Thushara Gunda, the senior researcher on the project. “Right now, we’re focused on extreme weather events, but eventually we’ll extend into chronic exposure events like consistent extreme heat.”

Hurricanes and snow and storms, oh my!

The Sandia research team first used natural-language processing, a type of machine learning used by smart assistants, to analyze six years of solar maintenance records for key weather-related words. The analysis methods they used for this study has since been published and is freely available for other photovoltaic researchers and operators.

“Our first step was to look at the maintenance records to decide which weather events we should even look at,” said Gunda. “The photovoltaic community talks about hail a lot, but the data in the maintenance records tell a different story.”

While hailstorms tend to be very costly, they did not appear in solar farm maintenance records, likely because operators tend to document hail damage in the form of insurance claims, Gunda said. Instead, she found that hurricanes were mentioned in almost 15% of weather-related maintenance records, followed by the other weather terms, such as snow, storm, lightning and wind.

“Some hurricanes damage racking — the structure that holds up the panels — due to the high winds,” said Nicole Jackson, the lead author on the paper. “The other major issue we’ve seen from the maintenance records and talking with our industry partners is flooding blocking access to the site, which delays the process of turning the plant back on.”

Using machine learning to find the most important factors

Next, they combined more than two years of real-world electricity production data from more than 100 solar farms in 16 states with historical weather data to assess the effects of severe weather on solar farms. They used statistics to find that snowstorms had the highest effect on electricity production, followed by hurricanes and a general group of other storms.

Then they used a machine learning algorithm to uncover the hidden factors that contributed to low performance from these severe weather events.

“Statistics gives you part of the picture, but machine learning was really helpful in clarifying what are those most important variables,” said Jackson, who primarily conducted statistical analysis and the machine learning portion of the project. “Is it where the site is located? Is it how old the site is? Is it how many maintenance tickets were submitted on the day of the weather event? We ended up with a suite of variables and machine learning was used to home in on the most important ones.”

She found that across the board, older solar farms were affected the most by severe weather. One possibility for this is that solar farms that had been in operation for more than five years had more wear-and-tear from being exposed to the elements longer, Jackson said.

Gunda agreed, adding, “This work highlights the importance of ongoing maintenance and further research to ensure photovoltaic plants continue to operate as intended.”

For snowstorms, which unexpectedly were the type of storm with the highest effect on electricity production, the next most important variables were low sunlight levels at the location due to cloud cover and the amount of snow, followed by several geographical features of the farm.

For hurricanes — principally hurricanes Florence and Michael — the amount of rainfall and the timing of the nearest hurricane had the next highest effect on production after age. Surprisingly low wind speeds were significant. This is likely because when high wind speeds are predicted, solar farms are preemptively shut down so that the employees can evacuate leading to no production, Gunda said.

Expanding the approach to wildfires, the grid

As an impartial research institution in this space, Sandia was able to collaborate with multiple industry partners to make this work feasible. “We would not have been able to do this project without those partnerships,” Gunda said.

The research team is working to extend the project to study the effect of wildfires on solar farms. Since wildfires aren’t mentioned in maintenance logs, they were not able to study them for this paper. Operators don’t stop to write a maintenance report when their solar farm is being threatened by a wildfire, Gunda said. “This work highlights the reality of some of the data limitations we have to grapple with when studying extreme weather events.”

“The cool thing about this work is that we were able to develop a comprehensive approach of integrating and analyzing performance data, operations data and weather data,” Jackson said. “We’re extending the approach into wildfires to examine their performance impacts on solar energy generation in greater detail.”

The researchers are currently expanding this work to look at the effects of severe weather on the entire electrical grid, add in more production data, and answer even more questions to help the grid adapt to the changing climate and evolving technologies.

This research was supported by the Department of Energy’s Solar Energy Technologies Office and was conducted in partnership with the National Renewable Energy Laboratory.

Source/Credit: Sandia National Laboratories

tn083121_01

Sea levels to become much more common as Earth warms

 

Extreme sea levels along coastlines across the world will become
100 times more frequent by the end of the century.
Image: Pexels
Global warming will cause extreme sea levels to occur almost every year by the end of the century, impacting major coastlines worldwide, according to new research from an international team of scientists.

Published today in Nature Climate Change, the research predicts that because of rising temperatures, extreme sea levels along coastlines across the world will become 100 times more frequent by the end of the century in about half of the 7,283 locations studied.

Co-author of the study, University of Melbourne’s Dr Ebru Kirezci, an ocean engineering researcher said areas where frequency of extreme sea levels are expected to increase faster include the Southern Hemisphere and subtropic areas, the Mediterranean Sea and the Arabian Peninsula, the southern half of North America’s Pacific Coast, and areas including Hawaii, the Caribbean, the Philippines and Indonesia.

“What we can also infer from this study, is that most of the eastern, southern and southwestern coastlines of Australia will be the impacted with almost an annual frequency of these extreme sea levels by 2100,” Dr Kirezci said.

“This increased frequency of extreme sea levels will occur even with a global temperature increase of 1.5 degrees Celsius. And the changes are likely to come sooner than the end of the century, with many locations experiencing a 100-fold increase in extreme events even by 2070.”

Lead author of the study, climate scientist at the US Department of Energy’s Pacific Northwest National Laboratory, Dr Claudia Tebaldi said it was no surprise that sea level rise will be dramatic even at 1.5 degrees and will have substantial effects on extreme sea level frequencies and magnitude.

“This study gives a more complete picture around the globe. We were able to look at a wider range of warming levels in very fine spatial detail,” Dr Tebaldi said.

The researchers called for more detailed studies to understand how the changes will impact communities within different countries. They added that the physical changes that the study describes will have varying impacts at local scales, depending on several factors, including how vulnerable the site is to rising waters and how prepared a community is for change.

“Public policy makers should take note of these studies and work towards improving coastal protection and mitigation measures. Building dykes and sea walls, retreating from shorelines, and deploying early warning systems are some of the steps which can be taken to adapt to this change,” Dr Kirezci said.

The research was led by the US based Joint Global Change Research Institute in collaboration with researchers from the University of Melbourne, IHE Delft Institute for Water Education in the Netherlands, the European Joint Research Centre in Italy, Princeton University, the University of Illinois, Rutgers University and the University of Bologna.

The study was funded by the US Environmental Protection Agency and their Department of Energy’s Office of Science.


Source/Credit: University of Melbourne

en083121_01

Vaccine candidates for Ebola

 Researchers at the University of Hawaiʻi at Mānoa John A. Burns School of Medicine (JABSOM) have

Axel Lehrer in his lab at the John A.
Burns School of Medicine.
demonstrated the efficacy in monkeys of multiple vaccine candidates targeting three filoviruses causing life-threatening infections to humans: Ebola virus, Sudan virus and Marburg virus. The new findings were published in Frontiers in Immunology on August 18.

Associate Professor Axel Lehrer of the Department of Tropical Medicine, Medical Microbiology and Pharmacology leads the JABSOM team, working in collaboration on this project with late-stage biopharmaceutical company Soligenix, Inc., and with the local development partner, Hawaii Biotech, Inc. The team also reported another breakthrough in demonstrating successful thermostabilization in single vials of Filovirus vaccines in Vaccine.

“Filoviruses are endemic in areas of the world where the power supply can be uncertain, making a thermostable vaccine particularly valuable,” said Lehrer. “Our work to date has demonstrated not only the feasibility of rapid and efficient manufacturing, but also the applicability of thermostabilization of multiple antigens with the potential for a broadly applicable and easily distributed vaccine.”

Lehrer’s work has focused on creating shelf-stable vaccines that require no refrigeration or freezing, which is key to eradicating viruses in tropical countries, and allows equitable distribution of much needed vaccines to communities around the globe.

According to Lehrer, once developed, such a vaccine may be able to rapidly address emerging outbreaks, such as the Marburg virus infection that appeared in Guinea recently. The collaborators believe that this technology may be an important contribution to National Institute of Allergy and Infectious Diseases Director Anthony Fauci’s proposed idea to develop prototype vaccines against the top 20 viral families that may also cause pandemics.

“Having such a platform available would likely enable broader and faster worldwide vaccination campaigns addressing future health emergencies. In addition, the ability to combine antigens in the formulation also enables generation of potentially broader protective vaccines,” Lehrer said.

COVID-19 vaccine update

Since March 2020, Lehrer has also been working with Soligenix on a promising thermostable COVID-19 vaccine. “While much progress has been made since the initial announcement of our collaborative research, we are actively working on further analysis if the neutralizing potential of the vaccine candidate against a number of virus variants,” he said. The vaccine is being developed using the same thermostable platform that was used for filovirus vaccines and has demonstrated promising results in mice and non-human primates.

Source / Credit: University of Hawaiʻi

med083121_01

Featured Article

Autism and ADHD are linked to disturbed gut flora very early in life

The researchers have found links between the gut flora in babies first year of life and future diagnoses. Photo Credit:  Cheryl Holt Disturb...

Top Viewed Articles