. Scientific Frontline: August 2021

Tuesday, August 31, 2021

New biomarkers identified to detect consumption of emerging illicit drug

Professor Eric Chan (middle) from the NUS Department of Pharmacy led the research which was conducted in collaboration with the Health Sciences Authority (HSA).
The research team included Ms Moy Hooi Yan (extreme left), HSA’s Laboratory Director of the Analytical Toxicology Lab - Drug Abuse Testing, and Dr Wang Ziteng (extreme right), Research Fellow at the NUS Department of Pharmacy.

 A team of researchers from the National University of Singapore (NUS) has come up with a new solution to boost the surveillance of designer drug abuse. Led by Professor Eric Chan from the NUS Department of Pharmacy, the team has identified three new urinary biomarkers that could be used to detect consumption of ADB-BUTINACA, an emerging synthetic cannabinoid which is a type of new psychoactive substance (NPS). The innovative approach used to identify the biomarkers can be applied to other existing and new synthetic cannabinoids.

NPS are drugs designed to mimic the effects of illegal substances such as cannabis, cocaine, heroin, ‘Ice’, Ecstacy and LSD. The intention of the clandestine laboratories to introduce synthetic cannabinoids with different chemical structures is to try to circumvent legislative bans.

Over the past two years, users of NPS made up the third largest proportion of drug abusers in Singapore, while synthetic cannabinoids have dominated Singapore’s NPS market for the past four years. As most synthetic cannabinoids are extensively metabolized in the body after consumption, they become virtually undetectable in urine samples.

Commenting on the significance of the team’s research, Prof Chan said, “Prior to our study, the metabolism and urinary biomarkers of ADB-BUTINACA were unclear. Our discovery and unique methodology offer assistance to the forensic fraternity who is constantly being challenged by the emergence of novel synthetic cannabinoids, and can also bring benefits to the international public communities to tackle the increasing abuse of this synthetic cannabinoid. This will bring us closer to the goal of having a drug-free world.”

The study, which was carried out in collaboration with the Analytical Toxicology Laboratory of Singapore’s Health Sciences Authority, was first published in the journal Clinical Chemistry on 13 August 2021.

New biomarkers for accurate detection of synthetic drug abuse

ADB-BUTINACA is a new synthetic cannabinoid that was first identified in Europe in 2019, and it entered Singapore’s drug scene last year. Although three existing metabolites of ADB-BUTINACA are available as reference standards for routine forensic monitoring, they have been found to be absent or detected at lower concentrations in some urine samples of abusers. This created an impetus to identify other potential metabolites for use as urinary biomarkers for the cannabinoid’s consumption.

Instead of using the conventional and more time-consuming method of chemically synthesizing metabolites of ADB-BUTINACA, Prof Chan and his team introduced an innovative method to identify the cannabinoid’s unique metabolites using the concepts of drug metabolism and pharmacokinetics.

The team synthesized key metabolites of ADB-BUTINACA using human liver enzymes in the laboratory for investigating their disposition and identifying novel biomarker metabolites in urine. From their studies, a total of 15 metabolites of ADB-BUTINACA and their respective pathways of biotransformation in the body were identified for the first time using this method.

Of the 15 new metabolites, the researchers proposed four as urinary metabolite biomarkers due to their metabolic stability, including one metabolite where its reference standard is currently available. A panel comprising either one or a combination of these four newly-established urinary biomarkers was developed for diagnosing the consumption of ADB-BUTINACA.

Moving forward, the team plans to extend their current strategy to better understand the disposition of novel metabolites of synthetic cannabinoids by kidneys and their eventual occurrence in urine.

Source/Credit: National University of Singapore

med083121_02

Sandia uncovers hidden factors that affect solar farms during severe weather

Sandia National Laboratories researchers Thushara Gunda, front, and Nicole Jackson examine solar panels at Sandia’s Photovoltaic Systems Evaluation Laboratory as summer monsoon clouds roll by. Using machine learning and data from solar farms across the U.S., they uncovered the age of a solar farm, as well as the amount of cloud cover, have pronounced effects on farm performance during severe weather.
(Photo by Randy Montoya)

 Sandia National Laboratories researchers combined large sets of real-world solar data and advanced machine learning to study the impacts of severe weather on U.S. solar farms, and sort out what factors affect energy generation. Their results were published earlier this month in the scientific journal Applied Energy.

Hurricanes, blizzards, hailstorms and wildfires all pose risks to solar farms both directly in the form of costly damage and indirectly in the form of blocked sunlight and reduced electricity output. Two Sandia researchers scoured maintenance tickets from more than 800 solar farms in 24 states and combined that information with electricity generation data and weather records to assess the effects of severe weather on the facilities. By identifying the factors that contribute to low performance, they hope to increase the resiliency of solar farms to extreme weather.

“Trying to understand how future climate conditions could impact our national energy infrastructure, is exactly what we need to be doing if we want our renewable energy sector to be resilient under a changing climate,” said Thushara Gunda, the senior researcher on the project. “Right now, we’re focused on extreme weather events, but eventually we’ll extend into chronic exposure events like consistent extreme heat.”

Hurricanes and snow and storms, oh my!

The Sandia research team first used natural-language processing, a type of machine learning used by smart assistants, to analyze six years of solar maintenance records for key weather-related words. The analysis methods they used for this study has since been published and is freely available for other photovoltaic researchers and operators.

“Our first step was to look at the maintenance records to decide which weather events we should even look at,” said Gunda. “The photovoltaic community talks about hail a lot, but the data in the maintenance records tell a different story.”

While hailstorms tend to be very costly, they did not appear in solar farm maintenance records, likely because operators tend to document hail damage in the form of insurance claims, Gunda said. Instead, she found that hurricanes were mentioned in almost 15% of weather-related maintenance records, followed by the other weather terms, such as snow, storm, lightning and wind.

“Some hurricanes damage racking — the structure that holds up the panels — due to the high winds,” said Nicole Jackson, the lead author on the paper. “The other major issue we’ve seen from the maintenance records and talking with our industry partners is flooding blocking access to the site, which delays the process of turning the plant back on.”

Using machine learning to find the most important factors

Next, they combined more than two years of real-world electricity production data from more than 100 solar farms in 16 states with historical weather data to assess the effects of severe weather on solar farms. They used statistics to find that snowstorms had the highest effect on electricity production, followed by hurricanes and a general group of other storms.

Then they used a machine learning algorithm to uncover the hidden factors that contributed to low performance from these severe weather events.

“Statistics gives you part of the picture, but machine learning was really helpful in clarifying what are those most important variables,” said Jackson, who primarily conducted statistical analysis and the machine learning portion of the project. “Is it where the site is located? Is it how old the site is? Is it how many maintenance tickets were submitted on the day of the weather event? We ended up with a suite of variables and machine learning was used to home in on the most important ones.”

She found that across the board, older solar farms were affected the most by severe weather. One possibility for this is that solar farms that had been in operation for more than five years had more wear-and-tear from being exposed to the elements longer, Jackson said.

Gunda agreed, adding, “This work highlights the importance of ongoing maintenance and further research to ensure photovoltaic plants continue to operate as intended.”

For snowstorms, which unexpectedly were the type of storm with the highest effect on electricity production, the next most important variables were low sunlight levels at the location due to cloud cover and the amount of snow, followed by several geographical features of the farm.

For hurricanes — principally hurricanes Florence and Michael — the amount of rainfall and the timing of the nearest hurricane had the next highest effect on production after age. Surprisingly low wind speeds were significant. This is likely because when high wind speeds are predicted, solar farms are preemptively shut down so that the employees can evacuate leading to no production, Gunda said.

Expanding the approach to wildfires, the grid

As an impartial research institution in this space, Sandia was able to collaborate with multiple industry partners to make this work feasible. “We would not have been able to do this project without those partnerships,” Gunda said.

The research team is working to extend the project to study the effect of wildfires on solar farms. Since wildfires aren’t mentioned in maintenance logs, they were not able to study them for this paper. Operators don’t stop to write a maintenance report when their solar farm is being threatened by a wildfire, Gunda said. “This work highlights the reality of some of the data limitations we have to grapple with when studying extreme weather events.”

“The cool thing about this work is that we were able to develop a comprehensive approach of integrating and analyzing performance data, operations data and weather data,” Jackson said. “We’re extending the approach into wildfires to examine their performance impacts on solar energy generation in greater detail.”

The researchers are currently expanding this work to look at the effects of severe weather on the entire electrical grid, add in more production data, and answer even more questions to help the grid adapt to the changing climate and evolving technologies.

This research was supported by the Department of Energy’s Solar Energy Technologies Office and was conducted in partnership with the National Renewable Energy Laboratory.

Source/Credit: Sandia National Laboratories

tn083121_01

Sea levels to become much more common as Earth warms

 

Extreme sea levels along coastlines across the world will become
100 times more frequent by the end of the century.
Image: Pexels
Global warming will cause extreme sea levels to occur almost every year by the end of the century, impacting major coastlines worldwide, according to new research from an international team of scientists.

Published today in Nature Climate Change, the research predicts that because of rising temperatures, extreme sea levels along coastlines across the world will become 100 times more frequent by the end of the century in about half of the 7,283 locations studied.

Co-author of the study, University of Melbourne’s Dr Ebru Kirezci, an ocean engineering researcher said areas where frequency of extreme sea levels are expected to increase faster include the Southern Hemisphere and subtropic areas, the Mediterranean Sea and the Arabian Peninsula, the southern half of North America’s Pacific Coast, and areas including Hawaii, the Caribbean, the Philippines and Indonesia.

“What we can also infer from this study, is that most of the eastern, southern and southwestern coastlines of Australia will be the impacted with almost an annual frequency of these extreme sea levels by 2100,” Dr Kirezci said.

“This increased frequency of extreme sea levels will occur even with a global temperature increase of 1.5 degrees Celsius. And the changes are likely to come sooner than the end of the century, with many locations experiencing a 100-fold increase in extreme events even by 2070.”

Lead author of the study, climate scientist at the US Department of Energy’s Pacific Northwest National Laboratory, Dr Claudia Tebaldi said it was no surprise that sea level rise will be dramatic even at 1.5 degrees and will have substantial effects on extreme sea level frequencies and magnitude.

“This study gives a more complete picture around the globe. We were able to look at a wider range of warming levels in very fine spatial detail,” Dr Tebaldi said.

The researchers called for more detailed studies to understand how the changes will impact communities within different countries. They added that the physical changes that the study describes will have varying impacts at local scales, depending on several factors, including how vulnerable the site is to rising waters and how prepared a community is for change.

“Public policy makers should take note of these studies and work towards improving coastal protection and mitigation measures. Building dykes and sea walls, retreating from shorelines, and deploying early warning systems are some of the steps which can be taken to adapt to this change,” Dr Kirezci said.

The research was led by the US based Joint Global Change Research Institute in collaboration with researchers from the University of Melbourne, IHE Delft Institute for Water Education in the Netherlands, the European Joint Research Centre in Italy, Princeton University, the University of Illinois, Rutgers University and the University of Bologna.

The study was funded by the US Environmental Protection Agency and their Department of Energy’s Office of Science.


Source/Credit: University of Melbourne

en083121_01

Vaccine candidates for Ebola

 Researchers at the University of Hawaiʻi at Mānoa John A. Burns School of Medicine (JABSOM) have

Axel Lehrer in his lab at the John A.
Burns School of Medicine.
demonstrated the efficacy in monkeys of multiple vaccine candidates targeting three filoviruses causing life-threatening infections to humans: Ebola virus, Sudan virus and Marburg virus. The new findings were published in Frontiers in Immunology on August 18.

Associate Professor Axel Lehrer of the Department of Tropical Medicine, Medical Microbiology and Pharmacology leads the JABSOM team, working in collaboration on this project with late-stage biopharmaceutical company Soligenix, Inc., and with the local development partner, Hawaii Biotech, Inc. The team also reported another breakthrough in demonstrating successful thermostabilization in single vials of Filovirus vaccines in Vaccine.

“Filoviruses are endemic in areas of the world where the power supply can be uncertain, making a thermostable vaccine particularly valuable,” said Lehrer. “Our work to date has demonstrated not only the feasibility of rapid and efficient manufacturing, but also the applicability of thermostabilization of multiple antigens with the potential for a broadly applicable and easily distributed vaccine.”

Lehrer’s work has focused on creating shelf-stable vaccines that require no refrigeration or freezing, which is key to eradicating viruses in tropical countries, and allows equitable distribution of much needed vaccines to communities around the globe.

According to Lehrer, once developed, such a vaccine may be able to rapidly address emerging outbreaks, such as the Marburg virus infection that appeared in Guinea recently. The collaborators believe that this technology may be an important contribution to National Institute of Allergy and Infectious Diseases Director Anthony Fauci’s proposed idea to develop prototype vaccines against the top 20 viral families that may also cause pandemics.

“Having such a platform available would likely enable broader and faster worldwide vaccination campaigns addressing future health emergencies. In addition, the ability to combine antigens in the formulation also enables generation of potentially broader protective vaccines,” Lehrer said.

COVID-19 vaccine update

Since March 2020, Lehrer has also been working with Soligenix on a promising thermostable COVID-19 vaccine. “While much progress has been made since the initial announcement of our collaborative research, we are actively working on further analysis if the neutralizing potential of the vaccine candidate against a number of virus variants,” he said. The vaccine is being developed using the same thermostable platform that was used for filovirus vaccines and has demonstrated promising results in mice and non-human primates.

Source / Credit: University of Hawaiʻi

med083121_01

Monday, August 30, 2021

Pathways to production

 Biologists at Sandia National Laboratories developed comprehensive software that will help scientists in a variety of industries create engineered chemicals more quickly and easily. Sandia is now looking to license the software for commercial use, researchers said.

Sandia’s stand-alone software RetSynth uses a novel algorithm to sort through large, curated databases of biological and chemical reactions, which could help scientists synthetically engineer compounds used in the production of biofuels, pharmaceuticals, cosmetics, industrial chemicals, dyes, scents and flavors.

A graphic illustration of the kind of retrosynthetic analysis conducted by RetSynth software developed at Sandia National Laboratories. Using a novel algorithm, the software identifies the biological or chemical reactions needed to create a desired biological product or compound.
(Graphic by Laura Hatfield)

The software platform uses retrosynthetic analysis to help scientists identify possible pathways to production — the series of biological and chemical reactions, or steps, needed to engineer and modify the molecules in a cell — to create the desired biological product or compound. By using the software to rapidly analyze all pathways, scientists can determine the production sequence with the fewest steps, the sequences that can be completed with available resources or the most economically viable process.

Synthetic biology involves redesigning organisms for useful purposes by engineering them to have new abilities. Researchers and companies around the world are using synthetic biology to harness the power of nature to solve problems in medicine — such as the development of vaccines, antibodies and therapeutic treatments — as well as in manufacturing and agriculture.

“Synthetic biology is becoming a critical capability for U.S. manufacturing. It has the potential to dramatically reduce waste, eliminate or curtail emissions and create next-generation therapeutics and materials,” said Corey Hudson, a computational biologist at Sandia. “That is where people will see RetSynth have the biggest impact.”

“The diverse functionality of RetSynth opens a lot of opportunities for researchers, giving them multiple options, including biological, chemical or hybrid pathways to production,” Hudson said. “All the while, the software is accelerating the research and development process associated with bioproduction. Traditionally, this process has been relatively slow and complex.”

RetSynth is designed to save researchers time and money by suggesting process modifications to maximize theoretical yield, or the amount of bioproduct that could be produced, Hudson said. All available pathways are rendered using clear visual images, enabling software users to quickly interpret results.

Commercial licensing for broader impact

The RetSynth software was originally developed as part of the Department of Energy’s Co-Optimization of Fuels & Engines initiative, a consortium of national lab, university and industry researchers who are creating innovative fuels and combining them with high-efficiency engines to reduce emissions and boost fuel economy.

Today, RetSynth has been expanded to support a variety of diverse applications, and Sandia is ready to license the software to an industry partner for commercial use, Hudson said.

Source / Credit: Sandia National Laboratories

tn083021_02

Carbon nanotube fibers woven into clothing gather accurate EKG.

 

Rice University graduate student Lauren Taylor
 shows a shirt with carbon nanotube thread that provides
constant monitoring of the wearer’s heart. Photo by Jeff Fitlow
There’s no need to don uncomfortable smartwatches or chest straps to monitor your heart if your comfy shirt can do a better job.

That’s the idea behind “smart clothing” developed by a Rice University lab, which employed its conductive nanotube thread to weave functionality into regular apparel.

The Brown School of Engineering lab of chemical and biomolecular engineer Matteo Pasquali reported in the American Chemical Society journal Nano Letters that it sewed nanotube fibers into athletic wear to monitor the heart rate and take a continual electrocardiogram (EKG) of the wearer.

The fibers are just as conductive as metal wires, but washable, comfortable and far less likely to break when a body is in motion, according to the researchers.

On the whole, the shirt they enhanced was better at gathering data than a standard chest-strap monitor taking live measurements during experiments. When matched with commercial medical electrode monitors, the carbon nanotube shirt gave slightly better EKGs.

“The shirt has to be snug against the chest,” said Rice graduate student Lauren Taylor, lead author of the study. “In future studies, we will focus on using denser patches of carbon nanotube threads so there’s more surface area to contact the skin.”

The researchers noted nanotube fibers are soft and flexible, and clothing that incorporates them is machine washable. The fibers can be machine-sewn into fabric just like standard thread. The zigzag stitching pattern allows the fabric to stretch without breaking them.

The fibers provided not only steady electrical contact with the wearer’s skin but also served as electrodes to connect electronics like Bluetooth transmitters to relay data to a smartphone or connect to a Holter monitor that can be stowed in a user’s pocket, Taylor said.

Pasquali’s lab introduced carbon nanotube fiber in 2013. Since then the fibers, each containing tens of billions of nanotubes, have been studied for use as bridges to repair damaged hearts, as electrical interfaces with the brain, for use in cochlear implants, as flexible antennas and for automotive and aerospace applications. Their development is also part of the Rice-based Carbon Hub, a multiuniversity research initiative led by Rice and launched in 2019.

The original nanotube filaments, at about 22 microns wide, were too thin for a sewing machine to

A Rice lab uses a custom device that weaves
carbon nanotube fibers into larger threads for sewing. Photo by Jeff Fitlow

handle. Taylor said a rope-maker was used to create a sewable thread, essentially three bundles of seven filaments each, woven into a size roughly equivalent to regular thread.

“We worked with somebody who sells little machines designed to make ropes for model ships,” said Taylor, who at first tried to weave the thread by hand, with limited success. “He was able to make us a medium-scale device that does the same.”

She said the zigzag pattern can be adjusted to account for how much a shirt or other fabric is likely to stretch. Taylor said the team is working with Dr. Mehdi Razavi and his colleagues at the Texas Heart Institute to figure out how to maximize contact with the skin.

Fibers woven into fabric can also be used to embed antennas or LEDs, according to the researchers. Minor modifications to the fibers’ geometry and associated electronics could eventually allow clothing to monitor vital signs, force exertion or respiratory rate.

Taylor noted other potential uses could include human-machine interfaces for automobiles or soft robotics, or as antennas, health monitors and ballistic protection in military uniforms. “We demonstrated with a collaborator a few years ago that carbon nanotube fibers are better at dissipating energy on a per-weight basis than Kevlar, and that was without some of the gains that we’ve had since in tensile strength,” she said.

“We see that, after two decades of development in labs worldwide, this material works in more and more applications,” Pasquali said. “Because of the combination of conductivity, good contact with the skin, biocompatibility and softness, carbon nanotube threads are a natural component for wearables.”

He said the wearable market, although relatively small, could be an entry point for a new generation of sustainable materials that can be derived from hydrocarbons via direct splitting, a process that also produces clean hydrogen. Development of such materials is a focus of the Carbon Hub.

“We’re in the same situation as solar cells were a few decades ago,” Pasquali said. “We need application leaders that can provide a pull for scaling up production and increasing efficiency.”

Co-authors of the paper are Rice graduate students Steven Williams and Oliver Dewey, and alumni J. Stephen Yan, now at Boston Consulting Group, and Flavia Vitale, an assistant professor of neurology at the University of Pennsylvania. Pasquali is director of the Carbon Hub and the A.J. Hartsook Professor of Chemical and Biomolecular Engineering and a professor of chemistry and of materials science and nanoengineering.

The research was supported by the U.S. Air Force (FA9550-15-1-0370), the American Heart Association (15CSA24460004), the Robert A. Welch Foundation (C-1668), the Department of Energy (DE-EE0007865, DE-AR0001015), the Department of Defense (32 CFR 168a) and a Riki Kobayashi Fellowship from the Rice Department of Chemical and Biomolecular Engineering.


NEWS RELEASE
Source/Credit: Rice University

tn083021_01

Space Weather Questions

 We'll be bringing back Space Weather to Scientific Frontline soon in a reduced version from the original web site. So for those that don't know much about space weather here is a very informative video.


Sunday, August 29, 2021

Will it be safe for humans to fly to Mars?

 

Credit: NASA
Sending human travelers to Mars would require scientists and engineers to overcome a range of technological and safety obstacles. One of them is the grave risk posed by particle radiation from the sun, distant stars and galaxies.

Answering two key questions would go a long way toward overcoming that hurdle: Would particle radiation pose too grave a threat to human life throughout a round trip to the red planet? And, could the very timing of a mission to Mars help shield astronauts and the spacecraft from the radiation?

In a new article published in the peer-reviewed journal Space Weather, an international team of space scientists, including researchers from UCLA, answers those two questions with a “no” and a “yes.”

That is, humans should be able to safely travel to and from Mars, provided that the spacecraft has sufficient shielding and the round trip is shorter than approximately four years. And the timing of a human mission to Mars would indeed make a difference: The scientists determined that the best time for a flight to leave Earth would be when solar activity is at its peak, known as the solar maximum.

The scientists’ calculations demonstrate that it would be possible to shield a Mars-bound spacecraft from energetic particles from the sun because, during solar maximum, the most dangerous and energetic particles from distant galaxies are deflected by the enhanced solar activity.

A trip of that length would be conceivable. The average flight to Mars takes about nine months, so depending on the timing of launch and available fuel, it is plausible that a human mission could reach the planet and return to Earth in less than two years, according to Yuri Shprits, a UCLA research geophysicist and co-author of the paper.

“This study shows that while space radiation imposes strict limitations on how heavy the spacecraft can be and the time of launch, and it presents technological difficulties for human missions to Mars, such a mission is viable,” said Shprits, who also is head of space physics and space weather at GFZ Research Centre for Geosciences in Potsdam, Germany.

The researchers recommend a mission not longer than four years because a longer journey would expose astronauts to a dangerously high amount of radiation during the round trip — even assuming they went when it was relatively safer than at other times. They also report that the main danger to such a flight would be particles from outside of our solar system.

Shprits and colleagues from UCLA, MIT, Moscow’s Skolkovo Institute of Science and Technology and GFZ Potsdam combined geophysical models of particle radiation for a solar cycle with models for how radiation would affect both human passengers — including its varying effects on different bodily organs — and a spacecraft. The modeling determined that having a spacecraft’s shell built out of a relatively thick material could help protect astronauts from radiation, but that if the shielding is too thick, it could actually increase the amount of secondary radiation to which they are exposed.

The two main types of hazardous radiation in space are solar energetic particles and galactic cosmic rays; the intensity of each depends on solar activity. Galactic cosmic ray activity is lowest within the six to 12 months after the peak of solar activity, while solar energetic particles’ intensity is greatest during solar maximum, Shprits said.

Source / Credit: UCLA

spn082921_01

Saturday, August 28, 2021

Exposure to air pollution linked with increased mental health issues

 

Exposure to traffic-related air pollution is associated with increased mental health service-use among people recently diagnosed with psychotic and mood disorders such as schizophrenia and depression, a study on data from over 13,000 people has found.

Increased use of mental health services reflects mental illness severity, suggesting that initiatives to lessen air pollution could improve outcomes for those with these disorders and reduce costs of the healthcare needed to support them.

The research was published in the British Journal of Psychiatry and funded by the National Institute for Health Research (NIHR) Maudsley Biomedical Research Centre.

In 2019 119,000 people lived with illegal levels of polluted air in London. Previous research has found that adults exposed to high levels of traffic-related air pollution are more likely to experience common mental health disorders such as anxiety and mild depression but, until now, little was known about whether air pollution exposure contributes to the course and severity after the onset of more serious mental illness.

Researchers at King’s College London, University of Bristol and Imperial College London analyzed data from 13,887 people aged 15 years and over who had face-to-face contact with South London and Maudsley NHS Foundation Trust (SLaM) services between 2008 and 2012. Individuals were followed from the date of their first face-to-face contact for up to seven years.

Anonymized electronic mental health records were linked with quarterly average modelled concentrations of air pollutants (20x20 meter grid points) at the residential address of the participants. These included nitrogen dioxide and nitrogen oxides (NO2 and NOx) and fine and coarse particulate matter (PM2.5 and PM10).

The study found people exposed to higher residential levels of air pollutants used mental healthcare services more frequently in the months and years following their initial presentation to secondary mental healthcare services compared to those exposed to lower air pollution.

The researchers found that for every 3 micrograms per cubic meter increase in very small particulate matter (PM2.5) and 15 micrograms per cubic meter increase in nitrogen dioxide (NO2) over a one-year period there was an increased risk of having an inpatient stay of 11 per cent and 18 per cent. Results also showed increases in PM2.5 and NO2 were associated with a 7 per cent and 32 per cent increased risk of requiring community-based mental healthcare for the same period. These findings were also replicated over a seven-year period.

Dr Ioannis Bakolis, Senior Lecturer in Biostatistics and Epidemiology at the Institute of Psychiatry, Psychology & Neuroscience (IoPPN) King’s College London and lead author of the study, said: ‘There is already evidence linking air pollution to the incidence of mental disorders, but our novel findings suggest that air pollution could also play a role in the severity of mental disorders for people with pre-existing mental health conditions.’

He continued: ‘Our research indicates that air pollution is a major risk factor for increased severity of mental disorders. It is also a risk factor that is easily modifiable which suggests more public health initiatives to reduce exposure such as low emission zones could improve mental health outcomes as well as reduce the high healthcare costs caused by long-term chronic mental illness.’

According to the researchers, if the UK urban population’s exposure to PM2.5 was reduced by just a few units to the World Health Organization's recommended annual limit (10 micrograms per cubic metre), this would reduce usage of mental health services by around two per cent, thereby saving tens of millions of pounds each year in associated healthcare costs.

Dr Joanne Newbury, Sir Henry Wellcome Research Fellow, Bristol Medical School (PHS), and the study’s first author, added: ‘We observed these findings for both mood disorders and psychotic disorders, as well as for both inpatient and community-based mental healthcare, and over seven years follow-up. This suggests that air pollution may contribute to a broad range of mental health problems, across a wide spectrum of clinical need, and over long periods of time.

‘We now plan to examine whether air pollution is associated with a broader range of mental health, neurodevelopmental, and educational outcomes, particularly among children, who might be especially vulnerable to air pollution.’

South London and Maudsley NHS Foundation Trust provides comprehensive secondary mental healthcare to approximately 1.36 million people within the London boroughs of Croydon, Lambeth, Lewisham and Southwark. These are inner-city areas with high-traffic flows and high average air pollution concentrations compared to other UK urban areas that reflect London’s diversity in terms of ethnicity and wealth.

The researchers controlled the analyses for a number of potential variables that could influence the association between air pollution and service-use association, such as deprivation, population density, age, season, marital status and ethnicity. However, they cautioned that the study does not prove cause and effect, and further research needs to demonstrate exactly how air pollution might increase severity of mental health problems.

Dr Adrian James, President of the Royal College of Psychiatrists, said: ‘The environmental and climate emergency is also a mental health emergency. Our health is fundamentally linked to the quality of our environment, whether that's about cleaner air, access to green spaces or protection from extreme weather.

‘If air pollution is exacerbating pre-existing serious mental illnesses, such as schizophrenia, bipolar disorder and major depression, then improving air quality could reduce the pressure on mental health services. As we look ahead to our post-pandemic future, it is vital that we find ways to build back greener and prevent poor health. This important research presents a clear example where these go hand-in-hand.’

The research was funded by the National Institute for Health Research (NIHR), the NIHR Maudsley Biomedical Research Centre, Wellcome, the Economic and Social Research Council, and the UK Medical Research Council.

Press Release
Source / Credit: University of Bristol

Rice lab dives deep for DNA’s secrets

 The poor bacteriophages in Yang Gao’s lab are about to have a lot of bad days.

Yang Gao

That’s all to the good for the structural biologist, who has received a prestigious Maximizing Investigators’ Research Award for New and Early Stage Investigators from the National Institutes of Health to make the lives of viruses harder so ours can be better.

The five-year grant for $1.9 million, administered by the National Institute of General Medical Sciences, will help Gao and his group detail the mechanisms of proteins that produce copies of genomic DNA, and what can go awry when they’re either subjected to stress or face other barriers.

A better understanding of the structural framework of DNA replication, stress response and repair at the atomic level could help find new ways to target processes involved in a host of diseases, including cancer.

“We’re interested in the basic question of how DNA is replicated,” said Gao, an assistant professor of biosciences who joined Rice in 2019 with the backing of a grant from the Cancer Prevention and Research Institute of Texas. “We’ve known for a long time that DNA is a fragile molecule and subject to many different assaults, environmental and physiological, like ultraviolet from sunlight and oxidative species.

“So many things damage DNA,” he said. “Despite that, DNA replication has to keep on going, even if there are errors, with an enzyme called DNA polymerase and a motor called the helicase.”

A study of stress on bacteriophage T7 will help Rice structural biologist Yang Gao and his team to reveal the atomic-scale mechanisms of DNA replication. (Credit: Yang Gao Lab/Rice University)

These are part of the replisome, a complex chain of proteins that carry out DNA replication and help repair DNA on the fly. Part of their normal function is to catch and fix coding errors. “When they see something bad they call for help, either before or after replication,” Gao said. “But how that works is still unknown, and we want to figure it out.”

The lab will start with the T7 bacteriophage, a virus whose infection mechanism in Escherichia coli bacteria is a good analog for what happens in humans.

“During my postdoc, we solved the first structure of T7 replisome to show how T7 comes together at a replication site,” he said. “We’ve continued that work at Rice, and we’re using the system to explore how it deals with different damages.”

The lab will then study the structure of mitochondria, the “power plants” inside cells, to see how DNA mutations produced there could lead to genetic diseases. “These two systems are mechanistically similar, and because we have experience with T7 and we’ve recently established a mitochondrial hub, we’re in a good position to start this investigation,” Gao said.

He noted he will continue to collaborate with Rice physicist Peter Wolynes and his group, which produces models that advance the theory of DNA replication. The lab also plans to make use of a new transmission electron microscope pegged for Rice’s BioScience Research Collaborative.

Press Release
Source / Credit: Rice University

Friday, August 27, 2021

Covid-19, not vaccination, presents biggest blood clot risks

Researchers from the University of Oxford have today announced the results of a study into


thrombocytopenia (a condition with low platelet counts) and thromboembolic events (blood clots) following vaccination for Covid-19, some of the same events which have led to restricted use of the Oxford-AstraZeneca vaccine in a number of countries.

  1. Study compares risks of thrombocytopenia and thromboembolic events following ChAdOx1 nCov-19 (Oxford-AstraZeneca), BNT162b2 mRNA (Pfizer-BioNTech) vaccination, and SARS-CoV-2 (Covid-19) infection
  2. Study shows increased risk of thrombocytopenia and venous thromboembolism with ChAdOx1 nCoV-19, and increased risk of arterial thromboembolism following BNT162b2 mRNA
  3. Risks of these adverse events were however much higher following SARS-CoV-2 infection
  4. Study authors fully independent from Oxford vaccine developers

Researchers from the University of Oxford have today announced the results of a study into thrombocytopenia (a condition with low platelet counts) and thromboembolic events (blood clots) following vaccination for Covid-19, some of the same events which have led to restricted use of the Oxford-AstraZeneca vaccine in a number of countries. 

Writing in the British Medical Journal (BMJ), they detail the findings from over 29 million people vaccinated with first doses of either the ChAdOx1 nCov-19 ‘Oxford-AstraZeneca’ vaccine or the BNT162b2 mRNA ‘Pfizer-BioNTech’ vaccine. They conclude that with both of these vaccines, for short time intervals following the first dose, there are increased risks of some hematological and vascular adverse events leading to hospitalization or death.

Julia Hippisley-Cox, Professor of Clinical Epidemiology and General Practice at the University of Oxford, lead author of the paper, said:

‘People should be aware of these increased risks after Covid-19 vaccination and seek medical attention promptly if they develop symptoms, but also be aware that the risks are considerably higher and over longer periods of time if they become infected with SARS-CoV-2’ 

The authors further note that the risk of these adverse events is substantially higher and for a longer period of time, following infection from the SARS-CoV-2 ‘coronavirus’ than after either vaccine.

All of the coronavirus vaccines currently in use have been tested in randomized clinical trials, which are unlikely to be large enough to detect very rare adverse events. When rare events are uncovered, then regulators perform a risk-benefit analysis of the medicine; to compare the risks of the adverse events if vaccinated versus the benefits of avoidance of the disease – in this case, Covid-19.

In this paper, the team of authors from the University of Oxford, University of Leicester, Guys and St Thomas’ NHS Foundation Trust, the Intensive Care National Audit & Research Centre, the London School of Hygiene and Tropical Medicine, the University of Cambridge, the University of Edinburgh and the University of Nottingham, compared rates of adverse events after vaccination with Pfizer-BioNTech and Oxford-AstraZeneca vaccines with rates of the same events after a positive SARS-CoV-2 test result.

For this, they used routinely collected electronic health records to evaluate the short-term risks (within 28 days) of hospital admission with thrombocytopenia, venous thromboembolism (VTE) and arterial thromboembolism (ATE), using data collected from across England between December 1, 2020 and April 24, 2021. Other outcomes studied were cerebral venous sinus thrombosis (CVST), ischemic stroke, myocardial infarction and other rare arterial thrombotic events.

Prof. Hippisley-Cox added:

‘This research is important as many other studies, while useful, have been limited by small numbers and potential biases. Electronic healthcare records, which contain detailed recording of vaccinations, infections, outcomes and confounders, have provided us with a rich source of data with which to perform a robust evaluation of these vaccines, and compare to risks associated with Covid-19 infection.’

The authors detail the following limitations to their study:

  1. restricting the analysis to first vaccine dose only
  2. a short vaccination exposure window
  3. the lack of formal adjudication of routinely acquired outcomes, and the potential for misclassification of outcomes or exposures
  4. admissions where patients were still in hospital by the study end date being excluded.
  5. However, they believe that any bias, if present, is likely to not change with respect to each vaccine and so the comparisons between vaccines are unlikely to be affected.

Andrew Morris, Director, Health Data Research UK and Lead, Data and Connectivity National Core Study:

‘Congratulations to the team at Oxford who have worked with colleagues across the UK on this important research. This is truly health data science in action – the use of secure, large scale, linked datasets to develop real-world insights on the safety of COVID-19 vaccines. The analyses in this paper are a vital addition to all of the work enabled by HDR UK to enhance our understanding of the virus, and a key output from the Data and Connectivity National Core Study.’

Aziz Sheikh, Professor of Primary Care Research & Development and Director of the Usher Institute at The University of Edinburgh and a co-author of the paper, said:

‘This enormous study, using data on over 29 million vaccinated people, has shown that there is a very small risk of clotting and other blood disorders following first dose Covid-19 vaccination.  Though serious, the risk of these same outcomes is much higher following SARS-CoV-2 infection.

‘On balance, this analysis therefore clearly underscores the importance of getting vaccinated to reduce the risk of these clotting and bleeding outcomes in individuals, and because of the substantial public health benefit that Covid-19 vaccinations offer.’

Source / Credit: University of Oxford

Together, but still apart

 Social disconnection is a lack of social, emotional and physical engagement with other people. This


results in isolation and loneliness. Risk factors such as the shrinking of family sizes, lack of family support and declining health have made it hard for older adults to keep up with social and economic activities and maintain social connections, which ultimately results in social disconnection and isolation. The social distancing measures brought about by the ongoing COVID-19 pandemic have exacerbated social isolation, especially among the elderly.

In the Singapore Chinese Health Study done by a team led by Professor Koh Woon Puay, from the Healthy Longevity Translational Research Program at the NUS Yong Loo Lin School of Medicine and Associate Professor Feng Qiushi of the Faculty of Arts and Social Sciences showed that from 16,943 community-dwelling seniors, 78.8% of socially disconnected older adults are living with family, compared to 14.4% of socially disconnected adults who are living alone. Hence, although older adults living alone are more likely to be socially disconnected, in Singapore, the majority of socially isolated older adults still stay with their families. Among those living alone, men were twice as likely to experience social disconnection, compared to the women. This study was published in Gerontology on 16 June 2021.

In this cohort, the team also studied the factors associated with social isolation in this cohort, to see if they had similar effects among those living alone and those living with their families. The salient findings were:

Regardless of the living arrangements, factors such as low education level, cognitive impairment, fair or poor self-rated health, depression, and limitations with daily living activities were independently associated with social disconnection.

Among those living alone, men were twice as likely to experience social disconnection compared to women.

From these findings, Prof Koh recommends targeting community interventions to elderly men living alone, and extending its scope to older adults in poor health who live with their families. The Singapore Government has made much effort in the area of eldercare which has helped most older people to stay socially connected. Despite this, social alienation is increasingly present due to the demographic trends of population ageing and solo-living and extra effort is needed to help vulnerable individuals, especially older men. Interventions that encourage individual and personal productivity, such as paid work, volunteerism and learning new skills should be promoted among older adults to create opportunities for social interaction and maintenance of cognitive functions, Prof Koh adds.

In addition to social isolation, older adults are also at increased risk of chronic age-related diseases, as well as gradual loss of bodily functions and independence in activities of daily living. Prof Koh and Associate Professor Feng have collaborated with other scientists within NUS and other research institutions to establish the SG70 Towards Healthy Longevity cohort study, as the next study to examine the effects of biological, lifestyle and socioeconomic factors that prevent people from ageing healthily and productively. This cohort will recruit 3,000 participants, from the ages of 65 to 75 years old comprising of the three major ethnic groups in Singapore. This age group has been identified as the vulnerable period where the average Singaporean may transit from good health to poor health, and the research team will study this ageing process in the SG70 participants for the next 10 to 15 years.

The eventual aim of this SG70 cohort study is to gather scientific evidence that will form the basis for intervention studies in the near future that may slow, halt or reverse the ageing process, in order to help people age more healthily, avoid age-related diseases and maintain a good quality of life in their twilight years.

Source / Credit: NUS Yong Loo Lin School of Medicine

Thursday, August 26, 2021

Plants evolved ability to actively control water-loss earlier than previously thought

 

Fern stomata Credit: University of Birmingham
New research has shed light on when plants first evolved the ability to respond to changing humidity in the air around them, and was probably a feature of a common ancestor of both flowering plants and ferns.

New research has shed light on when plants first evolved the ability to respond to changing humidity in the air around them, and was probably a feature of a common ancestor of both flowering plants and ferns.

Key to the regulation mechanism are tiny holes, or pores, on the surface of leaves, called stomata. These enable the plant to regulate the uptake of CO2 gas as fuel for photosynthesis, and the loss of water vapour – a constant balancing act that requires the pores to open and close according to changing conditions. This ability is important to agriculture because it helps crops to use less water to grow.

Plants first evolved stomata soon after they moved from water to land, some 450 million years ago, but scientists are still uncertain about the evolutionary pathway they took and the point at which plants became able to choose whether to open or close the pores in response to their environment.

In the most recently evolved plants – flowering plants – stomata closure in response to drought is actively triggered by a number of internal signals, including a hormone called abscisic acid (ABA), but scientists have been struggling to understand if this mechanism is also present in older groups of plants. In a new study, published in Current Biology, researchers at the University of Birmingham have found evidence that the fern species Ceratopteris richardii actively closes its stomata using similar signals.

This semi-aquatic tropical fern has recently become the first model for exploring genetic control of development in the fern family, and is now helping scientists to unpick the long evolutionary history between the earliest land-living plants (mosses, liverworts and hornworts) and the modern flowering plants that dominate today’s ecosystems.

The team used RNA sequencing technology to identify the genetic mechanisms behind different stomatal responses and was able to demonstrate the fern’s ability to close stomata in response to low humidity or in response to ABA involves copies of genes already known to control stomata in flowering plants.

The results suggest that both ferns and flowering plants evolved using similar stomatal closure methods. This indicates that these mechanisms were present – at least in some form – in the stomata of the last common ancestor of both groups.

Dr Andrew Plackett, of the University of Birmingham’s School of Biosciences, led the research in collaboration with groups at the University of Bristol and the University of Oxford. He said: “We know that plants have possessed stomata for most of their evolutionary history, but the point in evolution where plants became able to actively open and close them has been controversial.

“We’ve been able to show the same active closure mechanisms found in flowering plants are also present in ferns, a much older group of plants. Being able to better understand how these mechanisms have changed during plant evolution gives us useful tools to learn more about how they work. This will be important for helping our crops to adapt to future environmental changes.”

Prof Alistair Hetherington of Bristol’s School of Biological Sciences added: “This new work confirms that the earliest plants were able to actively control the water they lost through the microscopic valve like structures on the surfaces of leaves known as stomata. This is important because it shows that the intracellular machinery allowing stomata to open and close was present in the earliest land plants. The research also shows that, whether stomata respond actively or passively is dictated by the environment in which the plants lived. "

Source / Credit: University of Bristol

Farmed carnivores may become disease reservoirs posing human health risk

 Farming large numbers of carnivores, like mink, could allow the formation of undetected ‘disease reservoirs’, in which a pathogen could spread to many animals and mutate to become a risk to human health.

Research led by the University of Cambridge has discovered that carnivores have a defective immune system, which makes them likely to be asymptomatic carriers of disease-causing pathogens.

Three key genes in carnivores that are critical for gut health were found to have lost their function. If these genes were working, they would produce protein complexes called inflammasomes to activate inflammatory responses and fight off pathogens. The study is published today in the journal Cell Reports.

The researchers say that the carnivorous diet, which is high in protein, is thought to have antimicrobial properties that could compensate for the loss of these immune pathways in carnivores – any gut infection is expelled by the production of diarrhoea. But the immune deficiency means that other pathogens can reside undetected elsewhere in these animals.

“We’ve found that a whole cohort of inflammatory genes is missing in carnivores - we didn’t expect this at all,” said Professor Clare Bryant in the University of Cambridge’s Department of Veterinary Medicine, senior author of the paper. 

She added: “We think that the lack of these functioning genes contributes to the ability of pathogens to hide undetected in carnivores, to potentially mutate and be transmitted becoming a human health risk.”

Zoonotic pathogens are those that live in animal hosts before jumping to infect humans. The COVID-19 pandemic, thought to originate in a wild animal, has shown the enormous damage that can be wrought by a novel human disease. Carnivores include mink, dogs, and cats, and are the biggest carriers of zoonotic pathogens. 

Three genes appear to be in the process of being lost entirely in carnivores: the DNA is still present but it is not expressed, meaning they have become ‘pseudogenes’ and are not functioning. A third gene important for gut health has developed a unique mutation, causing two proteins called caspases to be fused together to change their function so they can no longer respond to some pathogens in the animal’s body.

“When you have a large population of farmed carnivorous animals, like mink, they can harbour a pathogen - like SARS-CoV-2 and others - and it can mutate because the immune system of the mink isn’t being activated. This could potentially spread into humans,” said Bryant.

The researchers say that the results are not a reason to be concerned about COVID-19 being spread by dogs and cats. There is no evidence that these domestic pets carry or transmit COVID-19. It is when large numbers of carnivores are kept together in close proximity that a large reservoir of the pathogen can build up amongst them, and potentially mutate.

This research was funded by Wellcome.

Source / Credit: University of Cambridge

Climate change is accelerating, according to comprehensive study

 Climate change is happening and accelerating. Earth will continue to warm. And these changes are unequivocally caused by human activities. Those are among the conclusions of the report published by the International Panel on Climate Change (IPCC), with University of Hawaiʻi at Mānoa Assistant Professor of oceanography Malte Stuecker as a contributing author.

Ocean temperature (blue=cold, red=warm) simulated at ultra-high resolution. Photo credit: IBS/ICCP‘s Aleph

“The latest IPCC report shows clearly that if we do not drastically curb our emissions, we will head towards temperatures that Earth has not seen in millions of years,” Stuecker summarizes. “Moreover, we can now say with certainty that all of the global warming that occurred since the mid-19th century is due to human activity. While these are sobering facts, we should certainly not despair. In fact, if societies choose a pathway of large reductions in greenhouse gas emissions now, the report also shows that we will avoid the worst possible future outcomes and Earth will experience only moderate additional warming over this century that we can likely adapt to.”

In addition to global warming, regional climate in many parts of the world is impacted by the cycling between warm El Niño and cold La Niña conditions in the eastern Pacific Ocean—commonly referred to as the El Niño-Southern Oscillation (ENSO). ENSO—has persisted without major interruptions for thousands to millions of years. This may also change in a future warmer world, though the recent IPCC report highlights uncertainties in potential changes in ENSO.

Rainfall will be more extreme as the climate warms. (Photo credit: Max LaRochelle via Unsplash)

Two additional studies

Continuing the long tradition of contributing to developing theories and advancing climate models around ENSO, researchers from the UH Mānoa School of Ocean and Earth Science and Technology (SOEST) recently published two additional studies addressing the complexity of this most important climate phenomenon.

SOEST atmospheric scientists, Associate Professor Christina Karamperidou and Professor Fei-Fei Jin, and Stuecker co-authored a review paper published in Nature Reviews Earth & Environment wherein they synthesized recent advancements in research on ENSO.

There is an emerging consensus among simulations of future climate under strong greenhouse gas emissions with the most recent generation of climate models that the variability of future ENSO sea surface temperature may increase as the climate warms.

“There is however still much uncertainty on the degree to which ENSO may change and the time at which these potential changes will emerge from ENSO’s natural variability,” said Karamperidou. “This is partly due to incomplete understanding of the phenomenon, partly due to known limitations of models in representing and resolving relevant processes, and partly due to the inherent limitations on our understanding imposed by the short length of the instrumental record.”

Additionally, led by researchers at the IBS Center for Climate Physics in Korea, Stuecker co-authored another study published in Nature Climate Change that produced a series of global climate model simulations with unprecedented spatial resolution. Boosted by the power of one of South Korea’s fastest supercomputers (Aleph), the new ultra-high-resolution simulations realistically represented processes that are usually missing from other models, though they play fundamental roles in the generation and termination of El Niño and La Niña events.

“From this highest resolution future climate model simulation that has been done to date, we conclude that it’s possible that ENSO variability could collapse under strong greenhouse warming in the future,” said Stuecker.

Further investigation is needed

This apparent contradiction in findings raises many interesting questions and highlights the need for further investigation.

“Regardless of the details of how El Niño changes in the future, rainfall and drought will become more extreme in the future due to the fact that we will be living in a warmer world with a hydrological cycle on steroids,” said Stuecker.

“Despite the spread of model projections on how ENSO may change under strong anthropogenic forcing, both the IPCC report and the Nature Reviews article demonstrate that its impacts on rainfall are very likely to be enhanced which has significant implications across the globe and the Pacific, including Hawaiʻi,” said Karamperidou.

Source / Credit: University of Hawaiʻi

Wednesday, August 25, 2021

Astrophysical data to unravel the universe’s mysteries

 The University of Washington and Carnegie Mellon University have announced an expansive,

Rubin Observatory summit facility in Cerro Pachón, Chile.Rubin Observatory/NSF/AURA

multiyear collaboration to create new software platforms to analyze large astronomical datasets generated by the upcoming Legacy Survey of Space and Time, or LSST, which will be carried out by the Vera C. Rubin Observatory in northern Chile. The open-source platforms are part of the new LSST Interdisciplinary Network for Collaboration and Computing — known as LINCC — and will fundamentally change how scientists use modern computational methods to make sense of big data.

Through the LSST, the Rubin Observatory, a joint initiative of the National Science Foundation and the Department of Energy, will collect and process more than 20 terabytes of data each night — and up to 10 petabytes each year for 10 years — and will build detailed composite images of the southern sky. Over its expected decade of observations, astrophysicists estimate the Department of Energy’s LSST Camera will detect and capture images of an estimated 30 billion stars, galaxies, stellar clusters and asteroids. Each point in the sky will be visited around 1,000 times over the survey’s 10 years, providing researchers with valuable time series data.

Scientists plan to use this data to address fundamental questions about our universe, such as the formation of our solar system, the course of near-Earth asteroids, the birth and death of stars, the nature of dark matter and dark energy, the universe’s murky early years and its ultimate fate, among other things.

“Tools that utilize the power of cloud computing will allow any researcher to search and analyze data at the scale of the LSST, not just speeding up the rate at which we make discoveries but changing the scientific questions that we can ask,” said Andrew Connolly, a UW professor of astronomy, director of the eScience Institute and former director of the Data Intensive Research in Astrophysics and Cosmology Institute — commonly known as the DiRAC Institute.

The Rubin Observatory will produce an unprecedented data set through the LSST. To take advantage of this opportunity, the LSST Corporation created the LSST Interdisciplinary Network for Collaboration and Computing, whose launch was announced Aug. 9 at the Rubin Observatory Project & Community Workshop. One of LINCC’s primary goals is to create new and improved analysis infrastructure that can accommodate the data’s scale and complexity that will result in meaningful and useful pipelines of discovery for LSST data.

“Many of the LSST’s science objectives share common traits and computational challenges. If we develop our algorithms and analysis frameworks with forethought, we can use them to enable many of the survey’s core science objectives,” said Rachel Mandelbaum, professor of physics and member of the McWilliams Center for Cosmology at Carnegie Mellon.

Connolly and Mandelbaum will co-lead the project, which will consist of programmers and scientists based at the UW and Carnegie Mellon, who will create platforms using professional software engineering practices and tools. Specifically, they will create a “cloud-first” system that also supports high-performance computing systems in partnership with the Pittsburgh Supercomputing Center, a joint effort of Carnegie Mellon and the University of Pittsburgh, and the National Science Foundation’s NOIRLab. The LSST Corporation will run programs to engage the LSST Science Collaborations and broader science community in the design, testing and use of the new tools.

The complete focal plane of the LSST Camera is more than 2 feet wide and contains 189 individual sensors that will produce 3200-megapixel images.Jacqueline Orrell/SLAC National Accelerator Laboratory/NSF/DOE/Rubin Observatory/AURA

The LINCC analysis platforms are supported by Schmidt Futures, a philanthropic initiative founded by Eric and Wendy Schmidt that “bets early on exceptional people making the world better.” This project is part of Schmidt Futures’ work in astrophysics, which aims to accelerate our knowledge about the universe by supporting the development of software and hardware platforms to facilitate research across the field of astronomy.

“Many years ago, the Schmidt family provided one of the first grants to advance the original design of the Vera C. Rubin Observatory. We believe this telescope is one of the most important and eagerly awaited instruments in astrophysics in this decade. By developing platforms to analyze the astronomical datasets captured by the LSST, Carnegie Mellon University and the University of Washington are transforming what is possible in the field of astronomy,” said Stuart Feldman, chief scientist at Schmidt Futures. "The software funded by this gift will magnify the scientific return on the public investment by the National Science Foundation and the Department of Energy to build and operate Rubin Observatory’s revolutionary telescope, camera and data systems,” said Adam Bolton, director of the Community Science and Data Center at NSF’s NOIRLab. The center will collaborate with LINCC scientists and engineers to make the LINCC framework accessible to the broader astronomical community.

Through this new project, new algorithms and processing pipelines developed at LINCC will be able to be used across fields within astrophysics and cosmology to sift through false signals, filter out noise in the data and flag potentially important objects for follow-up observations. The tools developed by LINCC will support a “census of our solar system” that will chart the courses of asteroids; help researchers to understand how the universe changes with time; and build a 3D view of the universe’s history.

“Our goal is to maximize the scientific output and societal impact of Rubin LSST, and these analysis tools will go a huge way toward doing just that,” said Jeno Sokoloski, director for science at the LSST Corporation. “They will be freely available to all researchers, students, teachers and members of the general public.”

Northwestern University and the University of Arizona, in addition to the UW and Carnegie Mellon, are hub sites for LINCC. The University of Pittsburgh will partner with the Carnegie Mellon hub.

UW, Carnegie Mellon to pioneer platforms that harness astrophysical data to unravel the universe’s mysteries

Source / Credit: University of Washington

Antarctic Ocean Flows: an excerpt from Atlas of a Changing Earth


In this visualization of Antarctica, we cruise along the coastline of the Amundsen Sea from Cape Dart to the Pine Island Glacier. Initially we pass the massive Getz Ice Shelf on our right stretching over 300 miles (500 km) along the coast. As we approach the Smith Glacier and the Dotson Ice Shelf, the sea surface becomes transparent allowing us to see the ocean flows moving under the surface. These flows portray the direction, speed and temperature of the ocean circulation based on version 3 of the ECCO ocean circulation model. The flows are colored by temperature, spanning the range from 29.75 degrees Fahrenheit (-1.25 degrees Celsius) shown in blue to 34.25 degrees Fahrenheit (+1.25 degrees Celsius) shown in red. We see the ocean flows circulating around the Pine Island Bay and under the adjacent floating ice tongue of the Thwaites Glacier.  

As we approach the Pine Island Glacier, we dip below the surface of the bay and see the stratification of the temperature in the ocean flows, with the coldest water shown in blue near the surface and the warmer water shown in red at lower depths. We move forward under the floating ice of the Pine Island Glacier and see how the warmer ocean flows are circulating under the glacier's floating tongue, eroding the ice from beneath.  

The topography in this visualization has been exaggerated by 4x above sea level and 15x below sea level in order to more clearly observe the change in ocean temperature at various depths.


Credit / Source: NASA's Scientific Visualization Studio

Peabody fossils illuminate dinosaur evolution in eastern North America

Tyrannosaurus rex, the fearsome predator that once roamed what is now western North America, appears to have had an East Coast cousin.

A new study by Yale undergraduate Chase Doran Brownstein describes two dinosaurs that once roamed the eastern United States from fossils housed at the Yale Peabody Museum of Natural History: an herbivorous hadrosaur (depicted in the silhouette) and a tyrannosaur.

A new study by Yale undergraduate Chase Doran Brownstein describes two dinosaurs that inhabited Appalachia — a once isolated land mass that today composes much of the eastern United States — about 85 million years ago: a herbivorous duck-billed hadrosaur and a carnivorous tyrannosaur. The findings were published Aug. 25 in the journal Royal Society Open Science.

Chase Doran Brownstein

The two dinosaurs, which Brownstein described from specimens housed at Yale’s Peabody Museum of Natural History, help fill a major gap in the North American fossil record from the Late Cretaceous and provide evidence that dinosaurs in the eastern portion of the continent evolved distinctly from their counterparts in western North America and Asia, Brownstein said.

“These specimens illuminate certain mysteries in the fossil record of eastern North America and help us better understand how geographic isolation— large water bodies separated Appalachia from other landmasses — affected the evolution of dinosaurs,” said Brownstein, who is entering his junior year at Yale College. “They’re also a good reminder that while the western United States has long been the source of exciting fossil discoveries, the eastern part of the country contains its share of treasures.”

For most of the second half of the Cretaceous, which ended 66 million years ago, North America was divided into two land masses, Laramidia in the West and Appalachia in the East, with the Western Interior Seaway separating them. While famous dinosaur species like T. rex and Triceratops lived throughout Laramidia, much less is known about the animals that inhabited Appalachia. One reason is that Laramidia’s geographic conditions were more conducive to the formation of sediment-rich fossil beds than Appalachia’s, Brownstein explained.

The specimens described in the new study were discovered largely during the 1970s at the Merchantville Formation in present day New Jersey and Delaware. They constitute one of the only known dinosaur assemblages from the late Santonian to early Campanian stages of the Late Cretaceous in North America. This fossil record period, dating from about 85 to 72 million years ago, is limited, Brownstein noted.

Brownstein examined a partial skeleton of a large predatory therapod, concluding that it is probably a tyrannosaur. He noted that the fossil shares several features in its hind limbs with Dryptosaurus, a tyrannosaur that lived about 67 million years ago in what is now New Jersey. The dinosaur has different hands and feet than T. rex, including massive claws on its forelimbs, suggesting that it represents a distinct family of the predators that evolved solely in Appalachia.

“Many people believe that all tyrannosaurs must have evolved a specific set of features to become apex predators,” Brownstein said. “Our fossil suggests they evolved into giant predators in a variety of ways as it lacks key foot or hand features that one would associate with western North American or Asian tyrannosaurs.”

The partial skeleton of the hadrosaur provided important new information on the evolution of the shoulder girdle in that group of dinosaurs, Brownstein found. The hadrosaur fossils also provide one of the best records of this group from east of the Mississippi and include some of the only infant/perinate (very young) dinosaur fossils found in this region.

Brownstein, who works as a research associate at the Stamford Museum and Nature Center in Stamford, Connecticut, has previously published his paleontological research in several peer-journals, including Scientific Reports, the Journal of Paleontology, and the Zoological Journal of the Linnaean Society. In addition to eastern North American fossils, he currently focuses his research on the evolution of fishes, lizards, and birds. He is particularly interested in how geographic change and other factors contribute to how fast different types of living things evolve.

He currently works in the lab of Thomas J. Near, curator of the Peabody Museum’s ichthyology collections and professor and chair of the Department of Ecology and Evolutionary Biology at Yale. Brownstein also collaborates with Yale paleontologists Jacques Gauthier and Bhart-Anjan Bhullar in the Department of Earth and Planetary Sciences.

While Brownstein is considering pursuing an academic career in evolutionary biology, he says his research is driven by enjoyment.

“Doing research and thinking about these things makes me happy,” he said. “Like biking, it’s something I love to do”

Source / Credit: Yale University
By Mike Cummings


Tuesday, August 24, 2021

Emerging from the deep: Stawell’s dark matter lab takes shape

Construction of the Southern Hemisphere’s first dark matter underground physics laboratory is progressing with the concrete slab now in place and the world-class facility on schedule to welcome scientists by Christmas.

The ancillary area where scientists can shower before going into the lab to work.
Image: Stawell Gold Mines


Dr Leonie Walsh, Victoria’s first lead scientist, first woman president of the Australian Innovation Research Group and representative on the Forum of Australian Chief Scientists is interim chair of the company that will operate and manage the Stawell Underground Physics Laboratory (SUPL).

Dr Walsh recently visited the underground laboratory in regional Victorian, seeing first-hand the work underway to ensure that the lab, one kilometer underground,  has an excellent chance of detecting the universe’s elusive dark matter.

“We saw the cavern walls where the lab is being built, being sprayed with a product called Tekflex to reduce the potential for interference from background radon gas in the rock mass, in experiments,” Dr Walsh said. “As an industrial scientist, I have worked across a broad range of industrial sites around the world, but none as unique as SUPL.”

It takes half an hour to journey underground to the site of the lab. Dr Walsh completed the journey after undergoing the strict safety induction and personal protective equipment (PPE) fit-out.

“Researchers will start their day with a 10km drive down a maze of tunnels in protective equipment to the cavernous laboratory, 1100 meters underground to work on their dark matter experiments with equipment designed and built for the purpose of finding dark matter – this thing that makes up 85 per cent of our universe, but which continues to be a mystery,” she said.

“The disused section of Stawell’s gold mine in regional Victoria, has turned out to be the ideal location to progress our understanding of dark matter.”

The 33 meters long, 10 meters wide lab is funded by a $10 million grant from the Federal and State Governments, supported by a $35 million Australian Research Council for the Centre of Excellence for Dark Matter Particle Physics based at the University of Melbourne.

Tom Kelly, the University of Melbourne’s Senior Project Manager, said that major pieces of the plant as well as plumbing, electrical and communications cable and mechanical ductwork and piping are expected to be in place by early October.

“We anticipate the handover to be on time and to commence the installation of experimental equipment before Christmas,” Mr Kelly said.

The five research institutions that will work at Stawell are Melbourne University, Swinburne University of Technology, Adelaide University, the Australian National University and the Australian Nuclear Science and Technology Organization (ANSTO).

ANSTO’s representative on the SUPL company board, Professor Andrew Peele, accompanied Dr Walsh on the inspection and said: “Science goes to extreme lengths to find answers, and in this case, to a very sheltered environment a kilometer underground. It is impressive to see the progress made first-hand and pleasing to see the preparations for the range of activities that will advance our understanding of dark matter.

“ANSTO is delighted to be part of this project and to share our expertise in ultra-sensitive radiation measurement. This is critical to the operation of the instruments that will be housed in SUPL and will also make possible high-precision radiation measurements needed to better understand environmental and other samples.”

Dr Walsh is pictured above at left with Professor Elisabetta Barberio, the Director of the Stawell Underground Physics Laboratory, and board member designate, Professor Andrew Peele, from the Australian Nuclear Science and Technology Organization (ANSTO).

Source / Credit: The University of Melbourne

Featured Article

Two artificial intelligences talk to each other

A UNIGE team has developed an AI capable of learning a task solely on the basis of verbal instructions. And to do the same with a «sister» A...

Top Viewed Articles