. Scientific Frontline

Tuesday, August 31, 2021

New biomarkers identified to detect consumption of emerging illicit drug

Professor Eric Chan (middle) from the NUS Department of Pharmacy led the research which was conducted in collaboration with the Health Sciences Authority (HSA).
The research team included Ms Moy Hooi Yan (extreme left), HSA’s Laboratory Director of the Analytical Toxicology Lab - Drug Abuse Testing, and Dr Wang Ziteng (extreme right), Research Fellow at the NUS Department of Pharmacy.

 A team of researchers from the National University of Singapore (NUS) has come up with a new solution to boost the surveillance of designer drug abuse. Led by Professor Eric Chan from the NUS Department of Pharmacy, the team has identified three new urinary biomarkers that could be used to detect consumption of ADB-BUTINACA, an emerging synthetic cannabinoid which is a type of new psychoactive substance (NPS). The innovative approach used to identify the biomarkers can be applied to other existing and new synthetic cannabinoids.

NPS are drugs designed to mimic the effects of illegal substances such as cannabis, cocaine, heroin, ‘Ice’, Ecstacy and LSD. The intention of the clandestine laboratories to introduce synthetic cannabinoids with different chemical structures is to try to circumvent legislative bans.

Over the past two years, users of NPS made up the third largest proportion of drug abusers in Singapore, while synthetic cannabinoids have dominated Singapore’s NPS market for the past four years. As most synthetic cannabinoids are extensively metabolized in the body after consumption, they become virtually undetectable in urine samples.

Commenting on the significance of the team’s research, Prof Chan said, “Prior to our study, the metabolism and urinary biomarkers of ADB-BUTINACA were unclear. Our discovery and unique methodology offer assistance to the forensic fraternity who is constantly being challenged by the emergence of novel synthetic cannabinoids, and can also bring benefits to the international public communities to tackle the increasing abuse of this synthetic cannabinoid. This will bring us closer to the goal of having a drug-free world.”

The study, which was carried out in collaboration with the Analytical Toxicology Laboratory of Singapore’s Health Sciences Authority, was first published in the journal Clinical Chemistry on 13 August 2021.

New biomarkers for accurate detection of synthetic drug abuse

ADB-BUTINACA is a new synthetic cannabinoid that was first identified in Europe in 2019, and it entered Singapore’s drug scene last year. Although three existing metabolites of ADB-BUTINACA are available as reference standards for routine forensic monitoring, they have been found to be absent or detected at lower concentrations in some urine samples of abusers. This created an impetus to identify other potential metabolites for use as urinary biomarkers for the cannabinoid’s consumption.

Instead of using the conventional and more time-consuming method of chemically synthesizing metabolites of ADB-BUTINACA, Prof Chan and his team introduced an innovative method to identify the cannabinoid’s unique metabolites using the concepts of drug metabolism and pharmacokinetics.

The team synthesized key metabolites of ADB-BUTINACA using human liver enzymes in the laboratory for investigating their disposition and identifying novel biomarker metabolites in urine. From their studies, a total of 15 metabolites of ADB-BUTINACA and their respective pathways of biotransformation in the body were identified for the first time using this method.

Of the 15 new metabolites, the researchers proposed four as urinary metabolite biomarkers due to their metabolic stability, including one metabolite where its reference standard is currently available. A panel comprising either one or a combination of these four newly-established urinary biomarkers was developed for diagnosing the consumption of ADB-BUTINACA.

Moving forward, the team plans to extend their current strategy to better understand the disposition of novel metabolites of synthetic cannabinoids by kidneys and their eventual occurrence in urine.

Source/Credit: National University of Singapore


Sandia uncovers hidden factors that affect solar farms during severe weather

Sandia National Laboratories researchers Thushara Gunda, front, and Nicole Jackson examine solar panels at Sandia’s Photovoltaic Systems Evaluation Laboratory as summer monsoon clouds roll by. Using machine learning and data from solar farms across the U.S., they uncovered the age of a solar farm, as well as the amount of cloud cover, have pronounced effects on farm performance during severe weather.
(Photo by Randy Montoya)

 Sandia National Laboratories researchers combined large sets of real-world solar data and advanced machine learning to study the impacts of severe weather on U.S. solar farms, and sort out what factors affect energy generation. Their results were published earlier this month in the scientific journal Applied Energy.

Hurricanes, blizzards, hailstorms and wildfires all pose risks to solar farms both directly in the form of costly damage and indirectly in the form of blocked sunlight and reduced electricity output. Two Sandia researchers scoured maintenance tickets from more than 800 solar farms in 24 states and combined that information with electricity generation data and weather records to assess the effects of severe weather on the facilities. By identifying the factors that contribute to low performance, they hope to increase the resiliency of solar farms to extreme weather.

“Trying to understand how future climate conditions could impact our national energy infrastructure, is exactly what we need to be doing if we want our renewable energy sector to be resilient under a changing climate,” said Thushara Gunda, the senior researcher on the project. “Right now, we’re focused on extreme weather events, but eventually we’ll extend into chronic exposure events like consistent extreme heat.”

Hurricanes and snow and storms, oh my!

The Sandia research team first used natural-language processing, a type of machine learning used by smart assistants, to analyze six years of solar maintenance records for key weather-related words. The analysis methods they used for this study has since been published and is freely available for other photovoltaic researchers and operators.

“Our first step was to look at the maintenance records to decide which weather events we should even look at,” said Gunda. “The photovoltaic community talks about hail a lot, but the data in the maintenance records tell a different story.”

While hailstorms tend to be very costly, they did not appear in solar farm maintenance records, likely because operators tend to document hail damage in the form of insurance claims, Gunda said. Instead, she found that hurricanes were mentioned in almost 15% of weather-related maintenance records, followed by the other weather terms, such as snow, storm, lightning and wind.

“Some hurricanes damage racking — the structure that holds up the panels — due to the high winds,” said Nicole Jackson, the lead author on the paper. “The other major issue we’ve seen from the maintenance records and talking with our industry partners is flooding blocking access to the site, which delays the process of turning the plant back on.”

Using machine learning to find the most important factors

Next, they combined more than two years of real-world electricity production data from more than 100 solar farms in 16 states with historical weather data to assess the effects of severe weather on solar farms. They used statistics to find that snowstorms had the highest effect on electricity production, followed by hurricanes and a general group of other storms.

Then they used a machine learning algorithm to uncover the hidden factors that contributed to low performance from these severe weather events.

“Statistics gives you part of the picture, but machine learning was really helpful in clarifying what are those most important variables,” said Jackson, who primarily conducted statistical analysis and the machine learning portion of the project. “Is it where the site is located? Is it how old the site is? Is it how many maintenance tickets were submitted on the day of the weather event? We ended up with a suite of variables and machine learning was used to home in on the most important ones.”

She found that across the board, older solar farms were affected the most by severe weather. One possibility for this is that solar farms that had been in operation for more than five years had more wear-and-tear from being exposed to the elements longer, Jackson said.

Gunda agreed, adding, “This work highlights the importance of ongoing maintenance and further research to ensure photovoltaic plants continue to operate as intended.”

For snowstorms, which unexpectedly were the type of storm with the highest effect on electricity production, the next most important variables were low sunlight levels at the location due to cloud cover and the amount of snow, followed by several geographical features of the farm.

For hurricanes — principally hurricanes Florence and Michael — the amount of rainfall and the timing of the nearest hurricane had the next highest effect on production after age. Surprisingly low wind speeds were significant. This is likely because when high wind speeds are predicted, solar farms are preemptively shut down so that the employees can evacuate leading to no production, Gunda said.

Expanding the approach to wildfires, the grid

As an impartial research institution in this space, Sandia was able to collaborate with multiple industry partners to make this work feasible. “We would not have been able to do this project without those partnerships,” Gunda said.

The research team is working to extend the project to study the effect of wildfires on solar farms. Since wildfires aren’t mentioned in maintenance logs, they were not able to study them for this paper. Operators don’t stop to write a maintenance report when their solar farm is being threatened by a wildfire, Gunda said. “This work highlights the reality of some of the data limitations we have to grapple with when studying extreme weather events.”

“The cool thing about this work is that we were able to develop a comprehensive approach of integrating and analyzing performance data, operations data and weather data,” Jackson said. “We’re extending the approach into wildfires to examine their performance impacts on solar energy generation in greater detail.”

The researchers are currently expanding this work to look at the effects of severe weather on the entire electrical grid, add in more production data, and answer even more questions to help the grid adapt to the changing climate and evolving technologies.

This research was supported by the Department of Energy’s Solar Energy Technologies Office and was conducted in partnership with the National Renewable Energy Laboratory.

Source/Credit: Sandia National Laboratories


Sea levels to become much more common as Earth warms


Extreme sea levels along coastlines across the world will become
100 times more frequent by the end of the century.
Image: Pexels
Global warming will cause extreme sea levels to occur almost every year by the end of the century, impacting major coastlines worldwide, according to new research from an international team of scientists.

Published today in Nature Climate Change, the research predicts that because of rising temperatures, extreme sea levels along coastlines across the world will become 100 times more frequent by the end of the century in about half of the 7,283 locations studied.

Co-author of the study, University of Melbourne’s Dr Ebru Kirezci, an ocean engineering researcher said areas where frequency of extreme sea levels are expected to increase faster include the Southern Hemisphere and subtropic areas, the Mediterranean Sea and the Arabian Peninsula, the southern half of North America’s Pacific Coast, and areas including Hawaii, the Caribbean, the Philippines and Indonesia.

“What we can also infer from this study, is that most of the eastern, southern and southwestern coastlines of Australia will be the impacted with almost an annual frequency of these extreme sea levels by 2100,” Dr Kirezci said.

“This increased frequency of extreme sea levels will occur even with a global temperature increase of 1.5 degrees Celsius. And the changes are likely to come sooner than the end of the century, with many locations experiencing a 100-fold increase in extreme events even by 2070.”

Lead author of the study, climate scientist at the US Department of Energy’s Pacific Northwest National Laboratory, Dr Claudia Tebaldi said it was no surprise that sea level rise will be dramatic even at 1.5 degrees and will have substantial effects on extreme sea level frequencies and magnitude.

“This study gives a more complete picture around the globe. We were able to look at a wider range of warming levels in very fine spatial detail,” Dr Tebaldi said.

The researchers called for more detailed studies to understand how the changes will impact communities within different countries. They added that the physical changes that the study describes will have varying impacts at local scales, depending on several factors, including how vulnerable the site is to rising waters and how prepared a community is for change.

“Public policy makers should take note of these studies and work towards improving coastal protection and mitigation measures. Building dykes and sea walls, retreating from shorelines, and deploying early warning systems are some of the steps which can be taken to adapt to this change,” Dr Kirezci said.

The research was led by the US based Joint Global Change Research Institute in collaboration with researchers from the University of Melbourne, IHE Delft Institute for Water Education in the Netherlands, the European Joint Research Centre in Italy, Princeton University, the University of Illinois, Rutgers University and the University of Bologna.

The study was funded by the US Environmental Protection Agency and their Department of Energy’s Office of Science.

Source/Credit: University of Melbourne


Vaccine candidates for Ebola

 Researchers at the University of Hawaiʻi at Mānoa John A. Burns School of Medicine (JABSOM) have

Axel Lehrer in his lab at the John A.
Burns School of Medicine.
demonstrated the efficacy in monkeys of multiple vaccine candidates targeting three filoviruses causing life-threatening infections to humans: Ebola virus, Sudan virus and Marburg virus. The new findings were published in Frontiers in Immunology on August 18.

Associate Professor Axel Lehrer of the Department of Tropical Medicine, Medical Microbiology and Pharmacology leads the JABSOM team, working in collaboration on this project with late-stage biopharmaceutical company Soligenix, Inc., and with the local development partner, Hawaii Biotech, Inc. The team also reported another breakthrough in demonstrating successful thermostabilization in single vials of Filovirus vaccines in Vaccine.

“Filoviruses are endemic in areas of the world where the power supply can be uncertain, making a thermostable vaccine particularly valuable,” said Lehrer. “Our work to date has demonstrated not only the feasibility of rapid and efficient manufacturing, but also the applicability of thermostabilization of multiple antigens with the potential for a broadly applicable and easily distributed vaccine.”

Lehrer’s work has focused on creating shelf-stable vaccines that require no refrigeration or freezing, which is key to eradicating viruses in tropical countries, and allows equitable distribution of much needed vaccines to communities around the globe.

According to Lehrer, once developed, such a vaccine may be able to rapidly address emerging outbreaks, such as the Marburg virus infection that appeared in Guinea recently. The collaborators believe that this technology may be an important contribution to National Institute of Allergy and Infectious Diseases Director Anthony Fauci’s proposed idea to develop prototype vaccines against the top 20 viral families that may also cause pandemics.

“Having such a platform available would likely enable broader and faster worldwide vaccination campaigns addressing future health emergencies. In addition, the ability to combine antigens in the formulation also enables generation of potentially broader protective vaccines,” Lehrer said.

COVID-19 vaccine update

Since March 2020, Lehrer has also been working with Soligenix on a promising thermostable COVID-19 vaccine. “While much progress has been made since the initial announcement of our collaborative research, we are actively working on further analysis if the neutralizing potential of the vaccine candidate against a number of virus variants,” he said. The vaccine is being developed using the same thermostable platform that was used for filovirus vaccines and has demonstrated promising results in mice and non-human primates.

Source / Credit: University of Hawaiʻi


Monday, August 30, 2021

Pathways to production

 Biologists at Sandia National Laboratories developed comprehensive software that will help scientists in a variety of industries create engineered chemicals more quickly and easily. Sandia is now looking to license the software for commercial use, researchers said.

Sandia’s stand-alone software RetSynth uses a novel algorithm to sort through large, curated databases of biological and chemical reactions, which could help scientists synthetically engineer compounds used in the production of biofuels, pharmaceuticals, cosmetics, industrial chemicals, dyes, scents and flavors.

A graphic illustration of the kind of retrosynthetic analysis conducted by RetSynth software developed at Sandia National Laboratories. Using a novel algorithm, the software identifies the biological or chemical reactions needed to create a desired biological product or compound.
(Graphic by Laura Hatfield)

The software platform uses retrosynthetic analysis to help scientists identify possible pathways to production — the series of biological and chemical reactions, or steps, needed to engineer and modify the molecules in a cell — to create the desired biological product or compound. By using the software to rapidly analyze all pathways, scientists can determine the production sequence with the fewest steps, the sequences that can be completed with available resources or the most economically viable process.

Synthetic biology involves redesigning organisms for useful purposes by engineering them to have new abilities. Researchers and companies around the world are using synthetic biology to harness the power of nature to solve problems in medicine — such as the development of vaccines, antibodies and therapeutic treatments — as well as in manufacturing and agriculture.

“Synthetic biology is becoming a critical capability for U.S. manufacturing. It has the potential to dramatically reduce waste, eliminate or curtail emissions and create next-generation therapeutics and materials,” said Corey Hudson, a computational biologist at Sandia. “That is where people will see RetSynth have the biggest impact.”

“The diverse functionality of RetSynth opens a lot of opportunities for researchers, giving them multiple options, including biological, chemical or hybrid pathways to production,” Hudson said. “All the while, the software is accelerating the research and development process associated with bioproduction. Traditionally, this process has been relatively slow and complex.”

RetSynth is designed to save researchers time and money by suggesting process modifications to maximize theoretical yield, or the amount of bioproduct that could be produced, Hudson said. All available pathways are rendered using clear visual images, enabling software users to quickly interpret results.

Commercial licensing for broader impact

The RetSynth software was originally developed as part of the Department of Energy’s Co-Optimization of Fuels & Engines initiative, a consortium of national lab, university and industry researchers who are creating innovative fuels and combining them with high-efficiency engines to reduce emissions and boost fuel economy.

Today, RetSynth has been expanded to support a variety of diverse applications, and Sandia is ready to license the software to an industry partner for commercial use, Hudson said.

Source / Credit: Sandia National Laboratories


Carbon nanotube fibers woven into clothing gather accurate EKG.


Rice University graduate student Lauren Taylor
 shows a shirt with carbon nanotube thread that provides
constant monitoring of the wearer’s heart. Photo by Jeff Fitlow
There’s no need to don uncomfortable smartwatches or chest straps to monitor your heart if your comfy shirt can do a better job.

That’s the idea behind “smart clothing” developed by a Rice University lab, which employed its conductive nanotube thread to weave functionality into regular apparel.

The Brown School of Engineering lab of chemical and biomolecular engineer Matteo Pasquali reported in the American Chemical Society journal Nano Letters that it sewed nanotube fibers into athletic wear to monitor the heart rate and take a continual electrocardiogram (EKG) of the wearer.

The fibers are just as conductive as metal wires, but washable, comfortable and far less likely to break when a body is in motion, according to the researchers.

On the whole, the shirt they enhanced was better at gathering data than a standard chest-strap monitor taking live measurements during experiments. When matched with commercial medical electrode monitors, the carbon nanotube shirt gave slightly better EKGs.

“The shirt has to be snug against the chest,” said Rice graduate student Lauren Taylor, lead author of the study. “In future studies, we will focus on using denser patches of carbon nanotube threads so there’s more surface area to contact the skin.”

The researchers noted nanotube fibers are soft and flexible, and clothing that incorporates them is machine washable. The fibers can be machine-sewn into fabric just like standard thread. The zigzag stitching pattern allows the fabric to stretch without breaking them.

The fibers provided not only steady electrical contact with the wearer’s skin but also served as electrodes to connect electronics like Bluetooth transmitters to relay data to a smartphone or connect to a Holter monitor that can be stowed in a user’s pocket, Taylor said.

Pasquali’s lab introduced carbon nanotube fiber in 2013. Since then the fibers, each containing tens of billions of nanotubes, have been studied for use as bridges to repair damaged hearts, as electrical interfaces with the brain, for use in cochlear implants, as flexible antennas and for automotive and aerospace applications. Their development is also part of the Rice-based Carbon Hub, a multiuniversity research initiative led by Rice and launched in 2019.

The original nanotube filaments, at about 22 microns wide, were too thin for a sewing machine to

A Rice lab uses a custom device that weaves
carbon nanotube fibers into larger threads for sewing. Photo by Jeff Fitlow

handle. Taylor said a rope-maker was used to create a sewable thread, essentially three bundles of seven filaments each, woven into a size roughly equivalent to regular thread.

“We worked with somebody who sells little machines designed to make ropes for model ships,” said Taylor, who at first tried to weave the thread by hand, with limited success. “He was able to make us a medium-scale device that does the same.”

She said the zigzag pattern can be adjusted to account for how much a shirt or other fabric is likely to stretch. Taylor said the team is working with Dr. Mehdi Razavi and his colleagues at the Texas Heart Institute to figure out how to maximize contact with the skin.

Fibers woven into fabric can also be used to embed antennas or LEDs, according to the researchers. Minor modifications to the fibers’ geometry and associated electronics could eventually allow clothing to monitor vital signs, force exertion or respiratory rate.

Taylor noted other potential uses could include human-machine interfaces for automobiles or soft robotics, or as antennas, health monitors and ballistic protection in military uniforms. “We demonstrated with a collaborator a few years ago that carbon nanotube fibers are better at dissipating energy on a per-weight basis than Kevlar, and that was without some of the gains that we’ve had since in tensile strength,” she said.

“We see that, after two decades of development in labs worldwide, this material works in more and more applications,” Pasquali said. “Because of the combination of conductivity, good contact with the skin, biocompatibility and softness, carbon nanotube threads are a natural component for wearables.”

He said the wearable market, although relatively small, could be an entry point for a new generation of sustainable materials that can be derived from hydrocarbons via direct splitting, a process that also produces clean hydrogen. Development of such materials is a focus of the Carbon Hub.

“We’re in the same situation as solar cells were a few decades ago,” Pasquali said. “We need application leaders that can provide a pull for scaling up production and increasing efficiency.”

Co-authors of the paper are Rice graduate students Steven Williams and Oliver Dewey, and alumni J. Stephen Yan, now at Boston Consulting Group, and Flavia Vitale, an assistant professor of neurology at the University of Pennsylvania. Pasquali is director of the Carbon Hub and the A.J. Hartsook Professor of Chemical and Biomolecular Engineering and a professor of chemistry and of materials science and nanoengineering.

The research was supported by the U.S. Air Force (FA9550-15-1-0370), the American Heart Association (15CSA24460004), the Robert A. Welch Foundation (C-1668), the Department of Energy (DE-EE0007865, DE-AR0001015), the Department of Defense (32 CFR 168a) and a Riki Kobayashi Fellowship from the Rice Department of Chemical and Biomolecular Engineering.

Source/Credit: Rice University


Space Weather Questions

 We'll be bringing back Space Weather to Scientific Frontline soon in a reduced version from the original web site. So for those that don't know much about space weather here is a very informative video.

Sunday, August 29, 2021

Will it be safe for humans to fly to Mars?


Credit: NASA
Sending human travelers to Mars would require scientists and engineers to overcome a range of technological and safety obstacles. One of them is the grave risk posed by particle radiation from the sun, distant stars and galaxies.

Answering two key questions would go a long way toward overcoming that hurdle: Would particle radiation pose too grave a threat to human life throughout a round trip to the red planet? And, could the very timing of a mission to Mars help shield astronauts and the spacecraft from the radiation?

In a new article published in the peer-reviewed journal Space Weather, an international team of space scientists, including researchers from UCLA, answers those two questions with a “no” and a “yes.”

That is, humans should be able to safely travel to and from Mars, provided that the spacecraft has sufficient shielding and the round trip is shorter than approximately four years. And the timing of a human mission to Mars would indeed make a difference: The scientists determined that the best time for a flight to leave Earth would be when solar activity is at its peak, known as the solar maximum.

The scientists’ calculations demonstrate that it would be possible to shield a Mars-bound spacecraft from energetic particles from the sun because, during solar maximum, the most dangerous and energetic particles from distant galaxies are deflected by the enhanced solar activity.

A trip of that length would be conceivable. The average flight to Mars takes about nine months, so depending on the timing of launch and available fuel, it is plausible that a human mission could reach the planet and return to Earth in less than two years, according to Yuri Shprits, a UCLA research geophysicist and co-author of the paper.

“This study shows that while space radiation imposes strict limitations on how heavy the spacecraft can be and the time of launch, and it presents technological difficulties for human missions to Mars, such a mission is viable,” said Shprits, who also is head of space physics and space weather at GFZ Research Centre for Geosciences in Potsdam, Germany.

The researchers recommend a mission not longer than four years because a longer journey would expose astronauts to a dangerously high amount of radiation during the round trip — even assuming they went when it was relatively safer than at other times. They also report that the main danger to such a flight would be particles from outside of our solar system.

Shprits and colleagues from UCLA, MIT, Moscow’s Skolkovo Institute of Science and Technology and GFZ Potsdam combined geophysical models of particle radiation for a solar cycle with models for how radiation would affect both human passengers — including its varying effects on different bodily organs — and a spacecraft. The modeling determined that having a spacecraft’s shell built out of a relatively thick material could help protect astronauts from radiation, but that if the shielding is too thick, it could actually increase the amount of secondary radiation to which they are exposed.

The two main types of hazardous radiation in space are solar energetic particles and galactic cosmic rays; the intensity of each depends on solar activity. Galactic cosmic ray activity is lowest within the six to 12 months after the peak of solar activity, while solar energetic particles’ intensity is greatest during solar maximum, Shprits said.

Source / Credit: UCLA


Saturday, August 28, 2021

Exposure to air pollution linked with increased mental health issues


Exposure to traffic-related air pollution is associated with increased mental health service-use among people recently diagnosed with psychotic and mood disorders such as schizophrenia and depression, a study on data from over 13,000 people has found.

Increased use of mental health services reflects mental illness severity, suggesting that initiatives to lessen air pollution could improve outcomes for those with these disorders and reduce costs of the healthcare needed to support them.

The research was published in the British Journal of Psychiatry and funded by the National Institute for Health Research (NIHR) Maudsley Biomedical Research Centre.

In 2019 119,000 people lived with illegal levels of polluted air in London. Previous research has found that adults exposed to high levels of traffic-related air pollution are more likely to experience common mental health disorders such as anxiety and mild depression but, until now, little was known about whether air pollution exposure contributes to the course and severity after the onset of more serious mental illness.

Researchers at King’s College London, University of Bristol and Imperial College London analyzed data from 13,887 people aged 15 years and over who had face-to-face contact with South London and Maudsley NHS Foundation Trust (SLaM) services between 2008 and 2012. Individuals were followed from the date of their first face-to-face contact for up to seven years.

Anonymized electronic mental health records were linked with quarterly average modelled concentrations of air pollutants (20x20 meter grid points) at the residential address of the participants. These included nitrogen dioxide and nitrogen oxides (NO2 and NOx) and fine and coarse particulate matter (PM2.5 and PM10).

The study found people exposed to higher residential levels of air pollutants used mental healthcare services more frequently in the months and years following their initial presentation to secondary mental healthcare services compared to those exposed to lower air pollution.

The researchers found that for every 3 micrograms per cubic meter increase in very small particulate matter (PM2.5) and 15 micrograms per cubic meter increase in nitrogen dioxide (NO2) over a one-year period there was an increased risk of having an inpatient stay of 11 per cent and 18 per cent. Results also showed increases in PM2.5 and NO2 were associated with a 7 per cent and 32 per cent increased risk of requiring community-based mental healthcare for the same period. These findings were also replicated over a seven-year period.

Dr Ioannis Bakolis, Senior Lecturer in Biostatistics and Epidemiology at the Institute of Psychiatry, Psychology & Neuroscience (IoPPN) King’s College London and lead author of the study, said: ‘There is already evidence linking air pollution to the incidence of mental disorders, but our novel findings suggest that air pollution could also play a role in the severity of mental disorders for people with pre-existing mental health conditions.’

He continued: ‘Our research indicates that air pollution is a major risk factor for increased severity of mental disorders. It is also a risk factor that is easily modifiable which suggests more public health initiatives to reduce exposure such as low emission zones could improve mental health outcomes as well as reduce the high healthcare costs caused by long-term chronic mental illness.’

According to the researchers, if the UK urban population’s exposure to PM2.5 was reduced by just a few units to the World Health Organization's recommended annual limit (10 micrograms per cubic metre), this would reduce usage of mental health services by around two per cent, thereby saving tens of millions of pounds each year in associated healthcare costs.

Dr Joanne Newbury, Sir Henry Wellcome Research Fellow, Bristol Medical School (PHS), and the study’s first author, added: ‘We observed these findings for both mood disorders and psychotic disorders, as well as for both inpatient and community-based mental healthcare, and over seven years follow-up. This suggests that air pollution may contribute to a broad range of mental health problems, across a wide spectrum of clinical need, and over long periods of time.

‘We now plan to examine whether air pollution is associated with a broader range of mental health, neurodevelopmental, and educational outcomes, particularly among children, who might be especially vulnerable to air pollution.’

South London and Maudsley NHS Foundation Trust provides comprehensive secondary mental healthcare to approximately 1.36 million people within the London boroughs of Croydon, Lambeth, Lewisham and Southwark. These are inner-city areas with high-traffic flows and high average air pollution concentrations compared to other UK urban areas that reflect London’s diversity in terms of ethnicity and wealth.

The researchers controlled the analyses for a number of potential variables that could influence the association between air pollution and service-use association, such as deprivation, population density, age, season, marital status and ethnicity. However, they cautioned that the study does not prove cause and effect, and further research needs to demonstrate exactly how air pollution might increase severity of mental health problems.

Dr Adrian James, President of the Royal College of Psychiatrists, said: ‘The environmental and climate emergency is also a mental health emergency. Our health is fundamentally linked to the quality of our environment, whether that's about cleaner air, access to green spaces or protection from extreme weather.

‘If air pollution is exacerbating pre-existing serious mental illnesses, such as schizophrenia, bipolar disorder and major depression, then improving air quality could reduce the pressure on mental health services. As we look ahead to our post-pandemic future, it is vital that we find ways to build back greener and prevent poor health. This important research presents a clear example where these go hand-in-hand.’

The research was funded by the National Institute for Health Research (NIHR), the NIHR Maudsley Biomedical Research Centre, Wellcome, the Economic and Social Research Council, and the UK Medical Research Council.

Press Release
Source / Credit: University of Bristol

Rice lab dives deep for DNA’s secrets

 The poor bacteriophages in Yang Gao’s lab are about to have a lot of bad days.

Yang Gao

That’s all to the good for the structural biologist, who has received a prestigious Maximizing Investigators’ Research Award for New and Early Stage Investigators from the National Institutes of Health to make the lives of viruses harder so ours can be better.

The five-year grant for $1.9 million, administered by the National Institute of General Medical Sciences, will help Gao and his group detail the mechanisms of proteins that produce copies of genomic DNA, and what can go awry when they’re either subjected to stress or face other barriers.

A better understanding of the structural framework of DNA replication, stress response and repair at the atomic level could help find new ways to target processes involved in a host of diseases, including cancer.

“We’re interested in the basic question of how DNA is replicated,” said Gao, an assistant professor of biosciences who joined Rice in 2019 with the backing of a grant from the Cancer Prevention and Research Institute of Texas. “We’ve known for a long time that DNA is a fragile molecule and subject to many different assaults, environmental and physiological, like ultraviolet from sunlight and oxidative species.

“So many things damage DNA,” he said. “Despite that, DNA replication has to keep on going, even if there are errors, with an enzyme called DNA polymerase and a motor called the helicase.”

A study of stress on bacteriophage T7 will help Rice structural biologist Yang Gao and his team to reveal the atomic-scale mechanisms of DNA replication. (Credit: Yang Gao Lab/Rice University)

These are part of the replisome, a complex chain of proteins that carry out DNA replication and help repair DNA on the fly. Part of their normal function is to catch and fix coding errors. “When they see something bad they call for help, either before or after replication,” Gao said. “But how that works is still unknown, and we want to figure it out.”

The lab will start with the T7 bacteriophage, a virus whose infection mechanism in Escherichia coli bacteria is a good analog for what happens in humans.

“During my postdoc, we solved the first structure of T7 replisome to show how T7 comes together at a replication site,” he said. “We’ve continued that work at Rice, and we’re using the system to explore how it deals with different damages.”

The lab will then study the structure of mitochondria, the “power plants” inside cells, to see how DNA mutations produced there could lead to genetic diseases. “These two systems are mechanistically similar, and because we have experience with T7 and we’ve recently established a mitochondrial hub, we’re in a good position to start this investigation,” Gao said.

He noted he will continue to collaborate with Rice physicist Peter Wolynes and his group, which produces models that advance the theory of DNA replication. The lab also plans to make use of a new transmission electron microscope pegged for Rice’s BioScience Research Collaborative.

Press Release
Source / Credit: Rice University

Featured Article

Autism and ADHD are linked to disturbed gut flora very early in life

The researchers have found links between the gut flora in babies first year of life and future diagnoses. Photo Credit:  Cheryl Holt Disturb...

Top Viewed Articles