. Scientific Frontline

Tuesday, September 7, 2021

Koala killer being passed to joeys from mum


A deadly koala virus that can cause immune depletion and cancer, known as koala retrovirus, is being transferred to joeys from their mothers, according to University of Queensland scientists.

Associate Professor Keith Chappell, from UQ’s School of Chemistry and Molecular Biosciences, said the virus predisposes koala to chlamydia and other disease, and was having a large impact on wild koala populations across Queensland and New South Wales.

“Koala retrovirus – also known as KoRV – and associated diseases are another threat facing koalas, along with climate change and habitat loss.

“The virus causes immune depletion, likely making it much harder for koalas to cope with these other, already-detrimental environmental stressors.

“All northern koalas share a single highly conserved version of KoRV that is integrated into the koala genome, however until now, we weren’t certain how other disease-causing variants are spread.

“By sequencing variations of the virus DNA in 109 captive koalas, we finally revealed how the virus spreads – from mother to joey.

“It seems that transmission between mother and joey likely occurs due to close proximity, via a joey’s exposure to a mother’s potentially infectious fluids, like their milk.

“Mothers were sharing their virus variants three times more than fathers, suggesting this is the dominant pathway of spread for the virus.

“And, unlike other diseases affecting koalas like chlamydia, there’s no evidence of sexual transmission.”

The 109 koalas were housed in two sites in south-east Queensland, helping identify a total of 421 unique koala retrovirus sequences.

Collaborator and lead author, PhD candidate Briony Joyce said the research may lead to a re-think in how conservation plans are executed.

“This work will be highly informative for koala conservation, as it suggests that captive breeding programs focused on mothers that have a low amount of retrovirus variants, could result in healthier animals for release,” Ms Joyce said.

“Also, we propose that antiretroviral treatment – if shown to be safe in koala and effective against KoRV – could be used specifically in mothers during breeding seasons to prevent transmission.

“This work helps pave the way for evidence-based conservation, increasing koala resilience to help them cope with a changing and challenging environment.

“We must do everything we can to ensure the survival of this culturally important species.”

Source/Credit: University of Queensland


Scientists awarded $6 million to plan brain-inspired computer that runs on probability


Conventional computers can look at the optical illusion on the left and normally only see a vase or two faces. Sandia National Laboratories is laying the groundwork for a computer that, like our brains, can glance many times and see both.
(Image by Laura Hatfield)

If you’ve ever asked a car mechanic how long a part will last until it breaks, odds are they shrugged their shoulders. They know how long parts last on average, and they can see when one is close to breaking. But knowing how many miles are left is extremely difficult, even using a supercomputer, because the exact moment a belt snaps or a battery dies is to some extent random.

Scientists at Sandia National Laboratories are creating a concept for a new kind of computer for solving complex probability problems like this. They propose that a “probabilistic computer” could not only create smarter maintenance schedules but also help scientists analyze subatomic shrapnel inside particle colliders, simulate nuclear physics experiments and process images faster and more accurately than is possible with conventional computers.

As part of a new microelectronics codesign research program, the Department of Energy’s Office of Science recently awarded the project $6 million over the next three years to develop the idea. Sandia will be working with Oak Ridge National Laboratory, New York University, the University of Texas at Austin and Temple University in Philadelphia.

A codesign microelectronics project involves multidisciplinary collaboration that takes into account the interdependencies among materials, physics, architectures and software. Researchers also will look at ways to incorporate machine learning methods.

The concept for a probabilistic computer runs opposite to how computers are normally built and programmed, Sandia scientist Brad Aimone said. Instead of making one that is perfectly predictable, Sandia wants one with built-in randomness that computes information differently every time.

“To a large degree, and at a great energy cost, we engineer computers to eliminate randomness. What we want to do in this project is to leverage randomness. Instead of fighting it, we want to use it.” said Aimone, who leads the project he and his team call COINFLIPS (short for CO-designed Improved Neural Foundations Leveraging Inherent Physics Stochasticity).

“What if, when I’m communicating with you, I flip a coin?” Aimone said. “If heads, you act on my message; if tails, you ignore it. We want to discover how you can use randomness like this to solve problems where probability is important.”

Concept modeled after unpredictable connections between brain cells

Aimone is an expert in technology that mimics the brain, including machine learning. He got his idea for a probabilistic computer from how brain cells talk to each other.

Inside your brain there are billions of cells called neurons that pass information across trillions of cell-to-cell connections called synapses, Aimone said. Whenever one neuron has a message, it sends a signal to lots of other neurons at the same time. But, only a random fraction on the receiving side carry on the message to more cells. Neuroscientists don’t agree why, but Aimone thinks it could be a reason why brains do some tasks better than computers, such as learning and adapting, or why they use less energy.

To imitate this brain behavior, scientists need to figure out how to generate trillions of random numbers at a time. That much randomness is too complex and takes too much power for computers, said Sandia’s Shashank Misra, who leads the COINFLIPS hardware team.

“We will need to get creative with new approaches, including new materials, atomic-scale control and machine learning-driven designs to generate the sheer volume of randomness needed and to make it useful for computation,” Misra said.

COINFLIPS will also identify tasks that benefit from randomness.

Probabilistic computers are part of a larger effort at Sandia to explore what computers in the future might look like. Researchers around the world have recognized that the rate at which computers are improving is slowing down, Aimone said. To break past the apparent limits of computers, scientists are looking at new, original ways of designing them.

Conrad James, the Sandia manager of the COINFLIPS team said, “Several of us at Sandia have been exploring brain-inspired computing and new design approaches for years. Encouraging more communication between mathematicians, algorithm developers and device physicists led to the formation of this team and research proposal.”

Sandia adds to other efforts to rethink computers

COINFLIPS was one of only 10 proposals selected nationwide to receive funding to design new, energy-efficient microelectronics. Separately, Sandia is lending its expertise in nanotechnology and computer modeling to another selected project led by Lawrence Berkeley National Laboratory.

These researchers will be redesigning nanosized sensors used in communications, imaging, remote sensing and surveillance technologies to be more compact, efficient and integrated into a computer processor.

“The photon absorption, the transduction to an electrical event and the measurement will all be part of one quantum system,” said Sandia physicist François Léonard, who is a member of the collaboration.

They will also attempt to enhance these sensors with advanced materials, such as carbon nanotubes, hollow carbon straws that are 100,000 times thinner than a strand of hair.

A third Sandia team consisting of researchers Alec Talin and Matt Marinella will be supporting another selected project that Oak Ridge National Laboratory is leading. Their research could help improve the energy efficiency of processing of information from sensors in autonomous vehicles, handheld devices and satellites.

Most of the time and energy that a computer chip needs are spent shuttling information between where it is stored and where it is processed, Talin said. But it might be possible to slash the power computers use by combining these two elements using brain-inspired devices developed at Sandia.

“The key idea is that in the brain, the memory and the logic (processing) are co-located in the same basic element, the neuron,” Talin said.

Fast, energy-efficient systems could potentially process complex tasks, such as recognizing images and translating languages in real time, on portable devices like smartphones without needing the computing power of the cloud, Talin said.

Source/Credit: Sandia National Laboratories


Monday, September 6, 2021

Messengers from gut to brain


Thomas Korn is a professor for Experimental Neuroimmunology at TUM.
Image: Magdalena Jooss / TUM
Scientists have long been aware of a link between the gut microbiome and the central nervous system (CNS). Until now, however, the immune cells that move from the gut into the CNS and thus the brain had not been identified. A team of researchers in Munich has now succeeded in using violet light to make these migrating T cells visible for the first time. This opens up avenues for developing new treatment options for diseases such as multiple sclerosis (MS) and cancer.

The link between the gut microbiome and the CNS, known as the gut/brain axis (GBA), is believed to be responsible for many things: a person’s body weight, autoimmune diseases, depression, mental illnesses and Alzheimer’s disease. Researchers at the Technical University of Munich (TUM) and LMU University Hospital Munich have now succeeded in making this connection visible for the first time. This is cause for hope – for those suffering from MS, for example. It may offer ways to adapt treatments, and T cells could perhaps be modified before reaching the brain.

The immune system is affected by environmental factors – also in the central nervous system in case of MS patients. This autoimmune disease is subject to repeated flare-ups, experienced by patients as the improvement or worsening of their condition. T cells collect information and, in MS patients, carry it to the central nervous system (in the brain or spinal cord) where an immune response is triggered. Until now, however, it was long uncertain how and from where the T cells were traveling to the CNS.

The team working with Thomas Korn, a professor of experimental neuroimmunology at TUM, has developed a method for marking immune cells in mice using photoconvertible proteins. The T cells can then be made visible with violet light. The researchers successfully tested this method with the mouse model in lymph nodes, both in the gut and the skin. They were able to track the movement of the T cells from those locations into the central nervous systems.

T cells from the skin migrated into the gray and white matter of the CNS, while almost all T cells from the gut ended up in the white matter. For T cells in the brain, it was still possible to determine their origin. “What makes these insights so important is that they demonstrate for the first time that environmental influences impact the T cells in lymph nodes in the gut and the skin, which then carry this information into the distant organs,” says Prof. Thomas Korn. “The characteristics of the T cells are sufficiently stable for us to determine whether immune responses are influenced by skin or gut T cells,” adds LMU researcher Dr. Eduardo Beltrán, who performed the bioinformatic analyses in this study.

An important insight for MS patients: “If gut or skin cells were known to be the cause, the T cells could be treated at the source of the disease and predictions could be made on the progress of the chronic inflammation and autoimmune condition,” says first author Michael Hiltensperger. The results of the study could also mean a breakthrough for research on other autoimmune diseases or cancer.

Paper released in publication Nature Immunology

Source/Credit: Technical University of Munich


Spread of Delta SARS-CoV-2 variant driven by combination of immune escape and increased infectivity


Visualization of the Covid-19 virus 
Credit: Fusion Medical Animation via Unsplash
The Delta variant of SARS-CoV-2, which has become the dominant variant in countries including India and the UK, has most likely spread through its ability to evade neutralizing antibodies and its increased infectivity, say an international team of researchers.

The findings are reported today in Nature.

As SARS-CoV-2 replicates, errors in its genetic makeup cause it to mutate. Some mutations make the virus more transmissible or more infectious, some help it evade the immune response, potentially making vaccines less effective, while others have little effect. One such variant, labelled the B.1.617.2 Delta variant, was first observed in India in late 2020. It has since spread around the globe – in the UK, it is responsible nearly all new cases of coronavirus infection.

Professor Ravi Gupta from the Cambridge Institute of Therapeutic Immunology and Infectious Disease at the University of Cambridge, one of the study’s senior authors, said: “By combining lab-based experiments and epidemiology of vaccine breakthrough infections, we’ve shown that the Delta variant is better at replicating and spreading than other commonly-observed variants. There’s also evidence that neutralizing antibodies produced as a result of previous infection or vaccination are less effective at stopping this variant.

“These factors are likely to have contributed to the devastating epidemic wave in India during the first quarter of 2021, where as many as half of the cases were individuals who had previously been infected with an earlier variant.”

To examine how well the Delta variant was able to evade the immune response, the team extracted serum from blood samples collected as part of the COVID-19 cohort of the NIHR BioResource. The samples came from individuals who had previously been infected with the coronavirus or who had been vaccinated with either the Oxford/AstraZeneca or Pfizer vaccines. Serum contains antibodies raised in response to infection or vaccination. The team found that the Delta variant virus was 5.7-fold less sensitive to the sera from previously-infected individuals, and as much as eight-fold less sensitive to vaccine sera, compared with the Alpha variant - in other words, it takes eight times as many antibodies from a vaccinated individual to block the virus.

Consistent with this, an analysis of over 100 infected healthcare workers at three Delhi hospitals, nearly all of whom had been vaccinated against SARS-CoV-2, found the Delta variant to be transmitted between vaccinated staff to a greater extent than the alpha variant.

SARS-CoV-2 is a coronavirus, so named because spike proteins on its surface give it the appearance of a crown (‘corona’). The spike proteins bind to ACE2, a protein receptor found on the surface of cells in our body. Both the spike protein and ACE2 are then cleaved, allowing genetic material from the virus to enter the host cell. The virus manipulates the host cell’s machinery to allow the virus to replicate and spread.

Using 3D airway organoids – ‘mini-organs’ grown from cells from the airway, which mimic its behaviour – the team studied what happens when the virus reaches the respiratory tract. Working under secure conditions, the team used both a live virus and a ‘pseudo typed virus’ – a synthetic form of the virus that mimicked key mutations on the Delta variant – and used this to infect the organoids. They found that the Delta variant was more efficient at breaking into the cells compared with other variants as it carried a larger number of cleaved spikes on its surface. Once inside the cells, the variant was also better able to replicate. Both of these factors give the virus a selection advantage compared to other variants, helping explain why it has become so dominant.

Dr Partha Rakshit from the National Centre for Disease Control, Delhi, India, joint senior author, said: “The Delta variant has spread widely to become the dominant variants worldwide because it is faster to spread and better at infecting individuals than most other variants we’ve seen. It is also better at getting around existing immunity – either through previous exposure to the virus or to vaccination – though the risk of moderate to severe disease is reduced in such cases.”

Professor Anurag Agrawal from the CSIR Institute of Genomics and Integrative Biology, Delhi, India , joint senior author, added: “Infection of vaccinated healthcare workers with the Delta variant is a significant problem. Although they themselves may only experience mild COVID, they risk infecting individuals who have suboptimal immune responses to vaccination due to underlying health conditions – and these patients could then be at risk of severe disease. We urgently need to consider ways of boosting vaccine responses against variants among healthcare workers. It also suggests infection control measures will need to continue in the post-vaccine era.”

The research was largely supported in India by the Ministry of Health and Family Welfare, the Council of Scientific and Industrial Research, and the Department of Biotechnology; and in the UK by Wellcome, the Medical Research Council and the National Institute of Health Research.

Credit/Source: University of Cambridge


Hubble Discovers Hydrogen-Burning White Dwarfs Enjoying Slow Ageing

To investigate the physics underpinning white dwarf evolution, astronomers compared cooling white dwarfs in two massive collections of stars: the globular clusters M3 and M13. These two clusters share many physical properties such as age and metallicity but the populations of stars which will eventually give rise to white dwarfs are different. This makes M3 and M13 together a perfect natural laboratory in which to test how different populations of white dwarfs cool.
Image credit: ESA/Hubble & NASA, G. Piotto et al.

Could dying stars hold the secret to looking younger? New evidence from the NASA/ESA Hubble Space Telescope suggests that white dwarfs could continue to burn hydrogen in the final stages of their lives, causing them to appear more youthful than they actually are. This discovery could have consequences for how astronomers measure the ages of star clusters.

The prevalent view of white dwarfs as inert, slowly cooling stars has been challenged by observations from the NASA/ESA Hubble Space Telescope. An international group of astronomers have discovered the first evidence that white dwarfs can slow down their rate of ageing by burning hydrogen on their surface.

“We have found the first observational evidence that white dwarfs can still undergo stable thermonuclear activity,” explained Jianxing Chen of the Alma Mater Studiorum Università di Bologna and the Italian National Institute for Astrophysics, who led this research. “This was quite a surprise, as it is at odds with what is commonly believed.”

White dwarfs are the slowly cooling stars which have cast off their outer layers during the last stages of their lives. They are common objects in the cosmos; roughly 98% of all the stars in the Universe will ultimately end up as white dwarfs, including our own Sun [1]. Studying these cooling stages helps astronomers understand not only white dwarfs, but also their earlier stages as well.

To investigate the physics underpinning white dwarf evolution, astronomers compared cooling white dwarfs in two massive collections of stars: the globular clusters M3 and M13 [2]. These two clusters share many physical properties such as age and metallicity [3] but the populations of stars which will eventually give rise to white dwarfs are different. In particular, the overall colour of stars at an evolutionary stage known as the Horizontal Branch are bluer in M13, indicating a population of hotter stars. This makes M3 and M13 together a perfect natural laboratory in which to test how different populations of white dwarfs cool.

“The superb quality of our Hubble observations provided us with a full view of the stellar populations of the two globular clusters,” continued Chen. “This allowed us to really contrast how stars evolve in M3 and M13.”

Using Hubble’s Wide Field Camera 3 the team observed M3 and M13 at near-ultraviolet wavelengths, allowing them to compare more than 700 white dwarfs in the two clusters. They found that M3 contains standard white dwarfs which are simply cooling stellar cores. M13, on the other hand, contains two populations of white dwarfs: standard white dwarfs and those which have managed to hold on to an outer envelope of hydrogen, allowing them to burn for longer and hence cool more slowly.

Comparing their results with computer simulations of stellar evolution in M13, the researchers were able to show that roughly 70% of the white dwarfs in M13 are burning hydrogen on their surfaces, slowing down the rate at which they are cooling. 

This discovery could have consequences for how astronomers measure the ages of stars in the Milky Way. The evolution of white dwarfs has previously been modelled as a predictable cooling process. This relatively straightforward relationship between age and temperature has led astronomers to use the white dwarf cooling rate as a natural clock to determine the ages of star clusters, particularly globular and open clusters. However, white dwarfs burning hydrogen could cause these age estimates to be inaccurate by as much as 1 billion years.

“Our discovery challenges the definition of white dwarfs as we consider a new perspective on the way in which stars get old,” added Francesco Ferraro of the Alma Mater Studiorum Università di Bologna and the Italian National Institute for Astrophysics, who coordinated the study. “We are now investigating other clusters similar to M13 to further constrain the conditions which drive stars to maintain the thin hydrogen envelope which allows them to age slowly”. 


[1] The Sun is only 4.6 billion years through its roughly 10-billion-year lifetime. Once it exhausts hydrogen in its core, the Sun will swell into a red giant, engulfing the inner planets and searing the Earth’s surface. It will then throw off its outer layers, and the exposed core of the Sun will be left as a slowly cooling white dwarf. This stellar ember will be incredibly dense, packing a large fraction of the mass of the Sun into a roughly Earth-sized sphere.

[2] M3 contains roughly half a million stars and lies in the constellation Canes Venatici. M13 — occasionally known as the Great Globular Cluster in Hercules — contains slightly fewer stars, only several hundred thousand. White dwarfs are often used to estimate the ages of globular clusters, and so a significant amount of Hubble time has been dedicated to exploring white dwarfs in old and densely populated globular clusters. Hubble directly observed white dwarfs in globular star clusters for the first time in 2006.

[3] Astronomers use the word “metallicity” to describe the proportion of a star which is composed of elements other than hydrogen and helium. The vast majority of matter in the Universe is either hydrogen or helium — to take the Sun as an example, 74.9% of its mass is hydrogen, 23.8% is helium, and the remaining 1.3% is a mixture of all the other elements, which astronomers refer to as “metals”

Source/Credit: ESA/Hubble


Hidden air pollutants on the rise in cities in India and the UK


Satellite data helped researchers discover rising
 levels of air pollutants
Levels of air pollutants in cities in India are on the rise, according to scientists using observations from instruments on satellites that scan the global skies every day.

Researchers used a long record of data gathered by space-based instruments to estimate trends in a range of air pollutants for 2005 to 2018, timed to coincide with well-established air quality policies in the UK and rapid development in India.

The study was led by the University of Birmingham and UCL and included an international team of contributors from Belgium, India, Jamaica and the UK. The researchers published their findings in the journal Atmospheric Chemistry and Physics, noting that fine particles (PM2.5) and nitrogen dioxide (NO2), both hazardous to health, are increasing in Kanpur and Delhi.

Delhi is a fast-growing megacity and Kanpur was ranked by the WHO in 2018 as the most polluted city in the world. The researchers speculated that increases in PM2.5 and NO2 in India reflect increasing vehicle ownership, industrialization and the limited effect of air pollution policies to date.

This contrasts with trends in the UK cities London and Birmingham, which show modest but ongoing declines in PM2.5 and NOx, reflecting the success of policies targeting sources that emit these pollutants.

They also found increases in the air pollutant formaldehyde in Delhi, Kanpur and London. Formaldehyde is a marker for emissions of volatile organic compounds that include a large contribution from vehicle emissions in India, and, in the UK, an increasing contribution from personal care and cleaning products and a range of other household sources.

Karn Vohra, study lead author and PhD student at the University of Birmingham, commented: “We wanted to demonstrate the utility of satellite observations to monitor city-wide air pollution in the UK where ground-based measurements are in abundance and in India where they are not. Our approach will be able to provide useful information about air quality trends in cities with limited surface monitoring capabilities. This is critical as the WHO estimates that outdoor air pollution causes 4.2 million deaths a year."

Study co-author Professor William Bloss, also from the University of Birmingham, commented “We were surprised to see the increase in formaldehyde above Delhi, Kanpur and London – a clue that emissions of other volatile organic compounds may be changing, potentially driven by economic development and changes in domestic behavior. Our results emphasize the need to monitor our air for the unexpected, and the importance of ongoing enforcement of measures for cleaner air.”

“There is more than a decade of freely available observations from instruments in space to monitor and assess air quality in cities throughout the world. Greater use of these in the UK, India, and beyond is paramount to successful air quality policies”, stated Dr Eloise Marais, Earth observation expert at UCL and conceptual lead of the study.

Source/Credit: University of Birmingham


Blue-tongue vs red-bellied black


Scientists have discovered that the humble blue-tongue lizard is largely resistant to the venom of the deadly red-bellied black snake, while giant carnivorous monitor lizards which feed on Australia’s most venomous snakes are not.

The surprising finding was revealed after University of Queensland scientists compared the effects of various reptile blood plasmas when exposed to the venom.

UQ PhD candidate Nicholas Youngman said mammalian – and particularly, human – reactions had been heavily investigated, but very little was known about snake venom effects on other reptiles.

“It was a shock discovering that the eastern blue-tongue, along with the shingleback, showed resistance specifically to red black snake venom,” Mr Youngman said.

“Since their resistance was so specific to only this snake species, it seems these lizards have evolved a special plasma component – known as a serum factor – in their blood.

“This prevents specific toxins in red-bellied black snake venom from clotting the lizards’ plasma, which would lead to a rapid death in most other animals.

“This resistance doesn’t mean they’re completely immune, but it would give them a greater chance of survival, allowing them to escape or fight back.

“Much like how a COVID-19 vaccine doesn’t mean you don’t get sick at all, it just means you are less likely to die.”

The research team analyzed the effects of seven different Australian snake venoms on the plasma of two species of blue-tongued skinks and three species of monitor lizards that would interact with these snakes in the wild.

Associate Professor Bryan Fry, who heads UQ’s Venom Evolution Lab, said the results also revealed that monitor lizards – or goannas – were not resistant to the snake venoms.

“You’d think that a goanna would be significantly resistant to the venom of any snake it was hunting and eating, but that isn’t the case,” Dr Fry said.

“Snake venom can only cause harm to goannas if it’s injected into its body by the snake’s fangs, it can’t be absorbed directly through the skin.

“Goannas are heavily armored and their scales act like medieval chain mail, with each containing a piece of bone, meaning venomous snakes’ fangs struggle to pierce this armor.

“So – unlike the slow, vulnerable blue-tongue lizard – there’s no pressure for goannas to evolve resistance; natural selection has invested in their armor and it’s clearly working for them.

“These two divergent forms of resistance are fascinating examples of evolutionary novelty.”

The research has been published in Toxins

Source/Credit: University of Queensland


How climatic changes influence the evolution of oceanic insects

Research team members (from left)
Mr Marc Chang and Assistant Professor Danwei Huang
examining ocean skater specimens
The open oceans are harsh and hostile environments where insects might not be expected to thrive. In fact, only one insect group, ocean skaters, or water striders, has adapted to life on the open seas.

How these insects evolved to conquer the high seas, however, was not known.

Now, a study of the genetics of skaters provides a clue. The answer has to do with when major currents in the eastern Pacific Ocean came into existence with each species of skater evolving to match the unique conditions of those currents.

Scientists from the National University of Singapore (NUS) and Scripps Institution of Oceanography at UC San Diego examined the genetics of three ocean skater species collected with dip nets across the eastern Pacific between Hawaii and Peru. The results of the study revealed that the skaters became specialized on different current systems, as those currents changed into their modern configurations.

The findings could unravel the mystery of how each skater species came to occupy habitats vastly different from those of other insects, and also deepen our understanding of how climate change affects ocean-dwelling organisms.

“It is amazing how the ocean skater’s genetic history is closely tied to that of our oceans,” said study leader Dr Wendy Wang, an entomologist from the Lee Kong Chian Natural History Museum at NUS. “The open ocean is an extremely hostile environment, with direct sunlight throughout the daytime, strong winds and limited food. The abilities of their body covering or cuticle to protect their internal organs from heat and ultraviolet damage, and to survive violent storms and find food in this unique habitat where no other insect could demonstrate their unique ecological roles in the ocean. These characteristics make them fascinating subjects of study for materials science and extreme biological adaptations.”

The research team first reported their findings in the journal Marine Biology on 5 September 2021.

Linking genetic data with climatic changes

Ocean skaters live their entire lives perpetually running about on the surface film of the open seas, enduring lashing storms and feeding on tiny prey trapped on or just below the ocean surface. Currently, there are five known oceanic species of the genus Halobates. While information about where they can be found are well-established, little is known about their genetic variation, and how physical factors like ocean currents, temperature and winds affect their distribution.

The research team conducted a genetic study of three of those skater species collected from offshore Mexico to Peru, and as far out to sea as Hawaii. Most of the specimens were skimmed from the ocean surface with dip nets by Dr Lanna Cheng, a marine biologist at Scripps Oceanography, and study co-author. Dr Cheng is a world expert who has devoted her research to Halobates, and she has been studying the genus for almost five decades.

Dr Wang led the gene sequencing and genetic analysis of nearly 400 specimens across the three species. The researchers uncovered distinct genetic variations among the species that illustrate very different stories of population growth and development during ancient times.

The oldest of them, Halobates splendens, was found to have expanded its population nearly a million years ago. The other two younger species, Halobates micans and Halobates sobrinus, were found to have increased in abundance 100,000 to 120,000 years ago.

These formative dates match past climate events. H. splendens is now found in the rich, productive waters of the cold tongue that originates off the coast of South America as the Peru current. Climatological data showed that this physical feature of cold surface water came into existence a million years ago, just at the same time as the period of growth in the genetic diversity and populations of H. splendens.

The other two species H. sobrinus and H. micans were determined to have diversified in the warm, relatively unproductive waters of Central America. The populations of both species expanded when El Niño climate patterns caused warm ocean water to move into the eastern Pacific Ocean. The El Niño effects were especially strong in the habitats of both H. micans and H. sobrinus about 100,000 years ago, coinciding with the time these species developed their modern genetic patterns and population sizes.

“With no apparent physical boundaries in the open ocean to stop them, Halobates can skate practically from the coast of California across the entire Pacific Ocean to Japan and beyond,” Dr Cheng said. “Two of the species studied in this paper, H. sobrinus and H. splendens, however, have never been found to venture beyond the eastern Pacific Ocean and we didn't know why. This paper gave us the clue from their ancestry."

Scripps Oceanography’s Professor Richard Norris is a paleontologist who, for the study, matched the expansion of ocean skater populations to the time periods when the fossil record suggests the modern currents first formed.

“The genetics show that the three species we studied each had periods of population growth that fit eerily well with geological evidence for when the current systems they live in came into existence,” said Prof Norris. “Perhaps I shouldn’t be surprised, since it is common for marine creatures to specialize on particular ocean conditions, but these skaters live on top of the ocean. Apparently, even the character of the sea spray and water surface film is different enough between currents to matter to these guys.”

Further research

Dr Wang elaborated, “The findings of our study highlight the deep influence of climatic conditions on marine populations. The results also contribute towards understanding the fates of ocean-dwelling organisms as ongoing climate change accelerates in the coming decades.”

To expand their knowledge, the researchers will continue to examine the population dynamics of this enigmatic marine insect by studying their genomes.

“Drawing on the key insights from this study, together with our ongoing work, we aim to connect the evolutionary origins of various Halobates species, and uncover how they came to occupy the surface ocean and coastal habitats in present day,” shared marine biologist and co-author Assistant Professor Huang Danwei from the NUS Department of Biological Sciences, and an alumnus of Scripps Oceanography.

Co-author Mr Marc Cheng added, “Having genetic data is especially useful for organisms such as the ocean skaters which we are studying as we are unable to observe them ethologically in their natural environment to track their population.” He is a doctoral student at NUS who is using DNA sequencing methods to uncover the genetic basis to life on the sea surface.

Source/Credit: National University of Singapore


From racehorse to therapy horse


Photo by Jennifer Murray from Pexels
A new study will examine the selection, training and welfare of thoroughbred horses as they transition from racetrack to therapy horse. The pioneering project, led by academics at the University of Bristol’s Veterinary School in collaboration with Racing to Relate, will develop a recognized global welfare standard for former racehorses who are moving into Equine Assisted Therapy (EAT).

Thoroughbreds are recognized for their sensitivity and this project will provide a research-based approach to retraining them for therapy work. EAT careers could include work with a diverse group of people, from veterans and disabled children to those struggling with mental health issues. The research, which is funded by the John Pearce Foundation, is the first of its kind to study EAT across many countries and will look at practices in the UK, USA, France and Ireland, to understand the impact of EAT on the horses.

Claire Neveux, Bristol Vet School PhD student for the project, said: "I have worked with thoroughbreds for about 20 years, mainly with broodmares and young horses, and I have always been amazed by their high reactivity and sensitivity. I'm also fascinated by the human-horse relationship. I had a few opportunities to participate in Equine Assisted Therapy programs as an intern during my graduate studies. That's why, when I met Jennifer Twomey from Racing to Relate, I took the opportunity to be part of this pioneering and collaborative project, and I'm thrilled to contribute to this research. I'm convinced that a better understanding of the thoroughbred personality traits and suitability of horses for EAT is essential for equine and human welfare."

The main aim of the research is to create a create a global standard for selection and training, to help the racing industry to improve welfare support for off-track racehorses going into a career in EAT. The research will help industry and stakeholders to improve Thoroughbred welfare through a successful transition to their new career in EAT.

Little research has been carried out on the welfare of horses within EAT programs, and especially on the impact it may have on their wellbeing. In particular, this research will analyze the educational process for all horses within the EAT sector, to gain a clearer picture of why and how horses are selected for particular roles. The aim is to fully understand the current selection and training methods within the sector and identify specific characteristics of the thoroughbred, which are suited to a career in EAT. The study will also explore details of the life and routine of equines within EAT, examining existing perceptions and considerations of horse welfare.

Dr Mathilde Valenchon, Research Fellow at the Bristol Vet School and co-supervisor of the PhD project, added: "I am delighted we successfully developed this research project to understand and facilitate the involvement of ex-racehorses in EAT activities. I have been studying equine behaviour, cognition and welfare for the past 12 years. I have always been impressed by the thoroughbred's sensitivity and adaptability. I am thrilled to contribute to a better knowledge of their suitability for EAT and the development of standards, as this will significantly and positively impact the horses' welfare, as well as people’s. I am especially proud that our research includes the horse's perspective."

Dr Siobhan Mullan, Senior Research Fellow at Bristol Vet School, and co-supervisor of the PhD project, said: "Thoroughbred horses involved in EAT programs are performing a really special and valuable role in society, and yet little formal research has been done to understand how to optimize their welfare throughout their transition from racehorse to therapy horse and in the course of their new career. I'm heartened by the interest around the world in using the results of our research to develop standards which will have a long-lasting impact on horse welfare."

Source/Credit: University of Bristol


Sunday, September 5, 2021

Fruits, vegetables sold in U.S. are products of forced labor


A study published in Nature Food involving academics at The University of Nottingham calls attention to the need for better systems to track forced labor in food supply chains.

The study reports on the development of a new scoring system that identifies the risk of forced labor for fruits and vegetables sold in the United States. It finds a high risk of forced labor, but also scattered and incomplete data sources that limit action.

The study was led by Dr Nicole Tichenor Blackstone in the Agriculture, Food and Environment program at the Friedman School of Nutrition Science & Policy at Tufts, and Dr Jessica Decker Sparks, Associate Director at the University of Nottingham Rights Lab, leading its Ecosystems and Environment Program.

“Sustainability research on the food supply typically focuses on promoting human health and protecting the environment,” said first and corresponding author Dr Blackstone. “But social sustainability provides a different perspective on our food sources, including issues of labor rights and equity. Globally, agriculture has one of the highest incidences of forced labor.”

Responsible procurement

The study developed a new forced labor risk scoring method that draws upon original data compiled by the authors as well as a range of governmental and non-profit data. The research team then coded each food and country-of-origin combination as either very high risk, high risk, medium risk, or low risk for forced labor having occurred at some point in the growing and harvesting of each item. Previously, there have been short lists of commodities suspected of being produced with forced labor, or case studies of foods produced in one country, such as Mexico.

“What we’ve done, for the first time, is to look at all of the major fruits and vegetables consumed in the U.S., as well as all of the countries these foods come from, including the U.S., and assess the possibility that somewhere in the production process forced labor could have been involved,” said Dr Blackstone.

The scoring method is not meant to be a consumer tool but could help industry and policy makers interested in the development of systems and protocols for the responsible procurement of foods.

The final data set included 93 fruits and vegetables in 307 food-country combinations. The results of the qualitative coding show that most food-country combinations were coded as high risk (85%) for forced labor at some point. Seven percent were coded as very-high risk, 4.5% were coded as medium risk, and 3.5% were coded as low risk.

“This is an extraordinary percentage at high risk, but it reflects that there are very limited or coarse data,” said senior author Dr Sparks. “There are major structural issues with how agricultural labor is set up that make workers vulnerable. To us, this reflects systemic issues in food supply chains that have not been addressed."

Agricultural work often takes place in remote and isolated environments with demanding labor requirements. There are typically inadequate legal protections, with piece-rate pay systems tied to productivity, and reliance on migrant labor.

As defined by the International Labor Organization, “forced labor can be understood as work that is performed involuntarily and under the menace of any penalty. It refers to situations in which persons are coerced to work through the use of violence or intimidation, or by more subtle means such as manipulated debt, retention of identity papers or threats of denunciation to immigration authorities.”

“Forced labor in agriculture is a threat to the sustainability of food systems. However, the scarcity of data noted limits holistic analysis and action. Future research should prioritize data and model development to enable analyses of forced labor and other labor-related social risks (e.g., wages, child labor) across the life cycles of a wide range of foods. These efforts can help ensure that the rights and dignity of “the hands that feed us” are centered in the transformation of food systems,” concluded the authors in the study.

Source/Credit: University of Nottingham


Featured Article

Autism and ADHD are linked to disturbed gut flora very early in life

The researchers have found links between the gut flora in babies first year of life and future diagnoses. Photo Credit:  Cheryl Holt Disturb...

Top Viewed Articles