. Scientific Frontline

Friday, April 8, 2022

In the race to build quantum computing hardware, silicon begins to shine

Silicon-based device in development for use in quantum computers. Gate electrodes shown in blue, red, and green are used to define the quantum dot potentials while the micromagnet on top provides a magnetic field gradient. The image was taken using scanning electron microscopy and the colors were applied for clarity. 
Image credit: Adam Mills, Princeton University

Research conducted by Princeton University physicists is paving the way for the use of silicon-based technologies in quantum computing, especially as quantum bits – the basic units of quantum computers. This research promises to accelerate the use of silicon technology as a viable alternative to other quantum computing technologies, such as superconductors or trapped ions.

In research published in the journal Science Advances, Princeton physicists used a two-qubit silicon quantum device to achieve an unprecedented level of fidelity. At above 99 percent, this is the highest fidelity thus far achieved for a two-qubit gate in a semiconductor and is on par with the best results achieved by competing technologies. Fidelity, which is a measure of a qubit’s ability to perform error-free operations, is a key feature in the quest to develop practical and efficient quantum computing.

Researchers around the world are trying to figure out which technologies — such as superconducting qubits, trapped ions or silicon spin qubits, for example — can best be employed as the basic units of quantum computing. And, equally significant researchers are exploring which technologies will have the ability to scale up most efficiently for commercial use.

“Silicon spin qubits are gaining momentum [in the field],” said Adam Mills, a graduate student in the Department of Physics at Princeton University and the lead author of the recently published study. “It’s looking like a big year for silicon overall.”

A ‘cautionary tale’ about location tracking

New research out of the University of Rochester shows that data collected from your acquaintances and even strangers can predict your location.

Data about our habits and movements are constantly collected via mobile phone apps, fitness trackers, credit card logs, websites visited, and other means.

But if we turn off data tracking on our devices, aren’t we untraceable?

No, according to a new study.

“Switching off your location data is not going to entirely help,” says Gourab Ghoshal, an associate professor of physics, mathematics, and computer science and the Stephen Biggar ’92 and Elizabeth Asaro ’92 Fellow in Data Science at the University of Rochester.

Ghoshal, joined by colleagues at the University of Exeter, the Federal University of Rio de Janeiro, Northeastern University, and the University of Vermont, applied techniques from information theory and network science to find out just how far-reaching a person’s data might be. The researchers discovered that even if individual users turned off data tracking and didn’t share their own information, their mobility patterns could still be predicted with surprising accuracy based on data collected from their acquaintances.

“Worse,” says Ghoshal, “almost as much latent information can be extracted from perfect strangers that the individual tends to co-locate with.”

The researchers published their findings in Nature Communications.

Researchers accurately identify people with PTSD through text data alone

A study participant is interviewed by Ellie, an artificial character, to gather text data.
Credit: Jonathan Gratch, USC Institute for Creative Technologies 

University of Alberta researchers have trained a machine learning model to identify people with post-traumatic stress disorder with 80 per cent accuracy by analyzing text data. The model could one day serve as an accessible and inexpensive screening tool to support health professionals in detecting and diagnosing PTSD or other mental health disorders through telehealth platforms.

Psychiatry PhD candidate Jeff Sawalha, who led the project, performed a sentiment analysis of text from a dataset created by Jonathan Gratch at USC’s Institute for Creative Technologies. Sentiment analysis involves taking a large body of data, such as the contents of a series of tweets, and categorizing them — for example, seeing how many are expressing positive thoughts and how many are expressing negative thoughts.

“We wanted to strictly look at the sentiment analysis from this dataset to see if we could properly identify or distinguish individuals with PTSD just using the emotional content of these interviews,” said Sawalha.

The text in the USC dataset was gathered through 250 semi-structured interviews conducted by an artificial character, Ellie, over video conferencing calls with 188 people without PTSD and 87 with PTSD.

New spin lasers for ultra-fast data transfer

Martin Hofmann receives funding as part of a Reinhart-Koselleck project for the development of spin lasers.
Credit: RUB, Marquard

The conventional type of Internet data transmission soon reaches fundamental physical limits. The process can only become faster if you rely on a different principle. Bochum researchers do that.

The transfer of data today is based on light pulses that are sent through fiber optic cables. The faster the light intensity varies, the faster you can transfer information. However, fundamental physical limits of the lasers that generate the modulated light prevent the process from becoming much faster than it is currently. The team led by Prof. Dr. Martin Hofmann, chair of photonics and terahertz technology at the Ruhr University Bochum. With the help of spin lasers, the researchers want to encode information in the polarization of light instead of in light intensity. The German Research Foundation will support the work in the future as part of a Reinhart-Koselleck project with 1.25 million euros for five years.

Thursday, April 7, 2022

Climate change increases risk of devastating debris flows after wildfires in western U.S.

Part of Interstate 70 in Colorado was washed away by flooding and shut down for six months during 2020 and 2021 for repairs.
Credit: Colorado Department of Transportation

In the early morning hours of January 9, 2018, intense rainfall loosened debris and mud in the Santa Ynez mountains, in Santa Barbara County, that had been torched by the Thomas Fire just months before.

The resulting debris flow killed 23 people, injured another 167 and damaged at least 400 homes. UCLA climate scientist Daniel Swain witnessed the aftermath in person.

“It gives you a sense of the physical forces involved,” he said. “You see cars up in trees and boulders the size of trucks strewn about as if they were pebbles in someone’s garden.”

According to a new research paper co-authored by Swain, events like that one could begin occuring more frequently in the western U.S. because of climate change. In the coming years, hilly or mountainous regions within wildfire burn areas will face a higher risk for debris flows, mudslides and flash floods — all of which are likelier to occur on fire-scorched hillsides without vegetation.

That’s because climate change is projected to increase the conditions — higher temperatures, low humidity and precipitation extremes, both wet and dry — that lead to those disasters, according to the study, which was published in Science Advances.

Are people more willing to empathize with animals or with other humans?

Credit: Photo by David Clode on Unsplash
Stories about animals such as Harambe the gorilla and Cecil the lion often sweep the media as they pull at people’s heartstrings. But are people more likely to feel empathy for animals than humans?

A new Penn State study led by Daryl Cameron, associate professor of psychology and senior research associate at Rock Ethics Institute, found that the answer is complicated. The findings could have implications for how messaging to the public about issues like new environmental policies is framed, among others.

The researchers found that when people were asked to choose between empathizing with a human stranger or an animal — in this study, a koala bear — the participants were more likely to choose empathizing with a fellow human.

However, in a second pair of studies, the researchers had participants take part in two separate tasks: one in which they could choose whether or not they wanted to empathize with a person, and one in which they could choose whether or not they wanted to empathize with an animal. This time, people were more likely to choose empathy when faced with an animal than when faced with a person.

Cameron said the findings — recently published in a special issue on empathy in the Journal of Social Psychology — suggest that when people are deciding whether to engage in empathy, context matters.

“It’s possible that if people are seeing humans and animals in competition, it might lead to them preferring to empathize with other humans,” Cameron said. “But if you don’t see that competition, and the situation is just deciding whether to empathize with an animal one day and a human the other, it seems that people don’t want to engage in human empathy but they're a little bit more interested in animals.”

Engineered crystals could help computers run on less power

Researchers at the University of California, Berkeley, have created engineered crystal structures that display an unusual physical phenomenon known as negative capacitance. Incorporating this material into advanced silicon transistors could make computers more energy efficient.
Credit: UC Berkeley image by Ella Maru Studio

Computers may be growing smaller and more powerful, but they require a great deal of energy to operate. The total amount of energy the U.S. dedicates to computing has risen dramatically over the last decade and is quickly approaching that of other major sectors, like transportation.

In a study published online this week in the journal Nature, University of California, Berkeley, engineers describe a major breakthrough in the design of a component of transistors — the tiny electrical switches that form the building blocks of computers — that could significantly reduce their energy consumption without sacrificing speed, size or performance. The component, called the gate oxide, plays a key role in switching the transistor on and off.

“We have been able to show that our gate-oxide technology is better than commercially available transistors: What the trillion-dollar semiconductor industry can do today — we can essentially beat them,” said study senior author Sayeef Salahuddin, the TSMC Distinguished professor of Electrical Engineering and Computer Sciences at UC Berkeley.

This boost in efficiency is made possible by an effect called negative capacitance, which helps reduce the amount of voltage that is needed to store charge in a material. Salahuddin theoretically predicted the existence of negative capacitance in 2008 and first demonstrated the effect in a ferroelectric crystal in 2011.

Most precise ever measurement of W boson mass to be in tension with the Standard Model

The Collider Detector at Fermilab recorded high-energy particle collisions produced by the Tevatron collider from 1985 to 2011. About 400 scientists at 54 institutions in 23 countries are still working on the wealth of data collected by the experiment.
Credit: Fermilab

After 10 years of careful analysis and scrutiny, scientists of the CDF collaboration at the U.S. Department of Energy’s Fermi National Accelerator Laboratory announced today that they have achieved the most precise measurement to date of the mass of the W boson, one of nature’s force-carrying particles. Using data collected by the Collider Detector at Fermilab, or CDF, scientists have now determined the particle’s mass with a precision of 0.01% — twice as precise as the previous best measurement. It corresponds to measuring the weight of an 800-pound gorilla to 1.5 ounces.

The new precision measurement, published in the journal Science, allows scientists to test the Standard Model of particle physics, the theoretical framework that describes nature at its most fundamental level. The result: The new mass value shows tension with the value scientists obtain using experimental and theoretical inputs in the context of the Standard Model.

“The number of improvements and extra checking that went into our result is enormous,” said Ashutosh V. Kotwal of Duke University, who led this analysis and is one of the 400 scientists in the CDF collaboration. “We took into account our improved understanding of our particle detector as well as advances in the theoretical and experimental understanding of the W boson’s interactions with other particles. When we finally unveiled the result, we found that it differed from the Standard Model prediction.”

World’s Largest International Dark Sky Reserve Created by McDonald Observatory

The Milky Way soars over the domes of McDonald Observatory's Mount Locke showcasing the region's dark skies.
Credit: Stephen Hummel/McDonald Observatory

The world’s largest International Dark Sky Reserve is coming to Texas and Mexico, thanks to a partnership between The University of Texas at Austin’s McDonald Observatory, The Nature Conservancy, the International Dark-Sky Association (IDA) and many others. The designation, granted by the IDA, recognizes the commitment of organizations, governments, businesses and residents in the region to maintaining dark skies. The move will benefit not only astronomical research, but also wildlife, ecology and tourism.

The new Greater Big Bend International Dark Sky Reserve will encompass more than 15,000 square miles in portions of western Texas and northern Mexico. It is the only such reserve to cross an international border.

“This reserve protects both the scientific research and public education missions of McDonald Observatory,” said Taft Armandroff, director of UT Austin’s McDonald Observatory. “Since 1939, the observatory has enabled the study of the cosmos by faculty, students and researchers at UT Austin and other Texas institutions of higher learning, with topics ranging from planets orbiting nearby stars to the accelerating expansion of the universe.”

Mini-Livers on a Chip

Researchers at Gladstone Institutes designed a new platform for studying how the human immune system responds to hepatitis C infection by combining microfluidic technology with liver organoids. Credit: Gladstone Institutes

A vaccine for hepatitis C has eluded scientists for more than 30 years, for several reasons. For one, the virus that causes the disease comes in many genetic forms, complicating the creation of a widely effective vaccine. For another, studying hepatitis C has been difficult because options in animals are limited and lab methods using infected cells have not adequately reflected the real-life dynamics of infection.

Now, researchers at Gladstone Institutes have developed a new platform for studying how the human immune system responds to hepatitis C infection. The method, presented in the scientific journal Open Biology, marries microfluidic technology (which allows scientists to precisely manipulate fluid at a microscopic scale) with liver organoids (three-dimensional cell clusters that mimic the biology of real human livers).

“The 3D structure and cellular composition of liver organoids allows us to study viral entry and replication in a highly relevant physiological manner,” says Gladstone Senior Investigator Todd McDevitt, PhD, a senior author of the new study.

“Our approach enables a more controlled and accurate investigation into the immune response to hepatitis C infection,” says Melanie Ott, MD, PhD, director of the Gladstone Institute of Virology and another senior author of the study. “We hope our method will accelerate the discovery of a much-needed vaccine.”

Featured Article

Discovery of unexpected collagen structure could ‘reshape biomedical research’

Jeffrey Hartgerink is a professor of chemistry and bioengineering at Rice. Photo Credit: Courtesy of Jeffrey Hartgerink / Rice University Co...

Top Viewed Articles