. Scientific Frontline: Computer Science
Showing posts with label Computer Science. Show all posts
Showing posts with label Computer Science. Show all posts

Friday, April 14, 2023

Location intelligence shines a light on disinformation

Each dot represents a Twitterer discussing COVID-19 from April 16 to April 22, 2021. The closer the dots are to the center, the greater the influence. The brighter the color, the stronger the intent.
Image Credit: ORNL

Using disinformation to create political instability and battlefield confusion dates back millennia.

However, today’s disinformation actors use social media to amplify disinformation that users knowingly or, more often, unknowingly perpetuate. Such disinformation spreads quickly, threatening public health and safety. Indeed, the COVID-19 pandemic and recent global elections have given the world a front-row seat to this form of modern warfare.

A group at ORNL now studies such threats thanks to the evolution at the lab of location intelligence, or research that uses open data to understand places and the factors that influence human activity in them. In the past, location intelligence has informed emergency response, urban planning, transportation planning, energy conservation and policy decisions. Now, location intelligence at ORNL also helps identify disinformation, or shared information that is intentionally misleading, and its impacts.

Wednesday, April 12, 2023

ORNL, NOAA launch new supercomputer for climate science research

Photo Credit: Genevieve Martin/ORNL

Oak Ridge National Laboratory, in partnership with the National Oceanic and Atmospheric Administration, is launching a new supercomputer dedicated to climate science research. The new system is the fifth supercomputer to be installed and run by the National Climate-Computing Research Center at ORNL.

The NCRC was established in 2009 as part of a strategic partnership between NOAA and the U.S. Department of Energy and is responsible for the procurement, installation, testing and operation of several supercomputers dedicated to climate modeling and simulations. The goal of the partnership is to increase NOAA’s climate modeling capabilities to further critical climate research. To that end, the NCRC has installed a series of increasingly powerful computers since 2010, each of them formally named Gaea. The latest system, also referred to as C5, is an HPE Cray machine with over 10 petaflops — or 10 million billion calculations per second — of peak theoretical performance — almost double the power of the two previous systems combined.

Monday, February 27, 2023

Hackers could try to take over a military aircraft; can a cyber shuffle stop them?

Sandia National Laboratories cybersecurity expert Chris Jenkins sits in front of a whiteboard with the original sketch of the moving target defense idea for which he is the team lead. When the COVID-19 pandemic hit, Jenkins began working from home, and his office whiteboard remained virtually undisturbed for more than two years.
Photo Credit: Craig Fritz

A cybersecurity technique that shuffles network addresses like a blackjack dealer shuffles playing cards could effectively befuddle hackers gambling for control of a military jet, commercial airliner or spacecraft, according to new research. However, the research also shows these defenses must be designed to counter increasingly sophisticated algorithms used to break them.

Many aircraft, spacecraft and weapons systems have an onboard computer network known as military standard 1553, commonly referred to as MIL-STD-1553, or even just 1553. The network is a tried-and-true protocol for letting systems like radar, flight controls and the heads-up display talk to each other.

Securing these networks against a cyberattack is a national security imperative, said Chris Jenkins, a Sandia National Laboratories cybersecurity scientist. If a hacker were to take over 1553 midflight, he said, the pilot could lose control of critical aircraft systems, and the impact could be devastating.

Jenkins is not alone in his concerns. Many researchers across the country are designing defenses for systems that utilize the MIL-STD-1553 protocol for command and control. Recently, Jenkins and his team at Sandia partnered with researchers at Purdue University in West Lafayette, Indiana, to test an idea that could secure these critical networks.

Monday, February 13, 2023

Efficient technique improves machine-learning models’ reliability

Researchers from MIT and the MIT-IBM Watson AI Lab have developed a new technique that can enable a machine-learning model to quantify how confident it is in its predictions, but does not require vast troves of new data and is much less computationally intensive than other techniques.
Image Credit: MIT News, iStock
Creative Commons Attribution Non-Commercial No Derivatives license

Powerful machine-learning models are being used to help people tackle tough problems such as identifying disease in medical images or detecting road obstacles for autonomous vehicles. But machine-learning models can make mistakes, so in high-stakes settings it’s critical that humans know when to trust a model’s predictions.

Uncertainty quantification is one tool that improves a model’s reliability; the model produces a score along with the prediction that expresses a confidence level that the prediction is correct. While uncertainty quantification can be useful, existing methods typically require retraining the entire model to give it that ability. Training involves showing a model millions of examples so it can learn a task. Retraining then requires millions of new data inputs, which can be expensive and difficult to obtain, and also uses huge amounts of computing resources.

Researchers at MIT and the MIT-IBM Watson AI Lab have now developed a technique that enables a model to perform more effective uncertainty quantification, while using far fewer computing resources than other methods, and no additional data. Their technique, which does not require a user to retrain or modify a model, is flexible enough for many applications.

Wednesday, February 1, 2023

Learning with all your senses: Multimodal enrichment as the optimal learning strategy of the future

Illustration Credit: John Hain

Neuroscientist Katharina von Kriegstein from Technische Universität Dresden and Brian Mathias from the University of Aberdeen have compiled extensive interdisciplinary findings from neuroscience, psychology, computer modelling and education on the topic of "learning" in a recent review article in the journal Trends in Cognitive Sciences. The results of the interdisciplinary review reveal the mechanisms the brain uses to achieve improved learning outcome by combining multiple senses or movements in learning. This kind of learning outcome applies to a wide variety of domains, such as letter and vocabulary acquisition, reading, mathematics, music, and spatial orientation.

Many educational approaches assume that integrating complementary sensory and motor information into the learning experience can enhance learning, for example gestures help in learning new vocabulary in foreign language classes. In her recent publication, neuroscientist Katharina von Kriegstein from Technische Universität Dresden and Brian Mathias of the University of Aberdeen summarize these methods under the term "multimodal enrichment." This means enrichment with multiple senses and movement. Numerous current scientific studies prove that multimodal enrichment can enhance learning outcomes. Experiments in classrooms show similar results.

Tuesday, January 31, 2023

A fresh look at restoring power to the grid

Sandia National Laboratories computer scientists Casey Doyle, left, and Kevin Stamber stand in front of an electrical switching station. Their team has developed a computer model to determine the best way to restore power to a grid after a disruption, such as a complete blackout caused by extreme weather.
Photo Credit: Craig Fritz

Climate change can alter extreme weather events, and these events have the potential to strain, disrupt or damage the nation’s grid.

Sandia National Laboratories computer scientists have been working on an innovative computer model to help grid operators quickly restore power to the grid after a complete disruption, a process called “black start.”

Their model combines a restoration-optimization model with a computer model of how grid operators would make decisions when they don’t have complete knowledge of every generator and distribution line. The model also includes a physics-based understanding of how the individual power generators, distribution substations and power lines would react during the process of restoring power to the grid.

“We’ve spent a lot of time thinking about how we go beyond simply looking at this as a multi-layered optimization problem,” said project lead Kevin Stamber. “When we start to discuss disruptions to the electric grid, being able to act on the available information and provide a response is critical. The operator still has to work that restoration solution against the grid and see whether or not they are getting the types of reactions from the system that they expect to see.”

The overarching model also can simulate black starts triggered by human-caused disruptions such as a successful cyberattack.

Thursday, December 15, 2022

Greenland’s Glaciers Might Be Melting 100 Times As Fast As Previously Thought

A melting glacier on the coast of Greenland.
Photo Credit: Dr. Lorenz Meire, Greenland Climate Research Center.

A computer model has been created by researchers at the Oden Institute for Computational Engineering and Sciences at The University of Texas at Austin that determines the rate at which Greenland’s glacier fronts are melting.

Published in the journal Geophysical Research Letters, the model is the first designed specifically for vertical glacier fronts – where ice meets the ocean at a sharp angle. It reflects recent observations of an Alaskan glacier front melting up to 100 times as fast as previously assumed. According to the researchers, the model can be used to improve both ocean and ice sheet models, which are crucial elements of any global climate model.

“Up to now, glacier front melt models have been based on results from the Antarctic, where the system is quite different,” said lead author Kirstin Schulz, a research associate in the Oden Institute’s Computational Research in Ice and Ocean Systems Group (CRIOS). “By using our model in an ocean or climate model, we can get a much better idea of how vertical glacier fronts are melting.”

Tuesday, December 13, 2022

AI model predicts if a covid-19 test might be positive or not

Xingquan “Hill” Zhu, Ph.D., (left) senior author and a professor; and co-author Magdalyn E. Elkin, a Ph.D. student, both in FAU’s Department of Electrical Engineering and Computer Science.
Photo Credit: Florida Atlantic University

COVID-19 and its latest Omicron strains continue to cause infections across the country as well as globally. Serology (blood) and molecular tests are the two most commonly used methods for rapid COVID-19 testing. Because COVID-19 tests use different mechanisms, they vary significantly. Molecular tests measure the presence of viral SARS-CoV-2 RNA while serology tests detect the presence of antibodies triggered by the SARS-CoV-2 virus.

Currently, there is no existing study on the correlation between serology and molecular tests and which COVID-19 symptoms play a key role in producing a positive test result. A study from Florida Atlantic University ’s College of Engineering and Computer Science using machine learning provides important new evidence in understanding how molecular tests versus serology tests are correlated, and what features are the most useful in distinguishing between COVID-19 positive versus test outcomes.

Researchers from the College of Engineering and Computer Science trained five classification algorithms to predict COVID-19 test results. They created an accurate predictive model using easy-to-obtain symptom features, along with demographic features such as number of days post-symptom onset, fever, temperature, age and gender.

Monday, December 12, 2022

Sandia, Intel seek novel memory tech to support stockpile mission

Developed at Sandia National Laboratories, a high-fidelity simulation of the hypersonic turbulent flow over a notional hypersonic flight vehicle, colored grey, depicts the speed of the air surrounding the body, with red as high and blue as low. The turbulent motions that impose harsh, unsteady loading on the vehicle body are depicted in the back portion of the vehicle. Accurately predicting these loads are critical to vehicle survivability, and for practical applications, billions of degrees of freedom are required to predict physics of interest, inevitably requiring massive computing capabilities for realistic turnaround times. The work conducted as part of this research and development contract targets improving memory performance characteristics that can greatly benefit this and other mission applications.
Simulation Credit: Cory Stack

In pursuit of novel advanced memory technologies that would accelerate simulation and computing applications in support of the nation’s stockpile stewardship mission, Sandia National Laboratories, in partnership with Los Alamos and Lawrence Livermore national labs, has announced a research and development contract awarded to Intel Federal LLC, a wholly owned subsidiary of Intel Corporation.

Funded by the National Nuclear Security Administration’s Advanced Simulation and Computing program, the three national labs will collaborate with Intel Federal LLC on the project.

“ASC’s Advanced Memory Technology research projects are developing technologies that will impact future computer system architectures for complex modeling and simulation workloads,” said ASC program director Thuc Hoang. “We have selected several technologies that have the potential to deliver more than 40 times the application performance of our forthcoming NNSA Exascale systems.”

Sandia project lead James H. Laros III, a Distinguished Member of Technical Staff, said “this effort will focus on improving bandwidth and latency characteristics of future memory systems, which should have a direct impact on application performance for a wide range of ASC mission codes.”

Thursday, December 1, 2022

New Stanford chip-scale laser isolator could transform photonics

From left, Alexander White, Geun Ho Ahn, and Jelena Vučković with the nanoscale isolator.
Photo Credit: Hannah Kleidermacher

Using well-known materials and manufacturing processes, researchers have built an effective, passive, ultrathin laser isolator that opens new research avenues in photonics.

Lasers are transformational devices, but one technical challenge prevents them from being even more so. The light they emit can reflect back into the laser itself and destabilize or even disable it. At real-world scales, this challenge is solved by bulky devices that use magnetism to block harmful reflections. At chip scale, however, where engineers hope lasers will one day transform computer circuitry, effective isolators have proved elusive.

Against that backdrop, researchers at Stanford University say they have created a simple and effective chip-scale isolator that can be laid down in a layer of semiconductor-based material hundreds of times thinner than a sheet of paper.

“Chip-scale isolation is one of the great open challenges in photonics,” said Jelena Vučković, a professor of electrical engineering at Stanford and senior author of the study appearing Dec. 1 in the journal Nature Photonics.

“Every laser needs an isolator to stop back reflections from coming into and destabilizing the laser,” said Alexander White, a doctoral candidate in Vučković’s lab and co-first author of the paper, adding that the device has implications for everyday computing, but could also influence next-generation technologies, like quantum computing.

Tuesday, November 29, 2022

Breaking the scaling limits of analog computing

MIT researchers have developed a technique that greatly reduces the error in an optical neural network, which uses light to process data instead of electrical signals. With their technique, the larger an optical neural network becomes, the lower the error in its computations. This could enable them to scale these devices up so they would be large enough for commercial uses.
Credit: SFLORG stock photo

As machine-learning models become larger and more complex, they require faster and more energy-efficient hardware to perform computations. Conventional digital computers are struggling to keep up.

An analog optical neural network could perform the same tasks as a digital one, such as image classification or speech recognition, but because computations are performed using light instead of electrical signals, optical neural networks can run many times faster while consuming less energy.

However, these analog devices are prone to hardware errors that can make computations less precise. Microscopic imperfections in hardware components are one cause of these errors. In an optical neural network that has many connected components, errors can quickly accumulate.

Even with error-correction techniques, due to fundamental properties of the devices that make up an optical neural network, some amount of error is unavoidable. A network that is large enough to be implemented in the real world would be far too imprecise to be effective.

MIT researchers have overcome this hurdle and found a way to effectively scale an optical neural network. By adding a tiny hardware component to the optical switches that form the network’s architecture, they can reduce even the uncorrectable errors that would otherwise accumulate in the device.

Tuesday, November 22, 2022

Researchers use blockchain to increase electric grid resiliency

A team led by Raymond Borges Hink has developed a method using blockchain to protect communications between electronic devices in the electric grid, preventing cyberattacks and cascading blackouts.
Photo Credit: Genevieve Martin/ORNL, U.S. Dept. of Energy

Although blockchain is best known for securing digital currency payments, researchers at the Department of Energy’s Oak Ridge National Laboratory are using it to track a different kind of exchange: It’s the first time blockchain has ever been used to validate communication among devices on the electric grid.

The project is part of the ORNL-led Darknet initiative, funded by the DOE Office of Electricity, to secure the nation’s electricity infrastructure by shifting its communications to increasingly secure methods.

Cyber risks have increased with two-way communication between grid power electronics equipment and new edge devices ranging from solar panels to electric car chargers and intelligent home electronics. By providing a trust framework for communication among electrical devices, an ORNL research team led by Raymond Borges Hink is increasing the resilience of the electric grid. The team developed a framework to detect unusual activity, including data manipulation, spoofing and illicit changes to device settings. These activities could trigger cascading power outages as breakers are tripped by protection devices.

Sunday, November 20, 2022

Electronic/Photonic Chip Sandwich Pushes Boundaries of Computing and Data Transmission Efficiency

Image: The chip sandwich: an electronics chip (the smaller chip on the top) integrated with a photonics chip, sitting atop a penny for scale.
Photo Credit: Arian Hashemi Talkhooncheh

Engineers at Caltech and the University of Southampton in England have collaboratively designed an electronics chip integrated with a photonics chip (which uses light to transfer data)—creating a cohesive final product capable of transmitting information at ultrahigh speed while generating minimal heat.

Though the two-chip sandwich is unlikely to find its way into your laptop, the new design could influence the future of data centers that manage very high volumes of data communication.

"Every time you are on a video call, stream a movie, or play an online video game, you're routing data back and forth through a data center to be processed," says Caltech graduate student Arian Hashemi Talkhooncheh (MS '16), lead author of a paper describing the two-chip innovation that was published in the IEEE Journal of Solid-State Circuits. "There are more than 2,700 data centers in the U.S. and more than 8,000 worldwide, with towers of servers stacked on top of each other to manage the load of thousands of terabytes of data going in and out every second."

Just as your laptop heats up on your lap while you use it, the towers of servers in data centers that keep us all connected also heat up as they work, just at a much greater scale. Some data centers are even built underwater to cool the whole facility more easily. The more efficient they can be made, the less heat they will generate, and ultimately, the greater the volume of information that they will be able to manage.

Tuesday, November 15, 2022

Solving brain dynamics gives rise to flexible machine-learning models

Studying the brains of small species recently helped MIT researchers better model the interaction between neurons and synapses — the building blocks of natural and artificial neural networks — into a class of flexible, robust machine-learning models that learn on the job and can adapt to changing conditions.
Image Credit: Ramin Hasani/Stable Diffusion

Last year, MIT researchers announced that they had built “liquid” neural networks, inspired by the brains of small species: a class of flexible, robust machine learning models that learn on the job and can adapt to changing conditions, for real-world safety-critical tasks, like driving and flying. The flexibility of these “liquid” neural nets meant boosting the bloodline to our connected world, yielding better decision-making for many tasks involving time-series data, such as brain and heart monitoring, weather forecasting, and stock pricing.

But these models become computationally expensive as their number of neurons and synapses increase and require clunky computer programs to solve their underlying, complicated math. And all of this math, similar to many physical phenomena, becomes harder to solve with size, meaning computing lots of small steps to arrive at a solution.

Now, the same team of scientists has discovered a way to alleviate this bottleneck by solving the differential equation behind the interaction of two neurons through synapses to unlock a new type of fast and efficient artificial intelligence algorithms. These modes have the same characteristics of liquid neural nets — flexible, causal, robust, and explainable — but are orders of magnitude faster, and scalable. This type of neural net could therefore be used for any task that involves getting insight into data over time, as they’re compact and adaptable even after training — while many traditional models are fixed.

Wednesday, November 2, 2022

Study urges caution when comparing neural networks to the brain

Neural networks, a type of computing system loosely modeled on the organization of the human brain, form the basis of many artificial intelligence systems for applications such speech recognition, computer vision, and medical image analysis.
Image Credits: Christine Daniloff | Massachusetts Institute of Technology

Neural networks, a type of computing system loosely modeled on the organization of the human brain, form the basis of many artificial intelligence systems for applications such speech recognition, computer vision, and medical image analysis.

In the field of neuroscience, researchers often use neural networks to try to model the same kind of tasks that the brain performs, in hopes that the models could suggest new hypotheses regarding how the brain itself performs those tasks. However, a group of researchers at MIT is urging that more caution should be taken when interpreting these models.

In an analysis of more than 11,000 neural networks that were trained to simulate the function of grid cells — key components of the brain’s navigation system — the researchers found that neural networks only produced grid-cell-like activity when they were given very specific constraints that are not found in biological systems.

“What this suggests is that in order to obtain a result with grid cells, the researchers training the models needed to bake in those results with specific, biologically implausible implementation choices,” says Rylan Schaeffer, a former senior research associate at MIT.

Tuesday, October 11, 2022

Bristol researchers make important breakthrough in quantum computing


Researchers from the University of Bristol, quantum start-up, Phasecraft and Google Quantum AI have revealed properties of electronic systems that could be used for the development of more efficient batteries and solar cells.

The findings, published in Nature Communications today, describes how the team has taken an important first step towards using quantum computers to determine low-energy properties of strongly-correlated electronic systems that cannot be solved by classical computers. They did this by developing the first truly scalable algorithm for observing ground-state properties of the Fermi-Hubbard model on a quantum computer. The Fermi-Hubbard model is a way of discovering crucial insights into electronic and magnetic properties of materials.

Modeling quantum systems of this form has significant practical implications, including the design of new materials that could be used in the development of more effective solar cells and batteries, or even high-temperature superconductors. However, doing so remains beyond the capacity of the world’s most powerful supercomputers. The Fermi-Hubbard model is widely recognized as an excellent benchmark for near-term quantum computers because it is the simplest materials system that includes non-trivial correlations beyond what is captured by classical methods. Approximately producing the lowest-energy (ground) state of the Fermi-Hubbard model enables the user to calculate key physical properties of the model.

In the past, researchers have only succeeded in solving small, highly simplified Fermi-Hubbard instances on a quantum computer. This research shows that much more ambitious results are possible. Leveraging a new, highly efficient algorithm and better error-mitigation techniques, they successfully ran an experiment that is four times larger – and consists of 10 times more quantum gates – than anything previously recorded.

Thursday, October 6, 2022

As ransomware attacks increase, new algorithm may help prevent power blackouts

Saurabh Bagchi, a Purdue professor of electrical and computer engineering, develops ways to improve the cybersecurity of power grids and other critical infrastructure.
Credit: Purdue University photo/Vincent Walter

Millions of people could suddenly lose electricity if a ransomware attack just slightly tweaked energy flow onto the U.S. power grid.

No single power utility company has enough resources to protect the entire grid, but maybe all 3,000 of the grid’s utilities could fill in the most crucial security gaps if there were a map showing where to prioritize their security investments.

Purdue University researchers have developed an algorithm to create that map. Using this tool, regulatory authorities or cyber insurance companies could establish a framework that guides the security investments of power utility companies to parts of the grid at greatest risk of causing a blackout if hacked.

Power grids are a type of critical infrastructure, which is any network – whether physical like water systems or virtual like health care record keeping – considered essential to a country’s function and safety. The biggest ransomware attacks in history have happened in the past year, affecting most sectors of critical infrastructure in the U.S. such as grain distribution systems in the food and agriculture sector and the Colonial Pipeline, which carries fuel throughout the East Coast.

Thursday, September 22, 2022

Conventional Computers Can Learn to Solve Tricky Quantum Problems

Hsin-Yuan (Robert) Huang
Credit: Caltech

There has been a lot of buzz about quantum computers and for good reason. The futuristic computers are designed to mimic what happens in nature at microscopic scales, which means they have the power to better understand the quantum realm and speed up the discovery of new materials, including pharmaceuticals, environmentally friendly chemicals, and more. However, experts say viable quantum computers are still a decade away or more. What are researchers to do in the meantime?

A new Caltech-led study in the journal Science describes how machine learning tools, run on classical computers, can be used to make predictions about quantum systems and thus help researchers solve some of the trickiest physics and chemistry problems. While this notion has been proposed before, the new report is the first to mathematically prove that the method works in problems that no traditional algorithms could solve.

"Quantum computers are ideal for many types of physics and materials science problems," says lead author Hsin-Yuan (Robert) Huang, a graduate student working with John Preskill, the Richard P. Feynman Professor of Theoretical Physics and the Allen V. C. Davis and Lenabelle Davis Leadership Chair of the Institute for Quantum Science and Technology (IQIM). "But we aren't quite there yet and have been surprised to learn that classical machine learning methods can be used in the meantime. Ultimately, this paper is about showing what humans can learn about the physical world."

Tuesday, September 20, 2022

Supercomputing and 3D printing capture the aerodynamics of F1 cars

A photo of the 3D color printed McLaren 17D Formula One front wing endplate. The colors visualize the complex flow a fraction of a millimeter away from the wing's surface.
Photo credit: KAUST

In Formula One race car design, the manipulation of airflow around the car is the most important factor in performance. A 1% gain in aerodynamics performance can mean the difference between first place and a forgotten finish, which is why teams employ hundreds of people and spend millions of dollars perfecting this manipulation.

Of special interest is the design of the front wing endplate, which is critical for the drag and lift of the car. Dr. Matteo Parsani, associate professor of applied mathematics and computational science at King Abdullah University of Science and Technology (KAUST), has led a multidisciplinary team of scientists and engineers to simulate and 3D color print the solution of the McLaren 17D Formula One front wing endplate. The work is the result of a massively high-performance computing simulation, with contributing expertise by research scientist Dr. Lisandro Dalcin of the KAUST Extreme Computing Research Center (ECRC), directed by Dr. David Keyes, and also the Advanced Algorithm and Numerical Simulations Lab (AANSLab), and Prototyping and Product Development Core Lab (PCL).

Wednesday, September 7, 2022

As threats to the U.S. power grid surge

WVU Lane Department of Computer Science and Electrical Engineering students Partha Sarker, Paroma Chatterjee and Jannatul Adan, discuss a power grid simulation project led by Anurag Srivastava, professor and department chair, in the GOLab.
Photo credit: Brian Persinger | WVU

The electrical grid faces a mounting barrage of threats that could trigger a butterfly effect – floods, superstorms, heat waves, cyberattacks, not to mention its own ballooning complexity and size – that the nation is unprepared to handle, according to one West Virginia University scientist.

But Anurag Srivastava, professor and chair of the Lane Department of Computer Science and Electrical Engineering, has plans to prevent and respond to potential power grid failures, thanks to a pair of National Science Foundation-funded research projects.

“In the grid, we have the butterfly effect,” Srivastava said. “This means that if a butterfly flutters its wings in Florida, that will cause a windstorm in Connecticut because things are synchronously connected, like dominos. In the power grid, states like Florida, Connecticut, Illinois and West Virginia are all part of the eastern interconnection and linked together.

“If a big event happens in the Deep South, it is going to cause a problem up north. To stop that, we need to detect the problem area as soon as possible and gracefully separate that part out so the disturbance does not propagate through the whole.”

Featured Article

Brain-Belly Connection: Gut Health May Influence Likelihood of Developing Alzheimer’s

UNLV study pinpoints 10 bacterial groups associated with Alzheimer’s disease, provides new insights into the relationship between gut makeup...

Top Viewed Articles