. Scientific Frontline: Computer Science
Showing posts with label Computer Science. Show all posts
Showing posts with label Computer Science. Show all posts

Friday, May 27, 2022

Same symptom – different cause?

Head of the LipiTUM research group Dr. Josch Konstantin Pauling (left) and PhD student Nikolai Köhler (right) interpret the disease-related changes in lipid metabolism using a newly developed network.
Credit: LipiTUM

Machine learning is playing an ever-increasing role in biomedical research. Scientists at the Technical University of Munich (TUM) have now developed a new method of using molecular data to extract subtypes of illnesses. In the future, this method can help to support the study of larger patient groups.

Nowadays doctors define and diagnose most diseases on the basis of symptoms. However, that does not necessarily mean that the illnesses of patients with similar symptoms will have identical causes or demonstrate the same molecular changes. In biomedicine, one often speaks of the molecular mechanisms of a disease. This refers to changes in the regulation of genes, proteins or metabolic pathways at the onset of illness. The goal of stratified medicine is to classify patients into various subtypes at the molecular level in order to provide more targeted treatments.

To extract disease subtypes from large pools of patient data, new machine learning algorithms can help. They are designed to independently recognize patterns and correlations in extensive clinical measurements. The LipiTUM junior research group, headed by Dr. Josch Konstantin Pauling of the Chair for Experimental Bioinformatics has developed an algorithm for this purpose.

Friday, May 20, 2022

Neuromorphic Memory Device Simulates Neurons and Synapses​

A neuromorphic memory device consisting of bottom volatile and top nonvolatile memory layers emulating neuronal and synaptic properties, respectively
Credit: KAIST

Researchers have reported a nano-sized neuromorphic memory device that emulates neurons and synapses simultaneously in a unit cell, another step toward completing the goal of neuromorphic computing designed to rigorously mimic the human brain with semiconductor devices.

Neuromorphic computing aims to realize artificial intelligence (AI) by mimicking the mechanisms of neurons and synapses that make up the human brain. Inspired by the cognitive functions of the human brain that current computers cannot provide, neuromorphic devices have been widely investigated. However, current Complementary Metal-Oxide Semiconductor (CMOS)-based neuromorphic circuits simply connect artificial neurons and synapses without synergistic interactions, and the concomitant implementation of neurons and synapses still remains a challenge. To address these issues, a research team led by Professor Keon Jae Lee from the Department of Materials Science and Engineering implemented the biological working mechanisms of humans by introducing the neuron-synapse interactions in a single memory cell, rather than the conventional approach of electrically connecting artificial neuronal and synaptic devices.

Artificial intelligence predicts patients’ race from their medical images

Researchers demonstrated that medical AI systems can easily learn to recognize racial identity in medical images, and that this capability is extremely difficult to isolate or mitigate.
 Credit: Massachusetts Institute of Technology

The miseducation of algorithms is a critical problem; when artificial intelligence mirrors unconscious thoughts, racism, and biases of the humans who generated these algorithms, it can lead to serious harm. Computer programs, for example, have wrongly flagged Black defendants as twice as likely to reoffend as someone who’s white. When an AI used cost as a proxy for health needs, it falsely named Black patients as healthier than equally sick white ones, as less money was spent on them. Even AI used to write a play relied on using harmful stereotypes for casting.

Removing sensitive features from the data seems like a viable tweak. But what happens when it’s not enough?

Wednesday, May 18, 2022

A component for brain-inspired computing

Scientists aim to perform machine-learning tasks more efficiently with processors that emulate the working principles of the human brain.
Image: Unsplash

Researchers from ETH Zurich, Empa and the University of Zurich have developed a new material for an electronic component that can be used in a wider range of applications than its predecessors. Such components will help create electronic circuits that emulate the human brain and that are more efficient than conventional computers at performing machine-learning tasks.

Compared with computers, the human brain is incredibly energy-efficient. Scientists are therefore drawing on how the brain and its interconnected neurons function for inspiration in designing innovative computing technologies. They foresee that these brain-inspired computing systems will be more energy-efficient than conventional ones, as well as better at performing machine-learning tasks.

Much like neurons, which are responsible for both data storage and data processing in the brain, scientists want to combine storage and processing in a single type of electronic component, known as a memristor. Their hope is that this will help to achieve greater efficiency because moving data between the processor and the storage, as conventional computers do, is the main reason for the high energy consumption in machine-learning applications.

Tuesday, May 17, 2022

New Approach Allows for Faster Ransomware Detection

Photo credit: Michael Geiger

Engineering researchers have developed a new approach for implementing ransomware detection techniques, allowing them to detect a broad range of ransomware far more quickly than previous systems.

Ransomware is a type of malware. When a system is infiltrated by ransomware, the ransomware encrypts that system’s data – making the data inaccessible to users. The people responsible for the ransomware then extort the affected system’s operators, demanding money from the users in exchange for granting them access to their own data.

Ransomware extortion is hugely expensive, and instances of ransomware extortion are on the rise. The FBI reports receiving 3,729 ransomware complaints in 2021, with costs of more than $49 million. What’s more, 649 of those complaints were from organizations classified as critical infrastructure.

Saturday, April 30, 2022

Researchers Create Self-Assembled Logic Circuits from Proteins


In a proof-of-concept study, researchers have created self-assembled, protein-based circuits that can perform simple logic functions. The work demonstrates that it is possible to create stable digital circuits that take advantage of an electron’s properties at quantum scales.

One of the stumbling blocks in creating molecular circuits is that as the circuit size decreases the circuits become unreliable. This is because the electrons needed to create current behave like waves, not particles, at the quantum scale. For example, on a circuit with two wires that are one nanometer apart, the electron can “tunnel” between the two wires and effectively be in both places simultaneously, making it difficult to control the direction of the current. Molecular circuits can mitigate these problems, but single-molecule junctions are short-lived or low-yielding due to challenges associated with fabricating electrodes at that scale.

“Our goal was to try and create a molecular circuit that uses tunneling to our advantage, rather than fighting against it,” says Ryan Chiechi, associate professor of chemistry at North Carolina State University and co-corresponding author of a paper describing the work.

Friday, April 29, 2022

Fermilab engineers develop new control electronics for quantum computers that improve performance and cut costs

Gustavo Cancelo led a team of Fermilab engineers to create a new compact electronics board: It has the capabilities of an entire rack of equipment that is compatible with many designs of superconducting qubits at a fraction of the cost.
Photo: Ryan Postel, Fermilab

When designing a next-generation quantum computer, a surprisingly large problem is bridging the communication gap between the classical and quantum worlds. Such computers need specialized control and readout electronics to translate back and forth between the human operator and the quantum computer’s languages — but existing systems are cumbersome and expensive.

However, a new system of control and readout electronics, known as Quantum Instrumentation Control Kit, or QICK, developed by engineers at the U.S. Department of Energy’s Fermi National Accelerator Laboratory, has proved to drastically improve quantum computer performance while cutting the cost of control equipment.

“The development of the Quantum Instrumentation Control Kit is an excellent example of U.S. investment in joint quantum technology research with partnerships between industry, academia and government to accelerate pre-competitive quantum research and development technologies,” said Harriet Kung, DOE deputy director for science programs for the Office of Science and acting associate director of science for high-energy physics.

Tuesday, April 12, 2022

Cloud server leasing can leave sensitive data up for grabs


Renting space and IP addresses on a public server has become standard business practice, but according to a team of Penn State computer scientists, current industry practices can lead to "cloud squatting," which can create a security risk, endangering sensitive customer and organization data intended to remain private.

Cloud squatting occurs when a company, such as a bank, leases space and IP addresses — unique addresses that identify individual computers or computer networks — on a public server, uses them, and then releases the space and addresses back to the public server company, a standard pattern seen every day. The public server company, such as Amazon, Google, or Microsoft, then assigns the same addresses to a second company.  If this second company is a bad actor, it can receive information coming into the address intended for the original company — for example, when you as a customer unknowingly use an outdated link when interacting with your bank — and use it to its advantage — cloud squatting.

"There are two advantages to leasing server space," said Eric Pauley, doctoral candidate in computer science and engineering. "One is a cost advantage, saving on equipment and management. The other is scalability. Leasing server space offers an unlimited pool of computing resources so, as workload changes, companies can quickly adapt." As a result, the use of clouds has grown exponentially, meaning almost every website a user visits takes advantage of cloud computing.

Saturday, April 9, 2022

‘Frustrated’ nanomagnets order themselves through disorder

Source/Credit: Yale University

Extremely small arrays of magnets with strange and unusual properties can order themselves by increasing entropy, or the tendency of physical systems to disorder, a behavior that appears to contradict standard thermodynamics—but doesn’t.

“Paradoxically, the system orders because it wants to be more disordered,” said Cristiano Nisoli, a physicist at Los Alamos and coauthor of a paper about the research published in Nature Physics. “Our research demonstrates entropy-driven order in a structured system of magnets at equilibrium.”

The system examined in this work, known as tetris spin ice, was studied as part of a long-standing collaboration between Nisoli and Peter Schiffer at Yale University, with theoretical analysis and simulations led at Los Alamos and experimental work led at Yale. The research team includes scientists from a number of universities and academic institutions.

Nanomagnet arrays, like tetris spin ice, show promise as circuits of logic gates in neuromorphic computing, a leading-edge computing architecture that closely mimics how the brain works. They also have possible applications in a number of high-frequency devices using “magnonics” that exploit the dynamics of magnetism on the nanoscale.

Thursday, April 7, 2022

Engineered crystals could help computers run on less power

Researchers at the University of California, Berkeley, have created engineered crystal structures that display an unusual physical phenomenon known as negative capacitance. Incorporating this material into advanced silicon transistors could make computers more energy efficient.
Credit: UC Berkeley image by Ella Maru Studio

Computers may be growing smaller and more powerful, but they require a great deal of energy to operate. The total amount of energy the U.S. dedicates to computing has risen dramatically over the last decade and is quickly approaching that of other major sectors, like transportation.

In a study published online this week in the journal Nature, University of California, Berkeley, engineers describe a major breakthrough in the design of a component of transistors — the tiny electrical switches that form the building blocks of computers — that could significantly reduce their energy consumption without sacrificing speed, size or performance. The component, called the gate oxide, plays a key role in switching the transistor on and off.

“We have been able to show that our gate-oxide technology is better than commercially available transistors: What the trillion-dollar semiconductor industry can do today — we can essentially beat them,” said study senior author Sayeef Salahuddin, the TSMC Distinguished professor of Electrical Engineering and Computer Sciences at UC Berkeley.

This boost in efficiency is made possible by an effect called negative capacitance, which helps reduce the amount of voltage that is needed to store charge in a material. Salahuddin theoretically predicted the existence of negative capacitance in 2008 and first demonstrated the effect in a ferroelectric crystal in 2011.

Micro­cavities as a sensor plat­form

Nano particles trapped between mirrors might be a promising platform for quantum sensors.
Credit: IQOQI Innsbruck

Sensors are a pillar of the Internet of Things, providing the data to control all sorts of objects. Here, precision is essential, and this is where quantum technologies could make a difference. Researchers in Innsbruck and Zurich are now demonstrating how nanoparticles in tiny optical resonators can be transferred into quantum regime and used as high-precision sensors.

Advances in quantum physics offer new opportunities to significantly improve the precision of sensors and thus enable new technologies. A team led by Oriol Romero-Isart of the Institute of Quantum Optics and Quantum Information at the Austrian Academy of Sciences and the Department of Theoretical Physics at the University of Innsbruck and a team lead by Romain Quidant of ETH Zurich are now proposing a new concept for a high-precision quantum sensor. The researchers suggest that the motional fluctuations of a nanoparticle trapped in a microscopic optical resonator could be reduced significantly below the zero-point motion, by exploiting the fast unstable dynamics of the system.

Wednesday, April 6, 2022

Does this artificial intelligence think like a human?

MIT researchers developed a method that helps a user understand a machine-learning model’s reasoning, and how that reasoning compares to that of a human.
Credits: Christine Daniloff, MIT

In machine learning, understanding why a model makes certain decisions is often just as important as whether those decisions are correct. For instance, a machine-learning model might correctly predict that a skin lesion is cancerous, but it could have done so using an unrelated blip on a clinical photo.

While tools exist to help experts make sense of a model’s reasoning, often these methods only provide insights on one decision at a time, and each must be manually evaluated. Models are commonly trained using millions of data inputs, making it almost impossible for a human to evaluate enough decisions to identify patterns.

Now, researchers at MIT and IBM Research have created a method that enables a user to aggregate, sort, and rank these individual explanations to rapidly analyze a machine-learning model’s behavior. Their technique, called Shared Interest, incorporates quantifiable metrics that compare how well a model’s reasoning matches that of a human.

Shared Interest could help a user easily uncover concerning trends in a model’s decision-making — for example, perhaps the model often becomes confused by distracting, irrelevant features, like background objects in photos. Aggregating these insights could help the user quickly and quantitatively determine whether a model is trustworthy and ready to be deployed in a real-world situation.

Monday, March 28, 2022

Let quantum dots grow regularly

With this experimental setup, the researchers check the quality of the quantum dots. Green laser light is used to stimulate the quantum dots that then emit infrared light.
© İsmail Bölükbaşı

With the previous manufacturing process, the density of the structures was difficult to control. Now researchers can create a kind of checkerboard pattern. A step towards application, for example in a quantum computer.

Quantum points could one day form the basic information units of quantum computers. Researchers at the Ruhr University Bochum (RUB) and the Technical University of Munich (TUM) have significantly improved the manufacturing process for these tiny semiconductor structures, together with colleagues from Copenhagen and Basel. The quantum dots are generated on a wafer, a thin semiconductor crystal disc. So far, the density of the structures on it has been difficult to control. Now scientists can create specific arrangements - an important step towards an applicable component that should have a large number of quantum dots.

The team published the results on 28. March 2022 in the journal Nature Communications. A group led by Nikolai Bart, Prof. Dr. Andreas Wieck and Dr. Arne Ludwig from the RUB Chair for Applied Solid State Physics with the team around Christian Dangel and Prof. Dr. Jonathan Finley from the TUM working group semiconductor nanostructures and quantum systems as well as with colleagues from the universities of Copenhagen and Basel.

A tool for predicting the future

MIT researchers created a tool that enables people to make highly accurate predictions using multiple time-series data with just a few keystrokes. The powerful algorithm at the heart of their tool can transform multiple time series into a tensor, which is a multi-dimensional array of numbers (pictured). Credits: Figure courtesy of the researchers Source: MIT

Whether someone is trying to predict tomorrow’s weather, forecast future stock prices, identify missed opportunities for sales in retail, or estimate a patient’s risk of developing a disease, they will likely need to interpret time-series data, which are a collection of observations recorded over time.

Making predictions using time-series data typically requires several data-processing steps and the use of complex machine-learning algorithms, which have such a steep learning curve they aren’t readily accessible to nonexperts.

To make these powerful tools more user-friendly, MIT researchers developed a system that directly integrates prediction functionality on top of an existing time-series database. Their simplified interface, which they call tspDB (time series predict database), does all the complex modeling behind the scenes so a nonexpert can easily generate a prediction in only a few seconds.

The new system is more accurate and more efficient than state-of-the-art deep learning methods when performing two tasks: predicting future values and filling in missing data points.

Featured Article

Autism and ADHD are linked to disturbed gut flora very early in life

The researchers have found links between the gut flora in babies first year of life and future diagnoses. Photo Credit:  Cheryl Holt Disturb...

Top Viewed Articles