. Scientific Frontline: Computer Science
Showing posts with label Computer Science. Show all posts
Showing posts with label Computer Science. Show all posts

Tuesday, April 12, 2022

Cloud server leasing can leave sensitive data up for grabs


Renting space and IP addresses on a public server has become standard business practice, but according to a team of Penn State computer scientists, current industry practices can lead to "cloud squatting," which can create a security risk, endangering sensitive customer and organization data intended to remain private.

Cloud squatting occurs when a company, such as a bank, leases space and IP addresses — unique addresses that identify individual computers or computer networks — on a public server, uses them, and then releases the space and addresses back to the public server company, a standard pattern seen every day. The public server company, such as Amazon, Google, or Microsoft, then assigns the same addresses to a second company.  If this second company is a bad actor, it can receive information coming into the address intended for the original company — for example, when you as a customer unknowingly use an outdated link when interacting with your bank — and use it to its advantage — cloud squatting.

"There are two advantages to leasing server space," said Eric Pauley, doctoral candidate in computer science and engineering. "One is a cost advantage, saving on equipment and management. The other is scalability. Leasing server space offers an unlimited pool of computing resources so, as workload changes, companies can quickly adapt." As a result, the use of clouds has grown exponentially, meaning almost every website a user visits takes advantage of cloud computing.

Saturday, April 9, 2022

‘Frustrated’ nanomagnets order themselves through disorder

Source/Credit: Yale University

Extremely small arrays of magnets with strange and unusual properties can order themselves by increasing entropy, or the tendency of physical systems to disorder, a behavior that appears to contradict standard thermodynamics—but doesn’t.

“Paradoxically, the system orders because it wants to be more disordered,” said Cristiano Nisoli, a physicist at Los Alamos and coauthor of a paper about the research published in Nature Physics. “Our research demonstrates entropy-driven order in a structured system of magnets at equilibrium.”

The system examined in this work, known as tetris spin ice, was studied as part of a long-standing collaboration between Nisoli and Peter Schiffer at Yale University, with theoretical analysis and simulations led at Los Alamos and experimental work led at Yale. The research team includes scientists from a number of universities and academic institutions.

Nanomagnet arrays, like tetris spin ice, show promise as circuits of logic gates in neuromorphic computing, a leading-edge computing architecture that closely mimics how the brain works. They also have possible applications in a number of high-frequency devices using “magnonics” that exploit the dynamics of magnetism on the nanoscale.

Thursday, April 7, 2022

Engineered crystals could help computers run on less power

Researchers at the University of California, Berkeley, have created engineered crystal structures that display an unusual physical phenomenon known as negative capacitance. Incorporating this material into advanced silicon transistors could make computers more energy efficient.
Credit: UC Berkeley image by Ella Maru Studio

Computers may be growing smaller and more powerful, but they require a great deal of energy to operate. The total amount of energy the U.S. dedicates to computing has risen dramatically over the last decade and is quickly approaching that of other major sectors, like transportation.

In a study published online this week in the journal Nature, University of California, Berkeley, engineers describe a major breakthrough in the design of a component of transistors — the tiny electrical switches that form the building blocks of computers — that could significantly reduce their energy consumption without sacrificing speed, size or performance. The component, called the gate oxide, plays a key role in switching the transistor on and off.

“We have been able to show that our gate-oxide technology is better than commercially available transistors: What the trillion-dollar semiconductor industry can do today — we can essentially beat them,” said study senior author Sayeef Salahuddin, the TSMC Distinguished professor of Electrical Engineering and Computer Sciences at UC Berkeley.

This boost in efficiency is made possible by an effect called negative capacitance, which helps reduce the amount of voltage that is needed to store charge in a material. Salahuddin theoretically predicted the existence of negative capacitance in 2008 and first demonstrated the effect in a ferroelectric crystal in 2011.

Micro­cavities as a sensor plat­form

Nano particles trapped between mirrors might be a promising platform for quantum sensors.
Credit: IQOQI Innsbruck

Sensors are a pillar of the Internet of Things, providing the data to control all sorts of objects. Here, precision is essential, and this is where quantum technologies could make a difference. Researchers in Innsbruck and Zurich are now demonstrating how nanoparticles in tiny optical resonators can be transferred into quantum regime and used as high-precision sensors.

Advances in quantum physics offer new opportunities to significantly improve the precision of sensors and thus enable new technologies. A team led by Oriol Romero-Isart of the Institute of Quantum Optics and Quantum Information at the Austrian Academy of Sciences and the Department of Theoretical Physics at the University of Innsbruck and a team lead by Romain Quidant of ETH Zurich are now proposing a new concept for a high-precision quantum sensor. The researchers suggest that the motional fluctuations of a nanoparticle trapped in a microscopic optical resonator could be reduced significantly below the zero-point motion, by exploiting the fast unstable dynamics of the system.

Wednesday, April 6, 2022

Does this artificial intelligence think like a human?

MIT researchers developed a method that helps a user understand a machine-learning model’s reasoning, and how that reasoning compares to that of a human.
Credits: Christine Daniloff, MIT

In machine learning, understanding why a model makes certain decisions is often just as important as whether those decisions are correct. For instance, a machine-learning model might correctly predict that a skin lesion is cancerous, but it could have done so using an unrelated blip on a clinical photo.

While tools exist to help experts make sense of a model’s reasoning, often these methods only provide insights on one decision at a time, and each must be manually evaluated. Models are commonly trained using millions of data inputs, making it almost impossible for a human to evaluate enough decisions to identify patterns.

Now, researchers at MIT and IBM Research have created a method that enables a user to aggregate, sort, and rank these individual explanations to rapidly analyze a machine-learning model’s behavior. Their technique, called Shared Interest, incorporates quantifiable metrics that compare how well a model’s reasoning matches that of a human.

Shared Interest could help a user easily uncover concerning trends in a model’s decision-making — for example, perhaps the model often becomes confused by distracting, irrelevant features, like background objects in photos. Aggregating these insights could help the user quickly and quantitatively determine whether a model is trustworthy and ready to be deployed in a real-world situation.

Monday, March 28, 2022

Let quantum dots grow regularly

With this experimental setup, the researchers check the quality of the quantum dots. Green laser light is used to stimulate the quantum dots that then emit infrared light.
© İsmail Bölükbaşı

With the previous manufacturing process, the density of the structures was difficult to control. Now researchers can create a kind of checkerboard pattern. A step towards application, for example in a quantum computer.

Quantum points could one day form the basic information units of quantum computers. Researchers at the Ruhr University Bochum (RUB) and the Technical University of Munich (TUM) have significantly improved the manufacturing process for these tiny semiconductor structures, together with colleagues from Copenhagen and Basel. The quantum dots are generated on a wafer, a thin semiconductor crystal disc. So far, the density of the structures on it has been difficult to control. Now scientists can create specific arrangements - an important step towards an applicable component that should have a large number of quantum dots.

The team published the results on 28. March 2022 in the journal Nature Communications. A group led by Nikolai Bart, Prof. Dr. Andreas Wieck and Dr. Arne Ludwig from the RUB Chair for Applied Solid State Physics with the team around Christian Dangel and Prof. Dr. Jonathan Finley from the TUM working group semiconductor nanostructures and quantum systems as well as with colleagues from the universities of Copenhagen and Basel.

A tool for predicting the future

MIT researchers created a tool that enables people to make highly accurate predictions using multiple time-series data with just a few keystrokes. The powerful algorithm at the heart of their tool can transform multiple time series into a tensor, which is a multi-dimensional array of numbers (pictured). Credits: Figure courtesy of the researchers Source: MIT

Whether someone is trying to predict tomorrow’s weather, forecast future stock prices, identify missed opportunities for sales in retail, or estimate a patient’s risk of developing a disease, they will likely need to interpret time-series data, which are a collection of observations recorded over time.

Making predictions using time-series data typically requires several data-processing steps and the use of complex machine-learning algorithms, which have such a steep learning curve they aren’t readily accessible to nonexperts.

To make these powerful tools more user-friendly, MIT researchers developed a system that directly integrates prediction functionality on top of an existing time-series database. Their simplified interface, which they call tspDB (time series predict database), does all the complex modeling behind the scenes so a nonexpert can easily generate a prediction in only a few seconds.

The new system is more accurate and more efficient than state-of-the-art deep learning methods when performing two tasks: predicting future values and filling in missing data points.

Featured Article

Autism and ADHD are linked to disturbed gut flora very early in life

The researchers have found links between the gut flora in babies first year of life and future diagnoses. Photo Credit:  Cheryl Holt Disturb...

Top Viewed Articles