Showing posts with label Technology. Show all posts
Showing posts with label Technology. Show all posts

Thursday, September 23, 2021

An experimental loop for simulating nuclear reactors in space

 
Will Searight is conducting research in nuclear thermal propulsion,
which could enable faster and more efficient space travel.
Image: ISTOCK/@3DSCULPTOR
Nuclear thermal propulsion, which uses heat from nuclear reactions as fuel, could be used one day in human spaceflight, possibly even for missions to Mars. Its development, however, poses a challenge. The materials used must be able to withstand high heat and bombardment of high-energy particles on a regular basis.

Will Searight, a nuclear engineering doctoral student at Penn State, is contributing to research that could make these advancements more feasible. He published findings from a preliminary design simulation in Fusion Science and Technology, a publication of the American Nuclear Society. 

To better investigate nuclear thermal propulsion, Searight simulated a small-scale laboratory experiment known as a hydrogen test loop. The setup mimics a reactor's operation in space, where flowing hydrogen travels through the core and propels the rocket — at temperatures up to nearly 2,200 degrees Fahrenheit. Searight developed the simulation using dimensions from detailed drawings of tie tubes, the components that make up much of the test loop through which hydrogen flows. Industry partner Ultra Safe Nuclear Corporation (USNC) provided the drawings.

“Understanding how USNC’s components behave in a hot hydrogen environment is crucial to bringing our rockets to space,” Searight said. “We’re thrilled to be working with one of the main reactor contractors for NASA’s space nuclear propulsion project, which is seeking to produce a demonstration nuclear thermal propulsion engine within a decade.”

Advised by Leigh Winfrey, associate professor and undergraduate program chair of nuclear engineering, Searight used Ansys Fluent, a modeling software, to design a simulation loop from a stainless-steel pipe with an outer diameter of about two inches. In the model, the loop connects to a hydrogen pump and circulates hot hydrogen through a test section adjacent to a heating element. 

Winged microchip is smallest-ever human-made flying structure

 

Northwestern University engineers have added a new capability to electronic microchips: flight.

About the size of a grain of sand, the new flying microchip (or “microflier”) does not have a motor or engine. Instead, it catches flight on the wind — much like a maple tree’s propeller seed — and spins like a helicopter through the air toward the ground.

By studying maple trees and other types of wind-dispersed seeds, the engineers optimized the microflier’s aerodynamics to ensure that it — when dropped at a high elevation — falls at a slow velocity in a controlled manner. This behavior stabilizes its flight, ensures dispersal over a broad area and increases the amount of time it interacts with the air, making it ideal for monitoring air pollution and airborne disease.

As the smallest-ever human-made flying structures, these microfliers also can be packed with ultra-miniaturized technology, including sensors, power sources, antennas for wireless communication and embedded memory to store data.

The research is featured on the cover of the Sept. 23 issue of Nature.

“Our goal was to add winged flight to small-scale electronic systems, with the idea that these capabilities would allow us to distribute highly functional, miniaturized electronic devices to sense the environment for contamination monitoring, population surveillance or disease tracking,” said Northwestern’s John A. Rogers, who led the device’s development. “We were able to do that using ideas inspired by the biological world. Over the course of billions of years, nature has designed seeds with very sophisticated aerodynamics. We borrowed those design concepts, adapted them and applied them to electronic circuit platforms.”

Monday, September 20, 2021

High-speed alloy creation might revolutionize hydrogen’s future

 
Researchers from Sandia National Laboratories and international collaborators used computational approaches, including explainable machine learning models, to elucidate new high-entropy alloys with attractive hydrogen storage properties and direct laboratory synthesis and validation.

A Sandia National Laboratories team of materials scientists and computer scientists, with some international collaborators, have spent more than a year creating 12 new alloys — and modeling hundreds more — that demonstrate how machine learning can help accelerate the future of hydrogen energy by making it easier to create hydrogen infrastructure for consumers.

Vitalie Stavila, Mark Allendorf, Matthew Witman and Sapan Agarwal are part of the Sandia team that published a paper detailing its approach in conjunction with researchers from Ångström Laboratory in Sweden and Nottingham University in the United Kingdom.

“There is a rich history in hydrogen storage research and a database of thermodynamic values describing hydrogen interactions with different materials,” Witman said. “With that existing database, an assortment of machine-learning and other computational tools, and state-of-the art experimental capabilities, we assembled an international collaboration group to join forces on this effort. We demonstrated that machine learning techniques could indeed model the physics and chemistry of complex phenomena which occur when hydrogen interacts with metals.”

Having a data-driven modeling capability to predict thermodynamic properties can rapidly increase the speed of research. In fact, once constructed and trained, such machine learning models only take seconds to execute and can therefore rapidly screen new chemical spaces: in this case 600 materials that show promise for hydrogen storage and transmission.

“This was accomplished in only 18 months,” Allendorf said. “Without the machine learning it could have taken several years. That’s big when you consider that historically it takes something like 20 years to take a material from lab discovery to commercialization.”

Monday, September 13, 2021

New tool for analyzing large superconducting circuits

The next generation of computing and information processing lies in the intriguing world of quantum mechanics. Quantum computers are expected to be capable of solving large, extremely complex problems that are beyond the capacity of today’s most powerful supercomputers.

New research tools are needed to advance the field and fully develop quantum computers. Now Northwestern University researchers have developed and tested a theoretical tool for analyzing large superconducting circuits. These circuits use superconducting quantum bits, or qubits, the smallest units of a quantum computer, to store information.

Circuit size is important since protection from detrimental noise tends to come at the cost of increased circuit complexity. Currently there are few tools that tackle the modeling of large circuits, making the Northwestern method an important contribution to the research community.

“Our framework is inspired by methods originally developed for the study of electrons in crystals and allows us to obtain quantitative predictions for circuits that were previously hard or impossible to access,” said Daniel Weiss, corresponding and first author of the paper. He is a fourth-year graduate student in the research group of Jens Koch, an expert in superconducting qubits.

Koch, an associate professor of physics and astronomy in Weinberg College of Arts and Sciences, is a member of the Superconducting Quantum Materials and Systems Center (SQMS) and the Co-design Center for Quantum Advantage (C2QA). Both national centers were established last year by the U.S. Department of Energy (DOE). SQMS is focused on building and deploying a beyond-state-of-the-art quantum computer based on superconducting technologies. C2QA is building the fundamental tools necessary to create scalable, distributed and fault-tolerant quantum computer systems.

“We are excited to contribute to the missions pursued by these two DOE centers and to add to Northwestern’s visibility in the field of quantum information science,” Koch said. 

In their study, the Northwestern researchers illustrate the use of their theoretical tool by extracting from a protected circuit quantitative information that was unobtainable using standard techniques. 

Details were published in the open access journal Physical Review Research.

The researchers specifically studied protected qubits. These qubits are protected from detrimental noise by design and could yield coherence times (how long quantum information is retained) that are much longer than current state-of-the-art qubits.  

These superconducting circuits are necessarily large, and the Northwestern tool is a means for quantifying the behavior of these circuits. There are some existing tools that can analyze large superconducting circuits, but each works well only when certain conditions are met. The Northwestern method is complementary and works well when these other tools may give suboptimal results.

The research was supported by the Army Research Office (Contract No. W911NF-17-C-0024).

Source/Credit: Northwestern University/Megan Fellman

tn091321_02

Researchers Create Materials for Shape-Shifting Architecture

 

Source/Credit: North Carolina State University

Researchers at North Carolina State University have developed materials that can be used to create structures capable of transforming into multiple different architectures. The researchers envision applications ranging from construction to robotics.

“The system we’ve developed was inspired by metamorphosis,” says Jie Yin, corresponding author of a paper on the work and an associate professor of mechanical and aerospace engineering at NC State. “With metamorphosis in nature, animals change their fundamental shape. We’ve created a class of materials that can be used to create structures that change their fundamental architecture.”

Kirigami is a fundamental concept for Yin’s work. Kirigami is a variation of origami that involves cutting and folding paper. But while kirigami traditionally uses two-dimensional materials, Yin applies the same principles to three-dimensional materials.

The metamorphosis system starts with a single unit of 3D kirigami. Each unit can form multiple shapes in itself. But these units are also modular – they can be connected to form increasingly complex structures. Because the individual units themselves can form multiple shapes, and can connect to other units in multiple ways, the overall system is capable of forming a wide variety of architectures.

“Think of what you can build with conventional materials,” Yin says. “Now imagine what you can build when each basic building block is capable of transforming in multiple ways.”

Yin’s lab previously demonstrated a similar concept, in which 3D kirigami units were stacked on each other. In that system, the units could be used to assemble a structure – but the structure could also then be disassembled.

The metamorphosis system involves actually connecting the kirigami units. In other words, once the units are connected to each other they cannot be disconnected. However, the larger structures they create are capable of transforming into multiple, different architectures.


“There are two big differences between our first kirigami system and the metamorphosis system,” Yin explains.

“The first kirigami system involved units that could be assembled into architectures and then disassembled, which is an advantage. However, when the units were assembled, the architecture wouldn’t be capable of transforming. Because the sides of the unit were not rigid and fixed at 90-degree angles, the assembled structure could bend and move – but it could not fundamentally change its geometry.

“The metamorphosis kirigami system does not allow you to disassemble a structure,” Yin says. “And because the sides of each cubic unit are rigid and fixed at 90-degree angles, the assembled structure does not bend or flex very much. However, the finished structure is capable of transforming into different architectures.”

In proof-of-concept testing, the researchers demonstrated that the metamorphosis system was capable of creating many different structures that are capable of bearing significant weight while maintaining their structural integrity.

That structural integrity is important, because Yin thinks construction is one potential application for the metamorphosis system.

“If you scale this approach up, it could be the basis for a new generation of construction materials that can be used to create rapidly deployable structures,” Yin says. “Think of the medical units that have had to be expanded on short notice during the pandemic, or the need for emergency housing shelters in the wake of a disaster.”

The researchers also think the metamorphosis system could be used to create a variety of robotic devices that can transform in order to respond to external stimuli or to perform different functions.

“We also think this system could be used to create a new line of toys – particularly toys that can help people explore some fundamental STEM concepts related to physics and engineering,” Yin says. “We’re open to working with industry collaborators to pursue these and other potential applications for the system.”

The paper, “Metamorphosis of three-dimensional kirigami-inspired reconfigurable and reprogrammable architected matter,” is published in the journal Materials Today Physics. First author of the paper is Yanbin Li, a Ph.D. student at NC State. The work was done with support from the National Science Foundation, under grant 2005374.

Source/Credit: North Carolina State University

tn091321_01

Friday, September 10, 2021

Silicon, Subatomic Particles and Possible ‘Fifth Force’

 

As neutrons pass through a crystal, they create two different standing waves – one along atomic planes and one between them. The interaction of these waves affects the path of the neutron, revealing aspects of the crystal structure.  Credit: NIST
Using a groundbreaking new technique at the National Institute of Standards and Technology (NIST), an international collaboration led by NIST researchers has revealed previously unrecognized properties of technologically crucial silicon crystals and uncovered new information about an important subatomic particle and a long-theorized fifth force of nature.

By aiming subatomic particles known as neutrons at silicon crystals and monitoring the outcome with exquisite sensitivity, the NIST scientists were able to obtain three extraordinary results: the first measurement of a key neutron property in 20 years using a unique method; the highest-precision measurements of the effects of heat-related vibrations in a silicon crystal; and limits on the strength of a possible “fifth force” beyond standard physics theories.

The researchers report their findings in the journal Science.

In a regular crystal such as silicon, there are many parallel sheets of atoms, each of which forms a plane. Probing different planes with neutrons reveals different aspects of the crystal.  Credit: NIST
To obtain information about crystalline materials at the atomic scale, scientists typically aim a beam of
particles (such as X-rays, electrons or neutrons) at the crystal and detect the beam’s angles, intensities and patterns as it passes through or ricochets off planes in the crystal’s lattice-like atomic geometry.

That information is critically important for characterizing the electronic, mechanical and magnetic properties of microchip components and various novel nanomaterials for next-generation applications including quantum computing. A great deal is known already, but continued progress requires increasingly detailed knowledge.

“A vastly improved understanding of the crystal structure of silicon, the ‘universal’ substrate or foundation material on which everything is built, will be crucial in understanding the nature of components operating near the point at which the accuracy of measurements is limited by quantum effects,” said NIST senior project scientist Michael Huber.

Neutrons, Atoms and Angles

Like all quantum objects, neutrons have both point-like particle and wave properties. As a neutron travels through the crystal, it forms standing waves (like a plucked guitar string) both in between and on top of rows or sheets of atoms called Bragg planes. When waves from each of the two routes combine, or “interfere” in the parlance of physics, they create faint patterns called pendellösung oscillations that provide insights into the forces that neutrons experience inside the crystal.

“Imagine two identical guitars,” said Huber. “Pluck them the same way, and as the strings vibrate, drive one down a road with speed bumps — that is, along the planes of atoms in the lattice — and drive the other down a road of the same length without the speed bumps — analogous to moving between the lattice planes. Comparing the sounds from both guitars tells us something about the speed bumps: how big they are, how smooth, and do they have interesting shapes?”

The latest work, which was conducted at the NIST Center for Neutron Research (NCNR) in Gaithersburg, Maryland, in collaboration with researchers from Japan, the U.S. and Canada, resulted in a fourfold improvement in precision measurement of the silicon crystal structure.

Not-Quite-Neutral Neutrons

Each neutron in an atomic nucleus is made up of three elementary particles called quarks. The three quarks’ electrical charge sum to zero, making it electrically neutral. But the distribution of those charges is such that positive charges are more likely to be found in the center of the neutron, and negative charges toward the outside.  Credit: NIST

In one striking result, the scientists measured the electrical “charge radius” of the neutron in a new way with an uncertainty in the radius value competitive with the most-precise prior results using other methods. Neutrons are electrically neutral, as their name suggests. But they are composite objects made up of three elementary charged particles called quarks with different electrical properties that are not exactly uniformly distributed.

As a result, predominantly negative charge from one kind of quark tends to be located toward the outer part of the neutron, whereas net positive charge is located toward the center. The distance between those two concentrations is the “charge radius.” That dimension, important to fundamental physics, has been measured by similar types of experiments whose results differ significantly. The new pendellösung data is unaffected by the factors thought to lead to these discrepancies.

Measuring the pendellösung oscillations in an electrically charged environment provides a unique way to gauge the charge radius. “When the neutron is in the crystal, it is well within the atomic electric cloud,” said NIST’s Benjamin Heacock, the first author on the Science paper.

“In there, because the distances between charges are so small, the interatomic electric fields are enormous, on the order of a hundred million volts per centimeter. Because of that very, very large field, our technique is sensitive to the fact that the neutron behaves like a spherical composite particle with a slightly positive core and a slightly negative surrounding shell.”

Vibrations and Uncertainty

A valuable alternative to neutrons is X-ray scattering. But its accuracy has been limited by atomic motion caused by heat. Thermal vibration causes the distances between crystal planes to keep changing, and thus changes the interference patterns being measured.

The scientists employed neutron pendellösung oscillation measurements to test the values predicted by X-ray scattering models and found that some significantly underestimate the magnitude of the vibration.

The results provide valuable complementary information for both x-ray and neutron scattering. “Neutrons interact almost entirely with the protons and neutrons at the centers, or nuclei, of the atoms,” Huber said, “and x-rays reveal how the electrons are arranged between the nuclei. This complementary knowledge deepens our understanding.

“One reason our measurements are so sensitive is that neutrons penetrate much deeper into the crystal than x-rays – a centimeter or more – and thus measures a much larger assembly of nuclei. We have found evidence that the nuclei and electrons may not vibrate rigidly, as is commonly assumed. That shifts our understanding on the how silicon atoms interact with one another inside a crystal lattice.”

Force Five

The Standard Model is the current, widely accepted theory of how particles and forces interact at the smallest scales. But it’s an incomplete explanation of how nature works, and scientists suspect there is more to the universe than the theory describes.

The Standard Model describes three fundamental forces in nature: electromagnetic, strong and weak. Each force operates through the action of “carrier particles.” For example, the photon is the force carrier for the electromagnetic force. But the Standard Model has yet to incorporate gravity in its description of nature. Furthermore, some experiments and theories suggest the possible presence of a fifth force.

“Generally, if there’s a force carrier, the length scale over which it acts is inversely proportional to its mass,” meaning it can only influence other particles over a limited range, Heacock said. But the photon, which has no mass, can act over an unlimited range. “So, if we can bracket the range over which it might act, we can limit its strength.” The scientists’ results improve constraints on the strength of a potential fifth force by tenfold over a length scale between 0.02 nanometers (nm, billionths of a meter) and 10 nm, giving fifth-force hunters a narrowed range over which to look.

The researchers are already planning more expansive pendellösung measurements using both silicon and germanium. They expect a possible factor of five reduction in their measurement uncertainties, which could produce the most precise measurement of the neutron charge radius to date and further constrain — or discover — a fifth force. They also plan to perform a cryogenic version of the experiment, which would lend insight into how the crystal atoms behave in their so-called “quantum ground state,” which accounts for the fact that quantum objects are never perfectly still, even at temperatures approaching absolute zero.

Source/Credit: National Institute of Standards and Technology

tn091021_01

Thursday, September 9, 2021

Newly developed software unveils relationships between RNA modifications and cancers

Researchers from CSI Singapore have developed a software called ModTect that identifies relationships between RNA modifications and the development of diseases as well as survival outcomes
 In a research breakthrough, a team of researchers from the Cancer Science Institute of Singapore (CSI Singapore) at the National University of Singapore has developed a software that can help reveal the relationships between RNA modifications and the development of diseases and disorders.

Led by Professor Daniel Tenen and Dr Henry Yang, the scientists devised ModTect – a new computational software that can identify RNA modifications using pre-existing sequencing data from clinical cohort studies. With ModTect, the team carried out their own novel pan-cancer study covering 33 different cancer types. They found associations between these RNA modifications and the different survival outcomes of cancer patients.

“This work is one of few studies demonstrating the association of mRNA modification with cancer development. We show that the epitranscriptome was dysregulated in patients across multiple cancer types and was additionally associated with cancer progression and survival outcomes,” explained Dr Henry Yang, Research Associate Professor from CSI Singapore.

"In the past decade, the ability to sequence the Human Genome has transformed the study of normal processes and diseases such as cancer. We anticipate that studies like this one, eventually leading to complete sequencing of RNA and detecting modifications directly in RNA, will also have a major impact on the characterization of disease and lead to novel therapeutic approaches," commented Prof Tenen, Senior Principal Investigator from CSI Singapore.

What are RNA modifications?

While most people are familiar with DNA, RNA plays just as much of a vital role in the human body’s cellular functions. Unlike DNA, which has the double-helix structure that most people are familiar with, RNA is a family of single-stranded molecules that perform various essential biological roles.

For example, messenger RNA (mRNA) conveys genetic information that directs the production of different proteins. Imagine DNA as an expansive library filled with books that carry instructions on how to make different proteins. Each letter in the sequences of words that make up the books’ contents are called nucleotides, which are small molecules that are used to store genetic information. To make sure these instructions are followed, mRNA makes copies of the books and carries them from a cell’s nucleus, where DNA is stored, to the ribosomes. These ribosomes are the “factories” where proteins are synthesized. Without RNA, the valuable genetic instructions stored in our cells would never be used.

Additional types of RNA perform other important functions. Some help catalyze biochemical reactions, just like enzymes, while others regulate gene expression.

Small chemical modifications to RNA can sometimes occur and alter the function and stability of the molecules. The study of these modifications and their effects is called ‘epitranscriptomics’. Research in the past has suggested a link between the development of diseases like Alzheimer’s disease and cancer with certain RNA modifications. However, despite multiple attempts to study these associations in deeper detail, the study of epitranscriptomes has proven to be difficult until this breakthrough by scientists from CSI Singapore.

In large patient cohorts, collecting and processing patient samples is challenging. Detecting RNA modifications often involves technically complex processes, such as treating the samples with chemicals that are difficult to access. These techniques often also require the use of large quantities of sample that are hard to obtain for rarer conditions. Because of this, scientists have been limited in their capacity to establish relationships between specific RNA modifications and various human diseases.

Software makes epitranscriptomics easier

The software that the CSI Singapore team created uses RNA sequences available from other large clinical cohort studies. To detect modifications in these RNA sequences, ModTect looks for mismatch signals and deletion signals. Mismatch signals arise when the experimental enzymes scientists use to turn RNA back into DNA incorporates random nucleotides during sequencing. Deletion signals, on the other hand, are when the enzymes sometimes skip a portion of the sequence. Together, these signals are referred to as misincorporation signals.

Unlike other models, ModTect does not require a database of misincorporation signal profiles corresponding to different types of RNA modifications to identify or classify them. ModTect can even identify new signal profiles that drastically differ from what has been previously recorded.

By applying the software to around 11,000 cancer patient RNA-sequencing datasets, the CSI Singapore team was able to embark on a novel study that investigated the associations between RNA modifications and clinical outcomes in patients. ModTect was able to utilize these large datasets and process them with robust statistical filtering. It unveiled that some types of epitranscriptome were associated with cancer progression and survival outcomes in patients. This finding highlighted the potential use of RNA modifications as biomarkers – molecules that can be used to test for diseases.

Unravelling the mystery of sequence differences that escape detection

As explored before, the transmission of genetic information from DNA in a cell’s nucleus to RNA molecules that carry it to a cell’s ribosomes is a critical process. However, this transmission process is not perfect and leads to differences in RNA-DNA sequences. The sites of these mismatches have been widely documented. However, it is unclear whether these observations are caused by modifications in mRNA and why these sites have escaped detection by Sanger sequencing (one of the most popular methods of DNA sequencing).

The group at CSI Singapore uncovered a potential explanation as to why these RNA modification signals have eluded detection over the years. They explained how some epitranscriptomes impede the use of standard reverse transcriptase (RT), the enzyme that is used to convert RNA into DNA. This enzyme is used by scientists in genome sequencing and its use is one of the most critical steps for experimental success. Hence, RNAs that had these impeding modifications were under-represented in Sanger sequencing techniques.

To combat this, the team used newly developed RT enzymes that have been known for their ability to bypass the effects of these modification sites. This allowed them to observe epitranscriptomes that were originally undetectable with Sanger sequencing.

The discipline of epitranscriptomics is still an emerging and rapidly developing field with around 170 RNA modifications being detected so far. By harnessing ModTect, Prof Tenen and his team were able to provide novel insights into the relationships between human diseases – like cancer – and such RNA modifications. The software will be publicly available on Github for other scientists to use.

The team is hopeful that their contribution will help further research that establishes any potential causal or mechanistic relationships between RNA modifications and tumor formation.

Source/Credit: National University of Singapore

scn090921_01

Tuesday, September 7, 2021

Scientists awarded $6 million to plan brain-inspired computer that runs on probability

 

Conventional computers can look at the optical illusion on the left and normally only see a vase or two faces. Sandia National Laboratories is laying the groundwork for a computer that, like our brains, can glance many times and see both.
(Image by Laura Hatfield)

If you’ve ever asked a car mechanic how long a part will last until it breaks, odds are they shrugged their shoulders. They know how long parts last on average, and they can see when one is close to breaking. But knowing how many miles are left is extremely difficult, even using a supercomputer, because the exact moment a belt snaps or a battery dies is to some extent random.

Scientists at Sandia National Laboratories are creating a concept for a new kind of computer for solving complex probability problems like this. They propose that a “probabilistic computer” could not only create smarter maintenance schedules but also help scientists analyze subatomic shrapnel inside particle colliders, simulate nuclear physics experiments and process images faster and more accurately than is possible with conventional computers.

As part of a new microelectronics codesign research program, the Department of Energy’s Office of Science recently awarded the project $6 million over the next three years to develop the idea. Sandia will be working with Oak Ridge National Laboratory, New York University, the University of Texas at Austin and Temple University in Philadelphia.

A codesign microelectronics project involves multidisciplinary collaboration that takes into account the interdependencies among materials, physics, architectures and software. Researchers also will look at ways to incorporate machine learning methods.

The concept for a probabilistic computer runs opposite to how computers are normally built and programmed, Sandia scientist Brad Aimone said. Instead of making one that is perfectly predictable, Sandia wants one with built-in randomness that computes information differently every time.

“To a large degree, and at a great energy cost, we engineer computers to eliminate randomness. What we want to do in this project is to leverage randomness. Instead of fighting it, we want to use it.” said Aimone, who leads the project he and his team call COINFLIPS (short for CO-designed Improved Neural Foundations Leveraging Inherent Physics Stochasticity).

“What if, when I’m communicating with you, I flip a coin?” Aimone said. “If heads, you act on my message; if tails, you ignore it. We want to discover how you can use randomness like this to solve problems where probability is important.”

Concept modeled after unpredictable connections between brain cells

Aimone is an expert in technology that mimics the brain, including machine learning. He got his idea for a probabilistic computer from how brain cells talk to each other.

Inside your brain there are billions of cells called neurons that pass information across trillions of cell-to-cell connections called synapses, Aimone said. Whenever one neuron has a message, it sends a signal to lots of other neurons at the same time. But, only a random fraction on the receiving side carry on the message to more cells. Neuroscientists don’t agree why, but Aimone thinks it could be a reason why brains do some tasks better than computers, such as learning and adapting, or why they use less energy.

To imitate this brain behavior, scientists need to figure out how to generate trillions of random numbers at a time. That much randomness is too complex and takes too much power for computers, said Sandia’s Shashank Misra, who leads the COINFLIPS hardware team.

“We will need to get creative with new approaches, including new materials, atomic-scale control and machine learning-driven designs to generate the sheer volume of randomness needed and to make it useful for computation,” Misra said.

COINFLIPS will also identify tasks that benefit from randomness.

Probabilistic computers are part of a larger effort at Sandia to explore what computers in the future might look like. Researchers around the world have recognized that the rate at which computers are improving is slowing down, Aimone said. To break past the apparent limits of computers, scientists are looking at new, original ways of designing them.

Conrad James, the Sandia manager of the COINFLIPS team said, “Several of us at Sandia have been exploring brain-inspired computing and new design approaches for years. Encouraging more communication between mathematicians, algorithm developers and device physicists led to the formation of this team and research proposal.”

Sandia adds to other efforts to rethink computers

COINFLIPS was one of only 10 proposals selected nationwide to receive funding to design new, energy-efficient microelectronics. Separately, Sandia is lending its expertise in nanotechnology and computer modeling to another selected project led by Lawrence Berkeley National Laboratory.

These researchers will be redesigning nanosized sensors used in communications, imaging, remote sensing and surveillance technologies to be more compact, efficient and integrated into a computer processor.

“The photon absorption, the transduction to an electrical event and the measurement will all be part of one quantum system,” said Sandia physicist François Léonard, who is a member of the collaboration.

They will also attempt to enhance these sensors with advanced materials, such as carbon nanotubes, hollow carbon straws that are 100,000 times thinner than a strand of hair.

A third Sandia team consisting of researchers Alec Talin and Matt Marinella will be supporting another selected project that Oak Ridge National Laboratory is leading. Their research could help improve the energy efficiency of processing of information from sensors in autonomous vehicles, handheld devices and satellites.

Most of the time and energy that a computer chip needs are spent shuttling information between where it is stored and where it is processed, Talin said. But it might be possible to slash the power computers use by combining these two elements using brain-inspired devices developed at Sandia.

“The key idea is that in the brain, the memory and the logic (processing) are co-located in the same basic element, the neuron,” Talin said.

Fast, energy-efficient systems could potentially process complex tasks, such as recognizing images and translating languages in real time, on portable devices like smartphones without needing the computing power of the cloud, Talin said.

Source/Credit: Sandia National Laboratories

tn090721_01

Thursday, September 2, 2021

Z Turns Twenty-five Years Old

 

An open-shutter photo showing electrical energy coursing through the
transmission line sections of Sandia National Laboratories’ Z machine.
(Photo by Randy Montoya)
Sandia National Laboratories is celebrating 25 years of research conducted at its Z Pulsed Power Facility — a gymnasium-sized accelerator commonly referred to as Z or the Z machine.

Z began with a simple idea — running large pulsed electrical currents through targets at the center of the machine — that has resulted in startling science even after 25 years.

“We have seen continuous innovation over the history of Z, and we still have about another decade of exciting research lined up,” says Dan Sinars, Sandia’s pulsed power sciences director.

The adventure began 25 years ago, said former director Don Cook, when Sandia researchers modified a machine built in 1985 called the Particle Beam Fusion Accelerator. That machine — Z’s ancestor, in a sense — employed a very high voltage and smaller current to make lithium-ion beams for fusion research. The experimental output was powerful, about 15 terawatts, but had hardly increased in a decade of testing.

So, trying a different approach, the machine was restructured to deliver very high currents and lower voltages. Currents 100 times larger than those in a bolt of lightning efficiently vaporized arrays of tiny wires into clouds of ions. Then the powerful magnetic field accompanying the electric current slammed the ions into each other, a process that emitted copious X-rays that could be used for fusion research and other applications.

The new method, attempted first on a smaller Sandia machine called Saturn, immediately increased the output to 40 terawatts, and led to many experiments to improve the number, size, material choice and placement of succeeding arrays.

“Once it was confirmed in experiments in 1996 on a machine temporarily called PBFA II-Z that enormous pressures (millions of atmospheres) and very high temperatures (millions of degrees Celsius) could be produced by z-pinches, we renamed the machine simply Z in 1996. So, 2021 is the 25th anniversary of Z,” said Cook.

Researchers around the world marveled at the huge output increase, which quickly reached more than 200 terawatts, said former Sandia vice president and early Z leader Gerry Yonas. The Z-pinch work — called Z because the operation occurs along the Z axis in three-dimensional graphs — generated data for the U.S. nuclear stockpile.

Z hasn’t yet created fusion ignition, though the effort to increase its fusion output continues. “Achieving nuclear fusion in the lab isn’t for people who give up easily,” said Sandia fellow Keith Matzen, Z director from 2005-2013 and again from 2015-2019, who cautions it will take a bigger version of Z to demonstrate that the fusion energy emitted by the process is equal to the electrical energy stored in the facility, a milestone known as break-even.

Meanwhile Z researchers have delved into other areas, including determining where life elsewhere in our galaxy may have evolved; investigating the existence of diamonds on Neptune and liquid helium on Saturn; determining the age of white dwarf stars and the behavior of black holes in space; and the amount of water in the universe and its age, said Sinars.

To achieve these unusual capabilities, he said, researchers over decades reimagined work on Z so that the huge magnetic fields naturally accompanying Z’s powerful electrical discharges became instruments in their own right, testing materials by creating pressures exceeding those at Earth’s core or aiding in the effort to create breakeven nuclear fusion by pre-compressing the target fuel environs.

“In the meantime,” said Cook, “Z has become the most energetic source of X-rays for fusion research and for stockpile stewardship on the planet. Its capabilities as a pre-eminent research facility for high energy density sciences are known and appreciated worldwide.”

Sandia National Laboratories is a multimission laboratory operated by National Technology and Engineering Solutions of Sandia LLC, a wholly owned subsidiary of Honeywell International Inc., for the U.S. Department of Energy’s National Nuclear Security Administration. Sandia Labs has major research and development responsibilities in nuclear deterrence, global security, defense, energy technologies and economic competitiveness, with main facilities in Albuquerque, New Mexico, and Livermore, California.

Press Release
Source/Credit: Sandia National Laboratories

tn090221_01

Wednesday, September 1, 2021

Using Liquid Metal to Turn Motion into Electricity

 

Photo credit: Veenasri Vallem
Researchers at North Carolina State University have created a soft and stretchable device that converts movement into electricity and can work in wet environments.

“Mechanical energy – such as the kinetic energy of wind, waves, body movement and vibrations from motors – is abundant,” says Michael Dickey, corresponding author of a paper on the work and Camille & Henry Dreyfus Professor of Chemical and Biomolecular Engineering at NC State. “We have created a device that can turn this type of mechanical motion into electricity. And one of its remarkable attributes is that it works perfectly well underwater.”

The heart of the energy harvester is a liquid metal alloy of gallium and indium. The alloy is encased in a hydrogel – a soft, elastic polymer swollen with water.

The water in the hydrogel contains dissolved salts called ions. The ions assemble at the surface of the metal, which can induce charge in the metal. Increasing the area of the metal provides more surface to attract charge. This generates electricity, which is captured by a wire attached to the device.


“Since the device is soft, any mechanical motion can cause it to deform, including squishing, stretching and twisting,” Dickey says. “This makes it versatile for harvesting mechanical energy. For example, the hydrogel is elastic enough to be stretched to five times its original length.”

In experiments, researchers found that deforming the device by only a few millimeters generates a power density of approximately 0.5 mW m-2. This amount of electricity is comparable to several popular classes of energy harvesting technologies.

“However, other technologies don’t work well, if at all, in wet environments,” Dickey says. “This unique feature may enable applications from biomedical settings to athletic wear to marine environments. Plus, the device is simple to make.

“There is a path to increase the power, so we consider the work we described here a proof-of-concept demonstration.”

The researchers already have two related projects under way.

One project is aimed at using the technology to power wearable devices by increasing the harvester’s power output. The second project evaluates how this technology could be used to harvest wave power from the ocean.

The paper, “A Soft Variable-Area Electrical-Double-Layer Energy Harvester,” is published in the journal Advanced Materials. First author of the paper is Veenasri Vallem, a Ph.D. student at NC State. Co-authors include Erin Roosa and Tyler Ledinh, who were undergrads at NC State when the work was done; Sahar Rashid-Nadimi and Abolfazl Kiani, who were visiting scholars at NC State and are now at California State University, Bakersfield; and Woojin Jung and Tae-il Kim of Sungkyunkwan University in South Korea, who worked on the project while visiting NC State.

The work was done with support from NC State’s ASSIST Center, which is funded by the National Science Foundation under grant EEC-1160483. Additional support came from the Coastal Studies Institute of North Carolina and the Fostering Global Talents for Innovative Growth Program supervised by the Korea Institute for Advancement of Technology.

Source/Credit: North Carolina State University

tn090121_01

Tuesday, August 31, 2021

Sandia uncovers hidden factors that affect solar farms during severe weather

Sandia National Laboratories researchers Thushara Gunda, front, and Nicole Jackson examine solar panels at Sandia’s Photovoltaic Systems Evaluation Laboratory as summer monsoon clouds roll by. Using machine learning and data from solar farms across the U.S., they uncovered the age of a solar farm, as well as the amount of cloud cover, have pronounced effects on farm performance during severe weather.
(Photo by Randy Montoya)

 Sandia National Laboratories researchers combined large sets of real-world solar data and advanced machine learning to study the impacts of severe weather on U.S. solar farms, and sort out what factors affect energy generation. Their results were published earlier this month in the scientific journal Applied Energy.

Hurricanes, blizzards, hailstorms and wildfires all pose risks to solar farms both directly in the form of costly damage and indirectly in the form of blocked sunlight and reduced electricity output. Two Sandia researchers scoured maintenance tickets from more than 800 solar farms in 24 states and combined that information with electricity generation data and weather records to assess the effects of severe weather on the facilities. By identifying the factors that contribute to low performance, they hope to increase the resiliency of solar farms to extreme weather.

“Trying to understand how future climate conditions could impact our national energy infrastructure, is exactly what we need to be doing if we want our renewable energy sector to be resilient under a changing climate,” said Thushara Gunda, the senior researcher on the project. “Right now, we’re focused on extreme weather events, but eventually we’ll extend into chronic exposure events like consistent extreme heat.”

Hurricanes and snow and storms, oh my!

The Sandia research team first used natural-language processing, a type of machine learning used by smart assistants, to analyze six years of solar maintenance records for key weather-related words. The analysis methods they used for this study has since been published and is freely available for other photovoltaic researchers and operators.

“Our first step was to look at the maintenance records to decide which weather events we should even look at,” said Gunda. “The photovoltaic community talks about hail a lot, but the data in the maintenance records tell a different story.”

While hailstorms tend to be very costly, they did not appear in solar farm maintenance records, likely because operators tend to document hail damage in the form of insurance claims, Gunda said. Instead, she found that hurricanes were mentioned in almost 15% of weather-related maintenance records, followed by the other weather terms, such as snow, storm, lightning and wind.

“Some hurricanes damage racking — the structure that holds up the panels — due to the high winds,” said Nicole Jackson, the lead author on the paper. “The other major issue we’ve seen from the maintenance records and talking with our industry partners is flooding blocking access to the site, which delays the process of turning the plant back on.”

Using machine learning to find the most important factors

Next, they combined more than two years of real-world electricity production data from more than 100 solar farms in 16 states with historical weather data to assess the effects of severe weather on solar farms. They used statistics to find that snowstorms had the highest effect on electricity production, followed by hurricanes and a general group of other storms.

Then they used a machine learning algorithm to uncover the hidden factors that contributed to low performance from these severe weather events.

“Statistics gives you part of the picture, but machine learning was really helpful in clarifying what are those most important variables,” said Jackson, who primarily conducted statistical analysis and the machine learning portion of the project. “Is it where the site is located? Is it how old the site is? Is it how many maintenance tickets were submitted on the day of the weather event? We ended up with a suite of variables and machine learning was used to home in on the most important ones.”

She found that across the board, older solar farms were affected the most by severe weather. One possibility for this is that solar farms that had been in operation for more than five years had more wear-and-tear from being exposed to the elements longer, Jackson said.

Gunda agreed, adding, “This work highlights the importance of ongoing maintenance and further research to ensure photovoltaic plants continue to operate as intended.”

For snowstorms, which unexpectedly were the type of storm with the highest effect on electricity production, the next most important variables were low sunlight levels at the location due to cloud cover and the amount of snow, followed by several geographical features of the farm.

For hurricanes — principally hurricanes Florence and Michael — the amount of rainfall and the timing of the nearest hurricane had the next highest effect on production after age. Surprisingly low wind speeds were significant. This is likely because when high wind speeds are predicted, solar farms are preemptively shut down so that the employees can evacuate leading to no production, Gunda said.

Expanding the approach to wildfires, the grid

As an impartial research institution in this space, Sandia was able to collaborate with multiple industry partners to make this work feasible. “We would not have been able to do this project without those partnerships,” Gunda said.

The research team is working to extend the project to study the effect of wildfires on solar farms. Since wildfires aren’t mentioned in maintenance logs, they were not able to study them for this paper. Operators don’t stop to write a maintenance report when their solar farm is being threatened by a wildfire, Gunda said. “This work highlights the reality of some of the data limitations we have to grapple with when studying extreme weather events.”

“The cool thing about this work is that we were able to develop a comprehensive approach of integrating and analyzing performance data, operations data and weather data,” Jackson said. “We’re extending the approach into wildfires to examine their performance impacts on solar energy generation in greater detail.”

The researchers are currently expanding this work to look at the effects of severe weather on the entire electrical grid, add in more production data, and answer even more questions to help the grid adapt to the changing climate and evolving technologies.

This research was supported by the Department of Energy’s Solar Energy Technologies Office and was conducted in partnership with the National Renewable Energy Laboratory.

Source/Credit: Sandia National Laboratories

tn083121_01

Monday, August 30, 2021

Pathways to production

 Biologists at Sandia National Laboratories developed comprehensive software that will help scientists in a variety of industries create engineered chemicals more quickly and easily. Sandia is now looking to license the software for commercial use, researchers said.

Sandia’s stand-alone software RetSynth uses a novel algorithm to sort through large, curated databases of biological and chemical reactions, which could help scientists synthetically engineer compounds used in the production of biofuels, pharmaceuticals, cosmetics, industrial chemicals, dyes, scents and flavors.

A graphic illustration of the kind of retrosynthetic analysis conducted by RetSynth software developed at Sandia National Laboratories. Using a novel algorithm, the software identifies the biological or chemical reactions needed to create a desired biological product or compound.
(Graphic by Laura Hatfield)

The software platform uses retrosynthetic analysis to help scientists identify possible pathways to production — the series of biological and chemical reactions, or steps, needed to engineer and modify the molecules in a cell — to create the desired biological product or compound. By using the software to rapidly analyze all pathways, scientists can determine the production sequence with the fewest steps, the sequences that can be completed with available resources or the most economically viable process.

Synthetic biology involves redesigning organisms for useful purposes by engineering them to have new abilities. Researchers and companies around the world are using synthetic biology to harness the power of nature to solve problems in medicine — such as the development of vaccines, antibodies and therapeutic treatments — as well as in manufacturing and agriculture.

“Synthetic biology is becoming a critical capability for U.S. manufacturing. It has the potential to dramatically reduce waste, eliminate or curtail emissions and create next-generation therapeutics and materials,” said Corey Hudson, a computational biologist at Sandia. “That is where people will see RetSynth have the biggest impact.”

“The diverse functionality of RetSynth opens a lot of opportunities for researchers, giving them multiple options, including biological, chemical or hybrid pathways to production,” Hudson said. “All the while, the software is accelerating the research and development process associated with bioproduction. Traditionally, this process has been relatively slow and complex.”

RetSynth is designed to save researchers time and money by suggesting process modifications to maximize theoretical yield, or the amount of bioproduct that could be produced, Hudson said. All available pathways are rendered using clear visual images, enabling software users to quickly interpret results.

Commercial licensing for broader impact

The RetSynth software was originally developed as part of the Department of Energy’s Co-Optimization of Fuels & Engines initiative, a consortium of national lab, university and industry researchers who are creating innovative fuels and combining them with high-efficiency engines to reduce emissions and boost fuel economy.

Today, RetSynth has been expanded to support a variety of diverse applications, and Sandia is ready to license the software to an industry partner for commercial use, Hudson said.

Source / Credit: Sandia National Laboratories

tn083021_02

Carbon nanotube fibers woven into clothing gather accurate EKG.

 

Rice University graduate student Lauren Taylor
 shows a shirt with carbon nanotube thread that provides
constant monitoring of the wearer’s heart. Photo by Jeff Fitlow
There’s no need to don uncomfortable smartwatches or chest straps to monitor your heart if your comfy shirt can do a better job.

That’s the idea behind “smart clothing” developed by a Rice University lab, which employed its conductive nanotube thread to weave functionality into regular apparel.

The Brown School of Engineering lab of chemical and biomolecular engineer Matteo Pasquali reported in the American Chemical Society journal Nano Letters that it sewed nanotube fibers into athletic wear to monitor the heart rate and take a continual electrocardiogram (EKG) of the wearer.

The fibers are just as conductive as metal wires, but washable, comfortable and far less likely to break when a body is in motion, according to the researchers.

On the whole, the shirt they enhanced was better at gathering data than a standard chest-strap monitor taking live measurements during experiments. When matched with commercial medical electrode monitors, the carbon nanotube shirt gave slightly better EKGs.

“The shirt has to be snug against the chest,” said Rice graduate student Lauren Taylor, lead author of the study. “In future studies, we will focus on using denser patches of carbon nanotube threads so there’s more surface area to contact the skin.”

The researchers noted nanotube fibers are soft and flexible, and clothing that incorporates them is machine washable. The fibers can be machine-sewn into fabric just like standard thread. The zigzag stitching pattern allows the fabric to stretch without breaking them.

The fibers provided not only steady electrical contact with the wearer’s skin but also served as electrodes to connect electronics like Bluetooth transmitters to relay data to a smartphone or connect to a Holter monitor that can be stowed in a user’s pocket, Taylor said.

Pasquali’s lab introduced carbon nanotube fiber in 2013. Since then the fibers, each containing tens of billions of nanotubes, have been studied for use as bridges to repair damaged hearts, as electrical interfaces with the brain, for use in cochlear implants, as flexible antennas and for automotive and aerospace applications. Their development is also part of the Rice-based Carbon Hub, a multiuniversity research initiative led by Rice and launched in 2019.

The original nanotube filaments, at about 22 microns wide, were too thin for a sewing machine to

A Rice lab uses a custom device that weaves
carbon nanotube fibers into larger threads for sewing. Photo by Jeff Fitlow

handle. Taylor said a rope-maker was used to create a sewable thread, essentially three bundles of seven filaments each, woven into a size roughly equivalent to regular thread.

“We worked with somebody who sells little machines designed to make ropes for model ships,” said Taylor, who at first tried to weave the thread by hand, with limited success. “He was able to make us a medium-scale device that does the same.”

She said the zigzag pattern can be adjusted to account for how much a shirt or other fabric is likely to stretch. Taylor said the team is working with Dr. Mehdi Razavi and his colleagues at the Texas Heart Institute to figure out how to maximize contact with the skin.

Fibers woven into fabric can also be used to embed antennas or LEDs, according to the researchers. Minor modifications to the fibers’ geometry and associated electronics could eventually allow clothing to monitor vital signs, force exertion or respiratory rate.

Taylor noted other potential uses could include human-machine interfaces for automobiles or soft robotics, or as antennas, health monitors and ballistic protection in military uniforms. “We demonstrated with a collaborator a few years ago that carbon nanotube fibers are better at dissipating energy on a per-weight basis than Kevlar, and that was without some of the gains that we’ve had since in tensile strength,” she said.

“We see that, after two decades of development in labs worldwide, this material works in more and more applications,” Pasquali said. “Because of the combination of conductivity, good contact with the skin, biocompatibility and softness, carbon nanotube threads are a natural component for wearables.”

He said the wearable market, although relatively small, could be an entry point for a new generation of sustainable materials that can be derived from hydrocarbons via direct splitting, a process that also produces clean hydrogen. Development of such materials is a focus of the Carbon Hub.

“We’re in the same situation as solar cells were a few decades ago,” Pasquali said. “We need application leaders that can provide a pull for scaling up production and increasing efficiency.”

Co-authors of the paper are Rice graduate students Steven Williams and Oliver Dewey, and alumni J. Stephen Yan, now at Boston Consulting Group, and Flavia Vitale, an assistant professor of neurology at the University of Pennsylvania. Pasquali is director of the Carbon Hub and the A.J. Hartsook Professor of Chemical and Biomolecular Engineering and a professor of chemistry and of materials science and nanoengineering.

The research was supported by the U.S. Air Force (FA9550-15-1-0370), the American Heart Association (15CSA24460004), the Robert A. Welch Foundation (C-1668), the Department of Energy (DE-EE0007865, DE-AR0001015), the Department of Defense (32 CFR 168a) and a Riki Kobayashi Fellowship from the Rice Department of Chemical and Biomolecular Engineering.


NEWS RELEASE
Source/Credit: Rice University

tn083021_01

Featured Article

Coral reef biodiversity predicted to shift as climate changes

  Experimental set up at HIMB with mesocosms. (Photo credit: Chris Jury) Coral reefs are among the most biologically diverse, complex and pr...

Top Viewed Articles