Researchers at Rochester Institute of Technology are creating a novel sensor system based on the superior design and detection range found on harbor seal whiskers.
Xudong Zheng, an associate professor in RIT’s Kate Gleason College of Engineering, received a three-year, $746,000 award from the Naval Research Laboratory to build an autonomous underwater detection and tracking system with biological-level sensitivity, accuracy, and intelligence.
With demands for new sensor capabilities, increased sensitivity and accuracy could significantly advance underwater scientific explorations, such as tracking anomalies and seismic events in areas currently inaccessible or in improvements to robotic functions and military stealth missions.
“This is the next stage of development of stronger sensors,” said Zheng, whose team published findings in Frontiers in Robotics and AI. “Some early results of our computer simulations show that the sensor array combined with ‘smart’ algorithms could provide more smart perceptions and better reasoning regarding the signal pattern and how it corresponds to flow patterns.”
Water can be an unpredictable medium to recognize objects because of limited visibility and disturbances. Zheng’s work will further demonstrate how spatial, or three-dimensional aspects of an object, can be consistently recognized. Also, key to the research will be how artificial intelligence is broadly introduced into an overall system to better predict movement and identification in marine settings.
“We are trying to mimic seals’ highly sensitive sensors using the bio-inspired shape of their whisker array because it can detect a disturbance at 254 microns per second,” he said.
Seals’ whiskers have varied lengths and orientations, making their detection abilities highly accurate. They innately recognize spatial indicators as well as distance, speed, directional movement, and location. RIT’s research team is combining these technologies into one comprehensive system. Current whisker-inspired, passive sensors, although very sophisticated, often provide single measurements.
“Single sensors are unable to detect spatial information,” he added.
Spatial recognition, the ability to visualize objects from multiple perspectives and to understand objects’ relations to each other, and interpretable learning, a feature of artificial intelligence integration, are two key system additions that can increase sensor accuracy.
“We are designing very sensitive sensors that can be tightly packed into smaller spaces and that can extract spatial information to recognize the surrounding environment with an algorithm to accurately predict the shape of an object. Based on this information, we can understand why this signal corresponds to certain types of bodies, or objects,” said Zheng.
The work began several years prior to Zheng coming to RIT. He is an expert in flow physics and biomechanics, leading the Flow Physics and Modeling Lab in the Department of Mechanical Engineering in the Kate Gleason College. Research has continued as he and his research partner and spouse, Qian Xue, began teaching at RIT in 2022. Xue, also an associate professor of mechanical engineering, is an expert in flow-structure interaction. She received a CAREER award in 2022 to explore how vibrations are captured and interpreted.
Joining them as a collaborator is Dongfang Liu, assistant professor of computer engineering, who brings expertise in AI and vision-language intelligence to the project. Associates from RIT’s 3D printing laboratories are also collaborating with the team to build test prototypes.
“We are leveraging several very strong fields at RIT—AI and 3D printing—to build this type of technology. We need these technologies to generate smaller, compact sensors and using AI we can add to the system to build even smarter sensors,” Zheng said.
Published in journal: Frontiers in Robotics and AI / PDF
Source/Credit: Rochester Institute of Technology
Reference Number: eng021924_01