Ralph W. Bernstein is the senior business developer at a lab you’ve probably never heard of in Norway — the SINTEF MiNaLab.
MiNaLab is operated by SINTEF which is the largest independent research organization in Scandanavia. The Lab is an 800 m² cleanroom and has a full processing line for 150 mm silicon wafers. It has a lab-to-fab approach to produce sensors and actuators.
Bernstein says the lab’s mission is to increase customers’ competitiveness by developing and delivering customized microchips as integral components of their products rather than offering off-the-shelf components.
“MiNaLab’s strategic research is focused on establishing enabling process technology platforms for next-generation sensor and actuator chips,” said Bernstein. “Our commercialization projects are, on the other hand, almost without exception focusing on developing applications.”
“MiNaLab has spent close to twenty years in developing and refining our world-class thin film piezoelectric actuation technology platform (3D ultrasonic sensors), said Bernstein.
“The goal of the project was to push the boundaries of existing ultrasonic applications, evolving them into something capable of providing higher resolution and greater environmental adaptability,” said Bernstein. “Traditional sensors were limited in their ability to function across diverse conditions and in delivering 3D imaging in air.”
MiNaLab’s expertise in microfabrication and novel materials allowed them to develop this new technology for real-world, industrial and robotic applications.
Here come the robots
Sonair is using the 3D ultrasound sensors exclusively licenced from MiNaLab to enable autonomous robots to see and make the right decisions. Sonair recently closed a $6.8 M funding round led by Skyfall Ventures, which they hope will propel the robotics industry into a new era of sensing technology.
The new 3D ultrasonic sensors will transform a mobile robot’s spatial awareness from 2D to 3D which offers improved safety performance over traditional vision systems.
“MiNaLab has focused on this niche for two decades because we recognized early on that ultrasonic technology, particularly in the context of 3D sensing, holds untapped potential,” said Bernstein. “Over the years, the market has been dominated by optical and electromagnetic technologies, but they come with their own constraints.
Bernstein says that ultrasonic technology, especially in the air, fills a unique role, particularly in environments where traditional sensors fail to see all objects.
“The next stage for robotics, with ADAR and 3D ultrasonic sensors, is seamless and safe real-time spatial awareness,” said Bernstein. “As robotics evolves toward more autonomous, adaptive, and even collaborative roles (working side by side with humans), having sensors that can operate reliably in unpredictable conditions becomes critical.”
He says that in the future, they see this technology enabling smarter, safer robots that can be deployed in even more challenging environments, whether indoor or outdoor.
New sensing technology
Knut Sandven, CEO of Sonair, says that images can be constructed by analyzing echo signals using high-frequency soundwaves inaudible to human ears and advanced signal processing. “Sonair’s breakthrough is to make ultrasound imaging possible for in-air applications for the first time.”
These silicon ultrasound transmitters create millimetre loudspeakers capable of generating sound levels comparable to a rock concert, said Sandven.
“Sonair is focused on applying 3D ultrasound technology to automation and robotics because we are in the middle of an automation and robotics revolution,” said Sandven.
“There are two main challenges the robotics companies are all working on: generative autonomy and human-level perception,” said Sandven.
“With generative autonomy, a robot would get a prompt such as ‘go empty the dishwasher’ and then would work out how to do that task on its own without any detailed programming, he said.
Sandven says this is similar to what we see with text-based generative artificial intelligence (AI).
In a September 2024 MIT study, researchers said they have developed a method that enables robots to make intuitive, task-relevant decisions. The team’s new approach, named Clio, enables a robot to quickly map a scene and identify the items they need to complete a given set of tasks.
“But with human-level perception, the robots can perceive their physical environment as good as or better than humans,” said Sandven. “Sonair is working on solving the second challenge of human-level perception with 3D ultrasonic sensos.”
From 2D to 3D sensors for robots
One of the most expensive components of an autonomous mobile robot (AMR) is its sensor package, which is crucial for obstacle detection and safe navigation.
“The depth ranging sensors used for robotics are currently laser (LiDAR), radar and 1D ultrasound,” said Sandven. “The 1D ultrasound sensors are the ones used as parking sensors on cars and 3D ultrasonic sensing in air is a new and emerging depth-ranging category with very few players.”
Bernstein says that robots, especially in dynamic environments like warehouses, factories or the outdoors, need to perceive spaces accurately in 3D. “While technologies like LiDAR and cameras are often used, they have their limitations. LiDAR can struggle in certain reflective environments, and cameras require good lighting and struggle with depth perception in specific scenarios.”
MiNaLab’s 3D ultrasonic sensors, based on their ADAR technology, offer a complementary sensing method that can overcome these limitations. “They are robust in various environments (dust, fog, low light), offer precise object detection at close and mid-range distances, and can even detect soft objects that other sensor types might miss,” said Bernstein.
Bernstein says this makes them critical for robotic applications where safety, precision, and environmental adaptability are paramount.
“We are on a mission to replace 2D lasers (LiDAR) with our 3D ultrasonic sensors,” said Sandven. “Most industrial robots today that move autonomously use a 2D laser scanner to provide the safety function, i.e. that the robot reliably stops for objects.”
Sandven says LiDAR shoots a horizontal-plane laser out about 20cm over the ground and tells the robot to stop if something gets too close. “This is an expensive technology with significant shortcomings because it can only see objects in the laser plane, and it struggles with transparent surfaces and reflective objects.”
Sandven says the bottom line is that robots, like humans, need to observe and understand their environment.
“For that, they need a 3D view. If they only see in 2D, they see only a slice of the physical environment,” said Sandven. “Most robots today are separated from humans because of safety concerns.”
Sandven says this separation will change.
“If a robot approaches a human leaning towards the robot, the 2D laser scanner will see the legs of the human and not the head or hand, which may be closer,” he said. “In contrast, Sonair’s patented ADAR technology detects people and objects in 3D, with low energy and computational requirements.”
Sandven says their technology is where AI, machine learning, and automation truly converge.