The image belongs to Jean-Philippe Roberge.
During his undergraduate studies, Jean-Philippe Roberge saw a video in class that explained how rat neurons had learned to fly a fighter plane simulator. This was a major trigger. It took him time to develop his new interest, through advanced courses and discussions with faculty members at his university, but he never forgot the trigger. “I still show that video to my own students,” he says.
Combining Vision and Touch
A professor in the Department of Systems Engineering at École de technologie supérieure (ÉTS) since 2020, Jean-Philippe Roberge focuses on artificial intelligence applications in robotics.
In his work, Jean-Philippe Roberge collaborates with many industrial partners, including Quebec SMEs facing a shortage of qualified workers. “This is an issue that needs to be addressed,” he says. “Especially since our rate of robotization is well below what we witness in other countries. We have some catching up to do.”
Artificial vision, which allows robots to understand their surroundings, move around, and grasp the right object at the right time has been a concern of science for decades. But the eye, whether human or artificial, is not always enough. For example, the robot’s field of vision can be obstructed, especially by its own gripping device. Using the sense of touch could prove to be a real advantage. Handling fragile and/or morphing objects also benefits greatly from another source of information like touch.
To do this, the robot must be equipped with sophisticated tactile sensors. These detect contact with surfaces and vibrations emitted during handling operations. Artificial intelligence algorithms can analyze the captured tactile data and identify objects by touch or detect dynamic phenomena such as an object sliding from the robot’s gripping device, allowing it to rectify its grip before the object falls to the ground. This ability is a good example of reflexes in tactile perception-controlled robotics, which allow robots to demonstrate higher skills during handling tasks.
Learning through Experience
Jean-Philippe Roberge hopes to improve dexterity by teaching robots to use both vision and touch. The project, funded by the Natural Sciences and Engineering Research Council of Canada, requires quick processing of visual/tactile data in vast quantities to allow robots to make decisions.
“These are advanced perception problems that require huge quantities of data,” says Roberge. “It’s very difficult to program these kinds of behaviours manually, which makes machine learning essential.”
Reinforcement learning is the preferred artificial intelligence technique because it allows a robot to learn the best actions to choose from to reach a specific objective. This approach allows an autonomous agent like a robot to perform experiments, make decisions based on its state and environment, and adjust its behaviour accordingly. Over time, the robot will learn to maximize a “reward function” that describes the goal to be achieved.
Synthesizing Artificial Data
Machine learning involves training data, lots of data. However, allowing a robot to experiment freely can prove to be expensive. Especially if it has to handle delicate and expensive objects! This is where another of Jean-Philippe Roberge’s research areas comes into play—generating synthetic data.
“During my PhD, it took me months of experimentation to obtain 1,500 samples,” he says. Today, his PhD students can produce tens of thousands of synthetic training data that allow robots to learn without handling real objects.
A Lot of Research Still on the Drawing Board
Robots may eventually make their way into our homes, but major breakthroughs in visual/tactile signal processing will be needed before that happens.
And while artificial intelligence is a powerful tool, it is not a cure-all. “We are still far from human consciousness. Three-dimensional vision is expensive and complicated. And despite all the power of algorithms, there are still hardware problems to solve.” Something to keep the next generation of researchers quite busy!