Header picture is from the author: Substance CC license applies.
Motion capture “involves measuring an object’s position and orientation in physical space, then recording that information in a computer-usable form. Objects of interest include human and non-human bodies, facial expressions, camera or light positions, and other elements in a scene”. The drawbacks of existing motion capture systems are their high fabrication cost, their set up cost, and the need to use external sources (eg: Outside-In systems with special room and cameras).
Concerning indoor pedestrian navigation, the trend is to use Local Positioning Systems (LPS) with a complex network of external sensors. Our system has the following advantages:
- Offering a high accuracy system at a low-cost production;
- Allowing easy and stealthy deployment anywhere, without external aid;
- Allowing a custom application with many features, offering data processing students and engineers easy analysis of inertial technologies.
The ibNav project uses MicroElectroMechanical System (MEMS) inertial sensors, combined with optimized inertial algorithms, for motion capture and indoor pedestrian navigation. Figure 1 gives a complete description of the system.
The ibnav system presents the following characteristics:
- Motion Capture: Body movement recognition with 3D display, using attitude and heading reference system (AHRS) algorithms (figure 2).
- Indoor Pedestrian Navigation: Movement represented on a map using inertial navigation system (INS) algorithms with no external sources.
Figure 2 Examples of motion patterns shown in 3D [Img2]
.The prototype is a network of 14 low cost Inertial Measurement Unit (IMUs) using MEMS sensors. These IMUs (Figure 5) are embedded in a suit, at each main joint of the body. They are composed of accelerometers, geomagnetic and gyroscopic sensors (figure 4) measuring forces in a Cartesian coordinate reference frame, providing kinematic information based on 9 Degrees of Freedom (DOF). Raw measurements and computed data (Position, Velocity, and Attitude) are obtained from each IMU and sent through a Serial Peripheral Interface (SPI) to the central IMU. The computed data is the result of a calibration process followed by strapdown navigation algorithms. The calibration algorithm uses multi-position method with an Extended Kalman Filter (EKF) and a Stance Detector. Navigation algorithms use the Attitude Heading Reference System (AHRS) and Inertial Navigation System (INS) within an EKF framework for several models. This fused navigation state is broadcasted as full data frames via a Wi-Fi protocol on the Apple iPad device.
A Graphical User Interface (GUI) for iPad was developed for the purpose of conducting further R&D on this prototype platform. On the one hand, it is possible to display either the 3D representation of a moving subject, or his movement on a map in real-time or in post-processing. On the other hand, it is possible to study the performance of the calibration, navigation, and models algorithms (forming inertial algorithms) by using charts, and with the possibility to change parameters (Figure 6).
ibNav is a project designed as a research and development platform for industry-related applications. Potential uses are numerous and cover essentially four distinct sectors: entertainment, sports, healthcare, and military to name a few.
René Jr Landry
René Jr Landry is a professor in the Electrical Engineering Department at ÉTS and the Director of LASSENA. His expertise in embedded systems, navigation, and avionics applies notably in transportation, aeronautics and space technologies.
Program : Electrical Engineering
Adrien Mixte is a project manager in the Laboratory LASSENA. His master thesis at ÉTS consisted in developing a prototype body and indoor navigation prototype, based on the use of MEMS-IMU platforms at low costs.
Program : Electrical Engineering
Research laboratories : LASSENA – Laboratory of Space Technologies, Embedded Systems, Navigation and Avionic