SCIENTIFIC NEWS AND
INNOVATION FROM ÉTS
Indoor and Body Navigation Prototype Using Low-Cost Sensors - By : René Jr Landry, Adrien Mixte,

Indoor and Body Navigation Prototype Using Low-Cost Sensors


René Jr Landry
René Jr Landry Author profile
René Jr Landry is a professor in the Electrical Engineering Department at ÉTS and the Director of LASSENA . His expertise includes embedded systems, navigation, and avionics.

Adrien Mixte
Adrien Mixte Author profile
Adrien Mixte is a project manager in the Laboratory LASSENA. His master thesis at ÉTS consisted in developing a prototype body and indoor navigation prototype, based on the use of MEMS-IMU platforms at low costs.

sensors

Header picture is from the author: Substance CC license applies.

Motion capture “involves measuring an object’s position and orientation in physical space, then recording that information in a computer-usable form. Objects of interest include human and non-human bodies, facial expressions, camera or light positions, and other elements in a scene”[1]. The drawbacks of existing motion capture systems are their high fabrication cost, their set up cost, and the need to use external sources (eg: Outside-In systems with special room and cameras).

Concerning indoor pedestrian navigation, the trend is to use Local Positioning Systems (LPS) with a complex network of external sensors. Our system has the following advantages:

  • Offering a high accuracy system at a low-cost production;
  • Allowing easy and stealthy deployment anywhere, without external aid;
  • Allowing a custom application with many features, offering data processing students and engineers easy analysis of inertial technologies.
sensors

Figure 1 General description of the ibNav system

The ibNav project uses MicroElectroMechanical System (MEMS) inertial sensors, combined with optimized inertial algorithms, for motion capture and indoor pedestrian navigation. Figure 1 gives a complete description of the system.

The ibnav system presents the following characteristics:

  • Motion Capture: Body movement recognition with 3D display, using attitude and heading reference system (AHRS) algorithms (figure 2).sensorssensors
  • Indoor Pedestrian Navigation: Movement represented on a map using inertial navigation system (INS) algorithms with no external sources.

Figure 2 Examples of motion patterns shown in 3D [Img2]

.

.

.

.

sensors

Figure 4: Example of a gyroscope in operation with freedom in all three axes. Note that the rotor maintain its spin axis direction regardless of the orientation of the outer frame [Img3]

The prototype is a network of 14 low cost Inertial Measurement Unit (IMUs) using MEMS sensors. These IMUs (Figure 5) are embedded in a suit, at each main joint of the body. They are composed of accelerometers,  geomagnetic and gyroscopic sensors (figure 4) measuring forces in a Cartesian coordinate reference frame, providing kinematic information based on 9 Degrees of Freedom (DOF).

sensors

Figure 5 Example of the STEVAL-MKI062V2, the inertial-measurement-unit used in the ibNav project. Source [Img 1]

Raw measurements and computed data (Position, Velocity, and Attitude) are obtained from each IMU and sent through a Serial Peripheral Interface (SPI) to the central IMU. The computed data is the result of a calibration process followed by strapdown navigation algorithms. The calibration algorithm uses multi-position method with an Extended Kalman Filter (EKF) and a Stance Detector. Navigation algorithms use the Attitude Heading Reference System (AHRS) and Inertial Navigation System (INS) within an EKF framework for several models. This fused navigation state is broadcasted as full data frames via a Wi-Fi protocol on the Apple iPad device.

.
A Graphical User Interface (GUI) for iPad was developed for the purpose of conducting further R&D on this prototype platform. On the one hand, it is possible to display either the 3D representation of a moving subject, or his movement on a map in real-time or in post-processing. On the other hand, it is possible to study the performance of the calibration, navigation, and models algorithms (forming inertial algorithms) by using charts, and with the possibility to change parameters (Figure 6).

sensors

Figure 6 The ibNav GUI objectives. Source [Img1]

sensors

Figure 7: Location of the sensors on the suit, source [Img1]

sensors


Figure 8 Images of the ibNav System, Body Wear and Interface. Source [Img1]

Possible Usage

ibNav is a project designed as a research and development platform for industry-related applications. Potential uses are numerous and cover essentially four distinct sectors: entertainment, sports, healthcare, and military to name a few.

 

To inquire about future projects of the LASSENA team, please consult the following link.

René Jr Landry

Author's profile

René Jr Landry is a professor in the Electrical Engineering Department at ÉTS and the Director of LASSENA . His expertise includes embedded systems, navigation, and avionics.

Program : Electrical Engineering 

Research laboratories : LACIME – Communications and Microelectronic Integration Laboratory  LASSENA – Laboratory of Space Technologies, Embedded Systems, Navigation and Avionic 

Author profile

Adrien Mixte

Author's profile

Adrien Mixte is a project manager in the Laboratory LASSENA. His master thesis at ÉTS consisted in developing a prototype body and indoor navigation prototype, based on the use of MEMS-IMU platforms at low costs.

Program : Electrical Engineering 

Research laboratories : LASSENA – Laboratory of Space Technologies, Embedded Systems, Navigation and Avionic 

Author profile


comments

    Leave a Reply

    Your email address will not be published. Required fields are marked *