Just as in-vehicle navigation systems have already revolutionized finding street addresses, pedestrian navigation systems aim to revolutionize finding indoor locations.
To actualize a whole range of pedestrian navigation applications, Sensor Platforms in San Jose, Calif. — the software specialists licensing motion algorithms — has added Pedestrian Dead Reckoning (PDR) to its FreeMotion Library of algorithms.
Sensor Platforms' PDR system uses 10-axis sensor fusion on the data from micro-electro-mechanical system (MEMS) sensors — accelerometers, gyroscopes, magnetometers, and barometric pressure sensors (for altitude) — to calculate the distance traveled by a user as well as the user's direction (bearing), working from the last known waypoint as read off a global position systems (GPS) chip. By calibrating to the user's context, Sensor Platforms claims its PDR solution provides accuracy within a few percent of the distance traveled from the last known waypoint.
(Source: Sensor Platform)
“Rather than go straight from sensor fusion to pedestrian dead reckoning, the approach that Sensor Platforms takes is to determine the user context after the sensor fusion and use that to enable more accurate pedestrian dead reckoning,” Frank Shemansky, vice president of business development at Sensor Platforms, told us. “Our software uses machine-learning algorithms to determine if you are sitting, if you are standing, if you are walking or running — in fact given enough data we can determine just about any user context.”
Other indoor navigation systems require RF receivers that triangulate their location from WiFi router signals or RF beacons, but dead-reckoning systems calculate location by keeping track of the distance and direction they have traveled since the last known waypoint — usually the GPS location at the entrance to an indoor facility.
Besides the usual contexts, Sensor Platforms has been asked to develop many custom contexts for specialized devices and applications, such as swimming, biking, and playing basketball. Their approach is to develop specific sensor-data signatures for a context, using primarily accelerometer data with the other sensors making the context detection more accurate. Once the signature of a steady-state is recognized — say sitting — algorithms detect transitions to another steady state — say standing — usually in under five seconds, claimed Shemansky.
Context awareness also assesses how a device like a smartphone is being carried — such as handheld in front of the user, held at their side, or bouncing around in their pocket. Without carry-context detection, according to Sensor Platforms, switching from portrait to landscape orientation could erroneously be interpreted as a 90 degree change in direction, but adding carry-context awareness to sensor fusion results in a constant bearing despite device rotation.
(Source: Sensor Platforms)
Designed for any mobile device — from smartphones and tablets to smartglasses and smartwatches — all Sensor Plaforms PDR solutions perform automatic sensor calibration and magnetic anomaly compensation with routines from the FreeMotion Fusion library that run in the background.