It was a happy accident that we got our hands on a thermographic camera and played around with it in the lab. Our R&D team had already been tasked with developing natural and intuitive ways of interacting with augmented reality applications when using head-mounted displays, but this first foray was really just playing around with new toys.
After measuring the temperature of my coffee mug and my display, and discovering interesting temperature patterns in my face, I noticed that, wherever I left my hand resting on the desk, residual heat would become apparent in the thermal image. Brief experiments with different objects showed that this is not a unique property of my desk, but most objects exhibit warm spots after touching them.
The camera module included a visible light camera, which allows recognizing and tracking objects in its field of view. The idea arose that the combination of detecting touch in the thermal image and detecting and tracking the touched objects in the visible image would enable a natural way to interact with those objects and digital information associated to it — particularly for wearable headsets.
We've been conditioned as technology users to look for touch — it's really the default user interface for most technology now. Wearable device makers have proposed multiple interface solutions: voice navigation, depth tracking for finger detection, companion devices, and even things as novel as shoulder-mounted projectors. Though these options are a great start, we've found many of them lacking or even frustrating for the average user.
But what if we could turn any surface into a touchscreen? This was the idea we tasked ourselves with after discovering the potential gain in marrying thermal imaging with traditional computer vision algorithms.
Our mobile prototype runs on a Lenovo ThinkPad tablet, to which we attached a combined thermal and visible light camera module. The fixture is simply a joist hanger I purchased at a local hardware store.
Our proof-of-concept software implementation is based on the Metaio augmented reality SDK, which uses tracking capabilities for dealing with both planar and three-dimensional objects. It further provides the ability to render virtual objects on top of tracked objects, which we need in order to associate virtual information to a physical surface.
Thermal imaging is, of course, not yet an inherent function of our software, so we had to extend the Metaio SDK to support capturing images from the thermographic camera. The only thing that was really lacking was a touch detection algorithm, so we developed one.
Next we searched for ideal use cases and applications that we could easily demonstrate. After success with some field testing, we decided to gather both conceptual and actual uses of this technology into the short video below.
The prototype is both heavy and expensive. This is not a consumer technology — yet. But this new way of interacting with augmented reality is clearly meant for wearable computers and head-mounted displays.
We believe that ubiquitous adoption of wearable computing is a when , rather than an if , but the question of the user interface remains. This “thermal touch” interface, made more robust, would be an ideal solution to transitioning between a touch device and one that leaves users hands free.
We will continue to improve our prototype in terms of robustness and latency, and we are looking into how this fundamental approach can allow more advanced interaction techniques. For example, touching an object with different fingers might have different effects.
It may be some time until the first head-mounted devices ship with embedded thermographic cameras, but these cameras will become available in a small form factor and an affordable price. A mobile phone add-on enabling mobile thermal imaging will become available this year, and this is only the beginning.
Though it may be years ahead in the future, embedding infrared cameras into wearable computing is not beyond the realm of possibility, especially in an industry that is still iterating on form factors and hardware, let alone the ideal graphical user interface.
This article was originally published on EBN's sister publication EE Times.