Advertisement

Blog

Wearables Get a New Interface

It was a happy accident that we got our hands on a thermographic camera and played around with it in the lab. Our R&D team had already been tasked with developing natural and intuitive ways of interacting with augmented reality applications when using head-mounted displays, but this first foray was really just playing around with new toys.

After measuring the temperature of my coffee mug and my display, and discovering interesting temperature patterns in my face, I noticed that, wherever I left my hand resting on the desk, residual heat would become apparent in the thermal image. Brief experiments with different objects showed that this is not a unique property of my desk, but most objects exhibit warm spots after touching them.

The camera module included a visible light camera, which allows recognizing and tracking objects in its field of view. The idea arose that the combination of detecting touch in the thermal image and detecting and tracking the touched objects in the visible image would enable a natural way to interact with those objects and digital information associated to it — particularly for wearable headsets.

We've been conditioned as technology users to look for touch — it's really the default user interface for most technology now. Wearable device makers have proposed multiple interface solutions: voice navigation, depth tracking for finger detection, companion devices, and even things as novel as shoulder-mounted projectors. Though these options are a great start, we've found many of them lacking or even frustrating for the average user.

But what if we could turn any surface into a touchscreen? This was the idea we tasked ourselves with after discovering the potential gain in marrying thermal imaging with traditional computer vision algorithms.

Our mobile prototype runs on a Lenovo ThinkPad tablet, to which we attached a combined thermal and visible light camera module. The fixture is simply a joist hanger I purchased at a local hardware store.

Our proof-of-concept software implementation is based on the Metaio augmented reality SDK, which uses tracking capabilities for dealing with both planar and three-dimensional objects. It further provides the ability to render virtual objects on top of tracked objects, which we need in order to associate virtual information to a physical surface.

The Thermal Touch prototype is heavy and expensive but points a way for future hands-free wearables.

The Thermal Touch prototype is heavy and expensive but points a way for future hands-free wearables.

Thermal imaging is, of course, not yet an inherent function of our software, so we had to extend the Metaio SDK to support capturing images from the thermographic camera. The only thing that was really lacking was a touch detection algorithm, so we developed one.

Next we searched for ideal use cases and applications that we could easily demonstrate. After success with some field testing, we decided to gather both conceptual and actual uses of this technology into the short video below.

The prototype is both heavy and expensive. This is not a consumer technology — yet. But this new way of interacting with augmented reality is clearly meant for wearable computers and head-mounted displays.

We believe that ubiquitous adoption of wearable computing is a when , rather than an if , but the question of the user interface remains. This “thermal touch” interface, made more robust, would be an ideal solution to transitioning between a touch device and one that leaves users hands free.

We will continue to improve our prototype in terms of robustness and latency, and we are looking into how this fundamental approach can allow more advanced interaction techniques. For example, touching an object with different fingers might have different effects.

It may be some time until the first head-mounted devices ship with embedded thermographic cameras, but these cameras will become available in a small form factor and an affordable price. A mobile phone add-on enabling mobile thermal imaging will become available this year, and this is only the beginning.

Though it may be years ahead in the future, embedding infrared cameras into wearable computing is not beyond the realm of possibility, especially in an industry that is still iterating on form factors and hardware, let alone the ideal graphical user interface.

This article was originally published on EBN's sister publication EE Times.

13 comments on “Wearables Get a New Interface

  1. prabhakar_deosthali
    June 25, 2014

    This seems to be something realy innovative. By having thermal imaging technique for sensing human touch, one can do away with touch sensing panels which are not practical in many of the applications.

    Thermal imaging technique can be universally applied for all kind of objects , irresssssssrpective of their composition, surface smoothness and so on.

    I congratulate the research team for thinking something outside the box!

  2. SunitaT
    June 25, 2014

    @Daniel, thanks for the post. I just watched the video and I really liked the concept of using heat generated by touch to know if the object surface is touched or not. I am curious to know if the same concept works for hot objects where the surface temperature is already  high ?

  3. SunitaT
    June 25, 2014

    Thermal imaging technique can be universally applied for all kind of objects , irresssssssrpective of their composition, surface smoothness and so on.

    @prabhakar, I agree with you that thermal imagiing technique can be universally applied for all kind of objects but what if the temperature of the surface is varying ? Will it identify any change in temperature as touch ?

  4. Daniel Kurz
    June 25, 2014

    Thanks a lot. I'm glad you like our concept!

  5. Daniel Kurz
    June 25, 2014

    Tirlapur, that's a very good question. Our current prototype assumes that the temperature of the surface is lower than the temperature of the fingertip.

    Conceptually, however, it's only important that there is a difference in temperature between the finger and the surface — it doesn't really matter which one is warmer than the other. In fact there are also some people whose fingertips are cold, and they leave a cold trace after touching surfaces at room temperature (~25 degrees Celsius).

    We also have constraints on the shape and physical size of residual heat to be classified as resulting from a touch by a fingertip. But of course false positive detections still occur at this early prototype stage. More details on the underlying algorithm can also be found in our research paper, which will appear in the proceedings of the International Symposium on Mixed and Augmented Reality (ISMAR) in September.

    We are working hard on making our approach deal with as many situations as possible in the future.

  6. Himanshugupta
    June 25, 2014

    I think that infrared imaging has this huge limitation of false identification, background noise, saturation, delta-Temperature sensitivity etc. But as already in use, such camera can be a good ad on when most of the other sensory inputs fail. I think that there are some research groups working on superconductor detectors which can detect a single phonon so basically IR cameras with super high sensitivity. These devices will sure push the applications to new boundaries.

  7. Taimoor Zubar
    June 25, 2014

    @Daniel: Interesting invention I must say. I think this will be particularly useful in designing touch-based surfaces that are non-flat. Currently, the touch screens are only available in a flat format and that puts up a big restriction. This technology may easily be used in designing touch-based non-flat surfaces like a mug or a pen.

  8. Taimoor Zubar
    June 25, 2014

    “. In fact there are also some people whose fingertips are cold, and they leave a cold trace after touching surfaces at room temperature”

    @Daniel: It might sound like a very naive question, but don't the fingerprints left behind create a problem for the sensor? Do you have to clean them up after every use?

  9. Hailey Lynne McKeefry
    June 26, 2014

    @Daniel, thanks so much for explaining that. It's so interesting, and so much more complex than I ever imagined. The devil is always in the details, isn't it?

  10. t.alex
    June 26, 2014

    Pretty interesting idea! Is it possible to recognize the object without even touching also?

  11. Taimoor Zubar
    June 28, 2014

    “Pretty interesting idea! Is it possible to recognize the object without even touching also?”

    @t.alex: Yes, I have seen touch technologies where the surface of the screen does not need to be touched. Instead, the sensor is able to detect the finger as soon as it moves close to it. However, when you're talking about wearable technologies, that doesn't apply. The object has to be part of the natural environment and hence needs to be “touched” like any other normal daily-life object.

  12. t.alex
    July 2, 2014

    TaimoorZ, the Samsung tablet can recognize it when you just start moving the stylus nearer to the screen, and will show up a small dot as a guide of where the stylus is point to.

  13. Wale Bakare
    July 20, 2014

    >>This technology may easily be used in designing touch-based non-flat surfaces like a mug or a pen<<

    You are right.  I hope it would be really nice having the technology in more application areas, and other temperature measuring things.

     

Leave a Reply

This site uses Akismet to reduce spam. Learn how your comment data is processed.