This project is solving the We Love Data challenge.

Use the Leap Motion controller ( ) as age/background independent, human-friendly, natural way of interacting with spatial and space-related data and/or applications. In practical terms, given the time limit, it would be a custom built version of open source apps (like Stellarium or Celestia) patched with custom interaction patterns (zoom, pan, rotate, etc).

Why is this under "we love data"? Just at at the challenge description: Rethink how people interact with space data in new and meaningful ways. Human hands provide bandwidth and interaction richness which are way beyond keyboards and mice, always at hand (pun intended), and best of all, we have a dedicated brain section for controlling them! And yet, most current interaction can be described as "use a simple input device to show pictures". That is what this project wants to change, demoing a prototype interaction via the Leap.

Simple English version: navigate time, space and data in general with a simple handwave/finger pointing

Project Information

License: LGPL
Source Code/Project URL:


Leap Motion development info -
Base Stellarium sources -
Qt Creator/libs (dev environment) -