Leap Motion Prototype: Leap Designer

Leap Motion released a devkit version 2. I decided to give it a try by writing a 3D system design application. That's a pretext to play with virtual stuffs btw.

Quick intro: Leap Motion is a 3D tracking device that allows you to interact with a computer by waving your hands around.

Three infrared LEDs light up the space above the device, and two cameras record what is happening. With these 2 streams, the driver reconstructs your hands in space. Here's the trailer for the second version of the driver:

Remember that this is marketing, they did it the first time, and that was
far from reality. But the data produced by the SDK is much much much more robust than before. They are (I guess) using a realistic hand model to constrain
the 3D reconstruction algorithms.

This "Skeletal Tracking" adds actions such as pinching something or closing your hand. As a developer you don't need to reinvent detectors. As a user you get a consistent detection among applications, good news.

Try or watch the demos here, it's worth taking a look even without a Leap Motion.

Leap Designer

Now the API promises more than a few "poke the screen and raise your hand"
interactions. Let's start to envision the ability to design and manipulate
systems without any interface. Elon Musk already tried and, well, Iron Man...

Iron Man Holograms

I'm not sure why, but I also remember this scene in Swordfish where Hugh Jackman designs a virus by jumping around his computer.
He finally gets some cubes to align around another cube. The virus is complete.

Physical analogies look easy when it comes to interactions.

  • What if you could design and watch over thousands of components as a physical system?
  • How different would it be from fixing XMLs and checking logs? Is more intuitive?

Back to reality, the following is a base project for a 3D design tool. Its a pretext to play with the new SDK and see if my Leap Motion may turn into something useful.

Demo

Statistics: you don't own a Leap Motion device. Here's a gifcording of what it looks like then:

Leap Designer Demo

You can only pinch and move cubes around but this is an honest capture of how interact a 3D world and a Leap Motion. Sometimes it works well, sometimes it's glitchy as hell. My main complain is that the Leap Motion API does not protects you enough against the noise.

Leap Motion is getting better, I gave up when "noisy detection" was its main feature. Now, noise seem's like a constraint you have to work around.

Usability

Interacting with a 3D world on a screen is hard!
The absence of depth perception makes a subtle difference: it's hard to tell what is above, below and behind.

Escher Ascending Descending

Escher hacked this subtle difference in his drawings and ustwo made a (beautiful) game out of it.

Mario 64 nailed the passage to 3D using unrealistic shadows:

On the TV screen, objects don't have the same kind of physicality, that's what makes it difficult to make people grasp the physicality and depth.
One solution is adding shadow. [...] Every floating object would have a reference point on the ground [...] It might not be realistic, but it's much easier to play with the shadow directly below.

Virtual worlds also need a lot of feedback. There's no physical feedback when you touch "something" virtual... These "something" need to light up, move, ring... react in any explicit way.

Technicalities

Leap Motion API

Some details about how to use the driver (still in beta!):

The leap motion provides callbacks called at each frame. Either browser rendering frames (60 per second) or Leap Motion frames (between 20 and 200 per seconds).

Leap.loop({/* options */}, {
    hand: function(hand) {
        // This is called for each hand at each frame
    },
    frame: function(frame) {
        // called for each frame,
        // I'd use it to detect disappearing hands
    },
}).use('riggedHand', {
          scale: 1,
          // Add the renderer or use the default one
    }
});

The variable framerate depends on the context: machine load, what's being detected and settings.
For complex application it may be worth to look at the impact on detection. You want to be more conservative when the device gets slow and imprecise.

Each hand object contains the fingers coordinates and some other values showed in the features page.
The second part, .use, is the rigged hand plugin that shows hand meshes in a Three.js context.
The hand plugin provides tools to translate from Leap Motion space to Three.js coordinates.

That's all you need to start writing a Leap Motion app. All the details about the Three.js integration are in the github project linked below.

Finally

If you look at the store, it's all regular applications controlled by a finger rather than a mouse. That's both imprecise and exhausting for a 80$ input device.
This new SDK is a step towards doing something unique with the Leap Motion.

If you have one, hidden somewhere, in a dusty box, give it a try again. The project is on github.

If you're just curious, there's an unofficial site that references cool leap motion projects.