Watch on YouTube: youtu.be/acOSi8GVyz0

About two weeks ago Aaron Frost presented Web Gestures With getUserMedia: Part1 (which was inspired by Tim Taubert's work at UtahJS.

I was so inspired that I wanted to try to figure out unassisted (meaning: freehand, no tracker images or "green screen") motion tracking that night. However, it wasn't until yesterday at the Salt Lake Open Space Conference (@SLOpenSpace) that I actually started the implementation (during a live code session). In under an hour we were able to fork and review Aaron's code and get a proper demo up. I spent several hours last night trying to refine and smooth the tracking, but it hasn't been an easy task.

Demo

Well, without further adieu, here's my demo page: http://coolaj86.github.com/getusermedia-gestures-preso.

Github

And here's my fork of Aaron's work: https://github.com/coolaj86/getusermedia-gestures-preso.

How it works

Aaron started off with the "green screen" technique. He loops through the imageData array and maps any value above the green threshold to a second 2d array with only the values 0 (not enough green) and 1 (enough green).

He then iterates through the map and scores each pixel by adding up the value of that pixel's neighbors 5 pixels away in any direction (left, right, up down). Thus a greenish pixel that has greenish neighbors will now have a score between 2 and 21.

For my initial approach I just created a second array to store the previous values frame's values and changed the green detection to difference detection.

function draw() {
  //
  oldPixels = newPixels;
  newPixels = canvas.getImageData(0, 0, vidWidth, vidHeight);

  ...


}

[Not done yet]: I'll update this more within the next day or two


By AJ ONeal

If you loved this and want more like it, sign up!


Did I make your day?
Buy me a coffeeBuy me a coffee  

(you can learn about the bigger picture I'm working towards on my patreon page )