getUserMedia() part 3: simple motion detection in a live video

Now that you should already know how to build a live green screen and an EyeToy-like mini-game using nothing but plain JavaScript and a modern browser supporting WebRTC, let us move on to another interesting example: simple motion detection in a live video.

The initialization code

To detect motion in a video we need to compare at least two frames. We will use typed arrays to store the lightness data of the previous frames:

function initialize() {
  // ... code to initialize the canvas and video elements ...

  // Prepare buffers to store lightness data.
  for (var i = 0; i < 2; i++) {
    buffers.push(new Uint8Array(width * height));

  // Get the webcam's stream.
  nav.getUserMedia({video: true}, startStream, function () {});

We want two frame buffers - a single one results in a heavily flickering motion video but the more frames we store the more motion blur we will see. Two seems like a good value for demonstration purposes.

Illustrating lightness changes

The main draw() function from part 1 did not change except that we now call markLightnessChanges() for every frame. This is also the probably most interesting function of the whole demo:

function markLightnessChanges(data) {
  // Pick the next buffer (round-robin).
  var buffer = buffers[bufidx++ % buffers.length];

  for (var i = 0, j = 0; i < buffer.length; i++, j += 4) {
    // Determine lightness value.
    var current = lightnessValue(data[j], data[j + 1], data[j + 2]);

    // Set color to black.
    data[j] = data[j + 1] = data[j + 2] = 0;

    // Full opacity for changes.
    data[j + 3] = 255 * lightnessHasChanged(i, current);

    // Store current lightness value.
    buffer[i] = current;

We determine the lightness value of every pixel in the canvas and compare it to its values in the previously captured frames. If the difference to one of those buffers exceeds a specific threshold the pixel will be black, if not it becomes transparent.

function lightnessHasChanged(index, value) {
  return buffers.some(function (buffer) {
    return Math.abs(value - buffer[index]) >= 15;

Blend mode difference

The simple method we use to detect motion is called a blend mode difference. That is a quite fancy word to say: we compare two images (also called layers or frames) by putting them on top of each other and subtracting the bottom from the top layer. In this example we do it for every pixel’s L-value of the HSL color model.

function lightnessValue(r, g, b) {
  return (Math.min(r, g, b) + Math.max(r, g, b)) / 255 * 50;

If the current frame is identical to the previous one, the lightness difference will be exactly zero for all pixels. If the frames differ because something in that picture has moved then there is a good chance that lightness values change where motion occured. A small threshold ensures that we ignore noise in the signal.

Demo and screencast

That is all! Take a look at the live demo or watch the screencast below:

You can create some really great demos with this simple technique. Here is a neat one of a xylophone you can play by waving your hands (which unfortunately does not work in Firefox).

Whatever your ideas may be, I encourage you to fiddle around with the small demos I provided in my three getUserMedia() examples so far and let me know if you built something amazing!

Note to myself: Don’t be lazy

Back in October 2012 I wrote two blog posts, getUserMedia part 1 and part 2, including demos which unfortunately would run in Firefox, only. I did not explicitly want to be exclusive but I think I just did not feel like looking up why my code did not work in Opera and why exactly webkitGetUserMedia() behaved differently than mozGetUserMedia(). I was being lazy.

I also intended to mix in a couple of nice JavaScript features, like block-scoped variable definitions with let, destructuring assignments or Sets (did I just do it again?). In hindsight this does not really make sense as I should not expect visitors to want to learn about cutting-edge JavaScript features when viewing a getUserMedia() post.

Before finishing my third piece on getUserMedia() I decided to update the demos of my older posts to run in any modern browser. I also seized the chance to overhaul code examples which did not adhere to my coding standards anymore.

If you should ever be in a similar situation - please take a couple of minutes to write code that runs in all modern browsers so people can enjoy your demos in their browser of choice. Please don’t be lazy.

getUserMedia() part 2: building an EyeToy-like mini-game

This post is a follow-up to my previous one about building a live green screen with getUserMedia() and MediaStreams. If you have not read it yet, this might be a good time. We will extend the small example to build an EyeToy-like mini-game.

Some additions

var video, width, height, context;
var revealed = Object.create(null);

function initialize() {

First, we will add a variable called revealed that keeps track of all pixels that have already been revealed by holding a green object in front of the camera. Instead of replaceGreen() we will call our method revealGreen() from now on:

function revealGreen(data) {
  var len = width * height;

  for (var i = 0, j = 0; i < len; i++, j += 4) {
    // This pixel has already been revealed.
    if (i in revealed) {
      data[j + 3] = 0;

When iterating over all of the canvas’ pixels we check whether the current index is marked as revealed. If so we do not need to check its color but set its opacity to zero and continue with the next iteration.

    // Convert from RGB to HSL...
    var hsl = rgb2hsl(data[j], data[j + 1], data[j + 2]);
    var h = hsl[0], s = hsl[1], l = hsl[2];

    // ... and check if we have a somewhat green pixel.
    if (h >= 90 && h <= 160 && s >= 25 && s <= 90 && l >= 20 && l <= 75) {
      data[j + 3] = 0;
      revealed[i] = true;

If the pixel has not been revealed yet but is a green one, we make it transparent like before and mark it to stay that way.

Demo and screencast

That is all! Take a look at the live demo or watch the screencast below:

I know…

… this is not much of a game but rather a small demo one could turn into a mini-game with little effort. Play around with the code and see what you can come up with!

Building a live green screen with getUserMedia() and MediaStreams

While recently watching a talk about the new WebRTC features I was reminded of Paul Rouget’s great green screen demo and thought that this would be a cool thing to have for live video as well. Let us build a live green screen!

The markup

  <video id="v" width="320" height="240"></video>
  <canvas id="c" width="320" height="240"></canvas>

Those are the parts we need. A <video> element that plays the media stream and a canvas we will use to read and transform image data.

The JavaScript

function initialize() {
  // Get the webcam's stream.
  navigator.getUserMedia({video: true}, startStream, function () {});

function startStream(stream) {
  video.src = URL.createObjectURL(stream);;

  // Ready! Let's start drawing.

We call navigator.getUserMedia() and pass {video: true} as the first argument which indicates that we want to receive a video stream. We assign the MediaStream to the video’s .src property to connect it to the <video> element.

The video starts playing (which means the camera will be activated and you will see your webcam’s live video) and we request an animation frame using the requestAnimationFrame() API. This is perfect for drawing to our canvas as the browser schedules the next repaint and we will be called immediately before that happens. Now for the last and most important part of our green screen:

function draw() {
  var frame = readFrame();

  if (frame) {
    context.putImageData(frame, 0, 0);

  // Wait for the next frame.

function replaceGreen(data) {
  var len = data.length;

  for (var i = 0, j = 0; j < len; i++, j += 4) {
    // Convert from RGB to HSL...
    var hsl = rgb2hsl(data[j], data[j + 1], data[j + 2]);
    var h = hsl[0], s = hsl[1], l = hsl[2];

    // ... and check if we have a somewhat green pixel.
    if (h >= 90 && h <= 160 && s >= 25 && s <= 90 && l >= 20 && l <= 75) {
      data[j + 3] = 0;

What happens here is actually quite simple: we read the current video frame and extract its image data. We then iterate over all pixels in the frame and check if we found a green one - if so its opacity byte is set to zero, which means fully transparent. The manipulated image data is put back into the canvas and we are done for now until the next animation frame is ready.

The demo

Take a look at the live demo, you will need a recent Firefox/Chrome/Opera build. Make sure that getUserMedia() support is enabled in your browser of choice. Hold a green object in front of the the camera and try it out yourself. Your camera and light setup is probably very different from mine so you might need to adjust the color check a little to make it work. Alternatively, here is a screencast of the demo:

The end

This is an admittedly very simple example of a green screen but you can use this little template to manipulate your webcam’s live video stream and build all kinds of fancy demos with it.

Force Octopress/Jekyll to use a specific time zone

I could not be happier ever since I switched from Wordpress to Octopress. I usually write and publish blog posts from where I live, Berlin. The time zone here is CET (UTC+1). While recently visiting Mozilla’s HQ in Mountain View I wrote another blog post just as usual and typed “rake generate” to turn my Markdown files into static HTML files.

Looking at the output though, got me a little puzzled. All timestamps were changed to be calculated off the PDT time zone. While certainly that is not a big deal as they are still the same timestamps, I did not feel like changing all of those every now and then I am somewhere in a different time zone.

If you want to use a “static” time zone when generating your page, do it like this:

TZ=CET rake generate

TL;DR - put your time zone into the TZ variable if you want to force Jekyll to use a specific time zone when generating your HTML files.

Recent posts


The opinions expressed here are my own and do not necessarily represent those of current or past employers.