Monday, 27 June 2016

Live imaging with Google cardboard

3D viewing of live data - or perhaps timelapse video sounded fun, and I especially wanted to have a go at cloud timelapse with extra large camera spacing - since I have a great view of the sky to the north from our house.

While there is lots of content out there, and apps that do fake 3d from single photos (with much the same results as fake stereo from mono in the 1980's), there didn't seem to be much info about roll your own. Well after a few hours messing about and a lot of lessons learned things are looking pretty good.

After a bit of judicious makery, here is a snapshot of the web page that displays live images from 2 webcams. This isn't proper video, this is 1 frame per second served up by motion and stuck into a single web page using some simple html that allows some tweaking. The reason for upside down and the html is all explained below. The webcams here are 2.3m apart.

Displayed on an original Nexus 7 in Google Cardboard (painlessly extended to accept the larger device), this looks pretty good, although the screen door effect is blatant.

The standard Google app is about a 'VR experience', but I am far more interested in a straight 3d image viewer - along the lines of the old viewmasters, So I am not expecting (so far!) any ability to track head motion. While there are a couple of 'ordinary' stereo image viewers around they only work with static images - not streams.

Of course looking through 2 high magnification lenses at a phone screen 2 - 3 inches away is never going to be that wonderful, but resolution keeps improving so next year.......

Below is some detail on what I have done and how I have done it.

The solution (so far)

  1. 2 webcams - exactly the same model so any distortions in the lens match up, and the auto settings will deliver similar results.
  2. 1 laptop running linux with motion installed ( so can move it around easily - a pair of raspberry pi's would probably work well as long as the wifi is good).
  3. Appropriate motion config files.
  4. A web page setup to pick up and display the 2 feeds from motion at appropriate size.

Setting up the webcams

The webcam alignment needs to be good but not perfect - the eye brain combination will quickly adjust to minor errors in vertical / horizontal alignment, but not any rotation!

This web page overlays the images which makes it easy to get the two webcams pointing in the same direction. Align on something as far away as practical.

You can do some initial testing inside with the webcam cameras just 6 - 7 cm apart (pretty similar to normal ipd).

Once the cameras are aligned set up the web page for use with cardboard:

The cardboard web page

      <img width="400" style="transform:rotate(180deg);" src="" />
      <img width="400" style="transform:rotate(180deg);" src="" />
<div style="transform:rotate(180deg);">400</div>

This is a very simple web page. It uses a table to get the two images side by side. The Nexus 7 insists on having a few system buttons permanently on display (with chrome at least) at the bottom of the screen - and these were visible through cardboard, so lock the screen orientation and turn the device (and contents) upside down.

I'm creating the images in motion at 800 x 600 (actually square would probably be slightly better). Adjust the width settings on the images so that they are just (just) larger than the field of view through cardboard. The number here will vary wildly depending on pixel size of the display. For newer devices with much smaller pixels, the image size produced by motion will need to be larger as well.

Put the device in cardboard so the centre line between the images lines up with the centre of cardboard, and scroll the display until the bottom of the image is 
just on the bottom of the visible area in cardboard.

Now look through cardboard and select a prominent feature as far away as practical. Close each eye in turn, as you switch from left eye to right eye and back the feature should not move (well only a few pixels), If it moves significantly then the img width value needs to be adjusted (if your cameras are properly aligned, don't try to fix this by moving the cameras).

Once you are close, the images will visually merge into a coherent whole.

If the right eye image jumps to the right try reducing the img width first.

Adjust the width by 30 or 40 pixels at a time until it is pretty close. Very small changes are almost imperceptible.

On the nexus 7 the pixels are blatant and quite distracting (216 ppi), If you wear
specs or contacts, removing them might make the image more convincing by making the pixels less obvious.

Making it even better

Being able to deliver proper live video to the webpage. I thought his wouldn't be too hard, but it seems it is actually pretty difficult. I want to use a solution that will work without special plugins etc, and play in a browser as now. I have been playing with ffmpeg and ffserver, but I cannot get delay down to acceptable levels, or even a reliable behaviour. webm seems to be the way to go, but all I have so far is a headache and a lot of things that don't work.

Counteracting the terrible pincushion distortion in the cardboard lenses. This is one of the things that the google app does, but I cannot find a way to deliver this sort of content through that app (or any other cardboard app come to that).

MUCH higher ppi screen. I think to work really well would require upwards of 500 ppi, and probably closer to 1000ppi. Visible pixels  - and especially any sort of screen door effect significantly interfere with the eye brain image processing
 to extract 3d information


  1. This comment has been removed by the author.

  2. I'm curious as to what ROM you have on your Original N7? Mine is slow as molasses, but I bought it used on eBay and it came with a custom ROM. Any insight is appreciated.

  3. It's a stock one - just the updates from google. It is a bit slow now - don't use it much but the screen size made it suitable for this