Motorized, Live 360° Video
(Before Facebook did it)

I love bouncing ideas off friends. Sometimes a novel idea just strikes you out of nowhere. We came up with this one when discussing the recent development of remote-controlled robotic hands.

“Great minds discuss ideas; average minds discuss events; small minds discuss people.

The software was done by me (age 18), and the physical construction was done by my pal Kyle (age 17). (FYI, most of the software was just ‘glue-work’). Facebook recently unveiled their live 360 video rig — and to be fair, it’s much better than ours — but maybe we get street cred for doing it first?

We built this contraption as a proof-of-concept some time in early November. All the pieces lined up just right for its creation. First of all, Mozilla’s WebVR API was mature enough to give us uninhibited realtime access to smartphone orientation data. Secondly, the iPhone 6 (and similar models) had massive screens that could completely fill people’s field of view and they boasted speedy hardware that could constantly repaint the canvas and could simultaneously send crucial telemetry data. Lastly, 802.11n was fast enough to trasmit our live MPEG streams with minimal latency.

vr building > our messy lab area

Parts list:

  • Wood scraps (free)
  • Arduino ($4)
  • Two servos ($10)
  • USB 720p Webcam ($15)
  • iPhone (borrowed)
  • “Google Cardboard”-based HMD ($10)

This was a very cheap project. Heck, we already had everything except for the USB webcam which we bought on Amazon.

Diagram:

vr diagram > how the whole thing works… (hint: I’m not an artist…)
> also, I meant MPEG instead of MJPEG

Video clips:

https://drive.google.com/file/d/0B8ow8UFcYneeOHNrb1JGWFZWd3M/view https://drive.google.com/file/d/0B8ow8UFcYneecVI1NGdBRHdPcTg/view

Code:

I dumped all of the code online here: ryanmjacobs/funbox. Sadly “funbox” was the best name we could come up with.

I sectioned the code into three main parts. First off, there is the simple servo control firmware which basically reads two bytes from a serial connection. Then it uses those two bytes respectively for the X-axis and the Y-axis of the servo rig.

Secondly, there is a static directory that is hosted with any HTTP server. I prefer darkhttpd but Node.js’s http-server works just as well. This is the home to the Javascript code that renders the MPEG stream and transmits orientation data to the server after converting it to Euclidean angles. The meat of the code is right here: static/js/main.js.

Lastly, the server code runs on my laptop. The server accepts orientation data and sends it across the serial connection to the Arduino. Then the Arduino controls the servos to match the phone’s orientation. There are actually three components to the server code. First and foremost is the Node.js server that parses and communicates the phone orientation data. Secondly, there is a Node.js server that transmits the MPEG stream from the webcam to the client phone. Lastly, the ffmpeg command uses v4l2 to read the video stream from the USB webcam mounted on the two servos.

Difficulties:

vr building > ahhh… too many letters and not enough numbers
WebVR’s API returns quaternions in order to represent the phone’s orientation in 3D space. And they are really confusing (for a high schooler at least). It was difficult to comprehend them at first, but now I see how useful they are. But with our rig, we only need two dimensions. So in this case they are kinda pointless because gimbal lock will never occur. Initially I used some formulas I found online to do the conversion to typical Euclidean angles… but then I discovered that Three.js did it already so now I use that.

Follow me on Twitter to get notified of new posts: @ryan_mjacobs




Share this story