Aussie Expat codes Amazon's Fire Phone features in a weekend

FirePhone
Amazon’s making a lot of noise today about its Fire Phone, which we almost certainly won’t see in Australia. That probably doesn’t matter given that it’s feasible to code its head tracking features rather quickly, as one enterprising Aussie expat coder did last weekend.
Patrick Morris-Suzuki* is a Aussie expat New York based developer and last weekend, prior to the Fire Phone’s announcement, he took part in the Video Experience Hackathon in New York, developing a system for a head coupled perspective interface, primarily to work around the Oculus Rift, but also capable of being rendered in an HTML5 compliant browser.
Sadly for Patrick, he didn’t win the main $50,000 prize, although he was one of four finalists.
What he realised this morning as news of the Fire Phone broke was that he’d coded a method that’s very similar to that used by the Fire Phone in the space of a weekend.
The winning entry, if you’re curious, was a location aware video platform that included iBeacon, GPS and AR features.
Patrick’s entry took him an estimated 36 hours of coding around a number of open source utilities to build the experience you can check out at http://streammersion.com/ — although you’ll need to be running either the Dev or Canary builds of Chrome to get it all to work, and allow access to your computer microphone and speakers if you want to use its voice and head tracking capabilities. If you’re feeling timid about that, it also supports WASD movement within the video once you click into a video file. Laptops with inbuilt gyroscopes also support looking around by changing the orientation of the laptop itself, another feature that the Fire Phone is touting itself heavily on.
“I just implemented every single unique feature of that phone.” he told me.
“And I didn’t even realise that Amazon was about to make it a big deal.”
Patrick’s code isn’t so much taking on Amazon in the shopping space, but in the way that the Fire Phone uses head tracking to deliver an immersive tilted experience based on head position.
Here’s a quick video demo of Streammersion in action.

Having played around with Streammersion, it does work for voice and head tracking, although the experience can vary depending on the capabilities of the computer you’re using. On an older iMac I’ve found it a little twitchy — Patrick notes it can become particularly problematic if you’re a little click-happy. “Right now if you click head tracking more than once the head trackers can conflict.” It’s a bug that he intends to iron out in any case.
Patrick’s aim wasn’t to take on Amazon in any case, but to develop frameworks for low cost creation of 3D content, especially for tools such as the Oculus Rift. “I bought a rift and realised quickly that content creation is a huge process” he says. “I could render a small 3D world, but to create the assets for an interesting virtual world is a huge process. So I have been wanting to work on ways to make interesting Rift content that is practical for ordinary people to do so quickly.”
With that in mind, he’s made the underlying code available under the AGPL license (“While I figure out what to do with it”), which can be accessed here.
Disclaimer: I’ve known Patrick for a number of years. That number seems to always be growing, which means he hasn’t perfected time travel — yet.

Leave a Comment

Your email address will not be published. Required fields are marked *

This site uses Akismet to reduce spam. Learn how your comment data is processed.