Great project! When listening to an interview with Alan Yates[0] (main designer of the Lighthouse) I was thinking about an application like this.
I recently did shop around for motion capture system (cameras tracking markers) and one of the cheapest systems with comparable performance to OP's came out to cost $5-8.5K.
A bit off-topic, but I was wondering:
How was the base station visualization[1] done?
[1] https://camo.githubusercontent.com/d9241f231a03d177d215f98bd...
Fantastic!
Having been developing with the Vive for most of the last year for Left-Hand Path (http://store.steampowered.com/app/488760), and given I've got a lot of mocap experience before that, I can confirm that Lighthouse's tracking is ridiculously good.
It's not just as good as something like an Optitrack system: it's significantly better.
If this provides comparable tracking to what the Vive offers, it's an absolutely unbeatable price / performance combo.
I actually just arrived in Seattle to take the official HTC course on using the Steam VR positional tracking system, and this was one of the first things I saw as I got off the plane.
I've been mulling a installation work based on exploring video solids[1] around using cheap phone-VR mounts. This project is exactly what I need to find the viewers position in the video/space. Any cheaper way to do this without the Lighthouse devices?
Hey, author here. I used the system described in the link for indoor stabilization of a drone, plus precise landing, to support an automatic battery swap station project (have a video there). Worked pretty well, so I decided to open-source it in hopes this would help fellow hackers.
Let me know if you have any questions!