Project Tango and Visual-Inertial Localization

This project enti­tled “Large-Scale, Real-Time, Visu­al-Iner­tial Local­iza­tion” is inter­est­ing, using Google’s exper­i­men­tal ‘Tan­go’ hard­ware to improve real-time track­ing of loca­tion and position.

The hard­ware is a tablet com­put­er with a motion track­ing cam­era, a 4 megapix­el 2µp pix­el cam­era, inte­grat­ed depth sens­ing and a high-per­for­mance proces­sor. This equip­ment aids in tasks like scan­ning rooms. A lim­it­ed num­ber of kits were pro­duced and giv­en or sold to pro­fes­sion­al devel­op­ers with the intent of mak­ing tech­no­log­i­cal developments.

One day we may see more accu­rate and inter­est­ing aug­ment­ed real­i­ty. I’ve often thought over­lay­ing infor­ma­tion onto our cur­rent real­i­ty would be inter­est­ing. Walk­ing down a street and see­ing for-sale signs could be inter­est­ing. It may just being over­loaded in adver­tis­ing, mak­ing a vir­tu­al eye­sore though.

Source:

Get Out of My Lab: Large-scale, Real-Time Visu­al-Iner­tial Localization
Simon Lynen, Torsten Sat­tler, Michael Bosse, Joel Hesch, Marc Polle­feys and Roland Siegwart.
Autonomous Sys­tems Lab, ETH Zurich
Com­put­er Vision and Geom­e­try Group, Depart­ment of Com­put­er Sci­ence, ETH Zurich
http://www.roboticsproceedings.org/rss11/p37.pdf