3D Interactions on the Google Glass

January 2017 - June 2017


Advisors/Collaborators: Gregory Abowd, Thad Starner

Notes:
  1. Zhang, Cheng, et al. "Soundtrak: Continuous 3d tracking of a finger using active acoustics." Proceedings of the ACM on Interactive, Mobile, Wearable and Ubiquitous Technologies 1.2 (2017): 30.
  2. Technical report of the design of the mount. (Please email me if you would like access to a link for this report.)

The small size of wearable spectacles like the Google Glass limits the efficiency and scope of possible user interactions, as inputs are typically constrained to two dimensions: the touchscreen surface. In this project, I explored using 3-dimensional gestures as a means to extend the input capabilities of the Glass. We used the SoundTrak acoustic sensing system to perform 3D finger detection and we describe the process of developing a form factor that the SoundTrak sensors attach on. We also considered the issues of social appropriateness in performing gestures on the Glass and perform some initial tests with our form factor system.

Tasks Performed:
  1. Fabircation and 3D Printing
  2. User Study
  3. Programming on the Google Glass
  4. Signal processing