AI spectacles - the ultimate self tracking tool

Hey folks!
I’m Sreeraj, co-founder of Panoculon Labs. We are a small team of recent engineering grads building an AI spectacle - frames equipped with camera, microphone, speakers and other sensors and can see and hear what the wearer does. Think Ray-Ban Meta glasses but with access for developers to connect with their own apps, much better battery life, lower cost of manufacturing and an architecture which respects the privacy of the wearer and people around them.
I recently stumbled upon Quantified Self and thought how AI spectacles could be the ultimate tool for tracking yourself - some examples I can think of the top of the head are a virtual dietitian which can see and track whatever you are eating and give feedback, keep track of and help in changing habits, remember conversations etc - the applications are virtually endless.
We are in the late prototyping stage (attaching an image of our prototype and expected final design below). It would be great to have your feedback on how useful you think such a device would be, and whether it would be worth buying once it is on the market (we are planning to retail for under $250). You can check us out here. Thanks!


1 Like

Welcome to QS! One thing that came to mind immediately is that the glasses could track focus time. Specifically for individuals with ADHD it could track how long they are seated (simple recognition of place view: is it changing or not changing) and count the number of changes and return to baseline desired view. Even if not designed and approved as a medical device it could provide this data for discussion with clinicians and assist individuals working with a coach to improve their focus time when testing new strategies to sit longer or to improve mental fitness for working on detailed tasks, especially boring ones, without breaking focus.

Privacy concerns would be substantial. I’m completely uninformed on the capabilities of glasses such as these, but I’d recommend your glasses have a distortion setting, i.e., could the glasses see a chosen area and be set to grossly distort the chosen area so the glasses can be worn for tracking without ‘reading’ the computer screen or paper documents for which a user is measuring the ability to stay seated and/or persist on a detailed or boring task.

For food I’d suggest the glasses be able to recognize place (set by each user) so it can say … you wandered into the kitchen 23 times today (whether for drink or food).

So many applications. One could iterate endlessly.

1 Like

Memories of Memoto!

1 Like

Thanks for pointing out the applications! As for privacy, it was a major factor for us while developing, since people have the same concern for many wearables out there in the market. We are working on a privacy-focused architecture for the glasses, which would allow the device to record data for analytics and process it securely, but it cannot be retrieved in its original format (like the wearer cannot use the glasses to record others and view it later).

A nice point on the geofencing (location detection) application. We could implement some sort of image recognition to detect when the user enters particular locations.

A main concern for investors, etc, we talked to was whether people would actually be willing to pay $250 for such a device, or if it is just a good-to-have.

1 Like

Thanks for sharing the blog. Shows that people are definitely excited about such an application!

Regarding privacy even if the data cannot be accessed/exported in its ‘original’ format, if it can be accessed via any script to look for/quantify certain colors, traits, objects, motions it certainly could be used to obtain data on others. This comment is not specific to your product.

The thing with food is that if glasses can intervene to ask, are you thirsty (vs hungry), are you bored (vs hungry), etc. then that goes a long way toward helping with awareness/increasing interoception.

1 Like

Got it. Will keep it in mind.