Using facial images to track mood?

Just curious as I’m currently analyzing facial images to see if one can detect micro-expressions that hint at mood. Some issues could be obtaining these images of someone in a natural setting. This is hard given that most people tend to smile in their photos. Maybe video could give a clue as a wider range of emotions may be captured.

Figured that most of us have a fairly large cadre of self-images, mostly on the web, so we could find ways to take advantage of this. Partly inspired by many of the everyday selfie videos (e.g.

Any thoughts on how we could use a more image / video based system to track this, or if this would even be of interest?

Stan James’ Lifeslice project is probably of interest. He uses the webcam on his laptop to periodically take pictures of himself. He has released the software for others to use. It’s a good example of capturing people’s “natural” expressions, and a fairly big dataset for any individual user.

How are you currently analyzing people’s facial images?

Stan’s QS Talk:
Lifeslice project page:
Cors Brinkman’s QS talk:

Thanks Steven, this is quite helpful. I like the approach of using webcam images, will definitely check out the Lifeslice project.

Right now I’m using Facebook photos since many people have photos of themselves over time on this single platform.

The analysis right now is manual, but I plan to automate using Python / machine vision. The hope is to derive a number of health insights from facial images - mood, stress levels, body fat % / BMI, etc. - as an overall means of passive health tracking, but also as a tangible emotional motivator (self-visualization has a powerful effect).

Doing some initial work on making this into a product here: , feel free to sign up here for updates, will also let folks know when we have something more production ready to use.

Below are some more materials for reference, where I specifically looked at BMI

Originally posted at

Video of talk can be seen here:

Slides from talk here: