I work in the MIT Media Lab and I’m working on a breakthrough wristband that uses an array of sensors to tune into your body and its actions so it can detect where you are and what you’re doing.
For example, it knows the difference between you standing/sitting/laying or if you are in the car/on a bike/at your computer/shaking someones hand/hugging someone/etc.
The goal is to use these contextual cues to build an IFTTT for your body. Laying down? disable screen rotation on my phone and make sure my alarm is on. Running and also have a meeting coming up? You’re probably going to be late so I’ll show you some canned messages I can send to the meetings’ attendees. Change your shoes into your gym shoes? It can show you the traffic on the way to the gym.