Quantified self metrics (eg estimating cognitive flexibility) from uploads of video data?

Does anybody know about novel cognitive measures from video data (eg. OBS Studio=>Twitch)? Doing this for people’s speeches/talking is already enough to predict Parkinson’s/Alzheimer’s ahead of time. It could be used for predictive analytics of processing speed/memory/etc.

I’ve heard of software and services that can take a video feed or upload, and keep track of specific physical behaviors; including the completion of work assigned tasks. It wouldn’t surprise me if much more sophisticated software already exists, that can determine a lot about an individual from a simple video appearance.

With that said, have you considered the reverse of your original query? Where instead of analyzing pre-made video uploads and feeds for human KPIs, one generates original video and audio based on those same metrics.

This would be done with the assistance of generative AI technologies of some sort. I would think basic biofeedback input(s) could provide all of the information that a computer would need to create perfectly customized experiences for each unique user and desired state.

They do it for C elegans worm tracking (to test movement/robustness of C. elegans as a proxy of aging rate). Ora biomedical does it. Also one of Morten’s students also does it for Drosophila.

It can be applied for mice data.

Datta lab once did it (see ODIN symposium Youtube upload), someone in Morten’s lab is working on it, but it has generalized to many more labs

https://twitter.com/OraBiomedical/status/1672255427260846081

https://twitter.com/nstroustrup1/status/1379085950937104386

It’s called phenotype screening. But imagine doing it for mouse cursor data

1 Like