Automating export, combining and analysis of data from many apps?

Do you think it’s possible to do automatic export and analysis of tracking data from ALL of these apps?:
-Notion
-Cronometer
-Track & Graph
-MySymptoms
-RescueTime
-Instant
-Google Calendar

I want to do a monthly review where I look at correlations from any of the data I collect, and I don’t want to manually export all the CSV files and combine them.
I am willing to spend some days setting this up so it becomes automatic.

What tools could I use to automate the exporting of CSV’s, combining the data and analysis of the data from the above mentioned apps?
Or do I need to code my own for the combining of data and automatic analysis?

I don’t know about the majority of those apps, but I have automated the export of Fitbit, RescueTime and Google Photos. Here’s a bit of a mind dump to possibly help you figure out your prefered course of action:

Automating data export from multiple services is tricky because each app may have different requirements to get data out of their systems…

Fitbit and Google Photos (and Google Health and many other of its services) require you to either manually download your data or use their APIs which in turn require Oauth authentication. Oauth is a bit of a beast to understand and it took me a long time get it all figured out - it requires programming knowledge (I used php and wrote my own Oauth server but there are libraries for various programming languages available).

RescueTime uses a user generated key, so it’s easier to set up.

Once you get the data, the next step when dealing with APIs is to map the data - that means diving in to their API specs or playing with the data to extract what you want.

Then you need a place to put it. I chose an online mysql database but you could do local Excel or cloud Google Sheets or use a third party app like https://zenobase.com/ that has analysis tools that you configure yourself.

I wrote my own Pearson’s correlation function and used Google Visualizations but there are lots of other, more advanced options out there.

Open Humans offers Jupyter Notebooks - I haven’t delved into those because it’s python code, I believe. They also have some free tools to help in the collection of various types of data. https://www.openhumans.org/explore-share/

exist.io is a paid service that tries to aggregate data from many different services and find correlations in the data. You can see the list at the bottom of this page https://exist.io/ but I don’t see a lot of the services you use.

There is a good list of aggregation apps on Mark Krynsky’s website at https://lifestreamblog.com/lifelogging/ (scroll down to the " **Data Aggregation App, Services and API’s" section) though some of the services may be stale.

My gut instinct is that it’s probably best to pick the 2 or 3 most meaningful/interesting data sources from your list, extract the data manually and mash it up in a google sheet or excel file. If you find it interesting/compelling, then look at steps to automate and expand your data collection with complimentary sources on your list.

Hope this helps,
Dean

2 Likes