Service to combine and analyse all your data?

Is there a (web) service that allows you to upload data/connect APIs from different sources so you can combine, correlate and analyse the data?
I’m currently doing this in Excel, but that’s not very convenient and you might miss (important) correlations and it’s a pain to maintain.

1 Like

Yes, but depends on 1. what are your data sources, and 2. what kind of analysis are you doing?

HI Chris,
If you are using Excel I wonder if you are using the Get & Transform extension which may help you out significantly more than out-of-box Excel? G&T let’s you define a custom data model, connect to APIs, transform your table data on the fly and more.
Sergio

I’m not familiar with Get & Transform yet, so that’s probably a great help.
Allthough I must say I’m always a bit hesitant to use excel as a database :wink:

As for my markers and data sources;

  • Quite a few custom daily health markers; Excel
  • Daily morning HRV; HRV4Training
  • Heartrate monitoring of sleep, showers and excercise with Polar H7 and HRV Logger tracking;
    – HR (per second and manual custom 10min average in excel + the average HR of whole night)
    – rMSSD avg
    – rr (Poincare graphs in excel)
  • Resting HR from Jawbone Up
  • Sleep tracking with Jawbone Up
    – time per sleep phase
    – number of times awake per night
  • Sleep location; Manual input excel
  • Steps with Jawbone Up
  • Coffee, water, alcohol and other drinks intake - Nomie
  • Medicin intake - Nomie
  • Weight with fat percentage - Nomie

Currently I’m putting it all in excel. I’m heaving quite some health issues at the moment, so literally any correlation (to start with :wink: ) between any of my markers would be interesting.

Thinking of moving it all into a database and see if I can get something working with Python, but I can’t imagine I’m the only one struggling with maintaining, combining and analysing the data. So there surely must be solutions out there? :slight_smile:

1 Like

I started with Excel a few years ago to eventually migrate to a data model I defined and implemented in Microsoft Azure SQL. If you are interested the DDL to create the data model is in GutHub - https://github.com/familysmarts/ostlog/tree/master/mssql

Another benefit of this approach is that I use Microsoft PowerBI to analyze and visialize the results easily and very flexible features.

I use picnic health. They take all my medical data and put it into one place which includes generating plots of my bloodwork.

I’m biased towards Zenobase, which works with both Jawbone and Nomie (plus you can add your own data sources with a bit of scripting), and gives you a basic dashboard for viewing, filtering and correlating data. But it doesn’t provide much assistance for figuring out what you are looking for :slight_smile:

1 Like

Hi! I am just recently into this area, but I decided to use my very own system. I don’t want my information is dependent on some external service and one day to be out of control.

I use files to keep my personal information (now it’s passwords, my chronology, contacts, clouds and repos where all my info is backed up). There are also scripts for retrieving and filtering information. I launch them in console, and get the result there or redirect it to the local (html) file. Maybe not so much for beginning, but I don’t see big barriers to implement whatever I will need in the future while staying in the same framework. Next steps can be writing data to google spreadsheet (I already have functions to read from), functions to work with some service API or visualizing data in the treemap.

I would like to meet people that use the same approach, so we can share if not the code (I use quite exotic Racket programming language), but algorithms, API docs and such. In the end of the day, if you have your own code base, you don’t need some central external service and you are much more flexible in what you can do and what kind of information you can manage.

I agree with you Denis. I think part of the usefulness of any information system depends on how intimate the user is with the way it works and how it is built. This translates to transparency for the user and transparency = trust. In the area of self-tracking, we are dealing with very personal data, and I feel it is only natural that the system managing this data is personal in similar ways too.

This puts commercial providers are at a disadvantage in terms of accessing the full breadth of our personal data (i.e. we have to be willing to upload it). They have an advantage, however, in terms of innovating in the processing and algorithms that make sense and unlock insight from this data. As AI and Machine Learning become more and more prevalent, i suspect even do-it-yourself 'ers will be tempted to tap into the insight-generating power offered by commercial personal health and wellbeing algorithms in the cloud.

3 Likes

I’ve taken a “roll my own” approach as well and built my own data warehouse/API you can check out here - https://www.bobapi.com/

Here’s a post I wrote that talks about my reason/motivation for building it - https://www.quantifiedbob.com/2016/10/personal-api-bobapi/

I’ve brought a handful of data sets online, and will be incorporating many more over the coming weeks/months!

3 Likes

Thanks all for the comments. I agree that it is (highly) personal and it’s great to hear some different approaches and services people use.

I’ll try a few different things and will see what suites me best.

I use Google Sheets. This makes it the easiest for me to share it with health care providers and care-takers where relevant.