Sharing my odd collection of custom-built, self-quantification software

As part of my thermal camera experiment, I decided to upgrade the computer hardware being used from a Raspberry Pi Zero 2 WH, to a Raspberry Pi 5. This detail is significant and telling in my opinion.

The relevant wireless access point within my homelab/LAN environment uses WPA3 for security. But the RPi Zero 2 WH can only access WPA2 signals. Instead of downgrading my network security, I am choosing to invest in a more sophisticated computer. One capable of accessing WPA3.

The main purpose of putting/keeping this experiment on my wireless network is to periodically (via remote SSH access) ensure data is collected by the Python program.

So we wait until the new computer arrives before continuing.

1 Like

Two weeks ago I mentioned building a device in the near future to measure the “vibe of a space”. I also shared a screenshot of the hardware I purchased from Adafruit to bring this quantified-self experiment to life.

The hardware arrived yesterday, as seen in the photo below:

It may not look like much, but with a (1) small computer, (2) 24 inch monitor and (3) Python, I will turn these sensors into a biometric monitoring station. Intended to explore whether a person’s state of being can be casually measured, as well as used to influence those metrics with mindfulness practices.

I also have the Raspberry Pi 5 computer (needed for another, related experiment) ordered and expect to have possession of it by tomorrow afternoon. Once the RPi 5 unit is in my hands, I will launch my next self-quantification effort; measuring the position of my body while sleeping.

Just to be clear, the hardware featured in the picture above is for the experiment after my next project.

1 Like

Hello everyone. Today I have something a little atypical to share. I recently built software which I am referring to as “Orbital Oracle”. Essentially, users submit info about their birth to Divine API and the response (containing personalized data from a western natal astrological paradigm) is fed to a locally deployed AI model in my home for summary/delivery.

Here is a screenshot of the frontend UI as it stands right now:

On the left side of the Orbital Oracle application, the user has a form to enter data about themselves. Including an “Ask the Oracle” button. After submitting their info, a chat message will begin printing on the right side of the screen; containing the local AI’s assessment of their astrological data. As seen in the screenshot displayed below:

For context, since my last post, I deployed several AI models on a computer within my private LAN. I am now actively learning how to program software which can interact with these artificial intelligences. Projects such as the one featured in this post.

Details About How “Orbital Oracle” Functions

First I activate a Python script (a Flask server) on my main computer, acting as the connector between Divine API, the user and the LLM in my homelab. On the same computer, I next launch a frontend UI written in JavaScript, CSS and HTML, enabling a user to submit the place, date and time of their birth.

Once that information is entered, the user presses a submit button and off the data goes to 5 separate Divine API endpoints; including planetary positions, moon phases, house cusps and others. Each returning numeric values representing how an individual is weighed and measured in western natal astrology based on the location and moment of their birth.

That data, along with a well-structured prompt and deeper context, is passed to an AI model on another computer in my homelab for review. Said artificial intelligence processes the values returned from Divine API, looking for patterns and likelihoods. Anything of interest. Finally delivering a chat response to the user for their use.

Relationship To Self-Quantification

Orbital Oracle is related to self-quantification because the values returned from Divine API are treasured and respected by many. And have been for a long time. In other words, western natal astrology can be considered a form of quasi form of self-quantification. At least in the context of such a project.

With this post, I am trying to show how a person (represented with their birth time, date and location) can interact with ancient forms of self-quantification using modern technologies. Which is, regardless of what one thinks about the information source and subject, interesting and worth experimenting with. Especially when tied in with modern AI.

1 Like

Hello all. I am still waiting for the correct inspiration to return to my latest Hackaday/hardware self-quantification experiment. The project involving an IR camera and my sleeping positions. In the meantime I have continued tinkering with a fun service known as Divine API.

Similar to my last update in this thread, while using the Divine API platform, I built an application rooted in Western Astrology. This software is a little different from my last update, though.

First, the Divine API is pinged using Python, with values representing myself (or anyone) such as the date and location of my birth. JSON is returned with positions of and other metadata about various celestial objects; looking something like this:

The frontend of my WIP application takes the saved JSON file and creates a unique UI for understanding interactions between the planet(oid)s. Among half a dozen other major features.

For example, the slider at the top center of the user interface moves the celestial bodies (graphed in the center of the screen) around the signs of the zodiac. It can auto-advance. Detailed information about each body is available upon being hovered over. Along with other interesting, interactive measurements and displays.

There is a lot more I could say about this application, but I think the screenshots and descriptions above speak volumes. I don’t know of anything else like this tool. More to come of this soon, probably.

On A Personal Note

Regardless of one’s perspective, there is value in seeing this kind of app in operation. This is the kind of real-time open source intelligence once exclusive to rooms filled with scholars, priests and kings.

With today’s tools, said information is available to essentially anyone, eventually to everyone. In such a way that those few who once had exclusive access, would downright envy.

The relevance of these projects to self-quantification is immense, in my opinion. Just to be able to play with astrological data like this, in a way that creates new meaning, is pretty cool.

My next step along these lines of thinking, if there is to be more, is to combine this application with my previously mentioned “AI Oracle” software. So I can get on-demand advice from a locally deployed LLM about specific (future, present and/or past) planetary positions without having to resubmit API requests. We live in amazing times.

2 Likes

Wanted to take a moment and express how positively ingrained self quantification has become in my everyday life. With the rise of Python automation and D3.js visualizations, much of what I do in my daily life is measured by programs I have developed or help maintain.

With this post I would like to express the following, “There is an array of quantified self projects I have worked on, but haven’t discussed.” I would like to now provide context for this statement.

Measuring Character Interactions

A first example of an undisclosed QS project is my “character interactions” application. Which takes a .txt file as input, and generates a (JSON file and) network graph visualization featuring all direct interactions between characters therein as output.

As an example, here is a screenshot of output from William Gibson’s “Neuromancer”:

If you are a fan of cyberpunk novels, you have probably read “Neuromancer”. What this output taught me is Molly (with nearly twice as many interactions than other characters) is the center of the story, not Case.

Expanded Audio Analyses And Visualizations

Another example is having added 11 new metrics to the “Detailed Audio Analyses and Visualizations” (DAAV) program, mentioned in the first post at the top of this thread. There are now 20 audio features measured for each sound file processed.

Here is a screenshot from the live demo of DAAV:

This data-rich application is fueling a much larger project. One where video and audio are intelligently fused together programmatically, into entirely new experiences. More to come on this project at a later date.

Dozens Of Other Examples

Since starting this thread, I have developed 80+ unique software programs, each with some form of logging or JSON output. Other specific examples I am especially proud of include PDF Finder, AV-Sync and many, many others. Each with a unique purpose:

In the long-run I am interested in integrating my love for data analysis and automation into biofeedback systems. Which has been a dream of mine since I was in college. And is something I can now imagine (in part) because of the development this thread has inspired.

I hope those who are following along enjoy the learning taking place as much as I am/do.

Also, a big thank you to those who have encouraged me here. It has been a wonderful journey so far. There is much more to come.

1 Like

Hello all. I hope your 2026 has been productive so far.

The sleeping position experiment involving an AMG8833 IR thermal sensor from August, 2025 has taken a fruitful turn.

Instead of using the single AMG8833 camera to measure my body heat, I am changing course. I have invested in 8 Force-Sensitive Resistor (FSR) square pads from Adafruit to measure where my body weight is positioned while sleeping.

In essence, as an alternative to mounting a thermal camera above my bed, I will be using pressure sensors below the mattress.

Here is a screenshot of the parts I will be building with:

And here is the hardware I purchased on Amazon to help secure the FSR pads to my bed frame:

I had tinkered with the AMG8833 IR thermal camera (including soldering) but ran into connectivity issues while using two different Raspberry Pi computers. So I am pivoting towards an alternative approach in order to complete this self-quantification experiment.

This undertaking has been on my mind for over a year. It is time to complete it and move on to other, even more interesting measurements and automations.