Week 5: Prototyping the A.E.G.I.S. System
Transitioning from theoretical planning to building the first working prototype of A.E.G.I.S.
This week, I moved from theoretical planning to building the first working prototype of A.E.G.I.S. Because a system is useless if it only works on paper, I needed to physically integrate the microcontrollers and the Python backend to process live environmental disturbances. The objective was to create a scaled prototype that detects sudden human falls and triggers a local alert, all whilst strictly preserving user privacy.
Mathematical Engine Preparation
Before testing the physical hardware, I had to ensure the data extraction was mathematically sound. Because preserving dignity is the primary motive behind A.E.G.I.S., optical cameras cannot be used. Therefore, I wrote a Python script to parse raw Channel State Information (CSI) from the Wi-Fi network.
Instead of relying on basic signal strength, which fluctuates wildly and causes false alarms, I developed a Wave-Shift algorithm. This calculates the kinetic variance (standard deviation) across a rolling 10-frame buffer. The logic is simple: if the room is still, the variance remains low; if a person falls, the physical drop violently distorts the Wi-Fi waves, causing a massive variance spike. This lightweight mathematical approach is necessary because it allows the system to run locally on the Raspberry Pi 5, entirely removing the need for external cloud computing.
Hardware and Edge Deployment
With the math engine established, I moved to the physical hardware setup. I configured two ESP32 microcontrollers, a Beacon and a Receiver, to generate an invisible Wi-Fi tripwire. However, Wi-Fi CSI alone has a flaw: it detects a sudden drop, but if the person lies perfectly still unconscious, the variance returns to normal. The system wouldn't know if a person fell or if a heavy book simply dropped off a table.
To solve this, I integrated an HLK-LD2410C mmWave radar to track micro-movements. The cause-and-effect logic is now absolute: if the Wi-Fi detects a massive variance spike and the radar subsequently detects human respiration at floor level with no walking movement, the system definitively confirms a fall.
For the alert output, a simple buzzer provides no detailed context. Consequently, I wired an ILI9341 SPI TFT display directly to the Pi’s GPIO pins. When the dual-sensor threshold is breached, the display dynamically flashes a red alert state.
Reflections and Challenges
Integrating the math with the physical hardware immediately revealed several real-world flaws. Firstly, if the Python script takes too long to process the complex matrices, the Raspberry Pi's USB serial buffer jams. This caused a massive 15-second latency between a physical movement and the screen updating. Because a 15-second delay is disastrous in a medical emergency, I coded a low-latency failsafe to constantly monitor and flush the queue, forcing the processing back to real-time.
Secondly, I encountered the "Quiet Room" problem. If the system relies on a smartphone to generate network traffic, modern operating systems will eventually drop the closed-loop Wi-Fi connection to save battery. If the phone disconnects, the wave transmission ceases, and the entire safety net fails. To fix this, I programmed the ESP32s using C and FreeRTOS to autonomously fire a UDP ping ten times a second, ensuring the system is completely self-sustaining.