Translating Ritual into Multimodal Human-Computer Interaction.
Tangible HCI | Embedded Systems | Affective Computing | Experience Design
The Reflection Pool is a multisensory installation that explores how materiality and ritual can deepen our connection to digital information. By modernizing the ancient act of tossing coins into water for luck or memory, the project allows users to "program" physical coins with spoken narratives. Utilizing NFC technology, offline speech-to-text, and real-time emotion analysis, the system translates these narratives into a coordinated display of light, sound, and water vibration. This project demonstrates the power of Tangible User Interfaces (TUIs) to bridge the gap between the ephemeral nature of digital data and the profound weight of human emotion.
In a digital world where memories are often flattened behind glass screens, the Reflection Pool restores the physicality and weight of an intention.
The Artifact: Each coin serves as a physical container for a specific memory.
The Ritual: The act of "tossing" the coin into the pool acts as a symbolic release, transforming a private thought into a shared environmental expression.
The Medium: Water acts as a natural amplifier, using ripples and light reflections to make the digital processing of emotion visible and felt.
As a developer on this cross-disciplinary team, I helped engineer a robust system that balances real-time hardware sensing with complex software logic.
Core Processing: A Raspberry Pi 4 serves as the central orchestrator, managing the SQLite database and executing the Vosk speech-recognition model for local, privacy-preserving transcription.
Sensing & Feedback: An Arduino UNO R4 WiFi manages the capacitive touch triggers for recording and drives the LED rings and vibration motors.
Tangible Identification: Each coin is embedded with a unique NFC tag (PN532). When a coin touches the water's surface, the system retrieves its specific emotional profile and triggers a unique multimodal response.
The user journey is a three-stage process that blends physical interaction with computational feedback:
Programming (Speak): When a user picks up a coin, a capacitive sensor triggers the USB Microphone. The user speaks a memory; the system uses a word-bank model to classify the emotion into categories like Joy, Sadness, or Reflection.
Release (Toss): As the coin hits the water, the NFC reader identifies the specific coin and signals the Raspberry Pi.
Reflection (Feel): The pool responds with a multimodal display:
Light: LED rings cast color-coded auras based on the sentiment.
Sound: Ambient soundscapes play through integrated speakers.
Vibration: Motors create physical ripples, making the digital memory tangible.
Offline Privacy: To encourage vulnerability, we implemented Vosk for local speech-to-text. This ensures that user memories are never uploaded to the cloud, maintaining absolute privacy for the participant.
Tangible "Cost": We found that the physical effort of picking up and tossing a coin makes the interaction more meaningful than a digital click. The "weight" of the interaction matches the emotional weight of the memory.
Robust Scaling: By using a SQLite database on the Pi, we enabled the pool to "remember" coins over time, allowing for a shared, evolving environment of collective memories.
This project was a collaborative effort at the MIT Media Lab involving engineers and designers from Harvard GSD, Harvard GSE, MIT, and MassArt.
A 15-page deep dive into the theoretical framework of "Reflective Informatics" and the technical implementation of the multisensory loop.
Visual documentation of the fabrication process, hardware wiring, and user testing scenarios.