An exploration of AI’s emotional intelligence, using fine-tuned models, a chatbot, Suno API, and Raspberry Pi to sense emotions, associate them with colors, and express them through music and light—delivered as both a responsive conversational lamp with built-in speaker and projection, and an app/web platform for journaling and archival.
Our Momentone prototype runs on a Raspberry Pi, which powers the conversation and feeds it into our finetuned Gemini model. The model, trained to map emotions to colors and generate optimized music prompts, produces the text prompt. This is then sent to the Suno API to generate music.
03 - Music & Visuals Create dynamic visual with p5.js
→ Dominant color
→ Map Energy Level to Motion
Use SunoAPI to create 30 seconds track
04 - Physical Attributes Live response from a lamp that changes color and plays music
Using RasPi to control the lamp and play music
It will act as a CPU to process all the data and get an output
INSPIRATION Sho Shibuya paints over the news with the everyday beauty of nature, later confronting it directly to explore his own emotions and reactions to the daily onslaught of stories—sad, happy, and always immediate. Inspired by this approach, we drew on The Pudding’s Emotion Wheelto train our model to recognize and express complex feelings—sometimes even those we struggle to name ourselves—through music and color.