Designing With Dogs in Shared Home Space

My ongoing thesis project asks: How can we leverage natural canine behaviors as design inputs to create a dog-led, home-based interaction experience that benefits both dogs and humans? I explore a system that captures everyday behaviors, especially sniffing, pausing, touching, and persistence, and translates them into a legible visual language. The goal is to move shared living from ‘managing’ a dog’s behavior toward noticing patterns of comfort, curiosity, and engagement, so humans can make kinder everyday care decisions while the dog retains agency to initiate or disengage.

I am interested in creating technologies that support interspecies relationships. As one who parents a dog, I want a better way to ‘see’ the hidden patterns in my dog’s daily life, what spaces she returns to, what she investigates, and what conditions seem to make her feel safe or curious at home. As a designer, I want to shift the home from a human-centered floor plan to a relational space where nonhuman preferences can be noticed and addressed.


Quick Prototype I

Prototype II: Shared agency through motion and light. I built a small mobile lamp with a front affordance for biting/pulling; interaction caused the lamp to move and project soft light, creating a shared spatial rhythm (motion → light → co-presence). Feedback: The cause-and-effect was emotionally convincing and playful, showing how the dog can influence the environment; however, it still needed clearer framing and greater durability for repeat use. 


Quick Prototype II

Prototype II: Scent-driven ‘drawing’ traces. I explored turning sniffing into a visible record (an open canvas rather than a task). The goal was to let the dog ‘speak’ in marks, where, how often, and how long she engaged, without training. Feedback highlighted two needs: humans want simple, readable summaries, and the system must respond reliably (visual confirmation and gentle rewards) to avoid frustration when scent cues imply an outcome.



Quick Prototype III

Prototype III: The painting interface (proof of concept). I created a foraging interface using everyday materials (egg cartons, bottle caps) and embedded FSR pressure sensors to capture intensity and duration. I developed a visual grammar: higher pressure generates larger/denser shapes; time generates trails. Outcome: a real-time abstract ‘painting’ that makes effort visible. Limitations: a wired setup limits where behaviors can be captured; the human side needs better instructions on how to read the visualization. 




Quick Prototype IV

Prototype IV: Wireless co-presence. I hacked a motion sensor and Hue bridge so the room’s lighting changes when Hexa performs a comforting sniffing behavior (burying her head in a bag). This prototype turns the entire room into a responsive interface, making her behavior immediately legible without cables. 



Next
Next

NATURE’S PLAYBOOK FOR INNOVATION