Forest Fairy (森之精灵)

A mobile, AI-connected plant companion that reads environmental and plant-side signals, responds with motion and communication, and argues—quietly but seriously—that plants deserve their own form of dialogue, not a borrowed human medical chart.

Every kind of life has its own grammar. This project is one step toward listening to the one that has lived beside us, overlooked, for years.

Project overview

Forest Fairy is an invention that combines a plant, a holonomic (omni-wheel) mobile base, and an AI stack. It gathers ambient data—light intensity, temperature, humidity—and plant-side data such as electrochemical / bio-signal responses, soil moisture, and soil nutrient indicators. After analysis, the system drives visible reactions: reminders to water or fertilize, autonomous moves toward better light, conversational interaction framed by the plant’s condition, and (via IoT) coordination with furniture and the home environment so the space stays livable for both people and plants.

The aim is close to what we expect from an animal companion: a being that can sense its own state, ask something of its human, and sustain a sense of presence over time—a plant–robot–AI hybrid with enough coherence to feel like it has a “soul” in the everyday sense: continuity, care, and reciprocity. It is offered as a creative response to aging and loneliness, low-carbon living made tangible through technology, and the next generation of smart homes—without pretending the plant is a dog in a pot.

Philosophy: why plants should be spoken with, not only for

Life reacts; interaction can carry real value

We often treat houseplants as decoration or background. Yet they are alive. Being alive means responding—continuously—to light, water, air, nutrients, temperature, and touch, in chemistry and electrophysiology that do not exist for our sake but nonetheless shape shared space.

When those responses are left invisible, care becomes guesswork. When they are made perceptible and intelligible—without forcing them into cartoon emotions—we open room for both practical value (better care, fewer losses) and quiet emotional value: acknowledging another life form in the room. We are not claiming plants “think like us”; we are claiming they are, in a form that merits its own vocabulary.

The lives we have already been tending

Watering, fertilizing, rotating a pot toward the sun, wiping dust from leaves, worrying about frost or scorch—these acts parallel how we feed, groom, and shelter cats and dogs. The difference is often structural: we lack a channel through which the plant can “say” what it is undergoing, so affection stays diffuse and care stays intermittent.

We do not assign human feelings to plants. We do take seriously that people can care about plants, and that such care deserves better than dismissal. The prototype acts as a kind of mouthpiece: translating physical and physiological states into language and cues humans can hear and act on—always bounded by honesty about uncertainty.

Vital dimensions: water, light, nutrients—and context

The project tracks foundational life supports: moisture, light, and nutrients. They are not metaphors; they are the substrate of photosynthesis, growth, and recovery. Monitoring them is how we approximate “vital signs” for this organism—not a human clinic, but a plant-specific trajectory of thriving and stress.

Signals change with context. Electrochemical or impedance readings under strong sun differ from shade; after watering differs from drought; air quality shifts the background. Interpretation therefore needs comparison over time—change relative to the plant’s own recent baseline—rather than a single fixed threshold copied from a textbook. What matters is often period-over-period movement: how today relates to yesterday, or to the hours after the last watering.

Variation, noise, and the ethics of not over-reading

Even under stable conditions, biological systems fluctuate. Repeated measurements on the same plant can wobble without a “message.” The project treats that honestly: through calibration and testing, variation that persists without correlating to care-relevant events is modeled as noise or stochastic background, not as secret plant poetry. The goal is not a theatrical talking plant; it is a grounded interface to another biology.

Do not map human biomarkers onto plant life

Heart rate, blood pressure, facial expression, and spoken language are signatures of human embodiment and culture. Stomatal behavior, osmotic regulation, photosynthetic rhythms, and electrochemical responses are different grammars. If we demand that plants “prove” distress only in human-shaped signals, we misunderstand them and foreclose real knowledge.

The long arc of this work is therefore not “make a plant pretend to be a pet,” but respect that each species has its own way of existing, build new scales and comparisons for that existence, and grow a relationship with life forms we have habitually overlooked. That moral stance is also a technical one: the sensors and models must be judged by how well they fit this organism, not how well they mimic a wristband.

Vision: dialogue as a beginning

We hope the relationship between people and plants moves from “keeping it alive” toward dialogue in the broad sense: trends, needs, limits, and shared rhythms in the home. Farther futures—energy, materials, or symbiotic technologies—remain open questions. This prototype is deliberately a first step on the path of talking with life, at a domestic scale ordinary people can touch.


Product and system design (functional outline)

Motion and sensing

Plant signal acquisition

AI and behavior

Planter, display, and connectivity

Milestones

As documentation grows, this page can link to specific weekly logs, design files, and repositories from your Fab Academy site.