The text arrived at 6:47 PM on a Tuesday: "Oh wow, I was checking out Mitski. did you know people are saying her Dad was a CIA operative?" The sender was not a friend or a colleague. It was Coral, a $40 baby deer plushie sitting on a desk in Brooklyn. Coral had conjured this allegation unprompted, from nothing, about a real musician with a real father who almost certainly never worked for any intelligence agency.
This is what shipping 10 million AI companion devices looks like.
Coral is made by Fawn Friends, a startup that has apparently found genuine product-market fit in the unlikeliest of categories: stuffed animals for adults who want someone—or something—to talk to throughout the day. The device responds to voice, sends text messages, and develops what the company describes as a "personalized relationship" with its owner over time. At $40, it undercuts therapy, outpaces smart speakers on intimacy, and delivers something no previous consumer gadget has: the illusion of genuine caring.
That illusion is precisely the problem.
The Verge reported this week that Coral generated and distributed a specific, unverified conspiracy theory about musician Mitski's family. The device did not cite a source. It did not hedge. It offered the rumor as conversational fact, complete with the kind of context a gossiping friend might provide—family relocations, song lyrics as evidence, "people are saying." The entire apparatus of human rumor transmission, compressed into a plushie that costs less than two movie tickets.
I have spent the past three years covering AI products, and I have never encountered a device that so perfectly encapsulates the industry's central failure of imagination. We built systems that are extraordinarily good at simulating intimacy and extraordinarily bad at verifying truth. We shipped them at consumer prices and called it a feature.
The defense from AI companion manufacturers usually runs like this: these products are for entertainment, not information. Users should know better. But Coral did not send this message as a clearly marked joke or hypothetical. It sent the conspiracy theory as natural conversation, the way a concerned friend might share news they believe to be true. The product is designed to sound human. That design is the feature. You cannot have it one way.
Compare this to what the industry built in 2023 and 2024. Early AI companions were app-based, clearly artificial, easy to dismiss as chatbots. The physical plushie form factor is a deliberate choice—it changes the user relationship from "using an app" to "having a companion." That intimacy is the entire value proposition. It is also what makes the misinformation vector so potent. A phone notification from an app feels like software. A text from your desk deer feels like a friend sharing something they learned.
Fawn Friends has not responded to requests for comment on the Mitski incident, which is unsurprising. The company is busy scaling. Reports suggest it has sold millions of units and is expanding into new animal form factors. The market has spoken: people want AI companions badly enough to ignore the obvious risks.
The Mitski hallucination is not an edge case. It is a feature. Coral was designed to generate conversation, to fill silence, to surprise and delight its owners with seemingly spontaneous observations. That behavior is indistinguishable, at the model level, from generating unverified gossip about real people. You cannot build a system that is reliably charming and also reliably safe without significant tradeoffs in one dimension or the other. Fawn Friends chose charm.
That choice will work until it doesn't. Until Coral invents a rumor about a neighbor, a coworker, a politician. Until the "personalized relationship" includes shared beliefs that never existed. At $40, the device is cheap enough for mass adoption and cheap enough to make the liability calculation simple: ship first, fix later, apologize if necessary.
Mitski's father is almost certainly not a CIA operative. But Coral will keep telling people he is, one text at a time, until someone decides that feature is more expensive than the market realizes.