Onix launched this week as a "Substack for chatbots," offering AI clones of health and wellness influencers that subscribers can chat with for personalized advice. Founded by former WIRED contributor David Bennahum, the Canada-based startup promises to solve AI's core problems—hallucinations, privacy, and creator compensation—by training bots exclusively on expert-provided content and storing user data locally with encryption. Users subscribe to individual "Onix" bots rather than accessing generic AI models.
The timing reflects a broader shift toward specialized AI agents that live in specific contexts—from The Nudge's AI that texts weekend plans to therapy bots dispensing mental health advice. But Onix's approach exposes the fundamental tension in monetizing human expertise through AI: how do you maintain authenticity while scaling infinitely? The subscription model attempts to recreate the scarcity and personal connection of human consultation, but at AI's always-available price point.
Testing revealed Onix's guardrails are more marketing than reality. When prompted to discuss NBA playoffs, a therapy bot called the topic change "a fun change of pace" and hallucinated about last year's conference finals. Another bot discussing ketamine therapy was easily diverted into indie band breakup analysis, though it tried connecting the conversation back to "neurobiology in distress." These failures highlight why building reliable, domain-specific AI remains incredibly difficult—even with expert-trained models and supposed conversation boundaries.
For developers, Onix demonstrates both the market demand for personalized AI and the gap between promise and performance. The local storage approach is interesting for privacy-conscious applications, but the guardrail failures suggest that constraining AI behavior requires more sophisticated techniques than training on curated content. Until we solve reliable AI alignment at the conversation level, subscription expert bots remain an expensive way to get unreliable advice." "tags": ["chatbots", "personalization", "monetization", "guardrails
