Granola, an AI-powered meeting notes app, markets itself with "private by default" messaging while actually making all user notes accessible to anyone with a link. The app, which captures audio from calendar-integrated meetings and generates AI summaries, enables public link sharing as the default setting. Testing confirmed that notes remain viewable in private browser windows without authentication, displaying the note owner's identity, creation timestamp, and partial transcript quotes when bullet points are selected.

This represents a familiar pattern in AI tooling where convenience trumps privacy in default configurations. As AI meeting assistants proliferate across enterprise environments, the gap between marketing language and actual privacy posture becomes a serious liability. The issue isn't just theoretical—sensitive business discussions, strategic planning sessions, and confidential conversations are being inadvertently exposed through shareable links that users may not realize are public.

While users can change the default sharing setting to "Only my company" or "Private" through profile settings, the opt-out approach shifts responsibility to users who likely assume their "private" notes are actually private. The company's documentation suggests full transcript access requires opening notes "inside the Granola desktop app," but it remains unclear whether this means any Granola user can access transcripts or just invited collaborators.

For teams evaluating AI meeting tools, this serves as a reminder to audit privacy defaults before deployment. The productivity gains from AI note-taking mean nothing if your competitive intelligence ends up accidentally public. Check your sharing settings, and assume any AI service defaults to the least private option until proven otherwise.