Ohio State researchers published details on "Serpent," an attack against Apple Intelligence's authentication on macOS 26.0 (Tahoe). The paper is on arxiv at 2604.15637. Apple's authentication design uses a two-stage token system (Token Granting Tokens and One-Time Tokens, issued via Privacy Pass protocol) meant to authenticate devices anonymously against Apple Intelligence services. Serpent demonstrates that the tokens end up in the login keychain in plaintext, can be extracted with standard macOS tools, and can be replayed on a different device to impersonate the original.
The extraction path is the engineering detail to understand. Malware running with ordinary user permissions on the victim Mac can pull TGT and OTT values out of the login keychain via the SecItemCopyMatching API or the /usr/bin/security CLI, assuming the user clicks through the routine keychain-access prompt. With tokens in hand, the attacker overwrites their own local keychain with the victim's tokens, and subsequent Apple Intelligence requests identify the attacker's device as the victim. Neither a kernel exploit nor privileged access is needed. The design flaw is that the tokens are storable and portable in a container reachable by ordinary app-level malware.
Apple patched in macOS 26.2 by moving the tokens from the login keychain to the iCloud keychain and adding kernel permission checks. The researchers note this is a partial fix, not a complete one: a sufficiently privileged attacker can still bypass entitlements via kernel extensions, and the general class of "per-device anonymous auth tokens that can be stolen from local storage" is architectural rather than implementation-specific. CVE-2025-43509 is assigned. For anyone building privacy-preserving anonymous-auth flows on client platforms, the lesson is that the storage layer matters as much as the protocol layer. Privacy Pass guarantees break down if the tokens can be removed from the device they were issued to.
Two points for builders. One, if your product uses anonymous access tokens tied to device identity (for AI services, DRM, API rate limiting, anything similar), the Serpent pattern is the template to test your storage design against. "Tokens in an OS-provided keychain" is not sufficient if that keychain is readable by ordinary user-permission code behind a routine consent prompt. Two, the Ohio State paper arrives the same week as the Mozilla/Mythos 271 zero-days and the Anthropic Glasswing leak report. The pattern is consistent: AI platform security is concentrating on model-access control (who gets the weights, who gets the API) while the classical app-and-OS authentication layer remains a softer target. The client side of AI deployment is where the next year of practical exploits will live.
