European Parliament voted to delay the EU AI Act's core deadlines by 18-28 months while simultaneously backing a ban on nudify apps. High-risk AI systems now have until December 2027 to comply instead of August 2025, with sector-specific systems like medical devices getting until August 2028. AI watermarking requirements moved to November 2026. The nudify ban emerged after widespread outrage over Grok's sexualized deepfakes flooding X earlier this year, though details remain vague beyond exempting systems with "effective safety measures."
This delay exposes the EU's regulatory reality: ambitious timelines meeting practical impossibility. Companies have already struggled with shifting guidance and missed EU deadlines for publishing implementation details. The simultaneous urgency around nudify apps versus the 18-month extension for everything else reveals how political pressure, not technical readiness, drives AI policy. Parliament still needs European Council approval, adding more uncertainty to an already chaotic rollout.
While Europe grapples with basic compliance timelines, other platforms continue building. Roblox just open-sourced an MCP server letting Claude directly modify game experiences, and they're completing their dynamic head migration to enable better user expression. The contrast is stark: European regulators debating watermark deadlines while developers ship AI tools that actually work.
For AI builders, this means more breathing room in Europe but continued regulatory uncertainty. The nudify ban's vague language around "effective safety measures" will likely create compliance headaches. Companies should prepare for enforcement inconsistencies as different EU member states interpret these rules differently when they eventually take effect." "tags": ["eu-ai-act", "deepfakes", "regulation", "compliance
