Few AI companies can trace their origin story directly to the paper that started it all. Cohere was co-founded in 2019 by Aidan Gomez, Ivan Zhang, and Nick Chicken. Gomez was a co-author of "Attention Is All You Need," the 2017 Google Brain paper that introduced the Transformer architecture — the foundation beneath virtually every large language model in existence. He was an intern at the time, just 20 years old, working under the supervision of researchers who would scatter across the industry to found or lead some of its most important companies. Gomez took a different path from his co-authors at Google: rather than building a consumer-facing AI product, he set out to build an enterprise AI platform from day one. That focus on business customers, not chatbots, has defined Cohere ever since.
While OpenAI chased consumer mindshare with ChatGPT and Anthropic carved out a safety-first niche, Cohere made a deliberate bet on the enterprise market. Their core products reflect this: Command (a family of instruction-following LLMs), Embed (text-to-vector embedding models), and Rerank (a re-ranking model for improving search results). These aren't flashy chatbot demos — they're the building blocks that companies need to deploy retrieval-augmented generation, semantic search, and document processing at scale. Cohere's models are designed to run anywhere: on major clouds, on-premises, or even in air-gapped environments. This cloud-agnostic, deployment-flexible approach is a meaningful differentiator in a market where most frontier models lock you into a single provider's ecosystem. Multilingual support has been another pillar — their models handle over 100 languages, making them attractive to global enterprises that can't afford English-only solutions.
Cohere has raised significant capital, including a $270 million Series C in 2023 that valued the company at roughly $2.2 billion, followed by a $500 million Series D in 2024 that pushed the valuation past $5 billion. Their investor roster includes NVIDIA, Oracle, Salesforce Ventures, and Inovia Capital. The company has partnership deals with Oracle Cloud, AWS, and Google Cloud, leaning into a strategy of meeting enterprises wherever their infrastructure already lives. Cohere has also made a point of staying headquartered in Toronto — part of a broader effort to build a credible AI ecosystem outside Silicon Valley. Gomez has been vocal about Canada's role in AI research, particularly the legacy of Geoffrey Hinton and the Toronto machine learning community that helped spark the deep learning revolution. This positioning has helped Cohere attract talent that might not want to relocate to San Francisco and has given the company a distinct identity in a field dominated by Bay Area firms.
Cohere's challenge is that the enterprise AI market is getting crowded fast. OpenAI, Anthropic, Google, and Amazon are all aggressively courting the same Fortune 500 customers, often with deeper pockets and more brand recognition. Cohere's counter-argument is specialization: they aren't trying to build AGI or a consumer product. Their entire stack is optimized for enterprise deployment — data privacy, on-prem flexibility, and models purpose-built for retrieval and search workflows. The launch of their Compass embedding model, their North training platform for fine-tuning, and their persistent investment in RAG tooling all reinforce this positioning. Whether that focused strategy can sustain Cohere as a standalone company in an increasingly consolidated market — or whether the sheer gravitational pull of the hyperscalers and frontier labs proves too strong — remains the central question for their next chapter.