Cohere

Cohere

Enterprise AI Language Models | Toronto, Ontario | Founded 2019

Cohere is a leading enterprise artificial intelligence company that builds large language models (LLMs) tailored for business deployments. Unlike competitors focused on consumer AI, Cohere has differentiated itself by offering cloud-agnostic, privacy-first LLMs that can be deployed across any cloud environment - including on-premises - making it the preferred AI infrastructure partner for regulated industries and global enterprises. For investors seeking to invest in Cohere stock or gain private market exposure to enterprise AI, Cohere represents one of the most compelling pure-play opportunities in the sector.

Company Overview

Founded2019
HeadquartersToronto, Ontario, Canada
IndustryGenerative AI / Enterprise Software
Total Funding~$1.57 billion
Current Valuation$7 billion (September 2025) [1]
Annual Recurring Revenue$240 million (2025) [2]
Employee Count201–500
Websitecohere.com

Highlights for Cohere

  • Sacra estimates Cohere reached $240 million in ARR in 2025, up 287% year-over-year from $62 million at end of 2024. [2]
  • Valued at $7 billion following an extended $600 million fundraise in August–September 2025 led by Radical Ventures and Inovia Capital. [1]
  • Approximately 85% of revenue generated from private cloud deployments, with enterprise customers including Oracle, Fujitsu, RBC, LG, and Notion. [2]
  • Cloud-agnostic architecture enables deployment on AWS, Google Cloud, Azure, virtual private clouds, and on-premises servers. [3]
  • Co-founded by Aidan Gomez, co-author of "Attention Is All You Need" - the seminal paper introducing the Transformer architecture underpinning modern LLMs. [2]
  • Strategic partnerships with SAP (Business Suite integration) and Dell Technologies (Cohere North on-premises offering) extend enterprise reach. [2]
  • CEO Aidan Gomez has publicly stated Cohere is "on a clear path to profitability" and expects to go public, potentially as the first pure-play AI model lab IPO. [2]

Product & Technology

Core Offerings:

  • Command: Cohere's flagship text-generation model, fine-tunable on enterprise data for tasks such as drafting emails, generating press releases, answering dataset questions, and summarizing documents. In August 2025, Cohere launched Command A Reasoning, an upgraded model optimized for enterprise customer-service workflows. [3]
  • Coral: A knowledge assistant for the enterprise that uses generative AI to assist in business operations, combining internal and external data sources with citations to mitigate hallucinations. Coral is deployable across cloud and on-premises environments. [3]
  • Embed: Converts text into vectors for semantic search and classification, enabling context-aware information retrieval across enterprise knowledge bases. [3]
  • Rerank: Re-orders search results by semantic relevance, improving the quality of information surfaced from large document repositories. [3]

Technology Stack:

  • Proprietary large language models trained for enterprise accuracy, safety, and compliance.
  • Cloud-agnostic deployment across all major hyperscalers and on-premises environments.
  • Integration with SAP Business Suite and Dell Technologies for enterprise distribution. [2]
  • Retrieval-Augmented Generation (RAG) capabilities for grounding outputs in real business data. [3]

Competitive Advantages

  • Cloud Agnosticism: Unlike OpenAI (tied to Microsoft Azure) or Anthropic (AWS/Google), Cohere deploys across any cloud or on-premises, a critical advantage for regulated industries and government clients. [2]
  • Enterprise Focus: By avoiding the consumer chatbot space (no ChatGPT equivalent), Cohere avoids competing with its own API customers - a structural advantage OpenAI lacks. [2]
  • Sovereign AI Positioning: As Canada's leading AI model lab with government support through the Sovereign AI Compute Strategy, Cohere is well-positioned to serve Western governments seeking AI independence from U.S. hyperscalers. [2]
  • Talent Cost Advantage: Toronto's talent pool is 30–40% cheaper than the Bay Area, providing a structural cost advantage in a capital-intensive industry. [2]
  • Gross Margin Expansion: Gross margins averaged around 70% in 2025, expanding by 25 basis points year-over-year, reflecting efficient scaling. [2]

Market Opportunity

The global enterprise AI market is projected to exceed $300 billion by 2026, driven by enterprise adoption of generative AI tools for knowledge work, customer service, and business automation. Cohere addresses the fast-growing market for enterprise LLM APIs and private cloud deployments, where demand for data privacy, customization, and compliance is driving companies away from general-purpose consumer AI products. The global cloud computing market - Cohere's primary distribution channel - was valued at over $600 billion in 2024. [4]

Market Trends:

  • Enterprises increasingly demanding private, customizable AI deployments to protect sensitive data.
  • Governments globally accelerating investment in "sovereign AI" to reduce dependence on U.S. hyperscalers.
  • Multi-LLM adoption patterns emerging, with enterprises using Cohere, OpenAI, and Anthropic in parallel to avoid vendor lock-in. [2]
  • AI integration expanding across verticals including financial services, healthcare, and manufacturing.

Financial Overview

Annual Recurring Revenue: Sacra estimates Cohere hit $240 million in ARR in 2025, up 287% year-over-year from $62 million at the end of 2024, surpassing its internal $200 million target. Quarter-over-quarter growth exceeded 50% throughout 2025. [2]

Revenue Model: Cohere combines usage-based API fees (pay-per-token) with fixed model licensing subscriptions for enterprise clients. The Free Plan provides rate-limited developer access at no cost; the Production Plan offers pay-as-you-go pricing at $1.50 per million input tokens and $2.00 per million output tokens; the Enterprise Plan provides custom pricing for dedicated model instances and specialized deployments. [3]

Gross Margins: Approximately 70% in 2025, supported by the company's capital-efficient flexible deployment model. [2]

Funding History and Investment Rounds

Key Investors: Nvidia, Oracle, Salesforce Ventures, Radical Ventures, Inovia Capital, AMD Ventures, Healthcare of Ontario Pension Plan, DTCP, Index Ventures, Thomvest Ventures [1][5]

RoundAmountTotal RaisedValuationNotable Investors
Seed$5M$5MN/AMultiple angels
Series A$40M$45M$200MIndex Ventures
Series B$125M$170MN/AThomvest Ventures
Series C$270M$440M$2.2BNvidia, Oracle, Salesforce Ventures
Series D$500M$940M$5.5BPSP Investments, Cisco, Fujitsu
Series E (ext.)$600M~$1.57B$7BRadical Ventures, Inovia Capital, AMD Ventures

Leadership Team

  • Aidan Gomez, CEO & Co-Founder: Co-authored "Attention Is All You Need," the foundational Transformer paper. Previously an intern at Google Brain and researcher at For.ai. PhD in Computer Science, University of Oxford; B.Sc. Computer Science, University of Toronto. [6]
  • Martin Kon, President & COO: Previously CFO at YouTube and Senior Partner & Managing Director at Boston Consulting Group. MBA from Queen's University; undergraduate at McGill University. [6]
  • Nick Frosst, Co-Founder: Previously a researcher at Google Brain's Toronto lab; among the first hires at Google's Toronto AI lab. B.Sc. Computer Science & Cognitive Science, University of Toronto. [6]

Investment Considerations

Growth Drivers:

  • Accelerating enterprise AI adoption creates sustained demand for private, customizable LLM deployments.
  • SAP and Dell partnerships provide distribution leverage across thousands of enterprise customers. [2]
  • Sovereign AI mandates from Western governments represent a growing revenue opportunity. [2]
  • Potential IPO on a "clear path to profitability" before 2029 - would be first pure-play AI model lab to go public. [2]

Risks and Challenges:

  • Intense competition from well-capitalized rivals including OpenAI, Anthropic, Google, and Meta's open-source LLaMA.
  • Rapid model commoditization could compress API pricing and margins over time.
  • Heavy dependence on enterprise sales cycles, which can be long and unpredictable.

Future Outlook:

  • Continued expansion of enterprise LLM use cases across financial services, healthcare, and government sectors.
  • Development of more autonomous AI agents capable of performing complex business tasks end-to-end.
  • Potential public market debut as a pure-play AI infrastructure company. [2]

References

[1] Source: Reuters.com / TechCrunch.com

[2] Source: Sacra.com

[3] Source: Cohere.com

[4] Source: Grandviewresearch.com

[5] Source: Pitchbook.com

[6] Source: LinkedIn.com

Rainmaker Securities, LLC ("RMS") is a FINRA (FINRA.org) registered broker-dealer and SIPC (SIPC.org) member. Find this broker-dealer and its agents at brokercheck.finra.org. Our relationship summary can be found at rainmakersecurities.com/disclosures.

RMS is engaged by its clients to make referrals to buyers or sellers of private securities ("Securities"). If such client closes a Securities transaction with a buyer or seller so referred, RMS is entitled to a success fee from the client. Such success fee may be in the form of cash or in warrants to purchase securities of the client or client's affiliate. RMS or RMS representatives may hold equity in its issuer clients or in the issuers of securities purchased or sold by the parties to a transaction.

This communication is confidential and is addressed only to its intended recipient. This communication does not represent an offer or solicitation to buy or sell Securities. Such an offer must be made via definitive legal documentation by the seller of securities.

Jeremy Nelson