Nvidia has agreed to a non exclusive license for AI chip startup Groq’s inference technology and will hire away Groq’s top leadership, including founder and CEO Jonathan Ross, in a deal that stops short of a formal acquisition.
Groq said the agreement gives Nvidia access to its inference focused chip know how, while Groq continues operating as an independent company under new CEO Simon Edwards. The company also said its cloud business will keep running.
The announcement landed after CNBC reported that Nvidia had agreed to buy Groq for about $20 billion in cash. Neither Nvidia nor Groq confirmed that figure or described the arrangement as an acquisition, and Groq’s own statement framed it as a licensing deal paired with a leadership transition.
License and leadership shift
Groq said Ross, who previously helped start Google’s AI chip efforts, will join Nvidia along with Groq President Sunny Madra and other engineering team members. A person close to Nvidia confirmed the licensing agreement to Reuters.
While financial details were not disclosed, the structure fits a pattern that has become more common across Big Tech: pay for technology access and hire key talent, while avoiding a full takeover that could trigger heavier regulatory scrutiny. Reuters pointed to similar arrangements involving Microsoft and Meta, as well as talent moves tied to Amazon and earlier Nvidia deals.
Analysts also flagged antitrust as a key risk. Bernstein’s Stacy Rasgon wrote that the non exclusive structure may preserve the appearance of competition even if leadership and technical talent move to Nvidia.
Inference race heats up
Groq specializes in inference, the stage where trained AI models generate responses for users in real time. Nvidia still dominates the training side of the AI hardware market, but inference is attracting more competition from AMD and newer chip startups, including Groq and Cerebras Systems.
Groq has positioned itself around an approach that relies on on chip SRAM rather than external high bandwidth memory, which can help avoid bottlenecks tied to tight memory supply. The tradeoff is that on chip memory can limit the size of models that can be served, even if it improves speed for certain workloads.
Reuters also reported that Groq more than doubled its valuation to $6.9 billion from $2.8 billion after a $750 million funding round in September, underscoring investor demand for inference alternatives as the AI market broadens beyond training clusters.
Key numbers (from reported statements)
| Item | What was reported |
|---|---|
| Deal structure | Non exclusive license plus executive and engineering hires |
| Acquisition rumor | CNBC reported about $20B cash, not confirmed by Nvidia or Groq |
| Groq valuation | $6.9B after a $750M funding round (Sept) |
Read Also: Strategy holds Nasdaq 100 spot as Bitcoin holdings face index scrutiny

