Is Nvidia a monopoly ? The real question is how to energize competition in the semiconductor top 10
In 2024 the FTC and DoJ accused Nvidia of anti-trust behavior, specifically of fostering an AI chip monopoly [1][2], and more recently the Trump admin considered breaking them up [3]. The underlying concern is the lack of AI chip competition in the US semiconductor industry, which is understandable as we watch major players like AMD and Broadcom (building custom chips for BigAI) -- not to mention startups like Cerebras, Groq, Positron, NextSilicon, etc -- struggle to gain traction. Instead, the government can take a win-win approach to energize competition in the semiconductor top 10 that doesn’t require antitrust litigation, which in the case of Nvidia is a totally losing effort.
Challenges facing Intel and AMD are widely discussed, and even ChatGPT knows that since the early 2000s Nvidia has been the only continuously committed semiconductor player in High Performance Computing (HPC, aka "fast compute", which includes AI). But Nvidia did have a natural competitor -- Texas Instruments. TI and Nvidia went head-to-head and TI did extremely well technology-wise. By 2014 TI had multicore performance density [4] advantages, as well as a market cap of $55B compared to Nvidia's $11B. The AI revolution was happening; key events included a convolutional model dominating industry image recognition in 2012 [5], Google purchasing DeepMind for $400M in late 2013, and Amazon launching Alexa in 2014.
The rise of AI started in speech and image recognition, areas relying heavily on signal processing in which TI already had a huge customer base. With their multicore signal processing CPUs TI had the semiconductor recipe for low-cost, energy efficient, and fast inference. After waiting 30 years for their killer app, it had arrived.
Churchill famously said it's darkest before the dawn, but in TI's case it was morning and they still couldn't see. The "AI winter" was Pleistocene by 2015, but TI leadership still failed to recognize the significance -- and growing revenue -- of AI, considering it a costly science project, akin to "fuzzy logic" (their words, I was in the room). They decided analog (analog?!) was their future and cut their HPC product line. Since then divergence between TI and Nvidia has been painful to watch; Nvidia market cap surpassed TI in 2018 and is now 15x bigger.
While TI executives squandered their opportunity to compete in fast compute and the next industrial revolution, Nvidia's leadership spent their 30 years persevering at all costs, outworking everyone else, surviving brushes with commercial death, and incrementally, painstakingly building their HPC software ecosystem. Hardly anti-trust behavior !
Wall Street analysts say Steve Mollenkompf (Qualcomm) was the 2010s "worst tech company CEO". Well I have a better candidate: Rich Templeton, who could not understand AI even when it was explained to him by his own executives, third-party partners, and Bay Area tech industry experts. Now TI has Haviv Ilan, who started at TI nearly 25 years ago and was responsible for radio modem chips, which also failed. I don't see any indication he has an AI strategy. And don't let anyone tell you TI has ADAS [6] chips for automotive -- that doesn't cut it. Ask them where are their multicore AI accelerator cards for servers, AI related open source presence, and partnerships with Super Micro, Dell, et. al. Without those they are nowhere in AI. Compared to Musk, Huang, Bezos, and other multidisciplinary CEOs who have engineering degrees and relentlessly invest in software ecosystems for their hardware, a lack of technology leadership, vision, and mojo was, and remains, TI's underlying issue. Nvidia's best defense in anti-trust litigation is to point to TI's lack of technology vision -- nobody "forced" or conspired with Templeton to turn a blind eye to AI and take down Nvidia's strongest competitor, he did it on his own.
Even with Nvidia's dominance, there remains plenty of room for competition. Nvidia pays little attention to massive, unsustainable energy consumption required by AI data centers, leading to prohibitive emissions and water usage. Instead of attacking the energy sucking problem at the semiconductor and algorithm level, Big AI wants to power data centers with nuclear reactors [7]. To this day TI has home-grown energy efficient HPC technology that Nvidia does not. To ramp up US competition to Nvidia -- and at the same time tackle unsustainable data centers -- the fastest way is to get Texas Instruments back in the game. [8]
And how to do that ? Currently TI gets 50+ % of their revenue from China, and this comes from mid-node analog and discrete chips -- any of which, should CCP planners decide they need in sufficient volume, they can reproduce with zero concern about patent litigation, rendering TI's 45,000+ patent portfolio essentially useless. The US Government should not be granting US semiconductor companies China sales waivers and $1.6B in subsidies [9] unless they are working hard on valuable, forward-looking products and R&D vital to the US economy and future. It's just a matter of time before TI's China revenue drops off; with their own complacency and the government's help, they are slowly becoming irrelevant, churning out non-state-of-the-art chips. Before TI faces existential challenges like Boeing, Intel, and Ford, their leadership should be leveraged to regain AI technology leadership and bring back their HPC line of products.
What specific policies can the government implement ? It helps to look at the solar power industry, adjacent to silicon integrated circuits but less complex. A "solar race" played out over the last 20 years, and the US lost convincingly, as explained in David Fickling's extremely well researched and documented article "How the US Lost the Solar Power Race to China". In fast compute the US still leads, namely due to one driven, visionary, and ultra hard working CEO, Jensen Huang. But when he ages out and Nvidia is run by an MBA -- or even a less technically visionary engineer -- it will be game over. Before we lose again, the government should leverage use of subsidies, tax breaks, and even tariffs if they are politically forced into it. There should be a "too important to not compete" designation that requires certain key companies to be run by visionary technical management, spend some percentage on AI chip R&D, and other accountability criteria. Milestones can be enforced; for example if results are not achieved on a timeline then subsidy payments stop, or tariffs are reduced. If we had done that in solar, instead of just handing US manufacturers free cash in the form of unconditional tariffs, the situation today might be different.
Let's see the US government use their leverage for a positive, productive outcome instead of ineffective antitrust litigation !
Recommended by LinkedIn
[4] Performance density is the ratio of performance to power consumption and chip size. Size includes square area and height of the chip package itself, and also heat sinks, liquid cooling, and fans necessary per chip. Increased performance density leads to a wide range of systemic advantages, for example reduced cooling, single-slot thickness PCIe accelerator cards, allowing 100s of cores to be added to a 1U server. Since the 2010s, to this day, Nvidia GPU accelerator cards are double-slot thickness, limiting the number of cards in a 1U server
[5] The first large models were Convolutional Neural Networks, or CNNs. Here is an article describing a key event on the AI timeline on Sep 30th, 2012
[6] Advanced Driver Assistance Systems
[8] A new generation 10-20 nm C66xx multicore chip would be literally 10x more energy efficient for a given performance vs Nvidia H100. For example, 32 cores, 2 GHz clock rate, 16 GB addressable external mem, matrix multiply acceleration, and expanded SIMD instruction set. The objective would be to compete with Nvidia as well as Groq and Cerebras in matrix multiply performance density, in addition to providing C/C++ coding and external I/O options the others cannot
#nvidia, #texasinstruments, #ai, #inference, #datacenters, #gpu, #antitrust, #doj, #ftc, #monopoly
No comments on this lucid article? Wow. Thats surprising Jeff. Let me get the circus started. This deserves to be read widely. I am not an AI/HPC expert the way you are Jeff but I know I was part of several of these conversations with you when we trying to take next steps with the multicore 6455 tomohawk multicore card + German town Speech codec software. You were so passionate about taking that card and add missing pieces for it do AI. I am sure I have several chat messages where you already predicted the rise of NVIDA in what 2012?! I am sharing this post for a few more folks to read and understand the AI bus missed.