Anthropic’s top executive has a message for the technology world, which is rushing to pour billions into artificial intelligence. She says bigger isn’t always better. Daniela Amodei, who leads the company as president and helped start it , ta lks often about an idea that shapes everything the business does. She calls it “do more with less.” That thinking puts Anthropic at odds with what most of Silicon Valley believes right now. The largest technology companies and their financial backers act like size determines who wins. They’re gathering unprecedented amounts of money, buying computer chips years before they need them, and constructing enormous buildings full of servers across middle America. Their bet is simple: whoever builds the biggest operation wins. OpenAI shows this approach most clearly. The firm has made commitments worth about $1.4 trillion for computing power and related infrastructure. Working with various partners, the company is setting up huge data center facilities and getting hold of advanced chips faster than the industry has ever managed before. Anthropic thinks there’s a different path. The company believes careful spending, better algorithms, and smarter ways of using technology can keep them competitive without trying to outspend everyone else. The situation carries extra weight because Daniela Amodei and her brother Dario helped create the very philosophy they’re now working against. Dario runs Anthropic as chief executive and previously worked at Baidu and Google. He was part of the research team that made popular the scaling approach now guiding how companies build AI models. The basic principle says that adding more computing power, more data, and making models larger tends to make them better in ways you can predict. Scaling laws drive industry economics That pattern now supports the entire financial structure of the AI competition. It explains why companies running cloud services spend so much money, why chip manufacturers command such high stock prices, and why private investors put huge valuations on companies still losing money as they grow. But Anthropic wants to show that the next stage of competition won’t be won just by whoever can afford the biggest initial training runs. Their plan focuses on using better quality information for training, techniques applied after initial training that improve how models think through problems, and product decisions that make models cost less to operate and easier for customers to use at a large scale. That last part matters because the computing bills never end once models are actually running. Anthropic isn’t working with pocket change. The company has around $100 billion in computing commitments and expects those needs to grow if it wants to stay at the leading edge. As reported by Cryptopolitan recently, Amazon powered Anthropic’s Claude model with its new Rainier AI infrastructure featuring over one million Trainium2 chips. “The compute requirements for the future are very large,” Daniela Amodei told CNBC. “So our expectation is, yes, we will need more compute to be able to just stay at the frontier as we get bigger. ” Even so, the company say s th e big numbers being reported throughout the sector often can’t be compared directly. Industry-wide confidence about the correct amount to spend isn’t as firm as it appears. “A lot of the numbers that are thrown around are sort of not exactly apples to apples, because of just how the structure of some of these deals are kind of set up,” she said, talking about how companies feel pushed to commit early so they can get hardware years later. The larger reality, she noted, is that even people who helped develop the scaling theory have been caught off guard by how steadily performance and business results have grown. “We have continued to be surprised, even as the people who pioneered this belief in scaling laws,” Daniela Amodei said. “Something that I hear from my colleagues a lot is that the exponentia l co ntinues until it doesn’t. And every year we’ve been like, ‘Well, this can’t possibly be the case that things will continue on the exponential’, and then every year it has.” What happens when growth stops? Daniela Amodei separated the technology trend from the economic trend, an important difference that often gets mixed together in public discussion. Looking at technology alone, she sai d An thropic doesn’t see progress slowing based on what they’ve observed. “Regardless of how good the technology is, it takes time for that to be used in a business or sort of personal context,” she said. “The real question to me is: How quickly can businesses in particular, but also individuals, leverage the technology?” “The exponential continues until it doesn’t,” Daniela Amodei said. The question for 2026 is what happens to the AI race and the companies building it if the industry’s favorite growth pattern finally stops working. As the industry grapples with AI compute demand growing 2x faster than Moore’s Law , requiring $500 billion annually until 2030, Anthropic’s bet on efficiency over raw scale may prove prescient, or it may find that in the AI race, there’s no substitute for overwhelming computational power. Sharpen your strategy with mentorship + daily ideas - 30 days free access to our trading program