Getting your Trinity Audio player ready...
|
OpenAI, the company behind ChatGPT, fired back last week with a countersuit against Elon Musk—marking another chapter in what has become a very public legal battle between Elon Musk, Sam Altman, OpenAI, and a select few others.
Musk has been at war with OpenAI and CEO Sam Altman for nearly a year now, accusing the company of abandoning its original mission. Musk’s original lawsuit centers on claims that OpenAI violated its founders’ agreement and broke away from its original nonprofit roots in pursuit of commercial gain—specifically through the creation of OpenAI Global LLC, its for-profit arm, as well as in its quest to convert its nonprofit entity into a for-profit company.
But this week, OpenAI countersued. The company filed a legal response accusing Musk of engaging in “unlawful and unfair business practices” designed to disrupt OpenAI‘s operations and smear its reputation. OpenAI also claims Musk is primarily doing all this to benefit his AI company, xAI.
If you’re just catching up on this feud, we are still in its early innings, and now is the time to get up to speed. Our previous coverage breaks down the legal filings, the history between Musk and OpenAI, and what’s at stake for both companies.
Claude’s $200 subscription
Last week, Anthropic rolled out a new “Max plan” for its AI chatbot Claude, a $100 and $200 per month subscription tier that offers what the company calls “expanded usage,” which is just another way of saying you’ll be able to do more (you will have fewer limits) on Claude than before. The $100/month tier offers 5x more usage than the standard Pro plan and the $200/month plan ups that to 20x the usage.
A move like this will probably be celebrated by developers and startups who have Claude integrated somewhere in their tech stack. But under the hood, this move is about more than performance for their users; it’s about profitability for Anthropic, the parent company for Claude.
Anthropic is probably hoping that this new Max plan opens up a new revenue channel. After all, OpenAI’s own $200/month Pro plan is rumored to have brought in an additional $300 million after it launched.
This pricing shift also highlights a bigger trend that’s been playing out behind the scenes of the AI boom. Despite billions in spending, none of these leading AI companies have turned a profit yet, and investors are starting to get concerned, which is why they are beginning to ask when and where a return on their investment will come from.
Offering a more expensive product is one way to get closer to the profitability that investors are beginning to pressure these AI companies to produce, but relying on that one stream of revenue from subscription models alone is unlikely to get any of the companies there—especially when you begin to analyze what consumer demand for AI goods and services actually looks like.IEA report explores AI energy consumption
The International Energy Agency (IEA) released a report last week titled Energy and AI, which explored the growing relationship between artificial intelligence and global energy consumption.
At 301 pages, it’s a dense report, but here are a few takeaways that stood out:
1. AI is spiking electricity demand
According to the report, electricity consumption by data centers is projected to more than double by 2030, and AI is the number one driver of that growth. The United States is expected to be responsible for more than half of the global increase. By the end of the decade, U.S. data center electricity usage could exceed the total power used to produce steel, aluminum, cement, chemicals, and all other energy-intensive goods combined.
2. Where will the power come from?
It’s not just about building more data centers; the IEA notes that several energy grids worldwide are already under heavy strain. Without significant infrastructure upgrades—especially new transmission lines, which can take 4 to 8 years to build—many of the data center expansion plans we keep hearing about may be delayed or canceled.
3. AI’s energy impact isn’t being treated like crypto’s.
As I was going through the report, I realized that the tone around AI energy consumption is much different than the attitude these same agencies had toward block reward mining. Even though data centers could be using more power than all of Japan by 2030, the IEA didn’t argue that the industry is consuming too much electricity. Instead, it argues that AI’s contributions to innovation—especially in energy efficiency and grid optimization—may ultimately justify consumption.
Overall, the report brings some of the artificial intelligence industry’s less explored yet crucial components to the surface. While AI companies have been saying for a while now that the U.S. needs more data centers to stay competitive, the IEA report underscores a portion of the argument that we typically don’t hear from AI companies: that it’s not just about the data centers, it’s about the power sources as well. If power generation and delivery solutions aren’t explored and implemented quickly, they have the potential to significantly slow down the plans some of the tech giants have for the AI industry.
In order for artificial intelligence (AI) to work right within the law and thrive in the face of growing challenges, it needs to integrate an enterprise blockchain system that ensures data input quality and ownership—allowing it to keep data safe while also guaranteeing the immutability of data. Check out CoinGeek’s coverage on this emerging tech to learn more why Enterprise blockchain will be the backbone of AI.
Watch: Micropayments are what are going to allow people to trust AI