The week on AI – October 20, 2024

Behind OpenAi’s audacious plan to make AI flow like electricitytricity

Sam Altman, the CEO of OpenAI, aims to create a global pool of computing power specifically for developing the next generation of artificial intelligence. Initially, Altman aimed to raise trillions of dollars by engaging with US officials, Middle Eastern investors, and Asian manufacturing giants. However, he has now scaled down his ambitions to hundreds of billions. Altman envisions making AI as pervasive as electricity, but the United States faces a challenge in continuing to build data centers, highlighting the significant power AI requires to operate. To understand Altman’s vision: he aims to construct data centers that cost USD 1 billion each, at least 5 times more than current data centers. These centers would house two million AI chips and consume 5 gigawatts of electricity. It seems that TSMC does not take Sam Altman’s plan very seriously. In parallel, OpenAI is seeking funding for its ongoing operations, which continue to consume more cash than they generate. Read

What’s at Stake in a Strained Microsoft-OpenAI Partnership

Microsoft has already invested billions of dollars in OpenAI, following a recent funding round that totaled $6.6 billion, which included cash and access to substantial computing power. OpenAI anticipates that it will spend as much as $37.5 billion annually on computing resources in the coming years. However, tensions are rising between the two companies. Microsoft has alleged that OpenAI is not delivering the expected AI software, while OpenAI has expressed concerns that Microsoft is not providing sufficient computing capacity. In this context, Microsoft has begun to diversify its AI strategy by hiring top talent, including former Google executive Mustafa Suleyman. Meanwhile, OpenAI has started forming partnerships with Microsoft’s competitors to secure additional computing resources. The ongoing lack of profitability at OpenAI poses a significant challenge. At the same time, Google is preparing to enhance its competitiveness in the AI sector. Read

Cerebras, an AI chipmaker trying to take on Nvidia, files for an IPO

The Silicon Valley company would be one of the first artificial intelligence companies to go public since the release of ChatGPT about two years ago. Cerebras is betting on not developing small chips, but rather going big: the chips they develop are up to 56 times larger than traditional chips used for artificial intelligence. A Cerebras chip is up to 21.5 cm by 21.5 cm. Nobody else produces chips that big. Read

  CerebrasNvidia Balckwell (planned)
Transistors on chip:4 trillion208 billion
AI cores900’00025’000
Petaflops of peak AI12520
Manufacturing process5nm4nm

One thought on “The week on AI – October 20, 2024”

Leave a comment