“OpenAI Chief Seeks New Microsoft Funds to Build ‘Superintelligence’: Sam Altman Expects Big Tech Group Will Back Start-Up’s Mission to Create Software As Intelligent As Humans”, Madhumita Murgia2023-11-13 (, ; backlinks)⁠:

…Asked if Microsoft would keep investing further, Altman said: “I’d hope so.” He added: “There’s a long way to go, and a lot of compute to build out between here and AGI . . . training expenses are just huge.”

Altman said “revenue growth had been good this year”, without providing financial details, and that the company remained unprofitable due to training costs. But he said the Microsoft partnership would ensure “that we both make money on each other’s success, and everybody is happy”.

…To build out the enterprise business, Altman said he hired executives such as Brad Lightcap, who previously worked at Dropbox and start-up accelerator Y Combinator, as his chief operating officer. Altman, meanwhile, splits his time between two areas: research into “how to build superintelligence” and ways to build up computing power to do so. “The vision is to make AGI, figure out how to make it safe . . . and figure out the benefits.”

…The company is also working on GPT-5, the next generation of its AI model, Altman said, although he did not commit to a timeline for its release. It will require more data to train on, which Altman said would come from a combination of publicly available data sets on the internet, as well as proprietary data from companies. OpenAI recently put out a call for large-scale data sets from organizations that “are not already easily accessible online to the public today”, particularly for long-form writing or conversations in any format. While GPT-5 is likely to be more sophisticated than its predecessors, Altman said it was technically hard to predict exactly what new capabilities and skills the model might have. “Until we go train that model, it’s like a fun guessing game for us”, he said. “We’re trying to get better at it, because I think it’s important from a safety perspective to predict the capabilities. But I can’t tell you here’s exactly what it’s going to do that GPT-4 didn’t.” To train its models, OpenAI, like most other large AI companies, uses Nvidia’s advanced H100 chips, which became Silicon Valley’s hottest commodity over the past year as rival tech companies raced to secure the crucial semiconductors needed to build AI systems.

[Hardware shortage] Altman said there had been “a brutal crunch” all year due to supply shortages of Nvidia’s $40,000-a-piece chips. He said his company had received H100s, and was expecting more soon, adding that “next year looks already like it’s going to be better”. However, as other players such as Google, Microsoft, AMD and Intel prepare to release rival AI chips, the dependence on Nvidia is unlikely to last much longer. “I think the magic of capitalism is doing its thing here. And a lot of people would like to be Nvidia now”, Altman said.

…“[Other companies] have a lot of smart people. But they did not do it. They did not do it even after I thought we kind of had proved it with GPT-3”, he said.

…Ultimately, Altman said “the biggest missing piece” in the race to develop AGI is what is required for such systems to make fundamental leaps of understanding. “There was a long period of time where the right thing for [Isaac] Newton to do was to read more math textbooks, and talk to professors and practice problems . . . that’s what our current models do”, said Altman, using an example a colleague had previously used. But he added that Newton was never going to invent calculus by simply reading about geometry or algebra. “And neither are our models”, Altman said. “And so the question is, what is the missing idea to go generate net new . . . knowledge for humanity? I think that’s the biggest thing to go work on.” [is this a reference to Q?]