2016 was an amazing year in the history of computing. That year, numerous experienced computer chip designers set out on their own to design novel kinds of parts to improve the performance of artificial intelligence. It’s taken a few years, but the world is finally seeing what those young hopefuls have been working on.
…Bajic, and other chip teams, are responding to the explosion in the size of deep learning models, such as BERT, and OpenAI’s “GPT2”, but also even newer models such as Nvidia’s “MegatronLM”, Microsoft’s “Turing-NLG”, and neural net models that Bajic said he couldn’t talk about publicly that will have on the order of one-trillion parameters.