Popular OSS codebase sizes compared to the context length of GPT-4: (This filters for runtime code, e.g. removing docs and tests)
32k token context length is a massive unlock for codegen For example: all the Typescript files in the popular project tRPC sum to ~82k tokens. You can fit over a third of the entire project into context at any given point in time. (And it will cost you ~$2)

Mar 15, 2023 · 9:47 PM UTC

Replying to @mathemagic1an
Have you tried making unit test and e2e test for the code base using gpt 4 ?
Replying to @mathemagic1an
add linux kernel to this chart!