ā€œPanGu-Coder: Program Synthesis With Function-Level Language Modelingā€, Fenia Christopoulou, Gerasimos Lampouras, Milan Gritta, Guchun Zhang, Yinpeng Guo, Zhongqi Li, Qi Zhang, Meng Xiao, Bo Shen, Lin Li, Hao Yu, Li Yan, Pingyi Zhou, Xin Wang, Yuchi Ma, Ignacio Iacobacci, Yasheng Wang, Guangtai Liang, Jiansheng Wei, Xin Jiang, Qianxiang Wang, Qun Liu2022-07-22 ()⁠:

We present PanGu-Coder, a pretrained decoder-only language model adopting the PanGu-α architecture for text-to-code generation, i.e. the synthesis of programming language solutions given a natural language problem description.

We train PanGu-Coder using a two-stage strategy: the first stage employs Causal Language Modeling (CLM) to pre-train on raw programming language data [Python], while the second stage uses a combination of Causal Language Modeling and Masked Language Modeling (MLM) training objectives that focus on the downstream task of text-to-code generation and train on loosely curated pairs of natural language program definitions and code functions. Finally, we discuss PanGu-Coder-FT, which is fine-tuned on a combination of competitive programming problems and code with continuous integration tests.

We evaluate PanGu-Coder with a focus on whether it generates functionally correct programs and demonstrate that it achieves equivalent or better performance than similarly sized models [up to 2.6b parameters], such as Codex, while attending a smaller context window and training on less data.