Bibliography (6):
MegatronLM: Training Billion+ Parameter Language Models Using GPU Model Parallelism
2021-junseong-hyperclova.html
GPT-3: Language Models are Few-Shot Learners
Wikipedia Bibliography:
Nvidia DGX
https://en.wikipedia.org/wiki/SK_Telecom :
https://en.wikipedia.org/wiki/SK_Telecom
https://en.wikipedia.org/wiki/Kakao :
https://en.wikipedia.org/wiki/Kakao