- See Also
-
Links
- “Universal Self-Consistency for Large Language Model Generation”, Chen et al 2023
- “A Systematic Comparison of Syllogistic Reasoning in Humans and Language Models”, Eisape et al 2023
- “Android in the Wild: A Large-Scale Dataset for Android Device Control”, Rawles et al 2023
- “Google's Newest A.I. Model Uses Nearly 5× More Text Data for Training Than Its Predecessor”, Elias 2023
- “Pretraining Language Models With Human Preferences”, Korbak et al 2023
- Miscellaneous
- Link Bibliography
See Also
Links
“Universal Self-Consistency for Large Language Model Generation”, Chen et al 2023
“Universal Self-Consistency for Large Language Model Generation”
“A Systematic Comparison of Syllogistic Reasoning in Humans and Language Models”, Eisape et al 2023
“A Systematic Comparison of Syllogistic Reasoning in Humans and Language Models”
“Android in the Wild: A Large-Scale Dataset for Android Device Control”, Rawles et al 2023
“Android in the Wild: A Large-Scale Dataset for Android Device Control”
“Google's Newest A.I. Model Uses Nearly 5× More Text Data for Training Than Its Predecessor”, Elias 2023
“Google's newest A.I. model uses nearly 5× more text data for training than its predecessor”
“Pretraining Language Models With Human Preferences”, Korbak et al 2023
Miscellaneous
Link Bibliography
-
https://www.cnbc.com/2023/05/16/googles-palm-2-uses-nearly-five-times-more-text-data-than-predecessor.html
: “Google's Newest A.I. Model Uses Nearly 5× More Text Data for Training Than Its Predecessor”, Jennifer Elias