glad so awkwardly worded tweet was redeemed by a good discussion still i shouldn't post half-asleep in the middle of the night while trying to calm down a shaking dog scared of severe thunderstorms
Can GPT-3 invent new words that make sense? If not I’m not interested
in reality what i was trying to ask if GPT-3 does a word playing, punning etc
Strongly recommend @gwern's writeup at gwern.net/GPT-3 which does a great job of exploring some of the strengths and limitations of GPT-3 where it comes to word play, if you haven't read it already!
Tl;dr Word tokenisation is the main obstacle, as GPT-3 is aware of puns, but can't necessarily tell which words are actually similar due to not seeing characters.
Replying to @NineOfNein @gwern
One thing I've noticed is that when generating other alphabets such as Russian, the tokens are generally one character in size I think. Haven't tried it yet, but I wonder if puns and rhyme would actually work better in Russian and other languages.

Jul 24, 2020 · 7:02 PM UTC

Replying to @NineOfNein @gwern
On the other hand strength of the model definitely not as good outside of English.
This tweet is unavailable
Yes I was a bit overoptimistic. Tokens are definitely shorter than English, but the performance even worse. Getting it to explain its thinking, it clearly can't tell at all which sentences/words sound the same, which is odd, since homonyms tend to have the same letters in Russian