Here's ChatGPT convos that will be interesting to NLP experts and nobody else. Should this be possible? Think about the tokenizer.
Here's another task that the tokenizer should make impossible. Fortunately for my sanity, ChatGPT gets it wrong almost always.

Dec 9, 2022 · 7:16 AM UTC

Replying to @tomgoldsteincs
I don’t get it. What did it get wrong?
The tokenized representation of these words doesn't contain information about the pronunciation of a word, or how many syllables it has. For this reason, the model can't know how many syllables there are in each line of the poem. In other words: it can't write haikus.
Replying to @tomgoldsteincs
Any insight into why it can’t get haiku right? I noticed this, too… seems like a simple task to learn syllable count?
The tokenizer represents most words in a way the removes and information about how many syllables they have. A typical language model doesn't know what words sound like, although this model seems to sometimes figure it out.
Replying to @tomgoldsteincs
In general, it struggles with meter or anything to do with counting syllables, giving wonderfully nonsensical and contradictory answers on cross-examination.
Replying to @tomgoldsteincs
Try asking it for a joke and then explain the joke. You can learn something about how it relates words with each other that have no obvious relationship for humans. Its really weird.
Replying to @tomgoldsteincs
To be fair, the true haiku luminaries aren’t quite so strict about the syllable count
Replying to @tomgoldsteincs
Imo this is a perfectly fine haiku. The exact syllable count is really not all that important to the form