Prompting is like teaching/instructing kids. When I was asked how to write a chain-of-thought prompt, I showed them my kid's homework.

Oct 31, 2022 · 4:15 PM UTC

Replying to @denny_zhou
Oh wow, I'm surprised that the homework includes a clear chain-of-thought example! Also I think this is a challenging example for LMs: it's nontrivial to work out generalization from <-2, /3> operations to <+2, /4>, even for the 20-year-ago me.
Leave out the first one, and then the problem is reduced to the problem in the example.
Replying to @denny_zhou
Good news is that you can give a model a set of instructions and get a correct response. Bad news is that you have to use the Socratic method.