Here is an experiment: using ChatGPT to emulate a Jupyter notebook. You can even get it to run GPT inside ChatGPT.
And you can also train neural networks from scratch inside ChatGPT.🤯
Here's walkthrough of how it works.
Jan 30, 2023 · 3:51 AM UTC
We start with a clever prompt that asks ChatGPT to be a Jupyter notebook.
It correctly prints out "hello", and can do basic arithmetic. So far, so good!
Let's see if it can run some numpy.
Ah, much better! So it knows about numpy arrays.
Can we do something more interesting? What about pre-trained models in the HuggingFace transformers library?
Let's import a sentiment analysis pipeline and try to run it.
Great, it also knows about HuggingFace transformers! It predicts this sentence to be "Label 1". I guess that means positive sentiment?
This sentence has a negative sentiment, and it gets "Label 0". Looks pretty good to me! It's not exactly in the format that the transformers library would output, but it's very close.
What about something a bit more fun, like translation?
We get an excellent translation, merci beaucoup!
Ah, but if ChatGPT can run sentiment analysis and translation models... can it also run GPT?
Oh, yes it can!
Not quite the ending I had it mind, but it works just as well.
But if ChatGPT can emulate HF transformers running GPT, can it emulate HF transformers running GPT running HF transformers?
Yes, it can!
It tries to be helpful by adding "the output would be", but I didn't prompt it to be concise, so no complaints on my end.
Okay, but running pre-trained models is too easy. What about training new models? Can my Jupyter notebook do that?
Let's try it out! I'll use Keras to create a simple neural net with one hidden layer.
I'm curious if it can learn a simple func like x**2, so I create the data.
Let's train it! The training loss is already pretty small here, maybe it already went down to 3e-4 in the first epoch? :)
Okay, let's try something slightly more complex like a sine function.
The loss seems a bit high on this one, but that's okay.
How does it do for some other values?
It gets: sin(pi) = 0 and sin(1.0) is correct as well. We can double check in the simulated notebook (and in a real one too just to be sure 🙃)
sin(2.0) is also correct
Although to be fair, the model makes errors on some more esoteric inputs. Maybe we need to train it for longer 🤔
Okay, let's try a more complicated function, and let's strip out the comments to make it harder.
This is trying to learn 5 / (1+x)**2 -1
So far, we ran several ML models in ChatGPT, we ran GPT inside ChatGPT, and we even trained some small neural nets in Keras.
But this demo would not be complete if we did not try one last thing---prompting GPT to create a Jupyter notebook in a Jupyter notebook in ChatGPT.