back to indexGPT-3 vs Human Brain

00:00:00.000 | 
The human brain is at least 100 trillion synapses, 00:00:05.880 | 
And a synapse is a channel connected to neurons 00:00:08.500 | 
through which an electrical or chemical signal is transferred 00:00:12.000 | 
and is the loose inspiration for the synapses, weights, 00:00:18.640 | 
GPT-3, the recently released language model from OpenAI 00:00:23.280 | 
that has been captivating people's imagination 00:00:45.440 | 
based on Lambda's test of U100 cloud instance, 00:00:48.640 | 
the cost of training this neural network is $4.6 million. 00:00:55.420 | 
if the model with 175 billion parameters does very well, 00:00:59.420 | 
how well will a model do that has the same number 00:01:05.300 | 
Setting aside the fact that both our estimate 00:01:07.900 | 
of the number of synapses and the intricate structure 00:01:10.580 | 
of the brain might require a much, much larger 00:01:15.500 | 
But it's very possible that even just this 100 trillion 00:01:23.700 | 
And one way of asking the question of how far away are we, 00:01:29.340 | 
to train a model with 100 trillion parameters? 00:01:39.440 | 
Let's call it GPT-4HB with 100 trillion parameters. 00:01:45.660 | 
Assuming linear scaling of compute requirements 00:01:51.580 | 
the cost in 2020 for training this neural network 00:02:02.740 | 
of Neural Networks," indicates that for the past seven years 00:02:20.740 | 
would be $325 million, decreasing to $40 million in 2028, 00:02:25.740 | 
and in 2032, coming down to approximately the same price 00:02:34.140 | 
Now, it's important to note, as the paper indicates, 00:02:36.300 | 
that as the size of the network and the compute increases, 00:02:39.460 | 
the improvement of the performance of the network 00:02:49.700 | 
it's fascinating to think what a language model 00:02:53.100 | 
with 100 trillion parameters might be able to accomplish. 00:03:00.140 | 
focusing on a single, simple idea on the basics of GPT-3, 00:03:04.540 | 
including technical, even philosophical implications, 00:03:08.500 | 
along with highlighting how others are using it. 00:03:12.060 | 
So if you enjoy this kind of thing, subscribe, 00:03:14.620 | 
and remember, try to learn something new every day.