ChatGPT: 30 Year History | How AI Learned to Talk

1,015,373
0
Published 2023-11-27
This video explores the journey of AI language models, from their modest beginnings through the development of OpenAI's GPT models. Our journey takes us through the key moments in generative neural network research involved in next word prediction. We delve into the early experiments with tiny language models in the 1980s, highlighting significant contributions by researchers like Jordan, who introduced Recurrent Neural Networks, and Elman, whose work on learning word boundaries revolutionized our understanding of language processing. It leaves us with a question: what is thought? Is simulated thought, thought? Featuring Noam Chomsky Douglas Hofstadter Michael I. Jordan Jeffrey Elman Geoffrey Hinton Ilya Sutskever Andrej Karpathy Yann LeCun and more. (Sam altman)

My script, references & visualizations here: docs.google.com/document/d/1s7FNPoKPW9y3EhvzNgexJa…

consider joining my channel as a YouTube member: youtube.com/channel/UCotwjyJnb-4KW7bmsOoLfkg/join


This is the last video in the series "The Pattern Machine" you can watch it all here:    • Artificial Intelligence  

00:00 - Introduction
00:32 - hofstader's thoughts on chatGPT
01:00 - recap of supervised learning
01:55 - first paper on sequential learning
02:55 - first use of state units (RNN)
04:33 - first observation of word boundary detection
05:30 - first observation of word clustering
07:16 - first "large" language model Hinton/Sutskever
10:10 - sentiment neuron (Ilya | OpenAI)
12:30 - transformer explaination
15:50 - GPT-1
17:00 - GPT-2
17:55 - GPT-3
18:20 - In-context learning
19:40 - ChatGPT
21:10 - tool use
23:25 - philosophical question: what is thought?

All Comments (21)
  • @ArtOfTheProblem
    STAY TUNED: Next video will be on "History of RL | How AI Learned to Feel" SUBSCRIBE: youtube.com/@ArtOfTheProblem?sub_confirmation=1 WATCH AI series: youtube.com/playlist?list=PLbg3ZX2pWlgKV8K6bFJr5dh… Timestamps: 00:32 hofstader's thoughts on chatGPT 01:00 recap of supervised learning 01:55 first paper on sequential learning 02:55 first use of state units (RNN) 04:33 first observation of word boundary detection 05:30 first observation of word clustering 10:10 sentiment neuron (Ilya & Hinton) 12:30 transformer explaination 15:50 GPT-1 17:00 GPT-2 17:55 GPT-3 18:20 In-context learning 19:40 ChatGPT 21:10 tool use 23:25 philosophical question: what is thought?
  • @clamhammer2463
    As an AI researcher and developer, this is the first video that I did not leave thinking that the author was just saying words without second thought. There is so much misinformation in this space stemming from the fear of the ramifications of AI that are a result of the negative feedback loop of those same people. Well done.
  • @belibem
    There only a handful of youtube channels that can make such concepts accessible to everyone with a curious mind (e.g. 3blue1brown, veritasium, Sabine). Art of the Problem is one of them. Brit you are a legend. Thank you for giving us this series.
  • @Just4Growers
    I have watched countless AI videos but nobody explained it like you. Many thanks.
  • @PotatoCider
    please never stop teaching on the internet. i can tell you have an intense passion in learning. im in love with how you explain concepts and their connections. feynman would be proud.
  • @jay_sensz
    Great video, but please fix your audio mix. The background music is way too loud. There likely is an option for audio ducking in your video editing program that will automatically lower the background music volume when foreground audio is playing.
  • Perfect. 12 years ago or so I learned about RSA from you. I'm glad to see you are still producing quality videos! Thumbs up!
  • @situranjankar
    One thing i like about your videos is that you are very good at explaining things and the way of presentation is such that it is intuitive to any age group . Although there are thousands of videos and articles are available related to the same topics but this one thing makes your videos unique and best. Thanks Brit. Please keep making such videos in future.
  • @Nick-Quick
    TIMESTAMPS: 00:05 Neural networks learned to talk, leading to more general-purpose systems. 02:30 Recurrent neural networks (RNNs) use state units to create a state of mind that depends on the past and can affect the future. 04:52 Neural networks can learn word boundaries and cluster words based on meaning. 07:10 Language models saw limited progress until 2011 when a larger network showed the potential for higher performance. 09:34 Neural networks can learn language and complex concepts with minimal human intervention. 11:43 Neural networks struggled to handle long-range dependencies in text sequences. 14:02 Neural networks use distance in concept space to find similarities and adjust their meaning. 16:20 Neural networks with self-attention and fully connected layers can generate coherent and contextually relevant text. 18:27 In-context learning allows changing the behavior of the network without changing the network weights. 20:38 Language models like ChatGPT are more than just chatbots, they serve as the kernel process of an emerging operating system. 22:41 Training networks on prediction empowered by self-attention leads to a more general system that can be retasked on any narrow problem. 24:43 Deep learning community is divided due to differing opinions on the nature of AI's linguistic abilities and thought process.
  • @CutStudio4
    I'm the one solving the rubiks cube!!!!
  • Wow, this is handsdown the best video I've even seen about this subject. Thank you! It is on my favorites now!
  • @jamesreilly7684
    Quoting one of the hero's of tech (Hofstadter ) makes this very well executed video that much more compelling. T9 (predictive testing) is the first really useful llm ish (not a word wheel but a predictive wheel from numbers) tech that end users experienced. The notion this simple premise is how it all works is nothing short of astonishing.
  • @SecretEyeSpot
    this channel changed my life about 10 years ago when i found the language of coins. By serendipity, I've found this channel again while im formally studying computer science which this channel inspired me to do. Thank you
  • @seschaitanya5676
    This is the first video I watched on this channel and the quality of content, communication, analysis, depth, and even the audio selection is soo on point. Hooked from start to end. Immediately liked and subscribed.
  • @AlexGeo925
    This video deserves so many more views and likes than it has… It’s pure gold. Thanks! 🙏🏼
  • @maivincent2659
    Been a big fan of your videos for years! I'm always delighted, even as I rewatch some of your old videos time and again, not only by how you distill a complex concept down to its essence but also by how you transform that which is abstract into something extremely palpable and relatable through filmmaking techniques.
  • @theK594
    This video is just fantastic, extremely up-to-date and very useful. It very well resonates with the discussions I am having now in the community. I would love to see it extended, once history gets written.
  • @les_crow
    This is the most perfect video on the entire internet. I will use it to elucidate people on the topic. If I was rich I'd pay you big for this. Thank you sir.
  • @JonDotExe
    This is easily the most cogent video on the topic I have ever seen. (I even sent this to my mom!) Hard to believe you've had these amazing topics for 12 years and aren't cracking 100k yet. Count me in for the journey!