Shortform
Shortform Book Summaries
Video Description
Get a 20% discount to my favorite book summary service at https://shortform.com/artem
=====
My name is Artem, I'm a neuroscience PhD student at Harvard University.
🌎 Website and Social links: https://kirsanov.ai/
📥 "Receptive Field" neuro-newsletter: https://artemkirsanov.substack.com/
✨ Support me on Patreon to get access to Discord community: https://patreon.com/artemkirsanov
=====
Most neural networks have no concept of time: they analyze each input in complete isolation, with no memory of what came before. In this video, we explore how Recurrent Neural Networks (RNNs) solve this problem by adding a single new term to the network equation: the echo. We build up the intuition from scratch, starting with feedforward networks, then showing why the naive approach to memory fails, and arriving at gated architectures like LSTMs and GRUs through a natural chain of reasoning — discovering along the way that the simplest working memory mechanism turns out to be the same one biology already uses.
🕒 OUTLINE:
00:00 Introduction
02:10 ANN Background
05:53 Adding Recurrence
11:04 Sponsor: Shortform
12:05 Leaky Integration
14:40 Gated Memory
17:18 Putting it together
=====
Icons by Freepik and Biorender
Music by Artlist
This video was sponsored by Shortform
=====
*Disclaimer:* This channel is my personal project. The views and content expressed here are my own and are separate from my research role at Harvard University.
#artificialintelligence #machinelearning #deeplearning