李沐,亞馬遜 AI 主任科學家,名聲在外!半年前,由李沐、Aston Zhang 等人合力打造的《動手學深度學習》正式上線,免費供大家閱讀。這是一本面向中文讀者的能執行、可討論的深度學習教科書!
之前,紅色石頭就分享過這份資源,再次附上:
線上預覽地址:
GitHub 專案地址:
https://github.com/d2l-ai/d2l-zh
課程視訊地址:
https://space.bilibili.com/209599371/channel/detail?cid=23541
我們知道,作為 MXNet 的作者之一,李沐的這本《動手學深度學習》也是使用 MXNet 框架寫成的。但是很多入坑機器學習的萌新們使用的卻是 PyTorch。如果有教材對應的 PyTorch 實現程式碼就更好了!
撒花!今天就給大家帶來這本書的 PyTorch 實現原始碼。最近,來自印度理工學院的資料科學小組,把《動手學深度學習》從 MXNet “翻譯”成了 PyTorch,經過 3 個月的努力,這個專案已經基本完成,並登上了 GitHub 熱榜。
首先放上這份資源的 GitHub 地址:
https://github.com/dsgiitr/d2l-pytorch
詳細目錄如下:
- Ch02 Installation
- Installation
- Ch03 Introduction
- Introduction
- Ch04 The Preliminaries: A Crashcourse
- 4.1 Data Manipulation
- 4.2 Linear Algebra
- 4.3 Automatic Differentiation
- 4.4 Probability and Statistics
- 4.5 Naive Bayes Classification
- 4.6 Documentation
- Ch05 Linear Neural Networks
- 5.1 Linear Regression
- 5.2 Linear Regression Implementation from Scratch
- 5.3 Concise Implementation of Linear Regression
- 5.4 Softmax Regression
- 5.5 Image Classification Data (Fashion-MNIST)
- 5.6 Implementation of Softmax Regression from Scratch
- 5.7 Concise Implementation of Softmax Regression
- Ch06 Multilayer Perceptrons
- 6.1 Multilayer Perceptron
- 6.2 Implementation of Multilayer Perceptron from Scratch
- 6.3 Concise Implementation of Multilayer Perceptron
- 6.4 Model Selection Underfitting and Overfitting
- 6.5 Weight Decay
- 6.6 Dropout
- 6.7 Forward Propagation Backward Propagation and Computational Graphs
- 6.8 Numerical Stability and Initialization
- 6.9 Considering the Environment
- 6.10 Predicting House Prices on Kaggle
- Ch07 Deep Learning Computation
- 7.1 Layers and Blocks
- 7.2 Parameter Management
- 7.3 Deferred Initialization
- 7.4 Custom Layers
- 7.5 File I/O
- 7.6 GPUs
- Ch08 Convolutional Neural Networks
- 8.1 From Dense Layers to Convolutions
- 8.2 Convolutions for Images
- 8.3 Padding and Stride
- 8.4 Multiple Input and Output Channels
- 8.5 Pooling
- 8.6 Convolutional Neural Networks (LeNet)
- Ch09 Modern Convolutional Networks
- 9.1 Deep Convolutional Neural Networks (AlexNet)
- 9.2 Networks Using Blocks (VGG)
- 9.3 Network in Network (NiN)
- 9.4 Networks with Parallel Concatenations (GoogLeNet)
- 9.5 Batch Normalization
- 9.6 Residual Networks (ResNet)
- 9.7 Densely Connected Networks (DenseNet)
- Ch10 Recurrent Neural Networks
- 10.1 Sequence Models
- 10.2 Language Models
- 10.3 Recurrent Neural Networks
- 10.4 Text Preprocessing
- 10.5 Implementation of Recurrent Neural Networks from Scratch
- 10.6 Concise Implementation of Recurrent Neural Networks
- 10.7 Backpropagation Through Time
- 10.8 Gated Recurrent Units (GRU)
- 10.9 Long Short Term Memory (LSTM)
- 10.10 Deep Recurrent Neural Networks
- 10.11 Bidirectional Recurrent Neural Networks
- 10.12 Machine Translation and DataSets
- 10.13 Encoder-Decoder Architecture
- 10.14 Sequence to Sequence
- 10.15 Beam Search
- Ch11 Attention Mechanism
- 11.1 Attention Mechanism
- 11.2 Sequence to Sequence with Attention Mechanism
- 11.3 Transformer
- Ch12 Optimization Algorithms
- 12.1 Optimization and Deep Learning
- 12.2 Convexity
- 12.3 Gradient Descent
- 12.4 Stochastic Gradient Descent
- 12.5 Mini-batch Stochastic Gradient Descent
- 12.6 Momentum
- 12.7 Adagrad
- 12.8 RMSProp
- 12.9 Adadelta
- 12.10 Adam
其中,每一小節都是可以執行的 Jupyter 記事本,你可以自由修改程式碼和超引數來獲取及時反饋,從而積累深度學習的實戰經驗。
目前,PyTorch 程式碼還有 6 個小節沒有完成,但整體的完成度已經很高了!開發團隊希望更多的愛好者加入進來,貢獻一份力量!
最後,再次附上 GitHub 地址:
https://github.com/dsgiitr/d2l-pytorch