【讀書1】【2017】MATLAB與深度學習——深度學習(2)

梅花香——苦寒來發表於2018-11-09

在深度神經網路的訓練過程中,反向傳播演算法經歷了以下三個主要困難:

The backpropagation algorithm experiencesthe following three primary difficulties in the training process of the deepneural network:

梯度消失

Vanishing gradient

過度擬合

Overfitting

計算量

Computational load

消失的梯度(Vanishing Gradient)

在此上下文中的梯度可以被認為是與反向傳播演算法的增量類似的概念。

The gradient in this context can be thoughtas a similar concept to the delta of the back-propagation algorithm.

當輸出誤差可能無法到達更遠的節點時,使用反向傳播演算法的訓練過程中發生梯度消失。

The vanishing gradient in the trainingprocess with the back-propagation algorithm occurs when the output error ismore likely to fail to reach the farther nodes.

反向傳播演算法訓練神經網路,因為它將輸出誤差向後傳播到隱藏層。

The back-propagation algorithm trains theneural network as it propagates the output error backward to the hidden layers.

然而,由於誤差幾乎不能到達第一隱藏層,所以權重無法被調整。

However, as the error hardly reaches thefirst hidden layer, the weight cannot be adjusted.

因此,接近輸入層的隱藏層沒有被正確地訓練。

Therefore, the hidden layers that are closeto the input layer are not properly trained.

如果無法訓練隱藏層,那麼隱藏層的新增就沒有必要了(參見圖5-2)。

There is no point of adding hidden layersif they cannot be trained (see Figure 5-2).

在這裡插入圖片描述

圖5-2 消失的梯度The vanishing gradient

消失梯度的代表性解決方案是使用整流線性單元(ReLU)函式作為啟用函式。

The representative solution to thevanishing gradient is the use of the Rectified Linear Unit (ReLU) function asthe activation function.

眾所周知ReLU比sigmoid函式能夠更好地傳遞誤差。

It is known to better transmit the errorthan the sigmoid function.

ReLU函式的定義如下:

The ReLU function is defined as follows:

在這裡插入圖片描述

圖5-3描述了ReLU函式。

Figure 5-3 depicts the ReLU function.

在這裡插入圖片描述

圖5-3 ReLU函式The ReLU function

當輸入為負數時,該函式輸出0;當輸入為正數時,則輸出為輸入的正數。

It produces zero for negative inputs andconveys the input for positive inputs.

它的名字是因為它的功能類似於整流器,一種在切斷負電壓時將交流轉換成直流的電氣元件。

It earned its name as its behavior issimilar to that of the rectifier, an electrical element that converts thealternating current into direct current as it cuts out negative voltage.

——本文譯自Phil Kim所著的《Matlab Deep Learning》

更多精彩文章請關注微訊號:在這裡插入圖片描述

相關文章