TensorFlow/TFLearn學習案例:泰坦尼克

banq發表於2016-07-27
在本教程中,您將學習使用TFLearn和tensorflow評估泰坦尼克號乘客倖存的機會,資料根據是利用他們的個人資訊(如性別、年齡等)。為了解決這一經典的機器學習任務,我們要建立一個深神經網路分類器。

準備工作:首先按照指引安裝好tensorflow 和 tflearn。

1912年4月15日,泰坦尼克號撞上冰山後沉沒,造成2224名乘客和機組人員中1502人死亡。雖然在這場事故中生存下來存在一些運氣因素,但是一些群體如婦女、兒童和船體上層人員生存機率更大。在本教程中,我們進行了分析,找出這些人是誰。

資料集
TFlearn會自動下載泰坦尼克號的下面資料:

VARIABLE DESCRIPTIONS:
survived        Survived
                (0 = No; 1 = Yes)
pclass          Passenger Class
                (1 = 1st; 2 = 2nd; 3 = 3rd)
name            Name
sex             Sex
age             Age
sibsp           Number of Siblings/Spouses Aboard
parch           Number of Parents/Children Aboard
ticket          Ticket Number
fare            Passenger Fare
<p class="indent">


建立分類器
資料集儲存在csv檔案中,能夠使用TFlearn的load_csv()函式載入資料,使用target_column作為存活與否的標籤,也就是資料集第一列survived,函式返回一對陣列(data, label)

import numpy as np
import tflearn

# Download the Titanic dataset
from tflearn.datasets import titanic
titanic.download_dataset('titanic_dataset.csv')

# Load CSV file, indicate that the first column represents labels
from tflearn.data_utils import load_csv
data, labels = load_csv('titanic_dataset.csv', target_column=0,
                        categorical_labels=True, n_classes=2)
<p class="indent">


預處理
資料作預先處理,資料中name對於預測沒有什麼用處,取消name和ticket兩個欄位;其次,神經網路只能處理數字,因此,將sex欄位男女轉為數字0或1。

# Preprocessing function
def preprocess(data, columns_to_ignore):
    # Sort by descending id and delete columns
    for id in sorted(columns_to_ignore, reverse=True):
        [r.pop(id) for r in data]
    for i in range(len(data)):
      # Converting 'sex' field to float (id is 1 after removing labels column)
      data[i][1] = 1. if data[i][1] == 'female' else 0.
    return np.array(data, dtype=np.float32)

# Ignore 'name' and 'ticket' columns (id 1 & 6 of data array)
to_ignore=[1, 6]

# Preprocess data
data = preprocess(data, to_ignore)
<p class="indent">


建立深度神經網路
我們使用TFLearn建立一個3層神經網路,需要規定輸入資料的形態,每個樣本有6個特徵,我們按批次處理可以節省記憶體,我們的資料輸入形態是 [None, 6] ,其中None程式碼不知道維度,我們能改變批處理中被處理後的樣本總數量。

# Build neural network
net = tflearn.input_data(shape=[None, 6])
net = tflearn.fully_connected(net, 32)
net = tflearn.fully_connected(net, 32)
net = tflearn.fully_connected(net, 2, activation='softmax')
net = tflearn.regression(net)
<p class="indent">


訓練
TFLearn提供DNN包裝器自動執行神經網路分類任務,比如訓練 預測和儲存恢復等,我們訓練10次,神經網路10次會看到全部資料,每次批處理大小是16:

# Define model
model = tflearn.DNN(net)
# Start training (apply gradient descent algorithm)
model.fit(data, labels, n_epoch=10, batch_size=16, show_metric=True)
<p class="indent">


輸出結果:

---------------------------------
Run id: MG9PV8
Log directory: /tmp/tflearn_logs/
---------------------------------
Training samples: 1309
Validation samples: 0
--
Training Step: 82  | total loss: 0.64003
| Adam | epoch: 001 | loss: 0.64003 - acc: 0.6620 -- iter: 1309/1309
--
Training Step: 164  | total loss: 0.61915
| Adam | epoch: 002 | loss: 0.61915 - acc: 0.6614 -- iter: 1309/1309
--
Training Step: 246  | total loss: 0.56067
| Adam | epoch: 003 | loss: 0.56067 - acc: 0.7171 -- iter: 1309/1309
--
Training Step: 328  | total loss: 0.51807
| Adam | epoch: 004 | loss: 0.51807 - acc: 0.7799 -- iter: 1309/1309
--
Training Step: 410  | total loss: 0.47475
| Adam | epoch: 005 | loss: 0.47475 - acc: 0.7962 -- iter: 1309/1309
--
Training Step: 492  | total loss: 0.51677
| Adam | epoch: 006 | loss: 0.51677 - acc: 0.7701 -- iter: 1309/1309
--
Training Step: 574  | total loss: 0.48988
| Adam | epoch: 007 | loss: 0.48988 - acc: 0.7891 -- iter: 1309/1309
--
Training Step: 656  | total loss: 0.55073
| Adam | epoch: 008 | loss: 0.55073 - acc: 0.7427 -- iter: 1309/1309
--
Training Step: 738  | total loss: 0.50242
| Adam | epoch: 009 | loss: 0.50242 - acc: 0.7854 -- iter: 1309/1309
--
Training Step: 820  | total loss: 0.41557
| Adam | epoch: 010 | loss: 0.41557 - acc: 0.8110 -- iter: 1309/1309
--
<p class="indent">


模型完成訓練準確率達到81%,說明它對全部乘客存活與否能夠有81%準確率。

下面我們試用這個模型,將泰坦尼克電影中男女主角傑克和露絲的資料輸入:

# Let's create some data for DiCaprio and Winslet
dicaprio = [3, 'Jack Dawson', 'male', 19, 0, 0, 'N/A', 5.0000]
winslet = [1, 'Rose DeWitt Bukater', 'female', 17, 1, 2, 'N/A', 100.0000]
# Preprocess data
dicaprio, winslet = preprocess([dicaprio, winslet], to_ignore)
# Predict surviving chances (class 1 results)
pred = model.predict([dicaprio, winslet])
print("DiCaprio Surviving Rate:", pred[0][1])
print("Winslet Surviving Rate:", pred[1][1])
<p class="indent">


輸出結果是:
DiCaprio Surviving Rate: 0.13849584758281708
Winslet Surviving Rate: 0.92201167345047

預測露絲有92的高機率生存,而傑克則相反。

更普遍的是,透過這項研究表明,第一層的婦女和兒童的乘客有最高的機會生存,而第三層的男乘客有最低。

tflearn/quickstart.md at master · tflearn/tflearn

相關文章