Kubernetes入門(四)——如何在Kubernetes中部署一個可對外服務的Tensorflow機器學習模型

bjehp發表於2020-09-19

機器學習模型常用Docker部署,而如何對Docker部署的模型進行管理呢?工業界的解決方案是使用Kubernetes來管理、編排容器。Kubernetes的理論知識不是本文討論的重點,這裡不再贅述,有關Kubernetes的優點讀者可自行Google。筆者整理的Kubernetes入門系列的側重點是如何實操,前三節介紹了Kubernets的安裝、Dashboard的安裝,以及如何在Kubernetes中部署一個無狀態的應用,本節將討論如何在Kubernetes中部署一個可對外服務的Tensorflow機器學習模型,作為Kubernetes入門系列的結尾。

希望Kubernetes入門系列能對K8S初學者提供一些參考,對文中描述有不同觀點,或者對工業級部署與應用機器學習演算法模型有什麼建議,歡迎大家在評論區討論與交流~~~

1. Docker中執行TensorFolw Serving

  • 執行half_plus_two模型 [1]
# Download the TensorFlow Serving Docker image and repo
docker pull tensorflow/serving

mkdir /data0/modules
cd /data0/modules
git clone https://github.com/tensorflow/serving
# Location of demo models
TESTDATA="/data0/modules/serving/tensorflow_serving/servables/tensorflow/testdata/"

# Start TensorFlow Serving container and open the REST API port
docker run -dit --rm -p 8501:8501 \
-v /data0/modules/serving/tensorflow_serving/servables/tensorflow/testdata/saved_model_half_plus_two_cpu:/models/half_plus_two \
-e MODEL_NAME=half_plus_two  tensorflow/serving 

# Query the model using the predict API
curl -d '{"instances": [1.0, 2.0, 5.0]}' \
    -X POST http://localhost:8501/v1/models/half_plus_two:predict

# Returns => { "predictions": [2.5, 3.0, 4.5] }

2. 構建TensorFolw模型的Docker映象

  • 後臺執行serving容器
docker run -d --rm --name serving_base tensorflow/serving
  • 拷貝模型資料到容器中的model目錄
docker cp /data0/modules/serving/tensorflow_serving/servables/tensorflow/testdata/saved_model_half_plus_two_cpu serving_base:/models/half_plus_two
  • 生成關於模型的映象
docker commit --change "ENV MODEL_NAME half_plus_two" serving_base ljh/half_plus_two
  • 停止serving容器
docker kill serving_base
docker rm serving_base
  • 啟動服務
docker run -dit --rm -p 8501:8501 \
-e MODEL_NAME=half_plus_two  ljh/half_plus_two
  • 查詢模型
curl -d '{"instances": [1.0, 2.0, 5.0]}'    -X POST http://localhost:8501/v1/models/half_plus_two:predict

# Returns => { "predictions": [2.5, 3.0, 4.5] }

3. Kubernetes部署TensorFolw模型

建立關於模型的Deployment

  • yaml檔案
cat deployment.yaml
apiVersion: apps/v1
kind: Deployment
metadata:
  name: halfplustwo-deployment
spec:
  selector:
    matchLabels:
      app: halfplustwo
  replicas: 1
  template:
    metadata:
      labels:
        app: halfplustwo
    spec:
      containers:
        - name: halfplustwo
          image: ljh/half_plus_two:latest
          imagePullPolicy: IfNotPresent
          ports:
            - containerPort: 8501
              name: restapi
            - containerPort: 8500
              name: grpc
  • 建立一個Deployment:
kubectl apply -f deployment.yaml
  • 展示Deployment相關資訊:
kubectl get deployment -o wide
kubectl describe deployment halfplustwo-deployment
  • 列出deployment建立的pods:
kubectl get pods -l app=halfplustwo
  • 展示某一個pod資訊
kubectl describe pod <pod-name>

使用service暴露你的應用

  • yaml檔案
cat service.yaml
apiVersion: v1
kind: Service
metadata:
  labels:
    run: halfplustwo-service
  name: halfplustwo-service
spec:
  ports:
    - port: 8501
      targetPort: 8501
      name: restapi
    - port: 8500
      targetPort: 8500
      name: grpc
  selector:
    app: halfplustwo
  type: LoadBalancer
  • 啟動service
kubectl create -f service.yaml
or
kubectl apply -f service.yaml
  • 檢視service
kubectl get service
#output:
NAME                  TYPE           CLUSTER-IP      EXTERNAL-IP   PORT(S)                         AGE
halfplustwo-service   LoadBalancer   10.96.181.116   <pending>     8501:30771/TCP,8500:31542/TCP   4s
kubernetes            ClusterIP      10.96.0.1       <none>        443/TCP                         8d
nginx                 NodePort       10.96.153.10    <none>        80:30088/TCP                    29h

測試

curl -d '{"instances": [1.0, 2.0, 5.0]}'    -X POST http://localhost:8501/v1/models/half_plus_two:predict
{"predictions": [2.5, 3.0, 4.5]}

刪除deployment和service

kubectl delete -f deployment.yaml
kubectl delete -f service.yaml

4. 參考資料

[1] https://www.tensorflow.org/tfx/serving/docker    TensorFlow Serving 與 Docker
[2] https://www.tensorflow.org/tfx/serving/serving_kubernetes?hl=zh_cn   將TensorFlow Serving與 Kubernetes結合使用
[3] https://towardsdatascience.com/scaling-machine-learning-models-using-tensorflow-serving-kubernetes-ed00d448c917  Scaling Machine Learning models using Tensorflow Serving & Kubernetes
[4] http://www.tuwee.cn/2019/03/03/Kubernetes+Tenserflow-serving%E6%90%AD%E5%BB%BA%E5%8F%AF%E5%AF%B9%E5%A4%96%E6%9C%8D%E5%8A%A1%E7%9A%84%E6%9C%BA%E5%99%A8%E5%AD%A6%E4%B9%A0%E5%BA%94%E7%94%A8/ Kubernetes+Tenserflow-serving搭建可對外服務的機器學習應用

相關文章