效能提升 48 倍! python redis 批次寫入大量資料最佳化過程

花菜發表於2020-09-14

背景

  • 最近在測試大資料時,需要往redis大寫入大量資料

1.最原始的版本,直接使用hset,效率很低

寫30w條完耗時365秒,這樣有兩個問題:

  • 相同的key,寫入多條應該用hmset代替hset
  • 另外可以用pipeline,避免頻繁跟redis服務端互動,大量減少網路io image.png image.png

image.png

def get_conn():
r = redis.Redis(host='localhost', port=6379, decode_responses=True)
return r


def test_set_redis():
conn = get_conn()
machineId = 43696000000000
device_no = 88800000
work_in = time.time()
source = "1"
factory_no = "factory"
today = datetime.date.today()
oneday = datetime.timedelta(days=1)
tomorrow = str(today + oneday).replace("-", "")
afterTomorrow = str(today + oneday + oneday).replace("-", "")
todayZero = int(time.mktime(today.timetuple()))
today = str(today).replace("-", "")
for i in range(300000):
upAxisId = "uxi" + str(device_no)
axisVarietyId = "axi" + str(device_no)
varietyId = "vi" + str(device_no)
axisNum = "axn" + str(device_no)
try:
conn.hset('mykey_prefix' + str(device_no), "machineId", str(machineId))
conn.hset('mykey_prefix' + str(device_no), "machineNum", str(machineId))
conn.hset('mykey_prefix' + str(device_no), "factoryId", factory_no)
conn.hset('mykey_prefix' + str(device_no), "groupId", "group_id")
conn.hset('mykey_prefix' + str(device_no), "groupName", "groupName11")
conn.hset('mykey_prefix' + str(device_no), "workshopId", "workshopId11")
conn.hset('mykey_prefix' + str(device_no), "workshopName", "workshopName11")
conn.hset('mykey_prefix' + str(device_no), "source", source)
conn.hset('mykey_prefix' + str(device_no), "errorTimeLimit", str(20))
conn.expire('mykey_prefix' + str(device_no), 864000) # 設定10天過期時間
conn.hset('mykey_prefix' + str(device_no), "axisInfo", json.dumps(axisInfo))
conn.hset('mykey_another_prefix:' + today, str(machineId), json.dumps(fbfcalue))
conn.hset('mykey_another_prefix:' + tomorrow, str(machineId), json.dumps(fbfcalue2))
conn.hset('mykey_another_prefix:' + afterTomorrow, str(machineId), json.dumps(fbfcalue3))
conn.hset('mykey_another_prefix1:' + today, str(machineId), json.dumps(fbfcalue))
conn.hset('mykey_another_prefix1:' + tomorrow, str(machineId), json.dumps(fbfcalue2))
conn.hset('mykey_another_prefix1:' + afterTomorrow, str(machineId), json.dumps(fbfcalue3))

conn.expire('mykey_another_prefix:' + today, 259200) # 3
conn.expire('mykey_another_prefix:' + tomorrow, 259200)
conn.expire('mykey_another_prefix:' + afterTomorrow, 259200)
conn.expire('mykey_another_prefix1:' + today, 259200)
conn.expire('mykey_another_prefix1:' + tomorrow, 259200)
conn.expire('mykey_another_prefix1:' + afterTomorrow, 259200)

conn.hset('fy:be:de:ma', str(device_no), str(machineId))
conn.expire('fy:be:de:ma', 864000)
machineId = int(machineId) + int(1)
device_no = int(device_no) + int(1)
except Exception as e:
print("設定異常,錯誤資訊:", e)

2.使用pipeline代替每次設定一個key就請求一次

##方法很簡單,只需要兩處小小的改動
image.png

使用pipeline效果非常明顯,已經從365秒變成了126秒,一下子就減少了239秒,將近4約分鐘!

image.png

3.使用pipeline + hmset

把同一個key對應的field和value組裝成字典,通過hmset一次性搞定

image.png

用了hmset之後,再次壓縮時間,126變成98,耗時縮小了28秒,將近半分鐘

image.png

為了進一步壓縮時間,使用golang實現了一遍,效能很強勁

從python的98秒變成了7.5秒,整整提升了13倍! 是最開始的365秒的48倍!!!

image.png

func setDevice() {
var deviceNo string
var deviceInfo map[string]interface{}
// 獲取reids管道
pipe := rdb.Pipeline()
defer pipe.Exec(ctx1)

for i := 0; i < len(devices); i++ {
device := devices[i]
for k, v := range device {
deviceNo = k
deviceInfo = v
}

deviceKey := fmt.Sprintf("%s:%s", deviceInfoKey, deviceNo)

machineId := deviceInfo["machineId"].(string)
// 設定排班資訊
shiftInfo, _ := json.Marshal(shiftToday)
pipe.HSetNX(ctx1, fystTodayKey, machineId, shiftInfo)
pipe.Expire(ctx1, fystTodayKey, time.Hour*24)
pipe.HSetNX(ctx1, fymstTodayKey, machineId, shiftInfo)
pipe.Expire(ctx1, fymstTodayKey, time.Hour*24)

// hmset 代替hset,一次性寫入map
pipe.HMSet(ctx1, deviceKey, deviceInfo).Err()
pipe.Expire(ctx1, deviceKey, time.Hour*72)
if i%1000 == 0 && i >= 1000 {
failCmd, err1 := pipe.Exec(ctx1)
log.Printf("正在設定第%d個採集器 \n", i)
if err1 != nil {
countFail += len(failCmd)
}
}
}

}

4.總結

  • 批量寫入時,使用pipeline可以大幅度提升效能
  • key相同的field和value,可以用hmset代替hset,也能很好的提升效能
  • 操作大量資料時,使用golang來代替python是很棒的選擇

相關文章