量化合約/合約量化/合約跟單系統開發(策略及詳細)案例原始碼

xiaofufu發表於2023-02-28

  First of all,we should clarify the basic concept of quantitative trading:


  Quantitative trading refers to an investment method that uses modern statistics and mathematical methods to trade through computer technology.Quantitative trading selects a variety of"high probability"events that can achieve excess returns from massive historical data to formulate strategies,uses quantitative models to verify and solidify these laws and strategies,and then strictly implements the solidified strategies to guide investment,in order to obtain sustained,stable and higher than average returns.


  int main(int argc,const char*argv[]){


  if(argc<4){


  DLOG(INFO)<<"Usage:./quantized.out src.mnn dst.mnn preTreatConfig.jsonn";


  return 0;


  }


  const char*modelFile=argv[1];


  const char*preTreatConfig=argv[3];


  const char*dstFile=argv[2];


  DLOG(INFO)<<">>>modelFile:"<<modelFile;


  DLOG(INFO)<<">>>preTreatConfig:"<<preTreatConfig;


  DLOG(INFO)<<">>>dstFile:"<<dstFile


  std::unique_ptr<MNN::NetT>netT;


  {//讀取原始的model檔案,藉助於flattbuffer生成Net物件


  std::ifstream input(modelFile);


  std::ostringstream outputOs;


  outputOs<<input.rdbuf();


  netT=MNN::UnPackNet(outputOs.str().c_str());//獲取Net物件


  }開發方案及專案:MrsFu123


  


  //temp build net for inference


  flatbuffers::FlatBufferBuilder builder(1024);


  auto offset=MNN::Net::Pack(builder,netT.get());//打包模型準備放入buffer中


  builder.Finish(offset);


  int size=builder.GetSize();


  auto ocontent=builder.GetBufferPointer();


  


  //建立兩個buffer,兩個都用來放模型資料


  std::unique_ptr<uint8_t>modelForInference(new uint8_t[size]);


  memcpy(modelForInference.get(),ocontent,size);


  std::unique_ptr<uint8_t>modelOriginal(new uint8_t[size]);


  memcpy(modelOriginal.get(),ocontent,size);


  


  netT.reset();


  netT=MNN::UnPackNet(modelOriginal.get());


  


  //進行量化操作,主要這個靠的是Calibration類


  DLOG(INFO)<<"Calibrate the feature and quantize model...";


  std::shared_ptr<Calibration>calibration(


  new Calibration(netT.get(),modelForInference.get(),size,preTreatConfig));


  calibration->runQuantizeModel();


  DLOG(INFO)<<"Quantize model done!";


  //量化後的模型寫入到FlatBufferBuilder


  flatbuffers::FlatBufferBuilder builderOutput(1024);


  builderOutput.ForceDefaults(true);


  auto len=MNN::Net::Pack(builderOutput,netT.get());


  builderOutput.Finish(len);


  //FlatBufferBuilder的內容寫入檔案,得到量化模型


  {


  std::ofstream output(dstFile);


  output.write((const char*)builderOutput.GetBufferPointer(),builderOutput.GetSize());


  }


  }


來自 “ ITPUB部落格 ” ,連結:http://blog.itpub.net/69956839/viewspace-2937458/,如需轉載,請註明出處,否則將追究法律責任。

相關文章