flink table api
1.flink 環境各種api
def oldFlinkBatchTable(): BatchTableEnvironment = {
val batchEnv = ExecutionEnvironment.getExecutionEnvironment
val oldBatchTableEnv = BatchTableEnvironment.create(batchEnv)
oldBatchTableEnv
}
def oldFlinkStreamTable(): StreamTableEnvironment = {
val env = StreamExecutionEnvironment.getExecutionEnvironment
val settings = EnvironmentSettings.newInstance()
.useOldPlanner()
.inStreamingMode()
.build()
val oldStreamTableEnv = StreamTableEnvironment.create(env, settings)
oldStreamTableEnv
}
def blinkBatchTable(): TableEnvironment = {
val blinkBatchSettings = EnvironmentSettings.newInstance()
.useBlinkPlanner()
.inBatchMode()
.build()
val blinkBatchTableEnv = TableEnvironment.create(blinkBatchSettings)
blinkBatchTableEnv
}
def blinkStreamTable(): StreamTableEnvironment = {
val env = StreamExecutionEnvironment.getExecutionEnvironment
val blinkStreamSettings = EnvironmentSettings.newInstance()
.useBlinkPlanner()
.inStreamingMode()
.build()
val blinkStreamTableEnv = StreamTableEnvironment.create(env, blinkStreamSettings)
blinkStreamTableEnv
}
2.如果想使用美元符號引入,需要匯入
import org.apache.flink.table.api._
val dataTable = tableEnv.fromDataStream(dataStream, $"itemId", $"behavior", $"timestamp".rowtime as "ts")
相關文章
- 淺析 Flink Table/SQL APISQLAPI
- 二十三、Flink Table API之基本APIAPI
- Flink實戰(六) - Table API & SQL程式設計APISQL程式設計
- Flink Table Api & SQL 初體驗,Blink的使用APISQL
- Flink的Table以及SQLSQL
- 聊聊flink Table的Joins
- Flink APIAPI
- [原始碼分析] 帶你梳理 Flink SQL / Table API內部執行流程原始碼SQLAPI
- Flink Table Store 的介紹
- 聊聊flink Table的OrderBy及LimitMIT
- 聊聊flink Table的where及filter操作Filter
- Flink Table Store 典型應用場景
- flink stream轉table POJO物件遇到的坑POJO物件
- Flink的流處理API(二)API
- Flink Table Store 0.3 構建流式數倉最佳實踐
- 如何在 Apache Flink 中使用 Python API?ApachePythonAPI
- Flink整合面向使用者的資料流SDKs/API(Flink關於棄用Dataset API的論述)API
- 第04講:Flink 常用的 DataSet 和 DataStream APIASTAPI
- Flink的DataSource三部曲之一:直接APIAPI
- 怎樣透過模型讓 Filament Table 消費外部 API 資料模型API
- create table,show tables,describe table,DROP TABLE,ALTER TABLE ,怎麼使用?
- 重磅!flink-table-store 將作為獨立資料湖專案重新加入 ApacheApache
- table
- [Flink/FlinkCDC] 實踐總結:Flink 1.12.6 升級 Flink 1.15.4
- MySQL:Analyze table導致'waiting for table flush'MySqlAI
- 【Flink】Flink 底層RPC框架分析RPC框架
- [Flink] Flink 版本特性的演進
- State Processor API:如何讀取,寫入和修改 Flink 應用程式的狀態API
- Sparse Table
- flink實戰--讀寫Hive(Flink on Hive)Hive
- 【Flink】深入理解Flink-On-Yarn模式Yarn模式
- flink快速入門(部署+flink-sql)SQL
- MySQL:Table_open_cache_hits/Table_open_cache_misses/Table_open_cache_overflowsMySql
- Flink gelly
- Flink模式模式
- Flink - Watermark
- Flink SideOutPutLateEventCustomIDE
- SQLAlchemy Table(表)類方式 – Table類和Column類SQL