是否可以在PyFlink中使用DataStream API,然后使用Table API SQL?

oknrviil  于 2022-12-09  发布在  Apache
关注(0)|答案(1)|浏览(149)

In PyFlink, is it possible to use the DataStream API to create a DataStream by means of StreamExecutionEnvironment's addSource(...), then perform transformations on this data stream using the DataStream API and then convert that stream into a form where SQL statements can be executed on it using the TableApi?
I have a single stream of many different types of events, so I'd like to create many different data streams from a single source, each with a different type of data in it. I was thinking perhaps I could use a side output based on the data in the initial stream and then perform different SQL operations against each of those streams, safe in the knowledge of what the data in each of those separate streams actually is. I don't want to have a different Flink job for each data type in the stream.

hk8txs48

hk8txs481#

是的,您可以将DataStream转换为允许您执行SQL语句的Table API:https://nightlies.apache.org/flink/flink-docs-stable/docs/dev/table/data_stream_api/
我认为您想要按类型拆分流的做法是合理的

相关问题