1
votes

I found there are only DDL and yaml format configuration in the section of jdbc connector,I don't know the way to use them.so I am asking for how to read stream data from a jdbc data source when writing a flink app jar.if it can be,does the stream get updated if data has changed in the datasource.

1

1 Answers

0
votes

Support for ingesting CDC streams from JDBC databases is coming in Flink 1.11. See FLIP-105. This will do what you're asking for, including updating the stream as the underlying database tables are changed.

For examples of what's already possible in Flink 1.10, see the Flink SQL Demo shown in this talk from Flink Forward by Timo Walther and Fabian Hueske. For example, in Flink 1.10, you can join a stream with a lookup table in MySQL. In the demo (linked to above) this is done by using a Hive catalog to describe some MySQL tables, and then this query

SELECT
  l_proctime AS `querytime`,
  l_orderkey AS `order`,
  l_linenumber AS `linenumber`,
  l_currency AS `currency`,
  rs_rate AS `cur_rate`, 
  (l_extendedprice * (1 - l_discount) * (1 + l_tax)) / rs_rate AS `open_in_euro`
FROM prod_lineitem
JOIN hive.`default`.prod_rates FOR SYSTEM_TIME AS OF l_proctime ON rs_symbol = l_currency
WHERE
  l_linestatus = 'O';

is used to compute euro-normalized amounts using the current exchange rates stored in MySQL.