kafka connect jdbc在视图中不工作

zz2j4svz  于 2021-06-04  发布在  Kafka
关注(0)|答案(0)|浏览(353)

为了避免将数字列转换为字节,我在源数据库中创建了一个表视图,并将列转换为数字。现在,在运行独立连接器时,它给出了无效的数字错误。这背后的原因是什么?
[2019-08-26 15:19:30598]对表timestampIncrementTableQuerier{table=“system.”运行查询失败。v\u dtd,query='null',topicprefix='corpneft-test-',incrementingcolumn='tran\u id',timestampcolumns=[]}:{}(io.confluent.connect.jdbc.source.jdbcs)ourcetask:332)
java.sql.sqlsyntaxerrorexception:ora-01722:无效数字


# 

# Copyright 2018 Confluent Inc.

# 

# Licensed under the Confluent Community License (the "License"); you may not use

# this file except in compliance with the License.  You may obtain a copy of the

# License at

# 

# http://www.confluent.io/confluent-community-license

# 

# Unless required by applicable law or agreed to in writing, software

# distributed under the License is distributed on an "AS IS" BASIS, WITHOUT

# WARRANTIES OF ANY KIND, either express or implied.  See the License for the

# specific language governing permissions and limitations under the License.

# 

# A simple example that copies all tables from a SQLite database. The first few settings are

# required for all connectors: a name, the connector class to run, and the maximum number of

# tasks to create:

name=test-source-sqlite-jdbc-autoincrement
connector.class=io.confluent.connect.jdbc.JdbcSourceConnector
tasks.max=1

# The remaining configs are specific to the JDBC source connector. In this example, we connect to a

# SQLite database stored in the file test.db, use and auto-incrementing column called 'id' to

# detect new rows as they are added, and output to topics prefixed with 'test-sqlite-jdbc-', e.g.

# a table called 'users' will be written to the topic 'test-sqlite-jdbc-users'.

connection.url=jdbc:oracle:thin:system/manager@10.101.142.25:3164/CBSPT
table.whitelist=V_DTD
table.types=VIEW
mode=incrementing
poll.interval.ms=10000
incrementing.column.name=TRAN_ID
topic.prefix=corpneft-test-

# Define when identifiers should be quoted in DDL and DML statements.

# The default is 'always' to maintain backward compatibility with prior versions.

# Set this to 'never' to avoid quoting fully-qualified or simple table and column names.

# quote.sql.identifiers=always

validate.non.null=false

# transformsTimestampConverter

# transforms.TimestampConverter.type=org.apache.kafka.connect.transforms.TimestampConverter$Value

# transforms.TimestampConverter.format=yyyy-MM-dd

# transforms.TimestampConverter.target.type=string

暂无答案!

目前还没有任何答案,快来回答吧!

相关问题