本文整理了Java中org.apache.kafka.connect.data.Decimal
类的一些代码示例,展示了Decimal
类的具体用法。这些代码示例主要来源于Github
/Stackoverflow
/Maven
等平台,是从一些精选项目中提取出来的代码,具有较强的参考意义,能在一定程度帮忙到你。Decimal
类的具体详情如下:
包路径:org.apache.kafka.connect.data.Decimal
类名称:Decimal
[英]An arbitrary-precision signed decimal number. The value is unscaled * 10 ^ -scale where:
Decimal does not provide a fixed schema because it is parameterized by the scale, which is fixed on the schema rather than being part of the value.
The underlying representation of this type is bytes containing a two's complement integer
[中]任意精度有符号十进制数。该值为无标度*10^-标度,其中:
*unscaled是一个整数
*scale是一个整数,表示小数点在未标度值上移动的位数
Decimal不提供固定模式,因为它由比例参数化,比例固定在模式上,而不是作为值的一部分。
这种类型的底层表示是包含2的补码整数的字节
代码示例来源:origin: debezium/debezium
/**
* Returns a {@link SchemaBuilder} for a decimal number depending on {@link JdbcValueConverters.DecimalMode}. You
* can use the resulting schema builder to set additional schema settings such as required/optional, default value,
* and documentation.
*
* @param mode the mode in which the number should be encoded
* @param precision the precision of the decimal
* @param scale scale of the decimal
* @return the schema builder
*/
public static SchemaBuilder builder(DecimalMode mode, int precision, int scale) {
switch (mode) {
case DOUBLE:
return SchemaBuilder.float64();
case PRECISE:
return Decimal.builder(scale)
.parameter(PRECISION_PARAMETER_KEY, String.valueOf(precision));
case STRING:
return SchemaBuilder.string();
}
throw new IllegalArgumentException("Unknown decimalMode");
}
代码示例来源:origin: hpgrahsl/kafka-connect-mongodb
@TestFactory
@DisplayName("tests for logical type decimal field conversions (legacy)")
public List<DynamicTest> testDecimalFieldConverterLegacy() {
SinkFieldConverter converter =
new DecimalFieldConverter(DecimalFieldConverter.Format.LEGACYDOUBLE);
List<DynamicTest> tests = new ArrayList<>();
new ArrayList<>(Arrays.asList(
new BigDecimal("-1234567890.09876543210"),
BigDecimal.ZERO,
new BigDecimal("+1234567890.09876543210")
)).forEach(
el -> tests.add(dynamicTest("conversion with "
+ converter.getClass().getSimpleName() + " for "+el,
() -> assertEquals(el.doubleValue(),
((BsonDouble)converter.toBson(el)).getValue())
))
);
tests.add(dynamicTest("optional type conversions", () -> {
Schema valueOptionalDefault = Decimal.builder(0).optional().defaultValue(BigDecimal.ZERO);
assertAll("checks",
() -> assertThrows(DataException.class, () -> converter.toBson(null, Decimal.schema(0))),
() -> assertEquals(new BsonNull(), converter.toBson(null, Decimal.builder(0).optional())),
() -> assertEquals(((BigDecimal)valueOptionalDefault.defaultValue()).doubleValue(),
((BsonDouble)converter.toBson(null,valueOptionalDefault)).getValue())
);
}));
return tests;
}
代码示例来源:origin: hpgrahsl/kafka-connect-mongodb
public DecimalFieldConverter() {
super(Decimal.schema(0));
this.format = Format.DECIMAL128;
}
代码示例来源:origin: org.apache.kafka/connect-api
@Override
public Headers addDecimal(String key, BigDecimal value) {
if (value == null) {
return add(key, null, null);
}
// Check that this is a decimal ...
Schema schema = Decimal.schema(value.scale());
Decimal.fromLogical(schema, value);
return addWithoutValidating(key, value, schema);
}
代码示例来源:origin: com.datamountaineer/kafka-connect-common
@Override
public Object convert(Schema schema, Object value) {
if (!(value instanceof byte[]))
throw new DataException("Invalid type for Decimal, underlying representation should be bytes but was " + value.getClass());
return Decimal.toLogical(schema, (byte[]) value);
}
});
代码示例来源:origin: org.apache.kafka/connect-api
return Decimal.toLogical(toSchema, (byte[]) value);
return Decimal.fromLogical(toSchema, (BigDecimal) value);
代码示例来源:origin: hpgrahsl/kafka-connect-mongodb
@TestFactory
@DisplayName("tests for logical type decimal field conversions (new)")
public List<DynamicTest> testDecimalFieldConverterNew() {
SinkFieldConverter converter = new DecimalFieldConverter();
List<DynamicTest> tests = new ArrayList<>();
new ArrayList<>(Arrays.asList(
new BigDecimal("-1234567890.09876543210"),
BigDecimal.ZERO,
new BigDecimal("+1234567890.09876543210")
)).forEach(
el -> tests.add(dynamicTest("conversion with "
+ converter.getClass().getSimpleName() + " for "+el,
() -> assertEquals(el,
((BsonDecimal128)converter.toBson(el)).getValue().bigDecimalValue())
))
);
tests.add(dynamicTest("optional type conversions", () -> {
Schema valueOptionalDefault = Decimal.builder(0).optional().defaultValue(BigDecimal.ZERO);
assertAll("checks",
() -> assertThrows(DataException.class, () -> converter.toBson(null, Decimal.schema(0))),
() -> assertEquals(new BsonNull(), converter.toBson(null, Decimal.builder(0).optional())),
() -> assertEquals(valueOptionalDefault.defaultValue(),
((BsonDecimal128)converter.toBson(null,valueOptionalDefault)).getValue().bigDecimalValue())
);
}));
return tests;
}
代码示例来源:origin: hpgrahsl/kafka-connect-mongodb
public DecimalFieldConverter() {
super(Decimal.schema(0));
this.format = Format.DECIMAL128;
}
代码示例来源:origin: com.github.jcustenborder.kafka.connect/connect-utils
static Object decimal(Schema schema, Object value) {
if (value instanceof byte[]) {
byte[] bytes = (byte[]) value;
return Decimal.toLogical(schema, bytes);
}
if (value instanceof BigDecimal) {
BigDecimal decimal = (BigDecimal) value;
final int scale = Integer.parseInt(schema.parameters().get(Decimal.SCALE_FIELD));
if (scale == decimal.scale()) {
return decimal;
} else {
return decimal.setScale(scale);
}
}
return value;
}
代码示例来源:origin: debezium/debezium
protected List<SchemaAndValueField> schemaAndValuesForMoneyTypes() {
return Collections.singletonList(new SchemaAndValueField("csh", Decimal.builder(0).optional().build(),
BigDecimal.valueOf(1234.11d)));
}
代码示例来源:origin: hpgrahsl/kafka-connect-mongodb
public DecimalFieldConverter(Format format) {
super(Decimal.schema(0));
this.format = format;
}
代码示例来源:origin: com.github.jcustenborder.kafka.connect/kafka-connect-cdc-test
static Object decimal(Schema schema, Object value) {
if (value instanceof byte[]) {
byte[] bytes = (byte[]) value;
return Decimal.toLogical(schema, bytes);
}
if (value instanceof BigDecimal) {
BigDecimal decimal = (BigDecimal) value;
final int scale = Integer.parseInt(schema.parameters().get(Decimal.SCALE_FIELD));
if (scale == decimal.scale()) {
return decimal;
} else {
return decimal.setScale(scale);
}
}
if (value instanceof Number) {
Number number = (Number) value;
int scale = Integer.parseInt(schema.parameters().get(Decimal.SCALE_FIELD));
BigDecimal decimal = BigDecimal.valueOf(number.longValue(), scale);
return decimal;
}
return value;
}
代码示例来源:origin: debezium/debezium
protected List<SchemaAndValueField> schemasAndValuesForBigDecimalEncodedNumericTypes() {
final Struct dvs = new Struct(VariableScaleDecimal.schema());
dvs.put("scale", 4).put("value", new BigDecimal("10.1111").unscaledValue().toByteArray());
final Struct nvs = new Struct(VariableScaleDecimal.schema());
nvs.put("scale", 4).put("value", new BigDecimal("22.2222").unscaledValue().toByteArray());
final Struct dvs_int = new Struct(VariableScaleDecimal.schema());
dvs_int.put("scale", 0).put("value", new BigDecimal("10").unscaledValue().toByteArray());
final Struct nvs_int = new Struct(VariableScaleDecimal.schema());
nvs_int.put("scale", 0).put("value", new BigDecimal("22").unscaledValue().toByteArray());
final List<SchemaAndValueField> fields = new ArrayList<SchemaAndValueField>(Arrays.asList(
new SchemaAndValueField("d", Decimal.builder(2).parameter(TestHelper.PRECISION_PARAMETER_KEY, "3").optional().build(), new BigDecimal("1.10")),
new SchemaAndValueField("dzs", Decimal.builder(0).parameter(TestHelper.PRECISION_PARAMETER_KEY, "4").optional().build(), new BigDecimal("10")),
new SchemaAndValueField("dvs", VariableScaleDecimal.optionalSchema(), dvs),
new SchemaAndValueField("d_nn", Decimal.builder(2).parameter(TestHelper.PRECISION_PARAMETER_KEY, "3").build(), new BigDecimal("3.30")),
new SchemaAndValueField("n", Decimal.builder(4).parameter(TestHelper.PRECISION_PARAMETER_KEY, "6").optional().build(), new BigDecimal("22.2200")),
new SchemaAndValueField("nzs", Decimal.builder(0).parameter(TestHelper.PRECISION_PARAMETER_KEY, "4").optional().build(), new BigDecimal("22")),
new SchemaAndValueField("nvs", VariableScaleDecimal.optionalSchema(), nvs),
new SchemaAndValueField("d_int", Decimal.builder(2).parameter(TestHelper.PRECISION_PARAMETER_KEY, "3").optional().build(), new BigDecimal("1.00")),
new SchemaAndValueField("dvs_int", VariableScaleDecimal.optionalSchema(), dvs_int),
new SchemaAndValueField("n_int", Decimal.builder(4).parameter(TestHelper.PRECISION_PARAMETER_KEY, "6").optional().build(), new BigDecimal("22.0000")),
new SchemaAndValueField("nvs_int", VariableScaleDecimal.optionalSchema(), nvs_int)
));
return fields;
}
代码示例来源:origin: hpgrahsl/kafka-connect-mongodb
public DecimalFieldConverter(Format format) {
super(Decimal.schema(0));
this.format = format;
}
代码示例来源:origin: debezium/debezium
(int)LocalDate.of(2016, Month.NOVEMBER, 6).toEpochDay()
)),
new SchemaAndValueField("numeric_array", SchemaBuilder.array(Decimal.builder(2).parameter(TestHelper.PRECISION_PARAMETER_KEY, "10").optional().build()).optional().build(),
Arrays.asList(
new BigDecimal("1.20"),
代码示例来源:origin: org.apache.kafka/connect-api
/**
* Convert the specified value to an {@link Decimal decimal} value.
* Not supplying a schema may limit the ability to convert to the desired type.
*
* @param schema the schema for the value; may be null
* @param value the value to be converted; may be null
* @return the representation as a decimal, or null if the supplied value was null
* @throws DataException if the value cannot be converted to a decimal value
*/
public static BigDecimal convertToDecimal(Schema schema, Object value, int scale) {
return (BigDecimal) convertTo(Decimal.schema(scale), schema, value);
}
代码示例来源:origin: debezium/debezium
assertRecordSchemaAndValues(expectedBefore, updatedRecord, Envelope.FieldName.BEFORE);
List<SchemaAndValueField> expectedAfter = Collections.singletonList(new SchemaAndValueField("num_val", Decimal.builder(2).parameter(TestHelper.PRECISION_PARAMETER_KEY, "5").optional().build(), new BigDecimal("123.45")));
assertRecordSchemaAndValues(expectedAfter, updatedRecord, Envelope.FieldName.AFTER);
Collections.singletonList(new SchemaAndValueField("num_val", Decimal.builder(1).parameter(TestHelper.PRECISION_PARAMETER_KEY, "6").optional().build(), new BigDecimal("123.4"))), updatedRecord, Envelope.FieldName.AFTER);
Collections.singletonList(new SchemaAndValueField("num_val", Decimal.builder(4).parameter(TestHelper.PRECISION_PARAMETER_KEY, "12").optional().build(), new BigDecimal("2.4800"))), updatedRecord, Envelope.FieldName.AFTER);
Collections.singletonList(new SchemaAndValueField("num_val", Decimal.builder(0).parameter(TestHelper.PRECISION_PARAMETER_KEY, "12").optional().build(), new BigDecimal("1238"))), updatedRecord, Envelope.FieldName.AFTER);
代码示例来源:origin: org.apache.kafka/connect-api
return new SchemaAndValue(Schema.FLOAT64_SCHEMA, dValue);
Schema schema = Decimal.schema(decimal.scale());
return new SchemaAndValue(schema, decimal);
} catch (NumberFormatException e) {
代码示例来源:origin: debezium/debezium
private void assertBigintUnsignedPrecise(Struct value) {
Struct after = value.getStruct(Envelope.FieldName.AFTER);
Integer i = after.getInt32("id");
assertThat(i).isNotNull();
//Validate the schema first, we are expecting org.apache.kafka.connect.data.Decimal:Byte since we are dealing with unsignd-bigint
//So Unsigned BIGINY would be an int32 type
assertThat(after.schema().field("c1").schema()).isEqualTo(Decimal.builder(0).schema());
assertThat(after.schema().field("c2").schema()).isEqualTo(Decimal.builder(0).schema());
//Validate the schema first, we are expecting int-64 since we are dealing with signed-bigint.
//So Signed BIGINT would be an INT64 type
assertThat(after.schema().field("c3").schema()).isEqualTo(Schema.INT64_SCHEMA);
//Validate candidates values
switch (i) {
case 1:
assertThat(after.get("c1")).isEqualTo(new BigDecimal("18446744073709551615"));
assertThat(after.get("c2")).isEqualTo(new BigDecimal("18446744073709551615"));
assertThat(after.getInt64("c3")).isEqualTo(9223372036854775807L);
break;
case 2:
assertThat(after.get("c1")).isEqualTo(new BigDecimal("14446744073709551615"));
assertThat(after.get("c2")).isEqualTo(new BigDecimal("14446744073709551615"));
assertThat(after.getInt64("c3")).isEqualTo(-1223372036854775807L);
break;
case 3:
assertThat(after.get("c1")).isEqualTo(new BigDecimal("0"));
assertThat(after.get("c2")).isEqualTo(new BigDecimal("0"));
assertThat(after.getInt64("c3")).isEqualTo(-9223372036854775808L);
}
}
代码示例来源:origin: com.github.jcustenborder.kafka.connect/connect-utils
public Parser() {
this.typeParsers = new HashMap<>();
registerTypeParser(Schema.BOOLEAN_SCHEMA, new BooleanParser());
registerTypeParser(Schema.BOOLEAN_SCHEMA, new BooleanParser());
registerTypeParser(Schema.FLOAT32_SCHEMA, new Float32TypeParser());
registerTypeParser(Schema.FLOAT64_SCHEMA, new Float64TypeParser());
registerTypeParser(Schema.INT8_SCHEMA, new Int8TypeParser());
registerTypeParser(Schema.INT16_SCHEMA, new Int16TypeParser());
registerTypeParser(Schema.INT32_SCHEMA, new Int32TypeParser());
registerTypeParser(Schema.INT64_SCHEMA, new Int64TypeParser());
registerTypeParser(Schema.STRING_SCHEMA, new StringTypeParser());
registerTypeParser(Decimal.schema(1), new DecimalTypeParser());
registerTypeParser(Date.SCHEMA, new DateTypeParser());
registerTypeParser(Time.SCHEMA, new TimeTypeParser());
registerTypeParser(Timestamp.SCHEMA, new TimestampTypeParser());
}
内容来源于网络,如有侵权,请联系作者删除!