本文整理了Java中org.apache.hadoop.hive.metastore.api.Decimal.<init>()
方法的一些代码示例,展示了Decimal.<init>()
的具体用法。这些代码示例主要来源于Github
/Stackoverflow
/Maven
等平台,是从一些精选项目中提取出来的代码,具有较强的参考意义,能在一定程度帮忙到你。Decimal.<init>()
方法的具体详情如下:
包路径:org.apache.hadoop.hive.metastore.api.Decimal
类名称:Decimal
方法名:<init>
[英]Performs a deep copy on other.
[中]在其他计算机上执行深度复制。
代码示例来源:origin: apache/hive
public static Decimal getDecimal(ByteBuffer unscaled, short scale) {
return new Decimal((short) scale, unscaled);
}
代码示例来源:origin: apache/hive
public Decimal deepCopy() {
return new Decimal(this);
}
代码示例来源:origin: prestodb/presto
public static Decimal toMetastoreDecimal(BigDecimal decimal)
{
return new Decimal(ByteBuffer.wrap(decimal.unscaledValue().toByteArray()), (short) decimal.scale());
}
代码示例来源:origin: apache/hive
public static Decimal getDecimal(int number, int scale) {
ByteBuffer bb = ByteBuffer.allocate(4);
bb.asIntBuffer().put(number);
return new Decimal((short) scale, bb);
}
代码示例来源:origin: apache/hive
public static Decimal createThriftDecimal(String s) {
BigDecimal d = new BigDecimal(s);
return new Decimal((short) d.scale(), ByteBuffer.wrap(d.unscaledValue().toByteArray()));
}
代码示例来源:origin: apache/drill
private Decimal convertToThriftDecimal(HiveDecimal d) {
return new Decimal(ByteBuffer.wrap(d.unscaledValue().toByteArray()), (short)d.scale());
}
代码示例来源:origin: apache/hive
/**
* Performs a deep copy on <i>other</i>.
*/
public DecimalColumnStatsData(DecimalColumnStatsData other) {
__isset_bitfield = other.__isset_bitfield;
if (other.isSetLowValue()) {
this.lowValue = new Decimal(other.lowValue);
}
if (other.isSetHighValue()) {
this.highValue = new Decimal(other.highValue);
}
this.numNulls = other.numNulls;
this.numDVs = other.numDVs;
if (other.isSetBitVectors()) {
this.bitVectors = org.apache.thrift.TBaseHelper.copyBinary(other.bitVectors);
}
}
代码示例来源:origin: apache/hive
case 1: // LOW_VALUE
if (schemeField.type == org.apache.thrift.protocol.TType.STRUCT) {
struct.lowValue = new Decimal();
struct.lowValue.read(iprot);
struct.setLowValueIsSet(true);
case 2: // HIGH_VALUE
if (schemeField.type == org.apache.thrift.protocol.TType.STRUCT) {
struct.highValue = new Decimal();
struct.highValue.read(iprot);
struct.setHighValueIsSet(true);
代码示例来源:origin: prestodb/presto
@Test
public void testDecimalStatsToColumnStatistics()
{
DecimalColumnStatsData decimalColumnStatsData = new DecimalColumnStatsData();
BigDecimal low = new BigDecimal("0");
decimalColumnStatsData.setLowValue(new Decimal(ByteBuffer.wrap(low.unscaledValue().toByteArray()), (short) low.scale()));
BigDecimal high = new BigDecimal("100");
decimalColumnStatsData.setHighValue(new Decimal(ByteBuffer.wrap(high.unscaledValue().toByteArray()), (short) high.scale()));
decimalColumnStatsData.setNumNulls(1);
decimalColumnStatsData.setNumDVs(20);
ColumnStatisticsObj columnStatisticsObj = new ColumnStatisticsObj("my_col", DECIMAL_TYPE_NAME, decimalStats(decimalColumnStatsData));
HiveColumnStatistics actual = fromMetastoreApiColumnStatistics(columnStatisticsObj, OptionalLong.of(1000));
assertEquals(actual.getIntegerStatistics(), Optional.empty());
assertEquals(actual.getDoubleStatistics(), Optional.empty());
assertEquals(actual.getDecimalStatistics(), Optional.of(new DecimalStatistics(Optional.of(low), Optional.of(high))));
assertEquals(actual.getDateStatistics(), Optional.empty());
assertEquals(actual.getBooleanStatistics(), Optional.empty());
assertEquals(actual.getMaxValueSizeInBytes(), OptionalLong.empty());
assertEquals(actual.getTotalSizeInBytes(), OptionalLong.empty());
assertEquals(actual.getNullsCount(), OptionalLong.of(1));
assertEquals(actual.getDistinctValuesCount(), OptionalLong.of(19));
}
代码示例来源:origin: apache/hive
@Override
public void read(org.apache.thrift.protocol.TProtocol prot, DecimalColumnStatsData struct) throws org.apache.thrift.TException {
TTupleProtocol iprot = (TTupleProtocol) prot;
struct.numNulls = iprot.readI64();
struct.setNumNullsIsSet(true);
struct.numDVs = iprot.readI64();
struct.setNumDVsIsSet(true);
BitSet incoming = iprot.readBitSet(3);
if (incoming.get(0)) {
struct.lowValue = new Decimal();
struct.lowValue.read(iprot);
struct.setLowValueIsSet(true);
}
if (incoming.get(1)) {
struct.highValue = new Decimal();
struct.highValue.read(iprot);
struct.setHighValueIsSet(true);
}
if (incoming.get(2)) {
struct.bitVectors = iprot.readBinary();
struct.setBitVectorsIsSet(true);
}
}
}
代码示例来源:origin: apache/drill
} else if (fName.equals("lowValue")) {
BigDecimal d = new BigDecimal(value);
decimalStats.setLowValue(new Decimal(ByteBuffer.wrap(d
.unscaledValue().toByteArray()), (short) d.scale()));
} else if (fName.equals("highValue")) {
BigDecimal d = new BigDecimal(value);
decimalStats.setHighValue(new Decimal(ByteBuffer.wrap(d
.unscaledValue().toByteArray()), (short) d.scale()));
} else {
代码示例来源:origin: org.apache.hive/hive-standalone-metastore
public static Decimal getDecimal(ByteBuffer unscaled, short scale) {
return new Decimal((short) scale, unscaled);
}
代码示例来源:origin: org.apache.hive/hive-standalone-metastore
public static Decimal getDecimal(int number, int scale) {
ByteBuffer bb = ByteBuffer.allocate(4);
bb.asIntBuffer().put(number);
return new Decimal((short) scale, bb);
}
代码示例来源:origin: prestosql/presto
public static Decimal toMetastoreDecimal(BigDecimal decimal)
{
return new Decimal(ByteBuffer.wrap(decimal.unscaledValue().toByteArray()), (short) decimal.scale());
}
代码示例来源:origin: org.apache.hive/hive-standalone-metastore
public static Decimal createThriftDecimal(String s) {
BigDecimal d = new BigDecimal(s);
return new Decimal((short) d.scale(), ByteBuffer.wrap(d.unscaledValue().toByteArray()));
}
代码示例来源:origin: org.spark-project.hive/hive-metastore
private static Decimal createThriftDecimal(String s) {
BigDecimal d = new BigDecimal(s);
return new Decimal(ByteBuffer.wrap(d.unscaledValue().toByteArray()), (short)d.scale());
}
代码示例来源:origin: com.teradata.tempto/tempto-core
private static Decimal toHiveDecimal(Object objectValue, int scale)
{
double value = ((Number) objectValue).doubleValue();
BigInteger bigInteger = BigInteger.valueOf(Math.round(value * scale));
return new Decimal(ByteBuffer.wrap(bigInteger.toByteArray()), checkedCast(scale));
}
}
代码示例来源:origin: prestodb/tempto
private static Decimal toHiveDecimal(Object objectValue, int scale)
{
double value = ((Number) objectValue).doubleValue();
BigInteger bigInteger = BigInteger.valueOf(Math.round(value * scale));
return new Decimal(ByteBuffer.wrap(bigInteger.toByteArray()), checkedCast(scale));
}
}
代码示例来源:origin: com.alibaba.blink/flink-connector-hive
@VisibleForTesting
protected static org.apache.hadoop.hive.metastore.api.Decimal fromFlinkDecimal(Decimal flinkDecimal) {
BigDecimal bigDecimal = flinkDecimal.toBigDecimal();
return new org.apache.hadoop.hive.metastore.api.Decimal(
ByteBuffer.wrap(bigDecimal.unscaledValue().toByteArray()), (short) bigDecimal.scale());
}
代码示例来源:origin: com.facebook.presto.hive/hive-apache
private Decimal convertToThriftDecimal(HiveDecimal d) {
return new Decimal(ByteBuffer.wrap(d.unscaledValue().toByteArray()), (short)d.scale());
}
内容来源于网络,如有侵权,请联系作者删除!