sum函数上的pig错误

zfycwa2u  于 2021-05-30  发布在  Hadoop
关注(0)|答案(1)|浏览(437)

我有这样的数据-

store   trn_date    dept_id sale_amt
1       2014-12-14  101     10007655
1       2014-12-14  101     10007654
1       2014-12-14  101     10007544
6       2014-12-14  104     100086544
8       2014-12-14  101     1000000
9       2014-12-14  106     1000000

我想得到销售金额的总和,我正在做这个
首先,我使用以下方法加载数据:

table = LOAD 'table' USING org.apache.hcatalog.pig.HCatLoader();

然后对存储、事务日期、部门id上的数据进行分组

grp_table = GROUP table BY (store, tran_date, dept_id);

最后尝试使用

grp_gen = FOREACH grp_table GENERATE 
           FLATTEN(group) AS (store, tran_date, dept_id),
           SUM(table.sale_amt) AS tota_sale_amt;

低于错误-

================================================================================
Pig Stack Trace
---------------
ERROR 2103: Problem doing work on Longs

org.apache.pig.backend.executionengine.ExecException: ERROR 0: Exception while executing (Name: grouped_all: Local Rearrange[tuple]{tuple}(false) - scope-1317 Operator Key: scope-1317): org.apache.pig.backend.executionengine.ExecException: ERROR 2103: Problem doing work on Longs
        at org.apache.pig.backend.hadoop.executionengine.physicalLayer.PhysicalOperator.processInput(PhysicalOperator.java:289)
        at org.apache.pig.backend.hadoop.executionengine.physicalLayer.relationalOperators.POLocalRearrange.getNextTuple(POLocalRearrange.java:263)
        at org.apache.pig.backend.hadoop.executionengine.mapReduceLayer.PigCombiner$Combine.processOnePackageOutput(PigCombiner.java:183)
        at org.apache.pig.backend.hadoop.executionengine.mapReduceLayer.PigCombiner$Combine.reduce(PigCombiner.java:161)
        at org.apache.pig.backend.hadoop.executionengine.mapReduceLayer.PigCombiner$Combine.reduce(PigCombiner.java:51)
        at org.apache.hadoop.mapreduce.Reducer.run(Reducer.java:171)
        at org.apache.hadoop.mapred.Task$NewCombinerRunner.combine(Task.java:1645)
        at org.apache.hadoop.mapred.MapTask$MapOutputBuffer.sortAndSpill(MapTask.java:1611)
        at org.apache.hadoop.mapred.MapTask$MapOutputBuffer.flush(MapTask.java:1462)
        at org.apache.hadoop.mapred.MapTask$NewOutputCollector.close(MapTask.java:700)
        at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:770)
        at org.apache.hadoop.mapred.MapTask.run(MapTask.java:340)
        at org.apache.hadoop.mapred.YarnChild$2.run(YarnChild.java:167)
        at java.security.AccessController.doPrivileged(Native Method)
        at javax.security.auth.Subject.doAs(Subject.java:415)
        at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1554)
        at org.apache.hadoop.mapred.YarnChild.main(YarnChild.java:162)
Caused by: org.apache.pig.backend.executionengine.ExecException: ERROR 2103: Problem doing work on Longs
        at org.apache.pig.builtin.AlgebraicLongMathBase.doTupleWork(AlgebraicLongMathBase.java:84)
        at org.apache.pig.builtin.AlgebraicLongMathBase$Intermediate.exec(AlgebraicLongMathBase.java:108)
        at org.apache.pig.builtin.AlgebraicLongMathBase$Intermediate.exec(AlgebraicLongMathBase.java:102)
        at org.apache.pig.backend.hadoop.executionengine.physicalLayer.expressionOperators.POUserFunc.getNext(POUserFunc.java:330)
        at org.apache.pig.backend.hadoop.executionengine.physicalLayer.expressionOperators.POUserFunc.getNextTuple(POUserFunc.java:369)
        at org.apache.pig.backend.hadoop.executionengine.physicalLayer.PhysicalOperator.getNext(PhysicalOperator.java:333)
        at org.apache.pig.backend.hadoop.executionengine.physicalLayer.relationalOperators.POForEach.processPlan(POForEach.java:378)
        at org.apache.pig.backend.hadoop.executionengine.physicalLayer.relationalOperators.POForEach.getNextTuple(POForEach.java:298)
        at org.apache.pig.backend.hadoop.executionengine.physicalLayer.PhysicalOperator.processInput(PhysicalOperator.java:281)
Caused by: java.lang.ClassCastException: java.lang.String cannot be cast to java.lang.Number
        at org.apache.pig.builtin.AlgebraicLongMathBase.doTupleWork(AlgebraicLongMathBase.java:77)
================================================================================

由于我正在使用hcatalog加载程序读取表,并且配置单元中的表数据类型是string,所以我也尝试过在脚本中强制转换,但仍然得到相同的错误

oxcyiej7

oxcyiej71#

我没有 HCatalog 安装在我的系统中,所以尝试使用简单的文件,但下面的方法和代码将为您工作。
1 SUM 仅适用于数据类型( int, long, float, double, bigdecimal, biginteger or bytearray cast as double ). 看起来sale\u amt列是字符串,所以需要将此列类型转换为( long or double )使用前 SUM 功能。
2.不应使用 store 作为变量,bcoz它是pig中的保留关键字,因此您必须将此变量重命名为其他名称,否则将出现错误。我将这个变量重命名为“stores”。
例子:
表格:

1       2014-12-14      101     10007655
1       2014-12-14      101     10007654
1       2014-12-14      101     10007544
6       2014-12-14      104     100086544
8       2014-12-14      101     1000000
9       2014-12-14      106     1000000

Pig手稿:

A = LOAD 'table' USING PigStorage() AS (store:chararray,trn_date:chararray,dept_id:chararray,sale_amt:chararray);
B = FOREACH A GENERATE $0 AS stores,trn_date,dept_id,(long)sale_amt; --Renamed the variable store to stores and typecasted the sale_amt to long.
C = GROUP B BY (stores,trn_date,dept_id);
D = FOREACH C GENERATE FLATTEN(group),SUM(B.sale_amt);
DUMP D;

输出:

(1,2014-12-14,101,30022853)
(6,2014-12-14,104,100086544)
(8,2014-12-14,101,1000000)
(9,2014-12-14,106,1000000)

相关问题