在hadoop配置单元中,unixtime认为数据是int/bigint的字符串

vwkv1x7d  于 2021-05-29  发布在  Hadoop
关注(0)|答案(1)|浏览(390)

我的代码如下。。

SELECT
        to_date(from_unixtime(time_first_touch)) AS sDate
    FROM (
SELECT
            MIN(GET_JSON_OBJECT(swanviraw.textcol,'$.ev_time')) as time_first_touch,
            COUNT(*) as number_of_events
        FROM swanviraw
    ) v

它在编译语句时抛出错误:failed:semanticexception[error 10014]:第2行:10错误的参数'time\u first\u touch':类org.apache.hadoop.hive.ql.udf.udffromunixtime没有与(string)匹配的方法。可能的选择:func(bigint)func(bigint,string)func(int)func(int,string)[错误\状态]
现在,关键是下面的查询工作正常。。ev\ U time有int/bigint值,因为min在以下情况下工作得很好。。

SELECT
                    MIN(GET_JSON_OBJECT(swanviraw.textcol,'$.ev_time')) as time_first_touch,
                    COUNT(*) as number_of_events
                FROM swanviraw

我们真诚地感谢您的帮助。。
谢谢

acruukt9

acruukt91#

作为 GET_JSON_OBJECT 返回json字符串,并且as error指示 from_unixtime 期望 int 或者 bigint ,您需要转换 time_first_touchbigint :

SELECT
        to_date(from_unixtime(time_first_touch)) AS sDate
    FROM (
SELECT
            MIN(cast(GET_JSON_OBJECT(swanviraw.textcol,'$.ev_time') as bigint)) as time_first_touch,
            COUNT(*) as number_of_events
        FROM swanviraw
    ) as v

或者

SELECT
        to_date(from_unixtime(time_first_touch)) AS sDate
    FROM (
SELECT
            MIN(unix_timestamp(GET_JSON_OBJECT(swanviraw.textcol,'$.ev_time'))) as time_first_touch,
            COUNT(*) as number_of_events
        FROM swanviraw
    ) as v

相关问题