在pyhive上运行insert查询时出错

wrrgggsh  于 2021-06-24  发布在  Hive
关注(0)|答案(0)|浏览(426)

我正在使用一个基本代码来测试通过pyhive对我的hiveserver2的访问,当运行select时,我得到了预期的结果,但是当我尝试插入一个新条目时,我遇到了这个问题,我发现很难调试并找到问题所在:

Traceback (most recent call last):
File "<stdin>", line 1, in <module>
File "/home/yootel/.local/lib/python3.6/site-packages/pyhive/hive.py", line 365, in execute
_check_status(response)
File "/home/yootel/.local/lib/python3.6/site-packages/pyhive/hive.py", line 495, in _check_status
raise OperationalError(response)
pyhive.exc.OperationalError: TExecuteStatementResp(status=TStatus(statusCode=3, infoMessages=['*org.apache.hive.service.cli.HiveSQLException:Error while processing statement: FAILED: Execution Error, return code 1 from org.apache.hadoop.hive.ql.exec.mr.MapRedTask. No such file or directory:28:27', 'org.apache.hive.service.cli.operation.Operation:toSQLException:Operation.java:335', 'org.apache.hive.service.cli.operation.SQLOperation:runQuery:SQLOperation.java:226', 'org.apache.hive.service.cli.operation.SQLOperation:runInternal:SQLOperation.java:263', 'org.apache.hive.service.cli.operation.Operation:run:Operation.java:247', 'org.apache.hive.service.cli.session.HiveSessionImpl:executeStatementInternal:HiveSessionImpl.java:541', 'org.apache.hive.service.cli.session.HiveSessionImpl:executeStatement:HiveSessionImpl.java:516', 'sun.reflect.GeneratedMethodAccessor31:invoke::-1', 'sun.reflect.DelegatingMethodAccessorImpl:invoke:DelegatingMethodAccessorImpl.java:43', 'java.lang.reflect.Method:invoke:Method.java:498', 'org.apache.hive.service.cli.session.HiveSessionProxy:invoke:HiveSessionProxy.java:78', 'org.apache.hive.service.cli.session.HiveSessionProxy:access$000:HiveSessionProxy.java:36', 'org.apache.hive.service.cli.session.HiveSessionProxy$1:run:HiveSessionProxy.java:63', 'java.security.AccessController:doPrivileged:AccessController.java:-2', 'javax.security.auth.Subject:doAs:Subject.java:422', 'org.apache.hadoop.security.UserGroupInformation:doAs:UserGroupInformation.java:1730', 'org.apache.hive.service.cli.session.HiveSessionProxy:invoke:HiveSessionProxy.java:59', 'com.sun.proxy.$Proxy42:executeStatement::-1', 'org.apache.hive.service.cli.CLIService:executeStatement:CLIService.java:282', 'org.apache.hive.service.cli.thrift.ThriftCLIService:ExecuteStatement:ThriftCLIService.java:563', 'org.apache.hive.service.rpc.thrift.TCLIService$Processor$ExecuteStatement:getResult:TCLIService.java:1557', 'org.apache.hive.service.rpc.thrift.TCLIService$Processor$ExecuteStatement:getResult:TCLIService.java:1542', 'org.apache.thrift.ProcessFunction:process:ProcessFunction.java:39', 'org.apache.thrift.TBaseProcessor:process:TBaseProcessor.java:39', 'org.apache.hive.service.auth.TSetIpAddressProcessor:process:TSetIpAddressProcessor.java:56', 'org.apache.thrift.server.TThreadPoolServer$WorkerProcess:run:TThreadPoolServer.java:286', 'java.util.concurrent.ThreadPoolExecutor:runWorker:ThreadPoolExecutor.java:1149', 'java.util.concurrent.ThreadPoolExecutor$Worker:run:ThreadPoolExecutor.java:624', 'java.lang.Thread:run:Thread.java:748', '*org.apache.hadoop.io.nativeio.NativeIOException:No such file or directory:62:35', 'org.apache.hadoop.io.nativeio.NativeIO$POSIX:chmodImpl:NativeIO.java:-2', 'org.apache.hadoop.io.nativeio.NativeIO$POSIX:chmod:NativeIO.java:234', 'org.apache.hadoop.fs.RawLocalFileSystem:setPermission:RawLocalFileSystem.java:861', 'org.apache.hadoop.fs.ChecksumFileSystem$1:apply:ChecksumFileSystem.java:508', 'org.apache.hadoop.fs.ChecksumFileSystem$FsOperation:run:ChecksumFileSystem.java:489', 'org.apache.hadoop.fs.ChecksumFileSystem:setPermission:ChecksumFileSystem.java:511', 'org.apache.hadoop.fs.FileSystem:mkdirs:FileSystem.java:676', 'org.apache.hadoop.mapreduce.JobResourceUploader:mkdirs:JobResourceUploader.java:660', 'org.apache.hadoop.mapreduce.JobResourceUploader:uploadResourcesInternal:JobResourceUploader.java:174', 'org.apache.hadoop.mapreduce.JobResourceUploader:uploadResources:JobResourceUploader.java:135', 'org.apache.hadoop.mapreduce.JobSubmitter:copyAndConfigureFiles:JobSubmitter.java:99', 'org.apache.hadoop.mapreduce.JobSubmitter:submitJobInternal:JobSubmitter.java:194', 'org.apache.hadoop.mapreduce.Job$11:run:Job.java:1570', 'org.apache.hadoop.mapreduce.Job$11:run:Job.java:1567', 'java.security.AccessController:doPrivileged:AccessController.java:-2', 'javax.security.auth.Subject:doAs:Subject.java:422', 'org.apache.hadoop.security.UserGroupInformation:doAs:UserGroupInformation.java:1730', 'org.apache.hadoop.mapreduce.Job:submit:Job.java:1567', 'org.apache.hadoop.mapred.JobClient$1:run:JobClient.java:576', 'org.apache.hadoop.mapred.JobClient$1:run:JobClient.java:571', 'java.security.AccessController:doPrivileged:AccessController.java:-2', 'javax.security.auth.Subject:doAs:Subject.java:422', 'org.apache.hadoop.security.UserGroupInformation:doAs:UserGroupInformation.java:1730', 'org.apache.hadoop.mapred.JobClient:submitJobInternal:JobClient.java:571', 'org.apache.hadoop.mapred.JobClient:submitJob:JobClient.java:562', 'org.apache.hadoop.hive.ql.exec.mr.ExecDriver:execute:ExecDriver.java:423', 'org.apache.hadoop.hive.ql.exec.mr.MapRedTask:execute:MapRedTask.java:149', 'org.apache.hadoop.hive.ql.exec.Task:executeTask:Task.java:205', 'org.apache.hadoop.hive.ql.exec.TaskRunner:runSequential:TaskRunner.java:97', 'org.apache.hadoop.hive.ql.Driver:launchTask:Driver.java:2664', 'org.apache.hadoop.hive.ql.Driver:execute:Driver.java:2335', 'org.apache.hadoop.hive.ql.Driver:runInternal:Driver.java:2011', 'org.apache.hadoop.hive.ql.Driver:run:Driver.java:1709', 'org.apache.hadoop.hive.ql.Driver:run:Driver.java:1703', 'org.apache.hadoop.hive.ql.reexec.ReExecDriver:run:ReExecDriver.java:157', 'org.apache.hive.service.cli.operation.SQLOperation:runQuery:SQLOperation.java:224'], sqlState='08S01', errorCode=1, errorMessage='Error while processing statement: FAILED: Execution Error, return code 1 from org.apache.hadoop.hive.ql.exec.mr.MapRedTask. No such file or directory'), operationHandle=None)

我的代码就是这样做的:

from pyhive import hive 
 conn = hive.Connection(host="x.x.x.x", database="my_db", username="hadoop")
 cursor = conn.cursor()
 cursor.execute("select * from customer")
 cursor.fetchall()
 # this query works fine and returns the appropriate data
 cursor.execute("insert into customer (id, name, balance) values (5, 'cutomer name', 10)")
 # this query generates the error

知道我试图检查配置单元权限后,会出现什么问题以及如何解决它!

暂无答案!

目前还没有任何答案,快来回答吧!

相关问题